During and after the FDR era, Republicans got politically smacked around and were the minority party for decades. It wasn't until the advent of the Southern Strategy, Trickle Down Economics, Two Santa Claus Theory, removal of the Fairness Doctrine, Rush Limbaugh, and Newt Gingrich that Republicans started doing whatever it took to gain seats, and thus power, in government. The GOP's "Party > Country" mandate manifested itself only within the last 40 or so years.
Edit: Here's some data from Wikipedia. FDR and the New Deal were immensely popular, so the Republicans needed something to claw back positive sentiment, hence culture wars and peddling to religious/racist angst etc.
View attachment 86972View attachment 86973