—“The Presidency and the Nominating Process,” by Lara Brown, in The Presidency and the Political System (ed. Nelson).
—Portions of chapters 3, 4 and 5 and 7 in Presidential Power: Unchecked and Unbalanced.
The shift from selecting presidential nominees via national party conventions dominated by state party elites to letting the party-rank-and-file select them through primaries was a slowly developing process that began in 1901 as an artifact of the progressive era, grew, had setbacks, grew again, and was not completed until the 1970s (and continues to evolve today). This change in the nominee selection process changed the type of presidential nominees we choose between in the general election, presenting us with presidents who are more self-driven and possibly demagogic than the Founders desired.
[A note: I will work on keeping these briefer, more concise. But to paraphrase Mark Twain, I don’t always have time to write a shorter essay. And my time may get shorter next month. Not only am I involved in a program review process for my department, but due to a successful union complaint (go union!), I am to be reviewed for promotion this year instead of next, which is great except it means I have to get my portfolio put together by April 1 instead of sometime in October. So if I fail to post lectures on the weekly schedule I’d hoped for, or if they are a bit rambly and not well-edited, I ask for your indulgence.]
The Significance of Institutional Change
Institutions matter.* I will have two lectures on how changing institutions have shaped the presidency. This first one is about the effect of changes in the nominee selection process, and the next will be about changes in the powers allotted to the president.
The institutions—rules—that govern any game determine the type of people who can have success in that game. For example, American football, with its line of scrimmage, favors having a hefty portion of behemoths on your team, while real football is ill-suited to people of gigantic stature. Politics, of course, is a type of game (a structured interaction between parties with non-congruent interests) , and so it is not surprising that ur electoral institutions will shape what type of players seek the presidency. In the era of the convention system, dynamic self-driven men—the type who tended to make enemies among some of their parties’ factions—tended to be blackballed, and some relatively non-offensive person—someone not driven enough to attract unfavorable attention—to be selected as the party’s nominee. The rise of the primary changed this, leading to the modern era of candidates who have the “fire in the belly” to make it through a long primary season while continually in the spotlight.
The First Signs of the Modern Presidency
The series of presidents from 1896 to the mid 20th century demonstrate the fitful rise of both the primary system and the modern presidency.
William McKinley’s election in 1896 marked the first appearance of the modern president. Significantly, he appeared before the rise of the primaries, but his ascendancy was made possible in large part by the increasing perception of the illegitimacy of the convention-based system for selecting presidential nominees. While a democratic “leap forward” from the party caucus system, the convention system still included only national and state level party elites, excluding the voice of the party rank-and-file. McKinley capitalized on the growing illegitimacy of the convention system by running against the party insiders, with the most candidate-centered campaign since Andrew Jackson, courting southerners who were usually ignored by national party leaders and using his personal popularity “to snatch the states from beneath their leaders” (Crenson & Ginsberg, p.115). Importantly, though, McKinley had kept a low profile in the years prior to the election so that he could go to the convention without having actively alienated many factions. He could only beat the party bosses in the convention system because had not made enemies of them.
On McKinley’s death, Theodore Roosevelt, another very driven man, became president. TR’s view of the presidency, as expressed in his autobiography, was aggressively modern:
I declined to adopt the view that what was imperatively necessary for the Nation could not be done by the President unless he could find some specific authorization to do it. My belief was that it was not only his right but his duty to do anything that the needs of the Nation demanded unless such action was forbidden by the Constitution or by the laws.
Together they seem to signal that the convention system could produce modern presidents. But they were anomalies, men who figured out how to manipulate the system successfully, and who built strength on its democratic illegitimacy. And the party elites were only temporarily set back.
The Rise, Decline, and Rise of Primaries
Primaries date from the first decade of the 1900s, and were very much a product of the progressive era. Just as states like California were instituting the initiative process as a way to unleash the power of the demos to break entrenched power, primaries were created to wrest control of nominee selection away from powerful state party leaders.
But in 1908 primaries were not yet relevant, and GOP party elites recaptured the machinery of the convention system, producing a president who was the antithesis of Teddy Roosevelt: William Howard Taft. Taft used a distinctly 19th century term for the presidency, the “Chief Magistrate,” that emphasized the president’s strict constitutional role as executor of the law, rather than political leader. And in contrast to TR he argued that;
“the President can exercise no power which cannot be fairly and reasonably traced to some specific grant of power of power…There is no undefined residuum of power which he can exercise because it seems to him to be in the public interest.”
1912 was the first presidential election in which primaries—the votes of rank-and-file party members—played an important role in nominee selection. About a dozen states had primaries, which chose 1/3 of the Democratic delegates and over 40% of the Republican delegates. This benefited Democrat Woodrow Wilson, the man who argued for basing the president’s authority in his role as the only representative of the whole people, but who was disliked by the party bosses. A truly modern president, whose lust for the office led him to sacrifice “ideology, friendships, political alliances and loyalties” (Crenson & Ginsberg, p. 136), it was only the primaries that made him a player at the convention. He won 6 of the 13 held (the others were mostly won by favorite-son candidates), from which came a majority of the delegates pledged to him. But even with that he won the candidacy only after the convention had deadlock through dozens of votes, finally being selected on the 47th ballot.
The Republican side demonstrated the conflict between the old and new systems even more starkly, as Crenson and Ginsberg explain:
“On the Republican side, the election of 1912 approximated a natural experiment on the implications of presidential selection…Roosevelt had overwhelmed Taft in the thirteen states where there were Republican primaries, but Taft overwhelmed Roosevelt in the Republican convention. In other words, the primaries produced a candidate with an ambitious, new political agenda while the party convention produced a candidate who had never really wanted to be president in the first place. “ (pp.132-33)
Wilson’s victory was a small breakthrough for the primary system, but the party bosses fought back, and as ivism lost favor so did the presidential primaries.
“The prominent role that the direct primary played in 1912 had virtually evaporated four years later. In 1916 Theodore Roosevelt and Charles Evan Hughes were the most popular Republican candidates, but neither would permit his name to be entered in any of the Republican primaries….Although more states held presidential primaries than ever before, most of the Republicans who entered them were favorite sons or long shots, not believable candidates. In the Democratic primaries, Woodrow Wilson was the only believable candidate. Turnout was low.” (Crenson & Ginsberg, p.141)
Although more states held primaries in 1916, 1/3 of them had abandoned them by 1935, with only one state (Alabama) adding the primary. Some of the states that still had primaries passed laws against delegates specifying which candidate they would support, and in others a favorite son candidate would run so they could win the delegates and control them at the convention. The party elites had captured the machinery designed to strip them of influence. This “bastard” system gave us two presidents who would have fit in perfectly in the pre-McKinley era, Harding and Coolidge. Harding was literally plucked out of obscurity and groomed by party bosses to be president because he looked the part, and his personal weakness as a leader led to his underlings committing unprecedented scandals. Coolidge thought the country already had “too many laws,” and declined to be an active legislative leader.
But the primary system recovered and continued to grow, although slowly. Herbert Hoover’s 1928 capture of the Republican nomination was through the primaries. He was not the part leader’s favorite because he’d supported Roosevelt’s breakaway Bull Moose party and had served in the Democratic Wilson administration as wartime food administrator. But based on his popular reputation he won all the primaries in which no favorite son was entered and arrived at the convention with a majority of delegates.
In 1932 Franklin Delano Roosevelt used primaries to reveal his success at building strength in the south and west to knock off the party’s inside favorite, Al Smith. In 1948 Harold Stassen’s primary efforts forced insider Thomas Dewey to contest in the primaries, demonstrating that they could no longer be ignored if a candidate had any hope of becoming his party’s nominee. In 1952, say Crenson and Ginsberg, quoting others, “the primaries may be said to have come into their own for the first time” as Eisenhower struck “a telling blow to the legitimacy of party leader control of delegates to the nominating convention.” (p. 173). Eisenhower’s success in primaries showed that he could win votes of the public; at the convention his managers challenged the credentials of delegates supporting his opponent, Robert Taft, who had been supported by party leaders, ultimately preventing them from being seated; and his general election campaign was a candidate-centered campaign that kept itself independent of the Republican National Committee. And in 1960 Kennedy used primaries to gather enough delegates to challenge the Democratic Party’s ultimate inside player, Lyndon Johnson, and ultimately wrest the candidacy away from him at the convention.
The growing importance of the primaries, and decline of control by party elites, in selecting the party’s nominee came to a head at the 1968 Democratic convention. In addition to the street protests outside, there was a battle royale inside about the continued control of (white male) party elites. Although Hubert Humphrey did not enter any primaries, on the first ballot at the convention he “trounced [Eugene] McCarthy in delegate votes (1,759¼ 601)” (Brown, p. 202), a result that outraged the non-elite members of the party. This led to changes that in 1972 made the primary system almost fully dominant in delegate selection (although even today, there are enough “superdelegates,” unpledged party elites such as congressmembers, governors, and high-ranking party officials, to tip the balance if the primary outcomes are close),** and by the start of the next decade, “both the number of states holding primaries and the percentage of delegates awarded in those contests nearly doubled” (Brown p. 203). Today about 35 states hold primaries (the rest hold caucuses, which are more restrictive than primaries, but still not as controlled by part elites as under the old system) that select about 75% of the convention delegates.
Presidential Selection and Presidential Style
According to Crenson and Ginsberg,
primaries tended to recruit presidential candidates with sufficient stamina and motivation to make themselves presidential nominees instead of waiting for the party to designate them as potential presidents. Primaries opened the presidency to candidates of ambition who did not want the encumbrances that came from bargaining one’s way to the presidency, and who were sufficiently self-propelled and confident to weather the rigors of a primary campaign.” (p.140) […]
[T]he same drive that animates candidate-centered campaigns also motivates presidents and their staff assistants to expand the formal powers and administrative resources of the presidency. In the process they bring into being an institution with a structural interest in defending and expanding executive authority. (p. 177)
We can see this in a comparison of presidential styles. Grover Cleveland said that he “did not come here to legislate,” Harding hought it improper “for the executive to meddle in the business of the legislative branch.” (Crenson & Ginsberg p. 143), and Coolidge’s aversion to having an active legislative agenda was demonstrated by his claim that the country already had too many laws “and we would be better off if we did not have any more.” (Crenson & Ginsberg p. 145)
In contrast, Woodrow Wilson believed that the President should be the leader of the party (which sounds normal to us now, but only because he helped to make it normal), and that he should act not just as the chief executive of the law, but as a “prime minister, as much concerned with guidance of legislation as with the just and orderly execution of the law.” (p. 138). He was the first president in over a century—since John Adams—to address Congress in person, leading some congressmembers to object to the imperialism of the president invading their turf. And his unprecedented efforts at legislative leadership led one Republican senator from Iowa to complain about the “persistent and determined” pressure of the “heavy hand of his power upon a branch of the Government that ought to be coordinate, but which in fact has become subordinate.” (p. 139)
Modern primary-selected nominees no longer owe their position to party leaders, but to the mass public. To modern democratic (small “d”) ears, this sounds wonderful. But appeals to the public require making promises beyond what candidates can possibly keep. It’s doubtful the public really believes those promises, yet those who do not overpromise gain no electoral traction. This dynamic favors a particular type of candidate, and I argue that this type is the demagogue. As Brown suggests;
Recent front runners have tended to have been known for their ability to connect with partisan voters. As political ‘outsiders’ … and good communicators…these aspirants enjoyed high ‘likeability’ ratings in the polls and made ample use of ‘soft’ media and talk show programs… In combination, all these traits suggest that the modern nomination process may now favor precisely those candidates the Framers had hoped to exclude through the Electoral College—those with ‘talents for low intrigue, and the little arts of popularity.” (p.208)
The amount of previous political experience of presidential nominees has also declined. First, there has been a decline in the amount of experience they have in the federal government, with governors being the most popular choice for party nominees. This is because the public likes candidates who run against the system, and (apparent) outsiders can more effectively do so, demagogically portraying themselves as unsullied, sinless, messiah figures who will lead us to the promised land outside the fever swamp that lies inside the beltway.
Second, overall political experience has declined from an pre-Civil War average of 11 political offices held over 21 years, to a post-1972 average of less than 8 offices over 16 years, to George W. Bush’s 1 office for 6 years (none in the federal government) and Obama’s 2 offices for 12 years (with 4 in the federal government). As much as we dislike professional politicians, the demands we place on presidents and the crucial foreign policy role they play make the office no place for amateurs.
Finally, successful candidates need not have any firm party support, diminishing their chances of successful governance. Jimmy Carter was, quite simply, a pain-in-the-ass to his party leaders in Congress, neither understanding how to work with them nor even recognizing that he did in fact have to work with them, instead of just trying to direct them. George W. Bush managed to destroy his party’s fragile control of the Senate (50-50, with VP Cheney casting the tie-breaking vote to give the GOP majority control) by so alienating Vermont’s James Jeffords that Jeffords quit the party and began caucusing with Democrats (although officially being an independent), for organizational purposes. Mitt Romney, had he won the presidency would have governed with only nominal support from the Tea Party members of his own party-in-government.
This dynamic, made possible by the primary system, is, I argue, the root cause of the imperial presidency. The overwhelming majority of presidential literature I read that talks about how to reign in the executive talks about statutory changes, reinvigorating Congress, or hopes for the public to become a serious and wise sovereign exercising meaningful electoral control. Much of this perpetrated by legal scholars, who of course have an intellectual bias, as a consequence of their education, toward thinking in a legalistic framework, toward establishing rules to produce compliance. It’s not always a bad framework, but in this case all their solutions ignore the root cause.
But if the root cause is democratic choice, how do we persuade the demos to constrain itself?
Next Week: Institutional Change II—Growth in Executive Powers. The readings are Crenson & Ginsberg pp. 15-28 of chapter 1, and chapter 5.
[P.S. Note that I’ve revised the syllabus again. No, I don’t do that in real classes. In part, here, I’m taking advantage of the flexibility of this format to play around with how I might structure the course next time I teach it for real.]
[Image: A delegation visits McKinley on his front porch in Canton, Ohio. From Wikimedia Commons.]
* Social scientists like to speak of “institutions,” by which they means rules, procedures, norms, etc. Where the layperson might look at some large organization, such as a university, and call it an institution, social scientists would pedantically emphasize that it is actually a set of institutions. (This is why social scientists are unpopular at parties.) Institutions may be formal (written down) or informal (unwritten), but that difference doesn’t necessarily reflect differences in their power to shape behavior. Written rules can be as behaviorally significant as written ones (which, for all their formality, may go unenforced because of an unwritten institution, a norm or tradition, that says to not enforce it).