History Podcasts

When the Major Parties Failed the Common Man

When the Major Parties Failed the Common Man

Rural discontent had brewed in the United States since the sharp decline of farm prices in the 1870s. The Greenback Party emerged as a force in national politics, leading the agitation for the currency's expansion. The temporary return of prosperity blunted the Greenback message, but the return of hard times in the 1880s led to emergence of the farmers' alliances.In December 1890, representatives from a number of the alliances met in Ocala, Florida to examine the issue of united political action. Racism, as well as loyalty, played a role; some feared that splitting the Democratic vote would revive the old Republican-black alliance.Two events in 1890 paved the way for a new political party. First, Congress passed the Sherman Silver Purchase Act, a totally inadequate gesture toward currency expansion. Second, Republicans in Congress chose to withhold support from a bill to enforce civil rights in the South, thus ending any hope for cooperation between the former slaves and the party of Lincoln. Into this void moved figures like Tom Watson of Georgia, who urged Southern white farmers to overcome their antipathy toward blacks because both groups were suffering at the hands of the same oppressors.

  • Election of 1892Neither the Republicans nor the Democrats addressed rural distress in terms sufficient to encourage the farmers of the West and South. As a result, a convention was held in Omaha, Nebraska in February 1892. Many members of the powerful farmers' alliances were present. The name "populist" (from the Latin populus, meaning people) was borrowed from a state political organization in Kansas. The Populist convention nominated a truly national ticket:
    • James B. Weaver of Iowa, a former Greenback candidate and Union general, for president
    • James G. Field of Virginia, a former Confederate general, for vice president.
    The Populist platform, backed by nearly religious fervor, advocated an array of progressive ideas, many of which would later be adopted by law or amendment.The Populists ran a surprisingly successful campaign in 1892, polling more than one million popular votes and electing several of their number to Congress. Their real expectation, however, was to prepare for a serious run four years later.
  • Election of 1896In 1896, the Populists gained control of the Democratic Party and engineered the nomination of William Jennings Bryan. The campaign was dominated by the silver issue. In a futile effort to assert their independence, however, the Populists refused to support the Democratic vice presidential candidate and instead nominated Thomas E. Watson of Georgia to run with Bryan.An energetic campaign failed to sway the electorate, except in the farm belt. The Republicans were returned to power and the Populists were badly split between those who wished to remain with the Democrats and those who wanted to reclaim their identity.
  • Election of 1900The depression of the 1890s had subsided and much of the fervor for silver had declined. Nevertheless, many Populist Party members elected to cast their lot with Bryan and the Democrats in 1900. A small minority of Populists refused to endorse "fusion," nominating Wharton Barker and Ignatius Donnelly instead.The duo finished at the bottom of the heap, outpolled even by the Prohibition and Socialist tickets. Clearly the Populist Party had become too closely identified with free silver and that issue had vanished.
  • Election of 1904In 1904, the Populist Party was reunited, but sorely lacked numbers. Thomas Watson, a former vice presidential candidate, was nominated to run with Thomas Tibbles.The Populists won fewer than 120,000 popular votes and none in the Electoral College.
  • Election of 1908Tom Watson was trotted for a final round in 1908, paired with Samuel Williams. The ticket polled fewer than 30,000 votes, effectively ending the Populist Party's short life.

The Populist effort was probably doomed from the start. They advanced a number of stellar ideas, but fell prey to the allure of free silver, an issue that resonated poorly with urban workers whose votes were badly needed. Discontented farmers, despite their enthusiasm, simply lacked the numbers to move the nation.


Ku Klux Klan

Founded in 1865, the Ku Klux Klan (KKK) extended into almost every southern state by 1870 and became a vehicle for white southern resistance to the Republican Party’s Reconstruction-era policies aimed at establishing political and economic equality for Black Americans. Its members waged an underground campaign of intimidation and violence directed at white and Black Republican leaders. Though Congress passed legislation designed to curb Klan terrorism, the organization saw its primary goal–the reestablishment of white supremacy𠄿ulfilled through Democratic victories in state legislatures across the South in the 1870s. 

After a period of decline, white Protestant nativist groups revived the Klan in the early 20th century, burning crosses and staging rallies, parades and marches denouncing immigrants, Catholics, Jews, African Americans and organized labor. The civil rights movement of the 1960s also saw a surge of Ku Klux Klan activity, including bombings of Black schools and churches and violence against Black and white activists in the South.


From our September 2016 issue

Check out the full table of contents and find your next story to read.

Vance survives this endless turbulence, thanks in large part to the tough love he receives from Mamaw, living nearby, who sees in him a chance to redeem her parenting failures with Bev. His grades are good enough to get him into the best state colleges in Ohio. But fearing that he isn’t ready for unstructured campus life, he enlists in the Marine Corps, and gets a stint in Iraq and a large helping of maturity and perspective. After finishing his tour, he excels at Ohio State and, to his joyful amazement, is admitted to Yale Law School.

With the same appealing guilelessness that he brings to the story of his youthful ordeals, Vance describes the culture shock he experiences in New Haven. He doesn’t know what to make of the endless “cocktail receptions and banquets” that combine networking and matchmaking. At the fancy restaurant where he’s attending a law firm’s recruitment dinner, he spits out sparkling water, having never drunk such a thing. He calls his girlfriend from the restroom to ask her, ‘What do I do with all these damned forks?’ ”

His estrangement often reflects poorly on the echelon he’s joined, whose members, he says with understatement, could do a better job of “opening their hearts and minds to” newcomers. He is taken aback when law-school friends leave a mess at a chicken joint, and stays behind with another student from a low-income background, Jamil, to clean it up. “People,” he writes, “would say with a straight face that a surgeon mother and engineer father were middle-class.” To his astonishment, he is regarded as an exotic figure by his professors and classmates, simply by virtue of having come from a small town in the middle of the country, gone to a mediocre public high school, and been born to parents who didn’t attend college.

He adapts to his new world well enough to land at a Washington, D.C., law firm and later in a court clerkship, and is today prospering as a principal at an investment firm in San Francisco. But the outsider feeling lingers—hearing someone use a big word like confabulate in conversation makes his blood rise. “Sometimes I view members of the elite with an almost primal scorn,” he admits. And questions nag at him: “Why has no else from my high school made it to the Ivy League? Why are people like me so poorly represented in America’s elite institutions?” He is acutely aware of how easily he could have been trapped, had it not been for the caring intervention he received at key moments from people like Mamaw and his sister. “Thinking about … how close I was to the abyss, gives me chills. I am one lucky son of a bitch.” He asks:

Vance’s answers read like works in progress: His passages of general social commentary could have benefited from longer gestation, and are strongest when grounded in his biography. He is well aware of the larger forces driving the cultural decline he deplores. He knows how much of the deterioration in Middletown can be traced to the shrinkage of the big Armco steel-rolling mill that, during World War II, drew so many Appalachians—including Papaw—to the town. His tales of the increasingly rarefied world of elite education offer good evidence for why “many people in my community began to believe that the modern American meritocracy was not built for them.”

But he also sees the social decline in personal terms, as a weakening of moral fiber and work ethic. He describes, for instance, working at a local grocery store, where he “learned how people gamed the welfare system”:

They’d buy two dozen-packs of soda with food stamps and then sell them at a discount for cash. They’d ring up their orders separately, buying food with food stamps, and beer, wine, and cigarettes with cash … Most of us were struggling to get by, but we made do, worked hard, and hoped for a better life. But a large minority was content to live off the dole. Every two weeks, I’d get a small paycheck and notice the line where federal and state income taxes were deducted from my wages. At least as often, our drug-addict neighbor would buy T-bone steaks, which I was too poor to buy for myself but was forced by Uncle Sam to buy for someone else.

As Vance notes, resentment of this sort—which surfaces again and again in his book—helps explain why voters in the world he came from have largely abandoned the Democrats, the party of the social safety net.

Nor is the animus new: Isenberg traces it back to the days when poor Southerners were scorned for availing themselves of the aid extended to freed slaves—and joined in the scorn as soon as they escaped the dole. “The same self-made man who looked down on white trash others had conveniently chosen to forget that his own parents escaped the tar-paper shack only with the help of the federal government,” she writes. “ ‘Upscale rednecks’ had no trouble spotting those below them in their rearview mirrors.” In Vance’s book, those “below” are mostly fellow whites and the resentment is not primarily racially motivated, as many liberals would have one believe of all anti-welfare sentiment.

Vance does not pivot from such observations to a full-blown indictment of social-welfare programs. He isn’t ready to join the Republican chorus that blames the government (and specifically the black president who now heads it) for all ills. But he zealously subscribes to its corollary: The government, in his view, can’t possibly cure those ills. In a summary that borders on the polemical, he exhorts the “broad community of hillbillies” to “wake the hell up” and seize control of its fate.

Public policy can help, but there is no government that can fix these problems for us … Mamaw refused to purchase bicycles for her grandchildren because they kept disappearing—even when locked up—from her front porch. She feared answering her door toward the end of her life because an able-bodied woman who lived next door would not stop bothering her for cash—money, we later learned, for drugs. These problems were not created by governments or corporations or anyone else. We created them, and only we can fix them.

Vance’s intentions here are sincere and understandable. He’s tired of folks back home talking big about hard work when they are collecting checks just like the people they denigrate—tired of “the lies we tell ourselves.” He’s fed up with the quick resort to political blame, like the acquaintance in Middletown who told him that he had quit work because he was sick of waking up early but then declared on Facebook that it was the “Obama economy” that had set him back. “Whenever people ask me what I’d most like to change about the white working class,” writes Vance, “I say, ‘The feeling that our choices don’t matter.’ ”

He is wrong, though, that the burden of fixing things falls entirely on his people. The problems he describes—the reasons life in Middletown got tougher for his mom’s generation than it was for Mamaw and Papaw when they came north for work—have plenty to do with decisions by “governments or corporations.” The government and corporations have presided over the rise of new monopolies, the effect of which has been to concentrate wealth in a handful of companies and regions. The government and corporations welcomed China into the World Trade Organization more and more economists now believe that move hastened the erosion of American manufacturing, by encouraging U.S. companies to shift operations offshore. The government and corporations each did their part to weaken organized labor, which once boosted wages and strengthened the social fabric in places like Middletown. More recently, the government has accelerated the decline of the coal industry, on environmentally defensible grounds but with awfully little in the way of remedies for those affected.

A family moves belongings into a trailer in Chauncey, Ohio. (Matt Eich)

Even at the edges, solutions lie within the purview of the powers that be—such as allowing Medicaid expansion to proceed in the South and expanding access to medication-assisted treatment to help people like Vance’s mother get off heroin. Yes, aid should be tailored to avoid the sort of resentment that Vance felt at the grocery store. At moments, he seems to acknowledge a role for taxpayer-funded compassion. “The best way to look at this might be to recognize that you probably can’t fix these things,” a friend who worked at the White House once told him. “They’ll always be around. But maybe you can put your thumb on the scale a little for the people at the margins.”

Perhaps you can even put your whole hand on the scale. One of the most compelling parts of Isenberg’s history is her account of the help delivered to struggling rural whites as part of the New Deal. Projects like the Resettlement Administration, led by Rexford Tugwell, which moved tenants to better land and provided loans for farm improvements, brought real progress. So did the Tennessee Valley Authority, which not only spurred development of much of the South but created training centers and entire planned towns—towns where hill children went to school with engineers’ kids. The New Deal had its flops. But men like Tugwell recognized that citizens in some places were slipping badly behind, and that their plight represented a powerful threat to the country’s founding ideals of individual self-determination and advancement.

A case can be made that the time has arrived for a major undertaking in, say, the devastated coal country of central Appalachia. How much to invest in struggling regions themselves, as opposed to making it easier for those who live in them to seek a livelihood elsewhere, is a debate that needs to happen. But the obligation is there, as it was 80 years ago. “We think of the left-behind groups as extinct,” Isenberg writes, “and the present as a time of advanced thought and sensibility. But today’s trailer trash are merely yesterday’s vagrants on wheels, an updated version of Okies in jalopies and Florida crackers in their carts. They are renamed often, but they do not disappear.”

Except they are now further out of sight than ever. As Isenberg documents, the lower classes have been disregarded and shunted off for as long as the United States has existed. But the separation has grown considerably in recent years. The elite economy is more concentrated than ever in a handful of winner-take-all cities—as Phillip Longman recently noted in the Washington Monthly, the per capita income of Washington, D.C., in 1980 was 29 percent above the average for Americans as a whole in 2013, that figure was 68 percent. In the Bay Area, per capita income jumped from 50 percent to 88 percent above average over that period in New York, from 80 percent to 172 percent. As these gaps have grown, the highly educated have become far more likely than those lower down the ladder to move in search of better-paying jobs.

Read Follow-Up Notes

The clustering is intensifying within regions, too. Since 1980, the share of upper-income households living in census tracts that are majority upper-income, rather than scattered throughout more mixed-income neighborhoods, has doubled. The upper echelon has increasingly sought comfort in prosperous insularity, withdrawing its abundant social capital from communities that relied on that capital’s overflow, and consolidating it in oversaturated enclaves.

So why are white Americans in downwardly mobile areas feeling a despair that appears to be driving stark increases in substance abuse and suicide? In my own reporting in Vance’s home ground of southwestern Ohio and ancestral territory of eastern Kentucky, I have encountered racial anxiety and antagonism, for sure. But far more striking is the general aura of decline that hangs over towns in which medical-supply stores and pawn shops dominate decrepit main streets, and Victorians stand crumbling, unoccupied. Talk with those still sticking it out, the body-shop worker and the dollar-store clerk and the unemployed miner, and the fatalism is clear: Things were much better in an earlier time, and no future awaits in places that have been left behind by polished people in gleaming cities. The most painful comparison is not with supposedly ascendant minorities—it’s with the fortunes of one’s own parents or, by now, grandparents. The demoralizing effect of decay enveloping the place you live cannot be underestimated. And the bitterness—the “primal scorn”—that Donald Trump has tapped into among white Americans in struggling areas is aimed not just at those of foreign extraction. It is directed toward fellow countrymen who have become foreigners of a different sort, looking down on the natives, if they bother to look at all.


Civil Rights Act of 1964

/tiles/non-collection/b/baic_cont_3_lincoln_statue_overlooking_march_LC-DIG-ppmsca-08109.xml Image courtesy of the Library of Congress As the finale to the massive August 28, 1963, March on Washington, Martin Luther King Jr. gave his famous “I Have a Dream” speech on the steps of the Lincoln Memorial. This photograph shows the view from over the shoulder of the Abraham Lincoln statue to the marchers gathered along the length of the Reflecting Pool.

A reluctant Kennedy administration began coordinating with congressional allies to pass a significant reform bill. Freshman Representative Gus Hawkins observed in May 1963 that the federal government had a special responsibility to ensure that federal dollars did not underwrite segregation in schools, vocational education facilities, libraries, and other municipal entities, saying, “those who dip their hands in the public treasury should not object if a little democracy sticks to their fingers.” Otherwise “do we not harm our own fiscal integrity, and allow room in our conduct for other abuses of public funds?” 101 After Kennedy’s assassination in November 1963, his successor, Lyndon B. Johnson, invoked the slain President’s memory to prod reluctant legislators to produce a civil rights measure.

In the House, a bipartisan bill supported by Judiciary Chairman Celler and Republican William McCulloch of Ohio worked its way to passage. McCulloch and Celler forged a coalition of moderate Republicans and northern Democrats while deflecting southern amendments determined to cripple the bill. Standing in the well of the House defending his controversial amendment and the larger civil rights bill, Representative Powell described the legislation as “a great moral issue. . . . I think we all realize that what we are doing [today] is a part of an act of God.” 102 On February 10, 1964, the House, voting 290 to 130, approved the Civil Rights Act of 1964 138 Republicans helped pass the bill. In scope and effect, the act was among the most far-reaching pieces of legislation in U.S. history. It contained sections prohibiting discrimination in public accommodations (Title II) in state and municipal facilities, including schools (Titles III and IV) and—incorporating the Powell Amendment—in any program receiving federal aid (Title V). The act also prohibited discrimination in hiring and employment, creating the Equal Employment Opportunity Commission (EEOC) to investigate workplace discrimination (Title VII). 103

Having passed the House, the act faced its biggest hurdle in the Senate. President Johnson and Senate Majority Leader Mike Mansfield of Montana tapped Hubert Humphrey of Minnesota to build Senate support for the measure and fend off the efforts of a determined southern minority to stall it. One historian noted that Humphrey’s assignment amounted to an “audition for the role of Johnson’s running mate in the fall presidential election.” 104 Humphrey, joined by Republican Thomas Kuchel of California, performed brilliantly, lining up the support of influential Minority Leader Everett Dirksen of Illinois. By allaying Dirksen’s unease about the enforcement powers of the EEOC, civil rights proponents then co-opted the support of a large group of Midwestern Republicans who followed Dirksen’s lead. 105 On June 10, 1964, for the first time in its history, the Senate invoked cloture on a civil rights bill by a vote of 71 to 29, thus cutting off debate and ending a 75-day filibuster—the longest in the chamber’s history. On June 19, 1964, 46 Democrats and 27 Republicans joined forces to approve the Civil Rights Act of 1964, 73 to 27. President Johnson signed the bill into law on July 2, 1964. 106


Teacher’s Guide

From the 1820s through the 1850s American politics became in one sense more democratic, in another more restrictive, and, in general, more partisan and more effectively controlled by national parties. Since the 1790s, politics became more democratic as one state after another ended property qualifications for voting. Politics became more restrictive as one state after another formally excluded African Americans from the suffrage. By 1840, almost all white men could vote in all but three states (Rhode Island, Virginia, and Louisiana), while African Americans were excluded from voting in all but five states and women were disfranchised everywhere. At the same time, political leaders in several states began to revive the two-party conflict that had been the norm during the political struggles between the Federalists and the Jeffersonian Republicans (1793–1815). Parties and party conflict became national with Andrew Jackson’s campaign for the presidency in 1828 and have remained so ever since. Parties nominated candidates for every elective post from fence viewer to president and fought valiantly to get them elected.

The number of newspapers exploded the vast majority of them were mouthpieces for the Democratic Party or the Whig Party (the National Republican Party before 1834). Accompanying the newspapers was a flood of pamphlets, broadsides, and songs aimed at winning the support of ordinary voters and teaching them to think as a Democrat or a Whig. Parties also created gigantic and incredibly effective grass-roots organizations. Each party in almost every school district and urban ward in the country formed an electoral committee, which organized partisan parades, dinners, and picnics distributed partisan newspapers and pamphlets, and canvassed door-to-door. In this way the parties got ordinary voters involved in politics, resulting in extremely high voter participation rates (80–90%). Even more than in the earlier period, parties were centrally coordinated and controlled. They expected their leaders, their newspapers, and their voters to toe the party line. Once the party caucus or convention had decided on a policy or a candidate, everyone was expected to support that decision.

The Democrats, National Republicans, and Whigs were not the only people creating a new kind of democracy, however. Several small, sectional parties promoted a way of conducting politics that was quite different from the practices of the major parties. The Workingmen’s Party, for example, organized in the major northeastern cities and in dozens of small, industrial towns in New England. Workingmen’s parties were part of the emerging labor movement and were made up primarily of skilled craftsmen whose trades were being industrialized. In addition, a growing movement of evangelical Christians sought to reform society by advocating temperance, an end to prostitution, the abolition of slavery, women’s rights, and more.

The two paintings and the cartoon offered here capture the passion, tumult, and divisions that came to characterize American democracy at this time.

George Caleb Bingham (1811–79) was one of the most successful and important American artists of the early nineteenth century. Born in 1811 to a prosperous farmer, miller, and slaveowner in western Virginia, Bingham knew prosperity but also experienced economic hardship when his father lost his property in 1818 and again when his father died in 1823. While he was a cabinet-maker’s apprentice, Bingham began painting portraits for $20 apiece and, by 1838, was beginning to acquire a reputation as an artist. During the 1840s he moved to St. Louis, the largest city in the West, where he pursued a successful career as a portrait artist. In 1848 he was elected to the Missouri General Assembly and later held several appointive posts. With gentle humor The County Election captures the arguing, the campaigning, and the drinking that accompanied the masculine ritual of voting in mid-nineteenth century rural America.

Richard Caton Woodville (1825–55) was born in Baltimore. His family hoped he would become a physician, and he did undertake medical studies in 1842. However, by 1845, when he traveled to Germany to train at the Dusseldorf Academy, he had abandoned medicine to pursue a career as an artist. Although he spent the rest of his life in Germany, France, and England, he devoted himself to re-creating his native Baltimore on canvas. With humor akin to that of Bingham, Politics in an Oyster House depicts a “conversation” between a young political enthusiast and a skeptical old-timer. As in The County Election, the political realm is exclusively masculine, for the oyster house is a male-only pub.

The Workingmen’s Party cartoon illustrates disillusionment with and dissent from the sharply divisive politics of the age. It suggests that the corruption of both the Whigs and the Democrats will lead to the oppression of the poor.


13a. The Declaration of Independence and Its Legacy

"When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature's God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation."


The first public reading of the Declaration of Independence occurred at high noon on July 8, 1776, in the Old State House yard in Philadelphia (what is now Independence Hall).

So begins the Declaration of Independence . But what was the Declaration? Why do Americans continue to celebrate its public announcement as the birthday of the United States, July 4, 1776? While that date might just mean a barbecue and fireworks to some today, what did the Declaration mean when it was written in the summer of 1776?

On the one hand, the Declaration was a formal legal document that announced to the world the reasons that led the thirteen colonies to separate from the British Empire. Much of the Declaration sets forth a list of abuses that were blamed on King George III. One charge levied against the King sounds like a Biblical plague: "He has erected a multitude of New Offices, and sent hither swarms of Officers to harrass our people, and eat out their substance."

The Declaration was not only legalistic, but practical too. Americans hoped to get financial or military support from other countries that were traditional enemies of the British. However, these legal and pragmatic purposes, which make up the bulk of the actual document, are not why the Declaration is remembered today as a foremost expression of the ideals of the Revolution.

The Declaration's most famous sentence reads: "We hold these truths to be self-evident, that all men are created equal that they are endowed by their Creator with certain unalienable rights that among these are life, liberty, and the pursuit of happiness." Even today, this inspirational language expresses a profound commitment to human equality.

This ideal of equality has certainly influenced the course of American history. Early women's rights activists at Seneca Falls in 1848 modeled their " Declaration of Sentiments " in precisely the same terms as the Declaration of Independence. "We hold these truths to be self-evident," they said, "that all men and women are created equal." Similarly, the African-American anti-slavery activist David Walker challenged white Americans in 1829 to "See your Declaration Americans. Do you understand your own language?" Walker dared America to live up to its self-proclaimed ideals. If all men were created equal, then why was slavery legal?

The ideal of full human equality has been a major legacy (and ongoing challenge) of the Declaration of Independence. But the signers of 1776 did not have quite that radical an agenda. The possibility for sweeping social changes was certainly discussed in 1776. For instance, Abigail Adams suggested to her husband John Adams that in the "new Code of Laws" that he helped draft at the Continental Congress, he should, "Remember the Ladies, and be more generous and favorable to them." It didn't work out that way.


King George III showed signs of madness. He likely suffered from porphyria, a disease of the blood leading to gout and mental derangement.

Thomas Jefferson provides the classic example of the contradictions of the Revolutionary Era. Although he was the chief author of the Declaration, he also owned slaves, as did many of his fellow signers. They did not see full human equality as a positive social goal. Nevertheless, Jefferson was prepared to criticize slavery much more directly than most of his colleagues. His original draft of the Declaration included a long passage that condemned King George for allowing the slave trade to flourish. This implied criticism of slavery &mdash a central institution in early American society &mdash was deleted by a vote of the Continental Congress before the delegates signed the Declaration.

So what did the signers intend by using such idealistic language? Look at what follows the line, "We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness ."

These lines suggest that the whole purpose of government is to secure the people's rights and that government gets its power from "the consent of the governed ." If that consent is betrayed, then "it is the right of the people to alter or abolish" their government. When the Declaration was written, this was a radical statement. The idea that the people could reject a monarchy (based on the superiority of a king) and replace it with a republican government (based on the consent of the people) was a revolutionary change.

While the signers of the Declaration thought of "the people" more narrowly than we do today, they articulated principles that are still vital markers of American ideals. And while the Declaration did not initially lead to equality for all, it did provide an inspiring start on working toward equality.


41e. The Election of 1896

Everything seemed to be falling into place for the Populists. James Weaver made an impressive showing in 1892, and now Populist ideas were being discussed across the nation. The Panic of 1893 was the worst financial crisis to date in American history. As the soup lines grew larger, so did voters' anger at the present system.

When Jacob S. Coxey of Ohio marched his 200 supporters into the nation's capital to demand reforms in the spring of 1894, many thought a revolution was brewing. The climate seemed to ache for change. All that the Populists needed was a winning Presidential candidate in 1896.

The Boy Orator

Ironically, the person who defended the Populist platform that year came from the Democratic Party. William Jennings Bryan was the unlikely candidate. An attorney from Lincoln, Nebraska, Bryan's speaking skills were among the best of his generation. Known as the " Great Commoner ," Bryan quickly developed a reputation as defender of the farmer.

When Populist ideas began to spread, Democratic voters of the South and West gave enthusiastic endorsement. At the Chicago Democratic convention in 1896, Bryan delivered a speech that made his career. Demanding the free coinage of silver, Bryan shouted, "You shall not crucify mankind upon a cross of gold!" Thousands of delegates roared their approval, and at the age of thirty-six, the " Boy Orator " received the Democratic nomination.

Faced with a difficult choice between surrendering their identity and hurting their own cause, the Populist Party also nominated Bryan as their candidate.

The Stay-at-Home Candidate


William McKinley stayed out of the public eye in 1896, leaving the campaigning to party hacks and fancy posters like this one.

The Republican competitor was William McKinley , the governor of Ohio. He had the support of the moneyed eastern establishment. Behind the scenes, a wealthy Cleveland industrialist named Marc Hanna was determined to see McKinley elected. He, like many of his class, believed that the free coinage of silver would bring financial ruin to America.

Using his vast wealth and power, Hanna directed a campaign based on fear of a Bryan victory. McKinley campaigned from his home, leaving the politicking for the party hacks. Bryan revolutionized campaign politics by launching a nationwide whistle-stop effort, making twenty to thirty speeches per day.

When the results were finally tallied, McKinley had beaten Bryan by an electoral vote margin of 271 to 176.

Understanding 1896

Many factors led to Bryan's defeat. He was unable to win a single state in the populous Northeast. Laborers feared the free silver idea as much as their bosses. While inflation would help the debt-ridden, mortgage-paying farmers, it could hurt the wage-earning, rent-paying factory workers. In a sense, the election came down to city versus country. By 1896, the urban forces won. Bryan's campaign marked the last time a major party attempted to win the White House by exclusively courting the rural vote.

The economy of 1896 was also on the upswing. Had the election occurred in the heart of the Panic of 1893, the results may have differed. Farm prices were rising in 1896, albeit slowly. The Populist Party fell apart with Bryan's loss. Although they continued to nominate candidates, most of their membership had reverted to the major parties.

The ideas, however, did endure. Although the free silver issue died, the graduated income tax, direct election of senators, initiative, referendum, recall, and the secret ballot were all later enacted. These issues were kept alive by the next standard bearers of reform &mdash the Progressives .


Born in South Carolina to impoverished parents on March 15, 1767, Jackson began life quite differently compared to the previous six presidents. At 13, Jackson joined the Continental Army as a courier during the Revolutionary War. (Jackson was also the last president to have served during the Revolutionary War). Losing his father before his birth, the war then obliterated Jackson&aposs family. Losing his two brothers and mother during the war fostered an intense hatred for the British that Jackson maintained his whole life.

Jackson initially had a sporadic education. After the war, Jackson taught himself to read and read law books so that he could find work as a lawyer in Tennessee in 1787. The wild frontier life suited Jackson and succeeded based upon his own hard work and merit. He became one of the first congressmen representing Tennessee, later a Tennessee senator in 1797, and appointed to the Tennessee Supreme Court in 1798. These accomplishes set Jackson apart from most men, yet they would pail in comparison to Jackson’s military career in the War of 1812.

During the War of 1812 Jackson, garnered his nickname “Old Hickory,” due to his strict command of his troops and abilities shown on the battlefield. The Battle of New Orleans on January 5, 1815 concluded with a major victory for Jackson. This victory forever made Jackson a national hero and gave him a place in the hearts of all American citizens. Jackson’s national identity and immense popularity enabled him to run for president in the 1828 election.

The Rise of the Common Man coincided with Jackson&aposs election because Jackson served as the ideal common man. Common origins no longer detracted from a candidate. Nor did a candidate have to attend Harvard or William and Mary. Jackson became the living embodiment of the changes and improvements going on throughout the United States. As well as the symbol of aspirations and expectations that Americans had of themselves. Jackson’s life was overshadowed with obstacles: orphaned at 14, bankruptcy, many brushes with death in his military career, and a marriage tainted with gossip of bigamy, but despite his lowly beginnings Jackson prospered in the western state of Tennessee and became the most powerful man in the country.


History of bombings in the US, including famous attempts that failed since the late 1800s

Police officers react to a second explosion at the finish line of the Boston Marathon in Boston, Monday, April 15, 2013. Two explosions shattered the euphoria of the Boston Marathon finish line on Monday, sending authorities out on the course to carry off the injured while the stragglers were rerouted away from the smoking site of the blasts. (AP Photo/The Boston Globe, John Tlumacki) (The Associated Press)

Here is a list of some of the worst bombings in the U.S. dating to the 1800s, including some famous attempts that failed:

— April 15, 2013: Two bombs explode in the packed streets near the finish line of the Boston Marathon, killing two people and injuring more than 80.

— Jan. 17, 2011: A backpack bomb is placed along a Martin Luther King Day parade route in Spokane, Wash., meant to kill and injure participants in a civil rights march, but is found and disabled before it can explode. White supremacist Kevin Harpham is convicted and sentenced to 32 years in federal prison.

— May 1, 2010: Pakistani immigrant Faisal Shahzad leaves an explosives-laden SUV in New York's Times Square, hoping to detonate it on a busy night. Street vendors spot smoke coming from the vehicle and the bomb is disabled. Shahzad is arrested as he tries to leave the country and is sentenced to life in prison.

— Dec. 25, 2009: The so-called "underwear bomber," Nigerian Umar Farouk Abdulmutallab, is subdued by passengers and crew after trying to blow up an airliner heading from Paris to Detroit using explosives hidden in his undergarments. He's sentenced to life in prison.

— Sept. 11, 2001: Four commercial jets are hijacked by 19 al-Qaida militants and used as suicide bombs, bringing down the two towers of New York City's World Trade Center and crashing into the Pentagon. Nearly 3,000 people are killed in New York, Washington and Pennsylvania.

— Jan 22, 1998: Theodore Kaczynski pleads guilty in Sacramento, Calif., to being the Unabomber in return for a sentence of life in prison without parole. He's locked up in the federal Supermax prison in Colorado for killing three people and injuring 23 during a nationwide bombing spree between 1978 and 1995.

— Jan. 20, 1998: A bombing at an abortion clinic in Birmingham, Ala., kills one guard and injures a nurse. Eric Robert Rudolph is suspected in the case.

— July 27, 1996: A bomb explodes at Centennial Olympic Park in Atlanta during the Summer Games, killing two people and injuring more than 100. Eric Robert Rudolph is arrested in 2003. He pleads guilty and is sentenced to life in prison.

— April 19, 1995: A car bomb parked outside the Murrah Federal Building in Oklahoma City kills 168 people and injures more than 500. It is the deadliest U.S. bombing in 75 years. Timothy McVeigh and Terry Nichols are convicted. McVeigh is executed in 2001 and Nichols is sentenced to life in prison.

— Feb. 26, 1993: A bomb in a van explodes in the underground World Trade Center garage in New York City, killing six people and injuring more than 1,000. Five Muslims are eventually convicted of the crime.

— Nov. 7, 1983: A bomb blows a hole in a wall outside the Senate chamber at the Capitol in Washington. No one is hurt. Two leftist radicals plead guilty.

— May 16, 1981: A bomb explodes in a men's bathroom at the Pan Am terminal at New York's Kennedy Airport, killing a man. A group calling itself the Puerto Rican Armed Resistance claims responsibility. No arrests are made.

— Dec. 29, 1975: A bomb hidden in a locker explodes at the TWA terminal at New York's LaGuardia Airport, killing 11 people and injuring 75. Palestinian, Puerto Rican and Croatian groups are suspected, but no arrests are made.

— Jan. 29, 1975: The U.S. State Department building in Washington, D.C., is bombed by the Weather Underground. No one is killed.

— Jan. 24, 1975: A bomb goes off at historic Fraunces Tavern in New York City, killing four people. It was one of 49 bombings attributed to the Puerto Rican nationalist group FALN between 1974 and 1977 in New York.

— Jan. 27, 1972: A bomb wrecks the New York City office of impresario Sol Hurok, who had been booking Soviet artists. One person is killed and nine are injured, Hurok among them. A caller claiming to represent Soviet Jews claims responsibility, but no arrests are made.

— March 1, 1971: The Senate wing of the U.S. Capitol Building in Washington, D.C., is bombed by the Weather Underground. No one is killed.

— March 6, 1970: Three members of the revolutionary Weather Underground accidentally blow themselves up in their townhouse in New York City's Greenwich Village while making bombs.

— 1951-56: George Metesky, a former Consolidated Edison employee with a grudge against the company, sets off a series of blasts at New York landmarks, including Grand Central station and Radio City Music Hall. No one is killed. Known as The Mad Bomber, Metesky spends 16 years in a mental hospital.

— May 18, 1927: 45 people — 38 of them children — are killed when a school district treasurer, Andrew Kehoe, lines the Bath Consolidated School near Lansing, Mich., with hundreds of pounds of dynamite, and blows it up. Investigators say Kehoe, who also died in the blast, thought he would lose his farm because he couldn't pay property taxes used to build the school.

— Sept. 16, 1920: A bomb explodes in New York City's Wall Street area, killing 40 and injuring hundreds. Authorities conclude it was the work of "anarchists" and come up with a list of suspects, but all flee to Russia.

— Oct. 1, 1910: The Los Angeles Times building is dynamited during a labor dispute, killing 20 people. Two leaders of the ironworkers union plead guilty.

— May 4, 1886: A bomb blast during a labor rally at Chicago's Haymarket Square kills 11 people, including seven police officers, and injures more than 100. Eight "anarchists" are tried for inciting riot. Four are hanged, one commits suicide and three win pardons after seven years in prison.


Thank you!

The then-44-year-old Senator was great at giving inspiring speeches and people were attracted to his youthful energy, but he could also come off like a “hothead,” as he did in his “angry” questioning of Secretary of State George Shultz when the Senate heard testimony about South Africa in 1986. His position in the Senate offered him a chance to show his skill. In particular, as Biden chaired the Judiciary Committee, he hoped to gained more national attention during the uproar over polarizing conservative Supreme Court nominee Robert Bork. Biden, in charge of the confirmation hearings, oversaw what was seen as potentially “the culminating ideological showdown of the Reagan era,” as TIME put it back then. “For Chairman Biden, the hearings could provide a spark for his presidential campaign by giving him a chance to show his mettle in front of a national television audience.”

But Biden didn’t get a chance to shine during the Bork hearings in the way he had hoped.

A few days before they began, video surfaced that spliced together footage of U.K. Labour Party leader Neil Kinnock giving a speech and Biden clearly quoting Kinnock at the Iowa State Fair without attribution. More examples of misattribution came to light, and the plagiarism scandal became more memorable than his leadership during the Bork confirmation hearing. His mouth &mdash or rather, what he failed to say &mdash got him in trouble again.

Here’s how TIME described why the fallout was so intense:

[T]he Biden brouhaha illustrates the six deadly requirements for a crippling political scandal.

1) A Pre-Existing Subtext. “The basic rap against Biden,” explains Democratic Pollster Geoff Garin, “is that he’s a candidate of style, not substance.”

2) An Awkward Revelation. The Kinnock kleptomania was particularly damaging to Biden since it underscored the prior concerns that he was a shallow vessel for other people’s ideas.

3) A Maladroit Response. Top Aide Tom Donilon claimed that Biden failed to credit Kinnock because “he didn’t know what he was saying. He was on autopilot.”

4) The Press Piles On. Once textual fidelity became an issue, reporters found earlier cases in which Biden had failed to give proper citation to Humphrey and Robert Kennedy. By themselves these transgressions would not have been worth noting.

5) The Discovery of Youthful Folly. During his first months at Syracuse University Law School, in 1965, Biden failed a course because he wrote a paper that used five pages from a published law-review article without quotation marks or a proper footnote. Since Biden was allowed to make up the course, the revelation was front-page news only because it kept the copycat contretemps alive.

6) An Overwrought Press Conference. With a rambling and disjointed opening statement, Biden failed to reap the benefits of public confession, even though he called himself “stupid” and his actions “a mistake.” Part of the problem is that he contradicted himself by also insisting that it was “ludicrous” to attribute every political idea.

The “final blow” for the campaign came when Newsweek unearthed C-SPAN footage of Biden rattling off his academic accomplishments, including saying that he graduated in the top half of his law school, when in fact, he ranked 76th out of 85.

Biden announced he was dropping out of the race on Sept. 24, 1987. (To make things even, Biden later jokingly gave Kinnock some of his speeches to use “with or without attribution” during a January 1988 trip to Europe.) About twenty years later, in his 2008 memoir Promises to Keep: On Life and Politics, he wrote that the plagiarism scandal was his own fault. “When I stopped trying to explain to everybody and thought it through, the blame fell totally on me,” he wrote. “Maybe the reporters traveling with me had seen me credit Kinnock over and over, but it was Joe Biden who forgot to credit Kinnock at the State Fair debate.”

Barrett helped break the news that the Kinnock attack video had come from the campaign of one of Biden’s main opponents, Massachusetts Governor Michael Dukakis. Paul Tully, a top aide to Dukakis, denied, on the record, that the video had come from the campaign, and Barrett says Tully expressed disbelief that the story would run anyway when they saw each other in Iowa. “I told you we were doing this story,” Barrett recalls telling Tully. “He looked at me as if I had done something awful.” Dukakis at first denied the story when the magazine hit newsstands, but hours later took back his denial. It was a particular embarrassment for the man known as the “straight arrow” candidate because of his “positive campaigning” tactics. Two of his aides stepped down: John Sasso, who leaked the video, and Tully, for lying to TIME.

The public was equally outraged.

Letters to the editor published in TIME offer a glimpse at the public reaction, finding neither Biden nor Dukakis to be honest or trustworthy. “Biden lied in situations in which it was not necessary or relevant,” wrote a Los Angeles reader. “I am alarmed that neither candidate viewed these acts as immoral and representative of his character.” Another reader was alarmed about a year later when Dukakis rehired Sasso after his campaign started to “tank,” literally &mdash a goofy photo of him posing in a military tank was turned into an ad that painted Dukakis as not taking national security issues seriously enough. When the election rolled around, Republican George H.W. Bush won. “Dukakis might have been spared some of [his] mistakes had Sasso been at his elbow,” Barrett recalls many thinking.

Biden’s short-lived 1988 campaign would end up having long-lasting effect on future political campaigns and political journalism, with Walter Shapiro arguing in a December 1987 TIME essay that it had helped turn political reporters into “character cops” who trade in “paparazzi politics and pop psychology.”

And for Biden, there was a silver lining to being driven out of the race: It saved his life. In February of 1988, he had a headache that turned out to be a brain aneurysm. He had surgery, and he had to have surgery again in the spring when a second smaller aneurysm formed. “There is no doubt &mdash the doctors have no doubt &mdash that had I remained in the race, I’d be dead,” he told TIME later that fall, at his first event since the aneurysms. He also joked that “The good news is that I can do anything I did before. The bad news is that I can’t do anything better.”

When he’d had announced his candidacy back in 1987, TIME reported that he had asked his then-teenage son Hunter if he should run. “You should,” Hunter said. “If you don’t do it now, I couldn’t see you doing it some other time.”

Hunter Biden, of course, was wrong.

Biden ran for the Democratic nomination again in 2008. He didn’t secure the nomination, but went on to serve as Vice President of the United States under Barack Obama. In his eight years in the office, he built up a foreign policy portfolio that included the Paris climate agreement and Iran nuclear deal. Now he hopes his policy portfolios and his high poll numbers, not his past runs for the White House, will define his candidacy.

“The huge difference between now and 1988 is that Biden has much more of a cause now,” says Barrett. “In 󈨜 he couldn&rsquot really formulate why he was running. He didn’t have an ideological cause the way Reagan had a cause. Now we know why he&rsquos running. He thinks he&rsquos the guy who can defeat Trump.”