History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Thu, 22 Oct 2020 07:11:29 +0000 Thu, 22 Oct 2020 07:11:29 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/site/feed "Every Goodbye Ain’t Gone and Every Close Eye Ain’t Shut": Black Georgians' Memories and Election Unease

 

 

During the June 9th primary elections in Georgia, Black voters were alarmed as they stood in line to vote, in some cases for three to four hours.  Terri Russell told the New York Times, “I refuse not to be heard and so I am standing in line.” Her dedication was even more poignant as Russell, who is 57 year old and who suffers from both asthma and bronchitis, had requested and never received an absentee ballot, and consequently was risking her life by standing in line during a pandemic in a county with one of  the highest rates of COVID-19 cases in the state.  Elections officials blamed the problems, which primarily plagued Black and Democratic regions of the state, on new voting machines. For many including Stacey Abrams, the problems harkened back to her narrowly defeated 2018 bid to become the state’s first Black governor, and the nation’s, first Black woman to win a governorship.

Nearly four months later on September 29th, when Chris Wallace called on President Donald Trump to denounce white supremacist groups like the Proud Boys during the first Presidential Debate, the President responded by calling on the militia group to “stand down and stand by.”  For many in Black communities, a collective shudder like steel against porcelain was palpable.  The primary elections and Presidential Debate of 2020 remind many in Black communities of efforts to mobilize black voters during the 1960s, but the reality is that semblances of these events reach further back to Reconstruction when formerly-enslaved people were first able to vote en masse.  This was a time when millions of Blacks, fresh from the horrors of slavery and the Civil War, held onto the promise that with Emancipation they could participate actively in the American experiment. But In Georgia, it would not be until the Spring of 1868 that they would be able to run for office and vote for laws that they had participated in writing.  

In the spring of 1868, African American voters faced long lines at polls when and where polls opened at all.  Part of Georgia’s newly ratified constitution was an expansion of education for all Georgians as well as property rights for women, and local white officials frequently questioned Black Georgians’ right to vote.  Black Georgians faced tremendous odds to vote, even in regions of the state with Black majorities, and only thirty-three out of nearly two hundred legislators elected in 1868 were Black.  Less than five months later, they would be expelled from office because they were Black, and it would be more than a year before they would regain their legislative seats.   During that year, White officials throughout the state targeted Black voters with the poll tax. For those who organized politically, their message was clear: the Black franchise was a threat to be quelled.  A well-known incident of Black voter suppression was the Camilla Massacre on September 19th, 1868, in Southwest Georgia near Albany.  Hundreds of local, primarily Black residents marched to Camilla to attend a political rally, but before they could reach the town square, White residents and officials shot and killed nearly twenty of them while countless others were injured and hunted down as they fled back to Albany. 

Efforts to dissuade Black voters frequently materialized in campaigns to threaten both Black voters and Black political leaders who were beaten, often in the dark of night.   Others involved in political organizing were held indefinitely in jail with no cause as was the case of F.H. Fyall, one of the Black legislators expelled from office.  Fyall was eventually arrested for his political activity and held in jail for nearly two months because of his refusal to switch his political affiliation.  Among the more high-profile targets of Black voter suppression were Henry McNeal Turner, a Georgia legislator, a minister and  future Bishop in the African Methodist Episcopal Church, and sixty-four-year-old Tunis Campbell Sr., a Georgia legislator and an Elder in the African Methodist Episcopal Church, Zion.  McNeal faced systematic threats for his political activity and Campbell would find himself incarcerated and on a Georgia chain gang. Blacks were also threatened economically for political activity. Sam Gilbert, a freedman from Houston county, exemplified this suppression when he faced a reduction of his wages for attending a political meeting.

By the fall elections of 1872 the state was firmly under conservative control and state leaders reinstituted poll taxes as a means of discouraging Black voters who already were facing economic challenges due to low wages.  Governor James M. Smith legally reconvened the state militia under the guise of promoting peace and civility, and he supplied guns and munitions to state sanctioned militias which were overwhelmingly composed of former Confederate soldiers.  During the elections that fall, these armed groups frequently surrounded polling stations bringing violence and intimidation in virtually every part of the state, blurring the line between state sanctioned armed groups and local Klan, and promoting their form of “law and order.”  

Black voters and politicians were helpless to oppose state sanctioned suppression, and the number of Black legislators declined even in counties with clear Black majorities.  Moreover, Georgia officials, hoping to avoid any return to federal oversight, moved state elections back a month to October and required Black voters to produce poll tax receipts.  Unsurprisingly, Ulysses S. Grant again lost the state during his bid for re-election in November of 1872.

For most Georgians, the history of Reconstruction in their state is not a familiar one, but the unspoken realities of past voter suppression in Georgia during this era resonate in the present experience of many Black Georgians.  Many have a tacit understanding that, even with the promise embodied in the election of Barack Obama in 2008 and 2012, the past is not gone, but promises to resurface like a shark in deep water with efforts de-register voters or with voting machines that don’t work in black or brown communities.  Voter suppression also resurfaces as Black voters are forced to choose between staying at home in an election or risking their lives to COVID-19 by waiting to vote in long lines and endure the potential threat of intimidation by “patriots” emboldened by the President himself to monitor polling stations.  150 years after Emancipation, many Blacks in Georgia and throughout the nation still feel a familiar unease, and many understand the proverb, “Every Goodbye Ain’t Gone, and Every Closed Eye Ain’t Shut”.  

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177745 https://historynewsnetwork.org/article/177745 0
Treason, the Death Penalty, and American Identity

Pueblo de Taos, 1847

 

 

In 1847, in a dusty plaza in Taos (now part of the American state of New Mexico), American authorities tried and executed a thirty-nine-year-old man named Hipolito Salazar for treason against the United States.  He is the only person ever executed for this crime since the adoption of the United States Constitution.  His story has faded into almost complete obscurity, but it tells us much about American national identity and whom we are willing (and not willing) to execute for acts of national betrayal.         

According to the solemn declarations of judicial opinions and legal treatises, treason is the highest crime in American law, worse even than murder.  Legal historians, however, know that such statements should not be taken at face value; the law in action is often quite different than the law stated in the books.

Since the late eighteenth-century, America has executed thousands of people for murder, but only a minuscule number for treason.  In addition to the Salazar execution, a small handful of people were executed as traitors to the individual states during the American Revolution, and John Brown and Edwin Coppoc were executed for treason against Virginia in 1859 for their role in the raid on Harpers Ferry.

If murder was committed far more frequently than treason, this disparity would be unremarkable.  In most years, treason is indeed rare.  But in a handful of turbulent years, treason has been widespread.  Although precise data do not exist, is it clear that tens of thousands of Americans have committed treason.  During the American Revolution, thousands of people sided with the British and committed acts that brought them within the technical scope of state treason laws.  The largest number of offenses came during the American Civil War.  All the members of the Confederate military, along with any person who provided them aid and assistance, committed the crime of levying war against the United States.    

But the Civil War did not lead to any treason executions, and there were very few during the American Revolution (many prosecutions failed because juries refused to convict, and where convictions were obtained, governors usually granted clemency).

So the disparity between treason and murder executions must be explained by other factors.  Two are particularly salient.

First, treason has generally been perceived as a political crime, not an individual crime of violence, even if it involved fighting as a soldier in the enemy’s military.  This was particularly true during the Revolution, when almost everyone knew someone—a friend, a neighbor, a relative—who had chosen the other side.  Such people were not incorrigible criminals, but ordinary Americans who could easily be welcomed back as productive members of society once the conflict was over.  Treason, in short, could be forgiven, whereas murder could not.

Second, conflicts like the Revolution and the Civil War would have been even more horrific if a wave of executions had followed on their conclusion.  Once the war was over, the primary goal became national reconciliation, welcoming back fellow Americans who had laid down their arms.  There was little appetite for further bloodshed, or for the creation of martyrs around which disaffected individuals could rally for generations. 

All of which suggests that something distinctive was happening in that plaza in Taos in 1847.  Until the publication of my book On Treason: A Citizen’s Guide to the Law, his case was entirely unknown to American treason scholars, who insisted that no such executions had occurred (and I include myself in that same category, having only learned of Salazar’s case in 2019). 

“Distinctive” would be an understatement.  In 1847, New Mexico did not belong to the United States, but to Mexico.  It would not be formally transferred to American jurisdiction until the 1848 Treaty of Guadalupe-Hidalgo.  Hipolito Salazar was a Mexican citizen who had never set foot in the United States.  As such, he owed no allegiance to the United States and could not be lawfully prosecuted for treason against it.

But when American military forces stormed into New Mexico in the Mexican-American War, they ignored these legal niceties, issuing proclamations stating that New Mexico was now part of the United States and that all residents of New Mexico owed allegiance to the United States.  Any military resistance to the American occupation would be treated as an act of treason.

Taos would later erupt in violent resistance.  The “Taos Revolt,” as it came to be called, was eventually suppressed by the American military, but American officials were determined to place the surviving leaders on trial.  Some were charged with murder, some with treason.  Of the treason defendants, Salazar was the only one who was convicted and executed.

As word of these trials seeped back to Washington, DC, many members of Congress responded with horror—why were American officials trying Mexican citizens, on Mexican soil, for the crime of treason against the United States?  The Polk Administration was forced to concede that the treason indictments had been issued in error.  Since New Mexico had not yet been formally ceded to the United States, treason was an inappropriate charge, and Salazar’s conviction was legally invalid.  But administration officials argued that this was a mere technicality—Salazar was basically a murderer who deserved to die, and whether the charge was treason or murder, he had received his just deserts.

Curiously, the same argument could potentially have been made eighteen years later, in the aftermath of the Civil War.  The Confederates could have been viewed as murderers, attacking American forces in the same way as had Salazar and his men.  But no one referred to the Confederates in this manner.  They were once and future Americans, part of the political community who had made a political mistake, but who could be welcomed back with open arms.

Salazar, by contrast, had never been an American; he had always been an outsider, perceived to be racially different.  And that fact, perversely, made him far more susceptible to execution for treason against the United States.  His offense could be casually compared to murder, in ways that other forms of treason would not.  And as a quasi-murderer, execution was entirely appropriate.

At the gallows, Salazar complained bitterly about the unfairness of his trial.  As the platform was about to drop, he uttered his last words, perfectly capturing his outsider status: “Caraho, los Americanos!”  Or, in English, “F--- the Americans!”

It is not a story that is recounted in our textbooks, but it should be.  It reveals much about who counts, and who doesn’t, in dealing with crimes of betrayal.  White Americans who waged war against their country and killed thousands other Americans in defense of race-based slavery were forgiven and allowed to return to their regular lives.  By contrast, the only person to die for betraying America was not even an American at all, but a Mexican defending his homeland. In this case, as in so many others, the law in action leaves much to be desired.

  

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177832 https://historynewsnetwork.org/article/177832 0
Fear of the "Pussification" of America: A Short Cultural History

 

 

Of all the responses to the COVID-19 pandemic in the United States—ranging from debates over mask wearing to school closings— perhaps the most bizarre is the suggestion that this deadly disease can be avoided simply through manliness.

 

Nowhere was this made more explicit than when former US Navy Seal Robert O’Neill shared a photo of himself, unmasked, on a Delta Airlines flight. “I’m not a pussy,” declared O’Neill on Twitter, as if to suggest that potent, masculine men, like those on Seal Team 6, would not be cowed into wearing cowardly protective gear (Never mind that a passenger sitting one row behind O’Neill, in a US Marine Corps baseball cap, was wearing his mask).

 

 

O’Neill’s use of the “P-word” was far from an outlier; in fact, it has been employed near and far in recent months. Adam Corolla stoked public outcry only weeks later when he maintained, incorrectly, that only the “old or sick or both” were dying from the virus. “How many of you pussy’s [sic] got played?” the comedian asked.

 

Nor were these remarks limited to COVID-19. Not to be outdone by such repugnant rhetoric, President Donald Trump—who elevated the word during the 2016 presidential campaign for other reasons—reportedly lambasted senior military leaders, declaring that “my fucking generals are a bunch of pussies.” On the opposite end of the military chain of command, 2nd Lt. Nathan Freihofer, a young celebrity on TikTok, recently gained notoriety for anti-Semitic remarks on the social media platform. “If you get offended,” the young officer proclaimed, “get the fuck out, because it’s a joke…. Don’t be a pussy.”

 

What should we make of these men, young and old, employing the word as a way to shame potential detractors? Perhaps the most telling, and least surprising, explanation is that sexism and misogyny are alive and well in Trump’s America. Yet it would be mistaken to argue that the epithet has regained popularity simply because the president seemingly is so fond of the word. Rather, such language—and more importantly, what it insinuates—is far from new. 

 

In July, after Alexandria Ocasio-Cortez (D-NY) was verbally accosted on the Capitol steps by fellow representative Ted Yoho (R-FL), the congresswoman delivered a powerful speech on the House floor. The problem with Yoho’s comments, Ocasio-Cortez argued, was not only that they were vile, but that they were part of a larger pattern of behavior toward women. “This is not new, and that is the problem,” she affirmed. “It is cultural. It is a culture of lack of impunity, of accepting of violence and violent language against women, and an entire structure of power that supports that.”

 

She’s right. This “violent” language—calling women “bitches” and men “pussies”—and the understandings that accompany it has a long history in American popular culture. And few cultural artifacts depict such sexist notions more overtly than Cold War men’s adventure magazines.

 

 

These “macho pulps” were an outgrowth of earlier men’s periodicals, including Argosy and Esquire. In the aftermath of World War II, magazines with suggestive titles like Battle Cry, Man’s Conquest, and True Men, exploded in popularity. The February 1955 issue of Stag, for example, sold more than 585,00 copies nationwide. The stories that filled these magazines portrayed the ideal man as physically tough, sexually virile, and unabashedly patriotic. Women, conversely, were represented either as erotic trophies of conquest or as sexualized villains to be overpowered.

 

Take, for example, an illustrative story from the March 1963 issue of Brigade. In “Castration of the American Male,” pulp writer Andrew Petersen decried how the “manly virtues—strength, courage, virility—are becoming rarer every day…. Femininity is on the march,” Peterson claimed, “rendering American men less manly.” To put a finer point on the message, Brigade included with the article a photograph of a sullen husband, in floral apron, doing the dishes. The message seemed clear. The masculine ideal of sexual conqueror and heroic warrior, touted in nearly every issue of the pulps, was under assault.

 

 

Indeed, in Cold War men’s adventure magazines, “real men” were never “pussies.” They courageously defeated former Nazi henchmen and evil communist infiltrators. They exposed femmes fatale who were engaging in “sexological warfare,” using their physical bodies as weapons of war. And they seduced women across the globe, one navy vet describing himself in the pulps as a virile “bedroom commando.”

 

 

Yet just below the surface of these hypermasculine narratives, a subtext of anxiety loomed. Read a different way, the pulps might also be seen as a form of escapism from deep anxieties about not measuring up in a rapidly changing postwar society. Fears of being emasculated by Cold War suburbia and a consumeristic society pervaded these men’s magazines. Pulp writers, as seen in the Brigade article, habitually expressed concerns over American men becoming “soft.”

 

Arguably, these fears of losing one’s masculinity engendered not only hostility toward women but spawned a backlash against those supposedly “weak” men who weren’t holding the line against supposedly aggressive feminism. As Betty Friedan argued in The Feminine Mystique (1963), male outrage was the result of an “implacable hatred for the parasitic women” who apparently were denying husbands and sons a more vigorous, manly lifestyle.

 

The Vietnam War, at least in the pages of men’s magazines, seemed only to widen the gap between “real men” and their “pussy” compatriots. Saga lashed out at members of the “new left” and the blatant “draft dodging underground” taking hold on college campuses. Man’s Illustrated condemned the “cardburners” and “slackers” who had worked the system to stay out of uniform. One antiwar activist recalled hearing epithets of “faggots” and “queers” as often as “commies” or “cowards.” In the pulps, the best American men went to war, while the weaklings stayed home.

 

 

Such narratives outlasted the pulps themselves, which died out in the early 1970s. Stories glorifying war and sexual conquest seemed out of step with the cultural revolutions rippling through the United States in the immediate aftermath of a failed overseas war. Yet the macho pulp storylines retained their attraction enough to resurface only a few years later.

 

By the mid-1980s, “re-masculinized” men returned in full force. A finely chiseled Rambo deployed back to Vietnam to save American prisoners of war still held captive there. So too did Chuck Norris’s Colonel Braddock in the Missing in Action films. Even President Ronald Reagan took his cue from these tough-minded action heroes, quipping in 1985 that he would now “know what to do” if faced with a hostage crisis after watching Rambo: First Blood Part II. Would anyone call Rambo or Braddock a “pussy”?

 

The militarization of masculinity portrayed in the Stallone and Norris action movies had clear roots in the Cold War macho pulps. Nor should we be surprised by former Seal team O’Neill’s use of the term “pussy.” Because the veteran had achieved his manhood through military service, especially in an elite unit, he could be secure in demeaning others who didn’t meet his masculine ideals—which apparently also inoculated him from deadly viruses. 

 

Yet it’s not only the militarization of the “P” word that resonates, but the politicization of it as well. When Senator Ted Cruz (R-Tx) recently claimed that “many liberal males never grow balls,” he was purposefully contrasting his own supposed conservative masculinity with the femininity of his political rivals, whether they be male or female. One wonders, though, if Cruz truly fashions himself as the new archetype for twenty-first-century manhood or simply hopes to score a few cheap political points via social media name-calling.

 

Or, conceivably, Cruz is channeling what has underscored a decades’ long anxiety over American masculinity: that “real men” are on the verge of extinction because of political correctness gone awry or a feminist movement subverting traditional gender norms or any other imagined threat that stokes fears among mostly white, young, angry men.

 

Perhaps the most revealing expression of these anxieties comes from right-wing, all-male groups like the Proud Boys who see themselves as “aggrieved, marginalized, and depressed.” These traditionalists extol the imagined superiority of western culture, believe they are being disenfranchised by the left, and have found in Trump’s America a “place to put [their] political resentment.” As if to demonstrate their masculinity, the Proud Boys, according to one critic, “like to spoil for a fight.”

 

 

According to the “pussy” narrative, it’s not just the sensibilities of persecuted white men who are under attack, though, but the nation’s security as well. When I posted to social media a few covers from the macho pulps to promote my forthcoming book, one retired colonel who believes conservatives must win the current “culture war” replied that we all should focus more on crafting a militarized notion of masculinity “because a lot of Americans are pussies.” Another Twitter respondent claimed that the “pussification of America’s youth is a matter of national security. We need a touch of testosterone added to the water with fluoride,” he argued. “No more cavities and fewer softies.” By this logic, if only US soldiers and marines were equipped with more hormones, they might have achieved more lasting results in Iraq and Afghanistan.

 

So, what to make of this perceived “pussification” of America? Most importantly, we need to accept the fact that real violence stems from imagined grievances and misogynistic language. A congresswoman being verbally assaulted. A group of wrathful white men seeing themselves as a “right-wing fight club.” A militarization and polarization of society based on outdated gender norms.

 

For those who don’t want to examine the violence they perpetrate—against women or minorities or immigrants or any other perceived social or cultural threat—the “pussification” of America provides a warped justification for violent means.

 

Popular narratives of what it means to be a man, to paraphrase Josephine Livingstone, need not rest on connecting “the vulva with weakness” for those men who don’t act like chiseled Hollywood action heroes. In the Cold War men’s adventure magazines, “real men” were depicted as heroic warriors and sexual champions. More than a half century on, such depictions continue to resonate far too widely across American society. It seems well past time to evolve beyond mid-1950s mindsets and begin conversations about alternate models of masculinity. Chances are, America will survive, even if all men aren’t Rambo.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177828 https://historynewsnetwork.org/article/177828 0
Ranking Donald Trump: No Cause for National Happiness

 

 

As editor of a recent book about presidential misconduct from George Washington’s administration through Barack Obama’s, I’m often asked where Donald J. Trump stands in the rankings of American presidents.  I respond that, in observance of historians’ practice, it’s too early to tell.  After all, the president’s term in office hasn’t yet ended.  And although we already know much about the Trump presidency from press coverage and court filings, the administration’s records are closed to examination.

But today this kind of reticence seems difficult to defend on civic grounds.  In this moment of political, environmental, public health, and resulting economic crisis, Americans deserve a considered answer to the question they ask: How does Trump fare in comparison with his predecessors?

The truth is that historians have never arrived at agreed-on criteria by which to compare American presidents.  Anyone who tries his hand at placing a single president in a ranking of some sort runs up against the fact that, despite many previous attempts, no group of specialists in presidential history has used the same set of measures.

Furthermore, although a number of historians’ presidential rankings have appeared since 1948 when Arthur M. Schlesinger Sr. led the first effort to create one, none is recent.  Consequently, what follows is my own stab at an assessment of Trump’s presidency.  Although it adopts some of the same yardsticks used in earlier attempts, it shouldn’t be taken to represent the views of historians generally.  Nevertheless, with another presidential election rushing at us, let me try my hand at trying to determine where the incumbent president ranks against those who’ve occupied the presidency before him.

Preparation for office.  It used to be asked about candidates for the White House whether they were “presidential timber.”  By that was meant two things: previous experience as an elected official and possession of the proven knowledge, bearing, and authority appropriate to the presidency.  Measured for public office-holding experience before their presidencies, all but three pre-Trump chief executives served in elective posts prior to their election,   The three who didn’t—George Washington, Andrew Jackson, and Dwight D. Eisenhower—demonstrated their leadership and command abilities by leading large military forces in the field.  Earlier experience in elective office, whether in national posts (like congressional seats and the vice presidency), through high military command, or in state office (say, a governorship), is assumed to give a chief executive the necessary political skills, leadership abilities, and knowledge essential to governing.  Not all presidents have possessed all these qualities even after previous experience.  One thinks of, say, Andrew Johnson as lacking what’s needed in the presidency.  An amateur in public office, Trump, having never been tested for leadership or command in public office, fails when measured for previous preparation, too.

Fitness for office.  This large measure encompasses aspects of a president’s mind and character—knowledge about the history, constitution, and culture of the nation as well as possession  of honesty, skill in selecting cabinet officers and advisors, balance of judgment, prudence of expression, empathy toward others, calmness in action, strength in decision, and the moral compass necessary for effective governance.  Fitness also includes the absence of inherent personality traits that inhibit soundness of judgment, calm behavior in the face of critical challenges, and balance in decision-making.  It’s difficult to find any previous president who exceeds Trump in his lack of so many of these qualities.

The successful pursuit of stated goals.  Central to an administration’s record are the aims it sets for itself, the quality of those aims, and its success in achieving them.  Historians’ favorite example of the successful achievement of campaign objectives is the one-term presidency of James K. Polk.  During the 1840s, Polk met all four of his campaign objectives: incorporating after negotiations with Great Britain the Pacific Northwest (today’s states of Washington and Oregon); gaining the American southwest from Mexico through war; lowering tariff rates; and establishing a federal monetary system independent of private banks.  Most other presidents have achieved only parts of their platform goals.  Trump’s major aims have been to free the U.S. of military and other foreign entanglements (half win), see to the repeal of the Affordable Care Act (fail), convince NATO to reduce its dependence on American funding (half win), erect barriers against immigrants, especially via a wall at the Mexican border) (half win), nominate conservative federal judges (win), and reduce federal taxes and regulations (win).    Whether Trump’s aims have been beneficial to the US and the world is debatable, just as the Polk administrations’s addition of additional slave territory has never sat well with historians.  But measured against campaign goals, Trump has done well, especially having been in office for less than four years.

Protecting the national interest.  This is considered the bedrock responsibility of a president.  It’s a major constituent of every campaign platform.  Its pursuit always faces serious challenges, its achievement many obstacles.  Central to it is the avoidance of war, victory in any conflicts that prove necessary, and the creation and preservation of good relations with other nation-states so as to guard and enhance the national interest.  On these grounds, Trump does better than, say, James Madison, James Polk, William McKinley, Woodrow Wilson, Franklin Delano Roosevelt, and the two Bushes, all of whom, claiming provocation or having seen the nation attacked, took the country into wars, some of them of questionable justification.  But are the unforced errors that have led to our recent fraying relationships with NATO, Iran, and China and our hard-to-explain cozying up to Russia and Saudi Arabia productive of greater American security?  By this metric, Trump comes out somewhere in the middle of former presidents.

Skill in governing.  Leading a real estate development firm is unlikely to give a president the political skills and other capabilities needed to govern.  In the White House, you’re head of a political party as well as head of state and chief of government, and it helps to be good at all three.  Sometimes—think of Millard Fillmore and Warren Harding—even prior experience in elective office fails to prepare you to be president of the United States.  Moreover, you’re president of all Americans, not of just some of them.  You’ve got to manage cabinet departments; represent the U.S. with dignity abroad; distinguish between campaigning and governing—all of these and many more skills and sensibilities being central to leading a large, powerful, and diverse nation like the U.S.  Abraham Lincoln and Franklin Roosevelt stand out as brilliant at governing the nation, their cabinets, and Congress in the midst of war.  Donald Trump?  His governing skills put him in the bottom half of the pack in any ranking of his predecessors.

Truth-telling and exemplary conduct.  Exemplary conduct, honesty being its chief ingredient, is the coinage of effective governance.  Six-year-old George Washington’s celebrated statement (a fictional one) that he could not tell a lie has set the standard for each president who followed him.  Most presidents, most notoriously Richard M. Nixon, have shaded or avoided the truth, often by covering up misdeeds.  Citizens are likely to shrug off a president’s lies if they’re infrequent, venal, and few.  But if, like Nixon’s, they’re many and strike at the heart of the government’s integrity, they pass the bounds of tolerability.  The number of Trump’s lies, tabulated by press and other organizations, surpass previous records by such an order of magnitude that they stagger belief.  Never has a previous president proved as mendacious as today’s incumbent.  In this category, Trump resides at the bottom.

Staying within the law.  Being on the defensive is normal for a president; no act goes without scrutiny and attack.  But once a president has to say, as Nixon did, that “I am not a crook,” that president has lost the authority and credibility to lead.  Nixon was the previous champion of illegal behavior—in his case being the first to orchestrate misconduct (what we know as the Watergate Affair) from the Oval Office.  But compared to Trump, Nixon was a lightweight.  Where Nixon acted purposefully to break the law though illegal acts and cover-ups, Trump has, while flouting the law, used his office to enrich himself at the public’s expense while ignoring existing legislation and breaking venerable norms of governance.  Seeking favors from foreign governments (Russia’s and Saudi Arabia’s), being exposed for corruption (“Individual #1”), padding his company’s pockets in contravention of the Emoluments Clause of the Constitution, using his foundation’s tax-protected funds for personal use, and failing to see that his eponymous university deliver promised education to its students—never before has a president so flouted the spirit and substance of the law.  Another case of Trump’s landing at the very bottom of the list.

Lifting hearts, banishing fear.  The greatest presidents, through the words they use, summon people to the nation’s service by raising their hopes, routing their anxieties, and helping the best in human nature express itself.  An effective president speaks as a wise counselor and rousing coach as FDR did in telling his fellow Americans that “the only fear we have to fear is fear itself.”  Never before has a president referred to his fellow citizens as “scum” “haters,” “losers,” “lowlifes,” and “thugs” or tried to set American against American.  Although Nixon indulged himself privately in insults against his political enemies, no other president has ever done so publicly or with the same viciousness.  Trump comes out at the bottom in this respect, too.

Empathy toward others.  Humility.  Social Conscience.  An ability to bring people together.    To merit the reins of government, chief executives must have the capacity to project themselves imaginatively into the feelings, thinking, and situations of those whom they lead—to act, in Abraham Lincoln’s words, “with malice toward none, with charity for all.”  Trump possesses no capacity to understand, accept, and empathize with others’ difficulties.  One recalls our great presidents for their magnanimity and inspiration.  On this score, Trump also falls to the end of the list.

Steering clear of self-dealing.  It has long been established—by the Constitution, law, and norm—that an incumbent president must not use his office for self-enrichment.  A chief executive may have inherited or accumulated wealth as, say, Washington and FDR did, and office may fit him for future earnings as it has Bill Clinton and Barack Obama.  But as we now know in detail, Trump has repeatedly manipulated law and regulation to protect and enlarge his and his family’s wealth while in office.  He has brazenly hawked his own products, directed the government to house federal officials at his resorts, made clear his expectation that foreign delegations and party stalwarts use his Washington hotel, and thwarted the move of the FBI to the Maryland suburbs so as to prevent the construction of a competing hotel in place of the FBI’s current headquarters.  No other president has attempted anything like Trump’s efforts to line his own pockets while in office.  On this count, too, he ends up at the bottom of any ranking.

Abiding by existing norms of government.  A nation’s code of governance, as much as its laws, reflect the character of the nation itself.  Since the birth of constitutional government in 1789, all presidents have abided by most of the standards of behavior, presentation, and action adopted before their time in office.  Without exception—even Nixon at his most fragrantly illegal—no president has openly stated his defiance of constitutional and other norms as has Trump.  None has flaunted his intention to challenge the outcome of a presidential election.  This threat alone places Trump at the bottom in a league of his own.

An assessment like this one of Trump’s presidency constitutes a bill of indictment of his presidency.  But it also gains his record one gold star of sorts.  He’s accomplished what no other president has been able to achieve since the first presidential ranking in 1948.  He’s managed to raise James Buchanan, Millard Fillmore, Andrew Johnson, and Warren Harding off the floor.  The sad thing is that this is no achievement we can cheer.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177833 https://historynewsnetwork.org/article/177833 0
Does the "Divided Loyalty" Question Still Dog Catholic Politicians?

Al Smith campaigns for president, 1928

 

 

The candidacy of Joe Biden, a cradle Catholic, together with the nomination to SCOTUS of Amy Coney Barrett, a cradle, charismatic Catholic, have again raised questions about Roman Catholicism’s place in American politics.  Presenting too strict a faith can generate reactions like Mark Sumner’s at Daily Kos, who wrote in reaction to Barrett’s nomination that 

 

the far right is so excited to see her name put forward [because] Barrett is a religious extremist, a member of a small sect that takes the inherent misogyny of traditional Catholicism and adds to it by doubling down with … more misogyny.

 

At the same time, Biden’s generic appeals to Christ or prayer barely register with voters and journalists accustomed to Christian rhetoric in political campaigns. A piece at NPR on how “Biden’s Catholic Faith Shaped His Life,” included a quote from John McCarthy, the candidate’s national deputy political director.  When McCarthy said, “It’s about the vice president being who he truly is, which is a Catholic and a deeply devout person of faith,” NPR’s reporter did not ask for an explanation. One way to account for the difference in coverage of Biden and Barrett is to acknowledge that a faith that inspires or comforts a politician is an easier sell than one that seems to challenge existing policies. 

 

For some in the press corps, however, Biden’s faith needs more spice.  Elizabeth Bruenig at the New York Times, for instance, has argued that Biden needs to incorporate more of Pope Francis’ teachings into the Democratic nominee’s campaign.  “Mr. Biden could look to the example of Pope Francis as a model for a kind of Catholicity that is both pious and challenging to the powers that be — if he, or anyone else, were interested in that sort of thing.” For Bruenig,  Francis provides grounds for challenging America’s ruling class and Biden, who needs to move to the left, has spent too much time in the moderate middle. 

 

That advice might make sense for someone like Bruenig, a socialist and relatively recent convert to Rome.  But the recent release of Fratelli Tutti, a papal encyclical that appeared after Bruenig’s column and that challenges Donald Trump’s brand of conservatism, raised the stakes of the columnist’s point.  Will Francis give Biden cover to move to the left?  Aside from differences between Republicans and Democrats over domestic policy, should a Catholic candidate aspiring to the presidency appeal to the papacy for support for public policy? The answer from American history is that for almost a century, Roman Catholic presidential hopefuls have distanced themselves from the church both to silence anti-Catholic critics and to affirm national norms. 

 

John F. Kennedy likely set the standard for all Catholic successors when he ran for POTUS.  In 1960, the U.S. Senator from Massachusetts had to answer many critics who worried that a Catholic president could not uphold the Constitution owing to competing loyalties.  In fact, anti-Catholicism had received a new lease on life only ten years earlier when Paul Blanshard wrote the best-seller, American Freedom and Catholic Power (1949).  The author was no fundamentalist bigot.  His anti-Catholicism tapped reputable Protestant sources, such as those that accompanied him from Harvard Divinity School to Union Seminary (New York) and into the Congregationalist ministry.  Although Blanshard wound up agnostic, he believed Roman Catholics were a threat to liberal democracy and wrote a book to prove it.  This was the predictable objection to Catholicism, namely, that a higher religious loyalty conflicted with duties to uphold American law (for some reason, it never applied seriously to Bible-thumping Protestant politicians). In Blanshard’s mind, Rome stood for “antidemocratic social policies” that were “intolerant,” “separatist,” and “un-American.”  Perhaps the reason the book sold well was that Blanshard expressed what most white Protestants whether mainline or evangelical thought about Roman Catholicism.  Even Reinhold Niebuhr, who remains practically every faith-friendly politician’s favorite theologian, believed Rome’s “authoritarianism” was fundamentally at odds with the “presuppositions of a free society.” 

 

Blanshard’s anti-Catholicism was hardly original.  The year before the 1928 presidential election, for instance, New York governor, Al Smith, the Democratic nominee, needed to answer a long article in The Atlantic by Charles C. Marshall, a prominent New York City attorney and lay Episcopalian that questioned whether a Roman Catholic could be loyal to both the Constitution and the pope.  Smith, who wondered “what the hell is an encyclical?” was blind sided. His reply, written with help by a priest who was famous for his service and heroism as a chaplain during World War I, was to affirm every point of the American creed and claim that no tension existed between American patriotism and church membership.  To the litany of quotations from papal pronouncements, Smith replied, “I have been a Catholic all my life and I never heard of these encyclicals and papal bulls.” On questions about his loyalty to American government, he blazed the trail for Kennedy: “I believe in the worship of God according to the faith and practice of the Roman Catholic Church” and “I recognize no power in the institutions of my Church to interfere with the operations of the Constitution of the United States.”  Smith added, he believed in “absolute” freedom of conscience and the “absolute” separation of church and state.  

Even if Smith’s devotion was closer to the norm for most American Catholics than a supposed following of every utterance from the Vatican, Rome’s reaction to the American Jesuit John Courtney Murray indicated that parts of anti-Catholicism had merit.  Murray came on to the radar of the Vatican in the early 1950s when he tried to defang Blanshard’s book and demonstrate the harmony between the American Founding and Roman Catholic natural law.  In the process, he challenged the Vatican’s default position on religious freedom which, put simply, was “error has no rights.”  Murray’s ideas became sufficiently alarming that his superiors instructed him to stop writing about church and state.  Although Murray eventually served as an advisor to the bishops at the Second Vatican Council, his positions were suspect within the Vatican even as the bishops convened in Rome. When the Council eventually embraced religious freedom in Dignitatis Humanae, Murray’s views seemed to prevail.  

 

Whether Vatican II also vindicated JFK’s earlier declaration of independence from church authority is debatable.  Although Murray had had to worry about offending his superiors, the church’s bishops offered no objections to Kennedy even when he said, before a body of Houston’s Protestant clergy: “I believe in an America where the separation of church and state is absolute, where no Catholic prelate would tell the president (should he be Catholic) how to act, and no Protestant minister would tell his parishioners for whom to vote.”  Chances are that Kennedy’s strict wall of separation was not what the bishops at Vatican II had in mind when endorsing religious freedom.  At the same time, Kennedy’s position of independence from the church has been the pattern over the last century for Catholic politicians in the United States. 

 

This is not only true for public figures; Rome’s bishops have also come around to a position that the United States is not a nation in need of correction by the church but is in fact a beacon of freedom and hope for the world.   In 2015, when Pope Francis visited the United States and spoke outside Independence Hall in Philadelphia, he avoided a prophetic witness and chose words of inspiration.  He invoked the examples of Abraham Lincoln, Martin Luther King, Dorothy Day, and Thomas Merton to underscore themes of his own papacy.  He also declared that the Declaration of Independence’s famous words – “all men are created equal” – were on the side of such Christian ideals as protecting “the good of the human person” and “respect for his or her dignity.”  He confessed a hope that the United States would “continue to develop and grow, so that as many young people as possible can inherit and dwell in a land which has inspired so many people to dream.”  Francis was echoing what the American bishops had been affirming since 2012 when they launched the Fortnight for Freedom, an annual two-week period prior to Independence Day that called on American Catholics to recognize and express gratitude for the freedoms their nation protected.  “We are Catholics. We are Americans,” the bishops asserted. “We are proud to be both, grateful for the gift of faith which is ours as Christian disciples, and grateful for the gift of liberty which is ours as American citizens. To be Catholic and American should mean not having to choose one over the other.”

 

Joe Biden will likely do what JFK and Al Smith did, namely, fit his faith into the norms of American politics.  In fact, Biden did something like that in his acceptance speech at the Democratic National Convention: 

 

We have a great purpose as a nation: To open the doors of opportunity to all Americans. To save our democracy. To be a light to the world once again. To finally live up to and make real the words written in the sacred documents that founded this nation that all men and women are created equal. Endowed by their Creator with certain unalienable rights. Among them life, liberty and the pursuit of happiness.

 

That may not be a vigorous expression of Roman Catholic conviction over and against the excesses of American society, but Biden is in a sense following the church hierarchy.  Since the 1960s, Roman Catholicism has provided more room for affirming America’s secular and liberal forms of government than the church ever did in the four centuries preceding Vatican II.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177830 https://historynewsnetwork.org/article/177830 0
My Memories of Voter Suppression

Andrew Goodman, James Chaney and Michael Schwerner were murdered in 1964 for their efforts to secure Black voting rights in Mississippi. 

The author joined an interracial movement for voting rights in Louisiana in the years before.

 

 

Back in July 1962, when, according to Donald Trump, America was “great,” I was in the Deep South, working to register Black voters.  It was a near-hopeless project, given the mass disenfranchisement of the region’s Black population that was enforced by Southern law and an occasional dose of white terrorism.

It all started in the fall of 1961, the beginning of my senior year at Columbia College.  My roommate (Mike Weinberg) and I, both white, had joined the campus chapter of the Congress of Racial Equality (CORE) and participated in a few of its New York City projects.  The real action, though, was in the turbulent South, swept by sit-ins and Freedom Rides that demanded an end to racial discrimination and, especially, the right to vote.

On an evening in the spring of 1962, Ronnie Moore, a Black CORE Southern field secretary, brought the news of the Southern freedom struggle to our Columbia CORE meeting.  Having headed up desegregation efforts in Baton Rouge, Louisiana, Ronnie and three other students at Southern University, an historically Black institution, were out on bail on “criminal anarchy” charges.  The laws under which they were charged and imprisoned, which provided for a penalty of ten years at hard labor and a hefty fine, dated back to the state’s early twentieth century repression of union organizing among Black and white timber workers.

Stirred by what Ronnie told us, Mike and I went up to him after his talk and asked him how we could help the cause.  Looking us in the eyes, he said, smiling: “What are you boys doing this summer?”  In reply, we explained that, inspired by Jack Kerouac’s On the Road, we would be driving around the country.  “Any chance that you’ll get to Baton Rouge?” he asked.  “We could manage it,” we said.  “Well, do it,” he remarked, adding: “Maybe we could arrange to get you arrested!”  We all had a good laugh about that.

That July, as Mike and I drove along Louisiana roads enveloped in an atmosphere of racial segregation, racist remarks, and unbearably hot and steamy weather, the venture no longer seemed quite as amusing.  Nor, after arriving in Baton Rouge, was it easy to find Ronnie, for the Congress of Racial Equality wasn’t listed in the phone book.  But we did find a Committee on Registration Education, and figured that, with the same acronym, that must be his group.  It was.  The state authorities had obtained a court order to shut down its predecessor.

When we arrived at CORE’s tiny office, Ronnie was delighted to see us and, together with his coworkers, took us to an all-Black hangout for coffee.  In his view, and ours, the only safe people in the South were Black.  As for local whites, we considered them all actual or potential Nazis, and stayed clear of them and their institutions.  Whether they would stay clear of us remained uncertain.  Mike and I slept on the Moore family’s entry hall floor, and local residents had been known to fire bullets into it through the front screen door.

Although most of the voter registration campaign Mike and I worked on in Baton Rouge was rather mundane, one evening was particularly exciting.  At dinner time, Ronnie suggested that we drive over to Southern University, from which he and the other CORE activists had been expelled for their “crimes.”  As we entered the all-Black dining hall, students started yelling: “It’s Ronnie!  It’s Ronnie!”  Hundreds of students swiveled around and cheers rent the air.  Leaping onto one of the tables, Ronnie made an impassioned speech about the freedom struggle and, then, announced that he had brought with him two movement supporters from the North.  “Get up here, Larry and Mike!”  So we jumped up there, too, and did our best to deliver strong messages of solidarity.  We had just about finished when someone rushed in, warning that the campus security police were on their way and that we had better get out of there fast!  While students ran interference for us, we did.

One day, Ronnie suggested that Mike and I drive him to Jackson, Mississippi, where a region-wide CORE-SNCC conclave would be held at the local Freedom House.  Accordingly, after dinner, we hit the road through northern Louisiana (where a local gas station operator threatened to kill us) and, then, through Mississippi to Jackson.  Here, in an abandoned building taken over by the movement and around which police cars circled menacingly, we joined dozens of CORE and SNCC activists from the Deep South.  At night, they had lengthy political discussions, in which they expressed their bitterness toward the Kennedy administration for its failure to back civil rights legislation or to protect movement activists from racist violence.

During the days, Mike and I joined Luvaughn Brown, a Black activist recently incarcerated at the county prison farm, to go door to door in a Black Jackson neighborhood and encourage its residents to register to vote.  This was a tough job because people feared retaliation if they dared to exercise their voting rights and, also, because they would almost certainly be rejected.  At the time, Mississippi used a “literacy test” to determine if a citizen was qualified to vote.  A voting registrar would ask a potential registrant to define the meaning of a section in the lengthy state constitution.  If you were Black, the registrar announced that you had failed the test; if you were white, you passed.

Voter registration work was not only frustrating, but exceptionally dangerous.  The following summer, Medgar Evers, head of the local NAACP, was murdered in Jackson by a white supremacist for his leadership in a voter registration campaign.  The next June, James Chaney, Andrew Goodman, and Michael Schwerner—participants in the Mississippi Freedom Summer voter registration project—met a similar fate.  Although rattled by our fairly brief Southern venture, Mike and I escaped with our lives, as did Ronnie.

Mike and I kept in touch, and were delighted when Congress responded to the scandal of Southern voter suppression with the Voting Rights Act of 1965, which outlawed the discriminatory voting practices of the past and established federal oversight of any new voting procedures in the offending states.

Imagine, then, our sense of sorrow, mingled with disgust, when, in 2013, by a 5-4 vote, the Republican-dominated U.S. Supreme Court gutted the Voting Rights Act.  This opened the door for numerous Republican-controlled state governments—many but not all Southern—to implement mass purges of their voter rolls, closure of polling places in minority neighborhoods, government ID requirements, felony disenfranchisement, and other barriers that deprived millions of Americans of the right to vote.

I wonder how Republican leaders can live with themselves when they betray the most basic principle of democracy.  Of all the things they have done during their time in power, this is surely one of the most despicable.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177743 https://historynewsnetwork.org/article/177743 0
Where In The World Are You? How My Great-Grandmother’s Letters Helped Me Locate My Great-Uncle after 78 Years

 

 

When did you last send a handwritten letter, or receive one? Can’t remember? Me neither, but there was a time, not that long ago, when the snap of the letterbox brought people rushing to see what had been delivered: reassuring words from a much-missed loved one, or the dreaded telegram bearing the very worst of news?

 

Much of what we know about the world wars comes from the letters and diaries of ordinary people caught up in those extraordinary events. Without these personal reflections, we wouldn’t have such an intimate knowledge of how war affected those who lived through it. Those like my great-grandmother, and great-uncle.

 

 

Between October and December 1942, my great-grandmother wrote a number of letters to her youngest son, Jack, who was reported as missing in action shortly after leaving home to fight for the British Army with his regiment from East Yorkshire. Jack, aged 25, was sadly never found and, heartbreakingly, my great-grandmother’s letters to him were all returned to her, marked simply, ‘To Mother’. Seven of these letters survive, and have been passed down through the family over the decades. In 2017, my father gave them to me.

 

 

Reading my great-grandmother’s unimaginable anguish, her desperate worry for Jack and her agony in not hearing from him, had a profound impact on me, and made me think about war in a different way. Her words made me realise that the pain of separation, of not hearing from loved ones for months, years, or ever again, wasn’t something that had happened to strangers in grainy black and white photographs, but had happened to my own family; to a mother, like me. Reading her letters inspired me to write a novel about the experience of ordinary people caught up in the war, and cut off from their loved ones, and soon after being given the letters, the right story found me: a story of the war in the Pacific; a story of resilient women and resourceful children; a story from a distant corner of that terrible war. 

 

I first learned about the events that inspired When We Were Young & Brave in a podcast. The episode began as an amusing anecdote about waylaid Girl Scout cookies, but went on to reveal the remarkable true events surrounding a group of schoolchildren and their teachers who were taken to a Japanese internment camp in China, following the bombing of Pearl Harbor in December 1941. The children’s parents were mostly British, American and European missionaries and diplomats, and their lives, up to that point, had been one of great privilege. Many of the children were part of the school’s Girl Guides unit, and the principles of girl guiding, the routine of patrol meetings, the practical skills learned, and a willingness to Lend a Hand and Be Prepared, became increasingly important in helping them, and their teachers, endure their ordeal over the next five years. The account stirred fond memories of my own years as a Brownie, and of the BBC drama Tenko, and the Ingrid Berman movie, The Inn of the Sixth Happiness, based on the life of missionary Gladys Aylward. 

 

I was intrigued, not only because World War II was an event I wanted to write about, but also because girl guides, schoolchildren and war simply didn’t belong together. I wanted to understand how it had happened, how the children and their teachers had coped in the circumstances they found themselves in so far from home, and how the experience, especially the prolonged separation from their family, had affected those children in later life. What I hadn’t expected to discover during my research was a story not only of unimaginable hardship, but of extraordinary hope, friendship, community and kindness as the children and their teachers adapted to the rapidly changing circumstances they found themselves in. 

 

As a historical novelist, I spend a lot of time walking in the shoes of those who have lived through world-changing events, so it felt very fitting to finish editing When We Were Young & Brave during lockdown in March, at the start of our own world-changing event. Now, more than ever, it seems to me that the past is not a foreign country where people do things differently, but is a reassuringly familiar place, one from which we can draw comfort, one from which we can learn. Disaster and tragedy are often where we find our strongest bonds, and as we find ourselves separated from family and distanced from loved ones, stories of community and shared hope are, arguably, more important than ever. 

 

 

 In February this year, with the help of the War Graves Commission website, I located my great-uncle Jack. The family had always believed he was lost in France, but he wasn’t. He was in North Africa, in Tunisia, so very far away from his rural Yorkshire home. His final resting place is marked with a memorial plaque, noting his age and the date of his death. He died on the day great-grandma wrote her final letter to him. 

 

Dec 27, 1942

 

My dear Jack, Where in the world are you? I keep writing and still no news of you. We are all ears at news time but seldom glean any news about your special team. How did you spend Christmas? We are all thinking of you … Everybody here has victory on their lips now, but I keep on looking for you. It does seem such long months since you went away.

 

Before my grandma passed away in May (after reaching her 100th birthday), we were able to tell her we’d found her brother’s final resting place, and show her the photograph of his memorial plaque. It gave her great comfort. Without my great-grandmother’s letters we would never have known how to find Jack, and I would never have understood how deeply the war had affected my family. 

 

 

How incredible that the words of a mother to her son, written nearly seventy-five years ago, still evoke such powerful emotions. I now look at the faces in old family photographs with a renewed sense of connection, sadness, and above all else, a profound sense that the past is not so different, or far away, after all.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177835 https://historynewsnetwork.org/article/177835 0
Lessons from the 18th Century Dutch Republic

Binnenhof, seat of the States General of the Netherlands, The Hague

 

 

 

On the eve of the 2020 US presidential elections, American society remains deeply polarized. Study after study demonstrates that Republican and Democratic voters disagree on key policy issuesare becoming increasingly partisan, and rarely switch parties. Whoever gets elected president of the United States in November will unquestionably govern a divided and restless nation.

 

The 2020 election may be pressing and significant, but more relevant for the long term is how to move forward. If the United States is to survive, what can be done to mitigate the country’s divisive politics?

 

The history of the Netherlands serves as both a warning and an opportunity for the United States in its current polarized state. The Dutch lesson is that there is a way to achieve reconciliation and deal with the divisions that naturally arise in any society; the embrace of political pluralism.

 

Similar to the United States today, the Dutch Republic - the predecessor to the current Kingdom of the Netherlands - was a deeply divided country around the time of the American Revolution. Comparable to Democrats and Republicans in American politics, the Dutch Republic had two opposite poles on the political spectrum, the Patriots and the Orangists.

 

By the late eighteenth century, the Orangists were defenders of the status quo and their most ardent supporters hailed from the urban working classes and farmers in the countryside. They favored an alliance with Great Britain and large standing armies to defend against land invasions from France. 

 

In contrast, the Patriots were an opposition movement predominantly from the urban mercantile middle classes. They supported larger navies to protect their overseas trade. The Patriots regarded the incumbent government as a corrupt aristocracy and sought to reform it through elections and the creation of citizen’s militias.

 

Like in the United States today, various developments amplified the political stakes as well as the polarization between the two parties. The Dutch Republic had been a world power in the seventeenth century, famous for its riches, military might, and high culture, but the country experienced a gradual decline of prestige in the eighteenth century. Foreign powers such as France and Great Britain sought to exploit Dutch internal divisions for their own benefit. Meanwhile, wealth inequality had grown to epic proportions with real wages barely rising.

 

“Imbecility in the government; discord among the provinces; foreign influence and indignities; a precarious existence in peace, and peculiar calamities from war”, was how Alexander Hamilton and James Madison portrayed the Dutch Republic in the Federalist Papers in 1787. In other words, much like how many would currently characterize the United States.

 

The Orangists and the Patriots insisted that polarization was the root cause of Dutch decline. Both parties agreed that only unity (eendracht) could restore Dutch glory, basing their argument on the Dutch Republic’s motto, Eendracht maakt macht, or strength through unity.

 

But the pursuit of unity in a country in which people would never fully agree on policy was as cynical as it was destructive. Though the Patriots favored the implementation of elections, they often limited who could vote for local offices by requiring membership in a citizen’s militia to cast a ballot, a rule that heavily favored the Patriot party. Similarly, Orangists thought that unity could be achieved through collective deference to the Stadtholder, an executive with monarchical pretensions who was allied with the Orangists. In the end, neither party got their way, at least not permanently. After an invasion by French revolutionary forces in 1795, the Dutch Republic ceased to exist.

 

Like in the eighteenth-century Dutch Republic, contemporary American politics has two major factions endlessly battling over dominance over the nation’s institutions. But where the eighteenth-century Dutch Republic represents a warning, the nineteenth and twentieth-century history of the Netherlands can provide a path towards national reconciliation for the United States. 

 

Political compromises on universal suffrage, labor rights, and freedom of education in the nineteenth and twentieth centuries institutionalized political pluralism in the Netherlands, which proved an effective method of easing tensions between various factions in politics and civil society. These compromises ushered in the period of “pillarization” (verzuiling) in Dutch society, the cultural foundation of political pluralism in the Netherlands today. It was broadly understood that different factions in society could peacefully coexist, as long as each group respected each other’s “sovereignty”, as Calvinist politician Abraham Kuyper put it. Socialists, various Christian denominations, and liberals each formed their own civil society in which they practiced their beliefs and shaped their political ideas.

 

The period of pillarization in the Netherlands is long gone, but the culture of political pluralism persists. Dutch Parliament currently counts thirteen different parties with the largest only commanding thirty percent of the seats. Though Dutch people complain about the quality of contemporary political debate and the splintering of political parties, having a different view than your neighbor on a political or cultural issue is hardly problematic.

 

We understand the contemporary United States as fundamentally different than the Netherlands in this regard, in part because politicians and their partisans have manufactured a red/blue, Republican/Democrat division to benefit electorally from polarization. Yet the idea that American citizens, nearly 330 million people from an endless variety of ethnicities, faiths, and socioeconomic backgrounds, can be categorized as either Republican or Democrat is a grotesque oversimplification. Even on abortion, arguably one of the most divisive issues in American politics, American citizens actually have a much less binary view than is usually understood.

 

It is unrealistic to expect that the United States will become a multiparty democracy any time soon. But what needs to be broadly recognized is that societies will always have a wide variety of political ideas that somehow have to be reconciled in compromised public policy, a fact the Dutch Patriots and Orangists never understood. Important steps can be made towards gradually building a culture of E pluribus unum, the foundation of a new era of political pluralism in the United States.

 

Practically, citizens should be encouraged to envision themselves on a larger political canvas than just Democrat, Republican, or even independent. In the Netherlands, millions of voters fill out the StemWijzer (literally “Vote Indicator”) before every election to orient themselves on the political spectrum. State governments and educational institutions should popularize the use of similar nonpartisan tests, such as the Pew Research Center’s Political Typology Quiz

 

The humanities also have an important role to play in promoting political pluralism. Through the humanities, students learn to appreciate how diverse political views and compromise shaped society around them. For instance, documents like the Declaration of Independence or the United States Constitution should be examined as products of compromise between people with a wide variety of ideas as opposed to a genius document that simply fell out of the founders’ heads. 

 

Likewise, governments on every level should do more to combat the civic illiteracy of American students. A holistic and deep understanding of civics is key to creating an informed and politically tolerant citizenry and can become a vehicle for political pluralism. In addition to learning about the Republican and Democratic parties, students should know more about the rich socialist and libertarian political traditions in the United States to help them recognize the diversity of opinions on the American political spectrum.

 

The embrace of political pluralism is essential to the health of the American Republic and will be even more so as polarization persists in the coming years. Voting is often seen as the duty of every citizen in a democracy and that is certainly important. But voting is just one part of citizenship. The history of the Dutch Republic demonstrates that polarization can gradually destroy a country from within and can easily be exploited by foreign actors. The embrace of political pluralism by every citizen is the key antidote to the rot of polarization and partisanship that haunts American politics today.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177836 https://historynewsnetwork.org/article/177836 0
Return to the Presidential Succession Act of 1886 (With Some Modification) Ronald L. Feinman is the author of “Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama” (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

A major controversy has arisen over the issue of presidential succession in the wake of President Donald Trump’s diagnosis with COVID-19.

There have been three presidential succession laws enacted. The first, in 1792, set up the President Pro Tempore of the Senate and the Speaker of the House of Representatives as the first two leaders following the Vice President, and then followed by cabinet officers in order of the creation of the Cabinet positions by Congress.  That law survived the crises that followed the deaths of William Henry Harrison, Zachary Taylor, Abraham Lincoln, and James A. Garfield, without the need to go beyond the Vice President.  

However, during the second abbreviated term of Abraham Lincoln and his successor Andrew Johnson (1865-1869), the nation potentially faced an unprecedented crisis, as two situations developed around Andrew Johnson. John Wilkes Booth plotted to eliminate both Lincoln and Johnson. If conspirator Lewis Powell had not gotten drunk and failed to assassinate Johnson, Connecticut Senator Lafayette Foster, the President Pro Tempore of the Senate,  would have succeeded Lincoln, a point this author points out in his book on presidential assassinations (Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama, Rowman Littlefield Publishers).  Also, if Andrew Johnson had been successfully removed from office by impeachment, it would have led to then President Pro Tempore of the Senate Benjamin Wade of Ohio becoming President. Wade was a major critic of Johnson and refused to abstain from the vote to convict Johnson.  That fact led a group of seven Republican Senators, who disliked Wade and his lack of ethics, to join with 12 Democrats to save Johnson from conviction and removal from office in 1868.

In 1886, the Congress wisely changed the Succession Law of 1792, and eliminated both the President Pro Tempore of the Senate and the Speaker of the House of Representatives from the line of succession. In so doing they took partisan politics out of the issue of who should succeed a President.  So the Cabinet Officers of the President, in order of the creation of the agencies, became the new order of succession, and remained so until 1947, spanning the deaths in office of William McKinley, Warren G. Harding, and Franklin D. Roosevelt.  

However, as reported in this author’s Assassinations book, Theodore Roosevelt faced a mostly unknown threat on September 1, 1903 when Henry Weilbrenner approached Roosevelt's family home at Oyster Bay, New York, attempting to get past the Secret Service detail created after President William McKinley’s assassination in September 1901.  Possessing a firearm, Weilbrenner claimed he wanted to marry the President’s daughter, Alice. Fortunately, Weilbrenner never was able to meet the President late on that evening.  Had an untoward event occurred, however, Secretary of State John Hay, who had been a private secretary to Abraham Lincoln in the White House, would have become President.

After President Franklin D. Roosevelt’s death in 1945, and the succession of Harry Truman to the Oval Office, the Republican Party opposition was able to gain massive control of the 80th Congress in the midterm elections of 1946, and were able to pass the Presidential Succession Act of 1947, again putting the Speaker of the House and President Pro Tempore of the Senate in line of succession after the Vice President, and before the Cabinet Officers. This was a purely partisan political act, with Republicans Joseph Martin and Arthur Vandenberg leading the way.  

In so doing, we have seen in the 74 years from 1947 to 2021 a situation in which the Speaker of the House has been of the party in opposition to the president for a total of 44 of 74 years, 60 percent of the time.  And the opposition party has held the position of President Pro Tempore of the Senate for 34 of the 74 years, nearly half the time.

This is not a tenable position in today's hyper-partisan environment. Therefore, reverting to the Presidential Succession Act of 1886, with updates for the additional Cabinet positions since created makes sense, although the idea of the order of succession being based on when the agency was created needs to be modified to allow the Secretary of Homeland Security, the last position created after September 11, to move up to next in line after the Attorney General and before the Secretary of Interior, due to the national security ramifications, in case of a Presidential vacancy.

Even though Cabinet Officers are not elected, it makes for better continuity that those selected by a President be in line for succession in case we ever have to go beyond the Vice Presidency in case of any unforeseen emergency, and it insures that there is a continuation of the political party chosen by the voters to control power in the Executive branch for that term of office.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/blog/154418 https://historynewsnetwork.org/blog/154418 0
Vaughn Davis Bornet, RIP at 102 This blog post was written by Rick Shenkman, founder of the History News Network, and the author of Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books). 

Let's begin with the cliche.  "So, Mr. Bornet, what is the secret to living a very long life?"  It was the question he got used to being asked. And it was the subject of a speech he gave to the Medford, Oregon Rotary Club when he turned 100, which we published a few months later. The following year in another piece on HNN he went into more detail. There was no secret, he answered.  He lived a long life, he supposed, because he didn't smoke, he stayed active "mentally and physically," and he married a good woman.  (She passed away in 2012.) 

He did have one secret about that speech, however, which he shared with me afterwards.  He managed to deliver it standing upright only with the help of a man who stood behind him out of view.  He may have graduated from Emory and Stanford, risen to the rank of commander in the US Navy during World War II, worked at RAND and written an armful of books on labor, Herbert Hoover, and Lyndon Johnson, but he was human.

He also was unsentimental.  Or was it just his endearing sense of humor on display when he wondered if having children helped or lessened one's chances of surviving a long life:

"I am of two minds about children’s effect on longevity. They may shorten your life by sometimes almost driving you nuts. Or, they may actually lengthen your life, as they may pay part of the bill for that fancy retirement home. They can provide a really good motive to stay alive as they visit weekly or monthly, bringing chocolates."

Vaughn wrote some sixty articles for HNN through the years, beginning in 2007 with, "How Race Relations Touched Me During a Long Lifetime."  Characteristically, it showed his continuing engagement with world affairs. When Mitt Romney was on pace to win the GOP nomination in 2012 Vaughn penned a piece that helped put the issue of Romney's LDS faith in historical perspective.  Mixed in with the articles on politics were dozens that spoke specifically to historians:  reminiscences on the death of his friend, the diplomatic historian Norman Graebner, reflections on life as a historian here and here. (If you're a student thinking about a career in history those two articles might help you make up your mind.)  Along the way he wrote numerous articles about life in America as it used to be: here and here, for example.  

Throughout those articles from the early years of HNN Vaughn took the attitude that he'd seen it all and we'll be fine.  In May 2016 he declared flatly:  "Why I’m Optimistic About Our Future." Then Donald Trump was elected president.  From then on Vaughn often seemed like a man in a state of shock.  This historian who had seen it all in his 100 + years -- in the Great Depression he'd watched powerless as his family lost their house and car as he was shipped off to live with an aunt -- now seemed dumbfounded by events, caustically commenting on the "spectacle of government by guesswork."  "So it has come to this," he observed in despair.

As events unfolded he pleaded with me to do whatever I could to draw attention to Trump's failings.  Meanwhile, he did all he could. He reviewed Michael Wolfe's book, then Omarosa Manigault Newman’s, then Bob Woodward's.

Vaughn was most comfortable in the role of patriot.  These were the kind of articles he wanted to write: "How Military Service Changes You,"  "It Has Been 63 Years Since I Raised My Right Arm and Joined the Navy""Good Luck, People of Our 50 States!"   And in his final piece for HNN, written back in May, he suggested, " 'This Too, Shall Pass.' History, and Life, Say So!"

Still, Trump unnerved him. His last book, published just a few weeks ago, is titled, "That Trump!" In the book, which consists of both new material and his HNN Trump articles, he aims to be objective but his disgust with Trump is self-evident.  At one point he hopes Trump will simply resign.

Vaughn hoped to live to see the end of Trump's presidency.  He didn't.  But maybe we will -- and soon. 

You can read Vaughn's many articles here at HNN and at his website, Clioistics. A family memorial can be found here and his obituary here.

 

 

 

 

 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/blog/154417 https://historynewsnetwork.org/blog/154417 0
"The Silent Guns of Two Octobers" Reviewing a New History of the Cuban Missile Crisis

 

HNN Editor's Note: This review was originally published in Washington Decoded on June 11, 2020, and is republished here with permission at the anniversary of the crisis. 

 

[Note to readers: Theodore Voorhees, whom I did not know, contacted me in 2017 about reading his manuscript. I concluded that his work added an important and fresh perspective to Cold War scholarship and, with Professor Martin Sherwin, assisted in finding a receptive university press. This article originally appeared on washingtondecoded.com on June 11, 2020] 

 

Part I: The Author’s Argument:

 

The standard view of the Cuban missile crisis is engraved in our historical memory. My own books reflect that outlook, describing those iconic thirteen days as the most dangerous episode of the nuclear era and the thirteenth day, October 27, 1962, as the most perilous twenty-four hours in human history. That view is so widely shared in missile crisis literature that it was startling to read a book in which that interpretation was all but relegated to the status of “the conventional wisdom.”

 

Theodore Voorhees, Jr., Senior Counsel at Covington & Burling LLP in Washington, DC, concludes “that much of the Cold War rhetoric the leaders employed was posturing and that neither had any intention of starting a nuclear war.” Voorhees begins by dissecting the October 1961 confrontation along the Berlin Wall at Checkpoint Charlie when some sixty Soviet and US tanks faced each other “across a tense Cold War border.” His conclusion, however, is that John F. Kennedy and Nikita Khrushchev were personally determined to avoid escalation. Indeed, in a matter of hours, they maneuvered to assure that the confrontation evaporated without violence or casualties.  

One year later, a vastly more dangerous crisis arose when US surveillance aircraft discovered that the Soviets had secretly placed medium and intermediate range ballistic missiles in Cuba (the IRBMs were never actually delivered because of the imposition of the US naval blockade). How Voorhees asks, did the rival leaders resolve the crisis “with lightning speed?” [i]

The simple answer is that the sudden, seemingly miraculous, restoration of peaceful coexistence was possible because both the underlying point of dispute and the ultimate deal terms that ended each crisis were matters under the personal control of each leader. When Kennedy and Khrushchev chose to settle, each man had the authority and the power to do so almost instantaneously. The two leaders personally directed all key decisions down to precise details…. It has become increasingly clear that Khrushchev and Kennedy felt free to reject the views of their closest advisers and brush aside the consternation they caused their alliance partners. … Neither Kennedy nor Khrushchev, whatever his publicly stated position, actually believed that his adversary’s actions presented a problem whose substantive importancewarranted even a conventional military engagement, far less a nuclear showdown.

Voorhees acknowledges that hawks on both sides of the divide regarded the missile crisis as an opportunity to settle the Cold War militarily and “there was always the danger that men lower down the chains of command might pull the trigger, whether by mistake, through personal belligerence, through fear, or all three.” However, this shared outlook at the top also significantly diminished the potential for unwelcome contingencies. The two leaders kept both the conventional and nuclear buttons under tight control and used back-channel diplomacy (involving the president’s brother Robert and Khrushchev’s son-in-law Alexei Adzhubei) to make sure that the other side received unmistakable signals of their ultimate intent to restore the status quo. JFK intended the naval quarantine of Cuba as a sign of caution and sober restraint, 

and that is how Khrushchev and his colleagues at the Kremlin immediately interpreted it—with great relief. On the other hand, the president’s DEFCON-2 alert unmistakably signaled to the Soviets the dire peril into which their gamble in Cuba had placed them. … In the days that immediately followed, both Khrushchev and Kennedy were literally tripping over one another to be first to make a settlement proposal that would be so generous that his adversary would be unable to turn it down.  

Both leaders, Voorhees contends, understood that the US held “all the cards” in the nuclear balance of power with a twenty-to-one advantage in nuclear warheads. The extraordinary Kennedy-Khrushchev missile crisis correspondence, he insists, once the Cold War bluster is discounted, reveals two anxious men committed to “keeping the lid on” and ready “to get the deal done.” 

And, most importantly, the rivals understood the danger posed by the tinder box in West Berlin, located deep inside Soviet East Germany, and carefully avoided any sign of aggressive intent to alter the status of that divided city. The US had nuclear superiority, but the USSR, with a substantial advantage in troops on the ground in East Germany and the Soviet satellites in Eastern Europe, could quickly overrun West Berlin. President Kennedy had remarked at a White House meeting that “It is insane that two men, sitting on opposite sides of the world, should be able to bring an end to civilization.” Khrushchev, fortunately, shared that point of view. The antagonists “realized that no politician in his right mind was going to use nuclear weapons first.” 

There were, Voorhees concedes, unanticipated and very dangerous incidents: most notably the October 27th downing of a U-2 by a surface-to-air missile fired without Kremlin authorization by a Soviet officer on the ground in Cuba. Sergei Khrushchev recalled his father’s near-hysterical reaction to that stunning development, which led to the death of the American pilot, the only fatality of the missile crisis. The furious Khrushchev even threatened to exile the officer to Siberia because “Everything is hanging by a thread as it is.” From Voorhees’ perspective, Khrushchev’s response, surely one of the dramatic highpoints in missile crisis literature, coupled with Kennedy’s decision not to retaliate against the SAM site(s), confirm the shared determination in Moscow and Washington to avoid nuclear war. 

 

Could it be,” Voorhees argues, 

that the Cuban missile crisis proved exactly the opposite of what was widely feared: namely, just how much safer and better protected the world had become from the risk of war arising between the superpowers given the widely appreciated horrors that nuclear weapons had introduced to modern war-fighting? … The lesson—perhaps counterintuitive to generations who have long accepted that the world came close to a nuclear holocaust in October 1962—is that the fearsome prospect of nuclear war-fighting of any kind virtually guaranteed that the crisis would be settled with remarkable speed and certainly well before the parties came anywhere near a point of no return.

Part II: The Reviewer’s Response:

After listening to hundreds of hours of recorded meetings and telephone conversations, I agree that JFK would never have chosen the nuclear option. Kennedy eagerly pursued a secret fallback plan, the so-called Cordier Ploy, in the wee hours of October 27-28 to give Khrushchev a face-saving way out by offering a Cuba-Turkey missile withdrawal plan that would appear to the world at large to have been put together by the United Nations rather than the US. JFK was ready, albeit reluctantly, to face the inevitable political fallout in the upcoming midterm elections if the secret missile swap had to be made public to avert war. The president, in a state of near despondency, told his 19-year old mistress that he would rather his children be red than dead—not the predominant view in the United States in 1962. The only other choice was nuclear fallout. 

Voorhees, however, in my judgment, seriously exaggerates the ability of the Kremlin to successfully micromanage a complex operation—carried out in secret for many weeks and more than 6,000 miles from the USSR. Soviet Ambassador Anatoly Dobrynin later acknowledged that erratic and limited communications severely undermined Moscow’s ability to cope with every conceivable or inconceivable eventuality in real time because their Washington Embassy did not have direct phone or radio communications with the Kremlin; coded messages had to be sent by Western Union Telegram—which could take 8-12 hours—after being picked up by bicycle couriers who, oblivious to the urgency of the situation, were known to stop for a snack or to flirt with a girl. JFK and the ExComm struggled with similar constraints—for example, waiting hours to receive State Department translations of Khrushchev’s messages. And, of course, neither Kennedy nor Khrushchev were able to control a potentially lethal wild card in the crisis, Fidel Castro—as revealed by his October 26 cable to Khrushchev advocating a nuclear first-strike on the US and his refusal to accept on-site UN inspection of the missile sites even after the October 27-28 negotiated breakthrough.

 

There were, of course, several other perilous and potentially unmanageable episodes. Khrushchev had also ordered the nuclear warheads in Cuba to be stored miles away from the missile bases to prevent an accidental or rogue launch; but at least one base commander, again without authorization from Moscow, secretly moved them to his site. And, even more ominously, tactical nuclear cruise missiles had been put into position to obliterate the American naval base at Guantanamo if the US bombed or invaded Cuba. If the Soviets had killed thousands of Marines using tactical nuclear weapons, could Kennedy have kept the public demand for retribution in check? Voorhees seems confident that the answer is yes, despite the fevered Cold War context of 1962 (which included a poll in which most Americans concluded that a nuclear showdown with the USSR was inevitable). 

 

Perhaps the most striking incident, which has gained a great deal of notoriety in recent decades, involves a Soviet submarine near the quarantine line forced to surface on October 27 after the US Navy dropped so-called “practice depth charges” [PDCs]—with the explosive force of a hand-grenade—producing “harmless explosive sound signals.” Voorhees recapitulates:

 

One of these PDC hand grenades may have detonated close enough to inflict some modest damage on at least one of the Soviet submarines, B-59, which would have allowed its captain under his standing orders to respond to any presumed damage-causing attack by firing torpedoes, one of which available to him in this case carried a nuclear warhead. … This incident has earned an outsized place in missile crisis lore owing to reports that a Soviet naval officer named Vasily Arkhipov on board B-59 allegedly stood up to his vessel’s captain, Valentin Savitsky; single-handedly talked him out of his threat to arm the submarine’s nuclear-capable torpedo for possible firing at US naval vessels; and thereby became known as ‘the man who saved the world from nuclear apocalypse’. 

 

Voorhees argues that Savitsky “had received notice of the new American [PDC signals] policy,” sent from Washington to Moscow on October 25, and “presumably [my italics] knew the difference between the sound of signaling PDCs and a determined lethal attack using real, full-strength depth charges.” However, JFK and the ExComm, Michael Dobbs concluded, “assumed that the Soviet submarine captains had been informed about the new procedures and understood the meaning of the [PDC] signals. They were mistaken.” [my italics] The Kremlin failed to confirm receipt of the message about the underwater signals and did not alert their four submarines in harm’s way near Cuba. Savitsky “knew nothing about the signaling procedures” and “nobody [on board] knew what was going on.” The submarines, Svetlana Savranskaya stressed, were also unable to contact Moscow without reaching “periscope depth” or surfacing in waters teeming with US Navy vessels.[ii] Voorhees remains confident, however, about “the essential inevitability of the actual outcome.” 

 

Finally, also on Black Saturday, October 27, a U-2 from a Strategic Air Command base in Alaska, apparently on a “routine air sampling mission” to check on nuclear testing in the USSR, “accidentally” strayed into Soviet air space. MiG fighters scrambled and the plane was permitted to return to its base escorted by US F-102 fighters equipped with nuclear air-to-air missiles. Voorhees insists that the Soviets, “already facing actual [my italics] oncoming attack threats” from American B-52’s “took no responsive measures.” In short, he concludes that the evidence suggests that the threat was not an “actual” threat and the Soviets knew it. Fortunately, however, the MiG’s could only reach a maximum of 60,000 feet and the U-2 flew at 70,000 feet—thus limiting the Soviet fighters, at least initially, to tracking the path of the American intruder. 

 

However, when Dean Rusk updated the president about the U-2 “accident” just hours later, he was reading from a prepared text—unlikely to have been written in the brief time since the intrusion: “Would there be,” Rusk asks President Kennedy, “anyadvantage [my italics] in our saying that ‘an Alaska-based U-2 flight engaged in routine air sampling operations in an area … normally 100 miles from the Soviet Union had an instrument failure and went off course … overflying a portion of the Soviet Union?’” Rusk’s calculated language and tone, captured on the tape recording, suggest that he was proposing a public relations cover story rather than simply presenting the facts to the president. 

 

Decades later, at a conference, Professor Scott Sagan asked Robert McNamara if the U-2 flight was part of the ultra-secret Strategic Integrated Operational Plan (SIOP) for nuclear war. The former defense chief curtly denied it but refused to discuss details—intensifying the skepticism of the panelists and the audience. Fred Kaplan, however, has documented that JFK, in 1961, had read and seriously discussed a nuclear first-strike plan that could have led to a million Soviet casualties in the first attack alone.[iii]

 

Michael Dobbs later utilized some newly released documents and interviewed U-2 pilots and senior SAC officers to nail down additional details on the overflight.[iv] He nonetheless stressed that the full report, originally ordered by McNamara, remains classified. Can historians rule out, without this potentially definitive evidence, the possibility that this episode was linked to a botched or aborted effort to “resolve” the crisis with a pre-emptive nuclear strike—in other words, that it was initially a strategic gamble that contingency morphed into a hazardous unanticipated consequence?

 

Both Kennedy and Khrushchev, Voorhees insists, were resolved to avoid the use of nuclear weapons. But, as explicated above, the micromanagement of historical contingency is an illusion. “The destinies of nations,” Martin Sherwin demonstrates, “just as the lives of individuals, are moved inexorably forward through crossroad after crossroad by decisions and chance, with the influence of each in constant flux. The disconcerting conclusion … [is that] a global nuclear war was averted because a random selection process had deployed Captain Vasily Arkhipov aboard a particular Soviet submarine.”[v]

 

Theodore Voorhees, Jr. has written a boldly original and impressively researched account of how events, fortunately, did turn out in October 1962.  But, if those fateful thirteen days could be repeated one hundred times, it is all but inconceivable that fortuitous contingency, branded as “plain dumb luck” by former secretary of state Dean Acheson, would substantiate Voorhees’ confidence in “the essential inevitability” of a peaceful outcome. Kennedy was steadfast about deterring nuclear war—a fact incontrovertibly documented by the real-time tape recordings; Khrushchev’s apparently analogous motives must be deduced from his actions, his memoirs, and the testimony of those around him. Nonetheless, that shared outlook alone could not and did not predetermine the outcome. As historian Fredrik Logevall recently warned: “we should avoid the trap of hindsight bias, or what the philosopher Henri Bergson called ‘the illusion of retrospective determinism’—the belief that whatever occurred in history was bound to occur.”[vi]

 

 

[i] If, as Voorhees maintains, the Checkpoint Charlies standoff provided Kennedy and Khrushchev with “a kind of blueprint and preview in miniature” of the missile crisis, it did not have a noteworthy impact, despite the persistent angst about Berlin, on the ExComm discussions or the correspondence between the two leaders.   

[ii] Michael Dobbs, One Minute to Midnight: Kennedy, Khrushchev, and Castro on the Brink of Nuclear War, 2008, 297-303; Svetlana Savranskaya, “New sources on the Soviet submarines in the Cuban missile crisis,” Journal of Strategic Studies, 28/2 (2005) 233-59.

[iii] Fred Kaplan, “JFK’s First-Strike Plan, Atlantic Monthly, October 2001, 81-86.

[iv] Dobbs, Op.Cit., 258-65, 268-72. 

[v] Martin Sherwin, www.cornerstone.gmu.edu/articles/4198 and Gambling with Armageddon: Nuclear Roulette from Hiroshima to the Cuban Missile Crisis, 1945-1962, forthcoming September 2020.

[vi] Fredrik Logevall, JFK: Coming of Age in the American Century, 1917-1956, 2020, 361.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177831 https://historynewsnetwork.org/article/177831 0
The Roundup Top Ten for October 16, 2020

Republican Voter Suppression Efforts were Banned for Decades. Here's what Changed

by Kevin M. Kruse

In 2020, as in 1981, the realities of voter fraud don't matter. Republicans are insisting that their very real efforts at voter intimidation are warranted because they insist that Democrats have done or will do or possibly might do something much worse. 

 

The Right's War on Universities

by Ruth Ben-Ghiat

"From the fascist years in Europe, nearly a century ago, to our own times, right-wing leaders have accused universities of being incubators of left-wing ideologies and sought to mold them in the image of their own propaganda, policy, and policing aims."

 

 

How Do Pandemics End? History Suggests Diseases Fade but are Never Truly Gone

by Nükhet Varlik

"Whether bacterial, viral or parasitic, virtually every disease pathogen that has affected people over the last several thousand years is still with us, because it is nearly impossible to fully eradicate them."

 

 

For 200 Years Courts Upheld Rules to Protect Americans’ Health. Until Now

by John Fabian Witt

"Now a new generation of judges, propelled by partisan energies, look to deprive states of the power to fight for the sick and dying in a pandemic in which the victims are disproportionately Black and brown."

 

 

Stop Othering Latinos

by Geraldo L. Cadava

When politicians see us as more than voters, we may give them our votes.

 

 

#WEWANTMOREHISTORY

by Greg Downs, Hilary N. Green, Scott Hancock, and Kate Masur

At historic sites across the United States on September 26, dozens of participating historians presented evidence to disrupt, correct, or fill out the oversimplified and problematic messages too often communicated by the nation’s memorial landscape.

 

 

Higher Ed’s Shameful Silence on Diversity

by Hasan Kwame Jeffries

Right-wing diatribes about diversity training often ended with a call for Trump to issue an executive order banning federal agencies from holding them. So it was not unexpected when, on September 22, Trump signed an executive order forbidding diversity training within the government.

 

 

The Real Black History? The Government Wants To Ban It

by Priyamvada Gopal

Tory attacks on "victim narratives" in the history curriculum defend entrenched power and ignore the fact that Black British histories are about the power of protest and activism to make social change. 

 

 

The Political History of Concealing Illness, from Brezhnev to Trump

by Joy Neumeyer

Like his Communist counterparts, Trump’s predilection for pageantry offers a hollow illusion of vitality while letting potentially fatal problems fester.

 

 

America Has No Reason to Be So Powerful

by Stephen Wertheim

"There was a time when Americans believed that armed dominance obstructed and corrupted genuine engagement in the world, far from being its foundation."

 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177820 https://historynewsnetwork.org/article/177820 0
"Provided I Can Fuse on Ground Which I Think is Right": A Lincolnian View of the White House History Conference

Drs. Wilfred McClay and Allen Guelzo, National Archives, September 17.

 

 

 

Some friends have importuned me for an explanation of why I joined the panel that spoke at the National Archives as “The White House Conference on American History” on September 17th. Having been engaged in teaching the subject in various ways for forty years, I can say bluntly that I am not happy about its present condition. That I would say so at the behest of the White House set off an overabundance of anxiety in some quarters and over-congratulation in others, and mostly about the fact that the Vice-President and President spoke on the same subject later in the event. I am not sure what the cause of either the anxiety or the congratulation was, since my comments, of course, were not directed to the President or Vice President, or made in consultation with them. I have never even met the former, and the latter only once, at a reception. 

 

The issue for me was history education; and if I anticipated causing upset, it was more for making no secret of my conviction that the Enlightenment universalism of the Founding, the Declaration and the Constitution is a remarkable and exceptional moment in human history, or for my resistance to the worrisome versions of tribalism which I see bidding to replace it. I am not ashamed to say that I am a Lincolnian on this point, and subscribe myself fully to Lincoln’s opinion. In 1858, he said that half of Americans then alive had come from some place other than the United States. “If they look back through this history to trace their connection” to the American Founding strictly “by blood,” then “they find they have none.”

 

But when they look through that old Declaration of Independence they find that those old men say that “We hold these truths to be self‑evident, that all men are created equal,” and then they feel that that moral sentiment taught in that day evidences their relation to those men, that it is the father of all moral principle in them, and that they have a right to claim it as though they were blood of the blood, and flesh of the flesh of the men who wrote that Declaration, and so they are.

 

The fact that Americans have not always lived up fully to that Enlightenment universalism, or that ethnicity has often gotten bloodily in the way of it, merely shows that we are human, not that it is wrong. Lincoln again:

 

It is said in one of the admonitions of the Lord, “As your Father in Heaven is perfect, be ye also perfect.” The Savior, I suppose, did not expect that any human creature could be perfect as the Father in Heaven; but He said, “As your Father in Heaven is perfect, be ye also perfect.” He set that up as a standard, and he who did most towards reaching that standard, attained the highest degree of moral perfection. So I say in relation to the principle that all men are created equal, let it be as nearly reached as we can. If we cannot give freedom to every creature, let us do nothing that will impose slavery upon any other creature. 

 

I do not see that aspiration represented in much of our history teaching today. I complained, in my panel comments, that to look through the tables-of-contents of our flagship quarterlies is frequently to encounter a witches-sabbath (and Night on Bald Mountain was thrumming in the back of my mind as I wrote that) of complaint about injustices, deportations, genocides, failures, co-optations, and miseries. Unhappily, I am not alone in this lament. As David Hackett Fisher complained years ago, we have made “the American past into a record of crime and folly” and told ourselves “that we are captives of our darker selves and helpless victims of our history.” Perhaps this fulfills a certain Puritanical gene in our national make-up which is never entirely happy unless we are unhappy; perhaps it’s because human nature is drawn to misanthropy and outrage because it makes us feel so powerful; or perhaps it’s an illustration of what Tocqueville observed when he said that the nearer we approach genuine equality, the more screamingly intolerable the remaining inequalities feel. I am not equipped to decide which of these preponderates in every case, but I see a good measure of each in the squinting vision that pervades our profession.

 

In my comments, I was particularly severe on critical theory, and especially critical race theory, which strikes me as indulging precisely the same circular reasoning as the Calhounites long ago, with the same appeal to the supremacy of “community” (that was Amos Kendall’s word in shutting down the circulation of abolitionist literature in the mails in the 1830s) and race. It’s that circular reasoning which leads me to reject the theorists’ despair; nor does their despair carry much persuasion when I hear it coming from people who, in flat contradiction of despair, occupy positions of prosperity, privilege, influence and (yes) property which would otherwise be the envy of the preceding three thousand years. Adorno thought that Enlightenment reason discards difference and thus victimizes two-thirds of the world’s population. The very point of the Enlightenment was that reason understood difference, and saw difference as the cult-goddess of violence. 

 

I don’t deny that academic historians always run the risk of being manipulated, infantilized and traduced, especially by the political classes. But if, to avoid that, we say nothing except to ourselves, then I think we forfeit what we owe to the history we purport to serve. And I do believe we have a responsibility as historians, both to those who cannot speak from the past and to those whom we teach, a responsibility not to wallow in guilt or drag others into the wallow, and it does not seem to me at all unreasonable to ask what souls we are forming as we teach. If we laugh at honor, we should not then profess shock when mass murder is perpetrated. I do not think it wrong to ask myself whether what I say builds up, or destroys; whether it would strengthen the resolve of some eighteen-year-olds to storm Omaha Beach, or whether it would incline them to sell nuclear secrets for the first offer in spot cash. History is an art that holds off dissolution; it should improve life rather than debasing it. 

 

I have no sympathy whatsoever with the pompous foolishness which argues that all Americans have been right, valiant, brave, noble, innocent, blue-eyed and pure. But the myths of the mindless patriots on the Right are not worse than the myths of the mindless cynics on the Left, and I do not need to explain that it is the Left that dominates in our profession. I suppose that this will invite the accusation that I am merely bourgeois. Very well. Susan B. Anthony was bourgeois, Frederick Douglass was bourgeois, and Lincoln was certainly the most bourgeois of all.

 

So, I will take the opportunity of any platform offered me short of outright tyrants, depraved fools and genocidal murderers to talk about American history -- I have done that for Dinesh D’Souza and was roundly condemned for doing so; I did it for the World Socialist Web Site, and was roundly condemned for doing that, too. I think I can do both without being either a Trotskyist or a D’Souzaist. Lincoln once more: “I have no objection to ‘fuse’ with any body provided I can fuse on ground which I think is right.” I would be just as willing to do so as an officer of the American Historical Association, except of course, that I was told by the chair of the committee on nominations years ago that people who thought like me were not wanted. So much for diversity and inclusion.

 

What is the way forward? That is the question I wish more people would ask. I begin with a comment John Cheever once made in Falconer, about “the inestimable richness of human nature” -- that we are tragic, selfish, and cruel, and yet capable of great vision. I then would take us to the foundations of law -- divine, natural and positive – and superimpose the conviction that the American democracy opens up for us, in a way seen in no society before 1776, the chance to fight for the natural rights with which we have all been endowed. Next, I would find in the American experience the rejection of tribe -- of blood, soil, and kings -- and the achievement, more than in any previous epoch, of happiness, of eudaemonia. To write this story, I would borrow a rule from John Gardner (from whom I have borrowed a lot): there is no true compassion without will, and no true will without compassion. And the last word should be from Sgt. William Carney: “The old flag never touched the ground, boys.”

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177724 https://historynewsnetwork.org/article/177724 0
Who Owns Churchill?: Three Mythic Configurations

 

 

 

We sent our manuscript, The Churchill Myths, to Oxford University Press on 24 July 2019, the very day that Boris Johnson became prime minister of the United Kingdom. The manuscript had been in conception, and came together in various drafts, for almost exactly a year: its completion and the dramatic shift in the political world was no coincidence. The conjunction of the two felt strangely unnerving to us, when for once our own historical study appeared unusually punctual, addressing immediately the contemporary moment.

 

Johnson’s career had become increasingly identified with Churchill’s memory, not least in his own dreams, which he proved keen to share with the nation. His bestselling book, The Churchill Factor (2014), signalled not only a literary but a political event. Since Churchill retired from office in 1955 many politicians, in the UK, the USA, and elsewhere, have endeavoured to appear “Churchillian.” Like Johnson, they have wished to exploit Churchill’s reputation as steadfast, determined, and resolute. Yet although the core of Churchill’s image -- indelibly linked with his cigar and V-sign -- has remained constant, there have been significant historical changes in how successive political generations have deployed him.

 

This is why we refer to myths in the plural. Churchill is constantly being reshaped in public debate and put to new political uses. How he is understood will continue to evolve in our own present, as well as in the future.

 

We suggest, speaking broadly, that there are been three overriding phases in the mythic configurations of May 1940, with the figure of Churchill himself assuming ever greater prominence as the story evolves. 

 

First, during the global crisis of May 1940, when the Nazi invasion seemed a matter of hours away, the place of Churchill in the telling of the story was not what we would expect. Mythic Churchill was present, of course, not least in Churchill’s own self-dramatizations. But this “Great Man” version of history had to compete with the role of “the people” as radical, and as the principal historical actor, with Churchill himself assuming a significant, but not an absolute, part in the story. This idea of ‘the people’ itself had mythic properties, while Churchill was relegated to acting as an adjunct to the larger popular mood, the one that was expressed in Labour’s 1945 election victory.

Second, from the late 1940s – that is, after Churchill lost the premiership – mythic Churchill really takes off. His own hand in the making of this state of affairs was not slight. In these years, Churchill became the means by which the compact between the state and the people could be harmoniously accomplished, magically unified through the commanding person of Winston Churchill. This is when Churchill himself became the totemic distillation of the nation.

 

In the last twenty years, thirdly, a new configuration of meanings has emerged. This is partly evident in Boris Johnson’s own revision of the story. Churchill still remains the incarnation of “the people.” But this is a more nativist, more belligerent people which is understood to be – not in harmony – but in contention with the state. This reimagining of Churchill sees the “respectable,” mainstream Conservatives as the enemies and betrayers of the people. The historic Conservative Party is re-imagined as the enemy within.

 

The tangled relationship with Europe lies behind this. Johnson himself, when adopting the garb of Churchill, does so as a “man of destiny,” saving the people from the corruptions of the old Establishment, which includes both the advocates of Europe and the institution of the Conservative Party. This signals a more Jacobin, intemperate, and resolutely populist politics. 

 

When we were completing the book we knew Johnson’s electoral triumph was not the end of the story. But we could not have possibly anticipated how the explosion of Black Lives Matter would come to represent such an electric current in the public life of the British nation. History can always take us by surprise. Churchill, this time as the “racist,” is again emphatically a charged sign in the polarization of contemporary British politics.

 

But notwithstanding these divisions Churchill as transcendent figure remains powerful. When his name is uttered in public, chances are that it will be underwritten by faith in the empire as the unilateral medium for the benediction of others. Repetition of Churchill as the nation’s story runs deep. Its popular variations are impervious to critique. It’s not that contrary voices don’t exist. They do, and frequently they’re heard. It is, rather, that the Ur-story reproduces itself regardless of whatever manifestations of dissidence cross its path. The more repetitious it becomes, the more impregnable it is, and the further it departs from historical realities. In its telling, thought is eviscerated. The past is acted out, with no need for reflection. The story – the myth -- obliterates history. The story tells itself.

Churchill becomes the means for the nation’s exceptionalism to exert its authority. There persists a puzzling faith in the notion that England, providentially, has been peculiarly immune to the existential darkness which shadows modern selfhood. Whether anyone actually believes this or not, as a literal truth, may be unlikely. But its echoes nonetheless persist. This is a voice heard in different frequencies. It constitutes a barely conscious substratum of popular experience, flaring into the light of day in tabloid headlines. It turns on an attachment to the protocols of an English fundamentalism. When critics point to the dark matter in the nation’s past -- to the violence of colonialism, to enslavement, to the racialization of others -- a reflex is triggered as if even to say such things the lifeblood of English selfhood is carried to the precipice of destruction.

Even now, to question Churchill’s historical record triggers all manner of aggrieved reactions. To do so is perceived to be denigrating not only the man but the nation. Churchill, in this sense, remains a charged element in the civilization of the British. The aim of our book is to help the reader understand how and why. 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177679 https://historynewsnetwork.org/article/177679 0
Judicial Overreach in High Partisan Times: How the Dred Scott Decision Broke the Democrats and Boomeranged on the Court

 

 

 

 

With the imminent confirmation of Amy Coney Barrett, Republicans appear to have a solid Supreme Court majority in their grasp. But they –and the conservative Supreme Court majority— ought to heed a lesson from the Court’s history: Beware of overreaching.

 

The most dramatic example comes from the Court’s most infamous case. We usually parse Dred Scott v. Sandford as the Worst Decision Ever, but it also offers an overlooked political lesson. The Court waded into a high partisan battle and badly damaged the institutions behind the ruling. The Democratic Party broke in two and the Supreme Court itself endured a decade of court packing. 

 

Start with the case itself. An army surgeon was posted to the free territory of Wisconsin and took along a slave named Dred Scott. While in the territory, Scott married in a civil ceremony – something he could not have done as a slave. When the army sent the doctor back into slave states, Scott sued for his family’s freedom (by now they had two daughters) citing the traditional legal rule, “once free, always free.” After a 12-year legal saga through state and federal courts, Chief Justice Roger Taney decided to use the case to settle the fiercest question of the 1850s: Which of the vast western territories should be open to slavery? 

 

Democratic President James Buchanan, who never missed an opportunity to side with slaveholders, used his inaugural address to cheer the awaited court decision as the final word on the matter. Like all good citizens, intoned this soul of innocence, “I shall cheerfully submit … to their decision … whatever it may be.” Except that he already knew perfectly well what it would be. Buchanan had pushed Justice Robert Cooper Grier (a fellow Pennsylvanian) to join with the five southern justices in order to improve the optics when the legal bomb detonated. 

 

Two days later, on March 6, 1857, the Supreme Court announced its ruling in Dred Scott v. Sandford, an historically inaccurate, legally implausible, virulently partisan decision marked by eight different opinions (two dissenting). A clerk even misspelled the plaintiff’s name –it was Sanford, not Sandford-- so that even the name of this infamous decision memorializes a typo. 

 

At the heart of all the jurisprudence sits Justice Taney’s majority opinion, an implacable picture of race and exclusion. The Constitution and its rights could never apply to Black people. What about Scott’s claim to have lived in a free territory? Not valid, ruled Taney, and for a blockbuster reason: no one had the authority to prohibit slavery in any territory – not the federal government, not the residents of the territory, not anyone. What about the Missouri compromise of 1820 which forbade slavery above the 36 30’ parallel? “Not warranted by the Constitution and therefore void.” How about the compromise of 1850 and the Kansas-Nebraska Act, which had turned to popular sovereignty? Nope. No one could limit slavery in any territory. 

 

The political fallout quickly spread to both the parties and the courts. The Republicans had sprung up, in the mid 1850s, to stop the spread of slavery into the territories. The Dred Scott decision, which was the first to strike down a major act of Congress in more than half a century, ruled out the party’s very reason for being. As historian George Frederickson put it, the ruling was “nothing less than a summons to the Republicans to disband.” Republican leaders denounced the Court as part of the Slave Power and accused Chief Justice Taney of conspiring with pro-slavery Democrats in the White House and Senate. The decision itself helped propel these new-found enemies of the court to power.

 

Across the party aisle, the detonation unexpectedly wrecked the Democrats. At their next political convention, in 1860, they paid the price of their victory. When it came time to write a party platform, northern Democrats opted for the same slavery plank they had used during the last presidential election: White men in the territories should decide the slavery question for themselves. After all, these politicians could not very well go before their voters –who were eager to claim western lands for white men and women-- and announce that every territory was open to slavery regardless of local opinion. 

 

The southern Democrats, however, insisted on a plank that said exactly that. They bitterly denounced their party brethren for casually handing back what the Supreme Court had given. The southern version of the plank proclaimed that Congress had a “positive duty” to protect slaveholders wherever they went -- “on the high seas, in the territories, and wherever else [Congress’s] constitutional authority extends.” The high seas? A sly call to bring back the Atlantic slave trade which had been banned in 1808. When the convention narrowly chose the northern version of the platform, the southerners walked out of the convention and eventually nominated their own candidate. 

 

A divided Democratic Party eased the way for a Republican victory in 1860 and that, of course, gave the nation a hard shove toward the Civil War.  Democrats had dominated Washington throughout the antebellum period. Now they fell from power. They would not control the Presidency and Congress again for forty-two years. 

 

The recoil from the Dred Scott decision also shook the Court. Lincoln bluntly expressed the Republican’s skepticism in his Inaugural Address.  “The candid citizen must confess that if the policy of the government … is to be irrevocably fixed by decisions of the Supreme Court, the people will have ceased to be their own rulers.” The other branches of government were every bit as capable of enforcing the Constitution, he continued, and the Court had no business claiming that right for itself. 

 

Republican Senator John Hale (NH) added that the Court “had utterly failed” and called for “abolishing the present Supreme Court” and designing a new one. The new majority did not quite go that far, but they packed and repacked the court. They added a tenth justice (in 1863), squeezed the number down to seven members (in 1866) and, finally, returned it to nine (in 1869). The Republicans also reached into the lower courts and rearranged the Circuits. The partisan reorganization of the courts –the only sustained court packing in American history-- went on for most of a decade. 

 

The lessons from Dred Scott echo down through to the present day. A declining political party only injured itself by using the courts to settle a fierce political controversy. Even more important, the Court’s plunge into the hottest issue of the era blew right back on the Court itself. Taney’s botched effort to settle the slavery issue sends a warning to every generation. There are distinct limits to the Court’s legitimacy in highly partisan times. Modest jurisprudence can protect the court. Overreach can cause all kinds of blowback.  

 

 

 

This essay is taken from Republic of Wrath: How American Politics Turned Tribal from George Washington to Donald Trump (Basic Books, September 2020) 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177738 https://historynewsnetwork.org/article/177738 0
The Coming Election and the Political State of Fugue

 

 

 

Americans teeter on the brink of a state of collective fugue. A psychiatric state of mind, the fugue is caused by extreme distress in the aftermath of one or more cataclysmic events. The fugue state causes a person to fail to recall intrinsic identifying personal characteristics and to no longer remember what they believed in the past; those things they knew to be true no longer exist. This dissociative mental state erodes one’s fundamental concept of self. Under Donald Trump’s cataclysmal presidency, our collective memory and awareness of who we are as a people and our shared aspirations to perfect our union appear to be at the point of dissolution.

 

We are not at war with a conventional army, yet our nation is in chaos. Over 200,000 American lives have been lost to COVID-19, and with winter on the horizon, a second wave of the pandemic is emerging in Europe. The United States leads the world in the number of people infected with the virus and in the number of COVID-19 deaths, despite the nation’s exceptional biomedical and health related research and infrastructures. 

 

Further, despite our proselytizing instinct to lecture the world about minority rights and good governance, police brutality against Black people in the United States is dramatically displayed in media across the globe. The ensuing continental uprising for civil rights, far-reaching economic spasms, and crisis of governance are exacerbated by the reflexive responses of an unpredictable President. 

 

The transition from a unipolar to a multipolar world, the emergence of economic centers in the East and the ensuing erosion of Pax Americana, accelerated by Trump and his team, compound our unease and search for identity. These seismic shifts nationally and internationally perpetuate our state of heightened anxiety. 

 

The erosion of our centuries-old governmental institutions is particularly distressing. In the wake of the death of Associate Justice Ruth Bader Ginsburg, we are now forced to put aside our national mourning and deal with the political ramifications of her passing. We must reckon with the seemingly assured confirmation of Amy Coney Barrett as Ginsburg’s replacement, a social conservative who would certainly erode the civil rights of minorities, the healthcare gains of the contemporary era, the procedural rights of ordinary Americans in the justice system and the bargaining power of workers. The death of Ginsburg, a champion of rights, bodes the potential to regress to a darker past.

 

Trump and his team claim to have the mandate from the American people and are preparing their Senate allies to complete the confirmation process within the span of a few weeks, prior to a presidential election. Mitch McConnell’s pledge to support the President ignores a precedent he declared just four years ago - not to appoint a Justice during an election year - which we are now expected to erase from our collective memory.

 

In many instances, particularly at the level of the High Court, the American judicial system has been predictable in rendering judgments based on Justices’ and other appointed federal judges’ partisan political suasions. Notwithstanding, the near balance of opposition forces within the Supreme Court provided stability and prevented the abrupt tilting of the scales of justice, such that they overwhelm their point of balance and implode the judicial system. 

 

The Republicans obstructed President Obama’s appointments for federal judges during his tenure as a political maneuver, albeit with an underlying racial element. Yet, in under four years, Trump has appointed over 300 federal judges. It is troubling that judges each view their disposition on national issues through a political lens, as a President who lost the popular vote and a Republican Senate whose members represent a minority of the nation have pushed the courts sharply rightward.

 

As Republicans abandon their legal philosophy when it is no longer expedient and backpedal from their position after the death of Justice Antonin Scalia just four years ago, they gaslight the American people. They claim that what we see and hear are not true. They ask us to forget what they committed to in 2016, to question everything and to dispute the existence of what we know to be true.

 

Through a similar prism of purged memories, Trump and his team deny the existence of systemic racism that underpins police killings of unarmed Black people or the warming of our planet. They attempt to erase what is real from our memory. These unprecedented events have brought the nation to the precipice of a political state of fugue. Trump and his team are determined to push us off the cliff. When confronted with existential social and political crises, they foment political fugue with campaigns of disinformation.

 

Bob Woodward discloses that in February, President Trump was fully aware of the fatal potential of the coronavirus. Trump not only failed to share this information with the American public; he actively downplayed its deadly potential to the public and strongly encouraged his followers to ignore preventive measures. The President had promised prior to his election in 2016 to end American carnage. Paradoxically, his words foreshadowed what his legacy would be – the savaging of the American dream and reaching the milestone of hundreds of thousands of preventable American deaths during his presidency.

 

If re-elected, after another four years of a Trump presidency, the Justice Department, Supreme Court, and other institutions of the American democracy will not be recognizable. Our system of checks and balances, the foundation of the American democracy, will be dismantled. Our identity and who we are as Americans and our aspirations for a perfect union will cease to exist. Our government will be so fundamentally altered from what we know it to be, that we will have entered a collective political fugue.

 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177684 https://historynewsnetwork.org/article/177684 0
Corporate Money Turns Democracy Upside Down in California Initiative Process

 

 

 

The California ballot initiative process, created in 1911 by the progressive movement to control the influence of corporations, has instead been turned upside down by massive corporate spending. Corporate interests now use referenda to defeat popular legislation and grassroots reform efforts.

 

The most glaring example is the current fight over Proposition 22 on the November ballot, which has attracted a whopping $190 million in spending from Uber, Lyft and DoorDash.  The measure, backed by the companies, would overturn a current California law defining their drivers as employees who must be paid a minimum wage and be eligible for unemployment insurance. 

 

According to Ballotpedia, an independent tracker, this has become the most expensive ballot proposition in California history. Unfortunately, this massive outlay is only the latest in a long line of corporate efforts to sway public opinion on initiatives that impact their bottom line.

 

For example, in 2018, the dialysis clinic industry spent $110 million to defeat a measure to regulate their operations. In 2016, the pharmaceutical industry doled out $109 million to defeat drug price controls and in 2006 the oil industry spent $91 million in a successful effort to kill an oil extraction tax. 

    

Hiram Johnson

 

If Hiram Johnson, the progressive governor who championed the ballot initiative process, were alive today he would be appalled. Johnson was first elected in 1910 on a platform of controlling “the interests,” specifically the Southern Pacific Railroad, which basically owned the legislature. Johnson was a liberal Republican, and a follower of “Fighting Bob” LaFollette, the progressive Republican governor of Wisconsin, who instituted an income tax, a railroad commission and a pure food law. 

 

Although California’s proposition battles have attracted the most headlines, the Golden State was not the first to put initiatives on the ballot. South Dakota was the first to adopt statewide referenda in 1889, followed by Utah in 1900 and Oregon in 1902. By 1918, an additional 16 states implemented the practice.  

 

As the progressive movement ebbed in the Roaring Twenties, interest in initiative process fell off. Between 1912 and 1969, less than three ballot initiatives per year, on average, appeared on the statewide ballot. 

 

However, the power of the proposition for implementing major change jumped into national view in 1978 with the passage of California’s Proposition 13. This measure, sponsored by apartment owner Howard Jarvis, rolled back residential and commercial property taxes and limited the legislature’s ability to raise new taxes. 

 

Awakened to the potential power of the initiative, grassroots activists and corporate interests alike rushed to qualify initiatives for the ballot. Between 1978 and 2003 there were 128 initiatives on the statewide ballot.  

 

Easy to qualify 

 

Getting on the ballot is not that difficult.  Currently, 624,000 signatures are required for a basic or statutory referendum and 997,000 for a proposed amendment to the state constitution. The widespread use of paying signature gatherers makes the process easy for deep-pocketed interests. Attempts to ban the use of paid gatherers have been struck down by the courts. 

 

With Californians facing a long list of ballot measures year after year, many are questioning the wisdom of the process. A 2013 survey of likely voters found that 67% said there were too many propositions on the ballot and 84% said that the wording on initiatives was “too complicated.” 

 

This November, an even dozen measures will be up for a vote. They include rent control, dialysis clinic staffing (a second time), eliminating cash bail, voting rights for 17-year-olds and ending the state’s 22-year-long ban on affirmative action. 

 

As for the $180 million spend on Proposition 22, Los Angeles Times business correspondent Michael Hiltzik recently observed “no other initiative campaign in California history — given that California campaigns are the most expensive in the country, that means U.S history — has come close to the gig companies’ spending on Proposition 22, even accounting for inflation.”

 

The spending has manifested itself in a barrage of TV ads and mailers. The campaign features Uber and Lyft drivers (presumably volunteers) asking voters to “save my job.” The TV ads feature drivers (often young mothers and fathers) who state that the only way they can make ends meet is by driving for Uber or Lyft.  

 

Despite the ad onslaught, the ridesharing companies have an uphill climb to win approval.  A poll released September 22 by the U.C. Berkeley Institute of Governmental Studies found that only 39% of likely voters would vote “yes” and support the ride-hailing companies, compared with 36% who would vote “no” to retain current law with 25% undecided. Given that that in most cases undecided voters ultimately vote “no” on ballot measures, Proposition 22 could be wind up a costly defeat. 

 

A number of studies in recent years have found that the cumulative effect of California’s multitude of ballot initiatives has been to reduce the power of the state legislature and make it difficult for cities, counties and school districts to raise funds. This, of course, is hardly what Hiram Johnson intended.

 

Can anything be done to restore fairness to the initiative process? 

 

Some legislators have proposed raising the signature requirement or increasing the filing fee (now just $2,000). But as long as the federal courts equate corporate spending with free speech, those measures would do little to discourage wealthy special interests from using this policy-making tool.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177740 https://historynewsnetwork.org/article/177740 0
Paul Revere Made the Boston Massacre a Flashpoint for Revolution.

Paul Revere's Engraving "The Fruits of Arbitrary Power, or the Bloody Massacre," from Henry Pelham's drawing, 1770

 

 

At this moment that feels like a hinge in history—when America will swing either toward authoritarianism or toward a more just and liberal democracy—the ghosts of history rise up and speak to us. Five of those ghosts lay in the snowy gutters of King Street, Boston, nearly two and a half centuries ago, and their dying gasps resounded into revolution.

On the snowy night of March 5, 1770, a band of citizens allied as Patriots taunted and harassed a lone British sentry, Private Hugh White, who was standing guard over the Custom House, the repository of the funds General Thomas Gage needed to pay and operate the two regiments of troops occupying the city.

Some of his senior staff had counseled him to station the troops outside the city at Castle William in the harbor to avoid provoking violence and stiffening resistance to the occupation, but Gage intended to stun the self-proclaimed Patriots with a demonstration of overwhelming force. Thus he “quartered the soldiers” in the city—a military way of saying he ordered them to take over private homes—a move that enraged even those among the population who professed loyalty to the king. It’s useful to remember what seemingly innocuous phrases really mean.

The troublesome Bostonians were refusing to pay the taxes imposed on them by a Parliament across the ocean to pay off the debt incurred fighting the Seven Years War against the French. Patriot gangs routinely blacked their faces and accosted Customs collectors in the nighttime streets. One of their most troublesome leaders was reputed to be a silversmith named Paul Revere. Gage wanted to teach them all a lesson.

The lone sentry at the Custom House belonged to the 29th Foot—an unruly and unreliable regiment, hardly the nimble, disciplined force needed to project power and yet avoid violence. At some point during the altercation, in which he was knocked to the ground, he cut one of the civilians with his bayonet. Suddenly the injured man’s cohorts raised a great hue and cry, and a mob formed. Captain Thomas Preston arrived with eight reinforcements, also from the 29th.

The mob pelted the soldiers with snowballs—some of them probably cored with stones—some men daring the soldiers to fire, others pleading with them not to. Someone at last did shout “Fire!”—or, according to later court testimony, it may have been Preston ordering “Hold your fire!” In any event, the first Brown Bess musket went off—then others followed. 

The Brown Bess, so sweetly named, was a formidable and reliable weapon, in use since 1722. It would remain the standard British Army firearm, with modifications, for more than a hundred years. It fired a one-ounce .71 caliber ball—gigantic by modern standards—that could, it was claimed, penetrate five inches of solid oak. 

Three men died instantly, two others died later of their wounds, and six additional civilians were hit. 

It’s fitting to remember the names of the dead: Samuel Gray, a rope maker; James Caldwell, a seaman; Samuel Maverick; and Patrick Carr. The fifth fatality is often described as a dockworker of mixed race, or “mulatto”: Crispus Attucks. Like Caldwell, he was hit twice. The autopsy, performed by Dr. Benjamin Church, a prominent Patriot who would later betray the cause and be exiled into oblivion by George Washington, records horrific wounds. The first ball broke the second rib an inch from his breastbone, blasted downward through his diaphragm, blew his liver and gallbladder to pieces, severed the aorta descendens just above the iliacs, then exited through his spine. The trajectory would suggest that he was already on his knees when he was shot. He was likely dead before the second shot punched him in the ribs.

If you’ve ever fired such a musket, as soon as it socks your shoulder hard, and the powder flames out in a long sheet—a delayed and startling instant after you’ve pulled the trigger—you realize it is not a quaint museum piece but a killing instrument of awesome power.

Just so, peaceful protesters today are learning with (literal) physical shock that so-called “non-lethal” “rubber bullets” and “beanbag rounds” are hard, brutal projectiles that can horribly maim and even kill. Again, it matters what words we use to describe things in the world of conflict.

As Preston writes later, “None of them was a hero. The victims were troublemakers who got more than they deserved. The soldiers were professionals…who shouldn’t have panicked. The whole thing shouldn’t have happened.”[1]

The soldiers were arrested and jailed—their actions were clearly a matter for the bar of justice. The trial, winding up seven months later, was thorough—John Adams for the defense. Paul Revere—notorious to the British occupiers as an instigator and rabble-rouser—provided key evidence: a pen and ink diagram of the kind familiar to contemporary juries. It located each of the shooters and victims on King Street with clarity and precision.

That was Revere’s second and far less famous pictorial representation of the event. The first was rushed into circulation when the blood was hardly dry on the frozen ground: an engraving of a Henry Pelham drawing titled, “Fruits of Arbitrary Power, or The Bloody Massacre Perpetrated in King Street.” Thus the event was publicly and for all time named a “massacre.” The engraving removed any ambiguity about who was at fault in the episode, depicting a line of soldiers volley-firing into an unarmed crowd on the order of an officer with raised sword, as a little dog watches the horror. It became the ubiquitous graphic account of the violence of March 5, 1770.

The Boston Massacre was just the most prominent flashpoint so far. Gage belatedly removed his troops from the city. There were others, not as infamous from our historical remove, but equally inflammatory—and they began to add up

On February 22, 1770, just weeks before the massacre, a hated customs informer named Ebenezer Richardson retreated into his home after being harassed by a gang of boys throwing dirt clods and waving sticks. He grabbed a musket and fired through a broken window into the crowd outside, killing an eleven-year-old boy named Christopher Seider. Four days later, some 2,000 Patriots staged a public funeral procession that began at the Liberty Tree, symbol of resistance to the King of England.

Among the inscriptions on the boy’s casket was a motto that could serve for Black Lives Matter: Innocentia nusquam tuta—“Innocence is nowhere safe.” 

Seider continued to inspire resistance. On the one-year anniversary of the massacre, Patriots gathered for a silent memorial—and it was no accident that the site chosen for the demonstration was the home of Paul Revere, an acknowledged leader of the Patriot movement. Once again, he understood the power of the visual. He created a triptych of iconic—and lurid—images that filled three windows, calculated to appeal to the smoldering resentment and fervent patriotism of the crowd. As the Boston Gazette reported:

“In the Evening, there was a striking Exhibition at the Dwelling House of Mr. PAUL REVERE, fronting the Old North Square. At one of the Chamber Windows was the Appearance of the Ghost of the unfortunate young Seider, with one of his Fingers in the Wound, endeavoring the stop the Blood issuing therefrom.” 

The portrait bore an incendiary caption:

            Seider’s pale Ghost fresh-bleeding stands,

            And Vengeance for his Death demands.

The Pelham-inspired print of the Boston Massacre filled the next window: “. . . the Soldiers drawn up, firing at the People assembled before them—the Dead on the Ground—and the Wounded falling, with the Blood running in Streams from their Wounds: Over which was wrote Foul Play.”

Revere understood how to fashion narrative through unifying the images: “In the third Window was the Figure of a Woman, representing America, sitting on the Stump of a Tree. With a Staff in her Hand, and the Cap of Liberty on the Top thereof—one Foot on the head of a Grenadier lying prostrate grasping a Serpent.—Her Finger pointing to the Tragedy.”

The exhibition worked its emotional magic, striking the thousands of assembled citizens to “solemn Silence” and ”melancholy Gloom.”[2]

Two years after the massacre on King Street, Gen. Gage advised the Secretary of War, Viscount William Wildman Barrington, “Democracy is too prevalent in America, and claims the greatest attention to prevent its increase.”[3]

Gage’s lament seems to be the current mantra of the Republican Party, as it seeks to suppress voting and clear the streets of peaceful citizens assembled to petition the government for redress of grievances, a right explicitly—if inconveniently for those in power—enshrined in the Constitution.

As for Captain Preston and his grenadiers, a jury of non-Bostonians (chosen for their presumed lack of bias) took just three hours to acquit them of murder. Two were found guilty of manslaughter, but did not suffer the usual sentence of death. Instead, their thumbs branded: should they ever commit another crime, the consequences would be dire indeed.

So what do the ghosts of that bloody history whisper to us now?

First, that language matters. The words with which we describe a thing can be accurate or misleading, are often fraught, and hardly ever are neutral. As soon as the event on King Street was popularly labeled a “massacre,” the Patriots had a rallying cry as potent as “Remember the Alamo!” It turned an event into a story with a clear moral, removed ambiguity, assigned fatal blame, and demanded justice. 

Likewise, it matters whether we describe a peaceful assembly as a “demonstration,” a “protest”— or a “riot.” The terms escalate in their degree of danger and violence. The first requires official forbearance, the second forbearance with caution against possible escalation, and the third warrants heavily armed police with shields and the apparatus of violence. 

The people assembled on June 1, 2020, in Lafayette Square, across from the White House, stood firmly in the first category. The police and National Guard were the rioters, instigating violence in a previously peaceful arena using tear gas—which is banned as inhumane by the Geneva Conventions. “Tear gas” sounds relatively benign, the kind of thing that will make your eyes water for awhile. But it can damage the lungs, cause respiratory distress, and in this era of pandemic, fatally compromise the health of its victims.

“Batons,” so genteelly named to conjure images of drum majorettes, are actually clubs with which to beat people into submission. So-called “beanbag rounds,” fired from shotguns, have broken a man’s head open. “Stun grenades” or “flashbangs” routinely cause temporary hearing loss, have started fires, and have triggered heart attacks.

Second, whatever you bring to the event will get used. If Ebenezer Richardson, the Customs man, had not had a musket handy, an eleven-year-old boy would have lived to see another day. The gang of boys would likely have gotten bored and left.

The grenadiers on King Street—and grenadiers were recruited for their size and strength to serve as shock troops, not to finesse their way out of confrontation—had their own muskets, and sooner or later they were bound to be fired. When police march onto the scene of a demonstration geared up with heavy firepower and protective vests and shields, they will likely find the riot they are equipped for and use their arsenal.

Third, frame the situation accurately. Enlightened military planners do this routinely: What are we facing? What are the facts on the ground? What outcome do we want, and how best can we achieve it? 

I wonder, for instance, what Captain Preston hoped to achieve on that snowy night? Why didn’t he just pull the lone sentry indoors and let the weather eventually disperse the crowd before it became a “mob”?  For that matter, what did General Gage expect to happen when his 2,000 troops invaded the homes of ordinary Bostonians, most of them not part of the firebrand Patriot movement? His own officers warned him that such a provocation could only have a bad outcome, in fact might accomplish the opposite of his purpose by uniting the city against him and his troops.

Because fourth, the mindset of those in authority—and those they send to do their armed bidding—matters. Soldiers, like police, are trained to stand their ground. In the words of our own Secretary of Defense, Mark Esper, they must “dominate the battle space.” But crowds are not armies, and there is no battle space until it is created by confrontation with an opposing military force. Lexington and Concord were just peaceful farming towns until two armed forces determined to make them battlegrounds. Boston was just an unruly city, still part of a British colony.

And let’s be clear: American citizens protesting in American cities inhabit civic space—not battle space. There is no earthly need for a civic space to be cleared simply for the sake of clearing it and asserting dominance. Yet again and again, we see it happening exactly that way, because of the way an increasingly militarized police force is trained. From “To Serve and Protect” we seem to have evolved to a place of “Occupy and Dominate,” as if citizens were not the clients of police but their enemies in an occupied zone.

And as bad as the police mindset has too often become, the military is even worse as the guarantor of civic order—as many prominent military leaders have made clear. Troops are trained to subdue the enemy with force, and they are granted the extraordinary license to kill the enemy to make this happen—not the ideal recipe for guarding Americans’ constitutional right to petition for redress of grievances in the streets.

Fifth, real-life violence always comes as a shock to its victims. The eleven-year-old boy throwing dirt clods at the custom’s informer’s house surely never expected to be torn apart by a lead musket ball. Crispus Attucks and the others on King Street were probably used to brawling—but they hardly expected to be ripped apart by volleyed musketfire in their own hometown.

They should not have been so surprised, because organizations behave according to their training and habits and use whatever tools or weapons they bring to the situation. When we witness the extraordinary and unprovoked violence unleashed on unarmed citizens by police and soldiers on the streets of America, we are shocked to discover the violence of their habits and training. Yet it was always there, like the tear gas and stun grenades in their lockers, waiting to be used. Previously it was used on a select vulnerable population, off-camera. Now it is center stage, happening on a grand scale in broad daylight to citizens of all races, ages, and backgrounds. It is happening to journalists even as their cameras are rolling on live TV.

Finally, the ghosts tell us, images are forever. Paul Revere’s print, made from his engraving of Pelham’s depiction, survives today as the definitive visual, of that event. What we are witnessing in the streets of America today is also a reaction to a horrific image—in this case a video of a slow-motion murder that plays out for almost nine agonizing minutes. That galvanizing image will forever haunt our nation. And like the Paul Revere triptych, its is woven into a narrative, connected to a train of other images, all of them frames in a dark movie about an America whose existence we have denied for far too long: the postcards of picnickers at lynching sites; Emmett Till’s ruined face in his casket; Rodney King beaten and beaten forever by the side of a freeway; and now the myriad new images of police beating and shooting and tear-gassing our neighbors.

When the grenadiers on King Street were taunted and hit by snowballs, they defaulted to their basic training and showed their true colors: they were indeed willing to shoot and kill their American cousins—to treat them as the enemy. 

Even as the worst of our leaders repeat the blundering, provocative, divisive policies of General Gage, far too many of our police and National Guard have shown us their true colors. They are indeed willing to treat their fellow Americans as the enemy.

They have become the redcoats.

 

[1]The Boston “Massacre,” Historical Scene Investigation (H.S.I.), College of William and Mary. https://hsi.wm.edu/cases/boston/boston_documents.html#doc2.  See also David Hackett Fisher’s vivid account in Paul Revere’s Ride, Oxford University Press, 1994, pp. 23-25.

 

[2] Quotes from The Boston Gazette and Country Journal, March 11, 1771. https://www.masshist.org/dorr/volume/3/sequence/458.

[3] Gage to Barrington, Aug,. 5, 1772, cited in Fisher, p. 379.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177683 https://historynewsnetwork.org/article/177683 0
The Lenin Plot: The Concealed History of The US-Led Effort to Overthrow the USSR

 

 

 

With the United States again accusing Russia of election interference, and Moscow again accusing Washington of rattling sabers on the Russian border, there’s been talk of a “new” Cold War. But what exactly does that mean? What happened to the “old” Cold War?

 

For years the prevailing narrative said that the Cold War against the Soviet Union started shortly after the end of World War II. Financier Bernard Baruch coined the term in a speech he made before the South Carolina legislature in 1947. He warned that a “new kind of war” was being fought “in which guns were silent; but our survival was at stake nonetheless.” As it turned out, the guns were not silent, and the Cold War went on to include hot wars against Soviet surrogates in Korea, Viet Nam, Cuba, and other trouble spots. And it wasn’t just a shooting war. It was also an attempt by each side to defeat the other side politically, economically, and culturally. The difference was that the Cold War was not an officially declared war, as both the world wars had been.

 

But the true origin of what President Kennedy called “the long twilight struggle” goes back much further, as I explore in my new Cold War history, The Lenin Plot: The Untold Story of America’s War Against Russia, published by Pegasus Books in New York and Amberley Publishing in the U.K. The Lenin Plot was a two-pronged operation, to (1) invade Russia and defeat the Red Army, and (2) stage a coup in Moscow, assassinate Soviet dictator V.I. Lenin, and get the country back in the war.

 

The plotting began shortly after Lenin seized power from the Provisional Government on October 24, 1917. Lenin called it the “Bolshevik Revolution” and the “Great October Socialist Revolution.” But it wasn’t a true revolution, a general uprising of the country. That had already occurred, in February 1917. Russian historians now see Lenin’s takeover as a military coup. 

 

The Western Allies were alarmed at the bloodbath the Bolsheviks were conducting against innocent civilians. The Provisional Government had declared political amnesty for all, but the paranoid and vindictive Lenin wanted his enemies exterminated. In the Western view, Lenin’s coup was actually a counterrevolution that returned Russia to old tsarist days of widespread terror, torture, and mass murder. 

 

The Allies were also alarmed by Lenin’s secret deal with Germany. Berlin had sent millions of marks to Lenin’s agents in Stockholm, and they laundered the money and passed it along to the Bolsheviks to finance a coup against the Provisional Government. In case of success, Lenin would take Russia out of the war, allowing Berlin to redeploy army divisions to the Western front, the main battleground. Speaking of this deal, Lenin said, “We would have been idiots not to have taken advantage of it.” 

 

Secretary of State Robert Lansing, a bored pacifist who usually sat doodling in Cabinet meetings with President Wilson, sprang into action and used the State Department as a bully pulpit to demand immediate action against Lenin. Lansing told Wilson that the United States had to stage a coup in Moscow and install an Allied-friendly “military dictatorship.” Lansing suggested U.S. funds be sent to the French and British as military assistance, and they could launder it for use in the plot against Lenin. 

 

“This has my complete approval,” Wilson told Lansing in December 1917. 

 

De Witt Clinton Poole, a young U.S. consul in Moscow and a former tennis star nicknamed “Poodles” at the University of Wisconsin, was sent on a secret mission to recruit a Cossack army in South Russia. But Poole found the generals down there too antagonistic toward one another to mount a coordinated attack on the Bolsheviks. Poole returned to Moscow without a new Caesar, but the fledgling Lenin Plot was not quit. It merely segued into 1918.  

 

Poole became Washington’s spymaster in Russia. His chief field officer was Xenophon Kalamatiano, a University of Chicago track star who had sold tractors in Russia before the Allied embargo. Kal recruited dozens of assets, including the head of the Red Army’s communications office. Poole and Kalamatiano sent their reports to U.S. ambassador David Francis, a bourbon-sipping old Confederate who forwarded them to the State Department’s Bureau of Secret Intelligence, predecessor to the CIA and NSA. 

 

One of Kal’s closest spy colleagues was Henri de Verthamon, a French saboteur who wore a black trench coat and beret, and slept with his explosives under his bed. Another was the impressively named Charles Adolphe Faux-Pas Bidet, who had worked the Sûreté’s case against Mata Hari. The British Secret Intelligence Service (later MI6) was represented by Sidney Reilly, a freelance Russian adventurer and drug addict who had visions of himself as another Napoléon. The British Foreign Office sent Bruce Lockhart, a footballer susceptible to the charms of exotic women, one of whom, Maria Benckendorff, was a triple agent serving Britain, Germany, and the Soviets. Then Boris Savinkov, an experienced Socialist Revolutionary terrorist, was added to the Western plot. Savinkov was also a drug addict; he saw himself as Nietzschean Superman immune to bullets. He and Reilly advanced the conspiracy from a simple capture of Lenin to an assassination plot. 

 

Allied forces invaded Russia and fought the Red Army in an attempt to support the Moscow plotters. But Wilson and Prime Minister Clemenceau made a mistake in placing U.S. and French troops under British command. Most of the British officers were mental or physical rejects called “crocks” or the “hernia brigade.” They resented being shuffled off to a “sideshow” like Russia, and took out their anger on the American and French troops under them. The crocks arrived with 40,000 cases of Scotch whiskey, and their drunken incompetence caused Allied battlefield deaths. The Yanks and poilus retaliated by staging mutinies against the British. One doughboy walked up to a crock, told him to say his prayers, and shot him dead. Sanity finally arrived after the British commander was sacked and replaced with Brigadier General Edmund Ironside, a decorated officer from the Western front. The troops loved him. 

 

Lenin was shot and seriously wounded by Fanny Kaplan, a hardened Socialist Revolutionary terrorist. Allied agent Savinkov said he gave Kaplan her pistol. The shooting caused a dramatic escalation of the Red Terror, resulting in thousands of deaths. Thousands more casualties were counted in the combat zones. 

The Lenin Plot was a massive embarrassment for the Allies, and they tried to cover it up. The denial continued for years. President Roosevelt said a “happy of tradition of friendship” had existed between the U.S. and Russia “for more than a century.” President Reagan in a television address said “our sons and daughters have never fought each other in war.”

 

 *   *   *

 

I first found out about the Lenin Plot when I was a student at Tulane. I met a gentleman at the university library who had known some young men in Paris in the twenties who served in the war against Russia. I’d never heard of that. I started checking.

 

The internet was a primitive tool at that time, so I consulted bound volumes and microfilms of the London Times, the French L’Illustration, and the Literary Digest, an American news weekly. Times coverage stood out because their Russian articles were written by historians and former military officers. The newspaper also published an encyclopedia, The Times History of the War, which covered the Russian campaign in detail.

 

A few other histories of Allied involvement in Russia had been written, and some were helpful in providing leads. But I learned to distrust a lot of “scholarly” research because many of the writers simply rewrote one another without verification. That’s hearsay, not original research.

 

Internet research is easier now. Old publications have been digitized and posted to the net, and I found a number of interviews that way. The Hoover Institution at Stanford and the national archives in Washington, Paris, and Kew, England, provided copies of many documents not available on the web.

 

But beware of censorship on the internet. The State Department on their website admits that certain documents have been “edited” before being posted, in order to “avoid impeding current diplomatic negotiations.” To get around that, I verified documents by using bound volumes of the State Department’s Foreign Relations of the United States, published long before the internet. Bound volumes of the Readers’ Guide to Periodical Literature also contained much valuable information not available on the web.

I shy away from websites with “wiki” or “.com” in the title. They, too, tend to run unverified information. I use “official” documents and the web only as a starting point, as a clue to what really happened. I recommend that researchers contact libraries, archives, and presidential libraries, and look for letters, notebooks, diaries, interviews, memoirs, autobiographies, photographs, and eyewitness accounts. You’ll find the truth only by persistent digging off the grid.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177737 https://historynewsnetwork.org/article/177737 0
The Battle of Salamis Opened the Door for Ancient Greece’s Golden Age

Battle of Salamis, Wilhelm von Kaulbach, 1868

 

 

 

 

Twenty-five hundred years ago in the Battle of Salamis (dated to September, 480 BC), the ancient Greeks defeated the invading Persians and paved the way for Greece’s Golden Age of the 5th century, BCE, a foundation period for Western Civilization. 

 

By the late 6th century BCE, the Persians had come to dominate numerous peoples and reigned as the superpower of the era. At its height, the Persian Empire consisted of twenty provinces and stretched from the Indus River in the east to northern Greece and Egypt in the west.

 

At this time ancient Greece, or Hellas as the Greeks called it, consisted of some 1500 city-states spread across the Greek mainland, the Aegean Sea islands to the east, and Sicily and southern Italy to the west. The most important and powerful of these were Sparta, a highly regimented city-state (polis) with a mixed political system and an invincible army, and Athens, a democratic polis with the largest population and navy in all of Hellas.

 

As the Persian Empire expanded westward into Asia Minor (current day Turkey), it came to dominate a number of Greek city-states on its western coast and on the islands in eastern Aegean Sea. In 499 BCE, this domination became intolerable to some city-states and they rebelled, calling on other Greeks for assistance. Athens responded and provided support. Though the revolt was suppressed, King Darius of Persia never forgave the Athenians for their audacity in challenging him. Legend has it that at dinner he ordered a slave to say three times: “Master, remember the Athenians.”

 

Persia had launched two earlier expeditions which did not bring success. The first in 492 BCE proved disastrous. The second in 490 BCE ended in the stunning victory for the Greeks, led by Athens, at the Battle of Marathon. (Our current day marathon is 26.2 miles because this was the distance that the messenger, Pheidippides, ran from the battle site of Marathon to Athens to announce the victory.) 

 

In 480 BCE, Persia, now led by Xerxes, renewed its campaign with overwhelming force. The ancient historian Herodotus indicated that 300,000 Persian allied forces crossed the Hellespont into northern Greece and faced Greek forces perhaps one-third that size. In his play The Persians, the Greek playwright Aeschylus, who fought in the battle, indicated that the Greeks had 310 ships facing a Persian allied fleet of 1207 ships. 

 

After defeating the Greeks, led by Leonidas and 300 valiant Spartans, at the Battle of Thermopylae, the Persian force marched south to Athens, now essentially evacuated, and sacked it. Most of the Athenians and other unconquered Greeks had withdrawn to the island of Salamis or manned the Greek fighting ships, the triremes. 

 

While the Spartans argued for withdrawal and the defense of the Peloponnesian Peninsula, the Athenian leader Themistocles won the debate on the strategy. His plan for defeating the Persian navy was simple: Lure the large Persian navy northward into the narrow strait feigning withdrawal, neutralizing its superior numbers, and then attack. 

 

To set the hook, he arranged for a slave, Sicinnus, to give the Persians false information: The Greeks were squabbling and were in disarray. They planned to withdraw the next day. Eager for victory, Xerxes took the bait. 

 

On September 29, 480 BCE, the Persian fleet—its rowers already in action for 12 hours—advanced into the trap. In his play Aeschylus relates the action at dawn:

 

 “…first there came from the Greeks the sound of cheerful singing, and the island rocks loudly echoed it. Fear struck all the Persians who had been disappointed in their hopes. For the Greeks were not singing their hymns like men running away, but like men confidently going into battle. The noise of the war-trumpet on their side inflamed them all.”

“It was possible too to hear shouting: ‘Sons of the Greeks, forward! Liberate your country, liberate your children, your wives and the temples of your gods, and the graves of your ancestors. The fight is for everything.’” 

He also paints the picture of the utter defeat of the Persians. 

 

“The sea was full of wreckage and blood. The beaches and the low rocks were covered in corpses. Every ship rowed in a disorderly rout, every one of the Persian fleet. … Wailing and shrieking covered the sea until dark night put an end to it. I could not finish telling you of the terrible happenings even if I were to relate them for ten days. Of the one thing you can be sure, never in one day did such a multitude of men die.”

Xerxes observed the action from the heights above the strait. Aeschylus envisioned his reaction to the disaster. 

 

“Deep were the groans of Xerxes when he saw this havoc; for his seat, a lofty mound commanding the wide sea, o’erlooked his hosts. With rueful cries he rent his royal robes, and through his troops embattled on the shore gave the signal for retreat.”

 

Salamis has come down to us as a key event in the early history of Western Civilization. If the Greeks had succumbed and came under the Persian “barbarian” yoke, ancient Greece probably would not have experienced its Golden Age in the 5th century BCE, with all its achievements: scientific inquiry of the natural world free from religion, philosophy, architecture, sculpture, mathematics, organized athletic competition, the realization of the world’s first democracy and the enrichment of the idea of freedom. 

 

Charles Freeman in his book, The Greek Achievement: The Foundation of the Western World, gives due adulation to the Greeks for the victory. However, he argues that it was the land Battle of Plataea, the succeeding year, which was more decisive. “It had dislodged the Persian forces from Greece and sent them home in humiliation and so, possibly, had changed the course of European history.” This is true; however, without the decisive naval battle of Salamis there would have been no decisive land battle of Plataea. 

 

The Greeks today have been celebrating the anniversary of this battle to include the staging this summer of the play The Persians, at the remarkable ancient amphitheater at Epidauros, which I was lucky enough to visit fifteen years ago.

 

Independent journalist John Psaropoulos witnessed the play and noted that the audience erupted in applause when the Persian queen Atossa asked of the Greeks, “Who is their master and commander of their armies?” The chorus leader answered: “They call themselves nobody’s slaves, nor do they obey any man.” 

 

Contributing editor Fred Zilian (zilianblog.com; Twitter: @FredZilian) teaches Western Civilization and politics at Salve Regina University, RI.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177681 https://historynewsnetwork.org/article/177681 0
Winners, All: A Personal History of Soldiers at War

 

 

 

 

According to multiple, increasingly unimpeachable reports, President Donald Trump disparaged those who served in the military, including the wounded and the dead who defended the United States. He called them, “suckers and losers.” He indicated they were fools to have served their country. 

 

When I went to Vietnam in 1966 as bureau chief for NBC News, I thought my assignment would be straightforward: run the bureau, decide which stories correspondents and cameraman would cover, ship those stories on time, be the best I could be in competition with the CBS and ABC, and keep an accurate set of books. My bosses in New York asked nothing less from me than to cover a war that consumed America. I had a rotating staff of Americans, Vietnamese, Koreans, Japanese, French, British and Germans.  

 

Sometimes members of my staff got hurt, suffered minor wounds, and had other illnesses that put them into hospital for care and recovery. That meant I put on yet another hat, that of visitor to the American Army 3rd Field Hospital at Tan Son Nhut Airbase just northwest of Saigon. 

 

Because of the generosity of the American military, and the respect given to NBC News, when someone on my staff got ill or hurt, I was able to secure for them a bed at 3rd Field. When I had someone in hospital I always visited him, sometimes three times a week. When I made those visits I learned more about our fighting men than anything I saw in combat. With as many as 1200 beds to service Army, Marines, Navy and Air Force wounded in combat, 3rd Field was a microcosm of the men who fought and suffered in the war.  

 

It was there that I got my real baptism of fire in the war. It was there that I saw the results of a war, mostly in the horrific wounds that we did not often report because we never got around to it properly. It was there that I learned about the unflagging spirit of young men who would never be the same because of their serious wounds. The first time it happened came after visiting one of my staff who was recovering from an attack of malaria. As I was leaving a nurse approached me and said, “ We are really shorthanded today. Can you help us feed the men?” I wanted to say no but I could not refuse her request and said, yes. There it was, a new role. I had become a volunteer to help them with their duties. It would become something I did after every visit to the hospital. Here is an excerpt of that first experience as edited, from my oral history, “The Soldiers’ Story.” 

 

“As we spoke, doctors, medics, and nurses were on the incoming ramp outside the hospital, receiving a large number of severely wounded men who had just arrived from an ambush near the Cambodian border. True, as usual, the hospital did not have enough staff. The recently heavy influx of wounded demanded the staff’s full attention. It needed help. I suddenly became part of what the hospital needed. 

 

"It was lunchtime, the hour to feed the men on the ward. A medic brought me a kitchen door. He gave me a small trolley loaded with food, and an apron and handed me a list of names to go with each tray. He then started me down a wide aisle with a long row of beds on each side. It felt like the inside of a World War II Hollywood movie—only this was real. One row of beds ran along the outside wall, which had large windows with white adhesive tape in crisscross patterns to prevent flying glass if bombs or rockets hit the building. The other row lined up against the inside wall, with a seriously wounded man in each bed. I planned to open their tray table, swing it up, around, and over their prone bodies, hand them the tray, and walk away. That proved unrealistic and impossible. Some of these men had no hands, no arms, no legs. They had so many serious wounds; they could not eat without help. It was the middle of 1967. I had been in Vietnam more than a year, and I had seen my share of horror. But being in the presence of so many wounded in one place was very difficult. As I marched down the aisle distributing trays of food, I saw that I had to feed many of the men. Some were patient; others were not. One man, more a boy of less than twenty, his body swathed in white bandages, lay unmoving. But his eyes were bright—they burned with life’s fire. And he could talk.

 

“Hey, man, over here. Don’t ignore me!”

 

"I stopped and turned to look at him. There seemed to be so little of him left, but he was still alive. Here was a young man who had held out for life when faced with almost certain death. The futility surrounding his future would come much later in his recovery. Now he was in charge, and he demanded service.

 

“Get that food over here. I’m hungry. I want to eat. Feed me.”

 

I moved over to him, unwrapping the tray as I approached his bedside. Wrapped in bandages and a plaster cast from his head to his toes, he resembled a mummy from a 1930s’ film. There were two black holes for his eyes, two black holes for his nostrils. His mouth was a larger black hole in his white-bandaged head. So I fed him. One spoonful at a time. Spoon by spoon. Slowly.

 

“More,” he said.

 

“Faster,” he said.

 

He demanded attention, and I readily complied. Then his tray was empty. There was no more food. His glass of water was empty. He could suck nothing more through his straw. There was nothing more for him to drink.

 

“Good, man,” he said.

 

He sighed deeply and was quiet. I moved away and distributed the rest of my trays. This was gut-real. War is mostly what is in front of you at the moment. War for me then was the seemingly hopeless situation of that blond-haired youth. But he was not helpless. I learned that, though badly wounded, their individual spirits were strong, and that these young men had an enormous gusto for life." 

 

These wounded men, mostly all young, did not make the war they signed up for. After seeing combat, many did not want to be in Vietnam any longer than they had to. Many complained but they stayed the course and finished their enlistment or the terms of their draft. The wounded men I saw at 3rd Field Hospital and talked to week after week, year after year, were something special.  Most had an enormous spirit and a gift for life unlike anything I had ever seen. No matter how seriously wounded, they belong in every parade. They should never be out of sight. We must never forget who they were, and who they now are. Whether they knew it at the time, they were the spirit of America. Today they still are. Winners all, they were not then suckers, losers, fools or mugs.  

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177725 https://historynewsnetwork.org/article/177725 0
Paris, City of Dreams: Napoleon III, Baron Haussmann and the Creation of Paris

Avenue de l'Opera, Camille Pissarro, 1898

 

 

It has been a long, long wait.  After years of dreaming and months of planning, your first trip or perhaps equally anticipated return-trip to Paris fell apart due to the global outbreak of COVID-19.  In cancelling your airline tickets and hotel, the chance to walk the cobblestone streets of the “City of Light” vanished with a sigh and considerable disappointment.  Yet, you are strong and remain hopeful for the future.  Imagine: One year and some months have passed, and a vaccine has allowed the world to rediscover its passion for international travel.  While emerging from the underground Metro subway system near the Eiffel Tower, you contemplate your good-fortune and brim with excitement.  You are not only alive…but you can now fully live again.

On an elevator to the top of the “Iron Lady” – a popular nickname for the Eiffel Tower, you glance at the mesmerizing views of the city and become aware of the diversity of your companions.  Surrounded by people from Brazil, the Netherlands, Saudi Arabia, Senegal, South Korea, Spain, Taiwan, Thailand, Venezuela and elsewhere, it seems the entire planet has convened to celebrate the end of the COVID-19 era in Paris.  After purchasing a delicious made-to-order crepe from an outdoor kiosk near the Louvre, you take a short stroll and wait in-line for twenty minutes to sip one of the finest hot chocolates in Europe at Angelina – a world-famous chocolatier and tea house established in 1903 on the Rue de Rivoli.  

While gazing at the chic Parisian elite at Angelina, two questions come to mind: 1) When was modern Paris constructed? and 2) Who was responsible for planning the streets and developing the distinctive white apartment buildings that define this stunningly beautiful city?  In the new monograph, Paris, City of Dreams: Napoleon III, Baron Haussmann and the Creation of Paris (2020), French historian Mary McAuliffe has delivered a highly-engaging and enjoyable narrative on how the “City of Light” achieved its stunningly beautiful architectural and structural character – a book recommended for anyone visiting Paris.

Visions of Grandeur: A New Nation, A New Paris

In early 1848, waves of revolt and revolution swept through the German states, the Italian states, the Austrian Empire, Denmark, France and elsewhere across Europe.  Decades and centuries of fossilized, monarchical regimes became besieged with demands for popular representation in government.  On 10 December, the newly-proclaimed Second Republic witnessed the election of Louis Napoleon – the nephew of Napoleon Bonaparte – to the presidency.  Rather than a majestic metropolis, contaminated water, dirt, suffocatingly narrow streets and dilapidated, overcrowded housing teeming with restless citizens defined the city of Paris.  At the outset of his term, the French president developed a strategic vision to transform the capital into a worthy symbol of the nation by recreating the city to enhance the beauty and the lives of its proud inhabitants.  Near the conclusion of his constitutionally-mandated one term in office, Louis Napoleon orchestrated a coup d’état, seized power as Emperor Napoleon III and ended the Second Republic in 1851.  As McAuliffe deftly notes, the president-turned-monarch began the recreation of Paris one year later by partnering with two Jewish brothers, Emile and Isaac Pereire, and their banking house - Crédit Mobilier – a rival to the financial empire of the Rothschilds.  In the Pereires, Louis Napoleon secured the requisite means to underwrite the reconstruction of Paris through bond-issues to raise capital.  

To accomplish the vast undertaking to revamp Paris and lift more than 600,000 of its residents out of squalor (out of a population of one million), Louis Napoleon selected the Prefect of the Bordeaux – Georges-Eugène Haussmann.  Louis Napoleon and Haussmann fostered a complimentary working relationship and commenced upon bringing to fruition the formerly proclaimed urban dream of Napoleon Bonaparte when he once famously stated “I intend to make Paris the most beautiful capital in the world.” (p.51) From chapters three to thirteen, McAuliffe follows the three “systems” of development unleashed by Haussmann.  The first phase or “system,” which focused on the nucleus of Paris, commenced with the elongation of the Rue de Rivoli and continued with a significant reconfiguration of the Latin Quarter, the design of an exquisite park (Bois du Bologne) and the establishment of large markets at Les Halles.  In his attempt to add kilometers to the Rue de Rivoli, progress halted due to the disparate, steep grades of the streets.  Haussmann remained undeterred and promptly overcame the conundrum by elevating “most of the surrounding neighborhoods” to the same level with large-scale engineering tactics.  In 1858, Haussmann launched his second system – a vast and bold undertaking to sweep away considerable portions of the city and turn his urban vision of grandeur into a quotidian reality for Parisians of the nineteenth century and beyond.  On both the Right Bank (the northern side of the Seine River) and the Left Bank (the southern side of the Seine), Haussmann lengthened and widened streets, demolished old buildings, installed new sewage and water systems, remade the Île de la Cité (the small island on the Seine containing Notre-Dame cathedral (est. 1260), planted new trees and added water fountains around the Avenue des Champs-Élysées and redrew the boundaries of the arrondissements – the districts of Paris.  Most significantly, Louis-Napoleon’s grand architect redefined Paris by constructing a plethora of simple yet elegant, off-white apartment buildings that still stand and exude the romance of the city today. 

 

Apartment Block on Boulevard Haussmann

Regardless of a public backlash to the steep cost of the tripartite-phased project by the time of the Third System in 1861-62, Haussmann doubled the size of Paris, increased its population, engineered a far-more livable city for its residents and ultimately won-over many of his critics due to his aesthetically-inspiring designs.  For a number of artists and intellectuals, however, Haussmann symbolized empty notions of progress at the expense of Parisian communities with ties to the ancient past. (p. 116-171)

New Artists in Old Paris

Through each chapter, McAuliffe intersperses the architectural remaking of Paris with biographical pastiches of a new generation of artists and literary savants.  The novelist Victor Hugo, who had sided with the monarchy against the masses during the large-scale revolt by workers in June 1848, changed course eighteen months later upon the brazen consolidation of power by Louis Napoleon in a successful coup.  Beyond launching a “small resistance committee,” Hugo picked up his talented quill, issued a virulent broadside and accused the duplicitous monarch as being “someone who ‘lies as other men breathe.’” (p.10-14, 36-37) Beneath the façade of the newly-remade, bourgeois Paris, Hugo published the first segment of a literary masterpiece on the lives of the destitute and the working-poor – the largest segment of the Parisian population – on 3 April 1862.  By the end of June, copies of Les Misérables had sold-out at booksellers across the city. (p.166-171) His tale on the struggles of Parisians to survive on a day-to-day basis offered a riveting and eloquent contrast to the Paris of Louis Napoleon, Haussmann and the elites.  

In response and reaction to the triumph of Louis Napoleon and his authoritarian regime, Hugo, the female novelist George Sand, several emerging artists, including Édouard Manet, Claude Monet, Berthe Morisot, and a coterie of young intellectuals gathered in Montmartre (an area largely untouched by Haussmann) and/or the Left Bank near the Sorbonne and produced a vibrant, countercultural alternative to the new order defined by kleptocratic power and crass materialism.  Indeed, their brilliant, soul-fulfilling work continues to flourish in academe, on theatrical stages and in the art world today.

Conclusion 

In Paris, City of Dreams: Napoleon III, Baron Haussmann and the Creation of Paris (2020), Mary McAuliffe has written a superb historical synthesis of scholarship on the period and thus provides a near-perfect introduction to the “City of Light” in the mid-nineteenth century.  Future visitors of Paris will also profit by reading two previously published books by the author – Dawn of the Belle Époque: The Paris of Monet, Zola, Bernhardt, Eiffel, Debussy, Clemenceau, and Their Friends (2014) and Twilight of the Belle Époque: The Paris of Picasso, Stravinsky, Proust, Renault, Marie Curie, Gertrude Stein, and Their Friends through the Great War (2017) to fully explore the evolution of Paris and French society and culture from 1848-1914.  

It has been a long, long wait.  Hopefully, the world will receive a viable vaccine to COVID-19 in the coming months and Paris – a true “City of Dreams” – will once again become a reality for millions of excited, historically-curious travelers.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177739 https://historynewsnetwork.org/article/177739 0
Like Lincoln, Biden at Gettysburg Urges Reunification

 

 

On October 6, Joe Biden gave a 22-minute speech near the famous battlefield of Gettysburg, Pennsylvania. He began it succinctly, “On July 4, 1863, America woke to the remains of perhaps the most consequential battle ever fought on American soil. It took place here on this ground in Gettysburg. Three days of violence, three days of carnage. 50,000 casualties wounded, captured, missing or dead. Over three days of fighting.” In November 1863, President Lincoln came to the battlefield to deliver the Gettysburg Address, which historian James McPherson called “the most famous speech in American history . . . only 272 words in length and took two minutes to deliver,” short enough to be reproduced on the walls of D. C.’s Lincoln Memorial. 

On his website Biden displayed the necessary humility, referring to his own speech as only “remarks,” not suggesting that they rose to the level of Lincoln’s Address. About the latter Biden said, “His words here would live ever after. We hear them in our heads, we know them in our hearts, we draw on them when we seek hope in the hours of darkness.” And yet, even though Biden’s “remarks” did not match the oratorical greatness of Lincoln’s Address, they were significant--and timely. 

Timely because Biden put himself forward, as he has consistently done this year, as the leader best equipped to unite our fractured nation. Of the many problems facing us, many exacerbated by President Trump, the extreme division separating the Trump supporters from the rest of us is certainly central. 

More than any other of the many Democratic candidates earlier in 2020, Biden stressed the need to heal our extreme and festering political divisions. Sometimes, as occurred already in 2019, even to the point of angering other Democrats for being too compromising. The proper balance between political passion, tolerance, and compromise is certainly difficult. But if Biden is correct that this divisiveness (and sometimes even hatred) is a central danger to our nation, then it could be argued, as I have done, that more than anyone else, “Biden has a better chance of unifying our nation and delivering positive long-range results.” 

In his Gettysburg speech, alluding to Lincoln’s House Divided Speech of 1858, Biden stated that “once again, we are a house divided. But that, my friends, can no longer be.” He warned of our shipwrecked state being “on the shoals of anger and hate and division.” 

Again citing Lincoln’s words, this time his Second Inaugural--“With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation’s wounds”--he pledged to “work with Democrats and Republicans,” to “work as hard for those who don’t support me as for those who do.” For our times of bitter rancor, he offered the balm of trying to “revive a spirit of bipartisanship in this country, a spirit of being able to work with one another.” (For lists of the large numbers of Republicans, already opposing Trump and supporting Biden, including many conservative columnists, see here and here.)

Although Biden did not mention Barack Obama, the leader and friend he worked so closely with for eight years, his remarks also reflected the spirit of the former president. A spirit demonstrated in his keynote address at the 2004 Democratic National Convention, when he was still an Illinois state senator, in which he called for overcoming Red-state-Blue-state divisions, for overcoming “those who are preparing to divide us.” A spirit also demonstrated frequently as president, for example during his 2010 commencement address to University of Michigan graduates, when he told them, “We can't expect to solve our problems if all we do is tear each other down. You can disagree with a certain policy without demonizing the person who espouses it.” 

Unfortunately, however, this pragmatic president, temperamentally so well equipped to work with Republicans to achieve the common good, discovered little reciprocity from the likes of John Boehner and Mitch McConnell.

 

After Donald Trump succeeded Obama matters got worse, in large part due to Trump’s belligerent style, so amply demonstrated by him in his first debate with Joe Biden. It is ironic that many conservatives support Trump, but that Biden seems to realize much better than he the truth of the words of one of the fathers of U. S. conservatism, Russell Kirk (1918-1994): “The prudential politician . . . is well aware that the primary purpose of the state is to keep the peace. This can be achieved only by maintaining a tolerable balance among great interests in society. Parties, interests, and social classes and groups must arrive at compromises, if bowie-knives are to be kept from throats.” 

 

Proceeding further in his speech, Biden linked many of our other most pressing problems with our national divisiveness, with our extreme partisanship. One of these problems is racial injustice. . . “the product of a history that goes back 400 years, to the moment when black men, women, and children were first brought here in chains.”  Recalling recent “peaceful protests giving voice to the calls for justice,” Biden also mentioned “examples of violence and looting and burning that cannot be tolerated.” But unlike President Trump, who stresses only law and order but not racial justice, the former vice president stated that “we can have both,” and that our country needs “leadership that seeks to deescalate tensions, to open lines of communication, and to bring us together.”

He also linked the over 200,000 coronavirus deaths we have suffered to “the deep divisions in this country.” Wearing a mask, social distancing, testing, and developing a vaccine should “follow the science,” he said, and not be politicized. Echoing the rhythm of Obama’s 2004 Democratic Convention keynote address, Biden added, “The pandemic is not a red state versus blue state issue. The virus doesn’t care where you live or what political party you belong to.”

Finally, Biden targeted “the divisions in our economic life that give opportunity only to the privileged few. America has to be about mobility,” the type that enabled Lincoln, a child of the frontier, to “rise to our highest office.”

Throughout Biden’s speech a can-do, optimistic spirit prevails. It emulates not only Lincoln’s words, but also those of Franklin Roosevelt  and Obama. 

In his first inaugural address (1933), coming near the height of the Great Depression, FDR said, “This great Nation will endure as it has endured, will revive and will prosper. So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.” 

Similarly, in his first inaugural address (2009) in the latter stages of the Great Recession, Obama spoke of being in the midst of crises that included a “badly weakened” economy, lost homes, “jobs shed, businesses shuttered,” costly health care,  energy policies that “threaten our planet,” a “sapping of confidence across our land,” and “a nagging fear that America's decline is inevitable, that the next generation must lower its sights.” But Obama assured the nation that these challenges “will be met,” that our nation will choose “hope over fear, unity of purpose over conflict and discord.”  

More than a decade later with our inner political conflict and discord worsened by eight years of Trumpism, Biden at Gettysburg urged us to “talk to one another,” to  “respect one another,” to “love each other.” He promised to be a president that would “embrace hope, not fear. Peace, not violence. Generosity, not greed. Light, not darkness.” A president that followed the example of “Lincoln and Harriet Tubman and Frederick Douglass,” that represented an America that “welcomed immigrants from distant shores,” and broadened opportunities for women, minorities, and gays. A president that embraced “the dreams of a brighter, better, future.”

Near the end of his speech Biden once again echoed the spirit of Obama’s 2004  keynote address at the Democratic National Convention. “We can,” said Biden, “end this era of division. We can end the hate and the fear. We can be what we are at our best: the United States of America.”

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177741 https://historynewsnetwork.org/article/177741 0
My Wish for Trump Steve Hochstadt is a writer and an emeritus professor of history at Illinois College.

 

 

 

Trump has COVID. What do I wish for him?

 

That question provokes a variety of verbal contortions. Official Democrats wish him and Melania well. A conservative, Ross Douthat, plays defense, blowing up a bit of evidence into an assumption that any other President would have made the same early mistakes. More liberal Nicholas Kristof wants to abstain from offense, saying the main thing to do now is to avoid snark. David Barash at Daily Kos is refreshingly brutal. He won’t wish Trump well, and he’s right in everything he says. I’m sure there will be many more efforts to publicly acknowledge the emotional, moral, and political battle between our better and worse angels.

 

We have been schooled to believe we should always wish the best to everyone, even to a man who epitomizes hate toward one’s opponents. When Hillary had the flu, Trump mocked her. Our President in a time of plague is the greatest source of public misinformation about it, says Cornell University researchers, after studying 38 million articles about the pandemic. If anyone deserved to get coronavirus, it’s Donald Trump. But let’s still play nice.

 

I won’t play nice, but I don’t want Trump to die, or even become deathly ill. That would not just be bad for him, but bad for my wishes for our national future. I want Trump to get well, to live many years beyond the end of his Presidency.

 

The last thing we need is for the Republicans to be able to validate his pose as the ultimate victim, so they can transform Trump into a martyr, even if it is to his own stupidity.

 

I am gleeful at the prospect of dozens of tell-some books by those who were present for Trump’s outrageous behavior, who heard what he said. Publishers will dangle millions of dollars in front of people who have thus far demonstrated little spine or conscience. Some of what they say will stick to his image like obscene Post-Its.

 

I look forward to countless court cases around Trump, a later life of defending himself for his whole life thus far. Eventually the accumulation of evidence and judgments will prove to any reasonable person that he was and is a crook, a fraud, a failure in everything but inherited privilege. I recognize how many unreasonable people there are in America, who could never be convinced of any truth about the object of their idolatry. Their fantasies will disappear into the dustbin of history, then reappear in some other guise as another generation of deluded souls gets taken in by the latest con. But some of the Americans who were duped by the greatest con man of our lives will eventually realize that they had no idea what was really going on. The history books will paint a damning portrait of Trump.

 

I look forward to the Republican Party explaining how it became Trump’s slave and where it is going now. The Never Again Trumpers still have a lot of squirming to do about their role in creating such low-hanging fruit for their most dangerous adherents. We all need to confront our participation in systemic racism, but most of all the systematically racist Republican Party. That could bring us a little closer to a just society.

 

That could only happen if the ultimately privileged Mr. Trump has some mild case of this flu, and recovers quickly enough to continue his reign of terror on the country of his birth for just a few more months. Then I look forward to the crash.

 

I want Trump to get better, but I don’t wish him well.

 

Steve Hochstadt

Jacksonville IL

October 6, 2020

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/blog/154415 https://historynewsnetwork.org/blog/154415 0
Trump's Opportunities to Thwart Democracy

 

 

 

Americans keep asking, “Can it possibly get worse?” and each day we discover that it can. During the first presidential debate, Donald Trump refused to commit to a peaceful transfer of authority if he loses in November, arguing that he did not have to abide by the results of a “rigged election.” This can no longer be dismissed as Trump being Trump. This is a threatened coup.

 

Five times the United States faced similar crises, and in four cases the “losing” candidate and his party accepted the decision, placing nation over political power. However, one case led to civil war, and in at least two of the cases the nation suffered from the result.

 

In 1800, Thomas Jefferson was elected President as a result of the 3/5 clause giving added electoral votes to Southern slaveholding states. President John Adams, who was seeking reelection, accepted the result. Adams’s decision not to contest the election led to the first political transition in the United States and established a precedent the country has largely followed for over to two hundred years. The other problem in 1800 was that the way the Constitution was originally written, the leading candidate became President and the second place candidate became Vice-President. Jefferson and his Vice-Presidential candidate Aaron Burr had the same Electoral College vote total so this had to be sorted out in the House of Representatives, which chose Jefferson over Burr. This fiasco led to passage of the 12th Amendment to the Constitution clarifying that Presidential and Vice-Presidential candidates were designated in advance. 

 

The 12th Amendment also established that if the election was thrown into the House of Representatives, each state delegation polls its members and has a single vote. If the 2020 election ends up in the House of Representatives, California’s 53 members representing 39.5 million people will have the same voting power as Wyoming’s single member representing a little over 500,000 people. Some commentators speculate that getting the election thrown into the House may be part of Trump’s reelection strategy because although the Democrats will likely have a clear majority, Republicans may control a majority of the states.

 

In 1824, four candidates split the electoral vote so none received the majority needed for election. Although Andrew Jackson had the largest electoral and popular vote total, the House of Representatives, under provision of the 12th amendment, selected the runner-up, John Quincy Adams. Adams won because the 4th place candidate, Henry Clay, who was no longer eligible, disliked Jackson and threw his support to Adams. Jackson’s supporters in the emergent Democratic Party later swept the 1828, 1832, and 1836 elections.

 

But in 1860, crises and collapse could no longer be avoided. Abraham Lincoln, the candidate of the Republican Party, was a regional candidate who did not even appear on the ballot in ten Southern states. Lincoln secured under 40% of the popular vote but 60% of the electoral vote because the Democratic Party was split and also nominated regional candidates. In response to Lincoln’s election, eleven Southern states, anxious to protect slavery, tried to secede from the federal union and plunged the United States into civil war.

 

The 1876 election result was corrupted when competing slates claimed victory in three Southern states where former Confederates were trying to regain local control and to throw out Reconstruction governments committed to protecting the rights of formerly enslaved African Americans. Although the Democratic Party candidate Samuel Tilden appeared to have strong majorities in both the popular and electoral vote, a special committee appointed by Congress with seven Democrats and eight Republicans awarded all the disputed electoral votes and the Presidency to Republican candidate Rutherford Hayes. In exchange for Democratic Party acquiescence, Republicans agreed to end post-Civil War Reconstruction, effectively abandoning Southern Blacks to Jim Crow white-controlled governments and laws for the next 100 years. 

 

In 2000, a Republican majority on the Supreme Court voted 5-4 to block a recount in Florida, making George W. Bush President. In his concession speech, Democratic Party candidate Al Gore simply said “Let there be no doubt, while I strongly disagree with the court's decision, I accept it . . . for the sake of our unity as a people and the strength of our democracy, I offer my concession.” The 2000 Supreme Court decision may establish a precedent for 2020 that the Supreme Court, which will have a Republican majority, may decide the outcome of the election.

 

The Trump campaign’s unsubstantiated challenge to the legitimacy of mail-in ballots is angling to throw the election into the courts if he loses. A rightwing Republican majority on the Supreme Court could then support his claim and allow state governments controlled by Republicans to throw out disputed ballots or delay the results past the December 14, 2020Electoral College deadline so small state Republicans can throw the election to Trump in the House of Representatives. In either case, it would constitute a betrayal of democracy and an electoral coup.

 

This would set the United States up with for four more years of Trump’s and Republicans’ contempt for democracy and majority rule. What would this mean? In a frightening parallel to Germany in 1933, Adolf Hitler and the Nazi Party took control of the German Parliament and government as a minority party, and within eighteen months established a one-party dictatorship.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177742 https://historynewsnetwork.org/article/177742 0
Life during Wartime 522

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/blog/154413 https://historynewsnetwork.org/blog/154413 0
The Roundup Top Ten for October 9, 2020

The Plot Against Whitmer Won’t Be The Last White Supremacist Threat

by Kathleen Belew

I'm very concerned that more violence is imminent, and that these ideologies pose an imminent threat to our democracy and to people going about their everyday lives.

 

Yes, Mike Lee, America is a Democracy

by Jonathan Bernstein

Mike Lee's insistence that the US is "a republic" and not "a democracy" is a petty distinction that ignores the historically interchangeable usage of the terms in American politics in order to justify undemocratic rule by a minority party. 

 

 

The Overlooked Queer History of Medieval Christianity

by Roland Betancourt

An attentive reading of the record shows that same-sex intimacy, gender fluidity, and diverse sexual identities were prevalent among early Christians, contrary to the claims made by some fundamentalists today that these represent deviations from historical norms. 

 

 

Why Heller is Such Bad History

by Noah Shusterman

Antonin Scalia's opinion in District of Columbia v. Heller ignored the actual history of the early American militia in order to invent an individual right to gun ownership.

 

 

What White Power Supporters Hear Trump Saying

by Alexander Hinton

Donald Trump's attacks on "political correctness" aren't calls for intellectual openness or academic freedom; they are coded messages invoking white grievance politics, including the longstanding idea that multiculturalism is part of a genocidal attack on the white race.

 

 

The Root of American Power

by Megan Beyer

"October is National Arts and Humanities Month. Observing what happens in America when we fail to protect them, invest in them, and recognize their value, is the best case that could ever be made for the Arts and Humanities."

 

 

A Brief History of the Taxpayer in Chief

by Margaret O'Mara

The revelation, at the height of the Watergate investigation, that Richard Nixon had abused deductions to avoid nearly all of his tax obligations initiated modern interest in presidential candidates' tax returns. 

 

 

Trump's Call for Freelance Poll-Watchers Summons a Dark History

by Nicole Hemmer

In 1981, the Republican National Committee used threatening signs and deployed off-duty officers to polling places in Black and Latino neighborhoods to help win the New Jersey governorship. This is the first presidential election year since the decree expired, making Trump's call for supporters to "watch the polls" ominous. 

 

 

Trump’s Attacks on Refugees Expose the Inadequacy of the Current System

by Carl J. Bon Tempo

The Refugee Act of 1980 is the law allowing the President to set an annual ceiling for refugee admissions to the United States, and is in urgent need of revision by Congress. 

 

 

Coronavirus Can Afflict the Powerful. Yet Food Workers Remain the Most Vulnerable.

by Angela Stuesse

The rollback of workplace protections under a generation of conservative state and federal administrations has made low-wage service workers acutely vulnerable to COVID. 

 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177723 https://historynewsnetwork.org/article/177723 0
Loyalty and Duty in the Federal Bureaucracy, From Nixon to Trump

Assistant to the President for Domestic Affairs John Ehrlichman and White House Chief of Staff H.R. Haldeman, 1973.

 

 

“There must be absolute loyalty,” said President Nixon during a meeting with his Chief of Staff H.R. Haldeman and Special Assistant Fred Malek, two months after his landslide victory over George McGovern. The White House’s repeated clashes with executive branch officials convinced Nixon that he needed to wrangle the federal bureaucracy during his second term. At one point, he even asked for the resignation of every cabinet member, a mostly symbolic gesture that was meant to send a message across the administration.  Nixon demanded that the bureaucracy would be at his disposal, particularly when it came to using the levers of government against his enemies. “There must be the ability that we speak out to this government; the damn government will start to pack,” exclaimed the president.[1]

While parallels between Nixon and Trump have received much attention, it is arguably more important to confront the fact that Trump has expanded on the process that Nixon was trying to build that winter. The White House has successfully carried out what Trump’s s former advisor Steve Bannon referred to as the “deconstruction of the administrative state,” instilling a culture of loyalty to the president that Nixon sought during his time in office. Whereas the civil servants featured in my book They Said No to Nixon were able to rein in Nixon’s power grab on a few crucial fronts, the current administration has successfully moved forward with institutionalizing rampant abuses of power across the government. 

The American public has been flooded with stories that collectively show that there is a concerted effort to not only cut the federal bureaucracy but also turn much of it into the president’s political weapon. Trump’s push for more loyalty across his administration has not only led to more corruption, but has also enabled purposeful abuses of power that advance the president’s agenda. This campaign has had far-reaching consequences for government workers, leaving them with little power to provide any sort of check on the White House’s misdeeds. The Justice Department is just one example of an executive agency’s work being redefined over the last four years. As Attorney General, Bill Barr has been responsible for lies about the Mueller report and mail-in voting while also intervening in the Roger Stone case and sending out unmarked federal agents to terrorize Black Lives Matter protesters in American cities. A US Attorney for the District of Massachusetts wrote in a letter to the Boston Globe that William Barr has done the president’s bidding at every turn.” Secretary of Health and Human Services Alex Azar recently made it clear that the Food and Drug Administration (FDA) could no longer sign any new rules related to medical products, including any future coronavirus vaccines. Under the leadership of longtime Republican donor Louis DeJoy, the United States Post Office has seen, among other issues, significant slowdowns just months before an election that will likely see a record number of absentee ballots. Unfilled positions, unjust firings and budget cuts, combined with numerous abuses of power have blurred the lines between the White House and the administrative state at an unprecedented level in the modern era. 

Unlike the Nixon administration, we have seen very little resistance from inside of the Trump administration. Even before Watergate became a national story, there were individuals who opposed some of the President’s more controversial policies, and also blocked his abuses of power. Republican appointees like Secretary of Treasury George Shultz and IRS Commissioner Johnnie Walters stopped the White House from taking over the IRS when they refused to audit hundreds of individuals who were on Nixon’s enemies list. Three Assistant Directors of the OMB, Kenneth Dam, William Morrill, and Paul O’Neill, prevented the President from cutting federal funds to MIT and other elite universities when they threatened to resign and take their story to the press. Most famously, Attorney General Elliot Richardson and his Deputy William Ruckleshaus resigned in protest when they were asked to fire the Watergate Special Prosecutor Archibald Cox, setting off the Saturday Night Massacre.  These substantive acts of resistance were linked to a much stronger culture of nonpartisan civil service that had defined much of the federal bureaucracy of the era. 

Confronted with a political culture that was significantly different than 2020, Nixon felt pressured to appoint and keep independent-minded Republicans like Elliot Richardson and William Ruckelshaus who had butted heads with the White House behind the scenes. Richardson had promoted more liberal policies related to busing and childcare while he was the Secretary of Health, Education, and Welfare (HEW) from 1970 to 1973. As the first head of the newly created Environmental Protection Agency (EPA), Ruckelshaus had fought back against the White House’s attempts to weaken the agency. “I suppose you’ve got to keep one person in the goddamned government that’s considered to be, sort of interested in the people. You see he has that,” said Nixon during a meeting where he and his aides were mapping out his second term.1 The credibility that Richardson brought to the administration was ultimately why Nixon selected him to become Attorney General in May 1973. Richardson resigned from the post five months later. Nixon later wrote that appointing Richardson was “a major mistake” and that “Richardson’s weakness, which came to light during the Cox firing, should have been apparent.”[2]

It has become clear that President Trump has little incentive to appeal to what Nixon often referred to as the nation’s “establishment,” as the White House has created an atmosphere that has strongly discouraged any forms of dissent. The Trump impeachment hearings brought forth a few individuals who were willing to speak out again the president, but not at the cabinet-level. One of the more notable dissenters, Lt. Col. Alexander Vindman, a Ukraine expert who worked inside the National Security Council. Vindman was fired from his position last October and retired from the military in July, citing the President’s "campaign of bullying, intimidation, and retaliation.”

Numerous successful attacks on the more independent parts of the administrative state have left the public with far fewer opportunities to hold a President accountable for their abuses of power when compared to the Nixon-era. Republican administration officials stopped Nixon, but Donald Trump has come dangerously close to achieving Nixon’s authoritarian vision. No matter the outcome of the upcoming election, future attempts to rein in the executive branch will need to confront the fact that our present-day political culture has unfortunately left little room for nonpartisan civil servants. 

 

[1] Oval Office, 836-9, January 9, 1973, Nixon Library.  

2 Camp David Hard Wire, 224-15, November 14, 1972. Nixon Library

3 Nixon, RN: The Memoirs of Richard Nixon, 1004. 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177618 https://historynewsnetwork.org/article/177618 0
A Founding Member Says the Commission on Presidential Debates Needs to Change  

 

 

 

As a founding member of the Commission on Presidential Debates and its vice chairman for its early debate cycles, I watched Tuesday’s debate with both a sense of outrage and no small measure of personal chagrin: how could the bipartisan forum for presidential debates have allowed such an unabashed debacle to have occurred in the midst of what nearly everyone agreed is the most important election of our lifetimes?

 

The answer is obvious: no one foresaw that a major party candidate – let alone a sitting president -- could appear on stage for the primary purpose of making a mockery of the election, the debate itself and, most important, the presidency. The immediate question is what can be done about it, in this cycle and beyond? Before getting into that, let me explain a brief history of commission’s establishment and its purpose. 

 

In their early years, presidential debates had been sponsored primarily by the League of Women Voters. They brought both their nonpolitical reputation and the worthiest of goals in offering a forum that would reveal the views and personalities of the candidates, supposedly unscripted and unaware of the questions they would receive. There were several problems, however, that occurred to both Democrats and Republicans. As helpful as the League’s sponsorship had been, it became clear over time they couldn’t assure that the candidates would appear regularly in a way that would make the debates permanent. Also, the League had become locked into a stiff, stale format that called for a panel of journalists who tended to offer stock questions, usually with little or no opportunity for follow-up and interaction between the candidates. 

 

In the mid-1980s, Robert Strauss, the former chairman of the Democratic National Committee, and Mel Laird, the much-admired former GOP congressman from Wisconsin and a former secretary of defense, put their heads together and formed a conference of political thinkers and practitioners to meet in Washington to discuss a range of national issues to see if it was possible to reach a bipartisan consensus to resolve them. High on the list for several of us was presidential debates, and it soon emerged that participants of both parties shared the same frustrations with the League’s sponsorship, particularly the lack of certainty every four years and the absence of varying formats. It would come as no surprise that a group of party activists concluded that the two political parties could do a better job, on the grounds that they were in a better position to assure that the party’s nominees would show up and that the commission would have the flexibility to experiment with different formats designed to engage and maintain citizen interest. 

 

The notion was approved overwhelmingly by the group, and it adjourned happy with itself and what it had done, while understanding that in all likelihood this was just another Washington report that would live, and inevitably die, on numerous Washington shelves. As one of the more outspoken advocates of the idea, I thought that would be a classic missed opportunity. Strauss and Laird were persuaded to take the finding of their commission one step further, and to make it a reality. 

 

Before long, Paul Kirk, the Democratic national chairman, and his GOP counterpart Frank Fahrenkopf, eagerly agreed to head up the new commission and assembled others from both parties who would commit to institutionalizing the presidential debates and to providing varied formats such as town hall-type meetings, and strict fairness in everything the Commission did. Over time various issues got worked out and the commission gained important experience and credibility. Happily, Kirk and Fahrenkopf developed an easy relationship that allowed them to steer a steady course that, thanks to their successors, has proved largely effective for more than three decades. . . until Tuesday evening. 

 

In my view, the commission needs to establish at once its credibility in providing a fair setting for the two candidates by taking at least two essential steps before proceeding with further debates this month:  it should empower the moderator to cut off the mic of a  participant who refuses to obey the agreed-upon rules and time limits in order to prevent a recurrence of the utter chaos and disaster that the president visited on some 100 million TV viewers. It should also bring back the idea of brief opening and closing statements, so that each candidate is assured of at least two uninterrupted minutes at the beginning and end of the debate, to lay out his best case for why he should be elected. There may be other steps that should be considered as well, and the commission should go to great lengths to establish its independence as part of its mission to provide fairness, which needn’t conflict with the fact of party sponsorship. If one candidate objects to these modest but justifiable changes to the rules, the commission should insist on them anyway to protect its own integrity and the long-term future of presidential debates. The onus for ensuring future debates this year should not fall on the candidate who agrees to new rules that are patently fair and self-evidently needed given the atrocious events of Tuesday in Cleveland. 

 

On the matter of future debates, I hope the commission will take the time after the election to review these and other issues in the commission’s history that might warrant reexamination. One item to consider is whether there should non-political public members added to the body, on the grounds that the debates are so important to the public interest that there should be voices representing the American electorate at large. 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177621 https://historynewsnetwork.org/article/177621 0
Combatting History “Indoctrination” in 1945 and 2020

Image from Learning Democracy: Education Reform in West Germany, 1945-1965 (Berghahn Books).

 

 

In the aftermath of the Second World War, American soldiers embarked on a massive project of educational reform in the ruins of postwar Germany. Their ultimate objective was to democratize a population that had been indoctrinated during the previous twelve years spent living – and learning – under a dictatorship. From their very first days in power, the Nazis had used the schools to promote nationalism, spread their racist ideologies, and prepare a new generation for war. Of special interest to the Nazis was history instruction, which they rewrote to facilitate the creation of uncritical, ill-informed subjects willing to follow the orders of their Fuehrer. And so American officers, working with professional historians and educators –both American and German – worked to establish new history instruction in the German schools that would bolster the emerging democratic order under construction. 

On September 17, President Trump announced a new commission to promote “patriotic education” and criticized the “twisted web of lies” currently taught in American classrooms. In the face of history teaching that he said resembled “anti-American propaganda,” the president desired to see instruction return to a focus on political elites in an effort to present “the miracle of American history.”  While the federal government does not control the history curricula taught in the schools -- this is the domain of the states -- it can make recommendations, offer grants, and encourage new approaches. The president clearly hopes his 1776 Commission will offer a set of guidelines that can stop what he deems the “indoctrination” underway in our schools. Fortunately for this newly established group, the successful strategies the U.S. pursued to eliminate Nazi propaganda and promote democracy in the German schools are easily accessible – and surprisingly relevant – to those who are interested.    

Should the new commission investigate America’s educational work in Germany, it would likely be shocked by what it would find. First, American officials advocated a new generation of textbooks that promoted discussion and debate. Utilizing a wealth of primary sources to foster critical thinking and analysis, these new publications reduced the emphasis on rote memorization and teacher-centered instruction. Second, they demanded that history instruction illustrate the successes of the German nation as well as its shortcomings and the challenges with which it still grappled. As one might imagine, history was a particularly difficult subject for those who wished to avoid discussions of the recent past. Third, American officials encouraged German educators to shift the focus of their instruction. Whereas previously war, political elites, and the nation had stood at the center of the narrative, American officials recommended that teachers concentrate on Germany’s larger role in European history, the diversity of Germany’s population, and social and cultural approaches to understanding the past. Here the voices of non-elites entered the curriculum, in many cases, for the first time. 

These reforms made a significant impact on a German nation seeking to combat a legacy of racism, extreme nationalism, and militarism. Educational changes inaugurated during the occupation helped to eliminate the propagandistic instruction that produced uncritical subjects and gradually transformed them into engaged citizens. Yet the recommendations of American education officers in Germany have lost none of their relevance in the seven decades that have passed. Most Americans would probably agree that the creation of informed and engaged citizens willing and able to participate actively in our democratic society is one of the principle goals of our educational system. In light of the president’s concerns, now seems like the perfect time to revisit the work of these dedicated men and women who worked to institute social and cultural change abroad. As the new 1776 Commission begins to consider how to wield history as a weapon against indoctrination, America’s educational work in Germany can serve as a guidepost. The strategies employed in postwar Germany are not likely what the president has in mind, but they undoubtedly reflect a commitment to preparing vigilant young men and women to build and defend a new democratic state. What better way is there to reinvigorate and renew our own democracy at a time when informed and engaged citizens have never been more needed?

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177622 https://historynewsnetwork.org/article/177622 0
Will 2020 Place the US Alongside Apartheid South Africa in History's Hall of Shame?

D.F. Malan, leader of South Africa's National Party

 

 

 

On November 9, 2016, the United States woke up to Donald Trump’s unexpected victory. The incumbent Democratic Party had been led by the hand-picked successor of a charismatic, “cross-over” leader. The Democratic candidate received more popular votes, but Trump, darling of the Christian right, prevailed because of the differential weighting of urban and rural votes.

The new administration implemented anti-immigrant, isolationist policies and cracked down on dissent by labelling it “fake news.” The Republican-led Congress passed laws to increase holdings of the wealthiest Americans and bolstered corporate power. Meanwhile Republican state legislatures worked to disenfranchise likely Democratic voters while Congress gutted voting rights protections.  

The Democrats have launched a massive effort to increase fundraising, counter the disenfranchisement efforts and appeal to voters of all persuasions. Anger over Trump’s use of unconstitutional, illegal foreign electoral assistance led to widespread Democratic gains in the 2018 elections. Buoyed, Congressional Democrats impeached Trump but were unable to rouse enough national indignation to convict him in the Republican-led Senate. 

The Democrats have chosen a close ally of charismatic, term-limited Barack Obama as their standard-bearer. Joe Biden is an experienced politician and party insider who is universally described as “decent.” He is trying to ride the dual bucking broncos of a rambunctious left wing and an elusive phalanx of conservative voters disgusted by Trump. 

In response, Republicans loudly portray themselves as the law-and-order saviors of white America. Trump relentlessly belittles stutter-challenged Biden as weak and paints the Democrats as in cahoots with black radicalism. The Black Lives Matter movement has led multi-racial crowds by the thousands in peaceful acts of protest and civil disobedience across the country. The Republicans, absurdly, have thrown the specter of socialism at the thoroughly capitalist Democrats, hoping to stoke the white electorate’s deepest fears of getting stuck in the backwaters with people of color. 

1953: Apartheid wins again

On May 29, 1948, South Africa woke up to the unexpected Parliamentary victory of the National Party (NP), champion of apartheid and darling of the Christian right. The losing, incumbent United Party (UP) was led by a charismatic “cross-over” leader. The UP received more votes, but lost because of the differential weighting of urban and rural votes. 

The NP immediately implemented isolationist policies and cracked down on dissent. It outlawed Communism, began racially registering the entire population, forbade sexual relations “across the color line,” segregated residential areas and education, and enforced identity documents on all people of color. It also declared its intention to change the Constitution to dispense with a few thousand so-called “coloured” voters. The NP prioritized economic growth in the white community and expanded corporate power.

The UP, now the opposition, faced huge challenges in readying itself for the next election in 1953. General Jan Smuts, the party’s charismatic leader and his hand-picked successor had both died. The mantle fell on Jacobus “Koosie” Strauss, an experienced politician and party insider who was universally described as “decent.” The UP mounted an unprecedented effort to address the under-registration of likely UP voters, appeal to both fired-up progressives and conservative-minded whites, and modernize a flabby fundraising infrastructure. 

In response, the NP loudly portrayed itself as the law-and-order savior of white South Africa, relentlessly belittling short-statured Strauss as weak. It claimed the UP was in cahoots with black radicalism like the 1952 Defiance Campaign in which the African National Congress staged peaceful, multi-racial civil disobedience demonstrations across the country. The NP, absurdly, accused the prim and proper UP of sympathizing with the anti-colonial Mau-Mau Uprising in Kenya, stoking the white electorate’s deepest fears of armed African insurgents taking back their land.  

The UP lost again in 1953. Strauss could not paper over the cracks of his party’s simultaneous appeals to left and right, opposition to policies which it did not condemn, and inability to rouse sufficient indignation at the abrogation of the Constitution. The Parliamentary opposition shriveled to a few seats. The NP ruled for the next 45 years. 

Could Trumpism last 50 years?

To be sure, almost seven decades, an ocean and significant differences separate the US and South Africa. But they share common dynamics following a surprise right-wing ideological electoral victory: cynical appeals to racial supremacy, creative voter suppression campaigns and relentless belittling of a “decent” opponent heading up a hide-bound party laboring to straddle vast ideological divides. Indignant requests for a return to civility can be ignored by a minority of inflamed voters with disproportionate electoral privilege. New social inequalities can then be bricked in for decades. 

The bitter lessons and legacies of mid-20th century South African elections should warn Americans against complacency now.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177625 https://historynewsnetwork.org/article/177625 0
"Follow the Science," but Explain and Apologize Poll after poll has shown that the increasing politicization of health information from government agencies is leading a majority of Americans to fear taking a vaccine against COVID-19. CDC guidance on the current pandemic, for example, has been issued and withdrawn, with dubious explanation. Yet all government health agencies make mistakes in a rush often to get the best information to the public.  The real issue is what they do with the information once these errors are clear, and what kinds of support or pressures they are getting from higher ups. 

 

A decade ago, on Friday morning, October 1, 2010, the United States federal government at its highest levels apologized to Guatemala for immoral research done by the U.S. Public Health Service six decades before.  It is hard to imagine that such honesty would happen now, but it is worth remembering as a potential model for cases when science goes awry.  I know because I was the medical historian who alerted CDC officials that I was about to publish an article about this research.

 

Between 1946 and 1948, U.S. government researchers infected more than 1400 Guatemalans with sexually transmitted diseases in a failed effort to determine if the newly available penicillin might serve as prevention against these infections.  Unlike other immoral studies that were hidden in plain sight over the years, this one really was kept secret. This was because U.S. taxpayer dollars went to encourage sex workers to have infected sex with Guatemalan prisoners, and soldiers and mental patients had parts of their bodies scarified so that the inoculums of the various infections could pass into their systems.  The reports of the details of the study were never published, but the correspondence and medical records were in the papers of physician John Cutler, who led the study (the papers are now in the Southeast Regional National Archives but then were at the archives at the University of Pittsburgh).

 

I had examined these archival records because I was the working on a book about the U.S. Public Health Service’s study of “untreated syphilis in the male Negro,” primarily known as the Tuskegee Study (1932-72). One of the myths of the study in Tuskegee is that the men were given syphilis by the government doctors, but this is not true. But then there was this evidence about Guatemala done by a physician who would later work in Tuskegee. I knew the difference between the two studies had to be explained.

 

I shared my research with David Sencer, the retired director of the CDC, asking him to make sure I had the medical aspects of my yet unpublished paper right.  Sencer, immediately understanding the possible backlash against CDC and the Public Health Service when my article came out, asked if he could give my draft to CDC officials. I agreed, because unlike a government employee, no one could stop me from publishing.  Government health officials were alarmed enough to do their own research in the archives, and sent my still unpublished paper and their report to the NIH, the Department of Health and Human Services (DHSS), and State Department leadership and the White House Domestic Policy Council.  The decision was made to have President Obama call the president of Guatemala and the secretaries of DHSS and State formally apologized. Subsequently, the president’s bioethical commission issued a report about how this study had happened.  

 

In many clinical trials researchers believe they are doing the best science possible to fight a deadly scourge. Those who did the studies in Guatemala, and in Tuskegee too, thought they were doing good science. Although they knew they were on an ethical edge, the researchers on the study in Guatemala, with help from Guatemalan government health officials, thought they had to do this to understand the way to stop the epidemic of  sexually transmitted diseases that harmed or killed millions too.  They were following the science in their testing, just not letting their unwitting participants know what they were up to. Similarly, the doctors in Tuskegee misled their subjects by telling them they were being treated not studied, but they thought the scientific payoff might be right, and they thought wrongly that syphilis was a different disease in Blacks and white patients. 

 

The need to “follow the science” in making recommendations is not always easy.  David Sencer knew this. In 1976, as the CDC director he thought “following the science” and advice of other scientists meant there would be an epidemic of deadly flu like that of 1918. He urged a national vaccination program; but the epidemic never emerged. Sencer was fired for his errors. 

 

There will always be mistakes, rushes to judgment, the weighing of the best way forward, or career or political concerns that interfere.  We have the model of firings and apologies when actions and recommendations are wrong. There are ways to give the American people the best possible information to make informed judgments about their own and their communities’ health.  Right now, we do not have this. But we should. 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177617 https://historynewsnetwork.org/article/177617 0
The Rise of the Anti-Analytical Presidency

The Cabinet of Donald Trump, 2017

 

 

Knowledge, we are told ad nauseum, is power. The first President, George Washington, embraced this idea and obsessed about getting information he could use to make sound decisions. He employed a cadre of spies to provide him with information about his allies as well as his adversaries. And when he became president, Washington continued with his goal of acquiring the best information from a wide range of sources prior to making key decisions.

When President Washington gathered the heads of his Departments (what we today refer to as the President’s Cabinet), he did more than what Article II, Section 2 of the Constitution prescribes: “to require the Opinion in writing, of the principal Officer in each of the executive Departments, upon any subject relating to the Duties of their respective Offices.” He used such meetings for face-to-face discussions (and because of the friction between Hamilton and Jefferson, sometimes heated arguments) regarding the vexing problems of the day. At that time, the Cabinet was small, and Washington had very few on his staff. One could hold a full Cabinet meeting sitting around the average size dinner table. Washington used these meetings as a way to gather information and make decisions based on the best available knowledge. Today, the executive branch is vastly larger in size and scope of power. Presidents appoint roughly 4,000 people to fill posts, and management has become a struggle. 

In FDR’s day, the President’s Committee on Administrative Management, better known as the Brownlow Committee, recognized the growth in power and responsibilities of the executive branch, and famously exclaimed, “The President needs help.” Such a large organization that developed around the President could assist him in acquiring sound information, but could just as easily be so unwieldy that it might isolate or develop dysfunctional relationships that poisoned the decision making process.  Over time, the president’s need for information has remained largely unchanged, but the increased size and scope of the executive branch and of the president’s powers and responsibilities have complicated a president’s efforts to acquire the information necessary to decide. 

 And still, presidents so often seem so woefully bereft of high quality, useable information. While there is no one decisionmaking model utilized by presidents, it is nonetheless useful to ask: how do presidents decide? From where do they get the vital information, options, and data they need in order to make sound decisions?  Good judgment is the most important quality a decision-maker can have, and both individual and institutional forces can interfere with good decision making. 

Humans are capable of exercising reason, yet there are a variety of cognitive biases that can interfere with sound judgment. These biases inhibit our ability to make reasoned judgments. From confirmation bias, to commitment escalation, to the use of false analogies, to anchoring and more, these cognitive biases can trip us up and lead us astray. 

Presidents cannot afford to make many mistakes. When we make a bad decision, its negative impact is usually confined to ourselves and those immediately around us. When presidents make mistakes, people can die. 

There is no way to eliminate all impediments to sound decisionmaking, but there are things presidents can do to increase the likelihood that they will make sound decisions. Process is one key. Presidents must set up a rational process that helps provide evidence and information, presents several sides to arguments, presents the president with a variety of options designed to give the president what he needs to decide. Process is not a panacea, but when we provide a president with tools for deciding, we make it easier for the president to choose based on solid information. 

Presidents are especially susceptible to isolation, group-think, and ideological blindness. How, one might ask, can a president who sits atop the world’s greatest information gathering machinery not have the tools necessary to decide? Because we all have cognitive biases that affect decisionmaking. The institutional machinery that surrounds a president and is intended to serve their needs may actually lock out vital information and key ideas and options, thereby isolating a president from reality.  In serving the president, staff may become flatterers and tell the president what they want or even demand to hear, not what they need to hear.  There is a tendency for staff to become “yes men,” sycophants, afraid to deliver hard truths to the leader. Or the demands for group solidarity can become an overarching goal, with contrary arguments ignored and critics silenced or self-silencing so as to maintain teamwork and common mission. This can become especially troublesome when the president has a clear ideology, and little patience for “non-believers.”

Decision making maladies can be diminished if the president is an astute manager (or his chief-of-staff is) of process. But presidents are notoriously bad managers, or refuse to spend precious time or political capital on managing. But to ignore management is to invite trouble. 

In recent years we have seen the rise of the “anti-analytical presidency”, defined as an administration that embraces a personalistic office where information and data are subordinated to the demands or needs of the president, and where rigorous science and robust information accumulation are secondary to the ideological or personal views of the president.  Rather than rely on evidence, information, or data, rather than consult experts and scientists, an anti-evidence and anti-science bias has often been employed in making policy, one that eschews critical thinking and turns a blind eye to the demands of logical reasoning.  From the Trump administration’s embrace of “alternative facts,” to their anti-science approach to climate change, the refusal to rely on expertise and evidence has become a central component of decision making in the Trump inner circle. 

President Trump is notorious for personalizing process. In fact, he eschews process. He believes that he doesn’t need to read reports as he is already a “stable genius.” Where past presidents met at the start of every day with a reading of the President’s Daily Briefing (PDB), a compendium put together by the intelligence agencies of the key issues and challenges ahead, President Trump has found the PDBs long and boring, and often bypasses the briefings. Compounding the problem, Trump often expresses a distrust of the intelligence community over which he presides (it is, he believes, part of the “Deep State” conspiracy against him), and refuses to rely on the information provided by the intelligence professionals.  

President Trump was warned numerous times as early as January of 2019 that a lethal virus was spreading in Wuhan, China, and could cause a global pandemic. Trump refused to believe the experts and dismissed or downplayed the seriousness of the coronavirus, thereby refusing to take sufficient early action when it could have made a dramatic difference. Precious weeks were wasted, time when the US could have put a testing, tracing, and isolation program in place, set social distancing and mask-wearing guidelines, and accumulated the necessary personal protection equipment (PPE) and ventilators. The “process” got the warnings to the President. The President refused to listen. 

An anti-analytical presidency has consequences, often leading to faulty, incomplete, or misleading information. How can we solve complex scientific problems if we refuse to rely on science?  The tools for making sound decisions are available, and while no panacea they can be effectively employed to heighten the odds in decisionmaking. Not to rely on them is a choice presidents sometimes make, often with tragic consequences. We cannot force a president to seek out information and review options. Deciding can be an intensely personal process. But rare is the president who openly and consciously chooses to short-cut information. It is, it should go without saying, a fool’s errand to act on instinct when information and data are available. 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177620 https://historynewsnetwork.org/article/177620 0
Presidents Lie About Their Health More Than Any Other Subject

 

This blog post was written by Rick Shenkman, founder of George Washington University’s History News Network, and the author of Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books).

The Trump administration has so little credibility that when it was announced that the president and first lady had tested positive for COVID-19 the first reaction of many people online was whether it was true or not.  My Twitter feed filled with tweets explaining that Trump probably just wanted to distract voters from his unhinged debate performance.  Some added that Trump would use a miraculously quick recovery to prove he’d been right all along that the disease is little worse than the flu.

It quickly became undeniable that Trump had contracted the disease, but people were right to question the administration’s story and not just because Trump has a well-deserved reputation for prevarication.  My research shows that the single issue presidents have lied about the most is their health. They lie for three reasons.  One, presidents believe their health is their own business and should be kept private.  Two, they worry that an admission of physical weakness will become a metaphor for political weakness.  Three, they fear (rightly) that their illness will quickly become the dominant political news, damaging their ability to control the political agenda; every minute spent discussing the president’s health is a minute people won’t be talking about what the president wants them to.

Until the 1880s no president lied about their health.  This wasn’t because they held themselves to a strict standard of truth-telling out of reach of their successors.  It was because reporters didn’t ask them questions about their health.  Thomas Jefferson frequently suffered from diarrhea during his two terms but nobody reported this.  Andrew Jackson was in near-constant stomach pain.  But it never occurred to the press to quiz him about this even though it was well-known that he often felt ill.

Everything changed on July 2, 1881.  That day as he was heading to catch a train out of Union Station for his summer getaway in New Jersey President James Garfield, just four months into his term, was accosted by a deranged man who’d been denied a civil service job and shot.  What happened next was wholly unprecedented.  As Garfield struggled to survive over the coming weeks the press began issuing hourly bulletins about his condition.  The papers flew off the presses.  It turned out that there was an enormous public appetite for health news about presidents.  By the time the president finally succumbed — four months later — the American media had changed. From then on the health of the president was considered a suitable subject for public debate.

Reporters did not have to wait long before another president’s health became a public concern.  Garfield’s successor, Vice President Chester Arthur, soon developed Bright’s disease, a then-fatal kidney disorder.  Reporters who followed him around noticed that his neck began shrinking and his shirt collars seemed loose.  Arthur suddenly began taking long trips to remote places like Florida where he’d be out of touch for long stretches.

This is when the lying began.  As soon as reporters began asking questions about his health Arthur began lying about it.  He maintained the lie right through the end of his presidency. He died less than a year after leaving office.  He was fifty-seven.

The next president to fall seriously ill was Grover Cleveland.  Just before July 4th 1893 he discovered that something was wrong with his jaw.  When doctors investigated they discovered that he had cancer on the roof of his mouth.  They quickly determined that he required an operation to remove the contaminated tissue.  Because Cleveland was obese the surgeons feared for his life.

Just then the country was slipping into a deep recession.  Cleveland worried that if the news got out about his illness the uncertainty about his health would aggravate the markets.  Cleveland, a backer of hard money (he believed the currency should be backed by gold) inspired confidence in bankers.  His vice president, however, believed in a loose monetary policy favored by broke and heavily indebted farmers.  Were Cleveland to die the soft money silverites in the Democratic Party (who believed the currency should be backed by silver) would be in a position to weaken the currency.

So Cleveland, who had a reputation for honesty, lied.  On July 1st he set off in yacht for his surgery and told almost no one, not even the vice president.  He spent the summer out of the public eye at Cape Cod recovering. When reporters asked after him he sent out an administration official to lie.  One reporter asked specifically if Cleveland had cancer.  No, was the answer.

The Cleveland story took a fascinating turn after he’d recovered and gone back to work.  One of his surgeons made the mistake of thinking, now that the president was well, that it was ok to tell the truth.  A front page story was published in Philadelphia in a major paper relating each of the important details of his operation based on an interview with the surgeon.  The Cleveland administration said the account wasn’t true and the story was forgotten.  The truth wasn’t confirmed for decades.

In the twentieth century presidents continued to dissemble.  Woodrow Wilson concealed the stroke he suffered in the White House after a grueling train trip across America to sell the Versailles Treaty, failing even to keep his vice president apprised of his true condition.  Infamously the first lady basically ran the government.  Wilson never copped to the deception though he remained partly paralyzed for the rest of his life.  (Incredibly he thought he should run for a third term.  Advisors talked him out of it.)

Franklin Roosevelt, to his credit, admitted that he had contracted Polio after his run for vice president in 1920, but routinely minimized  his illness.  Though he never recovered the use of his legs — he told a friend it took him a year to learn how to wiggle a toe—he deviously conspired with aides to leave the public with the impression that he had staged a complete recovery.  Photographers were even forbidden from taking pictures of him getting into or out of a car. In that more naive time voters never caught onto the truth.

Dwight Eisenhower similarly concealed his health issues.  A couple of years before he ran for president he spent about a month in the hospital but never publicly said a word about it.  Some historians surmise he was bedridden after suffering a heart attack. In 1953, just months into his first term he was giving a speech following the death of Joseph Stalin when he suddenly felt faint.  To keep from falling over he gripped the podium as he struggled to finish.  He sent his press spokesman out to say the president was suffering from indigestion.  More likely, it was a mild heart attack.

Two years later he was in Colorado playing golf when he again was taken ill.  This time there was no denying he’d had a heart attack, but even then the administration was slow to admit it.  In 1956 on the eve of his second run for president he suffered a severe case of abdominal pain.  The diagnosis was Crohn’s disease.  Ike was hospitalized and operated on.  This time the public was given a crystal glass account of his illness, including detailed drawings of his intestines.  Ike recovered quickly and went on to win in a landslide.

Eisenhower, who was seventy years old when he left office, was followed by John Kennedy, who was forty-two.  JFK was believed by the voters to be in perfect health, save for a bad back.  This was a lie.  At the Democratic convention in 1960 his chief rival, Senate Majority Leader Lyndon Johnson, revealed that Kennedy suffered from Addison’s disease, a debilitating adrenal glands disorder.  Kennedy flatly denied the charge and went on to win the nomination.  As president he dealt with a variety of illnesses including severe back pain, which he dealt with by taking a cocktail of powerful drugs.  Though voters knew he had a bad back they were led to believe he was supremely fit.  Pictures of him playing touch football with his family circulated widely.

Toward the century’s end there was the assassination attempt on Ronald Reagan, which the public was able to see on film.  Within hours the administration leaked a story that the president had joked with his surgeons that he hoped they were all Republicans.  What the public was not told was that Reagan had lost so much blood that he almost died.

What this potted history of presidential illness shows is that presidents and their aides feel free to engage in deception when the subject involves their personal health.  This is true of both Democrats and Republicans.  It is worthwhile remembering this history now that another president has taken ill.  No one aware of these facts can blithely assume that the Trump administration is giving us the straight story.  If two of our most honest presidents, Cleveland and Ike, could lie in these circumstances, it has to be assumed unless proven otherwise that our most dishonest president can as well.

 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/blog/154411 https://historynewsnetwork.org/blog/154411 0
Trump Returns to Campaign, Live Rallies Despite COVID Click inside the image to scroll through tweets

 

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/174574 https://historynewsnetwork.org/article/174574 0
A Personal Encounter with White Supremacy

St. Joseph, Missouri

 

 

As we travel through life--"til death do we part"--we are students of our environments: Our family, church, peers, schools, civic issues and what we read and imagine. And from all of this we take in what we consider “good" and what we consider “bad."  And in this jumble of input we adjust our values. It is with this perceived "reality" that I explore my early encounter with white supremacy. Or, to put it another way, black subordination.

Civic learning evolved in middle school:  "We are a government of laws, not men," still echoes from my civics class in the mid-1930s. Elsewhere, however, I learned of racial segregation, an every-day reality that nurtured white supremacy.

Missouri, my home state, is situated on the east bank of the Missouri river as it flows down from the north to Kansas City. There it takes a sharp turn east to cross the state to St. Louis where it merges with the Mississippi and flows on to the Gulf. 

Known as the "Show Me" state, Missouri joined the union in 1821 as a slave state. During the Civil War, however, the majority of citizens north of the river remained loyal to the Union. South of the river, where Missouri bordered on Arkansas, the Rebels reigned and their leaders dominated the racial issue: In due course, and with some support north of the river, racial segregation in public schools was established by Missouri law (racial segregation in private, religious, and social contacts varied throughout the state). This continued until the mid-1950s when racial integration began in order to comply with Brown v. Board of Education, a 1954 Supreme Court decision. 

From grammar school through university (my World War II experience not excluded) there was no consequential public agitation for change. The university, the faculty, community leaders, even the students, ignored, accepted, or excused legal segregation of Negro students (though not foreign students of color), which means, of course, many Americans were denied a fair shot at the political, economic, and social life of the nation. Racial segregation may not have been taught, but it was self-evident to students in the very first class attended.

Whether self-interest or simple indifference stilled any move to end racial segregation in Missouri, it fueled a lynching on the night of November 28, 1936, north of the Missouri river. There, in St. Joseph, fifty miles north of Kansas City, a mob estimated at five thousand lynched a young black man who had been convicted of no crime. Two National Guard tanks rumbled past my window on the way to the lynching where they offered no resistance to an act of torture and murder.

"The mob was from outside the city," declared the white community.

"Perhaps not," responded the black community.

Even if the size of the mob fell short of 5 thousand "outsiders," it would have required a caravan of autos and pickups. How come, one must wonder, every local failed to jot down license plates revealing the outsiders’ home counties? Could the citizens of this racially segregated community have willfully turned their collective backs on what was happening?

Kamala Harris, on being nominated as a vice president candidate, declared "There is no vaccine for racism.”

Nor for anti-semitism, misogyny, and other uncivilized biases that degrade our democracy. There are, however, a couple of ways to deal with racism, etc. 

1. Place more emphasis on the humanities throughout our educational system including self-education.

2. Not even the most fair-minded community can hope to educate away prejudice. But, I suggest communities deal with it like an alcoholic deals with drink: recognize that indulging bigotry even a little can have dire consequences, and just don’t do it.

]]>
Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177619 https://historynewsnetwork.org/article/177619 0
The Etymology of "Jazz": A Cautionary Word About Digital Sources

 

 

 

In a footnote to an essay titled “The Oracle of Our Unease” about F. Scott Fitzgerald in the October 8, 2020, New York Review of Books, Sarah Churchwell wrote:

 

  • Aptly enough the etymology and evolution of “jazz” are also semantically unstable. Although many claim its earliest print uses are from 1912-1913, and that its first recorded uses did not refer to music, there are at least two earlier references to jazz as music. In 1900 an Alabama paper reported on failed efforts to start “a jazz section” with “jazz palaces” in Glasgow (“Attempts to Jazz Glasgow have failed,” Anniston Star, January 1, 1900), while a 1909 article refers to the “jazz” and “pep” of the banjo (“Banjuke is the Latest,” Piedmont News, July 12, 1909).
  •  

    Those reports seemed unlikely to me — literally incredible — so I looked them up. I found both among the digitized microfilms that are searchable on the subscription Internet website newspapers.com, but both are misdated.

     

    The 1900 Anniston Star report begins, “London.—United Press—.” This should have given Churchwell or her research assistant pause for two reasons. First, because the original United Press went out of business in 1897 and the United Press syndicate of our lifetimes (UPI today) was founded in 1907. Second, because searches that return wire-service articles invariably record the same article published in dozens of newspapers, not just one.

     

    Although the Star dateline lacked the date, a search for an article about jazz music in Glasgow filed from London returned the same one dated March 1, 1924. Returning to the Star article, I scrolled through pages of the same issue. Somehow pages from a March 1924 issue had been spliced into or misfiled among the microfilmed January 1, 1900, issue.

     

    The 1909 Piedmont News snippet ended “—.Boston Globe.” A search for the same report in the Boston Globe found it in the June 20, 1917, issue. Returning to the News squib, I scrolled back to page 1 of that issue, which was dated July 12, 1917.

    ]]>
    Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177624 https://historynewsnetwork.org/article/177624 0
    Andrew Rotter's "Sensual Empires: Britain and America in India and the Philippines"

     

     

     

     

    The five senses help us navigate nature but, according to historian Andrew J. Rotter, they can also help construct an empire. Rotter is the Charles A. Dana Professor of History and Peace and Conflict Studies at Colgate University. His book Empire of the Senses: Bodily Encounters in India and the Philippines uses sensory history to compare the British and American empires side-by-side. Oxford University Press published the book in 2019 and they did a lovely job in producing a sensory experience for the reader. High quality paper and a dust cover with an attractive but calming green hue and two evocative images that would make for fine memes of the book in social media. Much work goes into designing covers and OUP artists did well to create a book with colors and lines that speak to organization and innovation.

     

    Organization and innovation is what describes the content as well. Rotter was inspired by sensory historian Mark Smith, who has done much to take the ideas of French historian Alain Corbin and apply them to the American context. Corbin’s examinations of perfume and bells in nineteenth century France laid the groundwork for a whole generation of sensory historians and Smith’s recent The Smell of Battle and the Taste of Siege--a sensory history of the Civil War--places him in this genre. Recently sonic history and aural studies including deaf studies have attempted to accentuate the ear’s role in a historical genre that, at least since the Enlightenment, has tended to privilege the visual power of the eye. Recovering the history of the other non-visual senses is sometimes posed as a critique of enlightenment histories. Histories that privilege the senses that do not rely on the eye are often very difficult to write because primary sources such as newspaper accounts or journal writers do not often take the time to fill their accounts with such descriptions. But doing the hard work of finding sources with sensory information is important to history. Not only does it allow us to recover histories that Enlightenment historians have neglected in their obsessed pursuit of the visual, but humans interpret the environment through the five senses. All five are necessary to make sense of the world and as new discoveries in how human memory works suggests, our minds uses these five senses interdependently to understand social constructions of reality and to make memories. 

     

    Andrew Rotter takes advantage of how our senses work with our minds to explain how British and Americans colonized far away lands in India and the Philippines. He divides his book into chapters that focus on each sense: seeing, hearing, smelling, tasting, and touching. By organizing his book around the senses Rotter has to sacrifice chronology. It can be a difficult comparison to make for as much as Britain and America are culturally and politically similar, their imperial visions were profoundly dissimilar. The British empire, for example, recognized difference in colonial subjects and often governed as if certain rules that applied to people in India did not apply to the British because of this difference. Americans governed the Philippines with the design of making Filipinos into Americans: rules tended to be applied more evenly to everyone, at least theoretically. A seasoned historian, Rotter more than understands this dichotomy and each chapter actually works well within this contrast. British and Americans govern colonized subjects differently even as imperial agents apply similar methods to subdue large populations. What gives the senses life is Rotter’s reliance on Edward Said’s concept of Orientalism. The idea that Said presented in his infamous work on the relationship between the Orient (The East) and the Occident (The West) is that Westerners often defined people who lived in non-Western places through history, art, and literature. This, Said contends, is a powerful way to build colonial regimes and perpetuate racial stereotypes, through cultural reproduction. Other historians, such as Paul Kramer, have picked up on this idea. Kramer argues in The Blood of Government, that Americans, not people who lived in the Philippines, created the idea of being Filipino through their governing by racial stereotypes. Rotter argues that this kind of Orientalism was created through British and Americans trying to regulate, control, and even reinvent the senses of Indian and Filipino people. The book is filled with all kinds of extraordinary stories of Westerners going to India or the Philippines and being absolutely aghast at the sensory attack on their own western values. Colonized peoples needed to be colonized, argued imperialists, because they had no understanding of how enlightenment senses were supposed to be used. The trick for civilizing barbarians, they argued, was civilizing the barbarians senses. 

     

    Here is Rotter’s second major theoretical influence: Norbert Elias’s The Civilizing Project. Elias examines Western European cultural practices, seemingly as trivial as blowing noses and using eating utensils, to suggest that nation states emerged out of the middle ages at moments when elite culture, manners, and “civilized” behavior began to permeate society. As civilized culture took root, nations broke from the medieval norms. Although Eurocentric and dated, Rotter uses Elias’s notion of the civilizing process not to describe how the British or Americans tried to give birth to new nations in India and the Philippines. Rather, he attempts to decolonize The Civilizing Project and use Elias to describe westerners' attempts at colonization. Rotter’s coupling of Elias with Said may seem strange at first, but it pays dividends again and again in each chapter. Whether using the sense of sight to condemn the diseased or the over-sexualized, taste to distinguish “colonial” food from healthy food, smell to sense the location of disease and poor hygiene, hearing to control political speech, or touch to control the sexuality of the colonized, British and American colonizers built their empires through what they sensed in India and in the Philippines.

     

    This was not a one-way process, however, colonists struck back against the imperial senses. Though many Britons had misgivings about curry, Indian people changed the tastes of the colonial overseer seducing their colonizers through their food. Colonizers and colonized mixed sexually too. Many Brits and Americans brought Indian or Filipino food and customs home with them and incorporated them into American society as time went on. Sometimes the very senses that Westerners produced became objects of resistance and even protest. Mutiny in the Sepoy Rebellion being just the most obvious example in India. Rotter showcases well how colonizers often become colonized by the very senses that the colonizers were trying to colonize. 

     

    Rotter did not show the inter-imperial exchange and learning as well as I would have liked. While it appears that imperialists looked at each other from afar, it remains unclear how intimately and how much Britons and Americans directly learned from each other. How do imperial agents share information with each other and how do empires learn from other empires are questions that remain for further research. Rotter likewise alludes to the idea that imperial agents learn how to implement their empire of the senses in part from their experiences on the domestic front while they grow up in the core of the empire. Interpreting the senses in a colonial system thus is shaped not only by experiences in the frontier but also by past experiences in the core. If domestic and frontier landscapes are linked in imperial practices then an imperial memory of home must exist. While Rotter focuses solely on bodily encounters, how senses stimulate psychic memories to become imperialistic is an avenue for research that could be further explored by others. Finally Rotter’s focus on the anglophone world is insightful and important and hopefully it will inspire other similar comparisons in the francophone world or francophone-anglophone comparisons or other imperialistic comparative studies. 

     

    Overall Empires of the Senses provides a fascinating comparative exploration of how the senses work within two imperialistic regimes. It is a worthwhile read for historians, graduate students, and the general public interested in sensory history, food, disease, or empire-building in the anglophone world.

     

    ]]>
    Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177623 https://historynewsnetwork.org/article/177623 0
    Corrected American History: The Cheery Colonial Period

    In 1626, Peter Minuit bought Manhattan Island. This first real estate deal foreshadowed the great deals Donald Trump would make in the same market.

     

     

    SAN DIEGO−President Trunk has decided to replace the current “crusade against American history” by setting up the 1776 Commission to teach students “about the miracle of American history.” Historians from Fox News have leaked advance outlines of the lesson plans for the different eras in the American past. Here’s the one for the colonial period.

    1607: White Anglo-Saxon Protestants settled Jamestown. Since the indigenous inhabitants of the area hadn’t done anything to develop the land, the colonists claimed the area as their own promising to civilize it by building casinos, golf courses, and, most importantly, fast food restaurants. An Indian maiden named Pocahontas recognized how superior white people were and married one of them to improve the genes of her children. Since this went against natural selection, the name Pocahontas would connote a nasty woman in the future.

    1619: The Virginia colony had a labor shortage so it advertised for workers from Africa and covered all their expenses to sail to America on spacious boats which had rowing machines for exercise. This arrangement worked out so well for both parties that millions more unemployed Africans chose to come to America in the next two centuries where they enjoyed free room and board as fringe benefits for the jobs toiling in the cotton fields.   

    1626: Peter Minuit, a Dutchman who was white and Protestant but not British−and two out of three isn’t bad−purchased Manhattan for $24. This was the first Manhattan real estate transaction and foreshadowed the great deals Donald Trump would make in the same market.

    1692-1693: Witches fomented anarchy in Salem, Since the Republicans governed Salem, they executed twenty of them. In nearby towns with Democrat mayors, the Witches ruled and caroused with the devil in their suburbs.

    1773: England had imposed taxes on the American colonies without letting the colonists vote on the matter. Voting would have been safe back then because the US Postal Service didn’t exist yet. The colonists protested by tossing imported tea into Boston Harbor. Many years later a political movement called the Tea Party would pave the way for Donald Trump to become president obliging him to lower taxes on big corporations.

    1776: The colonies issue the Declaration of Independence. The original document was recently discovered and its opening paragraph differs from the one left-wing historians have foisted on unwitting American students for generations. “We hold these truths to be self-evident, that wealthy white men are created equal, that they are endowed by their Creator with certain rights; that among these are life for fetuses, liberty from masks, and the pursuit of gun ownership.”

    ]]>
    Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177626 https://historynewsnetwork.org/article/177626 0
    Roundup Top Ten for October 2, 2020

    The Electoral Punt

    by Jonathan Wilson

    Many people imagine they understand the Framers' intent in creating the Electoral College. They impute more clarity of purpose than they should to a group who essentially made a slapdash compromise in order to be finished with the ordeal of drafting the Constitution. 

     

    Is Amy Coney Barrett Joining a Supreme Court Built for the Wealthy?

    by Kim Phillips-Fein

    Amy Coney Barrett's judicial record indicates she would help the court move back to the Lochner era by crippling regulation and ruling against labor unions. 

     

     

    When the Privileged Don’t Pay their Fair Share in Taxes, it Can Spark Revolution

    by Christine Adams

    The unfairness of the tax system, especially as the French government faced bankruptcy in the late 1780s, was one of the factors that triggered revolution in 1789. 

     

     

    Trump is Afraid of Honest History

    by James Grossman

    Trump's proposal for a "1776 Commission" suggests that history teachers should be cheerleaders, reducing the nation’s complex past to a simplistic and inaccurate narrative of unique virtue and perpetual progress.

     

     

    D.C. Statehood Is Good for the Democrats, Good for Democracy

    by George Derek Musgrove and Chris Myers Asch

    DC statehood will secure the citizenship rights of the city's residents and begin to repair the crisis of legitimacy caused by the gross imbalance of political representation in the U.S. Senate. 

     

     

    On the Peaceful Transfer of Power: Lessons from 1800

    by Sara Georgini

    Adams lost the presidency amid violent factionalism, a seething press, rampant electioneering, and the eruption of party politics, yet became a champion for the peaceful transfer of power. 

     

     

    How Trump Brought Home the Endless War

    by Stephen Wertheim

    The Global War on Terror reconfigured American foreign policy around military force against abstract ideas and indeterminate enemies. The divisions of domestic politics set the stage for Donald Trump to move the war to the streets of the United States. 

     

     

    The Supreme Court Used to be Openly Political. It Traded Partisanship for Power

    by Rachel Shelden

    Americans once assumed that the constitutionality of a given law was a matter to be settled through legislative politics and elections, and selected judges on a partisan basis. Today's court is no less political or ideological, but can exert more power because of its nominal freedom from partisan politics. 

     

     

    The Case to End the Supreme Court as We Know It

    by Keeanga-Yamahtta Taylor

    The Supreme Court has historically supported democratic and egalitarian change only when forced by social movements. People must stop looking to the power invested in the court and start looking for the power latent in themselves. 

     

     

    Trump's 1776 Education Plan Part of a Decades-Long, Right-Wing Movement — But Scarier

    by Natalia Mehlman Petrzela

    Today's battle over patriotic education doesn't just threaten a particular curriculum or course of social studies teaching, but is part of a broad attack on critical inquiry and public education. 

     

    ]]>
    Thu, 22 Oct 2020 07:11:29 +0000 https://historynewsnetwork.org/article/177614 https://historynewsnetwork.org/article/177614 0