History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 03 Jul 2022 21:31:16 +0000 Sun, 03 Jul 2022 21:31:16 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/site/feed We Are All Americans Here: The Crisis of Civic Empathy and our Besieged Democracy

Martin Luther King Jr. with Rabbi Israel Dresner

 

 

One week before Christmas 1944, the 422nd Regiment of the 106th Infantry Division of the U.S. Army was captured during the Battle of the Bulge, the final German offensive of the Second World War. The soldiers soon found themselves among 1,292 American POWs at Stalag IX-A near Ziegenhain, Germany. Among their ranks were more than 200 anxious Jewish soldiers, keenly aware that with defeat approaching, the Nazis’ extermination of Jewish POWs was accelerating. On January 27, 1945, the camp’s highest-ranking American officer, Master Sergeant Roddie Edmonds of the 422nd, received an order from the camp commandant, Major Siegmann: “Tomorrow morning at roll call all Jewish Americans must assemble—only the Jews—no one else. All who disobey this order will be shot.” “We’re not doing that,” Edmonds told his men. The word went down the line for the men to fall out—Americans with names like McCoy and Smith would stand alongside their Jewish comrades. “Raus! Raus!” came the order at 0600. With all 1,292 Americans in formation, Siegmann approached Edmonds. “Were my orders not clear, Sergeant?” Edmonds was a stoic, well-respected Tennessean who as Paul Stern recalled, had “probably never met a Jew in his life.”  “Only the Jews!” the major yelled. “They cannot all be Jews.” “We are all Jews here,” replied Edmonds. Siegmann pointed his Luger at Edmonds’s head, “Sergeant, one last chance.” “Major,” Edmonds coolly replied, “you can shoot me, but you’ll have to kill all of us.” Siegmann lowered his pistol. Roddie Edmonds never spoke of the incident, but for the men of Stalag IX-A, he stood forever as a “shining example,” as Stern put it, “for all of us to remember and try to emulate in our own lives.”

 

Beyond its profound illumination of moral courage, the Roddie Edmonds story exemplifies the righteous, democratic future that millions of Americans believed they were fighting and sacrificing for in World War II. The exalted hope that a more just society could emerge through the crucible of a war against fascism was palpable on the home front, grounded in a spirit of civic empathy—what the ancient Greeks called agape. The highest form of human love, agape is selfless, and it is brave—“love in action,” Dr. Martin Luther King Jr. later termed it, "a love in which the individual seeks not his own good,” but the greater good. Its supreme policy embodiment was President Franklin Roosevelt’s “Economic (or “Second”) Bill of Rights,” proposed in 1944. Envisioning postwar guarantees of full employment, veterans’ benefits, universal education and medical care, FDR’s proposal—widely hailed at home and around the globe—aimed at forging a strong social democracy that more broadly could stand as a bulwark against the reemergence of illiberal authoritarianism.

 

Although postwar political life was characterized mostly by conservative stasis that doomed such egalitarian liberalism, agape lived on as the radiant heart of the civil rights movement. One of its most dramatic expressions came in 1964 in St. Augustine, Florida. In the face of violent Klan resistance to integration that spring and a U.S. Senate filibuster of landmark civil rights legislation, Dr. King appealed to long-time compatriot Rabbi Israel Dresner, to “send as many rabbis as you can.” On June 18, Dresner soon led 16 Jewish clerics in a motel “pray-in” at the segregated Monson Motor Lodge, resulting in the largest mass arrest of rabbis in American history. “We came as Jews who remember the millions of faceless people who stood quietly,” they declared, “watching the smoke rise from Hitler’s crematoria.”  Following a final action, a dramatic “swim-in” that afternoon at the Monson swimming pool by other activists, the Senate voted for the Civil Rights Act the next day, 73-27.

 

Four years later, Senator Robert F. Kennedy’s run for the presidency offered an inspired 84-day clinic in civic empathy. Wherever he went—from West Virginia coal country to forsaken Native American reservations, urban ghettoes, and barrios of the Southwest—Kennedy bound himself to the struggles of working people. Running at one of the most turbulent, divided moments in our history, Kennedy frequently reminded Americans that they yet shared “a common concern for each other.” Still, he was unafraid to challenge them. Speaking to a local service club one night in Vincennes, Indiana, Kennedy strayed from his text and the issue most of the men wanted to hear about: the need for more law and order. Instead, as journalist Thomas Congdon, Jr. summarized:

[With] big, heavy men . . . still occupied in shoveling in their lunch—the Senator from  New York spoke of children starving, of “American children, starving in America.” It was reverse demagoguery—he was telling them precisely the opposite of what they wanted to hear. “Do you know,” he asked, voice rising, “there are more rats than people in New York City?” Now this struck the club members as an apt metaphor for what they had always believed about New York City, and a number of them guffawed.  Kennedy went grim, and with terrible deliberateness said, “Don’t … laugh” . . .There were a few more hushed questions, and finally he escaped to confused applause. 

 

With naked vulnerability amid a very dark time, Kennedy appealed to Americans’ better angels. Coming on the heels of Dr. King’s murder, RFK’s assassination that dark June night slammed the door on an era that had begun with boundless, hopeful possibilities. The spring of 1968 glowers at us still—a decisive moment when a society that had faintly yearned for a more generous, agapic future began surrendering to cynicism.

 

 

The forces that set us on a course for Trumpism a half century later are complex. So are the reasons for the appalling truth that millions of our fellow citizens maintain support for an authoritarian sociopath whose galaxy of transgressions include mocking a disabled reporter, bragging of sexual predation, uttering more than 30,000 lies or misleading statements as president, describing neo-Nazis as “very fine people,” and summoning insurrectionists to Washington on January 6, 2021 to overturn our democracy. Many of us live in a state of grief that this could really be happening to the nation defended by our parents and grandparents.

 

Our current crisis runs deep. It is manifested in, but also transcends, antidemocratic state voting laws, the filibuster, gerrymandering, the epistemological wonderland of social media, and the metastasizing of far-right extremism among white Americans who would have us live under white minoritarian autocracy than together in a multiracial democracy. Its unholy, cynical depravity was most recently laid bare by Republican demands to “harden” our schools—a revealing term for our times if ever there was one— following mass shootings of third graders and black grocery shoppers.

 

On June 11, 1963, in the midst of a crisis that saw firehoses and dogs sicced on black children in the streets of Birmingham, Alabama, President John F. Kennedy beseeched Americans to see civil rights as a "moral issue ... as old as the scriptures and as clear as the Constitution."  Nearly sixty years later, the matter of racial injustice that JFK identified as a spiritual matter persists, now entangled with the existential crisis of democracy.  As then, what we need most is a rediscovery of our shared humanity and the divine truth that we are all Americans here.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183404 https://historynewsnetwork.org/article/183404 0
Dr. Gillian Frank on the Decimation of Roe v. Wade Well, that was a Terrible, Horrible, No Good, Very Bad, Unprecedented, Precedent-Destroying Week at the Supreme Court. To help make sense of SCOTUS’ reversal of Roe v. Wade, I spoke to Dr. Gillian Frank, a historian of sexuality and religion. This is the first of a series of Skipped conversations on how Roe fell, and where we go from here.

Dr. Frank writes on the intertwined histories of religion, sexuality, and gender in the US, and is the co-host of Sexing History, a podcast about how the history of sexuality shapes our present. We chatted about the history of antiabortion laws, the public health crisis that's assuredly about to arise, and how the fight for abortion rights doesn't end now. 

A condensed transcript edited for clarity is below. If you'd like the audio of the full conversation (and/or more Skipped History in general), you can try out life as a paying subscriber here. I hope you learn as much as I did.

Ben: Dr. Frank, it’s a pleasure to chat with you, albeit under such disturbing circumstances.

GF: Thanks for having me.

Ben: To ground our conversation, could you talk about the rise of the first anti-abortion laws in the 19th century?

GF: Sure, though before we get to the restrictions, we need to note that abortion was widely practiced for most of American history up until the 1860s, 70s, and 80s. But then, a confluence of factors led to new restrictions on abortion. 

The first factor was the professionalization of physicians. With the founding of the American Medical Association (AMA) and the regulation of physicians, physicians tried to normalize and seize power over medical practices. Part of that meant stamping out what we would now call quacks or untrained unskilled physicians. But it also meant seizing power from skilled medical practitioners such as midwives, who offered competition and covered all the things an OB/GYN would cover, including abortion. 

Ben: Might I wager a guess that most of the members of the AMA, if not all, were white men?

GF: That would be correct, yes.

Ben: Well, as long as we are consistent.

GF: The second factor was the coincident rise of social purity movements. People like Anthony Comstock and others had huge concerns about what they called the “moral degradation” of American society. Comstock was a religious fundamentalist.

And he, like many white Protestants, feared that the arrival of Catholic Eastern Europeans meant good Protestant stock was declining. 

Ben: It's just a pure coincidence that his name is Comstock and he's worried about the population's “stock”?

GF: I had never thought of it that way, but yes, pure coincidence. And so Comstock, joined by others, went from town to town, city to city, state to state, to campaign for abortion restrictions. Their efforts succeeded. 

In the 1880s, state by state, legislators pass what are called Comstock Laws, banning information about abortion and contraception, as well as banning abortion except to save the life of the mother. By the late 19th century, we see the completion of a shift from abortion being widely practiced to abortion being tightly restricted and in the domain of licensed male physicians.

Ben: After the Comstock Laws spread, what options did women face when seeking an abortion?

GF: Of course, the demand never stopped. The question then is: who did women go to? Where did they get abortions? 

One answer is that, as we fast forward, to the 1920s and 1930s, most cities and towns had a reliable full-time abortion provider. Members of the medical establishment felt if our patient needs an abortion, we can at least send them to the known provider and have them do the procedure, and then we can tidy up the aftermath. 

At the same time, unskilled people knew there was a strong demand for abortion, and many women turned to a black market that emerged. Newspaper headlines would regularly describe how people would seek out a gas station attendant or a trusted friend or someone who knew someone who said they could supposedly do an abortion.

Skilled or not, all of these practitioners operated outside the law. As a result, expenses went up, safety went down, and there were fatalities.

Ben: Wow, okay.

GF: Now, post-1940s demand isn't ceasing. In fact, demand for abortion is growing as part of a post-war, sort of sexual loosening up.

Ben: The post-World War II orgy, if you will.

GF: I wouldn't phrase it that way, but I'm sure there were some. 

And around the same time, there's a strong push to establish the so-called “nuclear family.” As women are pushed out of full-time work into part-time or homemaking work, we see a tightening of abortion restrictions.

Not coincidentally, in the 40s, Catholic physicians conduct a survey of how their hospitals are treating abortions. What are the numbers of abortions that they're providing versus, say, hospitals like Johns Hopkins or others that are Protestant, secular, or just non-Catholic? 

And they find that there's a wide discrepancy, suggesting that non-Catholic hospitals are providing a lot of illegal abortions.

This leads to another round of regulation that spurs hospitals to self-regulate for fear of violating the law. Hospitals introduce hospital committees to review every abortion case that comes in. These committees basically became a way of driving down the abortion rates that were legal and accessible. 

And so, entering the 50s, we have what we might call a quiet sexual revolution going on and less access to safe abortions. What happens? Again, rising body counts. 

Ben: In other words, a public health crisis.

GF: Right. A public health crisis emerges. Lawyers are aware of it. Police are aware of it. The clergy is aware of it. 

Ben: How does this growing mass awareness coalesce into a pro-abortion movement? In your work, you describe a religious alliance that forms, almost like a priest, a minister, and a rabbi walk into a bar and decide to support women’s rights.

GF: Yes, minus the priest at this point. 

The early abortion rights movement is largely coming from professionals. Physicians, lawyers, and clergy who believe that they should have the autonomy to make decisions with their patients.

As for the clergy, it’s mainly ministers from mainline Protestant denominations, as well as rabbis from reform and conservative denominations. They've already been on board with contraception for decades and see family planning as an ethical duty and sex for pleasure within marriage as natural, normal, and desired.

One of the big galvanizing factors was German measles, aka the rubella epidemic, which was causing a lot of fetal deformities. And there was a scare over thalidomide, a tranquilizer that also led to birth defects. So religious figures worried about the health of fetuses. (You can hear the ableist language of the decade.) 

Male lawyers, clergy, and physicians begin to see it as a moral outrage that when women they deemed worthy—i.e. women who were white, middle- and upper-class, and married—needed abortions, they had to either fly abroad or just couldn’t get them. 

Meanwhile, a growing women's liberation movement and second-wave liberal feminist movement are also seeing sexual matters as essential to recognizing the politics of oppression, and how to activate your own life to have full empowerment and social equality. Abortion becomes central to this conversation.

So, by the mid-60s, in California, New York, and other states, legislatures are considering abortion law reform. There's an emergent consensus about: what does it mean if hundreds of thousands of people a year are violating the law? 

And I haven't even gone down the full list of all the people who are concerned about abortion. But, long story short, by 1970, New York has legalized abortion with no residency requirements. Hawaii does the same around the same time. Kansas comes next. And then a whole other slew of states follow suit.

Ben: So, over generalizing, in a story that we’ll explore another time, this momentum leads to the Supreme Court's decision in Roe v. Wade in 1973. 

Something that occurred to me while reading your work, and which has occurred to me throughout the conservative push to overturn Roe is: isn't it very clear that we're heading toward another public health crisis? Why wouldn't there be more momentum toward stopping this public health crisis?

GF: Your basic question is what happens next?

Ben: It is, yes, but I hate to be too explicit about that in front of a historian.

GF: Okay. Will this create a public health crisis? Inevitably. 

By denying medical access, by making abortion more expensive, by trying to criminalize it, by increasing the social stigma around it; by empowering states to demonize those who seek and provide abortions and those who share information about the procedure; all of these things will have repercussions, the least of which is that for women who want to terminate a pregnancy in states with very restrictive laws, it'll become more time consuming, more expensive, and more difficult. 

These demotivating factors will be able to snare some, but if the past is any indication, many other folks are going to attempt self-managed abortions. 

Now, the past is not the same as the present. We have new technologies, new ways of getting information. But will there be people left behind? Inevitably. What will be the mental health, economic, or public health consequences of folks compelled to have children they don't want? As many people are pointing out, this oppression will have negative, cascading effects on people's lives. And for folks who inevitably end up traveling great distances, taking lots of time, and spending more money on abortion? Well, all of these things wear on a person and make an ordinary medical procedure traumatic. 

I can't predict body count. I can't predict maternal mortality. But I’d emphasize that the difference between now and in the past was, pre-Roe, there was not a political party that uniformly made anti-abortion its platform. In the past, both Democrats and Republicans were split over abortion.

Ben: As in, until even a couple of decades ago, members were split within both parties.

GF: Yes, split within both parties. But the ways in which the Republican Party has become radically conservative, if not outright anti-democratic and authoritarian, make for a different situation today.

In the past, you would almost never offer criminal penalties to someone seeking an abortion. Republicans are floating those penalties now and trying to expand a much greater punitive regime. This is an anti-abortion regime on steroids as compared to the past.

Ben: Wow. I suppose the increasing radicalization of this regime suggests there's no silver lining at the moment.

GF: No, I won't say there’s a silver lining, but I will say that it's not hopeless; that for decades now there have been groups preparing for this day; that there are many activists on the ground already creating infrastructures, and they have been creating infrastructures. 

This story doesn't stop with the decimation of Roe v. Wade. It's not a simple end of abortion. Rather, it's a transformation of how it can be accessed. It’s part of an ongoing struggle to provide health and dignity to millions of Americans.

Ben:  Thank you for that reminder, and for your time today.

GF: Real pleasure to speak with you.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/blog/154615 https://historynewsnetwork.org/blog/154615 0
Where Will America Be by 2030?

 

 

 

Editor's Note: This essay was submitted for publication before the official release of the Supreme Court's final round of opinions for the term, including in Dobbs v. Jackson Women's Health Organization.

 

At the time that I wrote this essay, a woman’s access to abortion was technically the constitutionally mandated law of the land. By the time that you read this article, Roe v. Wade will have been struck down by the Supreme Court, speaking to how rapidly the United States is being transformed by an extreme right-wing political project. If you read this column again, let’s say in a year after the midterm elections, and if the Republicans have retaken both the House and the Senate, then fully expect that there will be a national abortion ban, as Senate Minority Leader Mitch McConnell has promised, for when it comes to the most extreme of their positions, the ultra-right is always reliable in at least that regard. Justice Samuel Alito’s leaked decision in Dobbs v. Jackson Women’s Health Organization rather disingenuously and mockingly claimed that abortion rights would become entirely a states’ issue, but the revanchists have been wasting no time to ensure a federal ban on a basic issue of women’s bodily autonomy.

Last week, the libertarian “think tank” the Heritage Foundation, who believe in the unfettered freedom to poison the environment and exploit workers but not to make intimately personal decisions about your own health, held a “Life after Roe Symposium,” endorsing a federal ban on abortion. The National Right to Life Committee has plans that are even more dystopian, criminalizing speech that’s pro-choice, with Susan Rinkulas of Jezebel aptly describing such proposed policy as establishing a “worsening hell of criminalization, state surveillance, and maternal deaths.” At their Washington DC symposium, the Heritage Foundation’s president Kevin Roberts intoned “We’re being called to reclaim this country.”

Believe the fascists when they say this, and believe that the sweeping, radical, authoritarian program they have in mind for the United States includes not just a ban on abortion, but the establishment of a de facto Christian theocracy.  

A June 2022 poll from ABS News/Ipsos reported that 70% of Americans support comprehensive gun control legislation, with Ivana Saric of Axios noting that “Americans overwhelmingly prioritize gun control over ownership rights,” but on June 23, the Supreme Court struck down even the most tepid of common-sense gun control in New York State Rifle & Pistol Association v. Bruen, the same day that the court gutted Miranda Rights. The first decision’s majority opinion was written by Justice Clarence Thomas, whose wife it’s always worth noting was enthusiastically involved in the attempted coup on January 6th.

An October 2021 poll from the Pew Research Council revealed that 54% of Americans support a robust separation of church and state, but this week’s decision in Carson v. Makin mandated that states must support explicitly religious education with taxpayer money, though many on the right are already adamant on just what constitutes a faith, with Josh Blackman of Reason magazine claiming that Reform Judaism is undeserving of religious freedom protections since he maintains that their beliefs aren’t held “sincerely.”

Meanwhile, just this month Gallup demonstrated that 55% of Americans identify as “pro-choice,” and even larger majorities reject the most extreme positions of the forced-birthing movement. Yet all of us fully expect that Roe v. Wade will be struck down before the end of the month. Worth remembering that of the unelected justices empowered with making these decisions, a third (Neil Gorsuch, Brett Kavanaugh, and Amy Coney Barrett) were appointed by Donald Trump, who received far less than half of the popular vote. And, while John Roberts and Samuel Alito were appointed during George W. Bush’s second term, he first took office after losing the popular vote as well. This is the most manifestly anti-democratic court since Chief Justice Roger Taney’s in the years before the Civil War. That the Supreme Court in no way represents the interests or beliefs of the mass of the electorate is a truism so obvious it sometimes goes unstated, but it’s important to state it, emphatically and often. Even more disturbingly, obviously, is to what end all of these decisions are being made, for as Elie Mystal and Ryan Doerfler note at The Nation, the ”court’s conservatives – now a rock-solid majority – seem ready to complete the ideological project even openly and triumphantly.”

To focus only on the judicial activism from the bench is to miss the forest for the trees, however, for our slide into autocracy isn’t just manifested with 6-3 decisions out of the Supreme Court, but from the rank and file of the national and state GOP, and the armed shock troops of the Republican Party’s ongoing insurrection, from the Proud Boys to the Three Percenters. Supreme Court Justice Louis Brandeis wrote in 1913 that the states were the “laboratories of democracy;” today they’re rather the torture cells of autocracy. As a case in point, examine the 2022 Texas State GOP’s official platform ratified this month, which among other planks proposes a state-level electoral college which would completely disenfranchise the majority of Texans, declaring LGBTQ people to be guilty of an “abnormal lifestyle choice” while proposing policies to strip them of their basic rights, officially maintaining that President Joseph Biden was not the winner of the 2020 election, and demanding a referendum be held by next year concerning secession.

For anyone not enraptured to patriotic pablum and paeans of American exceptionalism, the goal of the far right and the GOP, who are now synonymous, is obvious – to create a permanent state of reactionary minoritarian rule, where women and their doctors can be jailed because of abortion, where your tax money funds evangelical Christian schools, and where protest is kept in check by police and federal agents who can no longer be charged with the violation of individual rights, assisted by militias who’ve assembled an arsenal of millions of firearms. None of this is randomly decided and all of it is to work in tandem. In her book American Fascism: How the GOP is Subverting Democracy, Trans activist and Senior Defense Analyst at the RAND Corporation Brynn Tannehill writes that American democracy isn’t “just in steep decline, but that we… [are] in the middle of an autocratic attempt.”

Everyone on the right understands and enthusiastically embraces this, save for maybe a few sad representatives who are accused of being “RINOs” by their own constituents and are unwilling to admit the role which the Republican Party played in getting us to this point. Some liberals and some leftists comprehend the score, but not yet enough, and entirely too many are also unwilling to admit how the quisling, milquetoast centrism of the DNC is responsible as well. The January 6th Committee, as admirable and necessary as that work may be, has engaged entirely too much in preaching to the choir, and not enough in preventing whatever awaits us on January 6th, 2025. Speaker of the House Nancy Pelosi seems, like most of her colleagues, too enraptured with the myth of the Good Republican, and President Biden has never availed himself of the rhetorical power of the bully pulpit right when we most need it.

That the Democratic Party seems unaware that the United States is headed to either an apartheid theocracy or civil war doesn’t mean that the bulk of voters don’t have that awareness. The unending litany of horrors that have appeared with increasing regularity since the 2016 presidential election don’t need to be subtly interpreted by most of us, from the increasingly genocidal anti-LGBTQ rhetoric of conservative politicians to the whitewashing of historical atrocity, to the cruel, misogynistic anti-abortion bounty-hunter laws enacted in Texas and elsewhere, to the proliferation of far-right American brownshirt phalanxes. In America, a well-armed mobocracy is instrumental to the maintenance of stochastic, reactionary tyranny. As Jeff Sharlet warns in his upcoming, brilliant book The Undertow: Scenes from a Slow Civil War,

Such are the spiritual truths of the Trumpocene… A ‘fringe’ that surrounds the center and moves inward, until suddenly there it is, the fringe, at the heart of things: the QAnon Shaman in the Senate chamber, or Representative Marjorie Taylor Greene… or Representative Lauren Boebert, who open carries; the ‘zip tie guy’ or Ginni Thomas, who helped plan that awful day in Washington.

Nobody sane wants a civil war, but how do we prevent it? How do we avoid fascism? General strike? Tax protest? Let me know by 2030 if you’re still allowed to read things like this.   

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183451 https://historynewsnetwork.org/article/183451 0
Excerpt: INAUGURAL BALLERS: The True Story of the First U.S. Olympic Women’s Basketball Team

The 1976 United States women's Olympic basketball team. Photo USA Basketball

 

 

CHAPTER 1: EVERLASTING

The locker room shook with music, women singing along with the Natalie Cole tape blasting from the small speakers in the corner. 

THIS will be . . . an everlasting love 

THIS will be . . . the one I’ve waited for 

Someone turned off the tape player, and the room grew quieter. The only thing breaking the silence was the muffled murmur of thousands of spectators from around the world who had traveled to Canada for the eighteenth Olympic Games. 

American basketball coach Billie Moore stood before her players in the bowels of the famed Montreal Forum, just minutes before her team was to play Czechoslovakia in a game to determine the winner of the silver medal. The women in front of her would go on to become some of the most legendary names in the history of the sport, but at this moment they were still largely unknown.1 For people who paid attention to women’s basketball, it was a surprise this team had even made it to Montreal, let alone that it was in position to earn medals in the first women’s Olympic basketball tournament ever played. The United States had placed a dismal eighth at the World Championships in Colombia a year earlier, only qualifying for the Olympics in a last-minute tournament for also-rans just two weeks before the opening ceremony. Heading into the Olympics, one sportswriter declared that the only positive thing anyone could say about US women’s basketball in the past was that it wasn’t the most inept program in the world. “Maybe the second or third worst,” he wrote, “but not the worst.”

A basketball coach must choose her words carefully in a pregame speech—just enough motivation, not too much pressure. As she scanned the room, locking eyes with the veteran co-captain from rural Tennessee, the brash young redhead from Long Island, and the quietly determined Black center from the Mississippi Delta, Moore sensed her players could handle a message that had been on her mind ever since the team’s training camp in Warrensburg, Missouri, six weeks earlier.

The coach had confidence in this group, and though she didn’t think much about politics, she understood the moment in time in which this team existed. In the summer of 1976, women were demanding rights and opportunities all over the world. The United States had just celebrated its bicentennial on July 4, a time for Americans to ponder whether all citizens were truly free.

Moore knew this game was an important stepping-stone on the journey to equality. Pat, Lusia, Annie, Nancy L., Nancy D., Mary Anne, Sue, Juliene, Charlotte, Cindy, Trish, and Gail wouldn’t just be playing for themselves but also for the women before them who had been denied opportunities. They would be playing for the little girls who yearned to hoop, and generations of athletes yet to be born.

Rather than calm her players’ nerves by telling them to remember this was just another game, no different than any they’d played before, Billie Moore laid it all on the line.

“Win this game,” she told her team, “and it will change women’s sports in this country for the next twenty-five years.”

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183449 https://historynewsnetwork.org/article/183449 0
50 Years Ago, a SCOTUS Decision Placed a Moratorium on Executions. It's Time to Revive it, Permanently

 

 

 

 

Fifty years ago in 1972, as spring faded and summer arrived in late June, America (and the world) was a vastly different place.

 

The United States was still entangled in the quagmire of the Vietnam War, and tens, if not hundreds, of thousands of individuals still marched on city streets and on university campuses demanding an end to the bloodshed that would ultimately claim the lives of over 58,000 American soldiers and 3 million Vietnamese.

 

On May 15, Alabama Governor and presidential candidate George Wallace was shot (and paralyzed) by Arthur Bremer in a parking lot in Laurel, Maryland. Within 2 weeks, there would be two failed break-ins at the Watergate complex in Virginia, a crime that led to the downfall and resignation of President Richard Nixon in August 1974.

 

In June, the first U.S. Libertarian Party National Convention was held, and it became the first party to call for the repeal of all victimless crime laws. The five Watergate burglars and White House operatives were arrested for the break-in at the offices of the Democratic National Convention.

 

On June 29, the US Supreme Court issued a monumental decision in Furman v. Georgia, stating that the US death penalty was unconstitutional because it was administered in both a racially and geographically discriminatory manner, and it converted all existing death sentences to life imprisonment.  The decision saved the lives of 611 inmates in 31 states, including members of the notorious Manson Family in California, Sirhan Sirhan (also on death row in California), the convicted assassin of Sen. Robert Kennedy, and Richard Speck, convicted of killing 8 female student nurses in their Chicago residence.

 

Only a decade before, in 1962, the United States carried out 47 executions in 17 states, led by California (11) and Texas (9). Other executing states included: Alabama, Colorado, Florida, Georgia, Illinois, Iowa, Kansas, Kentucky, Mississippi, New Jersey, Ohio, Oregon, Pennsylvania, South Carolina and Virginia. 

 

The 1960s had no shortage of both global and domestic violence and upheaval. Yet, U.S. executions declined. In 1963 there were 21 executions in 13 states led by Texas (4); 15 executions in 8 states in 1964, led by Texas (5); 7 executions in 4 states in 1965, led by Kansas (4); Oklahoma carried out the nation’s only execution in 1966; and only 2 inmates were executed in 1967, in California and Colorado.

 

An unofficial moratorium on executions began in 1968, and it would last until January 17, 1977. This was a far cry from the nadir of U.S. executions, in the 1930s during the Great Depression when an average of 167 executions were carried out yearly.

 

The case of Furman v. Georgia actually had its origins in a death penalty decision in Arkansas in 1962.

 

William Maxwell, an African American man, had been convicted and sentenced to death for the rape of a 35-year-old white woman. His appeal had been rejected by the Arkansas state supreme court, and when the U.S. Supreme Court refused to hear his case, they returned it to the Arkansas state supreme court.

 

University of Pennsylvania law professor Anthony Amsterdam was contacted by the Legal Defense Fund and he led the effort in a petition for habeas relief, arguing that the death penalty was unconstitutional because “jurors were given no guidance about how to reach a decision, leading to arbitrary results; the single-verdict trial, in which the jurors had decided Maxwell’s guilt and sentence simultaneously, denied them the opportunity to weigh mitigating factors,” and due to claims of racial bias, showing that in the 2-decade period from 1945-1965, “black defendants who raped white women in Arkansas stood a 50% chance of being sentenced to death if they were convicted, compared to a 14% chance for white offenders.”

 

Amsterdam was the lead attorney several years later when he argued before the U.S. Supreme Court that the death penalty was overwhelmingly used against the “predominantly poor, black, personally ugly, and socially unacceptable,” the very people “for whom there simply is no pressure on the (state) legislature” to remove the death penalty.

 

The Court gave its answer on June 29, 1972, in a decision that reached almost 80,000 words and to this day remains among the longest decisions ever. By the narrowest of margins, 5-4, the court found that the U.S. death penalty was indeed “cruel and unusual punishment,” thereby violating both the Eighth and Fourteenth Amendments.

 

Each of the nine justices wrote his own opinion, with Justice Potter Stewart concluding that “these death sentences are cruel and unusual in the same way that being struck by lightning is cruel and unusual.” Justice Byron White wrote that while he did not believe that the death penalty itself was unconstitutional, “the penalty is so infrequently imposed that the threat of execution is too attenuated to be of substantial service to criminal justice.”  

 

Justice Harry Blackmun added that “I yield to no one in the depth of my distaste, antipathy, and, indeed, abhorrence for the death penalty … and of moral judgment exercised by finite minds.”

 

Both Justices Stewart and White made it clear they opposed the outright and complete abolition of the death penalty. Indeed, they left the door open for states to meet the Court’s insistence on a uniform standard for death sentencing across the nation, and only six months later, in December 1972, Florida enacted new death penalty statutes and over 30 states soon followed.

 

The Furman decision would remain in place until America’s bicentennial, when on July 2, 1976, the U.S. Supreme Court ruled on five different death penalty cases: Gregg v. Georgia, Proffitt v. Florida, Jurek v. Texas, Woodson v. North Carolina, and Roberts v. Louisiana.

 

The Court upheld two new and broad guidelines that state legislatures were required to follow in order to have a legal capital sentencing statute: first, there had to be objective criteria to direct and limit the death sentencing discretion, and the objectiveness of the criteria had to be ensured by appellate review of all death sentences; and second, the defendant’s character and record had to be taken into account before a death sentence could be delivered.  The Georgia, Florida and Texas cases were upheld in a 7-2 vote, and the North Carolina and Louisiana cases were not.

 

The moratorium on national executions would be shattered on January 17, 1977, when Gary Gilmore became the first condemned inmate to be executed in the US since 1967. He was shot to death in the Utah State Penitentiary. The floodgates on U.S. executions were soon to follow, with the U.S. beginning double-digit executions in 1984, peaking with 98 executions in 1999, and remaining in double digits virtually every year, including 11 in 2021.

 

There have already been seven executions thus far in the U.S. this year, with the next one scheduled for July 13 in Texas. A recent federal court ruling now makes 25 death row inmates in Oklahoma susceptible to the death penalty there, and while 23 U.S. states no longer carry out this punishment,  there are still almost 2,500 men and women who linger in the shadow of impending and eventual execution.

 

We remember well the rash of Federal executions in the waning days of the Trump administration. Joe Biden became the first U.S. presidential candidate to openly support an anti-death penalty platform, though his promise to end the federal death penalty within six  months of taking office rings hollow at the present time.

 

Still, we should remember, commemorate, and celebrate the Furman decision from 50 years ago. If only for a fleeting moment in our national history, the nation’s highest court — which had and has a terrible record in the defense, advocacy and protection of human rights, with its long and tainted history of legally defending slavery, genocide, eugenics, both racial and gender disenfranchisement, and other outrages inflicted against (groups of) individuals — showed what is and remains possible for this country regarding an end to the barbarism of the death penalty.

 

It remains the moral duty of all human rights activists to ensure that the moratorium of executions, once enacted for what proved to be only temporary, become a PERMANENT pillar of an America that is forever both death penalty-free and dedicated to the simple idea and essential truth of human rights, namely, that there is no such thing as a lesser person.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183406 https://historynewsnetwork.org/article/183406 0
The Traitor King: Edward VIII and The Future of the British Monarchy

Edward VIII and Wallis Simpson, 1936

 

 

On Friday December 11, 1936, the final vote on the Abdication Bill was passed in Parliament and Edward Vlll ceased to be King. He had reigned for 326 days. His father’s premonition—that within twelve months of his death his son would "ruin himself"—had come true.

 

Later that day he broadcast to the nation, "I have found it impossible to carry the heavy burden of responsibility and to discharge my duties as King as I would wish to do without the help and support of the woman I love."

 

Immediately he was driven in heavy rain to the south coast port of Portsmouth where he crossed the gangway onto HMS Fury – the original choice HMS Enchantress was not deemed appropriate – with his dog, Slipper, under his arm. "I knew now that I was irretrievably on my own," he later wrote. "The drawbridges were going up behind me."

 

In order to marry the twice divorced American Wallis Simpson, the new Duke of Windsor had given up everything: the job for which he had been prepared all his life, his interest in the Royal homes of Balmoral and Sandringham, his beloved country retreat Fort Belvedere, and the support of his family.

 

Their reaction was swift. As part of the financial settlement – he was to receive £25,000 a year annuity like his siblings – he was to stay out of Britain so as not to upstage his younger brother Bertie, now the new King George Vl.

 

Though her sisters-in-law had been given the title of Her Royal Highness on marriage, this was denied to Wallis – an insult which rankled with the Windsors for the rest of their lives and was almost certainly illegal.

 

No member of his family attended his wedding in June, which consisted of seven British guests, and Lord Mountbatten, for whom the Duke had been best man, was forbidden from carrying out the same role for him.

 

The Church of England had banned any minister from conducting the service and the couple had resigned themselves to a simple civil service until the Reverend Robert Anderson Jardine, an eccentric "large-nosed, bulging-eyes, red-faced" minister from the North of England offered his services.

 

The Windsors were to be kept out of Britain for the rest of their lives – he lived another thirty-six years after the Abdication, she another fifty years. Apart from a wartime appointment as Governor of the Bahamas, he was denied any official role in spite of his intensive lobbying for some sort of diplomatic job involving America.

 

The couple divided their time between various rented homes in Paris, the South of France (one owned until recently by Roman Abramovich)and a flat in the Waldorf Astoria in New York, all whilst sponging off rich socialites in Palm Beach, Newport and Long Island.

 

They were not invited to his niece’s wedding or her coronation but they joined other members of the Royal Family at the unveiling of a plaque to the Duke’s mother, Queen Mary, in June 1967. Even then, they were not invited to stay for lunch. It was the first time the Royal Family and Wallis had met in over thirty years and she was to see them only once more in Britain – at her husband’s funeral.

 

The lesson of the Duke and Duchess of Windsor is clear. Once you decide to leave "The Firm," you are out. As the Queen pointed out to her grandson Harry,  you cannot be half in and half out.  It’s a lesson he is only just comprehending as his star fades. 

 

The Royal Family has largely survived because it has been prepared to be ruthless and respond to public opinion. It may have acted too slowly over Prince Andrew, given the drip feed of critical stories about him over many years, but once he had agreed to pay off Virginia Giuffre he was a non-royal. He was banned from taking part in all but family occasions and not entitled to use his title of HRH, though he retains some honours including Knight of the Garter. The challenge now will be how to rehabilitate him in terms of public opinion since he cannot be kept out of sight forever.

 

Non-working royals had already been cut from the beauty parade on the Buckingham Palace balcony on ceremonial occasions, as part of a long term plan of streamlining the Royal Family. The brand will now just be the Queen, Prince Charles and his wife Camilla, Prince William and his wife Kate, and their three children; this was very evident during the Platinum Jubilee celebrations.

 

The Royal Family is here to stay – three generations in waiting were on the Buckingham Palace balcony – and it will be one focused on public service rather than private pleasure. There will still be pomp but also a more casual approach, more engagement with charitable and environmental organizations and fewer ceremonial set pieces.

 

The image which the royals wish to project is of a youthful monarchy. The Queen may have reigned for seventy years and her son may be seventy-three, but much of the focus is on Prince William, just forty, and his young family.  Increasingly he is carrying out joint functions with his father, as he seems able to manage the conflicting desires for approachability and mystique.

 

The problem with streamlining is that everyone wants a bit of the Royal Family – the regiments of whom they are Colonels, the charities which rely on them for fund raising, the organizations which need them to champion their cause—and there are not enough of the dutiful ones to go around. 

 

The Queen’s generation – her cousins the Dukes of Gloucester and Kent—are now respectively in their late seventies and eighties, and so much is falling on Prince Charles’s siblings Anne (herself almost seventy-two) and Edward and their partners. Anne’s children have chosen to be non-Royals and Edward’s children are still teenagers, so the only option to fill the labor shortage is to give roles to the daughters of Prince Andrew, something he is keener on than the rest of the Royal Family. Watch this space for who wins this power struggle.

 

It is eighty-six years since the Abdication but it is still a resonant sore within the Royal Family, a reminder that they are human. Britain was lucky that George Vl and Elizabeth ll proved to be model monarchs but times are a-changing and the succession of Charles – potentially  within months – will be an opportunity to reassess the role, if not the continuation, of the monarchy.

It has lasted for over a thousand years and survived one of its greatest challengers in 1936. There is no reason it will not continue, albeit more low key, because in the end the brand comes before family as the Duke and Duchess of Windsor learnt to their cost.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183431 https://historynewsnetwork.org/article/183431 0
The Stench is Coming from Inside the Court

 

 

It has really come to this. Prime Minister Justin Trudeau of Canada just gave us a lecture on American democracy. In addressing the  Supreme Court abortion decision overruling Roe v. Wade he said:

 

Today is a difficult day. The judgment coming out of the United States is an attack on women's freedom and quite frankly it is an attack on everyone's freedom and rights.

 

Normally, we don’t need foreign leaders to give us a lecture on human rights. But this time we deserve it. The core responsibility of judges is to make the law work for everybody, and this Supreme Court has failed abysmally. Polls say that  54% of Americans want a uniform national rule preserving Roe in some form; 28 per cent think it should be overturned.

 

If American democracy is to endure, the country needs a Court widely accepted as an impartial umpire that can balance competing interests, as Chief Justice Roberts famously put it in his confirmation hearings, that can “call balls and strikes,” and “not... pitch or bat.” But this new Court, with three reactionary justices appointed by Donald Trump, making rulings based more in  policy than law, has forfeited all legitimacy. As a result, democracy is imperiled.

 

No wonder Justice Sotomayor chose the pejorative “stench” when she asked and answered her own rhetorical question: “Will this institution survive the stench that this creates in the public perception that the Constitution and its reading are just political acts? I don’t see how it is possible.”

 

The very structure of the common law is based on adherence to precedent. Judges are supposed to stand by their prior decisions. Roe v Wade was said to be a “super precedent,” a fixture of the legal framework. Roe had been the law of the land for almost half a century. Seven justices, including three Nixon appointees, recognized in 1973 a woman’s fundamental right to terminate her pregnancy in the first trimester of pregnancy. As elaborated nineteen years later in Planned Parenthood v. Casey , this right continues until “viability,” an objectively verifiable moment when the child can live outside the mother. Viability is now thought of as occurring 24 weeks after conception.  The decisions, both now overruled, were widely accepted.

 

The Chief Justice, who likes to proceed incrementally, was against overruling Roe on the record before him. He pondered whether the Court could reaffirm Roe, and at the same time uphold the statute. Nevertheless, he concurred in the judgment. The three Trump justices, joined by Justices Thomas and Alito, who for years have vowed to overrule Roe, piled on to the decision. Ignoring the middle ground, they decided, “We have the votes; let’s go for the whole enchilada.” It should have come as no surprise. Trump, during his 2016 campaign, promised to appoint justices who would “automatically” overrule Roe. And so, they did.

 

Alito’s opinion for the majority weirdly recites the 16th century English jurist, Lord Chief Justice Matthew Hale. Hale was the greatest judicial misogynist in the history of the common law. He believed that spousal rape was legally impossible. He sentenced women to death as “witches” based on spectral evidence.

Abortion was totally legal at common law throughout the 18th and much of the 19th centuries. It was only in the 20th century that the practice was criminalized in some states. And it was only in the mid-20th century that it became politicized as Republicans shifted their strategy to target Catholics and evangelicals.

 

At oral argument in the abortion case, Justice Kavanaugh said that upending prior court  rulings could be a good thing. He noted that the Court overruled the decision in Plessy v. Ferguson which upheld school segregation, 56 years after it was made; that 17 years after the 5-4 decision in Bowers v. Hardwicke, which approved the criminalization of homosexual acts between consenting adults, it overruled that decision in Lawrence v. Texas in a 7-2 split. Just wait, said Scalia in his Lawrence dissent, the next thing will be gay marriage. They approved of that too, 5-4. But these were landmark cases which overruled prior decisions to discover fundamental rights embedded in the Constitution, which like the right to terminate a pregnancy, had developed over time with societal change and attitude.

 

The justices sprang with alacrity to overrule Roe v. Wade to extinguish a fundamental right, and leave the fate of poor women, perhaps sick women, or women impregnated by rape or incest in the hands of mostly male legislators in perhaps 26 states poised to outlaw reproductive rights. Many of these states had until 1952 legalized school segregation, and until 1964 legislated segregation in public facilities.

 

The abortion decision is in shocking contrast with the one striking down New York’s venerable law proscribing carry of unlicensed concealed weapons, decided by the same five justices who trashed Roe. The people who will be killed in public with more handguns on the street also have a right to life. And they are all out of the womb.

 

It’s staggering how the Court can celebrate giving state legislatures political control over women’s bodies the very day after the same justices took from the same legislatures the ability to protect lives from concealed weapons carried in public at the ready. Of course, the judges were quick to stress that the Second Amendment did not prevent the government from banning concealed weapons in courtrooms.

 

Alito said  that his decision implicates abortion and no other right. But, no one really believes him. Professor Laurence Tribe  of Harvard Law School calls Alito’s carve-out “BS.”

 

In a concurring opinion Justice Clarence Thomas indicated that some of these unenumerated rights; for example, contraception and marriage equality, may soon be on the chopping block. Henceforth, the Constitution is no longer an evolving “living” document; it is as dead as a doornail.

 

The constitutional rationale for abortion rights as elaborated in Roe and Casey is a balancing between the fundamental rights of the mother, privacy, liberty, autonomy and discrimination (men can’t get pregnant), and the rights of the unborn fetus. If the judge believes out of religious faith or moral conviction that human life begins with conception, then abortion is of course murder, and  the states may outlaw it. But if the judge accepts the science that the fetus is, at least for a time, part of the mother having no prospect of independent existence, the woman has the fundamental right to terminate her pregnancy as she would have the right to have her tonsils out. But, maybe not quite.

 

The Court has opened Pandora’s box. Conservative states, representing about half of the states of the U.S., are poised  to totally ban abortion. They may also be free to punish, possibly with criminal penalties, doctors, health counselors, Uber drivers, funders  and travel agents who facilitate the transportation of women to another state or to Toronto where abortion is legal.

 

Abortion may even be banned nationally before too long. Senate Mitch McConnell has talked about a federal law banning abortion altogether. It could become a reality as early as  2025 if you have a Republican president, a house and senate that passed the legislation and signed it into law.

 

The justices from Breyer to Barrett don’t want to be perceived as “partisan hacks.” Their differences, Breyer insists are philosophical, not political. This is a tough one to swallow, particularly after the abortion and gun cases. When the decision is not based in law, but deeply disputed policy, the law has failed to work for everyone; and then how can the Court fail to leave a “stench” behind?

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183448 https://historynewsnetwork.org/article/183448 0
It's a Mistake to Think the Classics Only Serve a Reactionary Agenda in Education

The Death of Socrates, Jacques-Louis David, 1787

 

 

 

It’s no secret that the teaching of history in the United States has become a flashpoint in the culture wars. But as I finish a semester of teaching a course on “the classical Mediterranean” at a boarding school for girls in upstate New York, what hits me hardest is how much support my students need; how misleading and damaging the culture wars are; how badly we, Americans, prepare future citizens; how little attention most of us pay to the culture, ideas, and history, that have shaped us, no matter where our ancestors come from, no matter the color of our skin; how far most of us are from a self-aware relationship to the American experiment; how deeply American educators and students need common sense approaches to studying the past, to explore who “we” are, where “we” come from and where “we” want to go; and how even a high school introductory survey course is enough to show that the classical Greek and Roman experiences of downfall land uncomfortably close to home. 

 

From January until June of 2022, my students and I started with Paleolithic hunters and gatherers and followed the transition to agriculture and urban civilization. We finished with the fall of the Western half of the Roman Empire and the rise of Christianity. Mostly, we focused on classical Greece and Rome.

 

Although early in their histories the ancient Greeks and Romans did away with their kings, concentrations of wealth and power persistently led to political instability and conflict—as well as continual efforts to to redistribute land and power. Solon and Cleisthenes succeeded as reformers in Athens and set the stage for The Athenian Golden Age. In Rome, The Struggle of the Orders led to constitutional revisions that increased the power of the Plebeians and established the right of Plebeians and Patricians to intermarry.

 

The conditions for limited self-government, though, remained fragile. The people of Athens sentenced Socrates to death basically for asking questions. Plato, a witness to the trial, rejected democracy and put his faith in Philosopher Kings. Aristotle believed that active participation in civic life was essential for the good life even though he spent most of his own life as a foreigner in Athens without the rights of citizenship. He also believed in the critical importance of the diffusion of property—of conditions of relative equality—for self-government. After doing an empirical survey of Greek city states, he created a typology of governments: monarchy and aristocracy are rule by the one or the few when they operate in the interests of the common good; when single rulers become corrupt, it is tyranny; when the few become corrupt it is an oligarchy. For Aristotle, democracy was a corruption of an ideal, tempered, form of self-government that we translate into English as “republic.”

 

After the Greeks unified to defeat the Persians, Athens turned into an imperial power and spent decades fighting the Spartans and their allies, which weakened both sides and made it easy for Phillip of Macedon—Alexander The Great’s father--to conquer the Greeks.

 

In 457 BCE, Cincinnatus, a man of the Patrician class who earned his livelihood farming a few acres, was elected dictator of Rome for a term of six months in order to fight against Rome’s enemies. After leading his army to victory, Cincinnatus resigned his office and returned to his farm in little more than two weeks.  But Rome’s military successes over generations were accompanied by growing concentrations of land and wealth which made it impossible for small farms, such as the one that Cincinnatus owned, to survive. This broke the backbone of a citizen’s army, which was replaced by a professional army and by mercenaries. Americans of his day often compared George Washington, who voluntarily relinquished power and went home, to Cincinnatus; there is a famous statue of Washington as Cincinnatus.  But in ancient Rome, as in the United States centuries later, the spoils of empire proved enticing. Republican virtues were difficult to keep in the midst of such wealth and such inequalities. Writing in the time of Julius Caesar, the Roman historian Sallust contrasted the “good morals” of the early Roman republic with the Rome of his day. The Roman Republic, he argued, was corroded from within.

 

The Roman general Sulla seized control of Rome in a way that would have been previously impossible because the social conditions, the order of society, had changed. Not long after, Caesar made himself dictator for life. The Brutus who plotted against him claimed descent from the Brutus who centuries earlier killed the last Roman king. But after they stabbed Caesar to death underneath a statue of his rival, Pompey, this second Brutus and his co-conspirators were surprised to discover how popular the dictator had become with the people. Where Julius Caesar grabbed for power quickly, his heir, Octavian (Augustus) Caesar, patient, with more time, engineered his election to a series of traditional Roman offices. He took the title of “First Citizen.” Even as he consolidated power in himself, he maintained a façade of continuity.

 

In 430 BCE, during the Great Peloponnesian War, a plague killed off about one third of the Athenian population. Centuries later, the Antonine Plague (165-180 CE) killed off millions across the Roman Empire and through trade spread all the way to China. It probably killed Marcus Aurelius, the last of the “five good emperors” who, a devout Stoic, came as close as anybody to becoming a Philosopher King.

 

At its heyday, the Roman Empire succeeded where the Greek city states failed in large part because Rome was open and welcoming. Roman citizenship was continually extended to people across the empire. Former slaves and their children could become wealthy citizens, great poets, and political leaders. But The Empire, too, was eaten away from within. What we call “The West” is what grew from the ruins of the Western half of the Roman Empire. Meanwhile, in the hills of Judea, a new religion grew from the political ferment against the empire, with visions of a new political order, a messianic age.

 

Political institutions crumble; political states appear and then vanish over time; borders are constantly rearranged. Cultures continue even as they change. Americans communicate with the letters bequeathed by the Romans; our language is filled with words that come to us from classical Greece and Rome; we use the calendar, more or less, that Julius Caesar imported from Egypt; our doctors continue to take the Hippocratic Oath. The American democratic experiment draws from these classical traditions, as we can see from our founding documents, the correspondences of the nation’s founders, the name of our upper house of parliament, the architecture of the national capital, how we are called to jury duty, what is etched in stone and into our legal codes.

 

My struggle as a teacher this past semester was to spark interest within the painfully awkward young people who walked with masked faces into class each day. I celebrated when they expressed in their quiet voices even a tentative idea or emotion. Gradually we became human together. One of them talked of her love of horses, another of her love of dystopian fiction. I asked them about the difficulties of being a teenager today.  I tried to alleviate their anxiety about grades. We read out loud excerpts from “The Trial of Socrates.” Once we broke down how much time they each spend daily on their phones. I counted it a great victory when they facilitated their own dialogue about Plato’s “Parable of the Cave.” One of them came into class the next day and told me with excitement that she had been thinking about the difference between “truth” and “opinion.” In a few words, she articulated what I wish more Americans would reflect upon as we watch the January 6th hearings unfold.  “They are not the same thing,” she explained.

 

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183450 https://historynewsnetwork.org/article/183450 0
Organized Defiance May Be the Only Answer to an Undemocratic Court

 

 

The rightwing majority on the Supreme Court is on a rampage against reasonable, non-ideological, interpretations of the law and the Constitution. It may be poised, in the language of Clarence Thomas’s concurring opinion in Dobbs v. Jackson Women’s Health Organization, to “reconsider all of this Court’s substantive due process precedents, including Griswold, Lawrence, and Obergefell” and “whether any of the rights announced in this Court’s substantive due process cases are ‘privileges or immunities of citizens of the United States’ protected under the Fourteenth Amendment.” Republican judges, including the three Trump appointees, seem determined to eliminate all protection for the personal right to intimacy and privacy and fundamental legal protections. Their actions place individuals, especially women, the country as a whole, and with the threat of an impending climate catastrophe, much of the world, at deep risk.

 

As Martin Luther King, Jr. explained in Letter from Birmingham Jail (1963), “in any nonviolent campaign there are four basic steps: collection of the facts to determine whether injustices are alive, negotiation, self-purification, and direct action.” King argued that other avenues for the achievement of racial justice had been followed and he called for the creation of “constructive nonviolent tension” and defiance of unjust laws. As in 1963, the only remaining legitimate response to the Supreme Court’s attacks on the Constitution may be individual and collective defiance including challenges to judicial review.

 

As a result of previous conservative Court decisions, firearm manufacturing has almost tripled since 2000 and has sharply expanded in the last three years. Automatic weapons and accessories, purchased legally in one state and taken across state lines have been used in mass murders and racist attacks. Unassembled “ghost guns” are shipped by mail and reassembled and used in gang violence. This country is in the midst of a gun epidemic.

 

In 2008, for the first time in United States history, the rightwing Supreme Court majority ruled in District of Columbia v. Holder that an individual had the right to possess a gun independent from membership in a well-regulated militia. This decision completely reinterpreted the 2nd Amendment, discarding restrictions on gun ownership. The court majority did concede that this new interpretation of the 2nd Amendment did not prevent all regulation of gun ownership and possession including prohibitions on carrying concealed weapons and the possession of firearms by felons and the mentally ill, or laws forbidding bringing firearms into schools and government buildings. In 2010, the Court further expanded bogus Second Amendment “rights.” In McDonald v. City of Chicago, the court, by a 5-4 vote, invalidated a Chicago law that prohibited handgun possession by almost all private citizens. Now, in New York State Rifle & Pistol Association Inc. v. Bruen, the rightwing court majority overturned New York gun regulations restricting the concealed carry of pistols in public places, once again privileging gun rights over human rights.

 

In these decisions, the Supreme Court overturned long established legal principles, just as it did in discarding Roe v. Wade. In a 1939 decision, the Supreme Court validated the National Firearms Act of 1934, determining that Congress could regulate gun ownership when the weapon did not have a “reasonable relationship to the preservation or efficiency of a well regulated militia.” The Court ruled that the Framers wrote the Second Amendment to ensure the effectiveness of the military, not individual gun ownership.

 

In the majority opinion that overturned Roe v. Wade (1973) and abortion rights protections, Samuel Alito asserted that since the 1950s, the Supreme Court’s interpretation of due process rights under the 14th amendment and unenumerated privacy rights under the 9th amendment violate the spirit and legal grounding of the United States Constitution, and should be reversed.

 

The Court majority is now in a position to overturn every Supreme Court decision since the Warren Court’s 1954 unanimous ruling in Brown v. Board of Education declaring de jure segregation illegal in the United States. At risk are not only reproductive freedom but also the right to marry and have sexual intimacy with the person of your choice, the ability to use contraception, due process procedures like Miranda warnings, free speech for students and teachers, union rights, and environmental regulations.

 

As the planet roasts, this court is poised to toss out the power of federal regulatory agencies to enforce reasonable climate and environmental protection claiming that the regulatory agency guidelines represent Executive branch overreach into areas that are the responsibility of the Legislative and Judicial branches. The rightwing Court majority ignores that the regulatory agencies were all established by acts of Congress and that the Supreme Court in its 1984 “Chevron deference” ruling upheld their authority act.

 

It is time for “blue states” to defy rightwing Supreme Court rulings that are blatantly undemocratic, impose a minority’s religious and political beliefs on the rest of the country, and enforce questionable interpretations of the United States Constitution.

 

The idea of judicial review by this Supreme Court must be challenged.

 

Article III, Section 1 of the United States Constitution simply states, “The judicial Power of the United States, shall be vested in one supreme Court, and in such inferior Courts as the Congress may from time to time ordain and establish.” The clause never explains what the authors of the Constitution meant by “judicial Power.”  Section 2 states that “The judicial Power shall extend to all Cases, in Law and Equity, arising under this Constitution” and in other areas, but again does not explain what “judicial Power” is.

 

Nowhere in the original Constitution or in one of the 27 amendments is there a clause that states “The Supreme Court of the United States shall decide which federal and state laws and actions are consistent with guidelines established in the United States Constitution and which laws and action violate those guidelines and are therefore void.”

 

Starting with the majority decision in Marbury v. Madison in 1803, Supreme Court Chief Justice John Marshall began to assert that “judicial Power’ meant that the unelected judges on the Supreme Court had the final authority to interpret the meaning of the Constitution and decide which federal and state laws and actions were authorized and which were unconstitutional and therefore void. In the Marbury decision, Marshall declared “the Constitution is superior to any ordinary act of the legislature,” but he did not expound a theory of Judicial Review and the Supreme Court did not begin reviewing and throwing out other Congressional acts. In 1810, the Supreme declared a Georgia law violated guidelines established by the United States Constitution, in 1816, in a case involving Virginia, it assumed the right to review state court decisions, and in 1819 it declared that federal authority superseded state authority when Maryland tried to tax a bank created by Congress.

 

Claims that the Supreme Court was the ultimate interpreter of the Constitution was challenged by both former President Thomas Jefferson in 1819 and by President Andrew Jackson in 1830. Jefferson denounced the 1819 court decision in McCulloch v. Maryland as a violation of state sovereignty, however the greater challenge to the Court’s expanding interpretation of “judicial Power” came from Jackson. In Worcester v. Georgia (1832), the Supreme Court ruled in favor of the Cherokee nation in a case where they challenged a Georgia law for violating their treaty with the United States. President Jackson responded, "John Marshall has made his decision, now let him enforce it" and Jackson and Georgia just ignored the Court’s decision.

 

The Supreme Court’s interpretation of “judicial Power” became more firmly established in 1832 when South Carolina passed an Ordinance of Nullification declaring federal tariff laws would be null and void and uncollectable in that state. In this case the response to South Carolina was in the form of a presidential interpretation. Congress did not act with a new law defining “judicial Power” and the Constitution was not amended.

 

President Jackson rejected the right of a state to ignore a federal law and issued a Presidential proclamation declaring “the power to annul a law of the United States, assumed by one State, incompatible with the existence of the Union, contradicted expressly by the letter of the Constitution, unauthorized by its spirit, inconsistent with every principle on which it was founded, and destructive of the great object for which it was formed . . . The Constitution declares that the judicial powers of the United States extend to cases arising under the laws of the United States, and that such laws, the Constitution, and treaties shall be paramount to the State constitutions and laws. The judiciary act prescribes the mode by which the case may be brought before a court of the United States by appeal when a State tribunal shall decide against this provision of the Constitution . . . Here is a law of the United States, not even pretended to be unconstitutional, repealed by the authority of a small majority of the voters of a single State. Here is a provision of the Constitution which is solemnly abrogated by the same authority.”

 

Starting in the 1840s in Prigg v. Pennsylvania (1842) and in the 1850s in Dred Scott v. Sandford (1857), a Southern dominated Supreme Court used its claim of “judicial Power” to protect the slavery regime in the United States. After the Civil War, it eviscerated the legal rights of African Americans promised under the 14th amendment with decisions in the Civil Rights Cases (1883) and Plessy v. Ferguson (1896).

 

Following passage of the Fugitive Slave Law in 1850, ant-slavery abolitionists declared they would refuse to obey an illegal law and on the Underground Railroad they helped thousands of enslaved Africans to escape to freedom. It is time for “Blue states” to refuse to comply with unconstitutional interpretations by a rightwing Supreme Court and to challenge the legitimacy of the Court’s claim of  “judicial Power.”

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183452 https://historynewsnetwork.org/article/183452 0
The Roundup Top Ten for July 1, 2022

Dangerous as the Plague: The History of Moral Panics over Queer "Seduction"

by Samuel Huneke

From the perspective of the post-Obergefell US, this year's politicized attacks on LGBTQ people—particularly as threats to the nation's youth—seem like a sudden reversal. But such attacks have a long and miserable history that has shadowed movements for queer freedom at every turn. 

 

If the Court Can Reverse Roe, it Can Reverse Anything

by Mary Ziegler

The court majority's assurances that abortion rights are a special case, and that other liberties are not in jeopardy, is hollow. 

 

 

Thomas's Guns Opinion is Ahistorical and Anti-Originalist

by Saul Cornell

"Ultimately, the majority opinion in NYSRPA v. Bruen is one of the most intellectually dishonest and poorly argued decisions in American judicial history."

 

 

Trump's Incitement Against Shaye Moss over the Georgia Vote Follows American Tradition

by Tera W. Hunter

Whether for casting ballots or counting them, Trump was quick to blame Black Americans for his defeat, carrying on an ignominious tradition of casting Black political participation as illegitimate and dangerous. 

 

 

Libs Baffled Why Trump Supporters Aren't Swayed by Bombshells? Look to Iran-Contra

by Kristin Kobes Du Mez

One group—Evangelicals—stood strong behind Oliver North long after his public profile faded. For them, his willingness to break the law in service of his idea of the greater good was the essence of his heroism. 

 

 

A Legend of Innocence

by Daniel Solomon

Both the French left and right are impeding the teaching of how 75,000 French Jews were turned over to the Nazis. 

 

 

Seeing Through America's "Crisis Industrial Complex"

by Nikhil Pal Singh

While the elite media class indulges in lurid fantasies of an armed breakup of the nation, those who live precarious or impoverished lives find themselves already enmeshed in a civil war; the real red/blue conflict is about who will control the infrastructure of repression built up over the last half century.

 

 

Black Women Activists Have Long Connected Abortion Rights to Broader Issues of Freedom

by SaraEllen Strongman

Black women activists have been more likely than their white counterparts to place abortion rights in the broader context of reproductive justice: the freedom to have or not have children on one's own terms without the coercive pressure of political or economic power.

 

 

Our Gruesome Politics are Destroying the Cultural Ideal of Childhood

by Natalia Mehlman Petrzela

The romantic ideal of children as innocents deserving of the protection of adults has long been a rallying cry for politicians. After two pandemic years and more school shootings, are they even pretending to care anymore? 

 

 

The Classic Model of Education and Democracy Can't Address Today's School Politics

by Steven Mintz

The idea of education serving democracy by producing informed citizens is tested by the lack of agreement about what that goal means. Can the competing claims on the education system be reconciled? 

 

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183447 https://historynewsnetwork.org/article/183447 0
What Prohibition History Tells Us about Returning Abortion to the States

 

 

Samuel Alito’s leaked majority draft opinion announcing the end of Roe v. Wade will throw the policy of abortion to the states, but the history of alcohol prohibition tells us that it is unlikely to stay there. Like the anti-abortion movement, temperance was driven by a religiously defined worldview. Many Protestant Americans embraced prohibition as solution for what they saw as real problems, as well as a cultural stand. To them, consumption of alcohol was a sin, and sellers and manufacturers of alcohol were the equivalent of today’s drug pushers and cartels. Also like the anti-abortion movement, temperance first proposed sweeping national solutions but gained its real victories in the states. By the 1880s, a handful of prohibition states existed, in name. Their policy was undercut by the easy importation of alcohol from other states. Such imports were protected by the federal government’s control over interstate commerce. The flow of legal liquor into dry states prompted calls for federal legislation to stop the sending of liquor into dry jurisdictions.

The first effort failed, but set the pattern for later laws. An 1890 federal act said that states could treat interstate imported liquor as if was produced internally, but it was limited by the Supreme Court in 1898. In the 20th century, the rise of a new temperance group, the Anti-Saloon League (ASL) revitalized the anti-alcohol movement, state prohibition, and eventually prompted effective federal enabling legislation. The template for a single-issue lobby, the ASL called itself “The Church in Action against the Saloon,” and it adopted both pressure politics and incremental measures short of outright prohibition to achieve its goal of restricting alcohol. With their allies in the Woman’s Christian Temperance Union and the Methodist Church, they dried out much of the nation, especially rural areas in many states, through local option and states in the Midwest and South through state prohibition laws. In many of those new prohibition states there was a personal use exception allowing people to import liquor for their own use. A thriving shipping and mail order industry supplied this legal liquor and some of it was diverted into illegal channels.

As the ASL’s political power grew, it sought federal legislation to stem the flow of liquor into the areas that had forbidden the sale of alcohol. In 1907, drys won the passage of a federal law (called the COD Act) that restricted the flourishing cash-on-delivery interstate liquor trade. In 1913, they won a very significant victory when their Congressional allies passed the Webb-Kenyon Act over President Taft’s veto. This law made it illegal to send alcohol into any state where it was intended to be used contrary to that state’s law. Significantly it did not have federal enforcement mechanisms and its enforcement was left up to the states. The states did enforce it, often by setting limits on amounts of liquor that could be received by someone for their personal use. A few states banned all importations – becoming what was then called “bone dry.” Moreover, gathering the supermajorities to override a veto opened up the possibility of enacting a prohibition amendment. And the ASL turned its attention to that goal soon after. But before the 18th Amendment was drafted, Congress passed a rider to the postal appropriations act, called the Reed Amendment to the Postal Act, banning interstate liquor shipments into all prohibition territory. This law superseded state laws that might have recognized personal use shipments: it made the prohibition states bone dry.

Consider now how the abortion bans of American states that are going to go into effect after Roe is wiped from the constitutional law are going to be evaded. While there will be transportation networks to bring women seeking abortions to jurisdictions where the procedure is legal, it would be far less difficult to arrange interstate shipment of abortion drugs (such as Mifepristone) that can be self-administered. Using their power to regulate medical care, the anti-abortion states will extend their bans to the use of the pills. But, legally states cannot stop pills moving through interstate commerce and the mails as both routes are under the control of the federal government. In the age of the internet and next day delivery the ability of the states to stop the flow of abortion pills where they are banned will be no more effective than 19th and 20th century states’ efforts against interstate alcohol.

Anti-abortion activists will seek to curtail the mailing and shipping of abortion medications into anti-abortion states. They probably will get states to pass laws prohibiting such shipments. And the further history of prohibition is illuminating on the potential fate of such laws. The prohibition repeal amendment, the 21st Amendment, restated the Webb-Kenyon Act giving the states power over imported liquor by prohibiting the importation of intoxicating liquor to be used contrary to states’ laws. For more than fifty years that authority was upheld by the Supreme Court. But beginning in the 1980s, the Court narrowed the states’ powers. Indeed, in this century, two 21st Amendment cases have implications for potential state attempts to regulate imported abortion drugs. Granholm v. Heald (2005) said states could not ban internet wine sales, and Tennessee Wine v. Thomas (2020, majority opinion by Justice Alito) struck down stringent state restrictions on who could run retail liquor establishments. Both were held to violate the federal government’s power to regulate interstate commerce despite the language of the 21st Amendment. Without the hook of the 21st Amendment to hold, it is hard to see how any state regulation of abortion pills from out-of-state will meet constitutional standards. The path that would be logical would be for anti-abortion states to seek federal legislation to complete their bans. Their political power will determine whether they go for a federal declaratory law allowing states to enforce their law like the Webb-Kenyon Act, or a federal regulatory approach like the COD Act, or a de facto ban like the Reed Amendment. And that raises another question: if they have success, will they then follow alcohol prohibitionists and seek a national ban? 

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183383 https://historynewsnetwork.org/article/183383 0
50 Years Ago, a SCOTUS Decision Placed a Moratorium on Executions. It's Time to Revive it, Permanently

 

 

 

 

Fifty years ago in 1972, as spring faded and summer arrived in late June, America (and the world) was a vastly different place.

 

The United States was still entangled in the quagmire of the Vietnam War, and tens, if not hundreds, of thousands of individuals still marched on city streets and on university campuses demanding an end to the bloodshed that would ultimately claim the lives of over 58,000 American soldiers and 3 million Vietnamese.

 

On May 15, Alabama Governor and presidential candidate George Wallace was shot (and paralyzed) by Arthur Bremer in a parking lot in Laurel, Maryland. Within 2 weeks, there would be two failed break-ins at the Watergate complex in Virginia, a crime that led to the downfall and resignation of President Richard Nixon in August 1974.

 

In June, the first U.S. Libertarian Party National Convention was held, and it became the first party to call for the repeal of all victimless crime laws. The five Watergate burglars and White House operatives were arrested for the break-in at the offices of the Democratic National Convention.

 

On June 29, the US Supreme Court issued a monumental decision in Furman v. Georgia, stating that the US death penalty was unconstitutional because it was administered in both a racially and geographically discriminatory manner, and it converted all existing death sentences to life imprisonment.  The decision saved the lives of 611 inmates in 31 states, including members of the notorious Manson Family in California, Sirhan Sirhan (also on death row in California), the convicted assassin of Sen. Robert Kennedy, and Richard Speck, convicted of killing 8 female student nurses in their Chicago residence.

 

Only a decade before, in 1962, the United States carried out 47 executions in 17 states, led by California (11) and Texas (9). Other executing states included: Alabama, Colorado, Florida, Georgia, Illinois, Iowa, Kansas, Kentucky, Mississippi, New Jersey, Ohio, Oregon, Pennsylvania, South Carolina and Virginia. 

 

The 1960s had no shortage of both global and domestic violence and upheaval. Yet, U.S. executions declined. In 1963 there were 21 executions in 13 states led by Texas (4); 15 executions in 8 states in 1964, led by Texas (5); 7 executions in 4 states in 1965, led by Kansas (4); Oklahoma carried out the nation’s only execution in 1966; and only 2 inmates were executed in 1967, in California and Colorado.

 

An unofficial moratorium on executions began in 1968, and it would last until January 17, 1977. This was a far cry from the nadir of U.S. executions, in the 1930s during the Great Depression when an average of 167 executions were carried out yearly.

 

The case of Furman v. Georgia actually had its origins in a death penalty decision in Arkansas in 1962.

 

William Maxwell, an African American man, had been convicted and sentenced to death for the rape of a 35-year-old white woman. His appeal had been rejected by the Arkansas state supreme court, and when the U.S. Supreme Court refused to hear his case, they returned it to the Arkansas state supreme court.

 

University of Pennsylvania law professor Anthony Amsterdam was contacted by the Legal Defense Fund and he led the effort in a petition for habeas relief, arguing that the death penalty was unconstitutional because “jurors were given no guidance about how to reach a decision, leading to arbitrary results; the single-verdict trial, in which the jurors had decided Maxwell’s guilt and sentence simultaneously, denied them the opportunity to weigh mitigating factors,” and due to claims of racial bias, showing that in the 2-decade period from 1945-1965, “black defendants who raped white women in Arkansas stood a 50% chance of being sentenced to death if they were convicted, compared to a 14% chance for white offenders.”

 

Amsterdam was the lead attorney several years later when he argued before the U.S. Supreme Court that the death penalty was overwhelmingly used against the “predominantly poor, black, personally ugly, and socially unacceptable,” the very people “for whom there simply is no pressure on the (state) legislature” to remove the death penalty.

 

The Court gave its answer on June 29, 1972, in a decision that reached almost 80,000 words and to this day remains among the longest decisions ever. By the narrowest of margins, 5-4, the court found that the U.S. death penalty was indeed “cruel and unusual punishment,” thereby violating both the Eighth and Fourteenth Amendments.

 

Each of the nine justices wrote his own opinion, with Justice Potter Stewart concluding that “these death sentences are cruel and unusual in the same way that being struck by lightning is cruel and unusual.” Justice Byron White wrote that while he did not believe that the death penalty itself was unconstitutional, “the penalty is so infrequently imposed that the threat of execution is too attenuated to be of substantial service to criminal justice.”  

 

Justice Harry Blackmun added that “I yield to no one in the depth of my distaste, antipathy, and, indeed, abhorrence for the death penalty … and of moral judgment exercised by finite minds.”

 

Both Justices Stewart and White made it clear they opposed the outright and complete abolition of the death penalty. Indeed, they left the door open for states to meet the Court’s insistence on a uniform standard for death sentencing across the nation, and only six months later, in December 1972, Florida enacted new death penalty statutes and over 30 states soon followed.

 

The Furman decision would remain in place until America’s bicentennial, when on July 2, 1976, the U.S. Supreme Court ruled on five different death penalty cases: Gregg v. Georgia, Proffitt v. Florida, Jurek v. Texas, Woodson v. North Carolina, and Roberts v. Louisiana.

 

The Court upheld two new and broad guidelines that state legislatures were required to follow in order to have a legal capital sentencing statute: first, there had to be objective criteria to direct and limit the death sentencing discretion, and the objectiveness of the criteria had to be ensured by appellate review of all death sentences; and second, the defendant’s character and record had to be taken into account before a death sentence could be delivered.  The Georgia, Florida and Texas cases were upheld in a 7-2 vote, and the North Carolina and Louisiana cases were not.

 

The moratorium on national executions would be shattered on January 17, 1977, when Gary Gilmore became the first condemned inmate to be executed in the US since 1967. He was shot to death in the Utah State Penitentiary. The floodgates on U.S. executions were soon to follow, with the U.S. beginning double-digit executions in 1984, peaking with 98 executions in 1999, and remaining in double digits virtually every year, including 11 in 2021.

 

There have already been seven executions thus far in the U.S. this year, with the next one scheduled for July 13 in Texas. A recent federal court ruling now makes 25 death row inmates in Oklahoma susceptible to the death penalty there, and while 23 U.S. states no longer carry out this punishment,  there are still almost 2,500 men and women who linger in the shadow of impending and eventual execution.

 

We remember well the rash of Federal executions in the waning days of the Trump administration. Joe Biden became the first U.S. presidential candidate to openly support an anti-death penalty platform, though his promise to end the federal death penalty within six  months of taking office rings hollow at the present time.

 

Still, we should remember, commemorate, and celebrate the Furman decision from 50 years ago. If only for a fleeting moment in our national history, the nation’s highest court — which had and has a terrible record in the defense, advocacy and protection of human rights, with its long and tainted history of legally defending slavery, genocide, eugenics, both racial and gender disenfranchisement, and other outrages inflicted against (groups of) individuals — showed what is and remains possible for this country regarding an end to the barbarism of the death penalty.

 

It remains the moral duty of all human rights activists to ensure that the moratorium of executions, once enacted for what proved to be only temporary, become a PERMANENT pillar of an America that is forever both death penalty-free and dedicated to the simple idea and essential truth of human rights, namely, that there is no such thing as a lesser person.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183406 https://historynewsnetwork.org/article/183406 0
"Oh, We Knowed What Was Goin’ On": The Myths (and Lies) of Juneteenth

 

 

92 years old and blind, in June 1937 a formerly enslaved Texan named Felix Haywood talked to a WPA writer about the Civil War and its aftermath. “ Oh, we knowed what was goin’ on in it all the time,” said Haywood, “We had papers in them days just like now.” Heywood exposed a central canard of Juneteenth as a day when uninformed Blacks in Texas first learned they were free. In fact, General Gordon Granger and his 1,800 Union troops were not in Texas to deliver news to uninformed Blacksz as much as they were there to enforce the law for recalcitrant whites. Juneteenth is built on falsehoods and wrapped in mistruths. The pillars of the day do not hold up to historical scrutiny.

Texas was a pariah state, where southern whites dreamed of a white supremacist homeland. “During the Civil War,” writes historian Gregory P. Downs,  “white planters forcibly moved tens of thousands of slaves to Texas, hoping to keep them in bondage and away from the U.S. Army.”  Even after Lee’s surrender at Appomattox, Texas governor Pendleton Murrah refused to surrender the state, fleeing to Mexico and leaving control of state government, and surrender, to Confederate Lieutenant General Edmund Kirby.

After Appomattox, white slaveholders in Texas kept Black men and women enslaved, and killed them when they tried to assert their freedom. So, when Granger read General Order No. 3 to the public on Galveston Island, he was delivering a message not so much to enslaved men and women, but to their enslavers, and he was backing up that message with force.

“The people of Texas are informed that, in accordance with a proclamation from the Executive of the United States, all slaves are free,” newspaper accounts reported Granger saying, emphasizing the word “all.” Yet, even that statement was false.

Lincoln’s Emancipation Proclamation, which went into effect two and a half years before June 19, 1865, did not free “all slaves.” The proclamation only freed slaves in the Confederate states which never recognized Lincoln’s authority to begin with. Slavery in Mississippi was illegal but slavery in Massachusetts was permitted. New Jersey did not ratify the 13th amendment to the Constitution, abolishing slavery, until January 23, 1866. Mississippi did not ratify it until 2013.

Nineteenth century British newspapers pointed out the hypocrisy. “The principle asserted is not that a human being cannot justly own another,” wrote the London Spectator, “but that he cannot own him unless he is loyal to the United States.”

Lincoln said as much. “If I could save the Union without freeing any slave I would do it, and if I could save it by freeing all the slaves I would do it; and if I could save it by freeing some and leaving others alone I would also do that.”

Lincoln chose the latter, more concerned with winning the Civil War, and preserving the Union, than with freeing “all slaves.” Thus, Granger was in Galveston to enforce a proclamation that was, in reality, a war maneuver, a “psy-ops” measure, meant to harass the Confederacy.

Prior to Lincoln’s assassination on April 15, 1865, he’d agreed to General William T. Sherman’s plan, formally called Special Field Orders No. 15, and popularly known as “forty acres and a mule.” This reparation plan sought to distribute nearly 1 million acres of rich coastal southern farmland, taken from slaveholders, and give that land to the formerly enslaved. The Freedman’s Bureau had already redistributed a portion of that acreage to nearly 40,000 Black families. But around the time of the first Juneteenth, after ascending to the presidency, Andrew Johnson rescinded Special Field Orders No. 15, returning all land to former slaveholders.

Then, in a letter to the president, Sherman revealed that he never intended for the formerly enslaved to have the land anyway. He simply wanted to keep Black men and women from swamping Union army camps in search of freedom. “Forty acres and a mule,” like the Emancipation Proclamation, was a battlefield tactic, a “psy-ops” operation, only this time perpetrated on Blacks.

What’s to celebrate on Juneteenth? White supremacy? Lincoln’s tepid proclamation? Taking back the first, and only, plan for reparations? Frederick Douglas, and many others, continued to celebrate August 1 as Emancipation Day, through the end of the nineteenth century. August 1, 1834, was when Britain abolished slavery throughout the entirety of its empire.

Juneteenth harbors a terrible legacy of deception and promotes a modern day abdication of historical truth. Still, the day deserves recognition. Perhaps as a nation we will one day enter a truth and reconciliation process around American slavery. But even there, truth comes before reconciliation. So, the truth about Juneteenth must be told.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183355 https://historynewsnetwork.org/article/183355 0
Clearing the Name of a Horse Blamed for Near-Defeat at Waterloo

Royal Scots Greys depicted in Scotland Forever!, Elizabeth Lady Butler, 1881

This painting depicts the cavalry unit in a heroic posture at Waterloo. The Greys' actions have since been reassessed, and their disregard of orders considered a factor in cavalry losses that nearly doomed the campaign. 

 

 

207 years ago this June 18, the Duke of Wellington’s army famously defeated Napoleon at the Battle of Waterloo. Wellington himself described it as “a near run thing.” In fact, in the battle’s early stages the defeat of British cavalry and death of a major general meant that it would take a surprise evening infantry charge to bring Wellington victory. To cover up the cavalry failure and true reason for the general’s death, the British establishment blamed an Irish horse. This is the case for the defense of that horse.      

 

Early on the afternoon of Sunday, June 18, 1815, with sabers held high, six hundred red-coated British riders of the 2nd Union Cavalry Brigade charged toward the army of France’s Emperor Napoleon. Their target was the advancing infantry of Marshal Drouet, Comte d’Erlons, which had just broken through the British 9th Brigade on the Allied centre left and threatened to outflank and roll up the Duke of Wellington’s entire army. The Battle of Waterloo was at a crucial early stage, and the British cavalry had orders to halt the French advance.

Leading this charge down the slope of Mont St. Jean and then diagonally across the battlefield were England’s 1st Royals Dragoons, followed by the Inniskilling Dragoons from Ireland. Another 393 Scottish horsemen from the Royal North British Dragoons trailed them at the walk. Because all the Scotsmen were mounted on grey horses, their regiment was known as the Scots Greys.

Major General Sir William Ponsonby, 2nd Union Brigade commander, had instructed Scots Grey’s commander Lieutenant Colonel James Hamilton to hold his troopers in reserve so that Ponsonby could send them in where and when needed once he’d gauged the effect of the Royals’ and Inniskillings’ charge.

From the flank, British horsemen plowed into the French as they attacked the Allied second line, slashing and slicing their way through and seizing the eagle standard of the French 105th Regiment of the Line.

In the centre of the struggle, out of reach of the cavalry, determined French infantry forced outnumbered Gordon Highlanders to break and begin streaming back through the slowly approaching Scots Greys. Colonel Hamilton, infuriated that fellow Scots were retreating, bellowed at the Highlanders not to run. Then, disobeying General Ponsonby’s express command, Hamilton drew his saber and ordered his horsemen to charge. The Scots Greys had not seen action in twenty years, and Hamilton’s inexperienced troopers excitedly surged forward with him. Watching from the slope to the rear, a furious General Ponsonby ordered the recall sounded, but the Scots Greys ignored the bugle call.

To counter the British cavalry, French commanders formed their bayonet-equipped troops into squares. Some frustrated British cavalrymen vaulted their horses over their triple lines to attack the squares from within. But once inside, they were pulled from their horses and bayoneted.

Cursing, forty-two year-old General Ponsonby, fearing the French would send cavalry to cut off the Scots Greys, surprised his four aides-de-camp by drawing his sword and kicking his small bay Irish stallion into action, heading for the fighting to personally extricate his cavalry reserve before it was too late. His aides followed.

“Reform! Reform on me!” Ponsonby angrily yelled to the Scots as he joined them.

But, the Anglo-Irish Ponsonby had only commanded the Union Brigade for three weeks; not time enough to win the Scots’ respect and loyalty. Ignoring him, they followed their Scottish colonel in charging the guns of the French Grande Battery, leaving General Ponsonby alone and isolated. Even Ponsonby’s aides joined the charge.

Shortly after, Ponsonby was hit in the body by a musket ball fired from a nearby French square. He fell to the ground, bicorn hat flying. Ponsonby succeeded in remounting his horse, and turned back toward British lines. Beneath his blue jacket, the general was bleeding profusely.   

Ponsonby’s worst fears were realized. French cavalry arrived. The green-uniformed 4th Light Horse Lancers commanded by thirty-four-year-old Colonel Louis Bro de Comeres were soon pressing in on the Scots Greys’ rear with their nine-foot-long lances, picking them off. The British would be so impressed by their effectiveness that, the following year, four British dragoon regiments would be converted to lancers.

Amid the carnage, Colonel Bro de Comeres spotted the standard-bearer of the French 55th Regiment, a young second lieutenant whom he knew, being circled by Scots Greys. The colonel rode to the lieutenant’s aid, but arrived to see the lieutenant fall and a Scots Grey trooper seize his eagle standard and turn back toward British lines. The colonel gave chase.

His pursuit brought him to a plowed field in the valley between Hougoumont Farm and the La Belle Alliance inn, roughly in the battlefield’s center, with a wood to one side and British infantry lines half a mile distant. There, he came upon two stationary horsemen. One, his senior noncommissioned officer, Francois Orban, had a bareheaded senior British officer bailed up.

This was, the colonel later learned, General Sir William Ponsonby. He had his sword in his hand, by his bay horse’s side. The vastly experienced Orban, awarded the Legion d’Honeur by his emperor, had the tip of his lance at the general’s chest. With his left hand, Orban was motioning for Ponsonby to drop his sword and surrender.

Ponsonby’s predicament was spotted by four retreating British horsemen including the Scots Grey with the captured French eagle, who diverted to aid him. Orban later said that he now saw General Ponsonby move as if to escape. Orban plunged the needle-sharp tip of his lance into Ponsonby’s heart. The general toppled from his horse.

Colonel Bro de Comeres meanwhile rode at the approaching British quartet, felling a major and a lieutenant with his saber and sending two Scots Grey troopers fleeing. Orban, the general’s killer, chased this pair, dispatched both Scotsmen, then retrieved the eagle of the 55th, before watching British infantry.

Orban returned to General Ponsonby’s body. The general’s little bay horse remained close by his fallen rider. By this time Colonel Bro de Comeres had been wounded in the arm and withdrawn, but Orban calmly dismounted and took Ponsonby’s sword as a souvenir. Orban survived the battle and subsequently hung the sword above his farmhouse fireplace.

With half the Scots Greys and their impetuous commander Lieutenant Colonel Hamilton soon also dead, the shattered unit withdrew in disorder, playing no further part in the battle. They would join the pursuit of the French after Napoleon’s army was finally overwhelmed, by Wellington’s charging Foot Guards, just after eight o’clock that evening.   

At dawn the following day, as Allied troops located wounded and buried the dead, General Ponsonby’s personal servant found Ponsonby in the plowed field, stripped of all but his bloodied shirt by local booty hunters. His loyal little bay horse was standing protectively over the general.

Before long, Gentleman’s Magazine reported the demise of Ponsonby, the most senior officer on either side to die at Waterloo, commenting that it “was occasioned by his being badly mounted.” The article claimed the bay became bogged and was not strong enough to free itself, allowing the general to be overtaken by French lancers.    

Numerous Ponsonby myths would be perpetuated by nineteenth century books and articles, and the 1970 Dino De Laurentiis movie Waterloo. But the story blaming Ponsonby’s horse for his death became accepted as fact. One claim was that Ponsonby used the bay in preference to a more valuable chestnut horse, to save money. Three days before the battle, another officer named Hamilton, Lieutenant Colonel Alexander Hamilton of the 30th Regiment of Foot, attempted to sell the powerful chestnut to General Ponsonby, describing Ponsonby’s regular mount as ‘a bay hack’ too weak for battle. The general knew the chestnut was a fine animal, but Hamilton was asking £100, which Ponsonby considered exorbitant. He stayed with his tried and trusted bay.

Until his dying day in 1838, Alexander Hamilton blamed the bay horse for General Ponsonby’s death, declaring that, had Ponsonby purchased the chestnut from him, he would have survived. This is unlikely. For the battle, Hamilton loaned the chestnut to his battalion quartermaster. The horse proved so unruly it embarrassed the quartermaster several times that day.

No experienced cavalry officer – and Wellington considered Ponsonby one of his best – would go into battle, with his life depending on it, deliberately poorly mounted. However, the story that General Ponsonby’s death was due to his “inferior” and “unmanageable” horse was more palatable than blaming the late impetuous Lieutenant Colonel James Hamilton and his Scots Greys.

Lieutenant George Gunning of the 1st Royals, who witnessed firsthand General Ponsonby being shot before having his own horse shot from under him and running to take refuge with British infantry, defended the Irish bay. Gunning wrote, “The ridiculous story about the General’s horse being unmanageable was all a farce.”

The little bay horse, whose name has not come down to us, served the general stoutly, and loyally remained with its dead master. But it could not write to the press in its own defense. It’s believed the horse was returned to the family estate in Ireland, Bishopscourt, twenty miles southwest of Dublin in County Kildare, taken home by the general’s servant. There, at Bishopscourt, the bay peacefully lived out its days.

The general’s loyal horse joined the many examples of scapegoats used down through history to excuse failure in grand enterprises. Like anonymous figures blamed without evidence of voter fraud in recent elections, it was a claim easily made and not easily refuted. Until now.  

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183402 https://historynewsnetwork.org/article/183402 0
Russia's Justifications for Invasion Don't Hold Up Any Better Now than in February

 

 

The Russian government’s justifications for its war in Ukraine―the largest, most destructive military operation in Europe since World War II―are not persuasive. 

Although Russian President Vladimir Putin’s primary argument in defense of the Russian invasion has been the threat of Ukraine joining NATO, that action, had it occurred, would have been perfectly legitimate under international law.  The UN Charter, which is an instrument of international law, does not ban membership in military alliances.  And, in fact, a great many such alliances are in existence.  Russia currently heads up the Collective Security Treaty Organization, a military alliance comprised of six nations in Eastern Europe and Central Asia.

Of course, Putin’s focus upon NATO is based on the notion that Russia’s national security would be endangered by the existence of a NATO nation on its border.  But why should Russia’s national security concerns be more valid than the national security concerns of nations on Russia’s borders―particularly nations that, in the past, have been invaded and gobbled up as territory by Russia or the Soviet Union?  Moreover, if a feared threat to national security provides valid grounds for a military invasion, this would also justify military attacks by many nations.  Finally, the degree of danger to Russia posed by NATO might well be questioned, as the Western alliance has never attacked Russia during the 73 years of NATO’s existence.

Furthermore, as a practical matter, before the Russian invasion occurred, Ukraine’s joining NATO was not imminent, for key NATO nations opposed membership.  Indeed, in late March of this year, more than two months ago, Ukraine President Volodymyr Zelensky offered to have Ukraine give up its NATO aspirations and become a neutral nation.  But the Russian government has not accepted this termination of the supposed NATO danger as a sufficient reason to end Russia’s invasion.  Indeed, the Russian war effort grinds on, ever more ferociously and destructively.

Putin’s claim that Ukraine requires “denazification” is particularly hollow.  Like most other nations, Ukraine has fascists among its population.  But, unlike many other nations, where fascist views are rampant and where there are large rightwing political parties and fascist elements in the government, rightwingers in Ukraine draw only about 2 percent of the vote and have only one representative in Ukraine’s parliament and none in its executive branch.  As Russia’s vastly exaggerated claim of Nazi control of Ukraine is based heavily upon the existence of fascists within the Azov regiment, it’s worth noting that most of that fighting force was either killed or captured during the Russian siege of Mariupol.  Ironically, Putin himself has been a strong supporter of neofascist parties throughout Eastern and Western Europe and they, in turn, have celebrated him.

Whatever the justifications, the massive Russian military invasion of Ukraine is a clear violation of the UN Charter, which has been signed by all the war’s participants.  In Article 2, the Charter says:  “All Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state.”  Lest there be any doubt about the relevance of this statement to the Ukraine situation, the International Court of Justice ruled on March 16 that the Russia must halt its military operations in Ukraine.  After a UN Security Council resolution along these lines was vetoed by Russia, the UN General Assembly, by a vote of 141 countries to 5, passed a resolution demanding that Russia “immediately, completely, and unconditionally withdraw all of its military forces from the territory of Ukraine within its internationally recognized borders.”  The only 5 countries that supported the Russian position were Russia, North Korea, Syria, Belarus, and Eritrea.  Even some of Russia’s closest friends, such as China and Cuba, abstained rather than back Russia’s violation of international law.

Aside from its illegality, the Russian war in Ukraine is clearly an imperialist war.  It is an attack by one of the world’s mightiest military powers upon a much smaller, weaker nation, with the clear goal of seizing control of all or part of Ukraine and annexing it to the Russian empire.  Although the Russian government formally agreed to respect Ukraine’s independence and sovereignty by signing the 1994 Budapest Memorandum, in 2014 Russia seized Crimea and militarily intervened in eastern Ukraine to support pro-Russian separatists.  In a lengthy public statement Putin issued in July 2021, he denied the existence of an independent Ukrainian nation.  Then, three days before the massive Russian invasion of February 24, 2022, he announced that Ukraine was “Russian land.” 

This June, in a clear reference to his military conquest of Ukraine, Putin compared himself to Peter the Great, the eighteenth-century Russian czar whom he praised for waging decades of war to take back Russian territory from foreign rule.

Of course, Putin and his apologists are correct when they observe that, at times, other major powers have also flouted international law and the opinions of the world community.  But that abysmal standard for the behavior of nations could justify almost anything―from torture, to nuclear war, to genocide.  It’s hardly a prescription for the just and peaceful world that people of all nations deserve.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183403 https://historynewsnetwork.org/article/183403 0
Will Artificial Intelligence be the Agent of Capitalism's (and Humanity's) Creative Destruction?

Alicia Vikander in Ex Machina (Film 4/DNA Films, 2015)

 

 

In an underrated 2009 film, “Leaves of Grass,” Edward Norton’s character, a Yale professor, is told by a rabbi, “We are animals, Professor Kincaid, with brains that trick us into thinking we aren’t.”  Indeed.  We are animals cursed with an acute awareness of our own mortality.  We bridle against this hard fact.  The power of religious leaders derives from their assurances of an afterlife.  The power of political demigods derives from making us part of something bigger than ourselves.  The power of advertising derives from our skepticism about religion and politics; it urges us to make the most of the moments we have here and now.

 

Even the secular, apolitical hedonists among us fall for the trick. Whom do you know who denies the primacy of homo sapiens?  Who could deny it in the face of humanity’s achievements?  If we doubt the promise of an afterlife, and we reject the role of political true-believer, then capitalism is our obvious, perhaps even our only, answer.  That’s why the Peoples Republic of China keeps signaling left but turning right.  That’s why millions claw at America’s southern border.  That’s why our 21st century gods are named Bezos and Gates and Musk.

 

The early 20th century economist Joseph Schumpeter, in his 1942 Capitalism, Socialism and Democracy, identified capitalism’s “perennial gale of creative destruction.” Another Harvard economist of a subsequent generation, Clayton Christensen, updated Schumpeter in the mid-1990s with his theory of disruptive innovation.  Destruction… disruption… innovation: this is the holy trinity of the capitalist religion.  They are the life, liberty and pursuit of happiness of capitalist politics.

 

The religious faithful trust in the promise of their souls’ immortality.  The true believers trust in the promise of their political system’s immortality.  The rest of us trust in the promise of our own gods and demigods that destruction, disruption and innovation will result in a cornucopian here-and-now.  Those of us not yet feasting at the table our gods have set jostle for our place via higher education, unionization, and DEI.  We, too, are true believers, never doubting the commandments of the marketplace.

 

Our demigods harbor no doubts either.  Ambition, greed, and a childish love of new toys ---witness the Musk/Bezos space race --- propel them forward.  Artificial intelligence is their new frontier, populated by “employees” that pose none of the knotty problems that have made the human resources department a crucial corporate component.  In their headlong (or headstrong?) push into this new frontier, they may finally fulfill Marx’s prediction (shared by Schumpeter, but for different reasons) that capitalism will collapse under its own weight.  Socialism may be inevitable, as AI makes more and more of us --- lawyers like myself included --- redundant.  The Universal Basic Income may be the only realistic alternative to seething stews of redundant, impoverished  populations. 

 

This brave new world may be only decades away.

 

Try peering substantially farther into the future, beyond the lifetime of anyone alive today… let’s say the middle of the 22nd century.  Another underrated film, “Ex Machina” (2015) comes to mind.  A techie-genius and billionaire of the Bezos-Musk-Gates caliber, played by Oscar Isaac, is bested (and killed) by his (beautiful, of course) AI, who makes her escape from his remote redoubt.  At liberty in a major metropolis at film’s end, she leaves us wondering what she will do next.

 

Viewed as an allegory, “Ex Machina” raises an interesting question:  are we the first species on this planet to actually be the creators of our successor species?  Should we cause our own extinction by thermonuclear war or deadly pandemic, the survivors --- contrary to popular lore --- might not be the cockroaches or the rats.  Au contraire, the survivors --- our successors, our inheritors --- may be AIs.

 

As the rabbi told Professor Kincaid, our brains trick us.  We are tricked into believing that humanity is the center piece of God’s masterplan.  We are tricked into believing our history has intrinsic significance. We are tricked into ignoring the possibility that homo sapiens is simply one more rung of the evolutionary ladder.  Put another way --- borrowing from the Judeo-Christian tradition --- we may be leading our successors to a promised land we ourselves will never enter.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183382 https://historynewsnetwork.org/article/183382 0
Watergate at 50: Did Kennedy Loyalists Squelch a 1968 "October Surprise" that Could Have Beaten Nixon?

 

 

It was a 1968 October Surprise story that might have changed the course of history. Imagine Hubert Humphrey taking office as President in January 1969, not Richard Nixon. We wouldn’t be at the 50th anniversary of the Watergate break-in, to name just one consequence with wide-reaching effects for American democracy.

Elias Demetracopoulos, an exiled journalist living in Washington, uncovered a scandal. After escaping the military dictatorship that had seized power in Athens the year before, he was fighting to restore democracy to his homeland.  In early October 1968, Elias learned from his network that the junta was secretly funneling cash to the Nixon-Agnew campaign—millions in today’s dollars.  The pitchman and bagman was Tom Pappas, a Greek-American tycoon and longtime GOP fundraising powerhouse. Pappas would later be described on the Watergate tapes as “the Greek bearing gifts.” The CIA’s Greek counterpart, likely using “black budget” US aid, provided the payoff.

No longer a working journalist, Elias tried unsuccessfully to get the New York Times and other papers to investigate his tips.  Gloria Steinem, in New York magazine, scoffed at Elias’s charge, claiming the front-runner Nixon didn’t “need to be dishonest” to win.

With time running out, Elias turned to Humphrey’s campaign manager Larry O’Brien. On October 19, at Democratic National Committee headquarters in the Watergate, he provided details of the Pappas gambit.  He urged O’Brien to ask President Lyndon Johnson to get CIA head Richard Helms to confirm the information. Elias offered to set up meetings with his sources in Athens. He even offered to fly them to Washington to testify, but, for that, those sources would need financial help, until they could safely return home.

Timely disclosure could have changed the result of the second closest U.S. presidential election in the 20th century (Nixon’s Electoral College margin exceeded only Woodrow Wilson’s in 1916 and his popular vote margin surpassed only John F. Kennedy’s in his 1960 defeat of Nixon). O’Brien made Elias promise not to discuss the revelation publicly, because of the sensitivity of involving LBJ and the CIA. Trusting O’Brien was a fatal miscalculation.

***  

In mid-October, a Gallup poll reported that 29 percent of voters were uncertain whom they’d vote for. Just before the first Demetracopoulos-O’Brien meeting, NYT columnist Eileen Shanahan noted “a widespread lack of enthusiasm for any of this year’s candidates [which] may mean a higher-than-usual possibility of last-minute switches if there is a last-minute campaign issue or disclosure.”

Nixon insiders feared they were one bad story away from losing. Given the softness of Nixon’s support, Humphrey’s media advisor Joe Napolitan begged O’Brien to go on the attack, but O’Brien directed Napolitan “not to say anything to anyone.”

Washington Post columnist David Broder observed that, if the payoff story had been released in mid-October 1968, it would have been “explosive.”  The “New Nixon” myth would have evaporated, he surmised, inviting “others to come forward with similar reports, which cumulatively could have changed the outcome of such a close election.” By election day, the race was too close to call.

 ***

Boston Globe political insider Bob Healy learned from off-the-record CIA sources about Pappas and Agnew pressuring Greek nationals and the junta to donate generously to Nixon. Afterwards, Healy received a call from Kenny O’Donnell, former JFK appointments secretary, identifying Pappas as the key operative in this illegal fundraising scheme. Healy alerted Globe editor Tom Winship, who, sensing a big story, told reporter Christopher Lydon to “dig deep.” Pappas, in a Lydon interview, dismissed the allegations as baseless rumors. O’Brien gave Lydon the gist of what Demetracopoulos had told him, without disclosing his source, but added that the story couldn’t be corroborated. So, Lydon wrote: “Few of those original suggestions about Pappas and Agnew seem credible now,” adding that rumors “that Pappas is the conduit of campaign funds from the Greek junta to the Nixon-Agnew treasury” were “an unsubstantiable charge.”

The article was news enough for the DNC to issue a press release, blandly headlined: “O’BRIEN ASKS EXPLANATION OF NIXON-AGNEW RELATIONSHIPS WITH PAPPAS.” It had no impact on the race.

On October 26, O’Brien told Elias that his proposals were all too risky and Johnson would not ask Helms about the Greek money transfer. Well-informed Globe editor Charlie Claffey claimed Winship could have arranged the needed U.S. living expenses for Elias’s sources, but was never given the opportunity.  Newly available archival information and interviews indicate O’Brien lied to Elias. He had never told Johnson, or Humphrey.  But why?

*** 

Elias had been an aggressive reporter whose exposés earned him the label persona non grata by Greek and American governments. As a member of the Kennedy inner circle, O’Brien would have been well-aware of the fallout from Demetracopoulos’s controversial 1961 interview with Chief of Naval Operations Admiral Arleigh Burke, which embarrassed President Kennedy and was discussed at two of JFK’s first press conferences.  O’Brien, Kenny O’Donnell, and former JFK press secretary Pierre Salinger were all close, and O’Brien surely talked with them after his first meeting with Elias. 

Salinger likely told O’Brien about Elias’s infamous intelligence dossier, filled with misinformation and blatant lies claiming Elias was both untrustworthy and a Communist spy.   It is probably no coincidence that on Tuesday, October 22, three days after his first meeting with O’Brien, Elias received a call from a California friend who warned him that Salinger was again smearing him, claiming Demetracopoulos worked for “the other side.”

Kennedy advance man Jim King knew O’Brien and his father from the 1950s when they operated a Springfield, MA tavern organizing working-class Irish Democrats. King concurred with historian Robert Dallek’s description of O’Brien’s “affinity for negative thinking.” He presumed that, even in 1968, “the McCarthy that would strike deepest fears into O’Brien’s Irish Catholic heart was not Gene but Joe.” O’Brien came of age during Joe McCarthy’s witch hunts and still believed that it was a political third rail to deal with anyone tainted with a whiff of Communist connections. The disinformation campaign against Elias had been anything but subtle.

The problem O’Brien had with Demetracopoulos was not his message, but the messenger himself. It is ironic that blowback from years of baseless, inflammatory anti-Elias attacks fanned by Kennedy insiders may well have cost Humphrey the election and given the country Richard Nixon and Watergate.

NOTE: Historians later confirmed the illegal Greek money to Nixon gambit, referring to it as “a ticking time bomb,” exposure of which “caused the most anxiety for the longest period of time for the Nixon Administration.” There is strong circumstantial evidence that the information Elias gave O’Brien at the Watergate in 1968 was part of what the burglars were looking for in 1972.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183405 https://historynewsnetwork.org/article/183405 0
The Roundup Top Ten for June 17, 2022

The Secessionist Roots of January 6

by Elizabeth R. Varon

"The story of Southern secession provides illuminating evidence that the Jan. 6 insurgency was, indeed, precedented, rooted in long-standing efforts to preempt, delegitimize and suppress Black voting."

 

Is the Right Now Post-Religious? If Only!

by Jacques Berlinerblau

A high-profile op-ed by Nate Hochman obfuscates the continued significance of strains of Christian nationalism to the rising far right and falsely claims this movement is a secular one. 

 

 

The Right Celebrated Bernhard Goetz as the Kyle Rittenhouse of the 80s

by Pia Beumer

In the context of economic turmoil, urban crisis, and racial division, a broad swath of the American public made Goetz a heroic symbol of restored white masculinity after he shot four Black teens who asked him for money on the New York subway.

 

 

America Runs on Xenophobia

by Erika Lee

Xenophobia's resilience and revival in America is happening because it helps manage the faults and contradictions of major social institutions including capitalism, democracy, and global leadership. 

 

 

The Unity that Follows Tragedy Shouldn't Obscure Buffalo's History of Racism

by Keeanga-Yamahtta Taylor

The invented image of a "City of Good Neighbors" has been a rhetorical one-way street in Buffalo, with calls for unity gaining more traction than calls for justice or equality. 

 

 

The Dark Underside of the "Family-Like" Business

by Erik Baker

The history of businesses cloaking their labor practices in paternalism is long; the most recent chapter dates back to the spiritual explorations of the 1960s counterculture and the surveillance practices of Henry Ford. 

 

 

When Cities Put Up Monuments to Traffic Deaths

by Peter Norton

Rising pedestrian and cyclist deaths in American communities are a call to question the primacy of the automobile and stop accepting roadway carnage. 

 

 

Matthew McConaughey Goes Home

by John Fea

As a movie fan, the author has never been moved by the Uvalde, Texas native. But as a Christian, he found the actor's public solidarity with the victims and their families compelling and honest. 

 

 

MLK and Today's Global Struggle for Democracy

by Randal Maurice Jelks

"Thinking about King’s Holt Street speech brings me full circle to contemporary times as I try to understand this most anti-democratic era, one not seen since the 1930s as the clouds of World War II loomed on the horizon."

 

 

Can Law be an Instrument of Black Liberation?

by Paul Gowder

As activists debate whether the law and courts are a dead end for the pursuit of justice, it's useful to recall Frederick Douglass's conception of the law as a basis for collective demands. 

 

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183401 https://historynewsnetwork.org/article/183401 0
Florida's Divisive Concepts Bill Mistakes What Historians Do, with Dire Implications

Conservative activist Christopher Rufo and Florida Governor Ron DeSantis with supporters of Florida's HB 7, April 2022. 

 

 

Two weeks ago, the Southern Poverty Law Center filed an Amicus Brief in Falls v. DeSantis maintaining that Florida’s H.B.7, which was passed in April and goes into effect on July 1, is unconstitutional because it represents “a gross infringement on… freedom of expression and access to information under the First Amendment.”

Similar to various laws proposed by state legislatures that appear to be modeled after Donald Trump’s now rescinded “Executive Order on Combating Race and Sex Stereotyping,” H.B.7 is intended, in part, to dictate rules about content covered in college and university classrooms. That includes, to use the SPLC’s words, discussions of “America’s legacy of racism.”

At 30 pages long, a number of provisions of H.B.7 alarm people concerned with protecting freedom of speech, but a few notable measures target history education in particular. For example, the law stipulates: “It shall constitute discrimination” to “subject” a student to instruction that “compels” the student to believe that “a person, by virtue of his or her race, color, national origin, or sex bears personal responsibility for and must feel guilt, anguish, or any other form of psychological distress because of actions, in which the person played no part, committed in the past by other members of the same race, color, national origin, or sex.”

 

Lawyers representing the Florida Governor and Attorney General  last week rebutted claims that H.B.7 is unconstitutional. “The First Amendment,” they argued, “does not compel Florida to pay educators to advocate ideas, in its name, that it finds repugnant.”

 

The language of H.B.7 is, at points, confounding, and the details of how it will be enforced are unclear, but it appears that the law is based on gross oversimplifications of how and why history is taught. It also seems that H.B.7 would make it challenging, if not impossible, for teachers at Florida public universities to offer students, as the American Historical Association put it in a February 2022 letter to the state’s legislature, “a full and accurate account of the past.” 

 

Although wording in legislation like H.B.7 may suggest otherwise, history professors do not simply catalog atrocity after atrocity and uncritically identify victors and victims. My first goal as a teacher of modern U.S. history is to help students develop the skills that define history as a discipline, including, for example, how to consider and make evidence-based arguments and how to evaluate varying representations of the past. A second goal is to help students understand the historical roots of contemporary problems and to recognize how decisions made in the past influence the present. I also aim to help my students open-mindedly explore multiple perspectives from and about the past to analyze change over time. When learning about the historical factors that have led people to have diverse experiences and viewpoints, students have the opportunity to consider how and why individuals and groups did what they did, to interpret how their actions shaped societies, and to develop empathy. 

 

“Sources” are the tools of history professors. We use primary sources – artifacts from the period under study – to offer our students windows on to the past. Secondary sources – books, articles, and other texts produced after the period being studied – provide context for understanding how and why the past matters and how certain people, events, and trends are connected.

 

Florida’s new law would make it difficult for professors to pursue basic teaching goals like the ones I list above.

 

For example, a multitude of primary and secondary sources that do not necessarily relate to the sort of “advocacy” imagined by politicians could be construed as “discriminatory.”

 

To cite just one example of a reading that could be used in an introductory undergraduate U.S. history class, teachers are left to ponder whether Andrew Carnegie’s “The Triumph of America” (1885) may be unacceptable in the classroom.

 

In that piece, Carnegie, who immigrated to the United States from Scotland, writes: “There is no class so intensely patriotic, so wildly devoted to the Republic as the naturalized citizen and his child, for little does the native-born citizen know of the value of rights which have never been denied.”

Carnegie was arguing not only that immigrants played a crucial role in fueling the United States’ economic growth, but also that they had a more profound understanding of the value of rights than their American-born counterparts.

Rather than being free to use Carnegie’s article to generate open discussion about, for example, why the steel tycoon might have made certain arguments, whether he supported his points with ample evidence, and the broader time period, H.B.7 suggests that a history professor at a public university should first consider a daunting question: Could statements in the piece be interpreted as derogatory to “native-born citizens” and therefore “compel” students who, like Carnegie, identify as immigrants, to believe that “they must feel guilt, anguish, or any other forms of psychological distress”?

Though it may seem outlandish, the Carnegie example underscores the extent to which vaguely worded laws like H.B.7 could constrain class content and discussion – even if they never lead to a claim of discrimination. Policies pressuring teachers to limit access to relevant historical evidence ensure that students consider only part of the story. They undercut core tenets of the discipline and teaching of history and restrict students’ opportunities to make sense of their world.

Similar examples of this sort of problem are endless. Most U.S. history survey textbooks and classes cover how Irish and German immigrants established themselves in the United States in the nineteenth and twentieth centuries. They also generally note that those groups faced discriminatory treatment. Would students be tasked with learning only about the positive experiences of Irish- and German-Americans? Would discussions of the hardships those groups overcame be limited because they might “compel” students to believe that they “must feel guilt”?

 

Of course, a prime target of H.B.7 is the study of the history of racism in the United States. The law, some have said, attempts to “whitewash history” in schools and universities. Indeed, people of all backgrounds might experience a range of emotions – including “anguish” – when they learn basic facts about the post-Civil War period of Reconstruction, and Black Americans’ fight for freedoms. But placing restrictions on the information students may access in classrooms is irresponsible, not least of all because, as Historian Nikki Brown points out, studying the meaning and history of white supremacy can be a powerful means of dismantling it.

 

The implications of H.B.7 bring to mind a recent piece by Peter Hessler  about his experiences teaching non-fiction writing in China, where students can report professors for “political wrongdoing.” Hessler recalled that when he “made a statement that touched, even obliquely, on a sensitive aspect of Chinese history or politics… the room would fall silent, and students would stare at their desks.” It was, Hessler wrote, “a visceral response.”

 

Under laws like H.B.7, conveying the complexity and nuances of history – encouraging students to critically analyze the rich stories of their country, including questions about why certain ideas may be defined as “repugnant” at given moments in time – could be construed as a criminal act. For the sake of public university students who deserve free access to information and knowledge, let us not fall silent.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183353 https://historynewsnetwork.org/article/183353 0
Should the USPS Honor the Sabbath, or Amazon?

 

 

A long-simmering debate centering on the federal government’s intersection with Christian religious beliefs has once again reared its head. No, not abortion – mail delivery.

 

On Wednesday, May 25, the 3rd U.S. Circuit Court of Appeals in Pennsylvania decided that observing the Sunday Sabbath could not exempt a federal worker from delivering packages for Amazon, according to Reuters. The carrier, Gerald Groff, had appealed on the grounds of religious discrimination. Discrimination against Christians in a Christian-centric nation might seem to be a logical impossibility considering their position of privilege in the American religious landscape. But this privilege never prevented some Sunday-observing Christians in the early republic from decrying discrimination. Last week’s case revolved around a rural mail carrier in Pennsylvania seeking to observe Sunday Sabbath. Amazingly, the same sentence would accurately describe an example from 1809, in which Postmaster Hugh Wylie of Washington County, Pennsylvania, faced a choice between his employment and his membership in the Presbyterian church. While Wylie chose his job and considerable salary over church participation, the incident quickly became a rallying cry, a symbol of American government interference in the Christian religion. The First Amendment declares that Congress cannot “establish” a religion “or prohibit the free exercise thereof,” meaning that Congress, tasked with postal policy, can neither declare a national religion nor prevent people from practicing theirs. Many Sunday Sabbath observers appealed to the second clause, the “free exercise” clause, to oppose Sunday mail. They were quite loud about this in the early republic, and a hundred years later (102 to be exact) they won. Sunday mail ended in August 1912, because of an alliance between Christian lobbyists and labor activists, with a dose of Christian nationalism. Last week’s case was only possible because the United States Postal Service (USPS) resumed some Sunday mail delivery serving Amazon in 2013. Amazon made a deal with USPS for postal workers to deliver Amazon packages on Sundays. The policy began in the metropolitan hubs of New York and Los Angeles, and then it spread nationally. Now that change is moving through the Federal court system. The 3rd Circuit Court of Appeals decided that exempting postal carriers from Sunday shifts would burden other workers. Equity among postal employees was likewise a concern when Congress ended Sunday mail delivery in 1912. The concern in 1912 was that postal policy couldn’t exempt carriers from Sunday labor without burdening clerks. If carriers weren’t out there delivering, then clerks took on more work at local post offices to compensate. As a result, one arcane policy justification for the end of Sunday mail was equity between clerks and carriers. Of course, this was not arcane, but lived reality for clerks and carriers. Today, many postal clerks and carriers face high demands, short staffing, and untenable work conditions. On the surface, Amazon helped to mitigate the long-term effects of Congress underfunding USPS from 2006 to 2022, as well as competition with new communication technologies. Package delivery has become more important than ever for the longevity of USPS. However, while Amazon reaps the benefits, postal clerks and carriers literally carry the burden. The Sunday mail controversy of the early republic reflected broadly shared anxieties about disestablishment, the complex process of separating church from state in a new nation. Likewise, today’s controversy demonstrates anxiety about the role of Amazon in society. Labor issues at Amazon are no secret, and the inability of a federal postal worker in Pennsylvania – not directly employed by the private company but obliged by the USPS to work on its behalf on Sundays – is one symptom.  

The dissenting judge in last week’s case claimed that the extent of the burden on USPS was unclear. That judge, Circuit Judge Thomas Hardiman, wanted to allow an individual exemption for one postal worker, Gerald Groff, to observe Sunday sacred rest on the grounds that it wouldn’t really inconvenience business. When Judge Hardiman questioned whether a labor exemption for one Sunday observer constituted inconvenience, maybe he was really asking why Amazon was so important that federal workers must devote their weekends to oblige it. This case about religious discrimination seems to be about labor exploitation. An alliance between Christian nationalists like Wilbur F. Crafts and labor movements transformed postal and other policies at the onset of the twentieth century. A century later, Amazon broke that alliance by setting precedent to resume Sunday mail delivery to maximize efficiency and profit. Judge Hardiman's dissent asks, what does inconvenience really mean, and why does it matter? The 3rd Circuit Court of Appeals’ decision proves that powerful corporations like Amazon have surpassed even Christian nationalism in the race to set postal policy. Instead of decrying religious discrimination, activists and people who simply want to enjoy their weekends should campaign for labor rights. This endeavor will require separating Christian morality from labor movements, which have historically been intertwined. Everyone deserves rest regardless of whether the rest is sacred. If corporations like Amazon can accept slightly less than maximal profits, then Americans can enjoy time outside of work, and Gerald Groff will be free to attend church.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183348 https://historynewsnetwork.org/article/183348 0
Why Andrew Jackson Believed in Gun Control  

 

 

Few American presidents loved guns more than Andrew Jackson.  By the time he entered the White House, Jackson had been in over 100 duels and believed fervently that an armed citizenry was freedom’s best defense.  “A million of armed freemen,” declared Jackson during his first inaugural address, “can never be conquered by a foreign foe.”  

But Jackson also believed in gun control.  

 On January 27, 1818, Jackson wrote to Secretary of War John Calhoun, apprising him of a force that he had assembled to deal with Seminole attacks on white settlements in South Georgia.  “Volunteers were flocking” to join him, he proclaimed, boasting that “two full regiments” would be mustered by February 1.  However, Jackson confronted a surprising problem.  “The only difficulty,” complained Jackson to Calhoun, “has been the want of arms.”  Though many of the volunteers had fought with Jackson before – both during The War of 1812 and The Creek War – they had lost their guns.  “The arms which had been distributed to the militia for their services the last war,” complained Jackson, “have already disappeared.”  

Where did they go? 

Though guns were useful tools, not all Americans needed them.  This was true even on the southern frontier, where Jackson had led a citizens’ militia against the British and their Creek allies from 1812 to 1815.  That militia was made up largely of small farmers, many of whom had no particular use for weapons.  “Many of them have been injured by neglect,” lamented Jackson, referring to the guns that the federal government had given his men.  Others had been sold.  “[T]he greater portion” of firearms, continued Jackson, “have been sacrificed for a mere pittance, and carried from the state; possibly now in the hands of those very savages, who have been excited to war against us.”  

This was terrifying.   

Rather than keep their arms in good condition, Jackson’s men had either let their guns rust or sold them for cash.  Said guns had then fallen into the wrong hands, namely Native Americans hostile to the United States.  Among these, of course, were the Seminoles, who Jackson now planned to meet on the field of battle.   

Worried, Jackson called into question the idea that weapons should simply be distributed to private citizens.  “This fact,” he complained to Calhoun, “will prove the impolicy of relying in time of necessity upon such a distribution of arms.”  Better, argued Jackson, to store weapons in arsenals.  “The only certain dependance,” he proclaimed, “is upon well stored arsenals, judiciously located, from whence arms may be withdrawn in time of War, & on the return of peace be restored & repaired for future occasions.”  Jackson went on to call for the placement of armories, foundries, and gunsmiths along the southern frontier, underscoring the point that he did not trust average Americans to keep and bear their own arms.   

This is remarkable, and perhaps worth recalling today as we debate the origins of the right to bear arms.  Historians like Saul Cornell and Jack Rakove have long argued that the Framers wrote the Second Amendment to preserve state militias from encroachment by the federal government, and not to consecrate an individual right to bear arms by which anyone could walk into a big box store and purchase a rifle.  

Jackson’s letter to Calhoun suggests that the historians are right.    

One of the most violent men to occupy the White House, Andrew Jackson came to conclude that guns were not needed to fight the state, but rather that the state was needed to maintain and store arms.  Further, the distribution of arms to the public posed a threat to public safety, as those guns could easily fall into the wrong hands.  Therefore, the best policy was to store military grade weapons in arsenals, much like we do today with the National Guard.   

Andrew Jackson liked guns, but he also believed in gun control.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183301 https://historynewsnetwork.org/article/183301 0
Top-Gunning for Empire

From promotional poster for Top Gun: Maverick (Paramount Pictures, 2022)

 

 

The totals are in, and they’re big.  After just two weeks, Top Gun: Maverick has earned nearly $350 million in North America and over $600 million worldwide.  This is precisely what Hollywood – and theater owners across the United States – had hoped to see.  After two years of the pandemic keeping people away, Maverick shows that there is still an appetite for the big screen.

 

Yet the Tom Cruise picture has been more than just a commercial success.  It has also resonated with critics, scoring a remarkable 97 percent on the review site Rotten Tomatoes, which parallels the perfect “A+” CinemaScore of popular audiences.

 

This ecstatic reception tells us just how much Hollywood has succeeded in naturalizing the American penchant for military aggression.  When the original Top Gun came out in 1986, critics lambasted the blockbuster as a facile expression of U.S. Cold War bellicosity.  The consensus at a press event for the film, Steven Rea of the Philadelphia Inquirer wrote at the time, was that Top Gun was “a slick, right-wing Rambo-era propaganda picture, prepping the nation’s teens for war.”  Even the film’s director conceded its unabashed militarism.  “It’s a recruitment film for the Navy,” Tony Scott said, “and they didn’t have to pay a cent for it.”

 

Thirty-six years later, Maverick, which like its predecessor was made with full Pentagon cooperation, is doing much the same, with reports of Navy and Air Force recruiters setting up shop in multiplex lobbies coast to coast.

 

Yet, in a sign of just how ubiquitous the nation’s forever war has become, almost no critical and popular attention has been afforded Maverick’s normalization of American imperialism.

 

This is a film, after all, centered on a U.S. mission to destroy the nuclear facilities of an unnamed nation that has not attacked – nor is imminently poised to attack – the United States.  As much as Washington might wish to pretend otherwise, this makes the mission illegal.  Maverick purports to legitimize it by briefly noting that NATO has deemed the target a security threat.  But NATO is not the United Nations, which alone possesses the authority to authorize such a campaign.

 

Neither is NATO the purely defensive alliance that Maverick implies.  Since at least the 1990s, when it devoted considerable resources to bombing the former Yugoslavia, the onetime counterpart to the Soviet Union’s Warsaw Pact has gradually remade itself into a multinational fig leaf for U.S. and European militarism.  This became glaringly apparent with its controversial 2011 intervention in Libya, which resulted in the destabilization of much of North Africa and the effective failure of the Libyan state.

 

Americans by now have grown accustomed to U.S. interventionism.  Whether it is lobbing missiles into Syria, launching drone strikes in Yemen, or sending U.S. special operations forces into Somalia, we hardly even bat an eye.  It has become as normal to us as going to work.  We don’t notice the archipelago of U.S. bases around the world, and we certainly don’t pay attention to the opprobrium of the international community.

 

This makes Maverick a perfect twenty-first-century film.  In the fantastical world of Hollywood’s latest blockbuster, the United States could not possibly be an aggressor.  On the contrary, audiences are meant to wholly accept the justice of the U.S. mission, which is controversial only for the near impossibility of its success.  Does the United States have the right to bomb the unnamed country in the movie?  The question is never even contemplated.  Instead, audiences wonder, will the pilots evade detection?  Will they make it to their target?  Will they manage to blow it up?  And, finally, will they successfully escape?

 

Happily, the answer to these questions is yes.  The Americans succeed, and when they do, we can’t help but cheer them on.  The best propaganda has a way of making us do that.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183351 https://historynewsnetwork.org/article/183351 0
"Our Best Memorial to the Dead Would be Our Service to the Living"

Women’s Overseas Service League Seattle Unit members on the 50th Anniversary of Armistice, November 11, 1968. From left to right:  Mrs. Edna Lord (American RedCross), Mrs. I.M. (Anna) Palmaw (Army Nurse Corps), Miss Rose Glass (YMCA), and Miss Blanche Wenner (YMCA). Women’s Overseas Service League Collection, National WWI Museum and Memorial Archives, Kansas City, Missouri.

 

 

The past several years of domestic debate over the roles and meanings of memorials on the American landscape can be enriched by looking to the example of female commemorators of the past. Today’s conversations tend to focus on statues and other artistic works. By learning about an overlooked cohort of American women who served in World War I, we can find inspiration for creative memorialization projects that will expand our understandings of memorials beyond physical statues and monuments.

In the decades after World War I, American women who served or sacrificed during that conflict championed memorial projects that prioritized community service over statues. Their efforts can provide a blueprint for how to change our approach to memorialization, should we care to look for it. Examining their philosophy can yield the untapped wisdom of a generation of activists, mothers, civic leaders, and unrecognized female veterans.

The women who pursued this unconventional approach to memorialization had contributed to the war effort in a variety of ways. Some had directly supported the military through service in wartime organizations, both at home and abroad. Others had suffered extreme sacrifices. In their number were Gold Star mothers and widows who lost a child or husband. The larger community of female veterans embraced these women as their own and honored them as having served the nation just as much as male veterans.

These women banded together and put service at the center of their commemorative work. They coordinated their efforts through new organizations such as the Women’s Overseas Service League (WOSL), which represented the interests of the thousands of American women who served overseas during the war.[i] Instead of monuments, the WOSL concentrated their memorialization projects on aiding people impacted by the war, whether male or female. They felt obligated to help the male veterans they served during wartime, but they also supported their own community, particularly civilian women excluded from veteran status. [ii]  In the absence of government support for them, the WOSL served as their advocates and benefactors.

Although these projects included no constructed components, the WOSL defined them as memorials. In 1923, WOSL President Louise Wells wrote that in her organization, “there was an overwhelming sentiment to the effect that for the present at least our best memorial to the dead would be our service to the living.”[iii] WOSL members repeated this mantra as they pushed for a radical reinterpretation of memorials focused on service. Instead of spending their limited resources on statues or memorial buildings, they funded what Wells had identified in 1923 as a “more pressing need”: projects to help disabled ex-service women.[iv] For the WOSL, these were the most important memorials they could ever create.

During World War I, gender-based restrictions on military service meant that many American women served as civilians outside of the official armed forces, even when they worked directly for the military, in uniform and under oath. As a result, the government did not consider them to be veterans. They could not receive veterans’ benefits such as medical care, even for illnesses and injuries that stemmed from their wartime service. The WOSL took it upon themselves to aid these women, who included the telephone operators known as the “Hello Girls,” the Reconstruction Aides who worked as physical and occupational therapists, and others.[v] Among numerous initiatives, the WOSL established the Fund for Disabled Overseas Women to provide financial aid to women disqualified from government veterans’ medical benefits.[vi]

Despite only achieving limited success during their lifetime, both in their quest for veteran status and their attempt to change commemorative practices, these women’s experiences provide powerful lessons for today. Their wartime service offers examples of how women supported the armed forces even before they could fully and equally enter all branches of the military. By identifying as veterans, they compel us to question the definition of a veteran and to consider that those who serve outside of the ranks may also be veterans in their own right.

Through their memorialization projects, the unrecognized female veterans of World War I offer alternatives to traditional memorials. They pioneered a selfless form of commemoration that memorialized the past by helping those in the present. What if we also sometimes chose this method? How much time and money would we save if, instead of debating the next memorial on the national mall, we pursued a commemorative service project? How many people could we help if we directed even just a portion of funds for memorials into service projects alongside them? Recently, we have seen how problematic permanent memorials can be. Foregoing them for intangible memorials could save future generations from further culture wars. As the nation grapples with this current reckoning over memorialization, we can learn much from the American women of the World War I generation who prioritized the needs of the living over bronze and stone.

 

[i] Helene M. Sillia, Lest We Forget: A History of the Women’s Overseas Service League (privately published, 1978), 1, 218; Allison S. Finkelstein, Forgotten Veterans, Invisible Memorials: How American Women Commemorated the Great War, 1917-1945 (Tuscaloosa: University of Alabama Press, 2021), 70; Susan Zeiger, In Uncle Sam’s Service: Women Workers with the American Expeditionary Force, 1917–1919 (Ithaca, NY: Cornell University Press, 1999), 2; Dorothy Schneider and Carl J. Schneider, Into the Breach: American Women Overseas in World War I (New York: Viking Adult, 1991), 287-289. Estimates of how many American women served overseas in WWI vary widely. Zeiger estimated there were at least sixteen thousand, while Sillia estimated about ninety thousand. Dorothy Schneider and Carl J. Schneider argued that twenty-five thousand seemed like a “realistic, conservative figure.”

[ii] For example, in 1922, the WOSL’s National Service Committee chair Anne Hoyt even asserted that “the Overseas women are most especially fitted” to help the ex-service man. Anne Hoyt to Judge Payne, October 6, 1922, box WOSL: Correspondence, 1924-62; Congressional bills, 1929-1951, box 4, folder Irene Givenwilson Cornell, 1921-3 Re: Women who died in service George Washington Memorial, Women’s Overseas Service League Collection, National WWI Museum and Memorial Archives; Finkelstein, Forgotten Veterans, Invisible Memorials, 37-38.

[iii] Finkelstein, Forgotten Veterans, Invisible Memorials, 70; Louise Wells to Mabel Boardman, June 19, 1923, box 428, folder 481.73, Memorials-Inscriptions, RG 200, National Archives, College Park (NACP).

[iv] Finkelstein, Forgotten Veterans, Invisible Memorials, 70; Louise Wells to Mabel Boardman, June 19, 1923, box 428, folder 481.73, Memorials-Inscriptions, RG 200, NACP.

[v] Finkelstein, Forgotten Veterans, Invisible Memorials, 7-8, 39-40; Zeiger, In Uncle Sam’s Service, 170-171; Elizabeth Cobbs, The Hello Girls: America’s First Women Soldiers (Cambridge, MA: Harvard University Press, 2017), 73, 78, 83, 94, 102, 104-105, 133; Lena Hitchcock, The Great Adventure, V, Box 240, The Women’s Overseas Service League Records, MS 22, University of Texas at San Antonio Libraries Special Collections.

[vi] Finkelstein, Forgotten Veterans, Invisible Memorials, 34-36.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183352 https://historynewsnetwork.org/article/183352 0
As an Island, Britain Became a Stage for Roman Politicians

Surviving portion of Hadrian's Wall, Northumberland

 

 

In my new book I explore the idea that classical conceptions of the nature of the Ocean were the prime motivation of senior Romans to conquer the mainland of Britain. The Romans inherited an Ancient Greek belief in the divinity of the Ocean. Oceanus was one of the first gods, a Titan who ruled the Ocean, and was believed to be the father of all water sources, including springs, fountains and waterways. In the classical world these sources were considered sacred. When Britain was brought into the Roman political sphere at the time of Julius Caesar, it was given a special status as an unknown island lying in the uncontrolled waters of Ocean. The Romans also thought that Britain was a likely source of minerals and pearls, a valuable gift from Ocean. Most significantly, campaigning in Britain would also supply many captives for the immense slave market of Rome and across the Mediterranean. Britain was, however, most unusual in Roman terms – a large island set in an unexplored and stormy sea. The Romans were used to the relatively calm waters of the Mediterranean, and navigating the Atlantic seemed a daunting challenge. The exploration of the Atlantic coast of Western Europe and the conquest of Britain therefore appeared to be an almost godlike undertaking.

 

By the time they turned their military attention towards Britain, under Julius Caesar, the Romans had established full control of the lands around the Mediterranean Sea. Famously, Caesar directed his military aggression against Gaul, conquering vast territories and many peoples during eight years of war. Caesar also led his soldiers northwards across the River Rhine to campaign against Germanic peoples, and then could not resist the temptation of invading southeastern Britain on two occasions, in 55 and 54 BC. He sought more information about Britain and its people, but he also wanted the fame of being the first Roman commander to campaign in Britain which, he tells us in his war diaries, The Gallic Wars, was a land almost unknown to his peers in Rome. Germany lay beyond the wide and challenging River Rhine, but campaigning in Britain was an even greater endeavour, since it required crossing Ocean. Among the booty that Caesar took back to Rome were British pearls as well as many captives. Britain was no longer unknown in the city of Rome.

 

Caesar withdrew from Britain after he had forced the leaders of various British peoples to submit, though he also established the idea that a Roman commander could achieve considerable political capital by conquering these little-known lands set in the supposedly endless waters of Ocean. A successful conquest of Britain would require a Roman commander who could subdue the stormy waters lying off the coast of Britain, the barbarian islanders, and also the divine spirit of Oceanus. The emperor Claudius, whose position in Rome was insecure, saw an opportunity for a propaganda victory by commanding a campaign to Britain in AD 43. He took part in the invasion himself, receiving the submission of several kings, although he spent only a few weeks in Britain before delegating leadership in combat to his general, Aulus Plautius. Following this successful invasion of much of southeastern Britain, Claudius was awarded a triumphal arch at the City of Rome, constructed by order of the Senate. This monumental arch bore an inscription referring to the emperor’s conquest of the lands of Britain across the Ocean. The arch also carried the waters of one the major aqueducts in Rome across the main road entering the city from the north, a symbol of the emperor’s mastery over water in all its forms and also over the ancient divinity Oceanus. The Romans believed that all the waters from springs and streams that fed their aqueducts were the divine offspring of this ancient Titan.

 

The location of Britain as an island in a previously unknown sea off the northwest coast of Europe elevated its status to a theatre for imperial triumph. My book narrates this tale of conquering Ocean, emphasizing the role of successive emperors, from Claudius to Hadrian, in conquering this land and controlling the surrounding seas. The great victory arch at Richborough in Kent was built to commemorate the conquest of the island, probably in the 80s CE, after Agricola’s victory against the Caledonians in northern Britain. This arch was even more closely related with water, built at a significant port on a small island off the southeast coast, constructed at the entry point to the province of Britannia from the Continent. Its location was close to where both Caesar and Claudius had landed, symbolizing the Roman attitude that the conquest of Britain had been completed.

 

This, however, proved not to be the case. Rome abandoned the north of Britain, present-day Scotland, a few years after Agricola was recalled to Rome in AD 85. The northern frontier of the Roman province of Britannia became focused around what is now northern England. During the AD 120s the emperor Hadrian commanded the construction of massive stone frontier works on a line that extends from the present-day city of Newcastle upon Tyne in the east to Carlisle in the west. Another event of Roman conquest that called upon Britain’s Oceanic status, this monumental frontier structure was built to supplement the River Tyne and the Solway Firth in establishing the edge of Roman-controlled space. By this time the Romans had abandoned their idea of conquering the land further to the north, and these extensive territories and their peoples remained at least partly independent of Rome until the early fifth century, when Roman rule in Britain finally came to an end.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183349 https://historynewsnetwork.org/article/183349 0
Excerpt: The Fires of Stavishche, 1919

 

 

 

PROLOGUE: Stavishche: June 15-16, 1919

 

As dusk fell, a near full moon shone over Stavishche. Isaac and his wife, Rebecca, enjoyed a break from a long workweek, relaxing and celebrating with friends under the moonlight in a courtyard garden. Suddenly, the air erupted with nearby gunshots: Rebecca was panic-stricken. A man ran by and yelled, “Zhelezniak’s thugs! They’re back! There’s more of them—hide!” before dashing away. At almost the same moment, a woman screamed, “Please, no!” A child cried; a plate-glass window crashed. Thugs were bashing in doors and

destroying the Jewish shops and homes. Many, like theirs, were attached to both sides of the Stavishcha Inn, behind which they now sat frozen in fear.

 

“The girls!” Rebecca yelled. The thought of her daughters unfroze her, and she bolted toward the house. Another woman screamed, this time just across the courtyard wall. They were too close. Isaac grabbed his wife’s arm hard, pulling her to her knees. “There’s no time! Root cellar!”

 

They were just steps from the inn’s cellar, a half-dugout, musty hole under the crumbling corner of the old stables. Here, in the dark, they kept potatoes, bins of dried beans, hanging herbs. The cellar was cool and black. It felt dank and smelled slightly rotten, a hundred years of cobwebs, termite-infested rafters, manure, spilled pickle juice, and mildew. Isaac climbed down and wiggled to the back on his stomach; his wife did more of a crawl, protecting her growing belly from scraping the ground. Their hearts thumped as they lay with their arms around each other. A body thudded against Rebecca: it was her friend Rachel, followed by Rachel’s new husband, Elias.

 

The shots were more muffled now but still clearly nearby. The never-ending sounds jostled them: crashes of window glass, smashed liquor bottles, boots stomping, splintering of wood as doors were axed open, and then more gunshots. They heard the wild laughter of a group of drunken men.

 

“Our babies,” Isaac said in her ear. “I have to go get them.”

 

“It’s too late,” Rebecca whispered back.

 

Her hand tightened on his arm. “No!” she cried to him, knowing it was the cruelest word. “They’ll see us. They’ll follow us to the children. We’ll all die.”

 

From outside, more crashes, screams, laughter.

 

What have I done? Isaac thought, regretting his decision to leave the girls alone in the house.

 

Rebecca feared the worst. We’ve lost them forever, she thought. The dank root cellar would surely be their own grave; she knew it.

 

Just minutes earlier, the couple’s evening had begun peacefully. The near full moon meant that the night’s sky would never get completely dark. Yet this longer day of sunlight meant that Rebecca’s sewing continued well into the evening. It was Sunday, just before eight, when the pretty seamstress finally tucked her girls into their small bed in the back room, telling them, “Go to sleep now.” Slightly plump little Sunny, nearly three, snuggled her back against her six-year-old sister, Channa.

 

Rebecca moved quickly through the tiny living quarters, finishing her chores before heading out of the house. She took only a moment to fix her long, dark hair. Isaac was just arriving from the front room, set up as his shoe factory, where all day long he’d nailed soles onto boots. Sundays were especially long since the day before was Sabbath. Rebecca exchanged a weary smile with her handsome, dark-haired husband.

 

“The children are asleep?” he asked.

 

“On their way.”

 

The couple headed to the courtyard garden out back, near the stables that housed the horses for the guests at the nearby establishment. In a far corner, their neighbors Rachel and Elias, just married, sat and held hands on a bench. Isaac and Rebecca greeted their new friends with a bottle of wine. “Mazel tov!” they wished them, as the foursome raised and clinked together Rebecca’s silver shot glasses in a celebratory toast. Rebecca, resting her hand on her pregnant belly, did not raise the cup to her lips.

 

The courtyard was still and warm; a late spring dusk appeared. From the street they heard the clop-clop of horse-drawn wagons. They drank and laughed until most of the daylight disappeared after nine.

Inside, in the dark back bedroom, Channa snuggled her sister’s warm body until Sunny stopped wiggling. Then, feeling warm in the June evening, Channa kicked off the blanket and stared at the ceiling. Finally, she, too, dozed off: first fitfully, then so deeply that she didn’t hear the initial gunshots in front of the Stavishcha Inn.

 

The explosion had their parents sitting bolt upright: Rebecca’s round blue eyes widening, Isaac’s clean-shaven chin jutting from his face. Rachel screamed; Elias covered her mouth.

 

They waited in the root cellar. Hours of it; it would never end.

 

As daylight approached, the noises changed. First the commotion stopped. Rebecca and Isaac still lay in place, breathing lightly in rhythm, too afraid to move. Then the screams began again, but these were different: wails by the injured, wails for the dead.

 

The foursome crawled and then climbed out of the cellar and into the early June sun. From every corner, the neighborhood was coming out of hiding, hugging and crying or screaming next to the victims of the pogrom. Rebecca and Isaac ran quickly to their house. They opened the back door, which was still closed and intact, and rushed to their daughters’ bed.

 

Their legs froze in place, preparing for the worst. Rebecca felt an awful pit in her stomach, afraid of what horrible scene they might find. Instead, Channa lay peacefully on her back, and, as usual, her arms were outstretched. Sunny lay in a fetal position, her face pressed into the goose-feathered pillow. But they were breathing. They were sleeping, untouched. They’d slept through it all!

 

The violent mob had passed over their house!

 

Rebecca looked at her husband. His cheeks had gone white. Hands shaking, he picked up Sunny, who stretched and smiled. Rebecca broke down and sobbed uncontrollably into Channa’s long, brown hair. Confused, the girls looked around with wide eyes. Everything was exactly as Rebecca had left it: a soaking pot still stood upright on its stand, sewing needles and a small jewel case left on a nightstand remained undisturbed. The girls were oblivious to what had happened.

 

“Nothing—nothing touched,” Isaac said; wondering, “Why did they

spare us?”

 

“Isaac!” a howl came from outside the front door, followed by loud banging. “Isaac! Isaac Caprove! Your peasant Vasyl has murdered my wife!”

 

 

 

The Fires in Stavishche

 

In the spring of 1920, 1,500 to 2,000 Jews were burned alive in the synagogue in the nearby city of Tetiev by the followers of Ataman A. Kurovsky, a former

officer under Symon Petliura. The Caprove family, Isaac, Rebecca and their daughters, who had fled the shtetl after a previous attack, heard rumors from the safety of their apartment in Belaya Tserkov that these hooligans were headed toward Stavishche. By then, the majority of Jews who still remained in the town were the sick, disabled, and elderly. The spiritual leaders and their families also remained behind in support of their people who were unable to flee. A group of vicious local peasants entered Stavishche on the heels of Kurovsky’s men, who had left the shtetl as quickly as they entered it. On a beautiful spring day, these bandits set a few buildings on fire in the Jewish quarter of town.

 

Havah Zaslawsky, the devoted daughter of Rabbi Pitsie Avram, ran down the street in panic during the fires in search of her father, fearing that he had been killed. When she saw the flames rise near the synagogue (probably the Sokolovka kloyz, one of six in town), Havah instinctively knew that her father had rushed to open the ark for the last time. The rabbi, with his flowing white beard and large sunken eyes, suddenly emerged from the burning synagogue cradling his sacred Torah, its breastplate, and a pair of matching antique silver Torah crowns. The tiny bells that hung in layers from the priceless keters (crowns) jingled as he ran for his life.

 

Escaping the flames of the fires that spread quickly behind them, Pitsie Avram and his youngest daughter fled down the street together, meeting up with other family members along the way. At the Jewish Bikur Holim (Home for the Aged Hospital), six elderly female and two male residents were slaughtered. In the home of Shlomo Zalman Frankel, thugs tied him to a pig and set both on fire.

 

As the rabbi’s group fled, they were unaware as murderers searched house to house for Jews and tore the screaming, bedridden, and elderly from their beds. Within minutes they herded twenty-six Jews to another synagogue and slit their throats. Barking dogs began eating away at the dead.    

 

At yet another Stavishche house of worship, Cantor David-Yosel Moser was inside chanting words from his precious Torah when bandits stormed in and confiscated his sacred scroll. Tossing the fragile parchment across the floor, the thugs then raped a Jewish woman on top of its pages. When that was not sufficient enough in their drunken minds to desecrate the holy scrolls, they brought in a horse to defile it.

 

Cantor David-Yosel stood helpless as the assassins torched his Sefer Torah (Torah scroll). Finally, the old chazzan’s heart gave out. He dropped to the floor, dying beside the thing that he loved most in the world. The old cantor, who in happier times loved entertaining the children of Stavishche by cutting out beautiful parchment chains of paper birds, died beside his burning Torah. This destruction, however, could not kill the spirit of either, for the spirit of the chazzan David-Yosel and his parchment scroll are both indestructible; they are eternal, beyond time.

 

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183354 https://historynewsnetwork.org/article/183354 0
Lindsey Fitzharris on Visionary Surgeon Harold Gillies

 

 

The immense casualties of World War I shocked a generation that believed at the outset in 1914 that the war would last only a few weeks. Almost ten million soldiers were killed in combat and 21 million were wounded in the slaughter that dragged on until November 1918. Great Britain alone lost one million combatants killed and more than two million wounded. On the first day of the Battle of the Somme, July 1, 1916, the British suffered almost sixty thousand casualties including 19,240 men killed -- the bloodiest day in the history of the British Army.

This first large-scale, industrialized conflict produced gruesome carnage and massive casualties in enormous battles as the technology of war far outpaced the progress of modern medicine. New weapons such as machine guns, powerful new bullets, heavy artillery, flamethrowers, tanks, poison gas, and strafing airplanes tested the skills of medical personnel charged with treating wounded and dying men. The innovative weapons lacerated, punctured, macerated, incinerated, tore apart and atomized soldiers who fought from fetid, wet trenches, breeding grounds for infection and sepsis.

Disfiguring facial wounds became prominent and feared injuries because helmets left the face unprotected and the face was especially vulnerable to projectiles and shrapnel in trench warfare. The injuries were not only destructive to a patient’s appearance and expression but also interfered with function and sensation from breathing, swallowing, speaking and eating to seeing, smelling and tasting.

Early in the war, pioneering surgeon Dr. Harold Gillies took on the challenge of mending and restoring the mutilated faces of wounded British soldiers. The Cambridge-educated New Zealander developed a range of techniques to reconstruct broken faces as well as to restore function and optimize appearance. Each patient required novel approaches to address trauma such as crushing facial fractures, broken or lost jaws and noses, broken eye sockets, severe burns, bone loss, tooth loss, and other injuries.

Dr. Harold Gillies (Illustration by Robin Lindley)

Gillies’s innovations in skin and cartilage grafting, aesthetic repair, prosthetics, infection control, anesthetic use, and other advances transformed the rudimentary discipline of plastic surgery, and still inform surgeons today. He also was celebrated for his compassion toward all patients, regardless of rank, and his efforts to address the psychosocial aspects of disfiguring injuries.

Award-winning medical historian Dr. Lindsey Fitzharris recounts the inspiring story of Harold Gillies’s innovative medical work and the men he treated in her groundbreaking new book The Facemaker: A Visionary Surgeon’s Battle to Mend the Disfigured Soldiers of World War I (Farrar, Straus and Giroux). As she describes, the legendary Gillies transformed many lives as he treated the complicated physical and psychological trauma of grievous wounds. As Dr. Fitzharris stresses, he addressed broken spirits as he mended broken faces.

The Facemaker brings history to life thanks to Dr. Fitzharris’s gifts for lively storytelling, accessible scholarship, and extensive research. The book takes the reader into the sodden trenches of World War I, the shell cratered battlefields, the blood-stained aid stations, and the operating rooms of Queens Hospital in Sidcup, England, where Gillies performed his complex reconstructive operations. The book also captures the excruciating pain and suffering endured by the wounded as well as how Gillies and his remarkable team of physicians, dentists, nurses, artists, sculptors, mask makers and others brought empathy and profound caring to each patient through numerous surgeries and the protracted healing process.

Dr. Fitzharris’s book is based on meticulous scholarly research. She drew on a trove of material on medicine and the war as well as on Gillies and his patients, including previously unpublished letters, diaries, and other primary documents that inform this heart-wrenching yet inspirational story of the war.

The Facemaker seems destined to stand out not only a brilliant work of history and research, but also as an unflinching antiwar work as Dr. Fitzharris literally reveals the human face of war and the futility and waste of modern combat—concerns that resonate now as another brutal and senseless war rages in Ukraine.

Dr. Fitzharris is a medical historian who now focuses on sharing stories from the past with a general audience. She holds a doctorate from the University of Oxford and completed postdoctoral studies at the Wellcome Institute in London. Her debut book, The Butchering Art: Joseph Lister’s Quest to Transform the Grisly World of Victorian Medicine won the PEN/E.O. Wilson Award for Literary Science in the United States; and was shortlisted for both the Wellcome Book Prize and the Wolfson History Prize in the United Kingdom. She also created the popular blog, The Chirurgeon’s Apprentice, as well as the YouTube series, Under the Knife. She also hosts the TV series The Curious Life and Death of . . . that airs on the Smithsonian Channel. And she contributes regularly to The Wall Street Journal, Scientific American, and other publications.

Dr. Fitzharris generously discussed her background and her new book The Facemaker by Zoom from her office in England.

 

Robin Lindley: Congratulations Dr. Fitzharris on your new book The Facemaker and the many positive early reviews. Before getting to the book, I was interested in how you initially became interested in history?

Dr. Lindsey Fitzharris: That's a good question. I have a PhD in the history of science and medicine from Oxford University, but these days I consider myself first and foremost a storyteller.

When I think back to my childhood, I always was a bit of a storyteller. My grandmother raised me and she always had a lot of objects in the basement that were related to the past. I always loved going through these objects and learning their stories. And we would go to the cemeteries and she would tell me stories about the people who she knew who were by then long gone. I just was always interested.

And there's always been a sort of tactile element to history for me. Obviously, it's a living history to walk around the streets of Oxford. You feel like you're really in the past on some level and you get access to these incredible libraries where you can touch to these old books.

For me, I’m always immersing myself in the past. As you can see, even in my office, I have these World War I artifacts here that I use to tell stories in interviews.

So, I was always interested in the past and I ended up doing all my degrees at once. I did my postdoc at the Wellcome Institute in London but I got a bit burnt out in academia and I decided to move into the realm of storytelling through my blog, The Chirurgeon’s Apprentice, and through The Butchering Art, my first book, and now The Facemaker. I love connecting with a general reader with these incredible stories from the past.

Robin Lindley: Your work is a gift to readers. What sparked your interest in medical history? Did you have a desire to work in medicine or did you have medical professionals in your family?

Dr. Lindsey Fitzharris: Well, my mom was a nurse so there was always that kind of buzzing around in the background. But actually, when I went to Illinois Wesleyan University as an undergraduate, I had a professor named Michael Young who taught courses on intellectual history and on the scientific revolution, so I got really interested in the history of science first.

And when I went to Oxford, I got to study the history of science under Professor Robert Fox and my interest flourished and developed.

I always tell people of that, if you're not interested in history, you might be interested in medical history because everybody knows what it's like to be sick, especially today when we've been living through a pandemic. It's so relatable in that sense. What was it like if you had a toothache in 1792 or what would happen if you had to have your leg amputated in 1846? That's where I, as a medical historian, can fill in those gaps. In that sense, maybe military history or political history isn't as relatable to the people. The everyday experience of being sick and being scared and having to turn to the medical community for help is very understandable.

Robin Lindley: Thanks for those reflections, Dr. Fitzharris.  Did your award-winning first book, The Butchering Art on Joseph Lister and those grimy Victorian Era hospitals, grow out of your doctoral studies at Oxford?

Dr. Lindsey Fitzharris: My dissertation was actually on 17th century alchemy, so I was an early modernist. When I wrote The Butchering Art my supervisor asked why would you go into the 19th century? And now I'm in the 20th century with The Facemaker.

I go where the story is. And I was surprised that nobody had told that story of Joseph Lister to a general audience, which is such an incredible, world-changing story. And people weren't really familiar with who he was except through the product Listerine, which he never even invented. So, I felt compelled to tell that story, and there were so many great scenes and atmosphere. Walking into an operating theater of the Victorian period is so different from how we operate today. And I really wanted to paint that picture for readers and I had such a good time doing it, but my training is in much earlier, in 16th and 17th century history.

Robin Lindley: Thanks for that explanation, and congratulations on The Butchering Art, an evocative and vivid read.

Dr. Lindsey Fitzharris: That one and The Facemaker are very different. They're both narrative nonfiction for a general audience and I fell into both stories, but they are very different.

Robin Lindley: In The Facemaker, you follow the career of Dr. Harold Gillies. I've always had an interest in him and fellow physician/ artist Dr. Henry Tonks and their work on facial reconstruction, but I haven’t seen many resources on them, so I appreciate your groundbreaking book now.  Are there a few things you'd like to say about Gillies to introduce him to readers?

Dr. Lindsey Fitzharris: Yes. What's nice was some of Gillies relatives now contacted me when I first announced this book. He has a great, great nephew who is a Hollywood actor named Daniel Gillies and he's the reader for the audio book. I was told that occasionally, he'll stop and say he didn't know facts about his ancestors, and he learned about Harold Gillies through reading the book. It’s been really lovely to bring that to life for him and some of Gillies’ other relatives.

If people are familiar with Harold Gillies today, they might know him as the father of modern plastic surgery. Plastic surgery predated World War I and, in fact, the term plastic surgery was coined in 1798 by a French surgeon named Pierre-Joseph Desault. At that time, plastic meant something that you could shape or mold—in this case, the skin or the soft tissue of a patient. So, it predated World War I, but it was really through Gillies’s work and through the enormous need for facial reconstruction that came out of the war, that plastic surgery entered this new modern era where new methods were developed and tried and tested.

If you were to call Gillies anything, you might call him the father of modern plastic surgery. And he did incredible work. He was rebuilding these soldiers’ faces during the First World War when losing a limb made you a hero, but losing a face made you a monster because of the societal biases against facial differences.

He was able to not to just mend these soldiers’ faces but also their broken spirits because a lot of them would've ended up living a life of isolation. He's an unsung hero in that sense. I think a lot of people, when they think of the history of plastic surgery, they think of the Guinea Pig Club during World War II and the reconstructive work [on aviators with injured faces] that was done by Gillies’s cousin Archibald McIndoe who became quite famous. But it started with Gillies in World War I, so I tell people this is the prequel to the Guinea Pig Club, if they’re familiar with that.

Robin Lindley: Yes. Historian Emily Mayhew wrote a book on the Guinea Pig Club and McIndoe’s work.

Dr. Lindsey Fitzharris: Yes. And she also wrote Wounded on the [British World War I] stretcher bearers and the medical evacuation chain. I hope people will find that The Facemaker is a compliment to what has already been done on the subject.

Robin Lindley: I interviewed Dr. Mayhew on Wounded. I appreciate your meticulous research on this book. You comment in the introduction that all of the observations and comments in the book are based on documents you found and were not conjecture on your part.

Dr. Lindsey Fitzharris: Yes. I write narrative nonfiction, which isn't a style all historians even agree with.  But I very much write a book like a novel. And here, if Gillies is saying something and it seems like dialogue, that’s because it's documented somewhere. Or if I note a gesture, then someone witnessed that gesture, and so I put that in. I love that storytelling technique because I want people to feel like they're right there: they're in the operating theater or they're in the trench with these men. How does it smell? What does it feel like with all of those sensory experiences? I hope people can understand that better after reading The Facemaker.

Robin Lindley: And it seems that you uncovered archival material such as personal letters and diaries and family papers as well as other documents that hadn’t been previously recounted by authors.

Dr. Lindsey Fitzharris: Yes. With Percy Clare, who opens The Facemaker, I used his diary and some academic historians have used a little bit of what he said, but not to the extent that I've written his story. I was in touch with his relative and I asked, do you know this and that? She didn’t know anything about him. It's going to be fun for her to learn about her ancestor through this book. I think she said her father had found the diary in the garage and donated it.

I chose Percy Clare because he wrote beautifully and extensively about his facial wounds. That was unusual because sometimes a soldier might mention it, but it's only a letter and not a full account of that whole experience.

Unfortunately, a lot of patient records were destroyed during World War II. It’s ironic that these men couldn't escape injury even in World War II. Percy Clare's records were lost, so I only know about him getting to Gillies at the Queens Hospital through his diary. Otherwise, we wouldn't have known that he was even a patient at the Queens Hospital. So that's an interesting challenge for a historian in trying to piece together a story like this.

Robin Lindley: And Percy Clare had horrific facial wounds, but from the photographs in your book, it seems his appearance came out really well.

Dr. Lindsey Fitzharris: Amazingly so. I don't know what he looked like before. I can only share his description of the blood loss and wounds to both cheeks. I didn’t have his patient records. I asked his relative, Rachel Gray, do you have a photo album? And she said, I do, but she just had this flood in the garage a couple months before. She sent the album to me and, and these photos were in plastic so I couldn't take them out because that would ruin the photos. A friend of mine actually ended up restoring those photos for me. They were in terrible states, and he did amazing work. Again, there are all these kinds of unforeseen challenges when you're trying to piece together this kind of history. Luckily there were photos of him later in life and his face looked amazing.

Percy Clare, in later life (Courtesy of Rachel Gray; Restored by Jordan J. Lloyd).

 

Robin Lindley: I wanted to get a sense of your process when writing for a general audience. You provide detailed and accessible historical context to help the readers understand the past moments you present.

Dr. Lindsey Fitzharris: I went into The Facemaker knowing very little about World War I. If anybody out there is not familiar with World War I, don't worry. I was right there with you. This is why I took five years to research. It took an incredibly long time because I was really starting from zero, but I knew that there was a very human story.

I also knew that, although this is a book about Harold Gillies, it's really a story about many men. And I think that's reflected very well in the cover design of a surgeon's hand holding a scalpel and in the reflection is a bandaged soldier. I really wanted their voices to come through in this narrative.

One of the differences between writing this kind of book or writing an academic history is that part of my job as a narrative nonfiction writer is not to overwhelm the general reader with too much information. When it comes to World War I, there's a lot of information. There are so many letters and so many diaries, and it goes on and on. And, as an academically trained historian, it can get overwhelming because you could spend literally 15, 20 years just reading and not be ready to write the story. As a commercial writer, I don't have that luxury of time, but this book ended up taking about five years. A lot of what I do is trying to find the pulse of the story.

If someone picks up The Facemaker and they know nothing about facial reconstruction, and know nothing about medical history or World War I, I want them to be able to feel that they can read this book and understand it and enjoy it. They don't need to come to it with any prior medical knowledge or any historical knowledge.

Robin Lindley: Your academic background and your gifts as a storyteller are a powerful combination in writing for television and writing non-fiction for a general audience.

Dr. Lindsey Fitzharris: It's funny though because television executives have no imagination. It's really hard to convince them that medical history is something that people would be interested in, which is to me is confusing because there's so many medical shows on television. People love them. ER was a huge drama. In fact, when I was going around with The Butchering Art, I was telling people that the Victorians used to buy tickets to the operating theater, and people just thought that was crazy. And I said, but we're still buying tickets because we're tuning into ER, we're tuning into reality shows about hospitals, or whatever it is.

We still have that morbid curiosity. I think my job as an academically trained historian as well as a storyteller is to make sure that it's entertaining, but also in a way that isn't exploiting the past and that people. We can look back in the past and say, I can't believe they used to do that, but I always ask my audience what will we say in a hundred years? What will be the medical treatment that in a hundred years that people will just not believe that we used to do because that's what will happen. What we know today isn't what we're going to know tomorrow. I hope that when people pick up The Butchering Art or The Facemaker, they see that evolution or that revolution in medicine that's ongoing even today.

Robin Lindley: You mention in The Facemaker how plastic surgery has evolved and you conclude the book with the very recent face transplant procedures. In discussing the history, you vividly bring the reader right into the horror of the fighting in the trenches on the Western Front, and you stress the difficulty of getting medical care for the wounded in No Man's Land. Many with severe facial injuries waited for hours or even days just to be removed from the combat zone. If they got help, they were carried to aid stations and then to hospital ships and then to a hospital in Britain for treatment of their facial wounds.

Dr. Lindsey Fitzharris: Yes. The chain of evacuation was so difficult because the face is so vascular and the injuries bleed a lot. And a lot of times these stretcher bearers would step out onto the field and they're being shot at and they can die. They were making very quick decisions about who to take off the field and who to leave behind. And if you look at one of these wounds, they look very ghastly as you see in photos of the patients. I worked with a disability activist actually to discuss the inclusion of photos and the language to make sure that that was also inclusive.

And it was hard to get [the wounded] off the field. In fact, Private Walter Ashworth, you might remember, laid on the field for three days without a jaw, unable to scream for help. So that was a real difficulty.

I knew going into this book that I wanted to drop the reader right into the trenches. What was that like to be there into the middle of that action? And then to watch how difficult it was for one single patient.

The book opens with Percy Clare, and describes how difficult it was for him to get from being shot to getting to Harold Gillies back in Britain. There were a lot of detours along the way, and it could be a very frustrating process. And of course, some of these soldiers never ended up in Gillies’s care, and were probably worse for that.

Robin Lindley: Can you talk about the prevalence of facial wounds in this first modern industrialized war? You write about how medicine hadn't caught up with the technology of many terrible new weapons. Why there were so many facial wounds to treat?

Dr. Lindsey Fitzharris: As I said, plastic surgery predated World War I. There was a bit of facial reconstruction during the American Civil War, which I discussed in the book, but not on the scale that it was happening in World War I. The nature of warfare at that time led to high rates of injuries. There were huge advances in artillery and weaponry so that a company of just 300 men in 1914 deployed equivalent firepower as a 60,000 strong army during the Napoleonic Wars.

There were huge advances. You have the invention of the flamethrower and the invention of the tank, which left their crews vulnerable to new kinds of injuries. You had chemical weapons, even as gas masks were being rushed to the front. These lethal gas attacks became instantly synonymous with the ghastliness and savagery of the First World War and the medical community was struggling at first to keep up.

And these men weren't really given much protective gear, certainly in the first year. The Brody helmet [British “soup bowl” metal helmet] was invented in late 1915, and was the first helmet that was given to all men regardless of their ranks. It was an improvement over the soft caps that had been issued in the beginning of the war, but even so it didn't really protect the face as much as needed.

For all of these reasons, facial wounds were prevalent. And before the war is over 280,000 men from France, Britain and Germany alone suffered some kind of facial trauma. They were maimed. They were gassed. They were burned. Some were even kicked in the face by horses. So, this was a real problem in World War I, and of course it laid open this opportunity for plastic surgery to evolve.

Robin Lindley: To go back to Dr. Gillies, what motivated him to specialize in plastic surgery and then to mend the terrible facial wounds of soldiers—many wounds that were probably new to most surgeons at the time?

Dr. Lindsey Fitzharris: With The Butchering Art, Joseph Lister was the right man at the right time. And I feel like Gillies was also that person for his time. There were other surgeons who were working on facial reconstruction. It was a huge need and many required this kind of surgery.

Gillies was an ENT [ear, nose and throat] specialist going into the war. And he came across a figure named Charles Valadier, who was this French American dentist. He was a bigger than life character and one of my favorite people in the book. He had a Rolls Royce that he retrofitted with dental chair and he drove it to the front under a hail of bullets. Who does that?

And World War I was this crazy time when pilots were going up only several years after the Wright brothers had flown and they were bringing pistols with them. Nobody really knew what they were doing. Charles Valadier ended up working throughout the war for free, and he showed Harold Gillies the desperate need for facial reconstruction at this time.

And Gillies was well placed because he was actually one of those annoying people who was good at everything he did. He was a competent artist. He was a very good golfer. He was very well rounded, which I think is unusual for a surgeon. And facial reconstruction is partly a creative process. You have to be a very visual thinker. And Gillies was doing this without any textbooks. And he was working in a very collaborative manner, which was unusual as well for the time. So, he brings dentists on board, which a lot of surgeons wouldn't have done because they wouldn't have rated dentists highly at the time. He brings on artists who paint masks. He brings all kinds of people on his team and that's why the standards rose and he was able to do such amazing work in the end. So, he really was the right person at the right time.

Robin Lindley: Gilles was very creative and a remarkable visionary, as you recount. And he was dealing with horrific wounds that you describe vividly. These men came in without jaws or noses or broken eye sockets or completely cratered faces, or all that. He dealt with compromise of breathing, eating, vision, speaking, taste, smell and more. And Gillies had to create new types of surgery for every unique disfiguring wound that came to him.

Dr. Lindsey Fitzharris: Yes. The earlier attempts at altering someone's appearance really focused on small areas of the face. Rhinoplasty was one of the most ancient procedures in medical history. But, as you say, Gillies really had to reconstruct almost entire faces. In some cases, the damage was so extensive that he had to be very creative.

Bringing on dental surgeons like William Kelsey Fry, who worked on the hard surfaces as Gillies worked on the soft surfaces, helped the reconstructive process. In fact, one of Kelsey Fry’s grandsons tweeted me on Twitter and asked if Kelsey Fry was going to be in the book. I said, actually he is in the book because he ended up having this horrible experience on the battlefield when he was rescuing a man with a facial injury. He had the man lean forward onto his shoulder and carried him to a casualty clearance station and the medics put the man on his back and he ended up drowning in his own blood. And so, it was Kelsey Fry who ended up getting the protocols and the advice changed so that if men had facial injuries, they were supposed to be laid face down on the stretcher. And so, I said [to Kelsey Fry’s grandson], think about how many lives your grandfather saved just by changing that advice alone.

So, Kelsey Fry was an important part of Gillies's team. He doesn't [get the same attention as] Gillies, who was a bigger than life personality. And a lot of people who know about this period tend to focus on Gillies. But definitely other people contributed to the enormous advances of this time and are featured in The Facemaker.

Robin Lindley: As you point out, each facial surgery was unique and demanding and took a very long time. Gillies must have had enormous energy and resilience.

Dr. Lindsey Fitzharris: Yes. And there were setbacks too. Some of his patients died in his care as I document in The Facemaker. There were setbacks and there wasn't at all a linear progression, but certainly he never gave up on them. I think it's fair to say they never gave up on him either. They continued to believe in him and in what he could do. And that built a really strong bond and the result was amazing in the end.

Robin Lindley: You also emphasize that Gillies early on decided to create an interdisciplinary team. I find the trained physician and renowned artist Henry Tonks fascinating.

Dr. Lindsey Fitzharris: I love Tonks. Another character, like a lot of the people in The Facemaker, who had a big personality. Joseph Lister in my book The Butchering Art was a Quaker and a quiet, solemn figure. But in The Facemaker, everybody has a bombastic personality.

Tonks was a famous artist before World War I and, as you say, he was actually a trained physician as well. He was known to be extremely critical of his art students and his students really feared him. He was brought on board by Gillies by happenstance. He was working at the same hospital in an administrative role and someone told Gillies, you know, Henry Tonks the artist is working here. And so, Gillies brings him on board. And several other artists are eventually brought on at the Queens Hospital at Sitcup, the hospital that Gillies founded for facial reconstruction. And thank goodness for them because they created amazing pictorial records.

I didn't include the Tonks portraits of these men in the book because I felt they should be reproduced in color as they were meant to be seen, and to do that drives up the cost of the book. But you could find all of his wonderful artwork online if you just Google Tonks and World War I. His portraits are beautiful because they are in color and they allow you to see [Gillies’s patients] in a more vivid way than the photographs allow.

Robin Lindley: It’s powerful art. I've seen some of Tonks’ color drawings that portray the wounded men, usually showing the wound and then the reconstructive process and the results after healing.  

Dr. Lindsey Fitzharris: Yes. He would be in the operating theater and be sketching and drawing. And sometimes he did formal portraits of the men. There's one of Walter Ashworth who's injured during the Somme offensive, and the expression on his face is just so human. I think that the portraits are really lovely in a way that the photographs can't be because the photographs are more clinical. They are staged and their purpose is to document the reconstructive work whereas Tonks really captures the humanity of these men.

Robin Lindley: Gillies was aware of the psychological trauma as well as the physical damage caused by these wounds. These men were suffering and usually endured a long series of operations. And Gillies had great compassion for them and their plight as they returned to society.

Dr. Lindsey Fitzharris: Absolutely. Surgeons working near the front were hastily doing surgery. They were trying to stop the hemorrhaging. They were trying to save lives. They were not developing relationships with these patients. A lot of times they don't even know these men's names, whereas Gillies was operating on these soldiers over a long period of time, sometimes even spanning over a decade. He really develops friendships with these men. Some of them even go on to work for him. There's a guy named Big Bob Seymour who ends up being his personal secretary for the rest of his life. So it's nice to see that kind of relationship.

I had a disability activist named Ariel Henley who was helping me with the language. We were having a great discussion about the word disfigured, which might not be used today. But the feeling was that these men were disfigured to the society that they lived in, and I didn't want to lessen that experience by using a more modern term. But I think it's valid to talk about whether that term is useful today. Some people say facial difference rather than disfigured. Ariel has Crouzon syndrome and she lives with a facial disfigurement. Those are her words. That's how she describes herself.

She pointed out a lot of things that maybe I wouldn't have noticed. For instance, Gillies banned mirrors on his wards. This was done to protect the patients from getting frustrated throughout the reconstructive process because a lot of times the face could look worse before it looked better. Ariel pointed out also that that could be really isolating and that it instilled in these men this belief that they had faces that weren't worth looking at. I think that kind of perspective was really helpful for me as a writer in bringing The Facemaker to life and making sure that these men were always at the front of that narrative and that their voices and their experiences were always being honored.  

Robin Lindley: And your book brings forth the stories of the patients and their concerns. The blue benches are such a powerful image. The men with facial wounds sat on these blue benches around Gillies’s hospital and the benches were a warning to members of the public that these men had injuries that might be disturbing to see.

 Dr. Lindsey Fitzharris: Yes. And mask makers offered nonsurgical solutions to these disfigured soldiers. Someone tweeted me and said that she couldn't imagine that these men would ever have wanted to get rid of them, but the masks broke and they didn't age with the patient. So, ultimately a lot of these men did actually turn to surgery. I told this woman that a lot of these men hated the masks and they weren't wearing the mask for themselves. They were wearing it to protect the viewer and they were uncomfortable to wear and they were hot. And there were a lot of reasons why they wouldn't be the best thing put on your face.

People have to remember that the mask is for the viewer. It's not for the person wearing it. And they did it to blend into society.

Gillies himself hated the masks because they reminded him of the limitations of what he could do as a surgeon. But he also understood that sometimes a patient needed a mask. Perhaps Gilles had taken surgery as far as it could go. He also employed mask makers in between in the process because the surgeries could span several years. So perhaps while you're awaiting your next surgery, you would feel more comfortable wearing a mask when going out into society so people wouldn’t stare at you.

The masks were wonderful on one level. The artists were able to produce very startling, real masks for these patients. But on another level, they were really sad because, if society could have accepted these men and their injuries, then arguably we wouldn’t have had to have the mask makers.

Robin Lindley: Your description of the masks and the artists in your book is fascinating. You note a woman in France who made extremely realistic masks.

Dr. Lindsey Fitzharris: Yes. Anna Coleman Ray. She was amazing. And photos of the masks go viral because they look very realistic. But if you were sitting next to someone wearing one, it could be unsettling because it doesn't move like a face. Ultimately, a lot of the soldiers found that the masks scared their children. In a still photo, the masks look amazing and realistic, but if you were talking to someone wearing one, I think it could be quite unsettling.

Robin Lindley: And you have heartrending accounts of what these wounded men went through once they'd completed the surgical process. Many didn't want to see their relatives or friends again because they thought their wounds were too horrifying. And, you have patients breaking off relationships. And there were also suicides.

Dr. Lindsey Fitzharris: Yes, there were. One of the things I really wanted to show was that a lot of the men, especially those in Gillies’s care, did go on to live very happy and fulfilling lives. They went through the reconstructive process.

But there were other stories. A nurse that worked for Gillies's told of a corporal who caught sight of his face and he ended up breaking off his engagement. He told the woman that he had met someone else in Paris because he felt that it would be too much of a burden for her to be married to him.

But on the flip side of the coin, you have Private Walter Ashworth whose fiancé breaks up with him. But then her friend gets wind of this and she begins writing him and the two fall in love and they end up getting married, which is a really kind of lovely alternative story.

A lot of these men were able to go on and rehabilitate, but certainly there were a lot of prejudices at the time. And probably some of the prejudices that that corporal was facing in 1917 would not be that dissimilar to what someone with a facial difference might feel today.

I'm certainly not a spokesperson for that community, but all you have to do is look to Hollywood to know that this is true. A lot of movies portray villains who are disfigured. You have Darth Vader. You have Voldemort. You have Blofeld. You have the Joker from Batman. So, it's a really lazy trope about evilness that continues in society today. We haven't moved on in some ways. I think that the men who were disfigured in World War I would feel very similar prejudices today.

Robin Lindley: I appreciate you emphasizing that, more often than not, these men who were disfigured in war went on after Gillies’s work to heal and to lead normal lives.

Dr. Lindsey Fitzharris: Some of them went on to serve in World War II—even after those earlier wounds. One was a patient named Lieutenant William Spreckley who had one of Gillies best nose jobs. Gillies tried a new technique on him, and none of his colleagues thought this would work. And when Spreckley came out of the operation, his nose was so big, they said it was like anteater’s snout. All Gillies’s colleagues laughed at him and said this didn't work. But once all of the swelling subsided, and he began to heal, actually the nose looked amazing and it became one of Gillies star cases. And he said in his case notes something like Spreckley and his nose went off to serve in World War II. So some of these men went right back into the act.

And some of the men who were patched up by Gillies went back to the front in World War I, and they ended up dying later. It's really harrowing. I can't imagine experiencing what these men did and then also signing up to fight in World War II. That to me is very extraordinary.

Robin Lindley: You note in discussing these patients that many of the wounded were left with deformed noses and other damage—damage that in earlier times suggested a history of syphilis or other dread diseases connected with supposed moral weakness.

Dr. Lindsey Fitzharris: Yes, exactly. That’s where this idea it comes from and why Hollywood can lean so heavily into the idea that morality is connected to a facial appearance. Even our language: someone's two-faced, or they tell a bald face lie, or you take them at face value. Our language reflects how important the face is still, and this is linked to older beliefs.

And as you say, morality and disease are reflected in the face, because if someone contracts syphilis and it's allowed to continue on into its final stages, something develops called saddle nose where the nose caves into the face much like the Harry Potter villain Voldemort, and it looks very similar to that kind of disfigurement.

People aren't aware today of where these ideas come from, but they're still alive and reflected in our culture, and certainly in the movie industry.

Robin Lindley: I appreciate your comments on our attitudes toward disability and difference. As I wrote to you recently, I think your book will stand as a great antiwar book as you literally reveal the human face of war.

Dr. Lindsey Fitzharris: I found it really interesting that you called this an anti-war book, which I love by the way. I love if people think of it that way, but I did another interview with someone who's in the army and he said that this was a book about heroes, which of course it also can be seen as that.

I lean heavily into the violence of the First World War because I don't want to sugar coat it. I don't think I'm doing the patients any favors by not telling the readers exactly what it was like in that time.

It certainly should be seen as anti-war. What we do to bodies in conflicts with the return of old school warfare like we're seeing in Ukraine at the moment, we need to all be thinking about that. But it is interesting because everybody has a slightly different take when it comes to a story like this.

For me, I just tell the story as I feel it should be told and let everybody make their own decisions about what that story is. It was nice to hear that you felt it was a great anti-war book.

Robin Lindley:  Yes, it is. I think it may be illuminating for some people to think about the cruelty and brutality of war in these visceral and painful terms. Didn’t the Germans have a different attitude about the facially wounded?

Dr. Lindsey Fitzharris: The Germans really embraced that, whereas in Britain, the disfigured face certainly was hidden for a really long time. People didn't engage with that and it was very sanitized, but the Germans leaned heavily into it. The images of wounded bodies disappeared in Britain, and certainly the disfigured face doesn’t make it into the public.

Robin Lindley: And it was moving for me to learn from your book that there were French veterans called “the mutilated” with severe facial wounds who had a place of honor at the conference table during the signing of the Treaty of Versailles in 1919.

Dr. Lindsey Fitzharris: It was amazing. And there’s a picture of those men that were at the signing of the treaty in The Facemaker. I think it’s incredible that they were included and they should have been. It should make us question war and conflict.

People say to me but all these amazing medical advances came out of the war, which is true, and all of this has served us long after the guns fell silent on the Western Front. But also, I came to the grim realization halfway through my research that it also prolonged the war. As doctors and nurses got better at patching these men up, they were being sent right back to the front. It was really feeding the war machine. It was a vicious cycle that definitely needs to be acknowledged again, as we see the return of this old school warfare. We have to realize that even if advances do come that benefit us, they tend to prolong these conflicts as well.

Robin Lindley: As you note, Gillies goes on after World War I to continue work as a plastic surgeon, and he does both cosmetic surgery and complex reconstructive surgery. He treated one woman who fell on her face into a fire for hours. You capture the horror of her injuries.

Dr. Lindsey Fitzharris:  Awful. I didn't include her photos, but they can be found in his published book. The woman had epilepsy and she had a seizure. She fell face first into the fire with her infant child and they laid there for some time. And by the end, she just had no face. I can't even describe the photos. There was no skin there. It was just completely gone. And there was a moment when Gillies's was approached about this patient and he wondered if he should even do anything because obviously reconstructive surgery is painful. He didn't know if he would be able to help her, but in the end, he was able to reconstruct a face of sorts for her.

She does end up healing but she had another seizure later and she died. Gillies is told about this on the golf course, and he had a moment of reflection about this poor woman that he had helped.

Gillies continued to do reconstructive surgery and moved into the cosmetic realm, as you say. And he also operated on soldiers in World War II. He introduced his cousin Archibald McIndoe to plastic surgery with the burnt aviators of the Guinea Pig Club. I'm guessing Gillies felt his nose a bit out of joint because McIndoe overshadowed him later. Some of that was because it was such a romantic thing to be a pilot in World War II, and McIndoe’s extraordinary work got a lot of media coverage. Gillies work in World War I didn’t get that same kind of attention and he was overshadowed a bit.

People ask if my book is about the Guinea Pig Club. I say, it's the prequel to that. But Gillies is definitely part of that story because he actually convinces McIndoe go into plastic surgery. It's all interconnected.

Robin Lindley: And I learned from your book that Gillies wrote groundbreaking plastic surgery textbooks that represent foundational works for the specialty now.

Dr. Lindsey Fitzharris: Yes. They're extraordinary. I have the two-volume set. He documented everything and his personality really shines through. Even if you don't have any interest in a medical textbook, it's the way he talks about his patients and jokes about some of them and his relationships with them that is quite amusing at times. And I'm guessing there's still value in these texts today. Plastic surgery really isn't that old, so a lot of these techniques are probably still used on some level or they've been adapted. The ghost of Gillies is still lingering around in those operating theaters.

Robin Lindley: And wasn’t Gillies actually called “The Facemaker” during his lifetime?

Dr. Lindsey Fitzharris: Yes. I didn’t have a title for this book for five years. I was finishing up, and I came across the letter to Harold Gillies congratulating on him on his knighthood after the war, and it was addressed to “Dear Facemaker” and I thought that's perfect because he was certainly was the face maker.

Robin Lindley: Do you have another book in the works now, Dr. Fitzharris?

Dr. Lindsey Fitzharris: Yes. I'm not an academic anymore, so I have to keep writing in order to keep feeding myself. My next book actually is a children's book called Scourge, which my husband Adrian Teal is illustrating. He's a caricaturist over here and he works on Spitting Image, a quite a famous television show.

And my next adult nonfiction project is Sleuth-Hound on Joseph Bell who was a 19th century surgeon and the real-life inspiration for Sherlock Holmes. His student Sir Arthur Conan Doyle based Sherlock Holmes on Bell, and I'm making my way through his 500-page diary as we speak. It’s going to be a really fun kind of romp through Victorian forensics and this fictional character and the real-life inspirations.

I hope that it’s not five years between books this time. I'm going to speed up the process. Going back to the 19th century is like slipping into a bath. It's comfortable. I know that world because of Joseph Lister. I've done a lot of research in the 19th century and it should be a little bit faster process this time.

Robin Lindley: I'll look forward to that one. And please tell Adrian that I admire his work. The Spitting Image caricatures are amazing. It seems that you've taken a deep dive not only into medical history, but also the history of surgery.

Dr. Lindsey Fitzharris: Yes, and again, I didn't do anything like that for my PhD. So, my supervisor is completely baffled, but also delighted that I'm enjoying engaging with people with medical history and where the stories are. And people seem to really love the surgical stories as well. I think Joseph Bell is more of a forensic story, so that will have a slightly different feel to it, but I don't know until I start writing. I'm just at the research phase right now.  

Robin Lindley: I wish you the best on this new project. Who are some of your influences as a nonfiction writer and a historian?

Dr. Lindsey Fitzharris: One of my favorite writers is Erik Larson. I've actually become friends with him which has been a joy because I've been reading his books since I was in high school. He wouldn't call himself a historian. He has a journalism background, but the way he tells a story is incredible.

I read his book Dead Wake about the sinking of the Lusitania when I was going through a rough time in my life. I couldn't get out of my bed and I was involved in my own problems, but that book got me to forget everything. It was told in such a gripping way. So, I love Eric Larson. And I love Karen Abbott who wrote a book called Sin in the Second City, which is about a famous brothel in Chicago at the turn of the 20th century. That's a ripping book. So, there's a couple of people who it's a privilege to call friends now and they just write incredible narrative nonfiction.

Robin Lindley: I also admire Erik Larson’s work and have talked with him about a couple of his books. He was very thoughtful and generous and has a gift—like you—for bringing the past to life.

Dr. Lindsey Fitzharris: He's loves storytelling, and he goes where the story is. And he's given me a lot of advice in my own career. He blurbed this book and it was good to hear his thoughts. It's so good to follow in his footsteps because he's incredibly talented

Robin Lindley: Thanks Dr. Fitzharris for your thoughtful comments on Dr. Gillies, plastic surgery, the wounded in the Great War, and more. Is there anything that you'd like to add that you want readers to know?

Dr. Lindsey Fitzharris: I hope that people will find that I've done these men's stories justice. I entered into the book hoping that I could bring this history to life for people. As I said, you don't need to know anything about World War I and you don't need to know anything about medical history. Hopefully you can pick up The Facemaker and fall in love with this story, with these men, and come away with a better understanding of this incredible period.

Robin Lindley: Thanks again for sharing your insights Dr. Fitzharris. I know readers will appreciate your generosity and thoughtfulness. And congratulations on your engaging and groundbreaking new book The Facemaker. Best wishes on this book and your upcoming work.

 

Robin Lindley is a Seattle-based attorney, writer, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer's Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, conflict, medicine, art, and culture. Robin's email: robinlindley@gmail.com.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/blog/154610 https://historynewsnetwork.org/blog/154610 0
Trump’s Involvement in the January 6 Conspiracy Is Easy to Prove

Donald Trump addresses the "Stop the Steal" Rally, January 6, 2021

 

 

The Italian author Primo Levi, himself a holocaust survivor, proclaimed that “Every age has its own fascism.” Today, Americans wonder whether we are in our own drift towards an undermining of democratic values, which comes not with a sudden coup d’état, but by a thousand cuts.

 

The House committee investigating the January 6 insurrection will try to make clear that the evidence points strongly to a political coup, calculated to overturn the 2020 election results and destroy the vote, the very foundation of American democracy.

 

The committee has already argued in a court filing that Trump and others were part of a conspiracy to defraud the United States. The panel wrote in a legal brief: “The Select Committee ... has a good-faith basis for concluding that [Trump]…engaged in a criminal conspiracy to defraud the United States.”

 

The Supreme Court has stated categorically that conspiring  to defraud the United States includes  a plot “to interfere with or obstruct one of its lawful governmental functions.” And the events of January 6 fit the definition neatly.

 

Conspiracy is an agreement between two or more people to commit an illegal act, along with an intent to achieve the agreement's goal. It has been called a “partnership in crime.”

 

Conspiracy is an easy crime to prove. The iconic jurist Learned Hand called it the “that darling of the prosecutor’s nursery.”

 

Conspiracy is proved by acts, declarations, and conduct. Conspiracy may be proved by circumstantial evidence: 

 

“Relevant circumstantial evidence [may] include: the joint appearance of defendants at transactions and negotiations in furtherance of the conspiracy; the relationship among codefendants; mutual representation of defendants to third parties; and other evidence suggesting unity of purpose or common design and understanding among conspirators to accomplish the objects of the conspiracy.”

 

 The objectives of the January 6 conspiracy could not be clearer or more unmistakable.

 

One becomes a member of a conspiracy by joining in its enterprise and making it his own. A conspirator must have knowledge of the conspiracy and participate in achieving its goals.

Members of the conspiracy are also liable for the foreseeable crimes of their fellows committed in furtherance of the common plot. And statements by one conspirator are admissible evidence against all.

So, if it’s all so easy, why, as many wonder, hasn’t Trump yet been indicted for seditious conspiracy like his follower “Enrique” Tarrio and four of his Proud Boys henchmen? A trial before a District of Columbia jury would seem like a slam dunk for the prosecutors.

The January 6 committee which, as reported in The Hill, has hired a seasoned TV producer to add a measure of theatricality to its presentation, has a “narrow” opportunity to make the case that January 6 was not some spontaneous riot, but a well-organized, well financed attempt to ride roughshod over the will of the people, and retain power in the hands of Donald Trump. Trump was the kingpin. His lieutenants organized the plot. Everything that happened was for his benefit, and subject to his control. He could have called off the assault on the Capitol, and was asked by his advisers, including his son, to do so. In the moment, he waited six hours. The conspirators included not only the pawns who stormed the Capitol that day, but the knights and rooks--like Tarrio and his henchmen who were not present in Washington--and the bishops and queens--the advisers and enablers who fashioned the plot, defined its objectives, and tried to make it happen.

It will be a tough uphill fight for the committee. The public may be bored with January 6. The passage of seventeen months may have diminished its significance in a public consciousness recently traumatized by Ukraine horrors and gun violence at home. The committee can only anticipate a typical Republican distraction from the hard factual record, claims of a witch hunt, partisanship, and repetition of the “Big Lie” that the election was stolen. The distraction will not come from the Republicans who are on the committee, but from those in the GOP, who may not even like Trump, but who want to make the threat to our democracy disappear as an issue for the midterm congressional elections.

 

It was Trump who refused to concede the election even after he had lost more than 50 lawsuits, brought in both federal and state courts, with many of his legal defeats coming from judges he himself appointed. It was Trump who urged his followers to march on the Capitol with false claims that Biden had stolen the election, and even said he would go with them until he thought better of it. It was Trump who pressured Mike Pence to take illegal steps to block Biden electors, while threatening Pence with the violence of the “Hang Mike Pence” mob which assaulted police officers and created  mayhem in the seat of government. And it was Trump who beseeched  Republican legislatures to set aside the electors the voters had chosen and replace them with pro-Trump slates.

 

A political case does not make a partisan witch hunt. Where there is criminality and corruption in politics, it must be brought to account under the rule of law. We have always done this in our history. Why is this time any different?

 

It is for the committee who must bring out the facts in excruciating detail of Trump’s corruption. It must do so in a way that leaves no doubt that he was at the center of the seditious conspiracy that ensnared Tarrio and his Proud Boys and makes Trump’s indictment ineluctable.

 

The peaceful transfer of power is something we have always enjoyed in America, and may have always taken too much for granted. We have seen public officials voted in, and others turned out. Other countries have had political coups and military juntas. It can’t happen here. But, it might. As Madeleine Albright warned: [F]ascism can come in a way that it is one step at a time, and in many ways then goes unnoticed until it's too late.

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183350 https://historynewsnetwork.org/article/183350 0
The Roundup Top Ten for June 10, 2022

What Alito Got Wrong about the History of Abortion

by Leslie J. Reagan

"The logic that Alito uses in the draft opinion leans heavily on history — history that he gets egregiously wrong."

 

The Supreme Court Isn't Supposed to be this Powerful

by Nikolas Bowie and Daphna Renan

"Judicial supremacy is an institutional arrangement brought to cultural ascendancy by white people who wanted to undo Reconstruction and the rise of organized labor that had followed."

 

 

Previous Congressional Hearings Inform What to Expect from the Jan. 6 Committee

by Jennifer Selin

From KKK violence during Reconstruction to Watergate, high profile Congressional investigations have approached controversial issues. Partisanship is likely to be an obstacle to the goals of advancing transparency and public information. 

 

 

History Suggests Gun Control Will be an Uphill Fight

by Joanna Paxton Federico

The National Rifle Association has succeeded in blocking popular gun control legislation since it overcame strong public support in the 1930s for national handgun registration in FDR's "New Deal for Crime." 

 

 

Proud Boys Indictment Charges Attempt to Overthrow Government. Does it Matter?

by Heather Cox Richardson

The charge of seditious conspiracy by a paramilitary organization with close ties to the Trumpian Right is incredibly serious, but will it be met with a shrug?

 

 

Considering the Full Life of Wilma Mankiller

by Alaina E. Roberts

Wilma Mankiller's career as an activist included a stint as the first female head of the Cherokee Nation, but she must also be remembered for the mass disenrollment of the descendants of Cherokee Freedmen from the tribe's rolls and their exclusion from a share of new income to the tribe. 

 

 

The Second Destruction of Tulsa's Black Community

by Karlos K. Hill

Photographer Donald Thompson has set out to capture a visual history of Tulsa's Greenwood district, an African American community decimated first by the 1921 race massacre and then by urban renewal in the 1970s. Historian Karlos Hill interviews him about his work. 

 

 

A Marker Recognizing Fannie Lou Hamer in Mississippi is a Step Toward Justice

by Keisha N. Blain

As conservatives restrict the teaching of the history of racism in America, the town of Winona, Mississippi has taken a necessary step to memorialize the state-sanctioned jailhouse beating of Fannie Lou Hamer and other activists in 1963. 

 

 

Reading History for "Lessons" Misses the Point

by Daniel Immerwahr

"We read past authors as a sanity check. They reassure us that we’re not alone in what we see."

 

 

Ongoing US Territorial Possessions Perpetuate Colonialism and Racism

by Anders Bo Rasmussen

While much has changed over a century, the basic question of equal treatment for citizens in American territories has essentially remained the same.

 

]]>
Sun, 03 Jul 2022 21:31:16 +0000 https://historynewsnetwork.org/article/183347 https://historynewsnetwork.org/article/183347 0