History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Tue, 31 Jan 2023 21:01:30 +0000 Tue, 31 Jan 2023 21:01:30 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/site/feed David Maraniss Follows Jim Thorpe's "Path Lit by Lighting"

Author photo by Linda Maraniss

 

 

I’ve had a special fondness for the phenomenal athlete Jim Thorpe (1887-1953) since grade school in the late fifties. I read all I could about his incredible story. I almost never kept track of sports and I was the worst athlete ever, always the last chosen for teams, if at all. But Jim Thorpe was an inspiring presence—a Native American from humble roots who came out of obscurity to master every sport he attempted.

After he won gold medals in the pentathlon and decathlon in the 1912 Olympics in Stockholm, the Swedish king called Thorpe “the greatest athlete in the world.” In addition to his prowess in track and field, he was an All-American collegiate football player, a star of the first pro-football Hall of Fame, and a major league baseball player, among many distinctions.

But I never knew much about my hero beyond his athletic achievements and yearned to learn more. And when I mention Thorpe to younger people, to my surprise, most have never heard of him. He loomed large for many of us baby boomers.

Thanks to prize-winning author David Maraniss, we now have a magisterial account of Thorpe’s extraordinary life with the meticulously researched, riveting, and first comprehensive biography of this mythic athlete, Path Lit by Lightning: The Life of Jim Thorpe (Simon & Schuster). The book not only recounts Thorpe’s athletic triumphs but also presents his life in the context of the history of Native Americans and sports in first half of the twentieth century. The result is an engaging and assiduously detailed portrait of a man who negotiated between two worlds, between traditional Indigenous society and majority white America that offered opportunity as well as exploitation and racism with the attendant degradation and stereotyping of the internationally renowned sports icon.

Readers of Path Lit by Lightning (Jim Thorpe’s American Indian name) will learn of Thorpe’s childhood challenges with his Native family in Oklahoma, his schooling at soul-crushing Indian residential schools, his spectacular gold medal performances at the 1912 Olympics, the painful rescinding of his Olympic records and medals a year later, and his triumphs in intercollegiate and professional sports.

But after the days of athletic glory, as Mr. Maraniss details, Thorpe became an itinerant “athletic migrant worker.” He endured a Sisyphean cycle of hope and disappointment as he repeatedly made false starts in ventures involving sports and other enterprises. To provide for his growing family, he took on many odd jogs with minimal pay such as playing bit parts in dozens of movies, usually without credit, and even digging ditches during the Depression.

A parade of swindlers and speculators exploited Thorpe’s fame and then abandoned him. His peripatetic life and alcoholism also interfered with work and disrupted his relationship with his family that he seldom saw. And the press tended to romanticize and patronize Thorpe as it dehumanized and mocked him with coverage that was rife with misunderstanding and overtly racist stereotypes.

Despite many disappointments and the disadvantages of life for a Native American in a racist society, Thorpe persisted. As Mr. Maraniss stresses in Path Lit by Lightning, Jim Thorpe was a model of fortitude and resilience and grace despite setbacks. He never gave up.

David Maraniss is a fellow of the Society of American Historians, a visiting distinguished professor at Vanderbilt University, and an associate editor at The Washington Post where he has worked for more than forty years. The Thorpe biography is his thirteenth hook, the conclusion of a trilogy that includes his previous biographies of Roberto Clemente and Vince Lombardi. He has also written authoritative biographies of Presidents Bill Clinton and Barack Obama and a trilogy on the 1960s with Rome 1960, Once in a Great City, and They Marched into Sunlight.

As an editor and writer for The Washington Post, in 1993 Mr. Maraniss received the Pulitzer Prize for National Reporting for his coverage of Bill Clinton and, in 2007, he was part of a team that won a Pulitzer for coverage of the Virginia Tech shooting. He was also a Pulitzer finalist three other times, including for his book, They Marched into Sunlight. He has won many other major writing awards, including the George Polk Award, the Robert F. Kennedy Book Prize, the Anthony Lukas Book Prize, and the Frankfurt eBook Award.

Mr. Maraniss generously responded by email to a barrage of questions on his career and his new Thorpe biography.

 

Robin Lindley: Thank you Mr. Maraniss for discussing your work and your illuminating new book on the life of phenomenal athlete Jim Thorpe. You’re an award-winning journalist and you’ve also written widely acclaimed books of history and biography, always based in painstaking research. How did you come to write these comprehensive books on the likes of Presidents Clinton and Obama, on sports figures such as Vince Lombardi and Roberto Clemente, and on historical incidents?

David Maraniss: For each of the books you mention, there had to be an obsession and a strong theme that drove me to devote years of my life to the subject, and in each case, it was somewhat different.

I had covered Clinton during the 1992 campaign and thought I understood him as deeply as anyone. That was my first biography. My obsession was to capture his duality and at the same time use him (and Hillary) as a means for writing about that generation, my generation, the Baby Boomers. I also wrote about Obama for the Post in 2007-8 and again felt I understood him. My obsession in that book was to understand the forces that shaped him and how he eventually reshaped himself.

I have come to consider the Lombardi, Clemente, and Thorpe biographies as a trilogy. In each case, I was looking to use the drama of sports and their unparalleled lives as a means of illuminating sociology and history.

Robin Lindley: You have a gift for breathing life into the history you recount. Who are some of your influences as a writer of history and biography?

David Maraniss: Before I wrote my first book, I had long conversations with Robert Caro and Taylor Branch about how they did what they did, and I would say they were major influences in both methods of organization and writing. Other influences were my father, a newspaperman with a wonderful ability to write intelligently but fluently and accessibly, and Richard Harwood, my first editor at the Trenton Times and the Post, who had that same skill. Also, I tried to model my work to some degree after George Orwell, not his novels but his essays, which I find models of clarity.

Robin Lindley: What inspired you to take on what has been called already “the definitive biography” of the legendary Jim Thorpe?

David Maraniss: The Thorpe story was first mentioned to me by a writer from the Oneida Nation, Norbert Hill, more than twenty years ago, but I had many other projects at hand at that time. He planted a seed that took a long time to grow. By then I had written Lombardi and Clemente and saw Thorpe’s incredible story as a means of exploring the Native American experience.

Robin Lindley: I’m surprised that several younger people I’ve talked with don’t know of Thorpe. He loomed large in my childhood in the fifties and sixties. How would you briefly introduce him to readers?

David Maraniss: Jim Thorpe rose from the Sac and Fox Nation in Oklahoma to become the greatest athlete in the world. It is impossible to compare athletes from different generations but he accomplished a trifecta that has never been matched: an All-American football player, a gold medalist in the decathlon and pentathlon [at the 1912 Stockholm Olympics], and a professional baseball player. The struggles he faced during his life from 1887 to 1953 mark the struggles of his people.

Robin Lindley: Your meticulous research on Thorpe’s life and times is astounding. You get down to details from ten-dollar loans to parking tickets, as well as recounting many, many major (and minor) events in his life. What was your research process for this magisterial biography?

David Maraniss: I try to use the same research methodology for all of my books, what I call the Four Legs of the Table, but in this case there were some limitations. For instance, the first leg is “Go There,” but for this book I was limited by COVID. For Lombardi, I moved to Green Bay. I could not live in Oklahoma or get to Stockholm, where Thorpe won his gold medals, because of COVID.

The second leg is “Interviews,” but in this case, Thorpe was from an era where none of his contemporaries are still alive. In fact, even his seven children were gone by the time I started, so interviews were less important.

I relied more on the final two legs, “Archives” and “Looking for what is Not There,” meaning breaking through the encrustation of mythology to find the real story. Archives were essential in this case, providing letters, oral histories, and all of the documents of the government boarding schools that Thorpe attended. The archives ranged from the Beinecke Rare Book and Manuscript Library at Yale to the Cumberland County Historical Association to the National Archives to the Library of Congress and many more.

Robin Lindley: Thorpe died almost 70 years ago, but I wondered if you had a chance to interview any surviving family members or friends of Thorpe? The challenges of a biography of a past celebrity must be much different than writing about living subjects such as Presidents Clinton and Obama.

David Maraniss: As I said, none of his children were alive. I did not rely on the family but I talked to a few great grandchildren. It was an entirely different process than writing about live figures such as Clinton and Obama, but my search for understanding the subject is pretty much the same.

Robin Lindley: As you recount, Jim Thorpe was a member of the Sac and Fox nation and was raised in Oklahoma. What are a few things you’d like readers to know about his childhood? Were there events as he grew up in Oklahoma that presaged his astounding athletic abilities?

David Maraniss: Throughout the book, I connect the story of Jim Thorpe to the larger plight of the American Indian. That starts with the year he was born, 1887, which was the year of the Dawes Act, which took communal tribal lands away from many Native peoples and gave them private parcels instead with a land grab that led to the Oklahoma Land Runs and was essentially another act of forced assimilation.

The Sac and Fox among other tribes lost much of their land. One thing most people don’t know about Jim is that he was a twin. His twin brother Charlie died when they were nine and boarding at the Sac and Fox Indian school when a disease swept through the institution. It was the first of many grievous losses in Jim’s life. There were few indications of his later athletic prowess yet, but it is apparent that he took much of his strength and stamina from his father, Hiram, who was so strong he could walk home 20 miles from a hunting trip with a felled deer on each shoulder.

Robin Lindley: How did Thorpe come to leave his family in Oklahoma as a teen and then wind up at the Carlisle Indian Industrial School in Pennsylvania?

David Maraniss: The Sac and Fox boarding school was the first of three Indian boarding schools Jim was sent to. The second was the Haskell Institute in Lawrence, Kansas. Jim began to love football there; but did not like the school and ran away. His father then sent him to the Carlisle Indian Industrial School in Pennsylvania. Jim was not quite 16 when he arrived. Unbelievably, considering his later stature, he stood 5-5 and weighed about 115 pounds when he arrived.

 

Robin Lindley: The goal of the Carlisle residential school was to prepare Native American students for assimilation into the majority white society. Its motto was “Kill the Indian. Save the Man,” as you stress. What was the historical context of this philosophy for addressing American Indian students in the early twentieth century?

 

David Maraniss: The “Kill the Indian, Save the Man” philosophy was developed by the founder of the Carlisle school, Richard Henry Pratt, who thought he was doing good – that it was better to force Indigenous students to assimilate into white society as a means of survival rather than literally kill them, as the wars of the mid-nineteenth century had done. This became the government policy throughout the boarding schools. Take away their culture, religion, language, shear their hair, dress them in military uniforms – this was the way to “save” the poor Indians from annihilation. There was no consideration given to the grave and long-lasting damage done by this approach.

 

Robin Lindley: There was a similar educational philosophy in Canada, and Pope Francis recently apologized for the abusive treatment of Indigenous students at residential schools there. Were US schools comparable to those in Canada?

 

David Maraniss: There were all sorts of Indian boarding schools. Some run by the federal government, some by local governments, and some by religious institutions, including Catholic, Lutheran, and Quaker schools. The pope came to Canada because so many of the First Nation boarding schools there were Catholic. But in Canada and in the US, much like in Australia and New Zealand with their aboriginal populations, the nefarious idea was much the same – involuntary assimilation through the boarding schools.

 

Robin Lindley: You tell the story of Thorpe in the context of the Native American history of dispossession, genocide, and ethnic cleansing (as embodied in residential schools).

Even Thorpe, perhaps the most accomplished and celebrated Indigenous person of his time, could not escape racist stereotyping and degrading mockery. How was he covered by the press after his days of athletic glory? How does his life echo the experience of other Native Americans during his life?

 

David Maraniss: Like his people as a whole, Thorpe was romanticized and diminished at the same time. Sportswriters for the most part supported the movement to restore his medals and thought they were championing his cause, yet they routinely resorted to stereotypes when writing about him. He was on the warpath and taking scalps and was called chief or the big chief. The common phrase when writing about him was “Lo, the poor Indian " taken from a line in an Alexander Pope poem.

On Hollywood, where he was an extra in more than 70 films, he tried to push back against the stereotypes perpetuated in many westerns and to force the studio to hire real American Indians to play Indians.

Robin Lindley: The feature-length movie Jim Thorpe—All American, starring Burt Lancaster, a white actor, in the title role came out just a couple of years before Thorpe died. How did Thorpe view that movie and Lancaster’s portrayal?

 

David Maraniss: The 1951 movie about his life, Jim Thorpe – All American, was for the most part a sympathetic portrait. It was directed by Michael Curtiz, who also directed Casablanca, and starred Burt Lancaster, a dynamic actor who was also an excellent athlete.

 

Many people I’ve talked to since the book came out said they were first attracted to Thorpe because of that movie. I understand that sentiment, yet the movie is wrong in ways small and large. The opening scene for instance shows Jim supposedly at his home in Oklahoma as a boy, yet the San Gabriel Mountains of California are in the background. That is a minor mistake. The big one is that the narrator of the film, and in essence its hero, is not Thorpe but his coach at Carlisle, Pop Warner, and the implication is that if only Jim had listened to Pop more and more successfully, and assimilated into white society he would not have had the post-athletic troubles that he had.

 

This is not only wrong but maddening, because in reality, when Jim faced the biggest trauma of his career, when his Olympic gold medals were rescinded [in 1913] after it was reported that he had played bush league baseball for two seasons, Warner lied about his knowledge of that to save his own reputation. He knew exactly what Jim and many of his Indian athletes were doing, and that scores of other college athletes were also playing pro baseball in the summers, though most of them were doing it under aliases while Jim played under his own name.

 

Robin Lindley: Thorpe navigated between Native and white worlds. He never gave up his search for a place in America. His story can be seen as a tragedy, of early glory and ensuing false starts and failures, or as a life of survival, resilience and inspiration. How do you see the arc of his life?

 

David Maraniss: As I was finishing the book’s final chapter, I kept hoping for something better to happen to Jim even while knowing that it would not. His “afterlife” after his athletic days were done was a struggle. He roamed from state to state looking for stable work, his first two wives divorced him, he did not see his seven children as much as he might have, he tried to overcome his addiction to alcohol, he had two heart attacks before a final one killed him at age 65 when he was living in a trailer park in southern California.

But was this story a tragedy? I decided it was not. It was more a story of persistence against the odds, emblematic of the struggles of all Indigenous people.

 

Robin Lindley: Is it fair to call Thorpe the greatest athlete of all time? He excelled in virtually every sport or athletic pursuit he tried.

 

David Maraniss: It's fruitless to compare athletes from different generations because of differences in training, diet, equipment, and other advancements in the sports world. But no one before or since accomplished all that Thorpe did in the trifecta of football, track and field, and baseball. His feats were unparalleled. In the modern world perhaps, only Bo Jackson comes close.

Robin Lindley: Thanks for your thoughtful comments, Mr. Maraniss, and for this comprehensive biography of Jim Thorpe, the greatest athlete of all time. And congratulations on the overwhelmingly positive response to your heartfelt, engaging and extensively researched book, Path Lit by Lightning. Best wishes.

 

Robin Lindley is a Seattle-based attorney, writer and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, art, and culture. Robin’s email: robinlindley@gmail.com.  

 

 

 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/blog/154680 https://historynewsnetwork.org/blog/154680 0
A Portrait of Carlos Franqui

Carlos Franqui (center) hosts Jean-Paul Sartre and Simone de Beauvoir in the offices of the newspaper Revolución, 1960

 

 

He was, or he appeared to be, the stereotypical "shrewd peasant": short of height, dark—later gray—hair that covered his head like small tight strings, intense eyes that looked back at one with a hard squint, tan, thick skin. But when he spoke, he became another, maybe a different, person – hard and gentle at the same time, as well as literate, sensitive, eloquent.

It is not possible to know if Carlos Franqui was born shrewd, but he was born a peasant. He attended school for a time but was an autodidact with an autodidact’s determination. That determination brought him to journalism and to Havana, where he joined the fight against the Batista regime and for the Revolution. The Revolution reciprocated, making use of Franqui’s way with words by giving him editorship of its newspaper, Revolución, and direction of Radio Rebelde.  

He thus became the propaganda chief or de facto minister of information of the July 26th Movement, and then, once in power, of the new Cuba. In addition to writing many speeches and pamphlets, he oversaw much of what today one would call the Movement’s intellectual and political network, in particular with the European Left: organizing prominent visits to Cuba to pay homage, particularly to Fidel Castro and Che Guevara, who were, for a time, two of the most popular men in the world. Franqui, in other words, was the man who helped turn the Cuban Revolution chic.

He was also one of the first to bolt, or so it seems in retrospect. At the time it must have seemed to take a long time, about a decade. Like many of his comrades, Franqui fell out with the Castros and the other Revolutionaries after they consolidated power and, step by step, kneeled before the Soviets with cap in hand. For his part, Franqui took with him most of what constituted the Revolution’s archive. Then, once in exile, he published it.

Two of his books – Diario de la Revolución and Retrato de familia con Fidel – give as honest a portrait of the Cuban Revolution from one of its ideologues as one can get. Among other things, they show that the Revolution was won not by bearded, heroic figures pouring down from the mountains ahead of an angry, determined peasantry but instead by middle class young men and women rising up, leading marches and demonstrations, and conducting acts of sabotage, in the cities. Many of these young people were killed early in the Revolution. The names of only a few of them, like Frank País, are remembered.

As it happens I met Franqui, about thirty years ago. My colleagues and I working for a small Washington, D.C. "think tank" were commissioned by some well-meaning private foundations to prepare a contingency plan for the moment that Cuba came in from the cold – which was to say, the moment after Communism’s collapse in Eastern Europe was replicated in the ever-faithful isle 90 miles off the US coast. 

That this expectation proved to be a tad bit optimistic didn’t stop us from preparing for the best. In the fashion of the day we recruited a "civil society" advisory group, which was about as diverse a group of Cuban exiles and fellow-travelers that had ever met in a single room. There were about 80 members of the group. One was Franqui.

I had corresponded with him several times and spoke to him on the telephone in Puerto Rico, where he lived. He became animated when I told him that our group would meet in Washington. I did all I could, however, to convince him not to attend in person: I did not want him to be a distraction, and our tiny project budget couldn’t afford the cost of hosting him. He said that he understood.

On the morning of our group’s meeting, my telephone rang. It was a friend of Franqui’s who lived in Washington. He said, “Why didn’t you tell me Franqui was coming here?” “He isn’t,” I said. “Then why is he standing on my doorstep?” I apologized and told the friend that I was sure I had dissuaded Franqui from making the journey, but the man just laughed. “He’s an old Revolutionary. He does what he wants.” 

Franqui arrived early to the meeting in a large conference room of a Washington law firm. I remember that he sat in a corner and said nothing to anyone. There were some glares and one or two gasps when some of his fellow Cubans – especially his former enemies who lost nearly all they had in the Revolution – realized who was sitting there. But nobody spoke out against him. He only listened, apart from insisting at one point that the Revolution was never anti-US, at least at the outset.

Later I went to hear him speak to a student group at Georgetown University. The mood there was more volatile because the event was open to the public and a number of Cuban exiles turned up. One of them issued a direct challenge that went something like this: how dare you, Franqui, come here to speak of reconciliation, of rebuilding a free Cuba, when you once did so much to enslave it. How can we believe anything you say? 

Franqui leaned forward and gave what I remember to be a monologue as formidable and persuasive as only someone with the gift of thinking and speaking in full paragraphs can deliver. The old wordsmith still had it. 

We spent some time together during the next few days; a good deal of that time was spent walking around the city. When we came in sight of the Corcoran museum, Franqui said he wanted to go in because a sign announced an exhibit of a work by Václav Havel.

The work was a transcribed address about the use of words; enlarged to fit several panels extending from floor to ceiling; and translated into several languages. We stood there, alone, because we had arrived just before closing time and remained there for a while once the doors were shut. Franqui said nothing but walked around the room and stared at each of the panels which faced one another in a circle. I remember that at one point he took out a pen and notebook and copied some of what he saw:

In the beginning of everything is the word.

It is a miracle to which we owe the fact that we are human.

But at the same time it is a pitfall and a test, a snare and a trial.

More so, perhaps, than it appears to you who have enormous freedom of speech, and might therefore assume that words are not so important.

They are.

They are important everywhere.

That remains my image of Carlos Franqui, the shrewd peasant, Revolutionary, and propagandist. Silently staring at words about words. 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184866 https://historynewsnetwork.org/article/184866 0
50 Years After the Paris Accords: How the US Lost, then Won, in Vietnam

Ho Chi Minh City, 2022

 

 

On January 27, 1973 the United States officially ended its war in Vietnam by signing the Paris Peace Accords and withdrawing from a land where it had been involved in warfare for over two decades.  In the 50 years since, there has been a significant political and scholarly debate over the meaning of the war and outcome in Vietnam.  In 1973, the outcome was clear, as American representatives essentially conceded defeat.  The U.S. left Vietnam with the Communist North stronger than the country it had “invented” in the South and the Democratic Republic of North Vietnam was barely two years away from outright victory, and when that happened in April 1975, the U.S. clearly had lost the war.

But in the half-century since, the way we look at Vietnam has shifted, and the goals the U.S. sought in Vietnam have come into focus and the results look different today than they did at the end of January 1973.  On this 50th anniversary of the peace treaty, we can now say the U.S. both lost and won the Vietnam War.

The Vietnam war was exhaustive and bloody. American policymakers after World War II became involved in the political affairs of Indochina, albeit reluctantly at first, to support France’s re-entry as the imperial power in the region and to contain Communist liberation movements, especially the Viet Minh in Vietnam, which might hamper the development of Asian Capitalism with Japan as the main ally and commercial partner of the U.S. 

By the early 1950s the U.S. was pouring hundreds of millions of dollars into Vietnam to quash the nationalist-Communist forces fighting the French, but the Viet Minh succeeded in defeating France at the pivotal battle of Dien Bien Phu in 1954.  The U.S., however, refused to allow the movement led by Ho Chi Minh to accept victory and denied the sovereignty and territorial integrity of Vietnam, inventing a country below the 17th Parallel, the Republic of Vietnam [RVN], led by Ngo Dinh Diem.

From that point on, the story is well-known . . . The U.S. increasingly ramped up its commitment to Vietnam, with money, arms, and ultimately troops.  In 1961, John F. Kennedy took office with 800 American military personnel in Vietnam, and by the time he was killed there were 16,000 troops there, along with armor, helicopters, defoliants like Agent Orange, and other heavy firepower.  By 1966 there were over 400,000 troops in Vietnam and the U.S. had “Americanized” the war with free-fire zones, search-and-destroy missions, round-the-clock B-52 attacks on a country the size of New Mexico, a massive campaign of ecological warfare, supporting repression in the RVN, and atrocities.  America’s attacks on Vietnam constituted one of the greatest war crimes in military history. 

By the time the war finally ended, the U.S. had more than 58,000 personnel killed and spent close to $200 billion (about $1 trillion today) during its involvement there.  The Vietnamese suffered considerably more—with perhaps 3 million killed, the country devastated by 4.6 million tons of U.S. bombs and immense firepower, (and still today suffering the effects of the environmental war there), its economy destroyed, and millions of refugees created (many of who now make up a thriving immigrant community in the U.S.).  The neighboring countries of Cambodia and Laos, the “sideshows” in the war, met similar fates.  

But since 1973, due to American and international pressure on Vietnam (renamed the Socialist Republic of Vietnam or SRV in 1975) and the lure of capitalist globalization, the SRV now significantly resembles the country the U.S. hoped to create when it began its intervention there right after World War II. 

Partly this was due to continued coercion after the peace treaty ended the U.S. role.  The SRV desperately needed outside funding to rebuild basic infrastructure and create a new economy but the U.S. reneged on a promise of $3.25 billion in reconstruction aid made during the peace talks, and then pressured international lending organizations such as the International Monetary Fund [IMF], World Bank, and United Nations agencies to reject Hanoi’s applications for loans or aid.  

This forced Vietnam to rely on the Soviet Union for economic help, which led to increased tension between Hanoi and the People’s Republic of China, and that led to more regional conflict which exacerbated the SRV’s economic crisis.  In 1978, the SRV intervened in Cambodia to remove the brutal Khmer Rouge, whom were supported by the Chinese and Americans, from power.  They ended the “killing fields” but took on a huge economic burden with the occupation there. 

The U.S. and China continued to recognize the Khmer Rouge as the government of Cambodia and the U.S. prodded China to act against the SRV. President Jimmy Carter expressed his desire to punish Vietnam, whom he called “invaders” of Cambodia, by pressuring others to reduce aid to Hanoi, increasing military aid to Thailand to contain the SRV, and warning the Soviet Union that helping Vietnam would damage its relations with the U.S.   Most dramatically, the Chinese, with U.S. backing, invaded Vietnam in February 1979 and, while suffering big losses, caused over 10,000 Vietnamese deaths and imposed a huge financial toll on the SRV.  The burden of fighting against China, right after intervening in and occupying Cambodia, would plague the SRV for the coming decades.

Inside Vietnam, the government made a turn toward a market economy, and reduced and cut programs to help workers and veterans, the forces that had fought for liberation for years and led the victory, and reached out to international groups for aid.  The SRV did get some support from the IMF in the 1980s but had to adjust its economic plans—reducing subsidies, increasing exports, privatizing foreign investment, and moving toward a market economy—to get funding but these measures hurt Vietnamese workers and peasants.  

The costs of maintaining an allied government in Cambodia and agricultural failures at home led to even more desperate conditions and, similar to Gorbachev’s “market socialism” in the Soviet Union, the SRV embarked on do moi, its own version of that doctrine.  Private entrepreneurs, generally with close contacts to the state bureaucracy, began taking over key economic sectors, and private investment came into Vietnam.  Production rose and new businesses came to the SRV, but little of that wealth ever trickled down to workers and veterans.  

Since then, Vietnam has been on an inexorable march toward the marketplace.  The United States lifted its trade embargo on Vietnam in 1994 and normalized relations a year later, and in 2001 negotiated a commercial agreement that virtually removed tariffs and opened trade between the two nations.   In 2007, Vietnam joined the World Trade Organization, which further led to tariff reductions and trade liberalization. In 2013 the U.S. and SRV began a “Comprehensive Partnership” in the economic, environmental, military, and education sectors. 

Bilateral trade has increased 200-fold since the 1990s and last year American investment in Vietnam reached $2.8 billion.   Meanwhile, the SRV began to move away from agriculture to industrialization and had an average growth rate of about 6.3% in the decade after joining the WTO, with annual export growth of about 12-14%.  Vietnam has over 22,000 foreign investment projects, valued at $300 billion, with companies such as Samsung, LG, Toyota, Honda and Canon. 

In April 1975, the Vietnamese Communists seemed to have finally achieved sovereignty and socialism after decades of war, while the United States apparently had seen the limits of its postwar power and suffered a defeat in a war against a small peasant nation in Asia.  Now, 50 years later, Vietnam resembles the country the Americans had hoped to create when they first became interested in Indochina as an economic partner for a restored Japan in a Capitalist Asia. 

After a 20th Century full of nationalist uprisings, Japanese occupation, and wars against France and the United States, with many millions dead and ecological devastation, Vietnam shed off foreign control but eventually became a major manufacturing base and trading partner for U.S. companies.  

One of the more famous anecdotes from the Vietnam War came from the journalist Peter Arnett, who reported that a U.S. army major explained his decision to shell Bến Tre, a city in the Mekong Delta, regardless of civilian casualties, in order to displace Viet Cong guerrillas there, by saying “it became necessary to destroy the town to save it.”  That can also be used as an allegory for the whole war—the U.S. destroyed Vietnam and made it abandon the visions of its revolutionary ancestors and accept the realities of the capitalist global market.  If Lyndon Johnson, Richard Nixon or other policymakers from that era could see Vietnam today they very well might conclude that the U.S. won the war after all.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184867 https://historynewsnetwork.org/article/184867 0
"Cut His Head Off if Necessary"—The Flimsy, Politically-Driven "Peace" Nixon Made in Vietnam

The inscription on President Richard Nixon's grave marker repeats a line from his first inaugural address in 1969. The diplomatic machinations on the day of his second inaugural belie his claim to the title. Photo by author.

 

 

The month of January 1973 stands for many things. Roe v. Wade, the decision establishing a right to abortion, was announced that month. Lyndon Johnson died at just 64 years of age. Harry Truman’s memorial took place in the National Cathedral (Truman died the day after Christmas, 1972).

And the Watergate burglars’ trial resulted in convictions of the two defendants who had not pled guilty, starting a chain of events that would bring down a president.

But the enduring lesson from January 1973 comes from the flimsy peace President Nixon forced on the South Vietnamese under the guise of “peace with honor.”

Nixon’s desperation to be known as peacemaker led to a dramatic showdown on the day of his second inaugural, Saturday, January 20, 1973—fifty years ago this month.

The White House tapes reveal a true sense of panic on the morning of the inaugural. Nearly half of Nixon’s second inaugural address was dedicated to the subject of peace, and it was premised on his belief that Henry Kissinger, his national security adviser, had concluded the peace negotiations in Paris and that the peace accords would be announced soon after the inaugural ceremonies.

But that plan was thrown into chaos when Kissinger called Nixon at 9:32 a.m. (per the 20th Amendment, Nixon was to be sworn-in at noon that day).

Nixon had been up talking on the phone with adviser Chuck Colson until well past midnight happily reading passages from his speech for Colson’s entertainment. Just a few hours later, he arose and ran five hundred steps in place before eating breakfast at 8:30. “It left me breathless,” he wrote in his memoirs, “but I thought it was a good idea to be in as good a shape as I could for the ceremonies to take place that day.”

Aware of the historical significance of the day, he stepped into the Lincoln Bedroom to say a silent prayer in the spot where the Emancipation Proclamation was preserved and where he understood Lincoln’s desk had been located.

Then the phone rang.

Expecting good news from Kissinger, Nixon was stunned by the report from General Alexander Haig, whom Nixon had sent to Saigon to meet with President Nguyen Van Thieu to enlist support for the peace terms negotiated by Kissinger in Paris.

Thieu balked. “Well, [Haig’s] had a session and Thieu has written you another letter,” Kissinger told Nixon.

“Oh God!” Nixon responded. One of the wonders of the White House taping system is how intimately listeners can participate in these historic moments during the Nixon presidency. In this instance, you can hear Nixon’s breathing grow heavier—the fear in his voice is clearly detectible.

Kissinger, as he so often had to do, immediately tried to soothe Nixon’s anxiety. “What the guy has done, he’s obviously posturing himself, step-by-step,” he said. “He’s now reduced his—in his letter he made four conditions and he’s now reduced them to two.” Then some more bad news. “He’s also sending his foreign minister to Paris to meet with me.”

Another exhale by Nixon. “Oh God!”

Kissinger’s initial reaction had been the same, he told Nixon, but after analyzing it with his team for the past two hours, he concluded that “the problem with [Thieu] is that if we initial an agreement Tuesday, without visible participation by them, it is a great loss of face.”

Nixon asked, “the foreign minister is his nephew?” No, Kissinger responded, “the nephew is that little bastard, that kid who is the Minister of Information [referring to Hoang Duc Nha].”

Kissinger had even less regard for the foreign minister Tran Van Lam, telling Nixon, “The foreign minister is an ass, and he won’t be able to do anything.”

Yet the risk that South Vietnam would not go along with a peace agreement was real. One of the two remaining conditions was easy to fix—allowing South Vietnamese to carry carbines instead of just pistols or sidearms. The other went to the heart of the peace terms. Thieu continued to object to the condition that allowed the North Vietnamese to keep their troops in place in the South. This, Thieu recognized, effectively spelled the end of the existence of the South.

Nixon grew petulant, instructing Kissinger to threaten the cut-off of all aid and financial support if Thieu rejected the Paris Peace Accords. “I’ll do any damn thing it is, or cut off his head if necessary,” he said in total exasperation.

The fact is that Nixon and Kissinger had sabotaged America’s ally by giving into the demand that North Vietnamese troops remain in place in the South as part of the settlement. They had engaged in a violent and horrific bombing campaign in December to bring the North Vietnamese back to the bargaining table and they were not going to let the legitimate concerns of the South Vietnamese upend the impending settlement.

Two hours later, Nixon took to the inaugural stand on the East Front of the Capitol to take the oath of office for a second time. The first part of his short speech continued to focus on peace and, without betraying the irony, Nixon said, “The peace we seek in the world is not the flimsy peace which is merely an interlude between wars, but a peace which can endure for generations to come.”

Nixon said nothing about cutting off President Thieu’s head to accomplish peace in Vietnam.

Americans had voted for Nixon twice based in large part on his pledge to have “peace with honor.” Nixon’s own grave marker at the Nixon Library in Yorba Linda, California, in fact has only one message engraved from his first inaugural: “The greatest honor history can bestow is the title of peacemaker.”

Perhaps Nixon and Kissinger intended to go back into Vietnam if the North violated the settlement agreement by reinstituting war in the South. But by the time that happened, Watergate had taken Nixon down and President Gerald Ford could only watch helplessly as the South fell in 1975, helicopters desperately removing American diplomats from the roof of the American embassy.

The peace turned out—not unsurprisingly—to be simply an interlude. It was a “flimsy peace.”

The lesson of Vietnam is that no matter the military might, it is impossible to bomb into submission people who are united, resistant, and fighting for their own independence and sovereignty. Putin and Russia should take note.

As importantly, lying to the public about wars and foreign affairs always compounds the danger to national security. Nixon should have learned through the Pentagon Papers, the secret study showing the United States could not win in Vietnam, that dissembling about war makes things worse. His own sales job to the nation that he had reached peace with honor was a sham. The terms of the Paris Peace Accords, signed fifty year ago, did nothing but hasten the end for South Vietnam.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184864 https://historynewsnetwork.org/article/184864 0
On Ukraine, International Law is Against Russia—But to What Consequence?

"Little Green Men"—Russian-affiliated troops wearing generic uniforms—in Crimea, March 2014

photo Anton Holoborodko, CC BY-SA 3.0.

 

 

The Ukraine War has provided a challenging time for the nations of the world and, particularly, for international law.

Since antiquity, far-sighted thinkers have worked on developing rules of behavior among nations in connection with war, diplomacy, economic relations, human rights, international crime, global communications, and the environment.  Defined as international law, this “law of nations” is based on treaties or, in some cases, international custom.  Some of the best known of these international legal norms are outlined in the United Nations Charter, the Universal Declaration of Human Rights, and the Geneva Conventions.

The UN Charter is particularly relevant to the Russian invasion of Ukraine.  Article 2, Section 4, perhaps the most important and widely recognized item in the Charter, prohibits the “use of force against the territorial integrity or political independence of any state.”  In Article 51, the Charter declares that "nothing in the present Charter shall impair the inherent right of individual or collective self-defense if an armed attack occurs against a member of the United Nations.”

Ukraine, of course, although partially or totally controlled by Russia or the Soviet Union during portions of its past, has been an independent, sovereign nation since 1991.  That year, the Soviet Union, in the process of disintegration, authorized Ukraine to hold a referendum on whether to become part of the Russian Federation or to become independent.  In a turnout by 84 percent of the Ukrainian public, some 90 percent of participants voted for independence.  Accordingly, Ukraine was recognized as an independent nation.  Three years later, in the Budapest Memorandum, Ukraine’s government officially agreed to turn over its large nuclear arsenal to Russia, while the Russian government officially pledged not only to “respect the independence and sovereignty and the existing borders of Ukraine,” but to “refrain from the threat or use of force” against that country.  In 1997, Ukraine and Russia signed the Treaty on Friendship, Cooperation, and Partnership, in which they pledged to respect one another’s territorial integrity.

Despite these actions, which have the status of international law, the Russian government, in 2014, used its military might to seize and annex Crimea in southern Ukraine and to arm pro-Russian separatist groups in the nation’s eastern region, the Donbas.  Although a Russian veto blocked a UN Security Council rebuke, the UN General Assembly, on March 27, 2014, passed a resolution (“Territorial Integrity of Ukraine”) by a vote of 100 nations to 11, with 58 nations abstaining, condemning the Russian military seizure and annexation of Crimea.  Ignoring this condemnation of its behavior by the world organization, the Russian government incorporated Crimea into the Russian Federation and, in August, dispatched its military forces into the Donbas to bolster the beleaguered separatists.  Over the following years, Russia’s armed forces played the major role in battling the Ukrainian government’s troops defending eastern Ukraine.

Then, on February 24, 2022, the Russian government, in the most massive military operation in Europe since World War II, launched a full-scale invasion of Ukraine.  Although UN  Security Council action was again blocked by a Russian veto, the UN General Assembly took up the issue.  On March 2, by a vote of 141 countries to 5 (with 35 abstentions), it demanded the immediate and complete withdrawal of Russian military forces from Ukrainian territory.  Asked for its opinion on the legality of the Russian invasion, the International Court of Justice, the world’s highest judicial authority, ruled on March 16, by a vote of 13 to 2 (with Russia’s judge casting one of the two negative votes) that Russia should “immediately suspend” its invasion of Ukraine.

In late September 2022, when the Kremlin announced that a ceremony would take place launching a process of Russia’s annexation of the Ukrainian regions of Donetsk, Luhansk, Kherson, and Zaporizhzhia, UN Secretary-General Antonio Guterres warned that “any annexation of a state’s territory by another state resulting from the threat or use of force is a violation of the principles of the UN Charter and international law.”  Denouncing the proposed annexation, Guterres declared:

It cannot be reconciled with the international legal framework.

It stands against everything the international community is meant to stand for.

It flouts the purposes and principles of the United Nations. 

It is a dangerous escalation. 

It has no place in the modern world. 

Nevertheless, the following day, Russian President Vladimir Putin signed an accord to annex the regions, declaring that Russia would never give them up and would defend them by any means available.

In turn, the nations of the world weighed in on the Russian action.  On October 12, 2022, the UN General Assembly, by a vote of 143 countries to 5 (with 35 abstentions), called on all nations to refuse to recognize Russia’s “attempted illegal annexation” of Ukrainian land.

What, then, after surveying this sorry record, are we to think about the value of international law?  It is certainly useful for defining the rules of international behavior―rules that are essential to a civilized world.  Addressing the UN Security Council recently, the UN Secretary General declared that “the rule of law is all that stands between peace and stability” and “a brutal struggle for power and resources.”  Even so, although it is better to have agreed-upon rules rather than none at all, it would be better yet―indeed, much better―to have them enforced. 

And therein lies the fundamental problem:  Despite agreement among nations on the principles of international law, the major entities providing global governance―the United Nations and the International Court of Justice―lack the power to enforce them.  Given this weakness at the global level, nations remain free to launch wars of aggression, including wars of territorial conquest.

Surely the Russian invasion of Ukraine should convince us of the need to strengthen global governance, thereby providing a firmer foundation for the enforcement of international law.      

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184865 https://historynewsnetwork.org/article/184865 0
Latino Activists Changed San Antonio in the 1960s

Henry B. Gonzalez was elected to Congress from San Antonio's West Side in a 1961 special election, the first Mexican American to represent Texas in the House. 

Detail from mural "Mi Teirra" by Robert Ytuarte, photo by author.

 

 

In 1960 San Antonio’s Latino population reached 40 percent, and for the first time politicians campaigning for national office paid attention to their vote. The success of the Viva Kennedy Clubs in San Antonio contributed to another pivotal historical moment–the election of one of the first Latinos in Texas to a federally elected post. Among the southern cities, San Antonio also led the way in racial integration, becoming the first Southern city to integrate the city’s schools, libraries, and restaurants. These demographic shifts accounted for representative power in federal posts that generated rapid economic development in San Antonio’s downtown center.

 

The era also brought unprecedented cultural innovation, racial conflict, and the beginning of a prolonged war in Asia. For young and old, the great rock and roll music of the sixties stands out. The U.S. also witnessed political and social disruptions with the assassinations of John F. Kennedy in 1963, and Robert Kennedy and Martin Luther King in 1968.

The sixties opened with great promise and hope as Americans elected the young, charismatic, first Catholic President John Kennedy. Kennedy campaigned in San Antonio in September of 1960, shortly after selecting U.S. Senator Lyndon Johnson as his running mate. Senator Johnson knew San Antonio well, and his knowledge of local politics represented a boost for the Democratic ticket as well as for the Latino community.

With the exception of New Mexico, a state that had elected several Mexican Americans to the U.S. Congress and the Senate in the 20th century, no other state had sent a Latino to Washington in 100 years. New Mexico had several congressional districts populated largely with Mexican American residents.

The 1960s saw the rise of the various Latino organizations such as the United Farmworkers and Chicano college student movements. Mural by Leo Tanguma and Gonzo. Houston, TX. Photo by Ricardo Romo.

 

Both Democratic and Republican powers in Texas successfully kept Mexican Americans out of state and national political participation by gerrymandering congressional districts and maintaining the dreaded poll tax which required paying to vote. The poll tax in Texas assured low participation among blue collar workers and proved especially harmful to voter registration drives among low-income Latino families. A special election in 1961 in San Antonio’s 20th Congressional District gave Westside voters the opportunity to elect the first Texas Mexican American ever to the U.S. Congress—Henry B. Gonzalez.

The Mexican American community proved able and ready for this opportunity. In the fall of 1961, San Antonians gathered at Las Palmas Shopping Center in the Westside to hear Vice President Lyndon Johnson’s endorsement of Texas State Senator Henry B.Gonzalez. This campaign helped usher in significant political change for San Antonio. With the help of Vice President Lyndon Johnson, Henry B. Gonzalez won the 20th U.S. Congressional seat. The newly elected Mexican American congressman went to Washington in 1961 with federal connections and significant local support.

Back home, San Antonians witnessed cultural change expressed in rock and roll music, as well as the advent of feminism, youthful defiance to the “establishment,” and the gradual collapse of segregation. 

How San Antonians handled this new era of greater inclusion says much about the role of progressive city leaders and reasonable policy strategies. San Antonio had been among the first Southern cities to fully integrate the local schools following the Supreme Court Brown vs. Board of Education decision in 1954 prohibiting segregation.

By the fall of 1960, newspaper stories of Black athletes excelling in football, track, and basketball in San Antonio’s formerly all white high schools were commonplace. Still, however, these Black high school students could only sit in the “colored” section of the major movie theaters downtown. In the early sixties I witnessed small demonstrations of Black students from my own high school, Fox Tech, marching in front of the Aztec Theater protesting the segregated seating policy.

The early months of 1960 also saw major development in the city’s civil rights posture. Old traditions were withering, and young Black students impatient with the progress of change took the lead to challenge segregation. Across America, Black college students led the way in developing new strategies of resistance and confrontation in the civil rights movement. The anti-racists' new strategies gained national attention when four North Carolina college students staged a “sit in” at a “whites only” lunch counter in the Greensboro, North Carolina Woolworth store. The students were beaten and arrested.

San Antonio became one of the many southern cities tested that spring by Black student activists. The local quest to end segregation came on March 16,1960 when Blacks entered the Woolworth department store downtown where lunch counters had long been reserved for white patrons only. Blacks patrons were allowed to shop, but not to eat at the lunch counters. In the previous weeks, Mary Andrews, a student at Our Lady of the Lake University on the Westside, had taken on the task of requesting six stores to end their segregated practices.

San Antonio Latino political leaders stepped up to ease any potential racial tension. City attorney Carlos Cadena instructed the police not to arrest students engaged in peaceful demonstrations. When the stores opened their lunch counters to Blacks the next day, San Antonio became the first city in the South to peacefully integrate public eating establishments.

San Antonians participated in a political revolution and social transformation and set an example for the rest of America. However, the right to sit at lunch counters at local department stores was only a small victory in a long battle that soon shifted to challenging discrimination in voting rights, housing, and employment. With the passage of the 1964 Civil Rights Act signed by President Lyndon B. Johnson, segregation was declared illegal. Local Black leaders also praised the passage of the 1965 Voting Rights Act which expanded the rights of Black voters.

San Antonio’s Hemisfair 68 ushered a new era in San Antonio. Downtown had been popular with U.S.

servicemen, but few tourists went there. Tourists preferred visiting the local historic missions and the Alamo. With the opening of Hemisfair, restaurants and jazz clubs clustered along the Riverwalk, and large new hotel chains opened in the downtown center. The city earned a new nickname, River City, as the city expanded the San Antonio Riverwalk.

The San Antonio Riverwalk is now one of the most visited tourist spots in Texas. Photo by Ricardo Romo

 

Hemisfair 68 succeeded because of earlier resolutions ending the political exclusion of Mexican Americans. The end of segregated public facilities that had kept Blacks from enjoying the downtown businesses and cultural events opened the community to all residents. Congressman Gonzalez stepped up using his political connections to secure federal funds to bring Hemisfair 68 to San Antonio.These early victories remain important because they made San Antonio a model city that avoided racial strife and supported progress in civil rights at a steady pace.

The 1960s represented a transformative era for San Antonio. The city’s progressive response to racial integration helped win additional support for federal funds, including the expansion of five military bases. Several Latinos won local political posts, including Albert Pena who won a seat on the powerful Bexar County Commisioner’s Court. The city also celebrated the Brackenridge football team that won a State football championship in 1962 with Black, Latino, and Anglo players–another first for Texas football and perhaps for the entire South.

Led by Latino quarterback Victor Castillo [14] and sensational running back Warren McVea [42], Brackenridge High School was the first high school in the South to win a State Championship with an integrated football team.

 

The influence of progressive Latino leaders has continued to make San Antonio a focal point for Latino culture, inclusiveness, and fair political representation.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184891 https://historynewsnetwork.org/article/184891 0
Lesley M.M. Blume on Hiroshima and Nuclear War According to the Bulletin of The Atomic Scientists, a nuclear watchdog group with the gloomiest job on earth, we’re closer to nuclear war than at any point since World War II. Ahead of their planned update to the Doomsday Clock, which currently stands at 100 seconds to midnight, I spoke to Lesley M.M. Blume, an award-winning journalist, historian, and New York Times bestselling author. 

Lesley is the author of Fallout: The Hiroshima Cover-up and the Reporter Who Revealed it to the World, which documents how American war correspondent John Hersey helped expose the true effects of the nuclear bombs detonated over Hiroshima and Nagasaki. 

Paying subscribers to Skipped History can access audio of the full conversation here (~1 hour... Lesley and I covered a lot of alarming ground). Skipped History is a reader-supported publication, so consider signing up today!

Ben: Lesley, thank you so much for being here.

LB: Well, thanks for inviting me.

Ben: To begin, let’s discuss Little Boy. What was it, and why is “Little Boy” arguably the worst euphemism ever?

LB: Yeah, Little Boy was not a little boy, but rather the first nuclear weapon ever used in warfare, dropped over Hiroshima on August 6th, 1945. We’ll never know the full extent of casualties, but probably 100,000 to 250,000 people died after the US detonated the bomb.

Little Boy was followed by another bomb, three days later—more appropriately called Fat Man—which decimated Nagasaki. A slightly smaller casualty count there, but numbers become academic when you’re in the tens of thousands.

Ben: In Fallout you quote Hiroshima’s governor as saying even today, “You dig two feet and there are bones.” Pretty stunning.

LB: Yeah. Researching in Hiroshima was really disturbing because, as the governor told me, you are literally walking on a graveyard.

Ben: On that ghoulish note, let’s dive into the US attempts to cover up the radioactive fallout of the bomb. What happened after August 6th, 1945?

LB: When looking at the initial coverage of the bombing, it was undoubtedly a huge story. For example, the New York Times ran a huge banner headline, and not long after that, the US released pictures of devastated landscapes in both of the bombed-out cities. 

So, it seemed like the government was divulging information, and that newspapers were fully covering it. But after the bombing, a few especially daring reporters entered the bombed-out cities to see what life was like. It was very clear to them that, actually, something very sinister was happening; that the atomic bombs continued to kill long after detonation. 

They managed to get a few initial reports out of Hiroshima, but the US government and military were very quick to lock down Japan and squelch subsequent accounts. They didn’t want the international community or American citizens to know that radioactive activity was still killing people in really agonizing ways.

Ben: Per your book, US Secretary of War Henry Stimson explained that the US wasn’t eager to “get the reputation for outdoing Hitler in atrocities.”

LB: Which makes sense, right? The US had just won an incredibly difficult war against undeniably evil powers. Japanese atrocities were horrific, just like those carried out by Germany and Italy.

So the US had the moral high ground, and the truth of what had just happened to the largely civilian populations in Hiroshima and Nagasaki would put that position at risk.

Also, the US military was about to station tens of thousands of Allied troops throughout Japan, including in the atomic cities. Some of them, especially in Nagasaki, were quite close to ground zero. And so of course the US government would say, Look, no harm done here, this area is safe for anybody.

So after September 1945, the story went quiet. US occupation authorities wouldn’t even allow mention of Hiroshima in Japanese poetry, let alone press reports about people dying in Hiroshima from horrific hemorrhaging and worse. As reporters moved on, the American public moved on, too.

Ben: Your discussion of the US public's (and the world’s) fatigue with war stories is quite thought-provoking.

LB: Thanks for bringing that up. Yes, the bandwidth for the American public to absorb yet another outside atrocity story in the immediate aftermath of Hiroshima and Nagasaki was pretty minimal. People suffered from what in the book I call “atrocity exhaustion.” When the government released pictures of the ruins of Hiroshima and Nagasaki, what differentiated those ruins from Dresden or Cologne or London after the blitz? So the exhaustion was total, and I'm more empathetic than ever with that exhaustion given all we’ve been through over the last few years with the Trump era, climate change, Ukraine, and more.

Ben: Ditto. And as a New York Knicks fan, I’ve felt a far lesser yet still persistent form of atrocity exhaustion for a long time.

LB: We have Mets and Jets fans in our family, so I hear you.

Ben: How did John Hersey manage to penetrate this exhaustion?

LB: So John Hersey was this young, gorgeous, Pulitzer Prize-winning novelist and war correspondent who'd worked for Time magazine from 1939 to 1944.  He also had a reputation as a hero for evacuating wounded Marines while covering battle in the South Pacific.

A free agent in the fall of 1945, Hersey had lunch with William Sean, the New Yorker’s managing editor. Sean was this strange, elfish, quiet man with shrewd, unerring news instincts They called him the “hunch man” because sometimes he’d just send a correspondent to some part of the globe on a hunch that something was going on, and he was never wrong.

Ben: Little do people know that Quasimoto was actually a very famous journalist during his time, too.

LB: Yes, the New Yorker tried to get him on contract.

Ben: But he was mired in his own drama.

LB: A sad chapter. 

So Hersey and William Sean were at this lunch and they realized that there was something kind of odd about the Hiroshima coverage. It all seemed to have been about what it did to the landscape, but they were like what happened to the humans? 

Hersey had a deep background in Asian coverage and grew up in China, so his idea was to go back to Asia, starting in China, and then try to get into Japan and find out what had happened to civilians in the atomic cities.

Ben: He was up against some pretty nasty American instincts toward the Japanese, right? There’s a quote from Hershey where he says, “If our concept of civilization was to mean anything, we had to acknowledge the humanity of even our misled and murderous enemies.”

LB: An extraordinary thing to have written on Hersey's part given how Hollywood, the military, and much of the media recast the Japanese as this kind of bestial subspecies during the war. 

So Hersey arrived in China in early 1946. While on assignment, he got sick, and amid a sort of feverish haze on a military ship, he read a book called The Bridge of San Luis Rey, which detailed the intersecting lives of five Peruvians killed when a suspension bridge broke. Hersey thought that’d be a good approach to telling the story of Hiroshima: pick a handful of everyday citizens and describe their overlapping experiences of the attack. He wanted American readers to put themselves in the shoes of his protagonists.

After going from China to Tokyo, spoiler alert, Hersey managed to get to Hiroshima.

Ben: Right, that's a dramatic tale in its own right, which you detail compellingly in Fallout. For now, let’s just say it involves surfing a giant sea horse into town.

LB: Yes… anyway, suffice to say, Hersey’s war reputation made him a perfect Trojan horse of sorts to get into the city.

Once there, over the course of a few weeks, he interviewed dozens of survivors who were in various positions vis-a-vis the blast, some of them quite close (it was deeply miraculous that they’d survived). Ultimately he whittled down his list of protagonists to six people: a young Japanese medic, a young female clerk, an older Japanese doctor, a young widowed mother who had three young children, a Protestant minister who also had a young family, and then finally a German priest who’d been living in Hiroshima. All of their stories overlapped in the lead-up to the bombing.

Hersey returned to Tokyo and then to the US to write his story. Titled “Hiroshima,” he began the story at the moment of detonation, describing what each of his subjects was doing. The reader then gets background information on each of the individuals, before Hershey shows what each of those individuals and each of their families went through in the minutes, hours, then days, and weeks after the bomb went off. Interspersed throughout are statistics about casualties and facts about radiation that almost nobody knew. 

For most people then (as now), the story is unputdownable; it’s the most compulsive reading possible. It’s one of the few stories you can access for free on the New Yorker website.

Ben: You cite one of Hersey's contemporaries, a reporter named Lewis Gait, who said, “You swallowed statistics, gasped in awe, and turning away to discuss the price of lamb chops, forgot. But if you read what Mr. Hersey writes, you won't forget.”

LB: Yeah, by eliciting sympathy and empathy, “Hiroshima” penetrated the public consciousness. Millions of people in the US and around the world read it in real time. Over 500 radio stations in America covered it. ABC, BBC, and newspapers and publications around the world syndicated it. When the story came out as a book, it immediately sold out. I can't overstate the global impact of the book and how voraciously it was consumed.

Ben: How did the US government respond to Hersey blowing their coverup?

LB: To put it mildly, the government reacted with displeasure. 

I mean, first, officials tried to ignore the story entirely. When Harry Truman was asked if he’d read “Hiroshima,” he said he’d never even read the New Yorker. 

But when it became clear they couldn’t ignore the story anymore, a handful of war department old boys got together and published a retort article (though they never called it a retort), ostensibly written by former Secretary of War Henry Stimson but really written by committee. And the retort basically argued in a very calm and unemotional way that the bomb saved a million lives that would’ve been lost through invasion—not just American lives but Japanese ones, too. The authors argued the US couldn’t in good conscience just sit on the technology when they could’ve ended the war in a faster way.

It’s a very paternalistic, reassuring document that conveniently didn’t mention civilian costs, radiation, the ongoing effects of the bombs, the fact that the US had sent troops into the atomic cities without really knowing if there was residual radiation (luckily, there wasn’t), or the fact that the Japanese had put out peace feelers via the Soviets before the bombs were dropped.

Fascinatingly, the Soviets were also extremely pissed about “Hiroshima.” Their US allies left them out of the decision to use the bomb, and now they were at a huge disadvantage. The US had this mega weapon and overnight became the world’s sole superpower.

The Soviets immediately accelerated their own efforts to create an atomic bomb. In the meantime, they had no interest in their public being panicked by their disadvantage, so the Soviets barely reported on Hersey’s story. And then a Soviet publication sent a correspondent to Nagasaki to write a rebuttal article about how the bombs weren’t that bad after all, and any suggestion otherwise was American propaganda.

So, in effect, both the US and the USSR engaged in their own respective coverups of the true aftermath in Hiroshima. To me, as a researcher, that was pretty bananas.

Ben: As a reader, it was too. And regarding the suppression of memory, you write, “The greatest tragedy of the 21st century may be that we've learned so little from the greatest tragedies of the 20th century.” Could you elaborate on that conclusion?

LB: Sure. So at least in the US, try as it might, the government couldn’t counteract the influence of Hersey’s article. We know what nuclear warfare looks like largely because of John Hersey. His article alone became a pillar of deterrence over the following years because the US government knew it couldn’t use nuclear weapons in other conflicts without generating the kind of outcry that followed the publication of “Hiroshima.”

Still, today, I think the threat of nuclear annihilation is probably sharper and fresher for most people than it has been for years. We’re learning yet again that the world has never been able to figure out how to walk away from the nuclear framework. 

One question that’s of profound importance to me, and which I believe was similarly profound for Hersey, was what are we capable of when we’ve dehumanized another race or country on a big scale? The answers to that question, which the world witnessed during World War II, are fading from memory. We’re seeing a terrifying rise in anti-Semitism right now; ethnic concentration camps in China; and of course war with the unlikely but still possible use of nuclear weapons in Ukraine.

So it’s a scary time, riddled with disinformation. The Bulletin of the Atomic Scientists is very clear in its warnings that the nuclear threat is more real now than at any time since World War II.

With that being said, I don't think that most people, especially in America, realize how much agency they have in nuclear matters. Whom we elect to control our nuclear destiny really matters. And at some point, we also have to take on the issue of presidents having sole authority to launch nuclear attacks.

I’d also remind everyone to support their local journalism communities, whom we need to tell the kinds of policy-affecting, eyewitness stories that Hersey was able to bring to the world in 1946.

Ben: An excellent reminder, and a good concluding note. The unsettling content aside, Lesley, this has been a blast.

LB: No pun intended?

Ben: Oh, wow. Not at all. I’m going to stop this interview before I’m tempted to launch more. Thanks again for being here.

LB: My pleasure.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/blog/154673 https://historynewsnetwork.org/blog/154673 0
The "Critical Race Theory" Controversy Continues

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/177258 https://historynewsnetwork.org/article/177258 0
The Roundup Top Ten for January 27, 2023

What My Mother's Activism Before Roe Shows Us about the Upcoming Fights after Dobbs

by Felicia Kornbluh

"The first thing we’ve missed about Roe is that it was merely the final scene in a drama whose origins lay far from the U.S. Supreme Court... a movement that resembled the movement for abortion rights today, centered on policy change in individual states and localities."

 

The 14th Amendment Should Put a Stop to Debt Ceiling Hostage Taking

by Eric Foner

The provisions of the Reconstruction Amendments dealing with the national debt were tied to the nation's short-lived commitment to interracial democracy in the South; today they offer the Biden administration a possible tool to use if Congress pushes to the brink of default. 

 

 

Margaret Bingham Stillwell, Women Archivists, and the Problem of Archival Inclusivity

by Amanda E. Strauss and Karin Wulf

Two scholars who are the first women leaders of their institutions reflect on the ongoing lessons of a pioneering woman archivist and rare books librarian for understanding how archival practices can be made to include or exclude. 

 

 

Why We Went to War on Iraq

by Melvyn P. Leffler`

One foreign policy historian argues that the decision to invade Iraq was made out of genuine concern for thwarting attacks on Americans and preserving the United States' ability to use military power in the Middle East. 

 

 

Why do Republicans Keep Calling it the "Democrat Party"?

by Lawrence B. Glickman

The odd rhetorical device isn't just trolling—it reflects 70 years of the Republican Party seeking to define itself against the opposition even as terms like "liberal" and "conservatism" had not yet taken on stable meaning. 

 

 

Some Escaped Slavery Without Escaping the South

by Viola Franziska Müller

The majority of people escaping slavery before Emancipation never crossed the Mason-Dixon line, finding a measure of freedom in southern cities. 

 

 

Bolsonaro's Long Shadow

by Nara Roberta Silva

The recently departed president is only the latest, and probably not the last, avatar of antidemocratic impulses in Brazilian politics, generally reflected by the elite recruiting the anxieties of the middle class to thwart broader social rights for the nation's poor. 

 

 

Atlanta's BeltLine Project a Case Study in Park-Driven "Green Gentrification"

by Dan Immergluck

Although the ambitious combination of multiuse trails and apartment complexes "was designed to connect Atlantans and improve their quality of life, it has driven up housing costs on nearby land and pushed low-income households out to suburbs with fewer services than downtown neighborhoods."

 

 

Family Histories where Black Power Met Police Power

by Dan Berger

Fighting back against mass incarceration today means learning from the stories of Black Power activists who fought against the expansion of police power and surveillance since the 1960s. 

 

 

Miami-Dade has Lurched Right, but Still Loves "Obamacare"

by Catherine Mas

Even though conservative Latinos in Miami are generally suspicious of "socialism", the long history of local government support for medical access means that many carve out a big exception for the Affordable Care Act. 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184888 https://historynewsnetwork.org/article/184888 0
The US is a Procedural, Not a Substantive, Democracy

 

 

 

The United States is well on its way to becoming a strictly procedural democracy, wherein legal and constitutional norms are observed, but the core requirements for democratic decision-making—the rule of the majority, the right of all citizens to vote without hindrance—are ignored.

 

In the eight presidential elections since 1988, Republicans have carried the popular vote only once, in 2004, when George W. Bush won 51% against John Kerry. Otherwise, the GOP is distinctly a minority party, incapable of winning the popular vote in a national election. 

 

That fact is essentially meaningless, however. Because of our archaic and increasingly corrupted electoral system, Joe Biden’s seven-million-vote majority in 2020 could easily have been trumped by the suppression of a few hundred thousand votes in just four states (Arizona, Georgia, Pennsylvania, and Michigan).  

 

That we are no longer a stable democracy is no new insight. Editorial boards, pundits, and scholars constantly decry this danger. But too often the precedents cited are from somewhere else: “illiberal democracies” like Hungary and India, or outright fascism, as in pre-war Europe.

 

We do not need to go abroad, seeking monsters. The precedents for a sham democracy are amply available in our own history, and our established constitutional order.

 

For most of our history, the U.S. could not be construed as democratic in any valid sense. It was “a republic, not a democracy,” as conservatives like to claim. They are right. There was no requirement for majoritarian rule. Open disfranchisement and restricted electorates were widely accepted.  A return to those norms is the present danger, not Donald Trump’s return to power, because today’s Republicans began pursuing that goal long before he rode down that escalator in 2015, and will continue to pursue it even if he disappears from the political arena.

 

What are the legal and constitutional precedents Republicans have and will invoke to justify a procedural democracy?

 

First, there is no affirmative right to vote anywhere in the Constitution or its later judicial interpretations.  In 1789, the Constitution delegated authority over elections to the states via Article I, Section 4: “The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof.” 

 

In practice, this meant that individual states did as they saw fit, either excluding or admitting poor men (“paupers”), immigrants whether naturalized or not, and free Black men. And despite the later amendments barring the use of race, gender, age, or a poll tax as conditions for voting, that ultimate control over elections remains with the states today. 

 

Of course, the largest anti-democratic factor prior to the Civil War was the provision allowing states to count the enslaved as three-fifths of a person for purposes of representation. This special privilege vastly expanded the South’s weight in the House of Representatives and the Electoral College: most presidents before 1860 were either slaveholders or defenders of the institution. 

 

Radical Reconstruction after the Civil War is the great exception to this long history of exclusion.  For about a decade, the federal government under President U.S. Grant enforced voting rights throughout the South, and formerly enslaved men voted in vast numbers.

 

In 1876, however, the Electoral College deadlocked with rival slates from three Southern states where Democrats tried to exclude Black voters. In the “Compromise of 1877,” Republicans made a deal with Democrats to keep the presidency in return for withdrawing federal troops from the South. After that, democracy slowly imploded. Between 1890 and 1908, the former slave states all found constitutionally permissible ways to disfranchise their Black citizens. As a consequence, an entire section of the electorate was excluded for several generations. And because poll taxes discouraged voting in general, many poor whites also did not vote. 

 

Consider Alabama in 1932. In that crucial election when FDR took charge at the Depression’s height, a mere 18% of eligible Alabamians cast a ballot. Turnout like that was typical across the South throughout the first half of the twentieth century.  By what logic could anyone call such a system “democratic?”  

 

All of these restrictions received the Supreme Court’s sanction. Only the Voting Rights Act of 1965 ended the South’s system of minority rule, and today’s Court has shredded what is left of that legislation. Since Shelby County in 2013, the justices have repeatedly refused to block efforts to guarantee Republican victories by restricting the electorate, whether through arbitrary purges of voter rolls, “voter identification” laws, ending early voting and mail balloting, imposing limits on voter registration, and much more.  

 

In sum, we are rapidly heading back to what the United States has been for most of its history—a strictly procedural, sham democracy, where minorities claim power and then hold onto it by any means necessary.  To turn a favorite phrase of President Biden’s on its head, this is who we are.

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184833 https://historynewsnetwork.org/article/184833 0
Why CRT Belongs in the Classroom, and How to Do It Right

Professor Derrick Bell flanked by pro-diversity protestors at Harvard Law School. 

 

Right wing politicians  in eight states have enacted laws and mandates banning Critical Race Theory (CRT) from their schools, and  since 2021 an astounding total of 42 states have  seen bills introduced in their legislatures that would restrict the teaching of CRT and limit how teachers can discuss the history of racism and sexism in public schools. This has been done  on the dubious grounds that such teaching amounts to left wing indoctrination, which they denounce as divisive, anti-American, racist, and damaging to white students’ self-esteem. Such gags on teachers constitute the greatest violation of academic freedom since the McCarthy era. The hysteria against CRT has been so extreme that Republican legislators in states such as North Dakota enacted anti-CRT bans while publicly acknowledging that there was no evidence that their state’s public schools even taught CRT.  The bans amount to a new front in the culture wars, designed to preemptively strike against critical historical thinking and sow political division at the expense of meaningful learning experiences.

 

Though we are veteran teacher educators, we never taught CRT to our student teachers prior to this era of anti-CRT hysteria. This was not because we disdained CRT, but rather because secondary school history tends to be atheoretical, focusing primarily on the narration of political – and to a lesser extent social – history.(1)  We thought of CRT primarily as a set of ideas taught at the graduate level, especially in law schools, and of little use for high school teachers.  Though we observed New York city public school history teachers for years, we never saw one teach CRT. But all the controversy about CRT provoked us to explore its origins and meaning, which led us to realize our error in failing to see CRT’s utility for teaching US history and debating the history of racism and the theory itself. Note that we speak here of having students debate the history of racism and CRT, not indoctrinating students, as right-wing politicians imagine. We are convinced that CRT, with its controversial assertion that racism is a permanent feature of American society, is a powerful tool that enables students to analyze, discuss, and debate the meaning of some central events and institutions in US history, including slavery, Indian Removal, Jim Crow, Chinese Exclusion, Japanese internment, mass incarceration of Black men, and the Trumpist movement to bar Latinx immigrants.   Those seeking to ban CRT either do not understand it or distort its meaning to obfuscate the educational benefits of discussing and debating its provocative perspective. We witnessed this positive impact firsthand as we piloted a unit on the uses and debates about and criticism of CRT in a high school class.

 

Based as we are in New York, we were drawn to study and teach about the writings of the late New York University law professor Derrick Bell-- a widely admired teacher and mentor--regarded as Critical Race Theory ‘s intellectual godfather.(2)  Un-American? Hardly. Hired as a civil rights attorney by Thurgood Marshall for the NAACP’s Legal Defense Fund, Bell spent years championing equal opportunity in historic desegregation cases.  But Bell was troubled by the fact that even when he won such cases, whites evaded school integration to the extent that by the early 21st century many school systems remained de facto segregated and scholars wrote about the resegregation of American public education. Seeking an explanation for this persistent, effective white resistance to racial integration, Bell argued that racism was a permanent feature of American society, and any anti-racist court victories and political reforms would have limited impact since whites would always find ways to avoid integration and limit progress towards racial equality.

 

Was Bell right?  This question has great potential to spark historical debate in our nation’s classrooms because his perspective offers one possible explanation for key events in African American history. Think, for example, of the emancipation of enslaved Blacks at the end of the Civil War, which the white South quickly limited by adopting Black Codes. Congress responded by enacting Radical Reconstruction to empower and enfranchise formerly enslaved people, but this multiracial democracy was overthrown violently by white supremacists and replaced with what became the South’s Jim Crow regime.  The dynamic of racial progress yielding white backlashes--asserted by Bell and documented exhaustively in Carol Anderson’s recent study, White Rage: The Unspoken Truth of Our Racial Divide (2016)-- can be seen in the way the Brown decision sparked a furious massive resistance movement in the South, the Supreme Court’s refusal in Milliken to mandate busing to integrate schools across municipal lines, and the Court’s assault on affirmative action. Think, too, of how Barack Obama’s two terms as  America’s first Black president were followed by Donald Trump’s presidency, which championed white grievance, flirted with white nationalism, and demonized the Black Lives Matter movement and the national wave of protests following the police murder of George Floyd, culminating in banishing CRT from schools.  How do we account for this pattern of racial progress followed quickly by reversals? And what are we to make of the fact that this pattern seems to conform to Bell’s argument about the permanence of racism in America?  In confronting, rather than evading or banning these questions, we enable students to probe some of the central questions in American history.

 

Discussing and debating Bell and CRT works best when we also explore their most perceptive critics’ arguments. Harvard Law School Professor Randall Kennedy, for example, charges that Bell was too pessimistic in his outlook on the history of racial progress and unrealistic in his yardstick for measuring the impact of civil rights law.  According to Kennedy, Bell

 

…was drawn to grand generalities that crumple under skeptical probing. He wrote, for example, that “most of our civil rights statutes and court decisions have been more symbol than enforceable laws, but none of them is … fully honored at the bank.” Yet consider that phrase “fully honored at the bank.” It does suggest a baseline – perfect enforcement. But such a standard is utopian. All law is underenforced; none is “fully’” honored.(3)

 

Kennedy draws upon voting rights to support this critique, finding that deep South Black voter registration skyrocketed thanks to the Voting Rights Act of 1965. Whereas in 1965 Black voter registration in Alabama was meager, with only 19.3% of Blacks registered, by 2004 72.9% were registered. In Mississippi the percentage rose from 6.7% in 1965 to nearly 70% in 2004.(4)  Kennedy viewed such statistics as proof that civil rights law worked over the long run, undermining Bell’s pessimistic claim that “Racism in America is not a curable aberration. [O]ppression on the basis of race returns time after time – in different guises, but it always returns.”(5) 

 

Clearly, then, debates about Bell and CRT are thought provoking and merit inclusion in high school history classes since they challenge students to assess the trajectory of a central theme in American history: the ongoing struggle for racial equity.  We partnered with a New York City high school teacher in designing a unit on debating Derrick Bell and Critical Race Theory.  We describe this unit below, but we would like to preface this summary by assuring you that – contrary to the hysterical fears of right-wing politicians – no students found these lessons anti-American, racist, divisive, or emotionally disturbing.  To the contrary, the students learned a great deal of history from this unit and came to see it as foolish, even outrageous, that teaching about CRT was banned from many school systems.

 

As we began to plan the unit certain things were clear: students needed to learn about Bell’s ideas, life, experiences, and intellectual turning points; the unit had to include resources and information that explained CRT in a way that high school students could understand; we needed to include a range of views on CRT from those who support it, to scholars who critiqued it, to polemics against it from the Right; and it was essential for students to evaluate historical and current events and decide for themselves if Critical Race Theory is, in fact, persuasive.   We were intentional in our planning–this could not be a unit that explicitly or implicitly steered students’ thinking in one way or another.  Our goal was to enable students – with proper support and resources – to discuss and debate CRT and its use as a tool for assessing key patterns in American history, arriving at their own conclusions.  The unit, therefore, gave students the tools to engage in this work.

 

We worked with an AP Government teacher at a large comprehensive Brooklyn high school.  He taught this unit over three days to his senior-level class, whose racial composition was 50% white, 29% Black, 14% Asian, and 7% Latinx.  The teacher was white.  Students previously learned about racial conflict in the United States, including lessons on slavery, Reconstruction, segregation, violence against Black people, and resistance to each; this unit built on that prior knowledge.  The readings and resources, though used here a senior class, could be used in any high school class.

 

We established two Essential Questions to frame the unit: “To what extent is backlash an inevitable response to Black Americans’ legal and societal progress?” and “To what extent does Critical Race Theory (CRT) provide an accurate framework for the US’s relationship to and problems with race in the past and present?” These questions challenged students to assess historical developments and CRT’s validity as an overarching theory.  To help students answer these questions, the lessons explored Bell’s central claim about the permanence of racism in the United States, and the ways racism is institutionalized.  We were mindful of planning a unit for high school students and tailored our intended understandings about Bell and CRT to that audience; we focused on Bell’s most important argument about the endurance of racism and chose not to explore his secondary arguments (such as his claim that fleeting moments of Black progress only occur when they align with white self-interest).   At the end of this unit students would understand the most important component of a nuanced and complicated legal theory and, through historical analysis, be able assess the extent to which it explained the role of race and racism in the United States.

 

Students navigated a variety of resources including biographical information on Derrick Bell, videos of scholars explaining CRT, excerpts from Randall Kennedy’s critical essay on Bell, primary sources focused on instances of progress and backlash in Black history, and statistics and media reports on school segregation and recent attempts to prohibit discussions of CRT in classrooms.  Ultimately, students used all that they learned to evaluate CRT.  At the unit’s end, students responded to two prompts: “To what extent does history align with Bell’s ‘one step forward, two steps back’ argument?” and “Indicate the extent which you agree with the following statement: ‘Critical Race Theory accurately depicts the impact of racism in the United States.’”  Additionally, the students responded to a scenario addressing the New York State Assembly’s proposal to ban discussions of Critical Race Theory from schools, drawing upon information from the lessons to support their positions.

 

Most students knew little about CRT before the unit began.  Four recalled hearing of it but were not sure of its precise meaning.  Their previous study of racial conflict in American history – from slavery through and beyond the Jim Crow era– made them more open to learning about this and understanding Bell’s views.  Three surmised, based on prior study, that it was related to systemic racism.  Students participated in discussions and group work, volunteering to share their thoughts with their peers.  From the first day of the unit, where students learned about Derrick Bell and the origins and critiques of Critical Race Theory, takeaways included: “Derrick Bell was one the first people to discuss this theory” and “Racism is more than just how people talk to each other. It’s more systemic.” Students were especially animated on Day Two, when they watched video of North Dakota legislators debate banning CRT in classrooms and worked in groups to apply CRT to pairs of historical events.

 

Overall, students gained an understanding of the debate over Critical Race Theory and the extent to which arguments and theories on the permanence of racism in the US explain Black Americans’ struggles.  Through historical analysis they made connections between events that signified progress towards racial equality, such as the Fourteenth Amendment, Brown v. Board of Education, and Obama’s election, and the backlash that curtailed that progress–Jim Crow laws, massive resistance, and the way Trump’s “birther” slander against America’s first Black president helped make Trump a popular figure on the right, paving the way for his presidential campaign and ascendance to the presidency.  Seventy-five percent of the students identified “one step forward, two steps back” as a trend over time, claiming, for example, “I think throughout most events in history involving race, there had been more setbacks than step forwards for people of color.”  Of course, this pessimism merits critical interrogation since such steps forward as the abolition of slavery and Jim Crow were not followed by a “two steps” return to that degree of racial oppression.  

 

Clearly, the CRT argument about the endurance of racism resonated with many students who had come to political consciousness in a city where there had been vocal opposition to Trump and his rhetoric of white racial backlash. When asked if CRT accurately depicts the impact of racism in the United States, about 75% of the students wholeheartedly agreed that it does, positing, for example, “One of the main points of CRT is that racism is fundamentally and deliberately worked into our government and society, and I think that that is absolutely true in the United States. A variety of factors, including healthcare outcomes, educational attainment, average income, and incarceration rates, all indicate that there is a disparity in opportunities offered to white people versus people of color.”  

 

But on the other hand, twenty-five percent of the students took more moderate stances, asserting, “Regression does happen but that does not mean that substantial progress has not/ can't be made.”  Just under a fifth of the class  aligned with Kennedy and his critique of Bell.  One student, for example, stated, “While racism was indubitably present in society, I don't completely agree with it being embodied in law and government institutions because people have tried making some progress by passing laws that would make people more equal.” 

 

Learning about CRT did not offend students, and none felt pressured to agree with Bell.   Students’ differences of opinion indicate that this unit, which provided plenty of room for debate and discourse, didn’t indoctrinate students.  Though the students’ views on Bell/ CRT differed, evidence suggests that they found these ideas intellectually stimulating and so were unanimous in their belief that they should be taught.  The same student who critiqued CRT said, “People have to be aware of darker aspects of history so they remember those bad times and prevent them from happening; it encourages understanding of each other.”  A classmate who agreed with CRT’s assessment of US history connected what happens in classrooms to society at large, stating, “I would say that for the sake of our democracy, it is always better to err on the side of protecting free speech.  This is especially true when it comes to students and teachers.” 

 

As students became more familiar with the critique of American racism offered by Bell and CRT and with the movement to ban CRT in schools, they grew more vocally critical of that movement, which they saw as “an attack on unbiased education” and proof that “the system has been working against people of color up until even now.”  They reacted passionately when asked how they felt about New York considering such a ban, saying, “It’s not right to pass laws saying we can’t learn about it in school” and “CRT is as much a part of history as everything else we learn about.  We should learn about virulent racism happening at the same time as all these other events.”  Students also questioned, “What is education if we erase history?”

 

None of the students’ comments disparaged the country or sought to evoke white guilt. Rather, learning about CRT and historical evidence that supports and contradicts it enabled students to better investigate and understand events of the past and develop informed conclusions about the present.  We observed a huge chasm between anti-CRT polemics, such as that of North Dakota Representative Terry Jones (R), who compared teaching CRT to “feeding our students… poison,” (6) and our class sessions, where students were not poisoned but intellectually stimulated by engaging in open discussion and drawing their own evidenced-based conclusions.  Such open-minded inquiry is, after all, a goal of historical and social studies education.

 

Creating this unit and working with a high school teacher to implement it demonstrated the possibilities and benefits of exploring Bell and CRT’s claims about the permanence of racism in America.  Students learned about figures and ideas omitted from their textbooks and most curricula and engaged with multiple and diverse resources.   Did every student agree with Bell? No.  Did that indicate that the unit failed?  Of course not -- and such disagreement attests that the lesson succeeded in fostering debate.  Did students walk away with a better sense of Bell and CRT’s critical take on racism and the way it might be applied to US historical events?  Certainly.  Whether or not students’ analysis of racism aligned with Bell’s, they had the time and space to think deeply about CRT, its roots, and the debate over its place in education in the last year and a half. 

 

If classroom realities matter at all to those governors and state legislators who imposed CRT bans on schools, they ought to be embarrassed at having barred students in their states from the kind of thought provoking teaching we witnessed in this project.

 

 

Notes:

(1) Though CRT has been applied to analyses of educational inequities, it is not a pedagogical practice or topic that most American students encountered in K-12 education prior to this.  As Stephen Sawchuk wrote in Education Week, “much scholarship on CRT is written in academic language or published in journals not easily accessible to K-12 teachers.”  (Stephen Sawchuk, “What Is Critical Race Theory, and Why Is It Under Attack?,” Education Week, May 18, 2021, https://www.edweek.org/leadership/what-is-critical-race-theory-and-why-is-it-under-attack/2021/05.)

(2) “Tributes,” Derrick Bell Official Site, 2014, accessed August 10, 2022, https://professorderrickbell.com.

(3) Randall Kennedy, Say It Loud!: On Race, Law, History, and Culture (New York: Pantheon Books, 2021), 45.

(4) Kennedy, 50-51.

(5) Kennedy, 44.

(6) Maddie Biertempfel, “North Dakota Senate passes bill banning critical race theory, heads to governor’s desk,” KX News, November 12, 2021, https://www.kxnet.com/news/local-news/north-dakota-senate-passes-bill-banning-critical-race-theory-heads-to-governors-desk/.

 

References:

“Black [Americans] Upbeat about Black Progress, Prospects.” Pew Research Center, January 12, 2010.  https://www.pewresearch.org/social-trends/2010/01/12/blacks-upbeat-about-black-progress-prospects/.

Calixte, Christiane. “Take it from a high schooler who’s actually learned about CRT: Adults need to chill out.”  Washington Post, January 14, 2022.  https://www.washingtonpost.com/opinions/2022/01/14/high-school-critical-race-theory-message-to-protesters/.

Cobb, Jelani. “The Man Behind Critical Race Theory.” The New Yorker, September 13, 2021.  https://www.newyorker.com/magazine/2021/09/20/the-man-behind-critical-race-theory.

“Critical race theory: Experts break down what it actually means.” Washington Post, July 13, 2021.  https://www.youtube.com/watch?v=svj_6w0EUz4.

Delgado, Richard & Stefancic, Jean, eds.. The Derrick Bell Reader.  New York: NYU Press, 2005.

Fortin, Jacey.  “Critical Race Theory: A Brief History.” New York Times, November 8, 2021.  https://www.nytimes.com/article/what-is-critical-race-theory.html.

“Most Americans Say Trump’s Election Has Led to Worse Race Relations in the U.S.” Pew Research Center, December 19, 2017.  https://www.pewresearch.org/politics/2017/12/19/most-americans-say-trumps-election-has-led-to-worse-race-relations-in-the-u-s/.

Schwartz, Sarah.  "Who's Really Driving Critical Race Theory Legislation?: An Investigation." Education Week, July 19, 2021. https://www.edweek.org/policy-politics/whos-really-driving-critical-race-theory-legislation-an-investigation/2021/07.

Stout, Cathryn and Wilburn, Thomas. “CRT Map: Efforts to restrict teaching racism and bias have multiplied across the U.S.” Chalkbeat, updated February 1, 2022.  https://www.chalkbeat.org/22525983/map-critical-race-theory-legislation-teaching-racism.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184803 https://historynewsnetwork.org/article/184803 0
What's Hiding in Putin's Family History?

In 1949, Mikhail Putin visits a worker club at the Red Vyborzhets factory in Leningrad with his former coworker (in uniform) Boris Kruglov. Above is a famous painting of Putin signing the first contract [dogovor] for socialist competition in 1929.

 

 

“Who is Mr. Putin?” This question, first posed in 1999, remains unanswered. Married? Children? Even basic information on Russia’s president is a state secret. Kremlin propaganda trumpets Vladimir’s humble origins, but offers meager details. For example, on October 27, 2022, Putin claimed his working class background allowed him to “delicately feel the pulse of common people.” 

I question this legend by asserting that Putin benefited from a familial connection to a prominent member of the Soviet elite, Mikhail Eliseevich Putin (1894–1969). Mikhail helped establish “socialist competition,” a crucial institution for Stalinist modernization whereby workers were encouraged to contend for social recognition instead of wages. Once a “name familiar to all Soviets,” Mikhail Putin is a forgotten figure, and this is no coincidence.

The nineteenth-century thinker Marquis de Coustine held that Russia was a nation that strives to forget. George Orwell used the imagery of “memory holes” to describe Stalinist history. For Hannah Arendt,  “the only rule of which everybody in a totalitarian state may be sure is that the more visible government agencies are, the less power they carry, and the less is known of the existence of an institution, the more powerful it will ultimately turn out to be.” Thus Russian constitutions and political parties are facades: real power resides, in the ruling families.  Dynastic bloodlines are studiously concealed from public scrutiny.

Vladimir Putin, who now fights a war to prevent what he calls a “rewriting of history,” pathologically fears any examination into his own family. Since he became president, even the blandest of biographies have to be cleared by Kremlin handlers. Aleksey Navalny sits in prison over his “rude” investigations into Putin’s love life. In a 2020 interview, the journalist Andrei Kolesnikov, a main source for Putinology, could not say whether Putin had re-married: “I honestly don’t know, and it’s better not to know.”  In Orwellian fashion, all the parish registers down to the sixteenth-century that mention Putin are off limits to researchers. Clearly, much is being hidden.

One taboo topic is Mikhail Putin. I have pieced together his biography from archival sources, interviews, Soviet newspapers and books. Apart from the reluctance of Russians to discuss this man on record, my investigation was made difficult because of Soviet falsifications in which Mikhail willingly participated. While Stalinist propaganda depicted Mikhail Putin as a vanguard Leninist, he was in reality a son of rural Russia.

  

Mikhail Putin: from Wrestling to Socialist Competition

According to Aleksandr Putin, the sole family chronicler and the President’s cousin, the Putins form a tight-knit clan [rod] who today number around 3000. All hail from Tver, a rural province that lies between Moscow and St. Petersburg. The Putins were serfs who were tied to patrimony-estates [votchinas]. A family legend holds that a smallpox outbreak in 1771 wiped out all the Putin line except a 13-year-old, Alesha. By end of the nineteenth century, the small Putin clan remained centered in a lightly populated region of Tver. The parish records for the local Pokrovskyaya  church  record a mere 148 births for the  year 1910. The Tver villages inhabited by the Putin clan were small communities where everyone knew one another.

The Putin men, starting with Ivan Petrovich (1845–1918), the President’s paternal great-grandfather, were migrant workers who established a family-association (Artel) that supplied workers for restaurants in St. Petersburg. Establishing connections in the city, Ivan was followed by Spiridon, the President’s grandfather, and Mikhail Eliseevich, respectively. Mikhail’s father was a switchman for the Nikolaivskii Railroad at the Bezhetsk Station in Tver, about 140 km from the Putin homeland, Pominovo.  With nine siblings, Misha, born on November 8, 1894, started work at age nine helping his father. Along the way, Misha received a few years of elementary education, presumably at the local Aleksando-Mariinskaya Church  which had a total of 14 students.  At twelve, Mikhail began traveling to St. Petersburg, lodging with Vladimir’s grandfather, Spiridon, on Gorokhovo Street. Born in Pominovo, Tver, at age sixteen, Spiridon  apprenticed under a relative as a cook at the swanky Astoria. Thanks to Spiridon, Misha became a bus-boy at the near-by cafe of Jean Cubat.

As a teenager, the muscular Mikhail lugged cargo for a longshoreman artel in the rough-and-tumble beer manufacturing docks. During breaks, he would wrestle peasant-style (bor’ba na opoiaskakh). Local sportsmen noticed Putin and invited him to work out at Sanitas, a gym precursor. Frequented by the great wrestlers of the era, Sanitas employed scientific methods developed by physiologist Peter Lesgaft, who wrote: “mental and physical activities should be in complete harmony, for only then is it possible to fully attain self-awareness.” Successful wrestlers gained fame doing tricks in the circus for semi-literate workers. Mikhail Putin, a middleweight, never reached Olympian heights, but fought some of the famous wrestlers of the era. The lads at the docks began calling Putin “Mishka the Wrestler.”

During the Civil War, as trade froze up, the dock artel organized show matches. In Tomsk, the workers wanted to see who could last the longest against the legendary Ivan Piddubny. Putin, out of fear, retreated. Piddubny, smiling, pulled him aside and told him: “Why are you chickening out? [Chto tikaish’?] Scary, yes, but fight!”[1] Putin took his advice to heart and lasted seven minutes in the ring against this Samson.

Mikhail Putin (second from left) with his brigade at Red Vyborzhets in 1929. [Nedeli, nom 10, 1980 p.8]

 

Mikhail Putin’s training was interrupted by war: he served in the Red Army from May 1920 to May 1922. In 1923, Mikhail Putin became a furnace stoker at a war-ravaged Leningrad factory, Red Vyborzhets. This factory, capable of producing a multitude of products at short notice, was crucial not only for industrialization, but also for the state propaganda. Notably, vanguard-workers forged the Lenin statue at the Finland Station.

The Bolsheviks soon faced the grim reality of a Marxist revolution in a peasant land of drunkenness and illiteracy. Putin, a “half-proletarian,” became quickly valued by Party bosses. Working the furnace, Putin drank 40 cups of water per shift to endure the heat. Between shifts, hearty Mikhail organized wrestling matches, thus gaining authority (avtoritet) among his illiterate mates. Impressed, the factory Party supervisor made him an agitator.

 

The Bolshevik conundrum was how to transform rowdy peasants into proletarians. What was needed was a way to present factory life in a fun, theatrical light. In Stalinist fashion, the solution would be found by supposedly turning back to Lenin.  According to Lenin’s essay “How to Organize Competition” workers must initiate a ruthless terror. The bourgeoisie were “parasites” who “must be dealt with mercilessly.” Stalin had “How to Organize Competition” published in Pravda in 1929 to justify forced industrialization. Lenin in his pamphlet suggested the Bolsheviks experiment with different methods to motivate the “half-proletarians.”

 

Although Putin’s effort was one out of many, Red Vyborzhets was canonized to become the template for industrialization. According to legend, later taught to every Soviet schoolchild, Putin read Lenin’s work to his brigade. “So great was the impression of Lenin’s simple words that everyone was lost in thought.” The workers exclaimed: “How can Leninist thoughts be realized?” There was a heated dispute, but Putin remembered the advice of Piddubny: “Don’t chicken out!”

Putin suggested: “Let’s write a contract! We will compete with each other, and challenge our fellows.”

“And win a prize?” A fellow worker, Kruglov, simplistically exclaimed.

“It’s not who wins,” Putin objected, “this is not our principle. But to finish the job faster, better.” Putin found a student notebook and drew up the first contract of socialist competition on March 15, 1929.   

In reality, this worker initiative was staged-managed from above. The Party sent skilled propagandists to Vyborzhezs to concoct a story.  At first, 186 workers, under strict supervision, were to “compete.” The Party bosses asked skilled machine operators to formalize their obligations in a written contract, but they refused. By April, the Party had browbeaten several brigades to sign contracts. Putin’s brigade was the only one that agreed to wage reductions. They signed, not in March, but on 13 May.

Stalin soon proclaimed, “competition is a communist method of building socialism based on the maximum activity of millions of workers.” Indeed, this “grass-roots initiative” was a Stalinist masterstroke: Actual proletarians, professionals, who realized that “socialist competition” was preposterous and counter-productive, were marginalized. To boot, many of these seasoned workers were Trotskyites. Young provincials, such as Putin, would be elevated through “competition” while owing their allegiance to Stalin. Thus, Stalin forged a pivotal political base.

Socialist competition fostered a carnival atmosphere that focused on social recognition rather than economics. A key to acclimatizing peasants to factory life, competition spread thought the socialist world, and is still prominent in North Korea.  The name “Putin” entered the Ukrainian discourse thanks to “sotsialistychne zmahannia.”

Mikhail Putin was the model for Ivan Shadr’s sculpture "Cobblestone - the Weapon of the Proletariat" (1927)

 

While not the “initiator” of competition, Mikhail Putin was no mere cog in the machine. The athletic Putin embodied the Bolshevik ideal of the “new” worker. A shirtless Putin served as the model for the I.D. Shadr sculpture “Cobblestone: Weapon of the Proletariat,” a 1927 glorification of macho proletarians. Sergei Kirov, who voiced worries about the influx of unruly peasants to the factories, would have found Putin an invaluable enforcer. Mikhail Putin received visits from Kirov who was instrumental in propagating socialist competition.

After “initiating” socialist competition, Putin was soon entrusted with another sensitive mission, agitating for collectivization. In fall 1930, Putin’s brigade left for a village, Nizhnee Chuevo, in Tambov. Putin went to the houses of the poorest peasants to explain the benefits of collectivization. In true Putin style, Mikhail embellished his tale by recounting how he was attacked by three wolfhounds unleashed by the kulaks.

Mikhail Putin was well compensated for his services. In 1931, he was awarded the Soviets’ highest honor, the Order of Lenin. Graduating from the School of Trade Unions in 1933, he managed a Leningrad construction trust. Moving into an elite apartment, dubbed “fairy tale,” next to the Kirov Theater, Mikhail married a beautiful young woman, 16 years his junior. During a pivotal (and still enigmatic) moment in Soviet history, Putin in 1934 chaired Sergei Kirov’s funeral. This, no doubt, endowed him with a powerful aura.

During the war, Mikhail Putin heroically supervised construction projects in Leningrad, often close to the front lines. Even during wartime, Mikhail returned to his factory, Vyborzhets, to celebrate militarized anniversaries of Socialist Competition.

After World War II, Mikhail Putin became a trusted elder. In Pravda he was lionized along with the miner Alexei Stakhanov.  While Stakhanov’s debauchery so angered the Party that he was stripped of his Moscow furnishings and quietly retuned to the Donbass, Putin, living “a humble life,” continued agitating up to age 75.  Unlike other labor heroes, Putin was consulted by scholars.  A typical propaganda piece relates: “Time passed and the labor veterans aged, but they never forget their factories. A gray-haired man with the Order of Lenin on his chest often visited Vyborzhets: Putin. The shop was changing before his eyes: no cramped, dark cells anymore.  Powerful, high-performance tube mills stand along the wide, bright aisles: Putin’s profession has disappeared.” Putin became known for his impassioned talks.

Mikhail Putin, a Leningrad Construction Boss (with the Order of Lenin)

 

Putin thus became a valuable tool for indoctrination. “The participation of the veterans of the Revolution and labor […] is extremely important in educating working youth on revolutionary and labor traditions.”  University students would be bused to Vyborzhets and “introduced to the latest equipment” and sometimes Putin himself.  On November 25, 1958, on the eve of the 21st Party Congress at the storied Tauride Palace, 1,500 people gathered for a meeting broadcast by radio that showcased Putin.  In encyclopedias, “Putin” appeared next to Alexander Pushkin. A  “1929” installment of the Soviet TV program Our Biography (1978) and a film Spring of Labor (1975) focused on Mikhail.

 

MIKHAIL AND VLADIMIR 

During the famine years of the 30s, as a way for Mikhail Putin to return a favor, Spiridon (the President’s grandfather) was set up in Moscow at the Gorki Palace to cook for Party bosses, including Stalin.[1] Instead of moving in with Spiridon, the President’s father Vladimir and his wife Maria left Tver for Leningrad, presumably because of Mikhail. During the blockade, Putin’s mother, according to the President, “lived with a relative on the embankment of the Fontanka River.” This, assuredly, would be with Mikhail Putin. In 1942, his apartment at the Skazka House was destroyed by bombing so that Mikhail moved to a nearby apartment, at 109 Fontanka St. Off and on, Vladimir’s family continued living with “relatives” until Vladimir landed a good job at the Egorov factory and they were given an apartment on Baskov Lane. (In “yet another coincidence, to which we have become accustomed,” Russian state-controlled media reported in 2004 that Mikhail Putin’s grandson, Viktor, was living on Baskov Lane.) We know that relatives of Mikhail’s wife became well acquainted with both the President’s father and grandfather. Vladimir talked little with his father, who was scarred by the war. But he used to visit a “relative”—perhaps Mikhail?—who recounted family history (by 1995, around one hundred Putins lived in St Petersburg, but from 1930 to 1970, there were only two or three Putin households in the city).

 

While nepotism was officially discouraged, the Soviets did promote “worker dynasties.” Propaganda articles highlighted the “wonderful” Vyborzhets families. Mikhail Putin was regarded as a paterfamilias of the “school of communist labor.” At Vyborzhets, according to Soviet propaganda, family dynasties enjoyed the “authority and deep respect of the collective. The display of such glorious labour traditions of hereditary working families in lecture and propaganda work is important in educating young people and instilling in them a love of work.” Mikhail’s efforts to guide struggling Vladimir would thus have received official blessing. Under the radar, nepotism became entrenched in the Party ranks as seen with Leonid Brezhnev’s own daughter, Galina.

 

Inter-generational sports was also a part of Soviet indoctrination. Mikhail “kept in touch with his native factory,” helping to build a good club and stadium.  From time to time, Mikhail met his old Sanitas wrestling mate Sergei Dashkevich (1896–1953). Dashkevich took Judo courses under the legendary Vasily Oshchepkov. In order to set oneself up teaching Judo in Leningrad, it would certainly have been helpful to be connected with someone with Putin’s sway. Mikhail Putin may have been instrumental in helping Dashkevich’s pupil, Anatolii Rakhlin establish the Judo Club currently located across the street from Vyborzhets factory. In the 1960s Soviet police state, the idea of a Judo (or Sambo) club, directed by a Jew and open to the public, would have been unheard of. Tellingly, Rakhlin’s club was not in some basement but was first located in the renowned Yusupov Palace, the site of Rasputin’s murder, and a four-minute walk from Mikhail’s house. This unique club was named “Pipe-builder” (Trubostroitel’)—Mikhail’s profession.

Times had changed: in place of peasant-brawlers, cultural heroes became scientists and scholars. Vladimir was estranged from his father and adrift at school. It is reasonable to assume that he would have thrown himself at the chance to follow in the footsteps of the iconic Misha the Wrestler. Certainly, martial arts shaped Putin’s personality. This straightened out the spoiled Vladimir, but training and the 40-minute trolleybus commute left little time for study. Rakhlin, acknowledging Putin’s limited academic potential, recommended that he enter community technical college (Vtuz); at school Putin had received many Cs (troiki) which would have barred him from entering  university.

Instead, Putin inexplicably got in the international division of the law faculty at Leningrad University, a notorious bastion of golden youth. Here students interacted with foreigners, read banned “petty-bourgeois” scholars, and took subjects such as “State Law in Bourgeois Countries.” All this was strictly limited to “verified” youth, and certainly not open to a nobody who was also a brawler and who fraternized with Jews, at a time when the 1967 Arab-Israeli War had caused a wave of anti-Semitism. Clearly, Putin’s entrance required connections (blat). It must have been Mikhail Putin who pulled the strings. Mikhail, old Leningraders whisper, wrote the required recommendation letter for Putin to enter the KGB.[1] Mikhail Putin, who died in 1969, was lionized in the front pages of Pravda: few would question the last wishes of this legendary man. According to Dmitrii Gantserov, a recruiter working in the 3rd department of the 5th Chief Directorate of the KGB, Putin inexplicably, from his freshman year, was considered a prime candidate out of an already elite group of law faculty students. The Leningrad KGB headquarters gave a green light to Putin’s candidacy based on a review of family relations (proverka dal’nikh rodstvennikov). For the final acceptance, during Putin’s last year of study in 1974, Gantserov was ordered to make a thorough review of Putin’s family background by personally interviewing family members (without naming himself, Vladimir Putin himself has admitted that connections (blat) are what made a KGB career in those days). The shadow of Mikhail would continue to give Putin a leg up. Key members of the Putin elite such as Valentina Matvienko would have heard Grigorii Romanov, the first secretary in Leningrad, herald Mikhail Putin as a hero who “our whole country follows today.”

As is the case for the majority of post-socialist societies, the leader’s princeling status is essential for acting as guardian and arbitrator over the ruling dynasties. This is what drove Putin’s rise to power. Amid a reactionary backlash, Putin protected the legacy of his former boss, the Mayor of St. Petersburg, Anatoli Sobchak, as well the business and political interests of his wife, Liudmila Narusova, and daughter Kseniia. Based on this reputation, Valentin Yumashev, Tat’iana Yeltsin, and other members of Boris Yeltsin’s family urged the president to select Putin as successor in 1999.

As with the President’s appearance and gait, Vladimir’s career closely hews to Mikhail’s: from macho wrestler, to wily political insider, to Party sage. The model of socialist competition—the notion that political theatre can replace trade-union politics and market forces—epitomizes Putin’s authoritarianism. Facing turmoil in his war with Ukraine, Vladimir Putin continues to turn to the people he trusts, his Leningrad Judo partners and their children.        

 

[1] In 2022, Putin named as his mentors: two school teachers, Tamara Chizhova and Vera Gurevich, and his trainer Anatoli Rakhlin. None of these would be able to provide a recommendation authoritative enough to get into the KGB.

 

 

[1] In paranoid Russian society, the leader’s cook is no humble job. Putin’s chef, Yevgeny Prigozhin, is a critical member of the elite.

 

 

[1] In strikingly similar language, Vladimir Putin often tells school children that his mentor, Anatoly Rakhlin, urged him to “fight to the end.”

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184804 https://historynewsnetwork.org/article/184804 0
Do Sanctions on Russia Portend a Return to the Interwar Order of Trade Blocs?

 

 

 

The sanctions levied against Russia in reaction to its war against Ukraine represent, along with the war itself, a striking break with the norms of the global order that have characterized the world since the late 1940s. Along with rising tensions between China and the United States, the sanctions may irredeemably weaken the economic foundations undergirding the global order. Were this to happen, what would the world be like? The collapse of economic globalization after World War I offers a sobering parallel to what might lie ahead.

The world before World War I was in some ways more globalized than it is today: migration, agricultural trade, and investment in public works all flowed more freely than they have since.  Blockades, seizures of enemy property, and capital controls during the war made almost all countries’ economies less globalized. Trade walls and monetary instability left over from the war brought about long-term deglobalization during the Depression. Foreign investments dropped from 18 percent of world GDP in 1914 to only 5 percent in 1950. They only surpassed the level of 1914 in the 1980s.

As the world economy unraveled, leaders and activists proclaimed autarky, economic self-sufficiency, as a goal. Yet vital raw materials and export earnings were too crucial for any industrial economy to practice true autarky. Instead, economic blocs emerged. Britain tried to knit together its colonies, dominions, and trading partners into a “sterling bloc” surrounded by tariffs, the latter enshrined in what were known as the Ottawa agreements. Nazi Germany’s Grossraumwirtschaft and Japan’s Greater East Asian Co-Prosperity Sphere had deep roots in peacetime patterns of trade and investment before they became war plans.

None of these schemes could be justified in purely economic terms. Despite costly efforts to rely on domestic or imperial sources for raw materials, these economic blocs never truly succeeded. Nazi Germany’s largest supplier of raw materials on the eve of the Second World War was the United States, a country with which it contemplated going to war. Forty-five percent of all Japan’s imports came from the United States, against which it went to war in December 1941.

Nonetheless, nations continued to try to build economic blocs. World War I had demonstrated how vulnerable nations were if political forces moved against them. Blockades had helped defeat both Germany and Russia and nearly brought Britain to its knees. Diplomats supporting the League of Nations hoped that sanctions could replace war as a way to curb aggressors. In fact, the near success—but ultimate failure—of League sanctions against Italy for its war against Ethiopia in 1935-36 had the opposite effect. The sanctions helped convince German and Japanese leaders that they needed secure sources of supply and that war to acquire them would not be stopped by the League powers. As exiled German economist Moritz Bonn recognized in 1934, “international economic interdependence is a gamble.” Peace made it a wise choice. The threat of war could make it appear foolish.

Even though academic and business leaders in the 1930s warned that economic blocs often made little sense, a floodtide of agitation and propaganda helped convince people of the wisdom of autarky or self-sufficiency. Radio, newsreels, and air travel made people around the world more aware of events in foreign countries than ever before. Yet more news led to fear, not understanding.  In 1936, U.S. Agriculture Secretary Henry Wallace pointed out why: “From the point of view of transportation and communication, the world is more nearly one than ever before. From the point of view of tariff walls, nationalist striving, and the like, the nations of the world are more separated than ever before.” Putin’s onslaught of lies about Ukraine has its counterparts in the propaganda that flowed from Berlin, Moscow, Rome, and Tokyo in the 1930s. Economic de-globalization and continued informational globalization, however skewed in its content, can happen simultaneously.

The contemporary global order began as an attempt by U.S. leaders during the 1940s to overcome economic blocs. What is striking is how long it took for the architecture of multilateral trade agreements and monetary stability to support globalization. As late as 1961, distinguished scholars of international affairs such as Karl Deutsch and Alexander Eckstein concluded that “In all probability, the world will not see again, in this respect, as in many others, the ‘normality’ of pre-1914.” Only in 1990s, with the reform era in China under Deng Xiaoping and the collapse of the Soviet bloc, did globalization surpass the levels of trade and investment that had been reached before World War I.

Will economic blocs and declining inter-dependence return? Centered primarily on western Europe, the world economy of the early twentieth century proved vulnerable to contraction. Today, East Asia, the EU, and North America all integrate economies across borders. The real test for the future will be whether China turns to building walls, not bridges. Dependent on exporting raw materials like oil, Russia could have a secure, though subordinate place in a Chinese-led economic bloc. China, despite its growing nationalism, remains more deeply enmeshed in the global economy than any country in the 1930s. In 2018, seventy percent of the countries in the world traded more with China than the U.S. Nonetheless, opposition to the global order led by the United States and its allies could tempt China’s leaders to turn away from the global economy and towards an economic bloc. The future may then look more like the past. A turn by China towards building an economic bloc in combination with Russia would be a momentous decision: difficult, drawn-out, dangerous, but not, in the light of history, out of the question. Politics can override purely economic calculations. They have in the past.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184832 https://historynewsnetwork.org/article/184832 0
The Pope at War: Pius XII and the Vatican's Secret Archives

 

David I. Kertzer: The Pope at War: The Secret History of Pius XII, Mussolini, and Hitler (2022)

 

The Pope. And how many divisions has he?”

--Joseph Stalin at the 1943 Teheran conference, responding to Winston Churchill’s suggestion that the Pope be involved in post-war planning.

 

Pope Pius XII (Eugenio Pacelli, 1876-1958) was the most powerful religious figure in Europe during World War II. Based in the tiny state of Vatican City, he held sway over Europe’s 200 million Catholics. Known as a quiet, intellectual man, fluent in four languages, he served from 1939 until his death in 1959.

His legacy has been dominated by one haunting question: could he have done more to save the Jews?

After the war, the Vatican’s propaganda office mounted a coordinated effort to portray Pius XII as a hero, a moral leader who spoke out against anti-Semitism and pleaded with the warring countries to protect innocent civilians, including minorities. The Vatican claimed, however, that Pope Pius XII, isolated inside Fascist Italy, only heard unverified “rumors” about the organized genocide of the Nazis. Thus, he was unable to provide any help, other than offering prayers, for the Jews in Germany, Poland, Hungary and other occupied countries.

In 2009, his defenders even mounted a campaign to have him declared a saint.  This effort ran into serious opposition from Holocaust survivors and the effort was put on indefinite hold by Pope Francis in 2014.

In the past two decades, a "Pope Pius XII War" has quietly raged among historians with accusers and defenders publishing articles and books about the wartime Pope. Those critical of the Vatican include David Kertzer’s The Popes Against the Jews, Peter Godman’s Hitler and the Vatican and Susan Zucotti’s Under His Very Windows.

In 2022, another book, The Pope and the Holocaust: Pius XII and the Secret Vatican Archives by Michael Hesemann, a German history professor, came out defending the wartime pope and claiming he saved thousands of lives of Jews and other minorities.

 

Secret Archives

In 2020, after much prodding from historians, the Vatican finally granted access to a vast trove of World War II archives, previously locked away since the end of the war.

This breakthrough resulted in tens of thousands of pages of records, letters, reports and internal memos becoming accessible to scholars. The new evidence was damning. Pius XII had received detailed reports about the death camps and had been asked repeatedly by Jewish leaders, Allied governments, and clergy to intervene. Many visitors pleaded with him to speak out publicly against the Nazi’s mass murders. Later, when Mussolini began stripping Italian Jews of their jobs and property, priests and rabbis begged him to intervene with the dictator.

The answer was always “no.”

In his new book, The Pope at War, The Secret History of Pius XII, Mussolini and Hitler, David Kertzer, a history professor at Brown University, details the dark truths of Pius XII’s wartime actions. Kertzer, who has written six previous books about the Vatican in the twentieth century, was one of the first researchers to access the secret wartime archives.

In The Pope at War, he describes how Cardinal Eugene Pacelli, who first served as the Vatican’s secretary of state from 1930 to his election as Pope Pius XII in 1939, received detailed reports about the Nazi’s campaign persecution of the Jews and political dissenters (including anti-fascist Catholic priests) from the very beginning of the Hitler regime.

Later came Kristallnacht in 1938 and then the organization of huge death camps in Poland and Germany. Throughout this dark period, local priests and diplomats sent hundreds of letters, telegrams and detailed reports of the death camps to the Vatican. But the Pope and his close advisors consistently rejected any effort to protest the killings, either publicly or privately. The Vatican’s powerful propaganda machine (two daily newspapers, a radio network and papal messages) ignored the roundups of Jews, and its only references to the war were anodyne statements calling for warring nations to spare “innocent civilians.”

 

Public Neutrality

Why was Pius XII so cautious?

Kertzer suggests that while Pius XII was privately shocked, he felt the Vatican and the Catholic Church in Italy and Germany were very vulnerable and could face violent attacks, should he anger either Mussolini or Hitler.

He also feared for the independence of Vatican City. When Italy was reunified in the mid-19th century, the Vatican was stripped of The Papal States, a region in central Italy. Tiny (109 acres) Vatican City, including historic St. Peter’s Basilica, lost its sovereignty and became part of the new, unified Italian state.

After Mussolini became prime minister in 1922, he was eager to cement his power as an absolute dictator in a one-party state. Some 99% of Italy’s 44 million people were Catholic, so the church represented a potential threat. He struck a deal with the Vatican in 1929, known as the Lateran Accords, that recognized Vatican City as a sovereign state, independent of Italy. The Vatican was now free to police its own territory and act as a nation-state, establishing diplomatic relations with other nations.

In return The Vatican agreed to subsume its powerful political party the PPI (Partito Popolare Italiano) into Mussolini’s Fascist Party. It also agreed to re-organize Catholic Action, a nationwide youth organization, into a training ground for fascist ideology.

Many aspects of fascism appealed to the Pope XII and the Vatican hierarchy. Most important was Mussolini’s suppression of Italy’s small, but well-organized Communist Party. Second was the Fascist Party’s rejection of Modern Europe’s popular culture including jazz music, avant garde literature and racy movies. The Vatican was very concerned these would lead to public amorality, particularly sexual promiscuity.

 

1,000 Years of Anti-Semitism

While the official ideology of Italy’s Fascist Party was not as virulently antisemitic as Germany’s Nazi Party, many of Mussolini’s lieutenants were outspoken Jew-haters. They were comfortable in the Catholic Church, which had a 1,000-year tradition of antisemitism. According to early church doctrine, the Jews were condemned to “eternal slavery” for their sin of murdering Jesus and then refusing to accept his teaching.

During the first decade of Mussolini’s dictatorship, Italy’s small (50,000) Jewish population did not face the violent antisemitism unleashed in Nazi Germany. In 1938, however, Mussolini, under pressure from Hitler, began an official purge of Jews from society. Jewish doctors, teachers and civil servants were forced out of their jobs.    

In 1940 Mussolini, following Hitler’s example, ordered the construction of some 200 concentration camps across Italy. The first to be confined in them were the thousands of Jewish and political refugees who had fled Germany, Austria and Czechoslovakia. Within three years, most would be sent to their death in Nazi death camps.

After the successful American and British invasion of Sicily, the Italian government formally surrendered to the Allies in May 1943. Six weeks later, with the approval of King Victor Emmanuel III, the Fascist Party leadership had Mussolini arrested. The Germans quickly sent an army to occupy northern and central Italy. Under the direction of the Nazi SS, Italian Jews were rounded up, some living only blocks from the Vatican. Many were sent directly to death camps. This was the darkest hour of Pope Pius XII’s reign, as he refused to speak out or order any clandestine resistance by local priests.

On June 4, 1944, the Allies liberated Rome. Pope Pius XII quickly established a liaison with American generals and greeted groups of Allied soldiers in the Vatican. But he still refused to publicly condemn the Nazis, even as they held out in Northern Italy and continued to send Italian Jews to death camps.

 

Moral Judgment

Kertzer saves his own moral judgment for the last chapter of his book. He states:

If Pius is to be judged for his action in protecting the institutional interests of the Roman Catholic Church at a time of war…his papacy was a success. However, as a moral leader Pius XII must be judged a failure.  At a time of great uncertainty, Pius XII clung firmly to his determination to do nothing to antagonize either (Hitler or Mussolini). In fulfilling this aim, the pope was remarkably successful.

 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184805 https://historynewsnetwork.org/article/184805 0
"The Dawn of Everything" Stretches its Evidence, But Makes Bold Arguments about Human Social Life

Excavation site at Çatalhöyük, a proto-urban settlement which dates to approximately 7,000 BCE

Photo Murat Özsoy 1958CC BY-SA 4.0

 

 

Review of David Graeber and David Wengrow, The Dawn of Everything: A New History of Humanity. New York: Farrar, Strauss, and Giroux,  2021.     

 

 

The title of The Dawn of Everything announces the book’s grand ambition: to challenge the established narrative of civilization as progress in material comforts and power (for some) and decline (of the others) into greater deprivation and unfreedom. The authors contend that the developments and decisions that led to a world-wide system of mostly hierarchical and authoritarian states need not be considered inevitable, nor the result unavoidable. Through a remarkably wide-ranging synthesis of the last thirty years of work on the Neolithic Age and the transition to agriculture and urban life, Graeber and Wengrow seek to open our political imaginations to recognize other ways of caring for the common good, some of which, they contend, have been realized in the past, and survived for many hundreds of years. They succeed in decoupling urban life from farming, and then cite cases of cities that appear to have been organized along horizontal, egalitarian lines. In doing so, they accomplish part of their goal. However, like other writers of provocative works, in pressing their case to the utmost, Graeber and Wengrow at times strain the evidence, and in castigating the writers of speculative history, they often seem to forget that they are writing speculative history also.

 

Graeber and Wengrow follow the method of paying attention to groups that have largely been silent or invisible in history because they did not have writing, or did not construct large stone monuments—those who lived in darkness in the interregnum between empires. Providing a more detailed and accurate portrait of such people, not considering them simply as underdeveloped barbarians or savages, can lead to a fuller history of humanity, a history not solely based on a single line of cultural evolution, through which all societies must proceed in a fixed set of stages.

 

The authors, an anthropologist and an archaeologist, are most successful in contesting the narratives of two developments—the origins of farming and of cities—that have previously been considered to be closely and even necessarily related. In the established and popularly accepted narrative of cultural evolution dominant in the last two and a half centuries, stages of social life succeed each other fairly quickly and decisively. On this view, farming displaced hunting and foraging over perhaps a few generations; moreover, agriculture in its early stages must have included, as appears in the early written record, the use of ploughs and planting, leading to the founding of permanent settlements, the production of surplus food, a greater division of labor, the appearance of craftspeople, priests, and permanent political hierarchies. In addition, hunting and foraging would not have been carried over into the newer social form because they are incompatible with settled life and the requirements of agricultural labor.

 

By contrast, relying on archaeology of the Neolithic era that has been published in the last three decades, Graeber and Wengrow show that the appearance of farming and of cities have in some cases been separated from each other by centuries or millenia, that many complex, hybrid forms existed, and that sometimes a people chose to remain in such a hybrid state, or even to return to hunting and foraging after having engaged in agriculture for generations or centuries. It is probable that plant cultivation first developed in many places—for example, along the shores of rivers, lakes, and springs. Flood-retreat farming near rivers required no ploughing, little investment of effort and time, and could serve as supplement to other means of subsistence. It was not likely to lead to private property because different pieces of land would be exposed and be productive each year.

 

In the early Neolithic, farming probably developed in the valleys of the Jordan and Euphrates as a “niche activity”—one of several forms of specialization, supplements to economies based primarily on wild resources. In some locations, the first steps toward cultivation consisted of (mostly women) observing which plants bore fruit at which time of year, and returning to harvest them in season, perhaps eventually establishing gardens next to temporary dwellings. There is evidence for seasonal alternation of forms of social organization: a hierarchical, patriarchal structure under a single leader during the hunting season, and a more egalitarian, perhaps matriarchal, organization in the season for foraging and gardening, which were mostly performed by women. Graeber and Wengrow contend that this pattern of seasonal variations of social structure held at Çatalhöyük in modern Turkey, “the world’s oldest town” (212) for more than a thousand years.

 

Even though they might have persisted for centuries, many of these hybrid forms could be considered partial or provisional farming. Some groups, like many northern Californian tribes, which were probably acquainted with agriculture from other tribes, apparently deliberately chose not to pursue the practice, while others, like those in England at the time of Stonehenge around 3300 B.C.E., after a period when they engaged in farming, turned away from it.  Almost all these developments are necessarily conjectural, because such small societies made up of hunters, foragers, and gardeners or small-scale farmers did not produce systems of writing.  

 

The case of Çatalhöyük indicates how the authors’ arguments about the halting growth of farming and the emergence of non-hierarchical cities complement each other. Just as they cite evidence that many societies maintained themselves in hybrid states combining seasonal or small-scale cultivation with hunting, fishing, and foraging, so they contend that early cities produced neither a division of labor, nor classes based on unequal wealth, nor a bureaucracy to organize the distribution of surpluses, nor a centralized political or religious authority. They cite more than a half dozen sites from around the world that challenge the established narrative of a set of institutions originating at nearly the same time in urban civilizations.

 

They assert that the early cities of Southern Mesopotamia such as Uruk provide no evidence of monarchy. The archealogical remains of Taljanky, the largest of the “mega-sites” in Ukraine dating to 4100-3300 B.C.E, with an estimated population of over 10,000, provide no signs of central administration, government buildings, or monumental architecture, no temples, palaces, or fortifications. However, the site presents evidence of small-scale gardening and the cultivation of orchards, some enclosed livestock, as well as hunting and foraging. According to the archaeological record, this town survived and prospered for more than five hundred years.

 

Teotihuacan in the Valley of Mexico provides perhaps the most striking example of urban life without kingship, central religious authority, bureaucracy, or wide inequalities. At its height, it is thought to have housed a population of about 100,000. In its early centuries (100-300 C.E.), Teotihuacan followed the pattern of other Meso-american cities ruled by warrior aristocracies, erecting monumental pyramids and other sacred structures, requiring the work of thousands of laborers and involving the ritual sacrifice of hundreds of warriors, infants, and captives who were buried in the pyramids’ foundations.

 

Yet the people of Teotihuacan appear to have reversed course around 300 when the Temple of Quetzalcoatl, the Feathered Serpent, was sacked and burned, and work on all pyramids came to a halt. Instead of pursuing the construction of palaces and temples, the city embarked on an ambitious program of building stone housing for the entire population. Each dwelling of about 10,000 square feet with plastered floors and painted walls would have housed 60 to 100 people, ten or twelve families, each with its own set of rooms.

 

The wall paintings of the new order contain scenes of everyday life, but no representations of warfare, captives, overlords, or kings. These colorful paintings appear to celebrate the activities of the entire community, not the greatness of a royal dynasty. Three-building complexes distributed throughout the city might have been used as assembly halls, suggesting that the unit of organization was the neighborhood, with local councils providing for the construction and maintenance of buildings, overseeing the distribution of necessary goods and services, and performing other public functions. This egalitarian, “republican,” de-centralized social organization—which has been called a “utopian experiment in urban life” (332)—survived for about 250 years before the bonds holding the city together seem to have dissolved, and the population dispersed, perhaps because of tensions between neighboring ethnic, linguistic, and occupational groups.

 

Graeber and Wengrow cite other, more ambiguous sites as evidence of egalitarian early cities. For example, Mohenjo-daro, founded in the Indus valley near the middle of the third millennium B.C.E., attained a peak population of perhaps 40,000. Its Lower Town, laid out in a grid of nearly straight lines, possessed an extensive system of terracotta sewage pipes, private and public toilets, and bathing facilities. Merchants and craftsmen in the Lower Town possessed metals and gems, signs of wealth absent in the Upper Citadel. On the other hand, the Citadel contained the Great Bath, a pool forty feet long by six feet deep that appears to have been the center of civic life. Excavations of the city have uncovered no evidence of monumental architecture, monarchs, warriors, or priests with political authority. However, the remains do give clear evidence of hierarchical organization, containing three of the four groups that later would be classified as castes in the Rig Veda (c. 1250 B.C.E.): ascetic priests in the Upper Citadel, merchants and laborers in the Lower Town (the absent fourth caste would have been composed of warriors). This hierarchy may not have distinguished groups on the basis of political authority, but it does classify on the basis of purity and cleanliness Although we do not know how public affairs were administered, it seems a stretch to consider Mohenjo-daro an instance of an early egalitarian city.

 

This example points to one of the principal limitations of Graeber and Wengrow’s book. In trying to provide a counterweight to a narrative they believe has paid inordinate attention to centralized, authoritarian regimes, they lean toward interpretations that accept the possibility of large-scale, nonliterate, non-hierarchical societies. Like most polemical writers, however, they tend to exaggerate and to strain the evidence. For example, Graeber and Wengrow want to argue that many despotic regimes have been brought down when oppressed people reclaimed their freedom by just walking away from their oppressors. Their argument is in line with the literal sense of the Sumerian word for “freedom,” ama(r)gi—a “return to mother” (426)—or with the verbal phrase for governmental change in Osage—to “move to another country” (469). But the authors cite only two clear examples of such desertions, both from Mississippian civilizations: Cahokia, centered at present-day East St. Louis, where, from 1150 to the city’s collapse in 1350, much of the commoner population deserted a culture based on aggressive warfare, mass executions for the burials of nobles, and strict surveillance of commoners. A similar exodus occurred several centuries later among the Natchez in the Lower Mississippi. But as they seek to generalize this finding, Graeber and Wengrow erroneously maintain that it was similarly possible to reclaim freedom by simply walking away from large empires such as the Roman, Han, or Incan. It was a notorious and bitter complaint among Romans, for instance, that one could not escape the reach of the Emperor, whose power extended to the ends of the known world.

 

Graeber and Wengrow similarly overreach in order to produce what they call the “indigenous critique” of European civilization. In 1703, an impoverished Baron Lahontan, who had spent ten years as a soldier and traveller in New France, published his Dialogues with a Savage of Good Sense, which recount the author’s conversations with a Native American he calls Adario, who articulates a devastating critique of French civilization. His targets include monarchy, the chasm between rich and poor, the dishonesty and faithlessness of the French, their lack of charity, the absurdities of Christian beliefs, the celibacy of priests, and many other institutions and practices. Lahontan calls Adario “the Rat,” which was also the (non-pejorative) cognomen of the celebrated Wendat (Huron) orator, statesman, and strategist, Kandiaronk, on whom Adario is clearly based. It is true, as Graeber and Wengrow state, that throughout the eighteenth century, European readers assumed that “Adario” was simply a fictional mouthpiece used by Lahontan to avoid persecution or censorship. Europeans, the authors claim, refused to believe that a “savage” Native American could have formulated a thoughtful political and social analysis of European society.

 

By contrast, Graeber and Wengrow at the other extreme assert that Adario’s criticisms and arguments are entirely Kandiaronk’s, as though no European could advance a forceful critique of their own civilization (it is likely that Adario’s critique derives from both Kandiaronk and Lahontan, more from the former than the latter). The authors go much further to infer that the criticism in the Dialogues constitutes not just one brilliant native’s considered insights, but a systematic judgment of Europeans by Native American political thought. Thus, their “indigenous critique” plays the role of a fully formulated political philosophy to contrast with the emerging narrative of progress. In fact, the “indigenous critique” may also have been in part a European “autocritique” of Enlightenment (as Mark Hulliung names his study of Rousseau’s thought). Graeber and Wengrow assert repeatedly that the “indigenous critique” influenced and perhaps catalyzed Enlightenment social and political thought and revolutionary practice—for which they believe Kandiaronk deserves credit. At the same time, they deplore  Enlightenment conjectural histories of early human societies as conservative responses intended to counteract the “indigenous critique.” In this way, they at once implicitly celebrate and explicitly disparage Enlightenment political and historical thought.

 

In fact, The Dawn of Everything stands in a much closer relation to the Enlightenment conjectural histories than its authors acknowledge. They recognize Rousseau’s importance, blaming him throughout for asserting, even while lamenting, the full-blown simultaneous appearance of agriculture and property, as well as the disappearance of innocent but stupid savages. They also refer to Adam Smith and Adam Ferguson, who proposed influential, clearly demarcated three- and four-stage theories of universal social development. But they do not mention alternate conjectural histories by Germans such as J. G. Herder and Georg Forster, who avoided a single, rigid scheme of social evolution, and argued in different ways that each people follows its own path of development at its own speed. Even among the Scots, James Dunbar questioned the category of savagery, contending that a society termed savage might be morally superior to a “civilized” empire. Through their almost exclusive focus on Rousseau, whose thought was unrepresentative in its utter condemnation of agriculture and property, Graeber and Wengrow may do what they accuse the Enlightenment thinkers of doing to indigenous people: they simplify a complex phenomenon in order to produce a derogatory representation of it.

 

The conjectural histories of the late eighteenth century were attempting to make sense of the fragmentary and often unreliable accounts from the previous two centuries of the world about which Europeans had previously known nothing or close to nothing. They speculated by necessity; they were thinking about periods for which there were no written records and few material remains. Yet they were speculating responsibly, attempting to work out an understanding of nonliterate societies that did not conform to Biblical accounts, mythical narratives, or dynastic histories, but was based on the best, if scanty, evidence they had before them. In that sense, they took a scientific approach.

 

Graeber and Wengrow also write responsible speculative history concerning societies about which much remains unknown, again in large part because of an absence of written records. Their speculations are based on many sites, material remains, and methods of analysis that were not available in the eighteenth century. Çatalhöyük was only excavated beginning in the late 1950s, and the Ukrainian mega-sites in the 1970s. It makes sense that Graeber and Wengrow have a different story to tell based on different, more plentiful evidence. In providing a provocative synthesis of the last thirty years of specialized archaeological research, however, their speculations are not more scientific than those of their Enlightenment predecessors. 

 

In fact, despite their straining of the evidence and occasional glib remarks (calling Rousseau, for example, a “not particularly successful eighteenth-century Swiss musician” [494]), they largely succeed in their primary aim of showing that the currently dominant form of bureaucratic, centralized, warlike state is neither inevitable nor inescapable. Most significant, perhaps, is their report that current reseach shows that cities with 100,000 people could be organized on egalitarian lines so that they were not centrally and hierarchically administered by a monarchy, aristocracy, or priesthood. Instead, neighborhood councils based on widespread participation were able to organize peaceful communal life for  hundreds of years at Teotihuacan, the Ukrainian mega-sites, the Hopewell Interactive Zone in Ohio, and Knossos in Crete, where the prominent role played by women appears to have been of signal importance.

 

In addition, the existence of societies that have been acquainted with or practiced full-scale farming, but turned away from the complete set of agricultural practices, enjoying greater freedom of thought and action for hundreds or even thousands of years—longer than most empires—indicates that human groups can evaluate the undesirable consequences of technological innovations and choose not to adopt all means of cultural or territorial expansion, economic growth, and resource exploitation. Indeed, the survival of our species and of others may depend on our developing sustainable forms of democratic self-government and adopting self-imposed restrictions on unchecked growth. Although such forms of social organization are widely dismissed as utopian, by showing their existence at many places and times in the past, this book demonstrates that they are indeed possible. Recognizing that such possibilities actually took shape in the past may encourage the realization of similar egalitarian societies in the future.  

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184781 https://historynewsnetwork.org/article/184781 0
As the Progressive Era Ideal of Regulation Vanishes, What Will Stop the March of AI?

Scholars have eagerly demonstrated that ChatGPT's facility with PR weaselspeak exceeds its grasp of historical fact and analytical discernment.

Screenshot of tweet by Zane G.T. Cooper, Jan. 18, 2023

 

 

Recent advances in Artificial Intelligence (AI)--like its ability to disrupt our democracy, write acceptable college essays, and cause teachers and professors to rethink the type of assignments they require--have raised new ethical questions. But they are related to an older one: “What force, if any, can limit the development and sale of a product that makes money?”

While it’s true that ChatGPT, the newest AI-based language model that can write acceptable essays and do much more, is “at the moment . . . available for free to anyone through a sign-up . . . , no one knows if they’ll eventually make it a paid tool.” The “they” that produced ChatGPT is OpenAI, co-founded in 2015 by Elon Musk and others. It has already “raised more than $1 billion in venture funding . . . with Microsoft as its largest investor. . . . The San Francisco-based organization expects $200 million in revenue in 2023 and $1 billion by 2024,” mainly, “by charging developers to license its technology to generate text and images.” Thus, as with other technology developed in capitalist societies by profit-earning companies, including Internet ones, no one should expect that ChatGPT will not end up being profitable (or at least pursuing profit).

As I have stated before, the primary purpose of capitalism has been to earn a profit. Conservative economist Milton Friedman even argued that the “social responsibility of business is to increase its profits.” And sociologist Daniel Bell noted that capitalism has “no moral or transcendental ethic.”

 

In the times of Karl Marx and Charles Dickens laissez faire capitalism, in which the government was not to interfere in the making of profits, was more common than today. During the Irish famine, which caused approximately 1 million deaths between 1846 and 1851, “ship after ship sailed down the river Shannon laden with rich food, carrying it from starving Ireland to well-fed England, which had greater purchasing power.” The British government refused to allow food grown by the Irish peasants to feed themselves partly because it was the private property of absentee English landlords, whose profit-making capabilities the government was not about to hinder.  

 

From about 1890 until World War I, however, a U. S. Progressive Era existed. Progressivism was a diverse movement “to limit the socially destructive effects of morally unhindered capitalism, to extract from those [capitalist] markets the tasks they had demonstrably bungled, to counterbalance the markets’ atomizing social effects with a countercalculus of the public weal [well being].” This movement did not attempt to overthrow or replace capitalism but to constrain and supplement it in order to ensure that it served the public good. It did, however, increase government controls.

 

After WWI, however, the U.S. had three conservative Republican presidents who opposed progressive measures. In 1932 the last of them, Herbert Hoover, stated that “Federal aid would be a disservice to the unemployed.” But with the election of Franklin Roosevelt (FDR) that same year progressive policies once again became acceptable. Harry Truman, who succeed FDR in 1945, and most subsequent presidents, accepted at least a minimum of progressive acts (like Social Security), with Ronald Reagan perhaps being the closest ideologically to earlier anti-progressives. Nevertheless, among the U.S. public and a minority of politicians there were always some who decried the amount of government activity implied by progressivism.

 

Yet, the failures of Republican governments, more influenced by anti-progressivism--or even more progressive Democratic governments--to end the sale of harmful profit-earning products has been notable. 

 

Look, for example, at U. S. Prohibition, in effect from 1920 until 1933. It did not end people’s purchasing of liquor or profiteering from its sale, but did stimulate a large increase in organized crime. The same is true for drug laws and the drug trade today, especially regarding fentanyl. A December 2022 Washington Post story reported that “during the past seven years, as soaring quantities of fentanyl flooded into the United States, strategic blunders and cascading mistakes by successive U.S. administrations allowed the most lethal drug crisis in American history to become significantly worse.

 

In January 2023 The New York Times informed us that “increasingly in drug hot zones around the country, an animal tranquilizer called xylazine . . . is being used to bulk up illicit fentanyl, making its impact even more devastating.”

 

And it has not only been illegal alcohol and drug sales that have caused tremendous damage, but also legal drugs like OxyContin, which was a leading cause of our earlier opioid crisis, but which “made billions in profits” for “U.S. drug manufacturers, distributors and chain pharmacies.”

In my “The Opioid Crisis and the Need for Progressivism” (2019), I indicated how Purdue Pharma, owned by the Sackler family, sparked the earlier crisis through its marketing of OxyContin and “put profits first. Before any ethical considerations. Before the interests of people. Even if it killed them.” In November 2022, Reuters reported that the management of CVS, Walgreens, and Walmart agreed to pay about $13.8 billion to resolve thousands of U.S. state and local lawsuits accusing the pharmacy chains of mishandling opioid pain drugs. One of the lawyers suing the companies said that “reckless, profit-driven dispensing practices fueled the crisis.”

Thus, it has not only been criminal elements that have engaged in practices that have harmed and killed innumerable people, but legal, profit-first companies. Although it could be argued that eventually the U. S. legal system dealt with the problem of legal drugs killing people, it did not do so until after countless deaths had occurred. And only the future will determine whether “the greed of the pharmaceutical industry,” as Sen. Bernie Sanders charged in August 2022, will continue to “literally” kill Americans.

For a more comparable view of AI’s future we can look at the past of social media. In her highly-praised These Truths: A History of the United States (2018), Jill Lepore writes that by deregulating the communications industry, the 1996 Telecommunications Act greatly reduced anti-monopoly stipulations, permitted media companies to consolidate, and prohibited “regulation of the Internet with catastrophic consequences.” Moreover, she states that social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.” Developments, especially Trumpian ones, which have occurred since her book first appeared about four and a half years ago just confirm the accuracy of her observations. 

Some 15 years ago I devoted the last chapter of my An Age of Progress? to the question of whether or not the twentieth century was such an age. My answer was that in some areas such as science, technology, and the expansion of freedom there had certainly been significant progress, but in other areas such as the environment and moral growth, advancement was more debatable. I also discussed certain technological developments such as television and tried to present different views on whether or not they contributed to overall progress (e.g., Marshall McLuhan reputedly once saying about televisions, “If you want to save a single shred of Hebrew-Hellenistic-Roman-Christian humanist civilization, take an axe and smash those infernal machines”).

In subsequent years I have often dealt with capitalism and progressivism in such essays as “Pope Francis's Christian Capitalist Criticism” (2013), “Capitalism Versus Democracy” (2014), “What Does History Tell Us about Capitalism, Socialism, and Progressivism?” (2016),“Why Progressivism Should Be Our Nation's Political Philosophy” (2021), and “Is Capitalism Killing Our Planet and Our Concern for the Common Good?” (2021). Most recently in “Climate Change, Fake Claims and Greenwashing” (2022), I defined “greenwashing,” as the UN does, as “misleading the public to believe that a company or entity is doing more to protect the environment than it is.”

 

Chat GPT's own promotional material highlights unlikely statements from historical figures that conforms to contemporary anodyne corporate communication while expunging political conflict from history and the ideas of historical figures.

 

 

Much of this activity comes from the false advertising and public relations operations of big companies more interested in profits than in the common good. In late 2021 Facebook whistle-blower Frances Haugen declared that the main reason companies like Facebook have not done more to advance the common good is “it makes the companies less profitable. Not unprofitable, just less profitable. And no company has the right to subsidize their profits with your health.” Yet that is exactly what some drug, fossil fuel, tobacco, and social media companies have sometimes done.

 

If enough people believe that Milton Friedman is correct that the “social responsibility of business is to increase its profits,” is there much hope that any force exists which can limit the development and sale of a product that harms society but makes money?” Is Progressivism such a force, or is even it too weak?

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184831 https://historynewsnetwork.org/article/184831 0
The Roundup Top Ten for January 20, 2023

Why Are We Arguing About History But Letting the Profession Die?

by Daniel Bessner

If nobody can expect to earn a decent living researching and writing history, then vast swaths of our past will be unknown to the future, and the history that is written will suit the whims of the rich hobbyists who can afford to do the work. 

 

Hamline Mess Shows Religion and Neoliberal Administration Converge to Reject Expertise

by Alexander Jabbari

An instructive contrast can be drawn from a 1997 controversy over a frieze depicting Muhammad at the United States Supreme Court. Since then, post-9/11 Islamophobia, a culture of deliberate trolling under the banner of free speech, and the rise of corporate-style university management have drained the capacity for nuance.

 

 

Florida's Ban on AP African American Studies Class is Authoritarian

by Jeremy C. Young

The decision is "bad for free speech and for educational practice, and it's especially worrisome for Florida high school students. When politicians go to war with teachers, students always lose."

 

 

Masculinity and Trauma in War and Football

by Sarah Handley-Cousins

Sports have been cast as a (relatively) peaceful way of inculcating a set of masculine virtues otherwise associated with war. But the experience of injury and grief will continue to confound the rules of manhood—and football fans and citizens should pay attention. 

 

 

Why Do "Secret" Documents Keep Showing Up in the Wrong Places?

by Matthew Connelly

The near-unilateral authority of presidents to declare material secret in the name of national security is intoxicating and it's nearly impossible for the chief executive to resist abusing it, creating not a "deep state" but a "dark state" of secrecy and impunity. 

 

 

Blasphemy Is Not a DEI Issue

by Joan W. Scott

Hamline has mistaken the vital imperative of care and respect toward members of minority communities on campus with capitulation to religious censorship, which a university cannot abide. 

 

 

DeSantis Merging Fear of Lessons on Race and Sexuality with Attacks on Public Education

by Jonathan Feingold

Ron DeSantis's General Counsel defined "woke" as “the belief there are systemic injustices in American society and the need to address them.” The governor's education agenda is neatly summed up by this statement, and it's spreading nationwide. 

 

 

Colleges are Vulnerable to Political Attacks Because They've Abandoned their Roots

by Christine Adams

"Despite the persistence of conservative campaigns against higher education, American colleges and universities have never really hit on an adequate response to these attacks."

 

 

The Romance of the Highway Obscures Harm to Communities of Color

by Ryan Reft

Secretary Pete Buttigieg's comments that interstate construction entrenched racial segregation were denounced as "woke" by critics. But history shows that highway planners knew that such consequences were likely to ensue, and proceeded anyway. 

 

 

The Middle Ages Were Much Cleaner Than We Think

by Eleanor Janega

Our myths about medieval cleanliness are contradicted by mountains of evidence about the lengths people of all social classes went to to bathe. 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184830 https://historynewsnetwork.org/article/184830 0
50 Years Ago, "Anti-Woke" Crusaders Came for My Grandfather

Norma and Mel Gabler

 

 

On April 22nd, 2022, Florida Governor Ron DeSantis signed House Bill 7 (popularly called the “Stop WOKE” Act). Christopher Rufo then took to the podium. After praising the Governor and the bill, Rufo denounced Critical Race Theory (CRT) in schools on three points: CRT segregates students based on race, teaches white heterosexual males that they are fundamentally oppressive, and paints America as a place where racial minorities have no possibility of success. 

 

While the bogeyman of CRT is a new iteration, Rufo's objections fit into the long history of the politics of American education. Like his predecessors, Rufo misrepresents ideas critical of conservative hegemony in order to maintain it. “I am quite intentionally,” Rufo tweeted, “redefining what ‘critical race theory’ means in the public mind, expanding it as a catchall for the new orthodoxy. People won’t read Derrick Bell, but when their kid is labeled an ‘oppressor’ in first grade, that’s now CRT.”  But if the public does read Bell, they will see the fallacious humbug Rufo has concocted. “America offers something real for black people,” Bell writes in Silent Covenants, “...the pragmatic approach that we must follow is simply to take a hard-eyed view of racism as it is, and of our subordinate role in it. We must realize with our slave forebears that the struggle for freedom is, at bottom, a manifestation of our humanity that survives and grows stronger through resistance to oppression even if we never overcome that oppression.” Rufo’s deliberate obfuscation of CRT furthers the American lost cause of white resentment. Attaching the politics of education to the politics of whiteness places Rufo’s actions within a longer historical pattern.

 

In 1972, Search for Freedom: America and Its People came up for review at a public hearing in Texas for statewide textbook adoption. Noted Texan conservatives Mel and Norma Gabler derided the fifth-grade social studies text for several reasons. First, they alleged, it questioned American values and patriotism. Second, it encouraged civil disobedience. Third, it championed Robin Hood economics (taxing the rich and giving to the poor). Fourth, it committed blasphemy for comparing the ideas of Thoreau, Gandhi, and King with those attributed to Jesus in the Gospels. Fifth, it glorified Andy Warhol and, worst of all, only mentioned George Washington in passing but devoted six-and-a-half pages to Marilyn Monroe. After the hearing, the Texas legislators agreed with the Gablers’ objections and effectively banned the textbook from Texas classrooms. Because of Texas's outsized role in textbook adoption, the textbook did not make it into any other classrooms.  

 

William Jay Jacobs, my grandfather, wrote the book. 

 

My personal connection to this history helps me see how Rufo carries the Gablers’ legacy into the twenty-first century. Acting as guardians of the American republic, Rufo and the Gablers turn complex ideas into soundbites and use those soundbites to make claims about radical indoctrination in schools. They portray this indoctrination as so dangerous that censorship is the only possible solution. The Gablers and Rufo, in their way, share Plato’s conviction that “the young are not able to distinguish what is and what is not...for which reason, maybe, we should do our utmost that the first stories that they hear should be composed as to bring the fairest lessons of virtue to their ears.” Should any story question or contradict the conservative virtues the Gablers and Rufo hold so dear, “it becomes [their] task, then, it seems, if [they] are able, to select which and what kind of natures are suited for the guardianship of a state.”

 

In a modern democracy, though, which “lessons of virtue” and who “select[s] which and what kind of natures” should be taught to the young are open for public debate. The Gablers and Rufo have therefore worked to manipulate ideas, and how the public perceives those ideas, to justify both conservative curricula and their roles as legitimate guardians of the common-sense virtures of the American republic.  

 

After the 1972 Search for Freedom hearings, as the right questioned the left’s patriotism and labeled any dissent as anti-American, the Gablers took to the press, seeding sensational soundbites. Headlines shouted: "The Sexy Textbook!" and "More MM than GW!" Mel and Norma then headed to "The Phil Donahue Show" and "60 Minutes" with my grandfather's textbook in hand. Proclaiming themselves as neutral textbook evaluators, they held the book up to the screen and claimed that my grandfather had swapped Marilyn Monroe for Martha Washington as mother of our country. But as my grandfather wrote in a retort,  

"Marilyn" made for a good laugh. Yet what better contemporary symbol have we of the potential for barrenness in the American dream when, stripped of its inherent idealism, it is reduced to a mindless groping for money and fame? The Marilyn Monroe sketch raised questions for young readers about mass "spectatorism" and the commercial packaging of human vulnerabilities. It illustrated that not every story beginning with "Once upon a time" necessarily will end with the hero (or heroine) living "happily ever after."

 

Rather than juxtaposing the moral of my grandfather’s story with their objection, the Gablers simply skipped over my grandfather’s critical rendition of the American dream and turned it instead into made-for-TV moral panic. They used live television to warn the American public that dangerous ideas were in their textbooks. The Gablers posture—as common-sense Americans shocked by outrageous lessons—spoke to conservative Americans and encouraged them to join their effort to prevent subversive ideas from entering classrooms.

 

Before Rufo spoke on the podium with DeSantis, he began his crusade on Fox News with Tucker Carlson. On live television, Rufo claimed that CRT “has pervaded every institution in the federal government.” He further proclaimed, “I’ve discovered… that critical race theory has become in essence the default ideology of the federal bureaucracy and is now being weaponized against the American people.” With a captivated, frown-eyed Carlson watching, Rufo explicated findings from three “investigations” that purported to “show the kind of depth of this critical race theory occult indoctrination and the danger and destruction it can wreak.” First, he presented snippets from a seminar led by Howard Ross, who asked treasury department employees “to accept their white privilege...and accept all of the baggage that comes with this reducible essence of whiteness.” Second, Rufo described a weekly seminar on intersectionality held by the Federal Bureau of Investigation, which aimed “to determine whether you are an oppressor or oppressed.” Third, Rufo detailed a “three-day re-education camp,” sponsored by the Sandia National Laboratories, to “deconstruct their white male culture and actually force them to write letters of apology to women and people of color.”

 

Rufo ended his diatribe with a call to action: “conservatives need to wake up that this is an existential threat to the United States...I call on the president to immediately issue this executive order and stamp out this destructive divisive pseudoscientific ideology at its root.” With his hyperbolic language, his tying CRT to anything that criticized the power of white American males, and his call for conservatives to “wake up” to defeat an “existential threat,” Rufo put his telegraphed approach to work.

 

The Carlson interview aired on the first of September; by the 4th a memo was sent by the Trump administration stating, “...according to press reports, employees across the Executive Branch have been required to attend trainings where they are told that ‘virtually all White people contribute to racism or where they are required to say that they ‘benefit from racism’.”

 

Extracting CRT from the halls of academia and claiming to find its pernicious presence across all federal agencies, Rufo and Carlson brewed moral panic to transform CRT into an existential bogeyman who was coming to destroy white America. In both cases, the Gablers and Rufo used television to gain support for their cause. They turned critical ideas of American society into a demon that must be slayed. By inflating distant employee training sessions and fifth-grade social studies textbooks into a vast anti-white, anti-American conspiracy, they encouraged viewers to see schools as a nearby battle front, they could, and must, fight on.

 

In an article titled “Ideological Book Banning is Rampant Nationally,” published in the Washington Post on October 16th, 1983, Alison Muscatine reported the following:

 

"Our children are totally controlled," said Norma Gabler, displaying a social studies textbook that devotes six pages to Marilyn Monroe but that makes only three references to George Washington. "Can you imagine a sex symbol being given more time than the father of our country? I don't think it's fair that our children be subjected to this kind of information. They are being totally indoctrinated to one philosophy.”

 

To try to fight the alleged indoctrination, the Gablers created the Educational Research Analysts—an explicitly Christian conservative organization--to review, revise, and censor any textbook that ran counter to their vision of what American children should be taught. In their attempt to guard the American child from subversive stories, the Gablers claimed children were being “totally indoctrinated to one philosophy.” Their censorious actions, however, did more to indoctrinate American children to one way of seeing the world than did my grandfather’s parable on Marilyn Monroe. Citing indoctrination, the Gablers justified their censorship to preserve their version of America as the only legitimate story American children should read.   

 

Although Rufo himself has not censored textbooks, his actions led to legislation that did. The Florida Department of Education published a press release labeled “Florida Rejects Publishers’ Attempts to Indoctrinate Students.” In 5,895 pages, the department details two reasons for rejecting 41 percent of the textbooks that were reviewed. The textbooks either followed Common Core Standards (which the Florida Department of Education rejects), or the textbooks included CRT (defined, of course, in Rufo’s expansive terms). Like the Gablers, the Florida textbook evaluators assume controversial ideas in a text will indoctrinate the children reading them. Again, the Gablers and Rufo posture as guardians standing against a radical activist agenda, not as censors. They both throw their hands up, sit, and watch as other citizens act upon their calls to censor ideas. And when others call them censorious zealots, they simply dodge the charges by claiming they themselves did not censor ideas, even though their actions clearly encouraged others to do so.

 

In an exposé on the Gablers, Mel details how they understand this guardianship. “‘When they eliminate good books and put garbage in, they are the censors,’ he said. ‘All we do is point it out’.” Because they only reported the textbooks to the Texas Education Agency, the Gablers did not see themselves as censors. Semantically, they may be right. Practically, however, the Gablers’ actions effectively “canceled” certain ideas. Forget merit; for the Gablers, an idea should only be taught if it fits into an understanding of “good books” that happens to coincide with their conservative worldview. The good books argument is akin to the argument Plato’s Socrates makes in the Republic. Namely, those who have the power and guard the republic are the rightful persons to decide which stories and thereby which virtues the future guardians should learn. The problem is, however, neither the Gablers nor any other single entity in a modern democratic state has the sole right to decide what the next generation ought to know.

 

On Twitter, Rufo evoked this exact line of reasoning. He wrote, “there are no ‘book bans’ in America. Authors have a First Amendment right to publish whatever they want, but public libraries and schools are not obligated to subsidize them. Voters get to decide which texts—and ultimately, which values—public institutions transmit to children.” Rufo is right, to a point. The voters do make those decisions but do so, presumably, by understanding good faith arguments on both sides of an issue. But Rufo’s sensationalized, bad faith reporting—which turned CRT into something it is wholly not—prevents voters, especially children, from seeing both sides of the issue and forming their own opinion. Positioning himself as defender of America, Rufo’s reporting turns progressive ideas into anti-American rhetoric to excite the conservative base to enact censorship.

 

Let me be clear, the difference between the Gablers and Rufo is one of degree, not kind. The Gablers aimed at textbooks while Rufo aims at a broad and diffuse set of ideas and practices that are now dubbed “wokeness.” The Gablers raised hell at textbook adoption meetings while Rufo raises hell on the internet. Both position themselves as protectors against supposedly subversive ideas. Both (along with Plato), however, fall into the same faulty assumption. Critical or not, ideas do not simply transmit to children. Children, like adults, can reason. Thus, children--not just books, not just ideas--shape how they understand the world they live in. 

 

In his response editorial, my grandfather leaves us with a prescient insight:

 

Meanwhile, it’s comforting to know that the issue of book banning continues to generate controversy. It means that at least someone, somewhere, still takes the written word seriously as a means of influencing the minds of young people.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184750 https://historynewsnetwork.org/article/184750 0
Resisting Nationalism in Education

Public Art, Copenhagen, Denmark. Photo by Author

 

 

You’ve seen the photos from Ukraine. The bombed out schools, the ghostly writing left behind on blackboards, desks turned over and posters in tatters.  As Russian attacks mercilessly drum on, innocent Ukrainian families and children flee westward. 

 

Education in Exile

 It was in a rural school in Northern Germany where I first met two elementary school-age brothers from Ukraine, now living as refugees.  I had not anticipated meeting children from Ukraine in my month-long research trip to German schools, and they hadn’t expected to be there either. Yet, there we all were, at a small school focused on democratic living and practices.

The boys arrived at the school not speaking any German.  One of their multilingual peers reported to me in English that the boys had made a lot of progress since the start of the school year. The younger of the two acted out and was known to hit others, perhaps responding to the chaos and upheaval that he had endured at such an early age.

The Ukrainian boys' new German teachers told me that there were many more children who had arrived in the area, but that they opted to take courses over Zoom with their teachers back home who had been unable to leave the country due to travel restrictions.  There was hope that the frequent video calls would ease the eventual return of the children to their regular schools in a post-conflict Ukraine.

As a sixth grade Social Studies teacher, I was curious to find out how a country balances talking about its own difficult past while encouraging pluralism, respect and youth engagement.  The immediacy of the crisis in Ukraine highlighted the pressing nature of addressing how our systems of education respond to such inhumanity.

 

Authoritarian Education

Putin’s nationalistic war of aggression seeks to create a mythic “Greater Russia.” This has been part of a long campaign that seeks to erase the unique history of the Ukrainian people. Part of this campaign featured comments from the Russian leader, such as his claim that, “Modern Ukraine was entirely and fully created by Russia.”  Along these lines, he believes that if Russia created Ukraine, then Russia can also destroy Ukraine as it deems fit.  The brave resistance of the Ukrainian people shows how Mr. Putin is wrongheaded in his notions and actions.

The warped view of history, as told by Putin, is a feature of nationalistic ideology, which divides the world into “us” and “them.”  The basic logic of this simplistic binary is to rationalize the use of state-sanctioned violence in the name of “us” to protect against the dangerous “them.”  As Yale Historian Timothy Synder points out, constructing the mythic past is part of “the politics of eternity,” whose followers believe that there is always a danger to civilization posed by the “outsider.” 

The pull of the narrative of a divided world is especially powerful in times of heightened anxiety and social upheaval. Those in power who try to smooth over the past do so out of a desire to mask the pain of resentment and embarrassment.  For Putin, there is the desire to hide the failures of the post-Soviet state, to reclaim a pseudo-historic image of Russia, and to channel outrage over lack of economic development toward the West. All of these aims, in addition to geo-political military positioning, against the perceived threat of NATO expansion in the Baltics and beyond, have driven Putin’s quixotic escapade of death in Ukraine.  Autocracy rots the promise of education.  It is a blight on the dreams of future generations.  It produces fear and conformity, stunting creativity, expression, and the power of imagination. A world divided is a destructively simple idea that masks brutality.

 

Resisting Authoritarian Education: In Exile and At Home

Authoritarianism has no place in education.  Even in the bucolic state of New Hampshire, I have seen it sprout up in the bullying tactics of politicians and when lawmakers attempted to ban honest conversations about our nation’s history of sexism, racism, antisemitism and xenophobia.  At the core of such restrictions on intellectual and academic life is the concept that it is unnecessary to engage in independent analytical thinking because the politics of eternity explain all that needs to be known by the general public: the past was glorious. 

Countering the pull toward nationalistic authoritarianism requires intellectual openness and curiosity.  This is a challenge in the time of recovery from the global pandemic, environmental catastrophe and jagged economic turbulence.  In these times, we want security, consistency, and remedies to our social ailments.  These desires can close people off to new ways of thinking and being, as many are in harm-reduction holding patterns that disallow newness out of fear.

Children know that pain exists in this world.  All too many of them live that pain.  Others have intergenerational wounds that require adults to attend to help them heal and grow. While I can empathize with parents, such as one German-American parent who went to great lengths to hide news of the War in Ukraine from her child, such a decision deprives a classmate of another understanding peer.  We have an obligation to listen to children in such a way that provides them with the knowledge they seek and the knowledge that allows for them to be fully in the world today, as they are.

 

In pictures and words, children are processing the trauma of wartime violence.  One public art display in Copenhagen places the illustrations and writings of children of 1930s Poland next to the experiences of children caught in the Ukrainian conflict today.  Such displays remind adults of the importance of speaking up against all forms of oppression and the need to have spaces in schools and in public that honor the voices and experiences of children.  

While there is room for rational discomfort and fear, we must be able to work through that fear to model for our young people how we can make the most of this moment of change – how we can live with uncertainty and create a new way of living that acknowledges the hurt and harm of the past while also moving to be more honest about the possibilities of today.

Teaching and learning history will not be without controversy or conflict, but the ability to recognize and critique the tropes of nationalism is a step toward preserving peace and freedom. These ideals of nonviolence start in a place close to the heart and grow through intermittent bursts that entangle us in the beautiful knarl of life. Nested here we find ourselves in the company of sorrow and joy. 

There is no escaping the hard truths of the past, but we can all strive to see how we are shaped by those who have come before us. It is only then that we can meet the challenges of today.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184780 https://historynewsnetwork.org/article/184780 0
One Term, Two Presidencies: Biden's Prospects under Divided Government

 

 

President Joe Biden has been, by most objective measures, an effective president. His first two years in office produced several landmark legislative achievements which, given his razor-thin majority control of the House and a 50-50 tie in the Senate, speaks to his skill as well as his success in the domestic arena. From the American Rescue Plan to infrastructure spending, a large stimulus package, an expansion of the social safety net, COVID vaccine distribution, the Inflation Reduction Act, climate control spending, gun control reforms, CHIPS, the reauthorization of the Violence Against Women Act, and other accomplishments makes Biden an especially successful president.

But as has been the pattern historically, in the midterm election of 2022, Republicans took control of the House by a slim margin, while the Democrats actually expanded their control of the Senate, winning one seat to take a 51-49 majority. If the Democrats did fairly well by comparative standards in the midterms, the loss of control of the House means that, as so often happens, Biden will face divided government in his next two years in office. How is this likely to impact his ability to govern?

The simple answer is that it will make things more difficult for Biden. Expect deadlock, roadblocks and a slew of investigations into any number of issues (or pseudoissues)—Hunter Biden and his infamous laptop, the Afghan pullout, the border situation, the COVID response—the list seems inexhaustible now and will certainly grow. Republicans will seek to distract Biden from governing and put him on the defensive. How have past presidents governed in divided government, and what lessons might Biden draw from his predecessors? Political scientists have extensively studied this question and while there is no one-size-fits-all strategy for success, there are several things that Biden can do to increase the odds he can govern amid divided government.

Biden’s governing path will be made even more difficult as a result of the chaos and conflict in the Republican House caucus, conflicts that were played out in front of a national television audience in the early days of January 2023 when it took the Republicans fifteen ballots to select a Speaker. In order to secure the Speaker’s job, California Republican Kevin McCarthy was forced to make compromises and bargains with the most extreme MAGA wing of the Republican Party, compromises that will severely weaken the Speaker as he tries to herd the cats of Congress. Clearly, President Biden will face a House of Representatives with an emboldened extreme right wing and a weakened Speaker, making broad-based coalitions and compromise between the parties more difficult if not impossible.

A proviso in this is that many of the lessons for navigating divided government draw from a time when, even with strong partisan divisions, it was not uncommon for landmark legislation to have a bipartisan flavor. Some of the biggest reforms in domestic policy came about with a fairly broad consensus and cross-party voting in Congress. Civil rights reforms, Medicare, tax cuts, and environmental laws often were the result of some legislators crossing party lines to form a broad-based consensus that, while led by presidents, had a bipartisan congressional stamp of approval. That has not been the case in the past thirty years.

Beginning in the Clinton years, the two parties became more divided ideologically, more tribal in their attitudes, less willing to broker deals, more rigid in their insistence that principles could not be compromised as more issues were seen as a defense of core principles. The opposition party was increasingly seen as “the enemy” to be destroyed.  With the parties so ideologically divided yet so evenly divided numerically in the nation, the obvious result was gridlock, deadlock, blame-game accusations, and the weaponization of policy differences. Politics, always a blood sport, became all-out war. Thus, strategies often employed by presidents like Lyndon Johnson and Ronald Reagan – schmoozing, making trade-offs, give and take, bargaining, compromising, meeting in the White House to pressure legislators, doing favors for members, and brokering exchange deals – while not off the table, happen less frequently than before.

Today, merely talking to a President of the opposition party can bring a barrage of criticism from the more extremist members of the opposition, and even mainstream members often see such political and even social exchanges as traitorous. When politics is war, sleeping with the enemy – or even breaking bread – is seen as a cardinal sin.

How have the more recent presidents adapted to this new hyper-partisan environment? In the past thirty years we have seen the development of the “two-stage presidency.” The modern presidency is bifurcated in two-year segments. In the first two years of an administration, presidents tend to stress legislative achievements and attempt to pressure Congress to pass their legislative agenda. Having (usually) a majority in both Houses, presidents are better positioned to seek legislative victories by appealing to party loyalty, or seeking limited bipartisan support. Most of the modern presidents have seen years one and two produce most of their legislative successes.

But after the midterms, which usually see an incumbent president’s party lose roughly 30 seats in the House and 3 or 4 in the Senate, the political calculations change sharply, and presidents can no longer expect Congress to pass significant legislation. Presidents figure this out quickly and adjust to a different governing strategy for years three and four: developing an administrative approach to governing. In the back half of the term, presidents seek to pursue policy changes using unilateral actions that do not require the approval of Congress. Governing “with the stroke of a pen” can be tempting, but also dangerous. Utilizing executive orders, proclamations, executive agreements, National Security Directives, regulatory changes, and other managerial tactics, presidents attempt to bypass Congress and govern directly (and alone).  Of course, this strategy is not as permanent as passing landmark legislation that is protected from the whims of change of the political seasons. Executive orders – which have the force of law – can be undone by a successor (as often happens) and while many do have lasting power, many of the more controversial orders (e.g. on civil rights or the environment) can be easily undone. President Trump made it a point of pride to undo as many Obama-era executive orders as he could. And then Joe Biden sought to undo Trump’s undoing of Obama.

If the managerial route to governing is so transient, why do presidents even bother? Because if that’s all you have, that’s what you use. Executive orders can be a good way to signal to important interest groups in the party that you are behind them and working for their interests. And at times, such orders stick and become set policy. Yes, the managerial or administrative presidency is a recognition and acceptance of weakness, but all presidents know they will be judged by the public, and they need trophies to display indicating they are governing for the people. Presidents would much rather go the legislative route, but when that door is slammed shut, effective presidents develop alternative strategies and adjust to the realities of the times.

All presidents are susceptible to two forces: numbers and times. By numbers we mean the number of members of the president’s party in Congress (a majority allows them to have a fighting chance of passing legislation), and by times, we mean the mood and demands of the times. In a crisis presidents are ceded power, in normal times, they are usually hemmed in. If the people are in a public-spirited mood, presidents might be able to use popular opinion to press Congress. At times these two factors merge (e.g. FDR and the New Deal years of 1933-37) but only rarely are the political stars so aligned. Presidents can’t create the political environments in which they govern. But they can be sensitive to mood swings and political opportunities. To everything there is a season, and the season for legislative achievement is short-lived and susceptible to interruption. Get what you can, when you can, in the ways you can.

 We have today a new “two presidencies” model of the office: one template for years one and two, another for years three and four. In the first two years presidents pursue legislative victories, in the last two they seek administrative victories. President Joe Biden will continue to make legislative proposals, but watch his administrative actions for a clue of where he intends to take the nation.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184783 https://historynewsnetwork.org/article/184783 0
With Academic History in Crisis, can Departments Pivot to Reach Interested Audiences?

 

 

 

When you work in higher education, people will tell you what they think of your discipline. All of the mathematicians I know are constantly told by people they’ve just met how much they hate math. As a historian, probably about a third of the general public quickly tells you that they never liked history. But it’s probably also a third that tells you they really appreciate history now that they’re older and they wish they had studied it more. What are we doing with this information?

 

On the one hand, history is one of the struggling liberal arts disciplines. When we look at undergraduate majors, history is in decline almost everywhere. Of course, there are exceptions. In The New Yorker, Eric Alterman examined the “decline of historical thinking” as an example of inequality. People at the most prestigious universities still do study history. But the overall decline is significant enough that it may be irresponsible to send your best students to grad school. According to recent numbers, of the 1,799 new history PhDs between 2019 and 2021, only 175 found full-time work as faculty. This is beyond bleak.

 

On the other hand, Americans don’t actually hate history. They just don’t necessarily appreciate it as undergraduates. Ken Burns’ Country Music reached over 34 million viewers. And history programming happens all the time on PBS and other channels. History also shows up in book sales. Americans buy books from the history and biography categories every year. History trends in podcasts. Dan Carlin’s Hardcore History gets millions of downloads per episode—and the episodes aren’t short. Malcolm Gladwell gets paid by Pushkin to make Revisionist History, which is not entirely about history but always engages the past. There are millions of people who subscribe to Substacks, but very few faculty who make them as part of their work. All kinds of adults are interested in history, but in higher ed we aren’t doing too much with that interest.

 

It's appropriate for universities to focus on undergraduate education and degrees, but it may be unnecessarily limiting to stop there. Why can’t history departments spend some of their time following the existing audience for their knowledge? Why doesn’t “running the university like a business” include attuning departments to the market and not just downsizing because of small class sizes of 18-22 year-olds? Barnes & Noble has turned itself around, why can’t our history departments do the same?

 

What could a reimagined history department look like? There are many ways that universities could better position their history departments. Right now, historians who attempt to reach the actual market of adults interested in history do so largely on their own, on the side. Why don’t history departments allow some faculty to do a podcast as a course substitution? Universities often have the technology and facilities to produce quality podcasts. A successful podcast could be an opportunity for revenue-sharing between the professor and the university. Why not try a substack as a course substitution? History departments could get encouragement to offer adult education classes, especially non-degree seeking. Many people want to learn more about history, they don’t all want to get another degree. History departments could also be positioned to offer quality lecture series for the community—again, not as an additional burden on faculty but as a substitute for some of their more traditional responsibilities. No doubt departments can think of many more unconventional ideas for engaging the public. These kinds of things could be revitalizing for departments and revenue-generating for universities.

 

Universities have everything to gain and nothing to lose from more creative approaches to expanding the reach of their history departments. You can squeeze faculty over class size and journal publications, or you can set them free to increase the impact of the university in your actual community. Most universities would like to have a higher profile in their neighborhood. Universities also benefit from having more people on campus more often. Traditional undergraduate course work does not do that. Neither does increasing graduate offerings. Events and programs that attempt to serve the actual market of people interested in history could do that. In many cases, faculty have ideas and skills. Why not attempt to cater to existing consumers of history where we find them—outside the traditional classroom?

 

When we downsize departments and don’t hire for history, we often say there’s just no audience for history. That’s just not true. It’s worth considering what responding to the existing audience of adults interested in history might look like within history departments. Not every history department can generate a hit podcast, but consider the contrast in number of listeners between a history survey course and a moderately successful YouTube channel. It’s time to reimagine what a history professor’s work can look like with more flexibility and fewer constraints. Colleges and universities stand to gain from such innovation.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184782 https://historynewsnetwork.org/article/184782 0
Martin Sherwin's "Gambling with Armageddon" Strips away the Myths of Nuclear Deterrence

A US helicopter flies above the Soviet submarine B-59 during the blockade of Cuba, October 28-29, 1962

 

 

Martin J. Sherwin Gambling with Armageddon:  Nuclear Roulette from Hiroshima to the Cuban Missile Crisis (Vintage Paperback Edition 2022).

 

 

The development and the deployment of nuclear weapons are usually based on the assumption that they enhance national security.  But, in fact, as this powerful study of nuclear policy convincingly demonstrates, nuclear weapons move nations toward the brink of destruction.

The basis for this conclusion is the post-World War II nuclear arms race and, especially, the Cuban missile crisis of October 1962.  At the height of the crisis, top officials from the governments of the United States and the Soviet Union narrowly avoided annihilating a substantial portion of the human race by what former U.S. Secretary of State Dean Acheson, an important participant in the events, called “plain dumb luck.”

The author of this cautionary account, Martin Sherwin, who died shortly after its publication, was certainly well-qualified to tell this chilling story.  A professor of history at George Mason University, Sherwin was the author of the influential A World Destroyed: Hiroshima and Its Legacies and the co-author, with Kai Bird, of American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer, which, in 2006, won the Pulitzer Prize for biography.  Perhaps the key personal factor in generating these three scholarly works was Sherwin’s service as a U.S. Navy junior intelligence officer who was ordered to present top secret war plans to his commander during the Cuban missile crisis.

In Gambling with Armageddon, Sherwin shows deftly how nuclear weapons gradually became a key part of international relations.  Although Harry Truman favored some limitations on the integration of these weapons into U.S. national security strategy, his successor, Dwight Eisenhower, significantly expanded their role.  According to the Eisenhower administration’s NSC 162/2, the U.S. government would henceforth “consider nuclear weapons as available for use as other munitions.”  At Eisenhower’s direction, Sherwin notes, “nuclear weapons were no longer an element of American military power; they were its primary instrument.” 

Sherwin adds that, although the major purpose of the new U.S. “massive retaliation” strategy “was to frighten Soviet leaders and stymie their ambitions,” its “principal result . . . was to establish a blueprint for Nikita Khrushchev to create his own ‘nuclear brinkmanship’.”  John F. Kennedy’s early approach to U.S. national security policy―supplementing U.S. nuclear superiority with additional conventional military forces and sponsoring a CIA-directed invasion of Cuba―merely bolstered Khrushchev’s determination to contest U.S. power in world affairs.   Consequently, resumption of Soviet nuclear weapons testing and a Soviet-American crisis over Berlin followed.     

Indeed, dismayed by U.S. nuclear superiority and feeling disrespected by the U.S. government, Khrushchev decided to secretly deploy medium- and intermediate-range ballistic nuclear missiles in Cuba.  As Sherwin observes, the Soviet leader sought thereby “to protect Cuba, to even the balance of nuclear weapons and nuclear fear, and to reinforce his leverage to resolve the West Berlin problem.”  Assuming that the missiles would not be noticed until their deployment was completed, Khrushchev thought that the Kennedy administration, faced with a fait accompli, would have no choice but to accept them.  Khrushchev was certainly not expecting a nuclear war.

But that is what nearly occurred.   In the aftermath of the U.S. government’s discovery of the missile deployment in Cuba, the Joint Chiefs of Staff demanded the bombing and invasion of the island. They were supported by most members of ExComm, an ad hoc group of Kennedy’s top advisors during the crisis.  At the time, they did not realize that the Soviet government had already succeeded in delivering 164 nuclear warheads to Cuba and, therefore, that a substantial number of the ballistic missiles on the island were already operational.  Also, the 42,000 Soviet troops in Cuba were armed with tactical nuclear weapons and had been given authorization to use them to repel an invasion.  As Fidel Castro later remarked:  “It goes without saying that in the event of an invasion, we would have had nuclear war.”

Initially, among all of Kennedy’s advisors, only Adlai Stevenson, the U.S. ambassador to the United Nations, suggested employing a political means―rather than a military one―to secure the removal of the missiles.  Although Kennedy personally disliked Stevenson, he recognized the wisdom of his UN ambassador’s approach and gradually began to adopt his ideas.  “The question really is,” the president told his hawkish advisors, “what action we take which lessens the chance of a nuclear exchange, which obviously is the final failure.”  Therefore, Kennedy tempered his initial impulse to order rapid military action and, instead, adopted a plan for a naval blockade (“quarantine”) of Cuba, thereby halting the arrival of additional Soviet missiles and creating time for negotiations with Khrushchev for removal of the missiles already deployed.

U.S. military leaders, among other ostensible “wise men,” were appalled by what they considered the weakness of the blockade plan, though partially appeased by Kennedy’s assurances that, if it failed to secure the desired results within a seven-day period, a massive U.S. military attack on the island would follow.  Indeed, as Sherwin reveals, at the beginning of October, before the discovery of the missiles, the U.S. Joint Chiefs of Staff were already planning for an invasion of Cuba and looking for an excuse to justify it.

Even though Khrushchev, like Kennedy, regarded the blockade as a useful opportunity to negotiate key issues, they quickly lost control of the volatile situation.

For example, U.S. military officers took the U.S.-Soviet confrontation to new heights.  Acting on his own initiative, General Thomas Power, the head of the U.S. Strategic Air Command, advanced its nuclear forces to DEFCON 2, just one step short of nuclear war―the only occasion when that level of nuclear alert was ever instituted.  He also broadcast the U.S. alert level “in the clear,” ensuring that the Russians would intercept it.  They did, and promptly raised their nuclear alert level to the same status. 

In addition, few participants in the crisis seemed to know exactly what should be done if a Soviet ship did not respect the U.S. blockade of Cuba.  Should the U.S. Navy demand to board it?  Fire upon it?  Furthermore, at Castro’s orders, a Soviet surface-to-air battery in Cuba shot down an American U-2 surveillance flight, killing the pilot.  Khrushchev was apoplectic at the provocative action, while the Kennedy administration faced the quandary of how to respond to it.

A particularly dangerous incident occurred in the Sargasso Sea, near Cuba.  To bolster the Soviet defense of Cuba, four Soviet submarines, each armed with a torpedo housing a 15-kiloton nuclear warhead, had been dispatched to the island.  After a long, harrowing trip through unusually stormy seas, these vessels were badly battered when they arrived off Cuba.  Cut off from communication with Moscow, their crews had no idea whether the United States and the Soviet Union were already at war. 

All they did know was that a fleet of U.S. naval warships and warplanes was apparently attacking one of the stricken Soviet submarines, using the unorthodox (and unauthorized) tactic of forcing it to surface by flinging hand grenades into its vicinity.  One of the Soviet crew members recalled that “it felt like you were sitting in a metal barrel while somebody is constantly blasting with a sledgehammer.”  Given the depletion of the submarine’s batteries and the tropical waters, temperatures ranged in the submarine between 113 and 149 degrees Fahrenheit.  The air was foul, fresh water was in short supply, and crew members were reportedly “dropping like dominoes.”  Unhinged by the insufferable conditions below deck and convinced that his submarine was under attack, the vessel’s captain ordered his weapons officer to assemble the nuclear torpedo for action.  “We’re gonna blast them now!” he screamed.  We will die, but we will sink them all―we will not become the shame of the fleet.”

At this point, though, Captain Vasily Arkhipov, a young Soviet brigade chief of staff who had been randomly assigned to the submarine, intervened.  Calming the distraught captain, he eventually convinced him that the apparent military attack, plus subsequent machine gun fire from U.S. Navy aircraft, probably constituted no more than a demand to surface.  And so they did.  Arkhipov’s action, Sherwin notes, saved not only the lives of the submarine crew, “but also the lives of thousands of U.S. sailors and millions of innocent civilians who would have been killed in the nuclear exchanges that certainly would have followed from the destruction” that the “nuclear torpedo would have wreaked upon those U.S. Navy vessels.”

Meanwhile, recognizing that the situation was fast slipping out of their hands, Kennedy and Khrushchev did some tense but serious bargaining.  Ultimately, they agreed that Khrushchev would remove the missiles, while Kennedy would issue a public pledge not to invade Cuba.  Moreover, Kennedy would remove U.S. nuclear missiles from Turkey―reciprocal action that made sense to both men, although, for political reasons, Kennedy insisted on keeping the missile swap a secret.  Thus, the missile crisis ended with a diplomatic solution.

Ironically, continued secrecy about the Cuba-Turkey missile swap, combined with illusions of smooth Kennedy administration calibrations of power spun by ExComm participants and the mass communications media, led to a long-term, comforting, and triumphalist picture of the missile crisis.  Consequently, most Americans ended up with the impression that Kennedy stood firm in his demands, while Khrushchev “blinked.”  It was a hawkish “lesson”―and a false one.  As Sherwin points out, “the real lesson of the Cuban missile crisis . . . is that nuclear armaments create the perils they are deployed to prevent, but are of little use in resolving them.”

Although numerous books have been written about the Cuban missile crisis, Gambling with Armageddon ranks as the best of them.  Factually detailed, clearly and dramatically written, and grounded in massive research, it is a work of enormous power and erudition.  As such, it represents an outstanding achievement by one of the pre-eminent U.S. historians.

Like Sherwin’s other works, Gambling with Armageddon also grapples with one of the world’s major problems:  the prospect of nuclear annihilation.  At the least, it reveals that while nuclear weapons exist, the world remains in peril.  On a deeper level, it suggests the need to move beyond considerations of national security to international security, including the abolition of nuclear weapons and the peaceful resolution of conflict among nations.

Securing these goals might necessitate a long journey, but Sherwin’s writings remind us that, to safeguard human survival, there’s really no alternative to pressing forward with it.

 

 

 

 

 

 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184751 https://historynewsnetwork.org/article/184751 0
Teach the History Behind "Emancipation" with the Primary Sources

From Harper's Weekly, July 4, 1863. At center is a reproduction of Mathew Brady's famous 1863 photograph of Peter's scars. 

 

 

 

Emancipation, directed by Antoine Fuqua and starring Will Smith, is loosely based on the life of an escaped slave in Louisiana, known in contemporary accounts as both Gordon and Peter, who fought in the Union Army helping to defeat the Confederacy and end slavery in the United States. Gordon/Peter is best known for an early photograph of keloid scarring on his back that was used by abolitionists to show the inhuman nature of enslavement. I found the movie dark and difficult to watch, both because it was filmed in black and white and because of its focus on the horrors of slavery and the Civil War. Slavecatchers killed and beheaded runaways; in Union camps surgeons amputated limbs from the wounded and deposited them in piles to be carted away; reconstructed battlefields were filmed covered with dead bodies. As a teacher, my preferred movies on slavery and African American participation in the Civil War remain Gordon Parks’ Solomon Northup’s Odyssey (1984) and Edward Zwick’s Glory (1989). Both films did a better job with character development, and the Parks’ film did a much better job portraying slavery as a work system and showed African American community and humanity under the direst circumstances. They provide more useful segments that can be played and discussed in classes.

 

The best thing about Fuqua‘s Emancipation was its rediscovery of the life of Gordon/Peter, whose image has survived because of the well-known photograph but whose personal and heroic story was largely forgotten. Fortunately, there are at least three accessible contemporary reports that provide documentary coverage of his life and the battle of Port Hudson that can be used in history classes: a Harper’s Weekly account from July 4, 1863 (429-430), a letter to the editor of the N.Y. Tribune dated November 12, 1863, and a New York Times article from June 14, 1863 that details the brutal treatment of enslaved Africans on Louisiana cotton and sugar plantations. The Harper’s Weekly report included three photographs, “Gordon as he Entered Our Lines” barefoot and dressed in rags; Gordon Under Medical Inspection” showing the keloid scarring, and “Gordon in his Uniform as a U. S. Soldier.” In the movie, director Antoine Fuqua did a good job of building on these sources, although the parts about Peter’s family, the way he was ripped away from them and their eventual reunification are fictional.

 

The Harper’s Weekly article was titled “A TYPICAL NEGRO.”

 

“WE publish herewith three portraits, from photographs by McPherson and Oliver, of the negro GORDON, who escaped from his master in Mississippi, and came into our lines at Baton Rouge in March last. One of these portraits represents the man as he entered our lines, with clothes torn and covered with mud and dirt from his long race through the swamps and bayous, chased as he had been for days and nights by his master with several neighbors and a pack of blood-hounds; another shows him as he underwent the surgical examination previous to being mustered into the service —his back furrowed and scarred with the traces of a whipping administered on Christmas-day last; and the third represents him in United States uniform, bearing the musket and prepared for duty.”

 

Harper’s Weekly credited Gordon with “unusual intelligence and energy” and described how he was able to “foil the scent of the blood-hounds who were chasing him” by rubbing “his body freely with these onions, and thus, no doubt, frequently threw the dogs off the scent.” According to the Harper’s Weekly story, Gordon initially acted as a guide for federal troops during the Louisiana campaign and was captured by Confederate forces and nearly killed. Left for dead, he managed to escape back to Union lines and entered the 1st Louisiana Native Guard, also known as the Corps d’Afrique. I believe the Corps d’Afrique was unique among African American units during the Civil War because its officers up to the rank of Captain were Black.

 

The New York Tribune published an extended letter signed by “Bostonian” and dated November 12, 1863. The letter claimed to correct misstatements in the Harper’s Weekly article and to

 

contradict the malicious falsehoods that have appeared in the Rebel organs of the North. No sooner had this heart-striking picture begun to circulate, and awaken a thrill of horror among the loyal and humane portion of the community, then the Copperhead press at once spit forth their poisonous venom, and boldly asserted that the whole story was a fabrication from beginning to end—the fruitful results of a fanatical Abolitionist’s deluded imagination.

 

In response,

 

the friends of freedom in New York and Boston have purchased "carte de visite" size photograph copies of the abused negro, as a faithful picture of the realities of Slavery as it exists in the southern States.

 

A “carte de visite” photograph was approximately 2 by 3.5 inches, about the size of a business card today.

 

In the letter, "Bostonian" claimed to have brought the original photographs to Harper’s Weekly after a March visit to Louisiana and he vouched for their “entire accuracy, as well for the truthfulness of the brief account of the outrages perpetrated upon the unoffending negroes which was published in connection with the pictures.”

In a correction to the Harper’s Weekly article, Bostonian explained that the three photographs published by the journal actually showed two different people. The person dressed in rags was a man named Gordon who was part of the group that escaped through the swamps together. The person with the lacerated back’s name was Peter. Peter barely spoke English because French was the language of the Louisiana planters and the people they held in bondage. When military officials interviewed him, Peter reported his “master’s name is Captain John Lyon, cotton planter, on Atchafalaya River, near Washington, La” and that the scars were the result of a whipping “two months before Christmas.”

Bostonian ended the letter praising the men who

chased by "hunters" with their savage pack of hounds, but they were ingenious enough to wade and swim through every stream they could find on their way, twice swimming the turbid waters of the Amite River in their wanderings. Upon coming from the water, they had presence of mind and sagacity enough to rub every portion of their body with onions and strong-scented weeds, in order to elude the trail of the bloodhounds, who were several times close upon them. To their intelligence may be attributed their narrow and fortunate escape from the terrible fate that befell “Poor John” their companion.

On June 14, 1863, The New York Times, ran a front page article headlined “FROM THE MOUTH OF RED RIVER.; Gunboat and Army Movements on the Mississippi. Gen. Banks’ Investment of Port Hudson. IMPORTANT SUBSEQUENT EVENTS.” The article recounted

 

Several parties of colored people have come down in canoes and flats -- having escaped on Monday and Tuesday last. They represent that the blacks are subjected to the greatest brutality by the enraged rebels, who find no other objects on whom to vent their chivalric spite. They run them down with horses; shoot them on the road, or tie and drag them with ropes at the tail of their horses toward the jail, which is now crowded so full that they cannot get any more inside the walls. All, white or black, who have shown any favor or countenance to the invading Yankees, have been arrested for punishment. Several are known to have been shot.”

 

Refugees reaching Union lines were interviewed about the conditions they were fleeing, and the bulk of the article reported on the treatment of enslaved Africans on Louisiana plantations. At least one planter threatened enslaved Africans that if they escaped to the Union lines "They would be sold to Cuba; be worked to death, or turned out to starve."

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184779 https://historynewsnetwork.org/article/184779 0
Professor Marcia Chatelain on McDonald's and MLK To reflect on Martin Luther King Jr.’s legacy, celebrated on Monday, I spoke to Professor Marcia Chatelain about the interconnected rise of McDonald’s. Professor Chatelain is the Pulitzer Prize-winning author of Franchise: The Golden Arches in Black America

Professor Chatelain and I discussed McDonald’s growth in Black communities following MLK’s death, corporations’ distortion of MLK’s legacy, and the long entrenchment of food inequity. A condensed transcript edited for clarity is below.

Ben: Professor Chatelain, thank you so much for being here.

MC: An absolute pleasure, Ben.

Ben: Of course—it's all downhill from here. To begin, who started McDonald’s and when?

MC: So Richard and Maurice McDonald were the founders of McDonald's, and they were two brothers from New Hampshire who moved to California during the Depression to figure out their lives. They worked in the movie industry, then they tried hotdog carts, then opened their first restaurant in San Bernandino, California in 1945.

The restaurant served barbecue and generally, they had a relatively big menu. When the brothers saw that their burgers were bestsellers, they realized they could automize the production of burgers, fries, and drinks if they kept the menu simple. So they closed and reopened with a smaller menu. The idea was a success, but they didn’t expand very much. They opened a couple of more restaurants in Southern California, and one or two in Arizona.

And then, in the mid-1950s, Ray Kroc, who people often think of as the founder of the franchise system, got involved. At the time, Kroc sold milkshake makers. He couldn’t understand why the McDonald’s brothers needed so many, and he went and checked out McDonald's and was like, what is this?

It really sparked his curiosity in the business, which is why he began working with the McDonald’s brothers in the 50s before purchasing the business outright in 1961. The McDonald’s brothers sold the restaurants to Kroc for $2,000,000—that's it! It's kind of bananas to think about.

Ben: It’s even more bananas when you consider that Ray Kroc also made a fortune selling rubber shoes later on.

MC: I don't think I knew that...

Ben: They’re called Crocs?

MC: Oh, I see what you did there.

Ben: So moving into the 1960s, how did the death of Martin Luther King Jr. factor into the expansion of McDonald's?

MC: So this really weird thing happens after MLK’s death.

In April 1968, MLK had been in Memphis talking about sanitation workers and economic boycott before he was killed. After he was assassinated on April 4th, McDonald's became one of many companies involved in a public racial reckoning, akin to what we experienced in 2020 in the United States. People asked: What did it mean for businesses to operate in Black communities, and what did it mean for them to give economic opportunities to Black entrepreneurs?

Conversations and studies among foundations, think tanks, and commissions revealed that economic and social problems plagued Black America. As advertising and marketing reports advised companies to target a growing market of Black consumers, all of a sudden, Black wealth building and the opening of businesses like a black-owned McDonald's became the fulfillment of MLK’s dream. 

So following King's assassination, McDonald's, along with a lot of major companies, started to recruit Black franchise owners. 

Ben: And this wasn’t exactly the fulfillment of MLK’s dream... as you write, “Almost immediately after he was laid to rest in his hometown of Atlanta, King’s death became inextricably tied to the advancement of capitalism, which he had believed ‘failed to meet the needs of the masses,’ and was on a par with the ‘evils of militarism and evils of racism.’”

That sounds about as evil as evils get. 

MC: Yeah, but corporate America was only too happy to turn up the volume on King’s alleged support of Black capitalism as a way to suppress his far more critical and complex ideology. That’s why he almost seamlessly became tied to the advocacy for programs like Black capitalism and Black entrepreneurship.

And the federal government, under Nixon, promoted the idea of Black capitalism, too. Related, McDonald’s learned how profitable Black franchises could be. Even though the number of Black franchises was small, throughout the 60s and 70s their margins were high. They were located in the urban core; serving a consumer market that visited multiple times a day because it was a cheap, reliable food option; and many white franchisee owners had fled cities for the suburbs.

So fast food companies like McDonald’s, which had good relationships with the White House, pushed the government to provide loans to Black franchisees. And Nixon was only too happy to comply and champion Black capitalism because it meant supporting individual business owners rather than addressing the systemic socioeconomic issues that had long held Black communities back.

Ben: That reminds me of another quote of yours: “The option of bartering civil rights for economic opportunity has been presented to African Americans for centuries. In exchange for silence, Black communities could acquire a plethora of resources.”

MC: Right, and I’ll add that people across the political spectrum, many Black people included, embraced Black capitalism because they felt it was the one strategy where they would see real, tangible outcomes; that, while policy initiatives often failed, you could go to a store or franchise opening and say that is a Black-owned McDonald's.

I say to my students all the time that it's very easy for us to be dismissive or make fun of people in the past because we actually know what happened now. But from the vantage point of a Black citizen in 1968, you can see how owning a McDonald's might’ve seemed super hopeful. There was so much potential. I liken it to if Mark Zuckerberg texted me right now and said you own Meta. Do whatever you want with it. I wouldn't be able to wrap my head around that level of wealth and power and access.

Ben: Eh, when comparing you and Elon Musk...

MC: Right, maybe I couldn’t do much worse owning Twitter. But all of this is to say that the opportunity of owning a McDonald’s was unfathomable at the time. And people thought if we open this McDonald's in this community, it'll provide good jobs, we’ll make money and we can support local youth programs and athletics.

It's not that people weren’t critical of companies like McDonald’s (or other companies) suddenly offering business opportunities in Black communities. Resistance to McDonald's came from the Black Panther Party and local groups and other black businesses that were trying to compete, all of whom were very suspicious and skeptical.

But from the vantage point of that moment—when considering the cautious optimism that Black people were emerging from years of strife and grief and loss—there was a lot of hopeful speculation projection about the opportunities that McDonald’s, which was incredibly powerful, could bring to this long marginalized group of people.

Ben: Moving into the 80s, can you speak about how McDonald’s doubled down on its presence in Black communities through marketing campaigns?

MC: One of the most poignant parts of my research process was watching old McDonald's commercials all day. I went to the Paley Center For Media in New York and I was in tears. I couldn't figure out why, until I realized it was my whole childhood unfolding in front of me. 

McDonald's was the leader in the type of advertising that we would call “ethnic” or “segmented” marketing: the use of Black celebrities, Black models, and Black athletes like Michael Jordan to sell the franchise. McDonald’s enlisted the services of Burrell Communications, a Chicago-based Black advertising and marketing firm, to create content for Black consumers.

Their efforts were really, really effective. Obviously, so much of advertising is about gender and class fantasies, and so whether it was first dates or family meals, McDonald's really knew how to aesthetically create a world that I think a lot of African American consumers wanted to be in. 

Ben: In Franchise, you describe one commercial, whose catchphrase was “a hamburger wallet and a beefsteak appetite.”

MC: Oh my gosh. They had this crazy product called a McSteak sandwich, and it was supposed to elevate the dining experience because the market research showed that adults didn't really love McDonald's. They just tolerated it because their kids liked it and they found that African American men especially didn't like eating there.

So they tried to create this fake steak sandwich that you could order on a date, and “onion nuggets” to go along with it. But the nuggets gave people really bad gas.

Ben: So funny. This calls to mind some of McDonald's other failed food creations, like the Hula Burger.

MC: Oh, so gross. It was just a piece of grilled pineapple with cheese melted on it and put in a bun.

Ben: Maybe they should’ve tried the “McCantouloupe” and used mayo to glue french fries to a melon.

MC: Perhaps.

Ben: Speaking of healthy food options (or not), how do you reflect on the connection between fast food and health disparities in the US today? To cite some stats from your book, Black children are at far greater risk of developing type 2 diabetes than white children, and as of 2015, Black citizens were 1.4 times more likely than white citizens to be obese.

MC: Yeah, people often think that when you write a book about race and fast food, there’ll be a lot of finger-wagging, saying it’s better to eat kale than burgers. I'm really not interested in that. What I'm interested in is helping us understand the racialized food system that exists today.

Since the late 60s, fast-food restaurants have been hyper-concentrated in the poorest and most racially segregated places. Fast food is often identified as the culprit for high rates of obesity, diabetes, and hypertension among the Black population. But in the public conversation about fast food, race, and health, we have to remember the centuries-old structural indifference to providing Black communities with nutritious foods. 

Sometimes, people will say to me, well, my family lived on the farm and we ate great, and then we came to the city and we ate poorly. And I'm like, that's really unlikely when we think about sharecropping and subsistence farming and the poverty that gripped people in the South. There has always been a long fight for diverse and robust and nutritionally balanced access to food for African Americans.

I don't blame McDonald's for that. But I think McDonald's is a symbol of the failures of the state to really take seriously how we're going to facilitate racial justice and all of the things that justice entails, from food to jobs to healthcare to education. When we abdicate the responsibility of the public good to corporations—when we blame Black citizens for eating at McDonald’s more than the structural lack of better food options for our communities—well, then more McDonald’s is what we get.

Ben: To quote you one last time, “Ultimately history encourages us to be more compassionate toward individuals navigating few choices, and history cautions us to be far more critical of the institutions and structures that have the power to take choices away.”

MC: Yes, and with the rise of Black Lives Matter, I think more and more people are starting to get it. They’re noticing the tendency for government and corporations to offer meaningless gestures that don’t address the origins of the rage and the disappointment that people have in a structure that has stayed unchanged for so many people.

Thankfully, too, people increasingly look at things like MLK weekend sales for sheets and guns and washing machines, and they’re like: you’ve got to be kidding. That is not the fulfillment of his dream.

Ben: A good concluding note. Professor Chatelain. Thank you so much for your scholarship and for being here. It’s been a pleasure.

MC: Thank you.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/blog/154661 https://historynewsnetwork.org/blog/154661 0
Revisiting Kropotkin 180 Years After His Birth

Pyotr Kropotkin, 1842-1921

 

 

“Anarchism is an aspect of socialism (among many others) that those of us wishing socialism, or some comparable form of resistance, to survive will have to think about again, this time without a prearranged sneer.” T.J. Clark, Farewell to an Idea This December 9th marked 180 years since the birth of Pyotr Kropotkin (1842-1921), the great Russian anarchist, sociologist, historian, zoologist, economist, and philosopher. Now, of all times, we should be remembering, revitalizing, and creatively reconstructing his legacy. One might assume that a 19th century Russian anarchist would have nothing to say that could possibly have real bearing on the world today, that his political philosophy, whatever relevance it might have once held, had been long surpassed. I would dare to venture another point of view: not only are we unable to justify confining Kropotkin to the history of ideas—or worse, the dustbin of history—rather, this is a thinker that remains still ahead of us, a thinker whose vision has yet to be truly realized. We have not yet caught up with Kropotkin, but there are indications that conditions more favorable to receiving his thought are on the horizon, and that perhaps there is a day approaching when we may even begin to see his ideas implemented on a scale that could radically transform our communities and, most especially, our workplaces. Kropotkin’s importance for us has only grown because the material conditions, the post-scarcity, the technological advances, have made it possible, no doubt for the first time in history, to truly realize his vision of unfettered human creativity. There is one chapter in The Conquest of Bread (1892) that I want to focus on because it may surprise those who are new to anarcho-communist political philosophy. The chapter is entitled "The Need for Luxury," and his thesis is quite a simple one: “After bread has been secured, leisure is the supreme aim.” The anarchist commune—or what is sometimes referred to today as “luxury communism”—recognizes “that while it produces all that is necessary to material life, it must also strive to satisfy all manifestations of the human mind.” We can agree with Aaron Bastani, who argues in Fully Automated Luxury Communism (2020) that, “There is a tendency in capitalism to automate labor, to turn things previously done by humans into automated functions. In recognition of that, then the only utopian demand can be for the full automation of everything and common ownership of that which is automated.” Bastani is talking about using the levels of post-scarcity and automation that we’ve attained to finally usher in a society free of drudgery, toil, and where the full range of tastes can be satisfied. Given the multiple crises we are facing, the general name for which is global capitalism, how should we answer the question famously posed by Lenin, “What is to be done?” There are at least three basic principles which can be derived from the work of Kropotkin, and that can and should strategically guide us as we move forward. The first is ending the tyranny of private property which has produced greater economic inequality today than we have ever seen in the history of the world. The concentration of capital has produced a condition in which a handful of individuals possess wealth exceeding that of the combined wealth of the billions of people who share this planet. So, as the great French philosopher Alain Badiou has also reiterated, our first principle must be that of collectivism in opposition to the dictatorship of capital: “It is not a necessity for social organization to reside in private property and monstrous inequalities.” The second principle involve democratizing our workplaces, through worker self-management, or more precisely through what the economist Richard Wolff calls "worker self-directed enterprises" – in a word, economic democracy. Experiments with non-traditional, non-hierarchical firms, have largely met with success. Perhaps the greatest example is Spain’s Mondrian Corporation, but there are many others. So that we are well past the stage of asking ourselves whether such non-capitalist forms of organization can succeed and be competitive. It has been amply proven that they indeed can. The non-capitalist reorganization of our workplaces would undoubtedly improve the condition of workers, which is under assault around the world. In countries around the world, union leaders are routinely threatened with violence or murdered. Indeed, the International Trade Union Confederation reports that 2019 saw “the use of extreme violence against the defenders of workplace rights, large-scale arrests and detentions.”  The number of countries which do not allow workers to establish or join a trade union increased from 92 in 2018 to 107 in 2019. In 2018, 53 trade union members were murdered — and in 52 counties workers were subjected to physical violence. In 72 percent of countries workers have only restricted access to justice, or none at all.  As Noam Chomsky observed, “Policies are designed to undermine working class organization and the reason is not only the unions fight for workers' rights, but they also have a democratizing effect. These are institutions in which people without power can get together, support one another, learn about the world, try out their ideas, initiate programs, and that is dangerous.” And third, it is time we recognize, as Badiou put it two weeks after the election of Trump, “that there is no necessity for a state in the form of a separated and armed power.” The principle of free association as opposed to the state is one that anarchism has long advocated. But we need to be clear here: anarchism is usually taken to mean, if anything, opposition to all government or to government as such. In fact, this is a mistakenly one-sided view of anarchism, and it certainly does not represent a nuanced understanding of Kropotkin, who made a clear and sharp distinction between government and the state. Anarcho-communism is opposed to the state inasmuch as it represents centralized power in the hands of a few, hierarchical relationships and class domination. But Kropotkin was not necessarily opposed to a condition of society in which certain elements of decentralized community government remain. Martin Buber underscored this point: Kropotkin’s “‘anarchy,’ like Proudhon’s, is in reality ‘anocracy’; not absence of government, but absence of domination.” The distinctive feature of anarchist programs is not that governments are excluded from the process and without any meaningful contribution to make. The essential characteristics are voluntarism, antiauthoritarianism, the decentralization of political authority, worker self-management (economic democracy), and in general a tendency to address social problems from the bottom up, rather than by imposing solutions from the top down. Kropotkin was one of Russia’s finest minds, and one that was among the most dedicated to the ideals of which we are in danger of completely losing sight. There is no better time than now to salvage the very best of Russian thought, to reaffirm its universality, its inherently critical posture towards authoritarianism, and the self-destructive pursuit of power through violence.

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184752 https://historynewsnetwork.org/article/184752 0
The Roundup Top Ten for January 13, 2023

As a History of Insurrection, the January 6 Report is a Mess

by Jill Lepore

The Committee delivered a potent indictment of Donald Trump's responsibility for the events of January 6, but shed little light on the origins or the future of the antidemocratic insurrection and failed to tell a compelling story about what happened. 

 

DeSantis's New College Coup Will Fail

by Adam Laats

Transforming colleges along ideological lines is much more difficult than amassing political power or appointing allies to governing boards. Conservatives are able to operate successful and ideologically friendly institutions when they accept that they will be occupy a niche, not change the ecosystem. 

 

 

The Black Widows' Struggle for Civil War Pensions

by Hilary Green

Black women's struggles to claim pensions earned by their late husbands' service in the Union Army reflected the incomplete realization of freedom after emancipation and the intrusive controls the pension system and growing administrative state placed on Black families. 

 

 

A Profession, If You Can Keep It

by Erin Bartram

"Let’s be clear: this “burnout” that secure scholars are feeling is phantom pain where their colleagues should be.... You are suffering from the effects of intentional systemic understaffing."

 

 

50 Years at Cook County Hospital Prove Abortion is Healthcare

by Amy Zanoni

Abortion rights activists have focused on horror stories of the pre-Roe era as cautionary tales, but the history of public hospitals since Roe shows that real reproductive freedom requires expanded access to care and a robust social safety net. 

 

 

Despite Aggressive Rebrand, Charles Koch is Still Fighting Against Democracy

by Nancy MacLean and Lisa Graves

The media have latched on to Charles Koch's recent expressions of regret over partisanship. But this is a rebranding, not a redirection. 

 

 

Will Ukraine Be the Death of German Pacifism?

by Stephen Milder

The real transformation wrought in Europe by the Russian invasion isn't the return of war (which was certainly present in the 1990s) but the turn of Germany away from a post-fascist pacifist posture to a potential remilitarization. 

 

 

Israel's Ruling Coalition Turns Toward Theocracy

by Bernard Avishai

Netanyahu has engineered an alliance among three disparate strains of religious parties and secular Israelis favoring an aggressive nationalism and occupation policy. But the religious parties have broader goals of undermining secular society in Israel. 

 

 

Oil and Spills Have Always Gone Hand in Hand

by Nolan Varee

Transporting a toxic substances quickly over long distances to market will inevitably produce spills. Though the technology of oil transport has changed, this essential fact remains unchanged, and will as long as regulation treats the risk as an acceptable part of the business.

 

 

John Fetterman and the Politics of Disability

by Anya Jabour

The personal became political for a historian who experienced disability similar to that affecting the new Pennsylvania senator on the campaign trail. The media must significantly readjust its framing of how disability impacts the ability to perform work. 

 

]]>
Tue, 31 Jan 2023 21:01:30 +0000 https://historynewsnetwork.org/article/184778 https://historynewsnetwork.org/article/184778 0