Breaking News Breaking News articles brought to you by History News Network. Fri, 01 Nov 2024 00:13:50 +0000 Fri, 01 Nov 2024 00:13:50 +0000 Laminas_Feed_Writer 2 (https://getlaminas.org) https://www.historynewsnetwork.org/article/category/55 Josh Hawley Earns F in Early American History Josh Hawley has a history degree from Stanford, where he wrote a thesis on the righteousness of Theodore Roosevelt, graduated with honors, and was remembered as “a serious scholar of the Constitution.” It’s reasonable, then, to assume that he is familiar with the basic premises of the American experiment. Yet Hawley is also a Republican politician in the era that has seen his party mount a determined assault on the honest teaching of American history about everything from race to foreign policy. So the relentlessly ambitious senator from Missouri has chosen to toss aside his learning in favor of a right-wing ideological fantasy and the political rewards that he hopes will extend from it.

That’s the best explanation for the misinformation that Hawley disseminated on July 4, when he tried to turn the 247th anniversary of American independence into a fact-free celebration of Christian nationalism.

Hawley’s 1.4 million Twitter followers were offered up a supposed pronouncement from one of the most outspoken advocates of American independence, Patrick Henry: “It cannot be emphasized too strongly or too often that this great nation was founded, not by religionists, but by Christians; not on religions, but on the Gospel of Jesus Christ. For this very reason, peoples of other faiths have been afforded asylum, prosperity, and freedom of worship here.”

The problem is that Patrick Henry never uttered those words. While he was at times in disagreement with fellow Virginians Thomas Jefferson and James Madison on precise questions of separating church and state, Henry is today recalled not only for his “Give me liberty or give me death!” rhetoric but also for his reflections on the value of a “general toleration of Religion.” Hawley’s inaccuracy was immediately called out by historians, religious scholars, and patriots of varying political tendencies. A clarification was attached to the senator’s tweet, which explained, “Patrick Henry never said that. This is a line from a 1956 piece in The Virginian that was about Patrick Henry, not by him.” With it came a link to the Fake History website, which explained why the attribution to Henry was especially “puzzling”:

The language is twentieth-century. The word “religionists,” for example. In Patrick Henry’s time, it meant a fanatic, a person obsessed with religion; not as here people of different religions (or something like that). The piece looks back on the founding of “this great nation” (would Patrick Henry really have used that phrase?) as something in the past, and it seems to know that “peoples of other faiths” are going to be “afforded asylum, prosperity and freedom of worship” in it. It’s wrong historically, and it’s wrong linguistically.

Presumably, a “serious scholar of the Constitution” should have recognized the disconnect. Or, at the very least, should have been embarrassed when it was pointed out. Not Josh Hawley. A week after the incident, the tweet was still featured on his Twitter timeline, where it had been viewed 3.4 million times. Worse yet, Hawley responded to rebukes from historians with the snarky observation, “I’m told the libs are major triggered by the connection between the Bible and the American Founding.” To support the latter assertion, Hawley featured quotes from John Quincy Adams and Daniel Webster. The problem, of course, is that Adams was just 8 years old when independence was declared, while Webster was born eight years after the Declaration was made—making both men flawed as exemplars of the founding circle.

....

What was the source of Hawley’s misinformation? The origins of the misquote, which has circulated for years in Christian nationalist publications, can be traced to that 1956 article in The Virginian, a segregationist-era publication that Willamette University history professor Seth Cotlar has described as “virulently antisemitic & white nationalist.”

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185905 https://historynewsnetwork.org/article/185905 0
Does Germany's Holocaust Education Give Cover to Nativism? SOMETIME IN THE 2000S, a group of mostly Turkish women from an immigrant group called Neighborhood Mothers began meeting in the Neukölln district of Berlin to learn about the Holocaust. Their history lessons were part of a program facilitated by members of the Action Reconciliation Service for Peace, a Christian organization dedicated to German atonement for the Shoah. The Neighborhood Mothers were terrified by what they learned in these sessions. “How could a society turn so fanatical?” a group member named Nazmiye later recalled thinking. “We began to ask ourselves if they could do such a thing to us as well . . . whether we would find ourselves in the same position as the Jews.” But when they expressed this fear on a church visit organized by the program, their German hosts became apoplectic. “They told us to go back to our countries if this is how we think,” Nazmiye said. The session was abruptly ended and the women were asked to leave.

There are a number of anecdotes like this in anthropologist Esra Özyürek’s Subcontractors of Guilt, a recently published study of the array of German Holocaust education programs dedicated to integrating Arab and Muslim immigrant communities into the country’s ethos of responsibility and atonement for Nazi crimes. As Özyürek shows, those who pass through these programs often draw connections their guides do not intend—to nativist violence in contemporary Germany, or to the bloody circumstances they fled in Syria, Turkey, and Palestine. For many Germans, the anxieties these historical encounters stoke for migrants are, in Özyürek’s words, the “wrong emotions.” One German guide who leads concentration camp tours recalled being “irritated” by members of immigrant tour groups voicing the fear that “they will be sent there next.” “There was a sense that they didn’t belong here, and that they should not be engaging with the German past,” the guide said. To be really German, they were supposed to play the part of repentant perpetrators, not potential victims.

This expectation has become the basis for what scholars Michael Rothberg and Yasemin Yildiz have called the “migrant double bind.” In this paradigm, the core of contemporary “Germanness” is found in a certain sensitivity to antisemitism, conferred through a direct, likely familial relationship to the Third Reich. Migrants and racialized minorities are expected to assume the per­petrators’ legacy; when they fail, this is taken as a sign that they do not really belong in Germany. In other words, in a paradox typical of the upside-down dynamics surrounding Jews, Arabs, and Germans in contemporary Germany, a questionably conceived anti-antisemitism has become the mechanism for keeping Germanness Aryan.

These dynamics are largely absent from the mainstream story about memory culture in Germany, which in recent decades has cemented its reputation as a paragon of national reckoning. For The Atlantic’s December 2022 cover story, poet and scholar Clint Smith traveled to Germany to see for himself what the country’s atonement process might teach the United States about confronting its own history of racist atrocity. In the piece’s final line, he appears to give the Germans an A for effort: “It is the very act of attempting to remember that becomes the most powerful memorial of all.” Smith is far from the only one to come away impressed by Germany’s example; from Canada to Britain to Japan, observers have looked to Germany as a model for how to contend with their own nations’ crimes. As Andrew Silverstein reports in this issue, Spanish memory activists seeking to jump-start their country’s internal reckoning with the violence of Francisco Franco’s fascist dictatorship have adopted the German practice of installing “Stolpersteine,” or remembrance stones, in the street.

Germany’s commitment to memory is undeniably impressive; no other global power has worked nearly as hard to apprehend its past. Yet while the world praises its culture of contrition, some Germans—in particular, Jews, Arabs, and other minorities—have been sounding the alarm that this approach to memory has largely been a narcissistic enterprise, with strange and disturbing consequences. The leftist German Jewish writer Fabian Wolff argued in a viral 2021 essay that Germany’s attachment to the past had diminished the space for Jewish life in the present: Germans have no place for “Jewish life [that] exists outside their field of vision and their way of knowing,” he wrote, or for “Jewish conversations about Jewish issues [that] have a meaning beyond and apart from what these Germans themselves think or would like to hear.” 

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185892 https://historynewsnetwork.org/article/185892 0
"Car Brain" Has Long Normalized Carnage on the Roads Francis Curzon, born in 1884 and later named the fifth Earl Howe, loved a souped-up Bugatti. And he loved to drive fast. He was famous for his “great skill and daring” on the racetrack, and also, eventually, for crashing into pedestrians—knocking down a boy in Belfast, Northern Ireland; slamming into a horse-drawn cart and killing a peasant in Pesaro, Italy.

These incidents (and 10 more) were recounted in a 1947 polemic by J. S. Dean, chair of the Pedestrians’ Association in England. Dean took particular issue with an assertion the earl had once made that the “recklessness” of pedestrians was the main safety problem on Britain’s roads. People who drive cars, Dean pointed out, do consider themselves to be “pedestrians” in other situations—that is, when they themselves are walking—and they agree that safety laws are important. Still, no matter what they may say, they continue to do whatever they want. Dean asked: “What are we to do with these people with their split minds?”

If the term had been available to him, he might have used the pejorative car brain to describe the conundrum he was observing. In the past five years or so, the term has become a common joke in left-leaning online spaces devoted to public transportation and urban planning, including the Facebook group “New Urbanist Memes for Transit-Oriented Teens.” Car brain also appears daily in the even more explicit Reddit forum r/fuckcars (404,000 members). It describes both a state of mind (“you’re car-brained”) and a type of person (“she is a car brain”). Obviously, the term is rude and very smug—in the same vein as the guys who wear one less car T-shirts while riding their bike. But there is also something true about it: Reason is failing in the face of the majestic automobile. People make excuses for cars and remain devoted to them, despite the incontrovertible evidence that they’re extremely dangerous.

This is an unresolvable tension of life in the United States. It’s been that way as long as there have been cars to drive and crash, and it’s especially notable now. An estimated 46,270 people were killed by cars last year. In 2019, deaths numbered 39,107. Car deaths drastically started to spike in 2020, a phenomenon that at first some ascribed to one of the many riddling consequences of the pandemic. Americans were driving much less than usual in the early days of COVID, but those who did take their cars out were found to be driving more recklessly and even faster than they were before, perhaps because everyone was simply more anxious, or perhaps because the roads were more open and people felt free to speed, or perhaps the threat of a deadly virus made other threats seem less consequential. Those explanations became less convincing, however, as pandemic restrictions faded yet car fatalities continued to rise. The number of people killed by cars in 2022 is 9 percent higher than in 2020.

Of course, one problem with these numbers is the simple fact that cars are necessary. Americans have to get places, and in much of the country there is no other way to do that. Sometimes, becoming “car-brained” is just what you have to do to get through the day without constant dread. I grew up in a rural area, and was happily car-brained as I commuted to my job at the mall. Now I’ve been living in New York City for the better part of a decade and am rarely in a car. I find myself acutely terrified by the idea; I feel sharp, pit-in-the-stomach anxiety whenever a phone call to a family member produces the knowledge that they will soon be driving somewhere. Yet I still love cars. I plan imaginary road trips as I fall asleep. I sigh with envy when I see someone pull into a Wegmans parking lot. I used to have a red Hyundai Elantra; when I say Hyundai Elantra, I say it like I am saying the name of the one who got away.

A new study attempts to model the confusion I’m feeling. Co-authored by Ian Walker, an environmental-psychology professor at Swansea University, in Wales, the preprint is titled “Motonormativity: How Social Norms Hide a Major Public Health Hazard.” It was based on survey data collected in the U.K., but nonetheless has some relevance: Walker and his team created pairs of questions designed to suss out the existence of a pro-car bias in society. The questions range from clever to somewhat chin-scratching. For instance, should people smoke cigarettes in highly populated areas where other people would have to breathe in the smoke? Forty-eight percent of respondents strongly agreed that they should not. Should people drive cars in highly populated areas where other people would have to breathe in the exhaust fumes? Only 4 percent strongly agreed that they should not. If you leave your car in the street and it gets stolen, is it your fault? Eighty-seven percent said no. If you leave anything else in the street and it gets stolen, is that your fault? Forty percent said yes.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185888 https://historynewsnetwork.org/article/185888 0
Hawley's Use of Fake Patrick Henry Quote a Revealing Error While a number of prominent Republicans have been caught peddling fake historical quotes, on Capitol Hill, Sen. Rand Paul tends to be in a league of his own. The Kentucky Republican has, after all, been caught pushing fake quotes from Thomas Jefferson, James Madison, Patrick Henry, Abraham Lincoln, and George Washington.

But Paul is not without competition. HuffPost highlighted a related senatorial misstep from yesterday:

Sen Josh Hawley (R-Mo.) is coming under fire for a Fourth of July tweet that managed to include both a false claim and a false quote at the same time. Hawley tweeted a quote he claimed to be from Founding Father Patrick Henry saying the United States was founded “on the Gospel of Jesus Christ.” Just one problem: Henry ― a slave owner perhaps best remembered for his “give me liberty or give me death” quote ― never said it.

It was late in the afternoon on the Fourth of July holiday when the Missouri Republican published this tweet — which, as of this morning, has not been taken down — with a purported quote from Patrick Henry, a prominent figure from late-18th century Virginia.

“It cannot be emphasized too strongly or too often that this great nation was founded, not by religionists, but by Christians; not on religions, but on the Gospel of Jesus Christ,” the quote read. “For this very reason, peoples of other faiths have been afforded asylum, prosperity, and freedom of worship here.”

What Hawley should’ve realized before promoting the quote is that Henry didn’t say it. The line was reportedly published instead by a white nationalist publication in 1956 — more than a century and a half after the founding father’s death.

On the surface, it’s obviously unfortunate to see a senator — a graduate of Stanford and Yale — make a mistake like this, especially as so many other Republicans also fall for fake quotes.

But let’s not brush past the underlying point the Missouri Republican was trying to make by way of a made-up line: Hawley seems certain that the United States was founded as a Christian nation, with members of one faith tradition — his own — enjoying exalted status over others.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185882 https://historynewsnetwork.org/article/185882 0
Health Researchers Show Segregation 100 Years Ago Harmed Black Health, and Effects Continue Today From the choice of schools to safety to access to green spaces and healthy food, the neighborhood where a child is raised can play a determining role in their future health. And because structural racism can systematically silo nonwhite people in certain neighborhoods, those local factors shape the health of millions of people of color in the United States. Now, census data link Black children’s neighborhoods and mortality rates in the early 20th century, exposing segregation’s devastating impact on health more than 100 years ago.

The study shows segregation drove racial health disparities “not just today, but [also] in the past,” says New York University community psychologist Adolfo Cuevas, who was not involved in the work.

John Parman, an economist at the College of William & Mary, says the new results are striking because they document the impacts even before the makings of the Jim Crow era in the late 19th century, which legalized and enforced racial segregation and is known to have exacerbated health inequities.

A growing body of evidence has shown that, today, neighborhoods with majority nonwhite residents tend to have poorer health—the result of many accumulated social and environmental inequalities such as systematic overcrowding, higher noise levels due to industrial projects, and exposure to toxic hazards. But how early such residential segregation began to affect health was not clear, says J’Mag Karbeah, a health services researcher at the University of Minnesota (UM) who led the new study. “What is really missing is this crucial period post-Emancipation and before the formalization of Jim Crow legislation,” she says.

So Karbeah and J. David Hacker, a demographic historian at UM, set out to correlate early segregation with child mortality, a proxy for the health of the entire population. “If you don’t have a healthy young population, you won’t have healthy working-age adults, [and] you will not have healthy seniors,” Karbeah says. “It’s really predictive of the quality of your society in the next 40 or 50 years.”

The researchers used census data from 1900 and 1910 that were recently processed by the Minnesota Population Center at UM. The lists, which together cover about 168 million people, include information on literacy, race, and whether the individual lived in a rural or urban area. Census takers also asked each surveyed woman who had ever been married how many children she had given birth to and how many were still alive.

From the data, Karbeah and Hacker reconstructed the number of children born in the 5 years before each census to arrive at a sample of nearly 4.7 million Black and white children. Focusing on the South because 90% of the Black population resided there at the time, they compared the mortality rates for Black and white children. They also calculated the spatial distribution of houses headed by Black or white people as a measure of segregation.

The largest mortality gap was in Savannah, Georgia, in 1910, where Black children were 3.2 times more likely to die than their white counterparts, with almost half dying before age 5. To tease out the influence of segregation, the team controlled for socioeconomic and other variables such as literacy, occupation, and unemployment. They found that in 1910, neighborhood segregation as much as doubled the mortality gap between Black and white children in cities, the team reported last week in Population, Space and Place.

Although the researchers couldn’t explain exactly how segregation affects child mortality, children are extremely vulnerable to environmental pollutants as well as to poor sanitation, Karbeah says, all of which tend to go hand in hand with housing segregation. “These are the populations that have been most impacted by these inequities within the neighborhood environment,” she says.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185867 https://historynewsnetwork.org/article/185867 0
Understanding the Leading Thinkers of the New American Right For more than half a century, the luminaries of the mainstream American right had a clear mission and sense of where they came from. If liberals were fixated on quixotic schemes for building a perfect society, conservatives would be on hand to do the sober work of defending liberty against tyranny. Conservatives traced their roots to 1790, with the British statesman Edmund Burke’s warnings about the dangers of revolution and his insistence on the contractual relationship between the inherited past and the imagined future. They counted the English philosopher Michael Oakeshott and the Austrian émigré economist Friedrich Hayek as ancestors and viewed public intellectuals, such as the American writer William F. Buckley, Jr., and people of action, such as British Prime Minister Margaret Thatcher and U.S. President Ronald Reagan, as fighters for the same cause: individualism, the wisdom of the market, the universal yearning for freedom, and the conviction that solutions to social problems will bubble up from below, if only government would get out of the way. As Barry Goldwater, the Arizona senator and forefather of the modern Republican Party, put it in The Conscience of a Conservative, in 1960, “The Conservative looks upon politics as the art of achieving the maximum amount of freedom for individuals that is consistent with the maintenance of the social order.”

Over the last decade, however, this account has given way to an alternative reading of the past. For a vocal cohort of writers and activists, the real conservative tradition lies in what is sometimes called “integralism”—the weaving of religion, personal morality, national culture, and public policy into a unified order. This intellectual history no longer reflects the easy confidence of a Buckley, nor does it advance an argument, formed primarily in conversation with the American founders, for government resting on a balance-of-powers constitution and enabling a free citizen’s pursuit of happiness. Instead, it imagines a return to a much older order, before the wrong turn of the Enlightenment, the fetishizing of human rights, and the belief in progress—a time when nature, community, and divinity were thought to work as one indivisible whole.

Integralism was born on the Catholic right, but its reach has transcended its origins, now as an approach to politics, law, and social policy known to its promoters as “common-good conservatism.” In states such as Florida and Texas, its worldview has informed restrictions on voting access, curbs on public school curricula dealing with race and gender, and purges of school libraries. Its legal theory has shaped recent Supreme Court decisions that narrowed the rights of women and weakened the separation between religion and public institutions. Its theology has lain behind the bans on abortion passed by nearly half of U.S. state legislatures. Its proponents will be present in any future Republican presidential administration, and in their fight against liberals and cosmopolitans, they are more likely than earlier American conservatives to look for allies abroad—not on the British or European center-right but among newer, far-right parties and authoritarian governments committed to unraveling the “liberal order” at home and abroad. “They hate me and slander me and my country, as they hate you and slander you and the America you stand for,” Hungarian Prime Minister Viktor Orban told a crowd last year in Dallas, at the annual Conservative Political Action Coalition conference, a gathering of conservative activists, politicians, and donors. “But we have a different future in mind. The globalists can all go to hell.”

For all these reasons, reading right-wing philosophers is the first step toward understanding what amounts to the most radical rethinking of the American political consensus in generations. Theorists such as Patrick Deneen, Adrian Vermeule, and Yoram Hazony insist that the United States’ economic ills, its political discord, and its relative decline as a world power spring from a single source: the liberalism that they identify as the dominant economic, political, and cultural framework in the United States since World War II and the model that the country has spent the better part of a century foisting on the rest of the globe. Yet these ideas also point toward a deeper change in how conservatives diagnose their country’s troubles. On the American right, there is a growing intuition that the problem with liberal democracy is not just the adjective. It is also the noun.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185863 https://historynewsnetwork.org/article/185863 0
Want to Understand the Internet? Consider the "Great Stink" of 1858 London For years, a primary metaphor for the internet has been the “town square,” an endless space for free expression where everyone can have their say. But as scaled digital platforms have grown to dominate most of modern life, metaphors centered solely on speech have failed to explain our current civic dysfunction.

Perhaps the better way to understand the internet is to compare it to a much older infrastructure problem: citywide sanitation systems. Posted content is akin to water; websites and other interfaces are analogous to pumps; and unintended feedback loops correspond to risk of infection. A public-health framework for understanding the internet would focus not on online information itself but on how it is generated, spread, and consumed via digital platforms.

This model’s genesis lies in the two-century-old story of early advocates for clean water in Victorian England. At the time, the life-threatening diseases that ravaged cities—cholera, typhus, tuberculosis, and scarlet fever—were not new. What were new were modern living conditions. Infections that might have taken weeks to spread through a village suddenly ravaged whole populations within days, and no one understood what was causing the massive outbreaks.

The Victorian working classes knew whom to blame when disease broke out: doctors. Mobs assaulted members of the medical establishment, leaving government officials unsure how to weigh the safety of physicians against the public interest. Why the rage? The traditional response to disease—quarantines—had become ineffective in industrialized cities, prompting the public to distrust those who profited from treatment.

The first serious approach to the problem was taken by a coalition of doctors, liberal advocates, and social reformers starting in the 1830s. Known as miasmists, they pushed the idea that noxious air was the culprit in epidemics. If a neighborhood could not pass the smell test, the argument went, one immediately knew it was already too late to be saved.

Miasmists, including prominent ones such as Florence Nightingale, have an ambivalent legacy. They were among the first to emphasize that disease had not just biological but also social and economic causes, a crucial insight. But simultaneously, they were dead wrong about the role of air in the spread of the common diseases of the time, a reflection of an elitist worldview and overprescribed morality.

This tension revealed itself during two key events. One was the Great Stink of 1858, in which a combination of hot weather and poor waste disposal transformed the Thames into a cesspool. The stench was so bad that even the curtains of the houses of Parliament had to be caked with lime. No one was safe from the foul air, and by the miasmists’ assumptions, that meant that no one was safe from disease. But, in fact, no major outbreak followed the Great Stink.

Second was the groundbreaking work of a brilliant doctor, John Snow, who had suspected for years that water (not air) was the actual cause of urban epidemics. In a painstaking natural experiment, Snow demonstrated that the Broad Street pump was the source of the 1854 cholera epidemic in the Soho area of London. His data revealed that residents living across the city became sick if they happened to get water from the pump, even while a nearby brewery that drew its water from a different source had no recorded cases. There was no other reasonable explanation: Some as-yet-undiscovered mechanism, localized at the pump, was responsible for infection. Though Snow was careful to frame his results so as not to explicitly reject the miasma theory, the implications were obvious.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185860 https://historynewsnetwork.org/article/185860 0
As More Schools Ban "Maus," Art Spiegelman Fears Worse to Come Right-wing culture warriors pushing restrictions on classroom instruction sometimes defend these measures by insisting that they avoid targeting historically or intellectually significant material. In their telling, these laws restrict genuinely objectionable matter — such as pornography or "woke indoctrination” — while sparing material that kids truly need to learn, even if it’s controversial.

A new fracas involving a school board in Missouri will test this premise. The controversy revolves around Art Spiegelman’s graphic novel about the Holocaust, and it indicates that those seeking to censor books seem oddly unconstrained by the principle that they are supposed to avoid restricting important, challenging historical material.

“It’s one more book — just throw it on the bonfire,” Spiegelman told me ruefully, suggesting the impulse to target books seems to have a built-in tendency to expand, sweeping in even his Pulitzer-winning “Maus” under absurd pretenses.

“It’s a real warning sign of a country that’s yearning for a return of authoritarianism,” Spiegelman said.

The board in Nixa, a small city south of Springfield, will debate the fate of “Maus” this month. The Springfield News-Leader reports that board employees flagged it in a review in keeping with a Missouri law making it illegal to provide minors with sexually explicit material.

It’s not yet clear what the employees found objectionable. But “Maus” — which illustrates Spiegelman’s parents’ experience of the Holocaust and features Nazis as cats and Jews as mice — graphically depicts his mother naked in a bathtub after taking her own life.

“She was sitting in a pool of blood when my father found her,” Spiegelman said of his mother. It is a “rather unsexy image seen from above,” he noted, and “not something I think anybody could describe as a nude woman. She’s a naked corpse.”

....

The repeated targeting of “Maus” over alleged sexual content, Spiegelman lamented, is a mere pretext. “It was the other things making them uncomfortable, like genocide,” he said. “I just tried to make them clean and understandable, which is the purpose of storytelling with pictures.”

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185844 https://historynewsnetwork.org/article/185844 0
PEN Condemns Censorship in Removal of Coates's Memoir from AP Course PEN America responded today to the removal of Ta-Nehisi Coates’ acclaimed memoir Between the World and Me from an advanced placement course in South Carolina, calling it “an outrageous act of government censorship.”

As reported, earlier this spring students in the Chapin High School classroom reported a teacher for including Coates’ memoir and two related short videos in her argument essay unit. The unit, designed in preparation for the AP Language test, which is accepted for credit by many colleges, included questions such as: “Do you think racism is a pervasive problem in America? Why or why not?”

Several students wrote to the school board about the class, saying it made them feel “ashamed to be Caucasian” and “in shock that she would do something illegal like that…I am pretty sure a teacher talking about systemic racism is illegal in South Carolina.” South Carolina passed an educational gag order last year that banned “divisive concepts” related to race and sex.

In response, Jeremy C. Young, freedom to learn program director, released the following statement:

“This is an outrageous act of government censorship and a textbook example of how educational gag orders corrupt free inquiry in the classroom. In a course designed to be taught at the college level, students complained that a teacher assigning a National Book Award-winning volume about race was “illegal in South Carolina.” Instead of defending the teacher’s right to teach this material, the school board sided with the students and censored the curriculum.

As with the AP African American Studies course in Florida – which also involved the politically-motivated removal of Ta-Nehisi Coates’ writings – government attempts to limit discourse and expression in high school classrooms have spilled over into early college programs. Educational gag orders in South Carolina and elsewhere are doing exactly what they are designed to do: censor teachers who dare to discuss race and gender in class. Anyone who believes these laws are harmless, or are designed to prevent indoctrination, should look no further than the Lexington-Richland School District to realize what these laws truly are: a license to silence students’ education.”

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185842 https://historynewsnetwork.org/article/185842 0
Teaching Hard Histories Through Juneteenth Editor’s note: Since the publication of this article, Juneteenth was declared a federal national holiday in 2021.

Each year around June 19, Black communities across the country unite for a family reunion of sorts. Juneteenth activities feature the sights and sounds of Blackness: People enjoying art, music and food that connect them to a shared ancestry and history. They celebrate being their authentic selves. They celebrate freedom in both solemn and festive ceremonies.  

This celebration marks a day in 1865 when enslaved Texans learned they’d be free—two months after Robert E. Lee surrendered and ended the Civil War and two and a half years after President Abraham Lincoln issued the Emancipation Proclamation. Initially a uniquely Texan observance, Juneteenth has now been recognized in some form in every corner of the country.

There are many ways to teach students about this celebration. Lessons about Juneteenth need to recognize the challenges those who fight injustice have always faced, but they shouldn’t be marked only by the tragedy of enslavement. Students, particularly Black students, can find empowerment in the jubilant celebrations of culture, activism and the humanity of a people.

Although the truth had been hidden from them—and they continued to face threats of continued oppression, violence and death—a year after they learned of their freedom, formerly enslaved people resiliently rallied around that date and made the celebration an annual ritual. Early Juneteenth observances included a search for lost family members and an opportunity to uplift each other as they moved through hostile environments. 

With this knowledge, students can also identify ways the descendants of the enslaved recapture and honor the cultures, customs and practices lost through slavery. 

Early celebrations involved readings of the Emancipation Proclamation, religious ceremonies, singing, games and enjoying foods that enslaved people ate. Today, it doesn’t look that much different. People retell histories, have family reunions, eat foods reminiscent of early Juneteenth celebrations such as barbeque, attend religious services or choir performances and have elaborate displays such as fancy dress and parades. 

That’s why Juneteenth is more than an observance of freedom. It’s also a time to share the experiences of those who fought—literally and figuratively—to seek true freedom for future generations. It’s important that we don’t whitewash this history. 

A common mistake among those who teach the history of American slavery is to center the U.S. government’s role in granting freedom while also placing the onus to navigate through a racist society solely on the formerly enslaved.

Perhaps many center Lincoln in this history because we tend to think of the Emancipation Proclamation, instead of the 13th Amendment, as ending slavery. Our 2018 Teaching Hard History report found that 59 percent of high school students couldn’t correctly identify the latter as the legal end to slavery in the United States. 

But it’s important for students to know that enslaved people didn’t willfully accept enslavement or wait for others to free them. They resisted often and consistently. While rare, violent rebellions did occur. Some people successfully escaped enslavement. And everyday acts of resistance, such as breaking tools or pretending to be ill were other ways enslaved people asserted their humanity. 

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185839 https://historynewsnetwork.org/article/185839 0
Arkansas Libraries and Booksellers Sue over State Book Restrictions Arkansas booksellers and librarians have filed a lawsuit challenging a new state law that would punish them with up to a year in prison for providing “harmful” books to people under age 18.

The suit challenges two parts of the law, Act 372, which is due to take effect Aug. 1. One section makes it a criminal offense to knowingly lend, make available or show to a minor any material deemed “harmful” to them. State law defines material as “harmful to minors” if it contains nudity or sexual content and appeals to a “prurient interest in sex,” “lacks serious literary, scientific, medical, artistic or political value for minors” and if, by current community standards, it is “inappropriate for minors.”

The plaintiffs also contest a section of the law that requires county and municipal libraries to establish written guidelines for the “selection, relocation and retention” of materials, including a process for individuals to challenge their “appropriateness” and request that they be moved to an area inaccessible to children.

By passing Act 372, which also strikes a statute that protected librarians from being prosecuted for circulating material “claimed to be obscene,” Arkansas became one of at least seven states that have passed laws criminalizing librarians and school employees who provide books deemed sexually explicit or “harmful” to minors, The Washington Post’s Hannah Natanson reported in May. Another dozen states have considered similar bills.

None of the bill’s four sponsors in the Arkansas General Assembly responded to a request for comment. In a May op-ed in the Arkansas Democrat-Gazette, state Sen. Dan Sullivan defended Act 372, saying that it simply expands existing prohibitions on “displaying” harmful material and creates a process for parents to “appeal the decisions of unelected librarians to local elected officials.”

Kandi West, the primary owner of WordsWorth Books in Little Rock, a plaintiff in the suit, expressed concerns about how to interpret the law’s definition of making material “available” to a minor. “While our store is organized with our children’s section in its own part of the store, there is nothing that keeps a minor from browsing the entire store,” she wrote in an email.

“We pride ourselves on our collection, and on having books that represent all kinds of people that are in our community,” said Daniel Jordan, who co-owns Pearl’s Books, another plaintiff, with his wife, Leah. Now he worries the law might force them to hide or remove critically acclaimed, popular books for fear that someone might deem them inappropriate for minors.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185835 https://historynewsnetwork.org/article/185835 0
Biden Administration Seeks US Readmission to UN Cultural Body, Aims at Countering China's Soft Power The U.S. is moving to rejoin Unesco—with plans to pay hundreds of millions of dollars in membership fees—in a bid to counter the growing influence of China and other adversaries at the United Nations culture and heritage organization.

On Thursday, a delegation of U.S. diplomats delivered a letter to Unesco Director-General Audrey Azoulay seeking readmission next month to the Paris-based organization. In the letter, which was viewed by The Wall Street Journal, a senior State Department official said the Biden administration plans to request an appropriation of $150 million from Congress for fiscal 2024 to pay Unesco, adding that similar contributions would be made in ensuing years until the country’s membership arrears are fully repaid. The U.S. currently owes Unesco $619 million, according to the organization.

The move aims to reverse the Trump administration’s decision to withdraw from the United Nations Educational, Scientific and Cultural Organization in 2017, when it cited a need for overhauls at the organization as well as its “continuing anti-Israel bias.” Since then, China has become one of Unesco’s largest donors. The organization’s No. 2 official is now Chinese, positioning Beijing to help steer discussions at the organization on issues ranging from press freedom to education in Ukraine and other war-torn countries.

In an interview, Azoulay said the U.S. was eager to re-establish its influence at an organization that—in addition to designating heritage sites around the world—is at the forefront of global efforts to develop guidelines for artificial intelligence and other sensitive technologies.

“The U.S. are coming back because Unesco has grown stronger, and because it is dealing with issues that concern them,” she added.

In its letter, the State Department said the Biden administration planned to work with Congress to provide additional funding of $10 million in support of certain Unesco programs, including the preservation of cultural heritage in Ukraine and education about the Holocaust.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185831 https://historynewsnetwork.org/article/185831 0
Major Genealogical Group Apologizes for Past Associations with Eugenics and White Supremacy One of the nation’s oldest and largest genealogical societies, founded to help Americans trace their family ancestries, apologized Thursday for its history of racism, which includes a founder who was a eugenicist, and early resistance to integration.

“In order to be credible, we have to be transparent, and we have to fully discover what our past was, as so many organizations are doing right now,” said Kathryn Doyle, president of the National Genealogical Society, based in Falls Church, Va.

The society’s effort began in 2017 after complaints about the lack of diversity among the expert presenters at the society’s annual conferences. It gained momentum, she said, after the murder of George Floyd by a police officer in Minneapolis in 2020 sparked a national conversation on race.

While the society members used their digging prowess to scour the organization’s archives, “we haven’t looked at everything yet,” Doyle said. “There may be more.”

The apology, which was made public at the organization’s conference in Richmond, comes five months after the American Society of Human Geneticists issued a similar apology and announced steps to rectify past harms, which also included the promotion of eugenicist beliefs. The ASHG is the largest group of human geneticists in the world.

Beliefs in biologically superior and inferior races — which contradict modern genetic knowledge — have permeated both the study of genetics and the practice of genealogy.

In a report issued with its apology, titled “Our Journey from Exclusion to Inclusion,” the National Genealogical Society noted that its founding in Washington, D.C., in 1903 coincided with the rise of the American eugenics movement, which was based on the long-discredited theory that humanity can be improved through breeding, with supposedly pure White people of European ancestry as the ideal.

One NGS founder, Joseph Gaston Baillie Bulloch, a physician from Georgia and president of the group from 1909 to 1912, was an adherent of eugenics, the report said. In a 1912 article he published in the society’s quarterly journal, he advised how genealogy should be used to protect the White race from genetic mixing and “tainted blood.”

Given the society’s decision to publish the article and its segregated membership, “it is reasonable to assume that other founders may have shared Bulloch’s beliefs in eugenics or racism and that those beliefs informed the exclusionary practices NGS maintained throughout its early years,” the report said.

In early 1960, James Worris Moore, an African American employee at the National Archives, attended a meeting of the NGS as a guest and was given a membership application, prompting an angry response among members and a rancorous discussion as to whether to integrate, the report says. On Nov. 19 that year, the society voted to both deny Moore’s membership and bar all Black people from joining.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185830 https://historynewsnetwork.org/article/185830 0
Latin American Historians, Diplomats Slam The Economist for Racist Description A recent article published in The Economist has drawn online backlash after its headline and heading characterized Latin American workers as "useless" and "unproductive."

"It's racist, it's incendiary, it's insulting," Alexander Aviña, an associate professor of history at Arizona State University, told NBC News in a phone interview.

The story, published Thursday in the London-based magazine that covers global and economic affairs, focuses on why Latin America, according to World Bank figures, is the world's slowest-growing regional economy. The article posited various factors, including governments' lack of investment in education, limited competition, a large informal economy and corruption.

But scholars, journalists and historians pushed back online that the article's heading — “A land of useless workers,” and the headline, “Why are Latin American workers so strikingly unproductive?” played into racist tropes of Latin American workers as lazy.

"If you look at some of the U.S. press in the 19th century, even into the 20th century, they would talk about Mexicans having the quote-unquote mañana habit and that explains the lack of development and productivity to characterize Mexico," Aviña said.

"Having followed The Economist for a really long time, they know what they're doing," he said. "There's a click-baity aspect to this title and this argument and the subtitle but, nonetheless, it's still, I think, incredibly insulting, objectively inaccurate on a variety of levels, and I think it's racist."

....

Ignacio Sánchez Prado, a professor of Latin American studies at Washington University in St. Louis, said in a phone interview that "there is an existing framework that disparages Latin American countries to stereotypes ... pieces like this feed from that representation."

In Sánchez Prado's opinion, the article doesn't address the history of past economic policies, some even endorsed by the magazine, that have contributed to undermining economic conditions in Mexico and Latin America.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185829 https://historynewsnetwork.org/article/185829 0
Isabella Weber and the Historical Case for Price Controls Cancelling Christmas was, of course, a disaster. Raised in West Germany during the reunification era, Isabella Weber had been working as an economist in either Britain or the United States for the better part of a decade. An annual winter flight back to Europe was the most important remaining link to her German friends and family. But in December, 2021, the Omicron variant was surging, and transcontinental travel felt too risky. Weber and her husband drove from the academic enclave of Amherst, Massachusetts, to a pandemic-vacated bed-and-breakfast in the Adirondacks, hoping to make the best of a sad situation. Maybe Weber could finally learn how to ski.

Instead, without warning, her career began to implode. Just before New Year’s Eve, while Weber was on the bunny slopes, a short article on inflation that she’d written for the Guardian inexplicably went viral. A business-school professor called it “the worst” take of the year. Random Bitcoin guys called her “stupid.” The Nobel laureate Paul Krugman called her “truly stupid.” Conservatives at Fox NewsCommentary, and National Review piled on, declaring Weber’s idea “perverse,” “fundamentally unsound,” and “certainly wrong.”

“It was straight-out awful,” she told me. “It’s difficult to describe as anything other than that.”

She gave up on skiing. The proprietor of the hotel made extra soup to cheer her up. But every time Weber checked her phone she was being mocked by a new round of critics. “The ugliness of the reaction to Weber’s op-ed is depressing,” Adam Tooze wrote, in his popular “Chartbook” newsletter. “Depressing and telling.”

In a matter of hours, Weber, who was thirty-three years old, had transformed from an obscure but respected academic at the University of Massachusetts, Amherst, into the most hated woman in economics—simply for proposing a “serious conversation about strategic price controls.” The uproar was clearly about something much deeper than a policy suggestion. Weber was challenging an article of faith, one that had been emotionally charged during the waning years of the Cold War and rarely disputed in its aftermath. For decades, the notion of a government capping prices had evoked Nixonian cynicism or Communist incompetence. And Weber was making her case in a climate of economic fear. Although the most acute disruptions of the pandemic seemed to be over—businesses were reopening and jobs were coming back—supply chains remained snarled and prices were rising faster than they had in forty years. Fringe fantasies of hyperinflation and economic doom were starting to go mainstream.

But Weber’s argument was carefully grounded in history. Price controls, she argued, had been an essential element of the U.S. mobilization strategy during the Second World War. And there were several striking similarities between the economy of the nineteen-forties and that of the present day, including very high consumer demand for goods, record corporate profits, and production bottlenecks in important areas. Back then, the Office of Price Administration simply prohibited companies from raising prices above certain levels. Violators could be sued, or worse. In 1944, Montgomery Ward, the department-store chain, refused to accept the terms of a collective-bargaining agreement—a cap on the price of labor—brokered by the government. President Roosevelt ordered the National Guard to seize the business and remove Sewell Avery, its chairman, from its headquarters.

The O.P.A. program was born of necessity. The traditional inflation-control tactic—jacking up interest rates—would have reduced employment and industrial activity, making it harder for the military to obtain the supplies that it needed to fight. Industry-specific price controls contained consumer costs while encouraging companies to boost profits through higher sales volume. The initiative worked. During the First World War, inflation had run rampant. During much of the Second, it was close to two per cent. And yet factories were operating at peak levels. If contemporary policymakers could do the same thing, Weber argued, they could limit inflation without inducing layoffs and wage cuts.

Today, in a host of key sectors, that’s more or less happening. The European Union is regulating the price of natural gas, the Biden Administration is regulating the price of oil, and the G-7 is enforcing a global cap on the price of petroleum products produced in Russia. Inflation appears to be cooling, and by nearly every measure we are living in the best labor market in a quarter century.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185827 https://historynewsnetwork.org/article/185827 0
Progress Digitizing the Johnson Publishing Archive, a Vast Resource in African American History Rows upon rows of ordinary-looking boxes, stacked high and surrounded by fencing, fill an unassuming warehouse in Chicago.

But what’s inside these boxes is anything but ordinary. Photographs of Ray Charles taking business calls, Louis Armstrong celebrating his birthday, Maya Angelou lounging on her bed, and millions more intimate and striking images of Black celebrities and everyday life are tucked away in the vast Johnson Publishing Company (JPC) archive.

“Once you start reading the names on the boxes, then you say ‘Ohhh…’ To me, that’s exciting,” said Vickie Wilson, who was the publishing company’s archival specialist for 20 years and is now consulting with Getty on the project to archive and digitize these materials and make them available to the public.

The Johnson Publishing Company produced iconic magazines including Ebony and Jet and its archive is regarded as one of the most significant collections of 20th century Black American culture. The archive contains around 5,000 magazines, 200 boxes of business records, 10,000 audio and visual recordings, and 4.5 million prints and negatives that chronicle Black life from the 1940s until the present day—not only preeminent figures like Muhammed Ali and Aretha Franklin, but also scenes of ordinary life like high school graduations and vacations, which weren’t often documented or celebrated in other publications.

After the publishing company filed for bankruptcy in 2019, a consortium comprising five institutions including the J. Paul Getty Trust and the Smithsonian Institution purchased the archive. Getty committed $30 million in support of the processing and digitization of the archive, which will make the collection available to scholars, journalists, and the general public like it never has before.

Last year, ownership of the archive was officially transferred to the Smithsonian’s National Museum of African American History and Culture (NMAAHC) and Getty. Since then, the two institutions have been laying important groundwork so that this fall, the monumental task of digitizing the entire archive, making it available it to a brand-new website, and moving the materials to their new home in Washington, DC, can truly begin.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185826 https://historynewsnetwork.org/article/185826 0
New College Visiting Prof. Out of Job—Rufo's Public Remarks Suggest Politics the Motive Erik Wallenberg won’t be teaching at New College of Florida next academic year. The circumstances suggest he might be the latest target of the state’s partisan battle over higher education.

In March, Wallenberg and a colleague wrote an opinion essay criticizing an attempted ideological overhaul of their campus. This year, Gov. Ron DeSantis, a Republican, stacked the board at New College, a small public liberal-arts institution, with a cohort of conservative activists — including Christopher F. Rufo, known for waging a national campaign against critical race theory.

“What the DeSantis administration is trying to do, in brief, is force a conservative Christian model of education onto our public college,” Wallenberg and his colleague wrote in Teen Vogue.

Their commentary soon attracted the attention of Rufo, who called Wallenberg and his colleague “pure left-wing Mad Libs” on Twitter.

At the time, Wallenberg, a visiting assistant professor of history, was awaiting news on whether his contract would be renewed. He continued opposing his institution’s sudden political shift, helping organize a teach-in on academic freedom and bringing a prominent Black historian and DeSantis critic to campus. Meanwhile, turmoil continued at New College; at one board meeting, five faculty members were denied tenure and a professor resigned on the spot in protest.

Last month, Wallenberg learned that New College wouldn’t be renewing his faculty appointment. He said the head of his division told him that the decision was made by Richard Corcoran, New College’s interim president and a former Republican speaker of the Florida House of Representatives.

Rufo shared on Twitter last week that Wallenberg wouldn’t return in the fall, and seemed to connect the contract nonrenewal to the professor’s political views and criticism of New College’s administration. “It is a privilege, not a right, to be employed by a taxpayer-funded university,” Rufo wrote. “New College will no longer be a jobs program for middling, left-wing intellectuals.”

Rufo’s remarks prompted the Foundation for Individual Rights and Expression to get involved. FIRE on Thursday declared the decision a violation of Wallenberg’s First Amendment and academic-freedom rights, warning of a chilling effect that would prevent “any reasonable faculty member from expressing views, extramurally or in class, that might cost them their jobs.” Wallenberg said he had not been in touch with FIRE but appreciated its efforts; New College did not return a request for comment.

The Chronicle spoke with Wallenberg about his experience. This interview has been edited for length and clarity.

The Teen Vogue essay was your first experience publicly opposing change at New College. Why did you and your colleague write it? We decided that we needed to write something to our students, who were really struggling over the course of the semester. We had to make time at the beginning of our classes for them to talk about that, and we could see how this crisis was weighing on students. We wanted to write something that addressed the history of what some of their fears were — in particular, losing access to curriculum, things like Black freedom studies, gender studies.

Shortly after it was published in March, Rufo tweeted a criticism of the essay, screenshotting your CV and your colleague’s and writing, “Luckily, both are visiting professors.” What was your reaction to that? I certainly didn’t expect him to be civil or interested in an actual conversation. But I was shocked by the fact that as a trustee, he had no sense of what his role is and should be, and that he would so blatantly attack two faculty members. He demeans the fact that we would publish in Teen Vogue, which has a very large readership of young people, which is why we chose to publish there. In his criticism, there is nothing of substance — nothing of what we’ve actually written in the articles that we’ve published. It’s just based on some buzzwords that he pulled out.

Do you have any regrets about the essay, looking back on it now? I have no regrets. I stand by what we wrote. What we wrote has been confirmed in a lot of ways. There were a lot of faculty who were hopeful that this new Board of Trustees wouldn’t really disturb our teaching as much as it has. I think everyone’s realized how bad things really are.

....

Did the nonrenewal come as a surprise to you? A major part of teaching U.S. history, to me, is teaching the construction of race and racism in policy, law, and practice. Knowing that that is something that the governor has spoken out against teaching and said we shouldn’t be talking about those things, I was certainly expecting that the board would try to not renew my contract.

I think the reason is clear if we go by Christopher Rufo’s tweets; it suggests that the reason they did not renew my contract is because of what he perceives as my politics and what he sees as my publications. But I will also say I would like more information. I would like to actually hear Corcoran give a reason why he refused to sign that contract renewal.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185824 https://historynewsnetwork.org/article/185824 0
Hollywood Has Abandoned the Citizen-Inventor As a child of the 1990s, only three possible jobs existed to my young mind: paleontologist, Chicago Bull, and inventor. The last seemed the most practical of the options, as I lacked the height to dunk or the lateral agility to juke velociraptors. Invention seemed practical by its sheer omnipresence. In grade school we read about Great Men like Thomas Edison and Nikola Tesla, singular visionaries who invented the future on their own initiative.

Television and movies brought it closer to home. Edison and Tesla were distant from me by time and complicated suits, but on the big (and little) screen, inventors were my contemporaries. They were of similar look and means, most notably in their “laboratory.” Pop culture inventors invariably worked out of their garages, that emblem of middle-class mobility. Invention seemed within reach when it was two steps out the front door.

This next generation doesn’t suffer from the same delusion, and their sanity frightens me. Instead, invention has become a secret knowledge, accessible only to M.I.T. grads (and occasionally Stanford). Rather than a meritocratic act of creation, invention in the public consciousness has become elite in nature and limited in scope. The pool of possible inventors has grown smaller, and the depth of their potential shallower. We used to dream of flying cars; now we only hope for slightly less buggy apps.

The fault lies in a subtle yet violent shift in our imagination away from our own responsibility to invent. Pop culture’s vision of invention creates a place where inventions are not only possible but expected. In an ouroboros of cause and effect, our depiction of invention on the screen has shifted from populist obligation to the exclusive right of a technocratic priestly caste. To put it less verbosely, the inventors in film used to look like us; now they look like Robert Downey Jr.

Some of the fondest memories from my childhood are of staying home sick, watching daytime television. One commercial in this timeslot played more frequently than most. It showed a cartoon caveman carving the first wheel out of stone, and the voice-over encouraged viewers to patent their own invention at the advertised company, because clearly copyright law should have come before pants.

The ad, like all daytime television commercials without Wilford Brimley, proved to be a scam. But then the whole myth of the lone wolf inventor was a scam as well. Tesla got his break working for Westinghouse, and Edison had his own sweatshop of engineers cranking out inventions while he was busy electrocuting dogs.

But the unreality is hardly the point. After all, culture is just myth with the skepticism withered away. In printing the legend over the truth, we created a society where inventors did arise. Likewise, through a conspiracy of education and Bob Barker, I believed I could invent a cotton gin and steer the course of human history.

American film and television had no shortage of such inspiring lies. For the sake of brevity and my lacking a Criterion Channel subscription, let’s start with the 1960s. The three most prominent pop culture inventors of the time were, not coincidentally, all professors; one Absent-Minded, one Nutty, and the last so renowned on his little island he was known simply as “the Professor.”

Fred MacMurray’s Absent-Minded Professor Brainard wants to use flubber to make basketball players jump and help out the military, resisting the attempts of a local businessman to exploit it for pure profit. Jerry Lewis’s Nutty Professor Kelp doesn’t sell his transformation serum to the military-industrial complex to create an army of super soldiers and instead keeps it for the more relatable desire of scoring a hot blonde. And if Gilligan’s Island is indeed a microcosm of civilization, as we’ve long suspected, then the Professor is invention at its most altruistic. He never condescends to his fellow castaways, instead using all manner of coconut to ease their troubles. He always has time for a bamboo fashion show or whatever nonsense, even if constructing a raft is a better use of his talents.

While academia is far from a blue-collar field, all three demonstrated their populist bona fides. Only fifteen years from the Manhattan Project, the public still saw invention as the domain of the university. But more importantly, they saw academics as the domain of the people. The GI bill wedged a work boot into the college door — it was no longer just the nesting ground for George Plimpton types. Here the professors use their inventions for plebian good: finding love and helping white kids dunk.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185823 https://historynewsnetwork.org/article/185823 0
New Book Says Cure for Girls in Crisis is Revolution YOUNG AND RESTLESS: The Girls Who Sparked America’s Revolutions, by Mattie Kahn

American girls are in crisis — on that much we can all agree. In a 2021 study of young women’s mental health, a vast number of respondents between ages 12 and 19 answered that they had felt “persistently sad or hopeless”; rates of sexual assault and violence in the same population are alarmingly high. As caregivers, many of us are looking for ways to shore up our children’s well-being, self-esteem and happiness. And although it is not the aim of a historical survey to be prescriptive, heartening inspiration can be found in “Young and Restless,” Mattie Kahn’s thoroughgoing examination of the role of young women and girls in America’s uprisings.

Her subjects have agitated on behalf of labor and voting rights, racial dignity and equality, sexual and reproductive freedom, freedom of speech and against climate change. The solutions she illustrates include objecting, resisting — and, yes, acting up, rather than sinking into sadness and accepting the unacceptable. By taking direct action in the service of shared values, in alliance with beloved communities for a better future, girls throughout American history have discovered a sense of personal agency, often during eras when their opportunities were sharply circumscribed. Sometimes they even changed history.

Kahn, whose stated aim is to write girls back into the historical record, also considers her subjects’ lives before and after their time in the trenches. Many of the young women who took on activist roles — especially those who lived before the mid-20th century — faced intense blowback, even as they inspired others to their causes. The book also examines the place of childhood itself as a battleground on which America’s culture wars have historically been fought.

The author maintains an admirable ability to complicate her own assertions — girls have been a force for progressive change, for instance, but also a force in reactionary movements. And, delightfully, she brings a onetime women’s magazine editor’s attentiveness to the importance of style and theatricality in the lives of young women whose sashes and hats, hairstyles and armbands and, finally, pants, have marked their movements for change.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185821 https://historynewsnetwork.org/article/185821 0
The Debt Ceiling Law is now a Tool of Partisan Political Power; Abolish It The debt ceiling drama seems to be nearing its end, as the US House of Representatives passed legislation that would lift the debt ceiling in accordance with a deal reached last weekend between Joe Biden, the president, and Kevin McCarthy, the Republican speaker of the House. The Republicans have been fighting to force cuts in spending and/or eligibility for food stamps (Snap), Medicaid, childcare and pre-schools, education and grants for higher education.

By linking these and other provisions to the lifting of the debt ceiling, the Republicans tried to use the threat of default on the public debt to force Democrats to accept them. The legislation, which now goes to the Senate where it is expected to pass, did not satisfy most of their desires.

The worst abuse that Republicans managed to include will be suffered by the hundreds of thousands of poor people who will likely lose access to food assistance under the Snap program. Many are in poor health and will not be able to complete the work requirements that Republicans have insisted on imposing for people of age 50-54; others will lose benefits due to additional red tape.

There was also damage done by the fictitious narrative that Republicans were able to successfully promote about the “ticking time bomb” of the public debt. There is no bomb and if there were, it would not be ticking.

The relevant measure of our debt burden is how much we pay annually in net interest on the debt, as a share of our national income (or roughly, GDP). That number was 1.9% for 2022. That is not big, by any comparison. We averaged about 3% in the 1990s, while experiencing America’s then longest-running economic expansion.

The constant repetition of the “threat” posed by our national debt was a big win for Republicans, who are always looking to cut spending on social needs and safety nets; and more strategically important, to cut spending that could aid recovery from an economic downturn when Democrats are in power.

]]>
Fri, 01 Nov 2024 00:13:50 +0000 https://historynewsnetwork.org/article/185794 https://historynewsnetwork.org/article/185794 0