History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Thu, 23 Sep 2021 17:58:20 +0000 Thu, 23 Sep 2021 17:58:20 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/site/feed Teaching "All Men are Created Equal" (Part II)

Pulling Down the Statue of King George III, N.Y.C., Johannes Adam Simon Oertel ca. 1859, depicts the destruction of a statue of the monarch in the wake of the reading of the Declaration of Independence, 1776.

 

 

Note: This is the second part of Jeff Schneider's essay on teaching the content and contradictins of the Declaration of Independence. Part I is here

 

The assignment for the second day is to find the words where Jefferson discusses revolution and describes government and democracy. Unfortunately even though the idea that “all men are created equal” is the basis of the right to revolution, we will have to put off most of that discussion until the last day. However, here is a key sentence describing the relation between revolution and government:

 

(T)hat to secure these (natural) rights governments are instituted among men deriving their just powers from the consent of the governed: that whenever any form of government becomes destructive of these ends it is the right of the people to alter or abolish it, and to institute new government, laying its foundation on such principles, … as to them shall seem most likely to effect their safety and happiness.

 

The causal relationships among natural rights, agency and government are at the heart of the description of the right to revolution. It is these lines that inspired peoples all over the world to take their fate in their hands and overthrow tyrannical rule. It is these ideas that inspired Tom Paine to write in the first edition of Common Sense:

 

O ye that love mankind! Ye that oppose not only the tyranny, but the tyrant, stand forth! Every spot of the old world is overrun with oppression. Freedom hath been hunted round the globe. Asia, Africa have long expelled her! Europe regards her like a stranger and England hath given her warning to depart. O! receive the fugitive, and prepare in time an asylum for mankind.

 

On the third day we discuss the meanings of the right to revolution and the paragraph leading to the Grievances and the conclusion, sometimes called the declaration of war. The last lines declare that all the previous ideas and grievances will be defended at all costs.

 

And for the support of Divine Providence we mutually pledge to each other our lives, our fortunes and our sacred honor.

 

 

The fourth day is devoted to an Oprah show in which she interviews the two white and two black descendants of Thomas Jefferson. 

 

 

 

I showed it up to the end of the comment by the great historian Annette Gordon-Reed, who speaks from the audience. It is about halfway through. The discussion is riveting: a real discussion between blacks and whites about race, heritage and truth in history. “All men are created equal” is its background. Students are fascinated by the straightforward exchanges on America's creed and its complexities. Many questions are raised in this discussion including passing for white and the shocking actions of Jefferson, who took the enslaved half-sister of his deceased wife as a concubine when she was a young teenager. The story of Sally Hemings could be a week at least in itself. Oprah is one of the few TV personalities who could make a success out of such a show.

 

Now we are ready for the last day of Declaration classes in my series. The assignment is for the students to list all the meanings they could think of for “All men are created equal.” Who is equal in the Declaration? The list grows as we talk. First there are the white men over 21 who own property or pay a minimum of taxes. These men comprised the vast majority of the voters in all the states. However were there other groups of people to whom the Declaration was addressed? Let us consider some possibilities. Could George Washington and Thomas Jefferson have fought the war by themselves? All the troops could not come from the elite family members of the Continental Congress. To be sure, the protesters of the 1760s and '70s were essential for the development of the revolutionary movement. They were not voters by and large, yet they were essential to the movement.

 

If we consider the dates we had discussed during the first class for the Revolutionary War, the first battle was fought in April 1775, and the Treaty with Great Britain was signed in 1783, but the date of the Declaration was July 4th, 1776. One might then ask why the Declaration was written so long after the first battle and before the treaty, or why it was written at all. One purpose, besides recruitment of troops, was to gain support for the war and the organizations in the states and local committees of public safety: enforcing boycotts and discouraging Americans from supporting the Loyalists. In addition the Declaration itself was addressed to the supporters of democracy and opponents of Great Britain around the world. The Americans were declaring that they were equal to all the peoples of the world, including the British people. The line before the Grievances was “(L)et these facts be submitted to a candid world.” The Continental Congress expected honest people and countries all over the world to accept the long list of grievances against the king of Great Britain. France, Spain and Holland eventually gave money and political support to the Americans. Generals and troops from Prussia, France and Poland supported the Revolution.

 

I should make clear that when we discuss “All men are created equal,” we are on a series of simultaneous tracks. First the clause is part of the natural law argument at the beginning of the document: the Revolution is “impelled” by the laws of the universe. Then it is an element in the relation between the people and the government: Since the government is founded by the people they have the right to “alter or abolish it” when the government threatens their natural rights to liberty and safety. As we saw in the quote above on democracy and revolution, it is their government that they built to “secure” their rights. All men have these rights and the responsibility to defend themselves against tyrannical government.

 

Thus equality is a law of nature, and a political factor in democracy. Equality is an organizing concept, as we have seen. Finally, equality is a concept that breaks through the prejudices of thousands of years of hereditary rights in Europe.

 

Now we come to the question of why the king is addressed as “He.” If there were a delay in a response to this question, I asked my students what happens when you are at a large family gathering and, while you are talking to your cousin, you refer to your father as “he.” Some of the students laugh. Some are dumbfounded. As we say in the vernacular of southern Long Island, where I grew up, you would get “slapped upside the head.” Demystifying the king is the key here. During the coronation ceremony in England the king is anointed – in secret – with a holy oil that causes him to attain a status between man and God. He obtains thaumaturgic powers: divine right.  If you are going to have a revolution against the king, he must be a man like anyone else. Calling him “He” strips him of his mystery.

 

As we have been pointing out since the beginning of this journey through the Declaration of Independence, the clause “All men are created equal” embodies the hopes and some of the worst nightmares of every American. Who was not equal or addressed in the empowering clause for white men? Women, Blacks, most of whom were slaves, and those white men without property. Native Americans were also not included in the so-called equality promoted by the Declaration. It is usually stated that about 6% of the total population could vote. New York, New Jersey, Massachusetts, Pennsylvania and New Hampshire all allowed free black men to vote if they met property requirements which, at least in New York, were higher than for whites. Women of any race could vote in New Jersey if they were widowed or not married, but in 1807 the state changed the relevant clause in their constitution from inhabitants to male inhabitants. More than 40,000 Blacks ran away during the Revolution and by 1800 there were nearly 60,000 free Blacks in the US.  They and their allies frequently petitioned the states for freedom based on the bill of rights of the constitution of Massachusetts, for example, which stated that “all men are born free and equal.” Here are the essential parts of one of those petitions:

 

To the honorable Counsel and House of Representatives for the State of Massachusetts in General Court Assembled, January 13, 1777:

 

The petition of a great number of blacks detained in a state of slavery in the bowels of a free and Christian country humbly show … that they have in common with all other men a natural and inalienable right to that freedom which the Great Parent of the heavens has bestowed equally on all mankind.... (We petition) your honors... (to) cause an act of the Legislature to be passed whereby they may be restored to the enjoyments of that which is the natural right of all men – and their children who were born in this land of liberty – not to be held as slaves.

 

The drama in this cannot be denied. These Black petitioners took only 6 months and 9 days after the signing of the Declaration to demand freedom for themselves.  It was based legally not only on the clause referred to above in the Massachusetts bill of rights, but it also quoted the Declaration's assertion of “inalienable” rights and claimed the natural right of equality for “all man(kind.)” These petitioners changed the meaning of men in “all men are created equal.” It was a verbal act of agency that resulted in a legal ruling that slavery in Massachusetts was unconstitutional, codified under the new Massachusetts Constitution of 1780. The Declaration gave them an opening to claim their freedom; something that the US Constitution later denied them. There are no natural rights written into the Constitution of 1787.

 

I taught these lessons in class after class for more than 30 years. My students welcomed the explanations and close readings of the founding documents. They were proud of learning from the sources and figuring out the truth about our founding as they read and discussed. They took it as a matter of course that they were learning the history of our country in all its complexity. We later read the Constitution, the Articles of Confederation, the Seneca Falls Declaration, the 4th of July Oration by Frederick Douglass, the speeches and letters of Lincoln, the platform of the Populist Party, the Supreme Court cases of Plessy v. Ferguson, Korematsu v. US,  Brown v. Board of Education, the Inaugural Addresses of FDR and JFK, and Martin Luther King, Jr.'s  Letter from Birmingham Jail. I could not wait to get to class to discuss these great works. It was a privilege to teach. I tried to grab my students by the brain and run. I gave them the documents and we learned together. Race was not the only topic we covered, but it was a frequent subject. I did not confront their prejudices; instead, I had them confront the ideas of our history and how our leaders and ordinary people explained them. We covered strikes from the Lowell Mills to the Great Railroad Strike of 1877 to the sit-downs if the 1930s. We read the obituary of one of the survivors of the Triangle Fire and an article by a leader of the of the garment workers. We watched sections of  “Eyes on the Prize,” where they learned that agency was a way of life for the Civil Rights workers.

 

Over time students will see that there are alternative ways of looking at the world different than the one in which some of them grew up. Eventually they can come to understand that racism is virulent and dangerous to peace in society and physically dangerous to Blacks, Hispanics, Asians and Indigenous peoples in the US. I taught American History with all its flaws. It is not impossible, but it takes intellectual effort and trust that your students can understand and learn how to think for themselves.

 

The Declaration is complex. It contains soaring rhetoric and grievances that can pull at your heartstrings and a call for equality that can make you want to believe every word. But it is also riveted to and riven by slavery and hypocrisy. The idea that these slaveholders and their non-slaveholding allies were calling for freedom in the name of a humanity that was so narrowed by race, religion and wealth is appalling. Yet it is still a powerful document even when we understand the context: the contradictions of the real lives of the people and the undemocratic character of the government of the United States. Nevertheless we can face all this in class without blaming our students or their parents for the sins of our Founding Fathers.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181198 https://historynewsnetwork.org/article/181198 0
Psychologically Speaking, Who Were the Heads of the Chinese Communist Party?

 

 

An individual leader’s lifelong socialization is also critical to understanding their behavior. It is well established in the fields of psychiatry, psychology, social psychology, psychohistory, political psychology, and psychopathology that pre-adult and early adult socialization are critical formative periods that do much to shape how a person behaves throughout their life. In particular, at least the following experiences and relationships have been identified as key influences: where one grows up and their standard of living; relationships with mothers and fathers; degree of maternal nurturing (or lack thereof); experiences in school and relationships with teachers; and relationships with peers. All of these encounters usually have far-reaching impacts on subsequent personality development. For example—for boys—maternal nurturing, a secure home environment, interactive siblings, financial stability, supportive primary and secondary school teachers, and inclusivity with a network of peers can all lead to a secure ego, confidence, and an outgoing adult personality. Conversely, antagonistic relations with fathers, a sense of neglect from mothers or abandonment by parents (even if they are away from home working), bickering with siblings, financial instability, harsh discipline from teachers, exclusion by peers—these experiences can all lead to an alienated, frustrated, angry, repressed, aggressive, insecure, insular adult personality type. This latter type is frequently associated with the development of strong anti-authoritarian and frequently narcissistic adult personality types. These two sets of general pre-adult characteristics have also been found across multiple national-cultural environments and are not simply characteristics of modern Western societies. Indeed, in pre-modern agrarian or early industrialized societies they are quite common.

 

To what degree do these early family rearing and socialization features shed light on the five leaders covered in this study? One interesting commonality is that only one of the five (Jiang Zemin) grew up in a close-knit and stable nuclear family environment. All the others had very disrupted youths with absent or deceased parents.

 

Mao and his father had very strained relations, they clashed frequently, and Mao’s anti-authority persona has been attributed to his deep antagonistic relationship with his father. His father made Mao work in the fields beginning at age six, something he resented. As Mao described his father in an interview with Edgar Snow in Yanan in 1937 (his only known reflection on his youth and family): “He was a hot-tempered man and frequently beat both me and my brothers.” Mao told Snow that he grew to “hate” his father. Being unfilial toward his father, in such a patriarchal traditional culture, gave Mao an “Oedipus complex” (a Freudian reference to Greek King Oedipus, who unknowingly killed his father and married his mother), in the view of Sinologist and social scientist Richard H. Solomon, who authored a comprehensive psychocultural biography of Mao. In sharp contrast to his father, Mao’s mother was very nurturing and indulgent of her first son— thus providing him with a strong sense of self-confidence and a self-assured ego. As he described her to Snow: “My mother was a kind woman, generous and sympathetic, and ever ready to share what she had.” Mao was also very protective of his mother (sometimes physically) when she clashed with his father. There was much acrimony in the Mao family household. Mao’s anti-authority trait deepened in primary school, where his teacher frequently punished and beat him. Mao described his teacher as belonging to the “stern treatment school; he was harsh and severe.” After five years in this school and one too many beatings Mao ran away, never to return. These early childhood experiences proved pivotal for Mao—producing resentment of his father and authority figures, and instilling “revolutionary” traits in him at an early age.

 

In Deng Xiaoping’s case, his father was absent for long periods from the family residence in rural Sichuan, and thus Deng did not have much of a relationship with his father. His mother, like Mao’s mother, was loving, and doted on her first-born son. But she died when Deng was only 14. Deng then left home for middle school in Chongqing, also never to return. At just age 16 Deng had the wrenching experience of being sent on a long steamship trip to France for an overseas work-study program (which turned into much work and little study). Altogether Deng spent a total of six years abroad in France and one year in Moscow before returning to China at age 23.

 

These early experiences on his own certainly bred a certain self-reliance in Deng. While in France Deng developed a liking for French food, liquor, and a passion for croissants. He found a series of odd jobs and factory work, but his schooling only lasted three months. Deng did find a peer group in relationships with other young Chinese and Vietnamese (including Ho Chi Minh) then studying and working there, many of whom were active in socialist politics following the Bolshevik Revolution (1917). One of these individuals who did play an important mentoring role in Deng’s life was Zhou Enlai, who was six years Deng’s senior and who brought him into the nascent Chinese Communist Party Socialist Youth League. Deng’s main job in the League was to produce propaganda pamphlets, for which he became known as “Monsieur Mimeograph.”

 

Hu Jintao was also deprived of parents early. His mother died when he was only 7, and because his father (a merchant) was often away on business traveling throughout the lower Yangzi delta region, Hu and his three sisters were raised by an aunt. While the aunt was a good provider, Hu never had the security and familiarity of a close nuclear family. This likely contributed to his own self-reliance, and possibly also to the aloofness he displayed as an adult.

 

Jiang Zemin is the only one of the five leaders in this study to have had a fairly normal nuclear and extended family life, growing up in Yangzhou, Jiangsu province. His father was a writer and part-time electrician, and Jiang recalled later that his mother was doting and loving. The Jiang family was well-to-do and well-known in Yangzhou, an important cultural and commercial center for centuries. Jiang was one of five children. His uncle Jiang Shangqing and his wife were second parents to Jiang Zemin, essentially raising him. Jiang Shangqing was a leftist intellectual who was active in communist underground activities, arrested and rearrested by the Nationalists’ police, and who had just joined the communist Red Army when he was killed in an ambush during the Japanese occupation in 1939, thus becoming a CCP martyr (giving the extended Jiang family a communist pedigree). Following his death Jiang Zemin’s natural father Jiang Shijun offered their son to his brother’s widow, as the couple had no male children of their own. This was not as disruptive for young Jiang Zemin as it might seem, as he had been living mostly with the aunt and uncle from an early age. Other than this anomaly, as described in Chapter 4, Jiang’s upbringing was quite normal and quite intellectual—which may have given him a secure self-confidence.

 

Xi Jinping was also thrust into the world at the tender age of 14, when he was sent from Beijing to rural Shaanxi province during the Cultural Revolution. His father had been imprisoned five years earlier and his mother was sent to a rural cadre school. The Xi family household thus broke apart early in Xi’s young life—he was just 9 when his father was imprisoned— Xi was thereafter sent to a boarding school on the outskirts of Beijing.

 

Thus, if self-confidence and independence born of adversity at an early age is a characteristic of individuals who become leaders, all five leaders in this study had their youths and home life disrupted and had to learn to cope on their own in their mid-teens. Only Jiang Zemin had the semblance of a normal family life, although he grew up in split households between his birth parents and his aunt and uncle.

 

Of course, leaders—like all humans—are not static creatures. Despite the important impact of childhood and early adult socialization, we all learn and change as we grow older. Certain learned “lessons” from the past are assimilated and applied to the future—or at least they should be (“those who do not remember the past are condemned to repeat it,” famously observed philosopher George Santayana). So, a leader at one stage of his or her career may act differently at another. It is thus relevant to consider mid-life and late-life experiences. As individuals approach death, paranoia and irrationality grip some leaders. However, it is not only that people pass through identifiable stages in the life cycle, but psychologists observe that the transitional periods from one stage to the next can be particularly unpredictable and unsettled (like “power transitions” in international relations). Three key transitions are distinguishable: adolescence to young adulthood (ages 17–22); the young adult to middle adulthood transition (ages 40–45); and mid-life to late adulthood transition (ages 60–65). The literature in psychology generally argues that one’s political orientations are formed by stage 1, habits of decision-making and leadership emerge by stage 2, increasing decisiveness occurs by stage 3, but that “decisional sclerosis” can set in after age 65, with increasing unpredictability, irrationality, and dogmatism frequently apparent (which fear of death only exacerbates).

 

In the case of the five leaders in this study, it does not seem to me that these transition points were very influential (the exception being Mao, who certainly grew quite irrational, unpredictable, and dogmatic in his sixties). Rather, I would argue, more significant in shaping the personas and leadership styles of the five were experiences they all had during their twenties and thirties, prior to the mid-life transition point noted above. As is described in detail in their individual chapters, it was primarily during these two decades of their lives that each really began to form a distinct leadership style and modus operandi.

 

While it is important to analyze leaders in their adult years, psychologists have long established that people are actually creatures of habit, and are quite resistant to change and adaptation. People’s essential personalities are fairly firmly established by early adulthood—absent profound experiences (such as war, natural calamity, or life-altering events). Their basic belief systems and worldviews (weltanschauungen) are predominantly determined by one’s twenties—and they are determined by a combination of family, school, community, and peer group socialization influences. Thereafter, as the psychological theory of cognitive dissonance (expounded most thoroughly by Leon Festinger) tells us, adults go through life selectively accepting evidence that confirms pre-existing beliefs and images while rejecting (dissonant) information that contradicts the core belief systems established by their twenties.

 

With respect to our five Chinese leaders, I would argue that the theory of cognitive dissonance and the argument that their worldviews were strongly formed prior to their thirties applies really only to Mao. Deng, Jiang, Hu, and Xi all forged their professional personas during their thirties and forties—through working in and managing CCP institutions. All three were strong institutionalists, and I would argue that this was an outgrowth of their work experience rather than their childhoods, teenage years, twenties, or revolutionary activities (in the case of Deng).

 

All of these features of human development and behavior should be kept in mind when reading this book, as Chinese leaders are not unique human beings—they are susceptible to many of the behavioral patterns that psychologists, political scientists, and other researchers have discovered across multiple countries and cultures. Recognizing this, individual countries and cultures also exert their own specific influences on individual leaders. In this context, the next two sections discuss, respectively, the unique impacts of Chinese traditional political culture and that of Leninist-type communist parties.

 

This essay is excerpted from David Shambaugh's China's Leaders: From Mao to Now (2021) with permission of Polity Books. 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181275 https://historynewsnetwork.org/article/181275 0
For Constitution Day, Let's Toast the Losers of the Convention

Antifederalist Luther Martin, Etching Albert Rosenthal, c. 1905.

 

 

 

The earliest critics of the U.S. Constitution struggled against aspects of a document with many avoidable flaws, mistakes that haunt us to the present day. Go back to that hot room in the summer of 1787. How many of us would endorse the Electoral College? The three-fifths clause? The fugitive slave clause? The continuation of the African slave trade for 20 years? An Executive elected with no term limits? A ban on paper money? Vast Presidential pardon powers? An ineffective and nearly impossible to complete impeachment process?

In this era of highly charged debates over how we study American history, I suggest we mark Constitution Day not by bowing at the feet of Madison, Hamilton, and Washington, but by remembering those who lost. Several skeptical delegates left Philadelphia earlier than the September 17, 1787 signing day. One was Luther Martin of Maryland.

Martin was a difficult man. Imagine a cross between Ralph Nader, George Wallace, and Hunter Thompson. Catherine Drinker Bowen labeled him the “wild man of the convention.” We created the four-part documentary series, Confounding Father: A Contrarian View of the U.S. Constitution because this fascinating man was, in the words of eminent revolutionary era historian Gordon Wood, “full of predictions and most of them came true...” We would all have been better served if Martin and other gadflies like Yates and Lansing of New York had stayed until the end of the convention, as much was decided without them.

“Happiness is preferable to the splendors of a national government...”

Maryland Delegate Luther Martin at the Constitutional Convention

According to Bill Kauffman, author of Founding Father, Drunken Prophet: the Life of Luther Martin, this statement was a plea to the framers of the U.S. Constitution for a modest American confederation. Luther Martin thought the leading framers lusted to create an empire that would compete with European powers. Strange as it may seem from the 21st century, he and many opponents of the Philadelphia convention did not want that. For Luther Martin, empires inevitably led to unhappiness and ruin. Might we have been more like Canada or Switzerland?

One of Martin's prescient predictions was that Washington DC would become isolated and removed from the people because the House was too small and members would soon neglect their constituents. Many have indeed come to feel that what happens “inside the beltway” is removed from their lives and concerns. In our documentary, historian Woody Holton calls this an “invisible wall.”

In one of his many appearances in the series, biographer Bill Kauffman recites Martin's anti-slavery words from an August 1787 convention speech:

“The continuance of slavery ought to be considered as justly exposing us to the displeasure and vengeance of Him (God) who is equally lord of us all and who views with equal eye the poor African slave and his American master.”

Kauffman is quick to remind us that Martin owned six slaves at the time. However, given an opportunity to go back in time to speak in that room, wouldn't we similarly challenge the slavery protections?

In her ground-breaking book Madison's Hand: Revising the Constitutional Convention, Mary Sarah Bilder suggests that James Madison, the sainted “Father of the Constitution” may have actually put some of Martin's anti-slavery words into his own mouth.

Martin on August 21, 1787 arguing against slavery protections:

“It was inconsistent with the principles of the revolution and dishonorable to the American character”

Madison records himself arguing against the slave trade continuing for 20 years on August 25, 1787:

“...so long a term will be more dishonorable to the National character...”

Bilder notes the suspicious use of the words “dishonorable” and “character,” and suggests these uncharacteristic anti-slavery statements by Madison were not really uttered in Philadelphia but were inserted later as he revised his notes of the convention to make himself look better. If Madison wanted to sound like Martin, perhaps the legacy of this forgotten, drunken founder warrants a more in-depth look?

Martin's bold criticisms of the Constitution's framers were printed in the rambling pamphlet, The Genuine Information. This was an important source for many of the so-called Anti-federalists (who called themselves the true Federalists) during the ratification debates – but is admittedly not especially quotable or well written.

The late historian Pauline Maier, (Ratification: Arguing the Constitution, 1787-1788), in probably her final on-camera interview opined:

“He (Martin) thought the federal convention should have done what it was authorized to do: and that is to propose amendments to the Articles of Confederation. He thought it was much smarter to fix what was broken, and to give new powers sparingly...because it was always easier to give power, than to take it back.”

As it happened, it took a Civil War and 700,000 dead to create a more perfect union with the 13th, 14th, and 15th amendments – yet those gains are not necessarily guaranteed and many are under threat in the 21st century. Our work is not done.

Luther Martin was attorney general of Maryland for most of his adult life, argued important cases before the Supreme Court, and was an alcoholic who spent his final years as a pauper in the home of Aaron Burr – another unpopular figure in American History. Remarkably, Martin died on July 10, 1826 – just six days after the passing of Jefferson and Adams on July 4, 1826. He's buried in an unmarked grave under a playground in Lower Manhattan.

We are currently mired in absurd debates over criticisms of the more revered Founding Fathers. Our mantra during production of the film was: The more we elevate the founders, the more we diminish ourselves.

We don't deserve it.

On Constitution Day, I say drink a toast to the losers. After all, some of them agitated for a Bill of Rights, a proposal that was rejected by the framers in September of 1787. Let us study their criticisms of our imperfect system. Imagine what might have been and what might be as the republic suffers in a realm of political dysfunction caused in no small part by our odd framework of government.

 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181276 https://historynewsnetwork.org/article/181276 0
We are All Becoming Cassandras: Leaders Must Heed the People on Climate, Disarmament, and Pandemic

Cassandra, Evelyn de Morgan, 1898

 

 

In ancient Greek mythology, Cassandra was a priestess who was able to predict the future but unable to convince others to act upon her prophecies.

The fate of Cassandra seems particularly relevant today, for there has been ample warning about three developments that threaten continued human existence—preparations for nuclear war, climate change, and disease pandemics—without, however, adequate measures being taken to safeguard human survival.

Ever since the atomic bombing of Japan in 1945, prophetic voices have warned of doom if the world does not ban nuclear weapons.  And yet, the nine nuclear powers are currently engaged in a new nuclear arms race to build ever faster, more devastating weapons that, if used, will annihilate nearly all life on earth.

About three decades ago, climate change also became a major public issue, with scientists, politicians, and environmental organizations issuing prophetic statements about the extreme dangers ahead.  Today, following a remarkable display of inaction, massive wildfires and floods sweep across nations, the polar ice caps are melting, sea levels are rising, and millions of climate refugees are fleeing for their lives.

More recently, specialists have warned of the outbreak of new, highly contagious and potentially deadly diseases around the globe.  And the result?  Thanks to the failure of governments to implement the necessary public health measures, a COVID-19 pandemic has already produced over 222 million cases and 4.6 million deaths, with no end in sight. 

Curiously, though, there is a major difference between the Cassandra of the Greek myths and her modern counterparts.  In the myths, Cassandra was ineffective because she was simply not believed.  By contrast, most people do believe our modern Cassandras and want action taken to avert catastrophe.

When it comes to nuclear weapons, polls have repeatedly shown that most people favor eliminating them.  A 2008 public opinion survey in 21 nations worldwide found that large majorities in nearly all the nations supported the total abolition of nuclear weapons.  Recently, public opinion surveys in Europe, Japan, and Australia reported similar results. 

Substantial majorities of people polled around the world also feel seriously endangered by climate change.  A 2018 Pew Research Center survey of people in 26 nations in North America, South America, Europe, Asia, the Middle East, and Africa found that a median of 68 percent regarded climate change as a “major threat,” 20 percent as a “minor threat,” and only 9 percent as “not a threat.”  In early 2021, the UN Development Program announced the results of the “People’s Climate Vote” that covered 50 nations with over half the world’s population.  The program’s administrator declared that they “clearly illustrate that urgent climate action has broad support amongst people around the globe.”

The COVID-19 pandemic also sparked an exceptionally strong demand for remedial action.  In late February 2021, an Ipsos survey of people in 15 nations found overwhelming numbers intending to be vaccinated in a variety of nations,  including Brazil (89 percent), Italy (85 percent), China (82 percent), Spain (82 percent), Mexico (80 percent), South Korea (80 percent), Canada (79 percent), Australia (78 percent), and Japan (74 percent).  Although rightwing politicians in the United States downplayed the seriousness of the epidemic and railed against vaccines and other public health measures, recent polls have found that 64 percent of Americans approve of mandatory vaccinations for everyone in the United States and that same percentage backs mask mandates for all public places

Even so, governments have not taken adequate action to stave off the catastrophes of nuclear war, climate change, and disease pandemics.  Why?

One key factor is the control of public policy by self-interested economic forces.  Seeking lucrative military contracts from the U.S. government, giant corporations campaign relentlessly for the building of new nuclear weapons.  In 2020, the major nuclear weapons contractors in the United States employed 380 lobbyists and spent $60 million on lobbying, with great success.  This expenditure, of course, does not include their lavish campaign contributions to friendly politicians.   

Nor should we forget the immense role that wealthy fossil fuel corporations have played in sabotaging action to avert climate catastrophe.  Although ExxonMobil and other oil companies knew decades ago about what their products were doing to the environment, they funded a massive misinformation campaign designed to deny the findings of climate science, subvert public opinion, and block international treaties that could curb greenhouse gas emissions.  Thus far, they have been very successful.

As for the giant pharmaceutical companies, they treat the COVID-10 pandemic as an opportunity to reap vast profits.  Public health, of course, is dependent upon the worldwide distribution of antiviral vaccines as quickly as possible.  But the corporations manufacturing the vaccines, determined to maximize their income, refuse to waive their patent rights, thus preventing other companies or governments from producing or distributing the vaccine and, thereby, competing with them.  In this situation of scarcity, they sell the vaccines to the highest bidders among governments—overwhelmingly those of the richest nations.  Consequently, as of August 30, 2021, 57 percent of people in high-income countries had received at least one dose of the vaccine, while only 2 percent had received it in low-income nations.

A second key factor behind the inadequate response to these crises is the absence of a system of global governance.  Even when the baneful influence of powerful corporate entities is overcome, on occasion, in individual nations, there is no structure that can take remedial action on a global basis.  The closest the world has come to that structure is the United Nations.  But, if the United Nations is to meet the challenges posed by these existential crises, it needs substantial strengthening of its authority and resources.

Consequently, until corporate influence is curbed and the United Nations strengthened, our modern Cassandras’ warnings seem likely to go unheeded.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181312 https://historynewsnetwork.org/article/181312 0
Inequality Tends to Reach Political Tipping Points – Is That Happening Today?

US Rep. Alexandria Ocasio-Cortez (D-NY) wears a dress with the slogan "Tax the Rich" at the 2021 Met Gala. With her is the dress's designer Aurora James.

 

 

 

 

Two things have been ever-present in the history of American wealth: 1) inequality grows, and 2) it then gets redressed through adjustments and policies when the gap between the haves and have nots brings pronounced stress in society. Both things are part of American business history, so the calls in 2021 for child tax credits or higher minimum wages are not inconsistent with similar calls in the past. 

 

In its earliest days, America had far less wage inequality than its European forbears — slaves and the scourge of slavery excepted — since land was plentiful and labor was scarce, the very opposite of the conditions found in Europe. The average wages in America quickly exceeded those in Europe. 

 

But wealth inequality rose dramatically to its first historical peak in the mid-to-late 1800s with the massive railroad, steel, and coal companies of the Industrial Revolution. Many workers’ lives deteriorated as they descended into coal mines and trudged into steel mills, and their livelihoods became bound to the unforgiving boom and bust cycle of industry.  

 

Soon a chasm separated the very wealthy and the poor. Immigrants flooding into the United States often lived in heartbreaking, abject poverty — made all the more disturbing in contrast to the glittering lifestyles of the wealthy that surrounded them.  

 

Depressions such as the one that followed the Panic of 1893 led to steep wage reductions at large companies. For miners and other workers already living at subsistence levels, these reductions had grim consequences. They led to social disruption in the form of strikes that were often lethal, including the Bituminous Coal Miners’ Strike of 1894, the Lattimer Massacre in 1897, and the Battle of Virden in 1898.  

 

As the fortunes of a select few grew to previously unimaginable heights and economic inequality climbed right along with them, the Progressive political movement developed in the early 1900s to oppose laissez-faire capitalism and monopolistic corporations and to close the inequality gap. 

 

Business history shows that wealth inequality tends to reach a tipping point. Progressives such as Jacob Ris, Ida Tarbell, and Jane Addams condemned harsh work conditions, especially for children, and advocated for public education. Theodore Roosevelt adopted Progressive trust-busting and regulatory policies when he became president in 1901.  

 

But Henry George, the most famous economic writer of the Progressive era, had a more dramatic approach to wealth inequality. He railed against low wages and inequality and espoused a philosophy known as Georgism, which contended that the benefits of land belonged equally to all. He advocated for a “single tax” on the unimproved value of land, arguing all government revenue should come from this tax. Although all but forgotten today, George’s most famous work, Progress and Poverty (1879), sold millions of copies internationally and likely had a larger worldwide circulation than any work on economics ever written.  

 

The decimation of fortunes in the Great Depression and widespread demand for labor in World War II once again reduced our nation’s inequality, albeit by grim means, and in the 1950s and 1960s the middle class reach an apex as measured both by percent of the population and relative income. 

 

Unions ascended in power and influence — in automobiles, steel, mining, communications, trucking, and more. Union members were 7 percent of the workforce in 1930. Through the Great Depression that figure rose to 18 percent and then peaked at 28.3 percent in 1954. These unions won and expanded key benefits, especially pensions and health care. Today unions represent only 10.8 percent of the workforce. 

 

In recent decades, however, the rise in inequality has crept up yet again. In the United States, the Gini coefficient, which measures inequality and where a higher number represents greater inequality, has climbed markedly from 0.40 in 1980 to 0.48 in 2019.  

 

From 1989 to 2019, the financial net worth of the lowest 59.9 percent of U.S. households has declined from 43 percent to 24 percent of their income, leaving them with diminished relative capacity to afford education, investments, and other expenditures. For the top 10 percent, their financial net worth has doubled from 158 percent to 335 percent of their income. The contrast could hardly be more stark.  

 

It should not shock or surprise us, therefore, that we’re once again hearing about policies to narrow the wealth inequality gap. Some attribute the large followings of both Donald Trump and Bernie Sanders partly to the discontents created when the haves and have nots are so far apart. 

 

Proposals for a $15 minimum wage, more support for childcare and education, and even an alternative minimum income did not emerge out of nowhere as radical notions imported from Western Europe. To the contrary: measures to narrow wealth inequality have been as constant in U.S. business history as the growth of wealth inequality itself. 

 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181273 https://historynewsnetwork.org/article/181273 0
The Roundup Top Ten for September 17, 2021

Martin Luther King Knew: Fighting Racism Meant Fighting Police Brutality

by Jeanne Theoharis

Despite contemporary efforts to portray contemporary movements like Black Lives Matter and radical groups like the Black Panther Party as deviators from the "respectable" movement led by MLK, the SCLC leader insisted on the need to combat police brutality despite the unpopularity of that position,

 

Politicians, not Migrants, are Fueling the Pandemic's Resurgence

by Randa Tawil

At the height of colonialism, European governments rejected calls for quarantine to keep global commerce humming, and blamed supposedly unsanitary local populations for the inevitable spread of cholera. Governors in some US states are repeating this mistake today. 

 

 

Another 9/11 Legacy? The Spread of Conspiracy Theories Online

by Jeff Melnick

9/11 happened as traditional American media outlets were being consolidated into a small number of corporate networks, encouraging people seeking information to turn to decentralized sources and, eventually, social media, opening space for misinformation and conspiracy theories. 

 

 

The Conspiracy Theorists Are Coming for Your Schools

by Thomas Lecaque

"Over the past year, as the conspiracy theorists have come together under one big apocalyptic tent we have seen organized campaigns of harassment, threats of violence, attempts to harm members of school administrations, and physical altercations at school board meetings when masks are mandated."

 

 

There’s a Very Good Reason ‘Washington Slept Here’

by Nathaniel Philbrick

"Today the phrase 'Washington slept here' is a historical joke, but during the two years of intermittent travel at the beginning of his presidency, all those nights spent in taverns and homes across the country were essential to establishing an enduring Union."

 

 

The 70s are Back, But Not How You Think

by Lauren Rebecca Sklaroff

"In the coronavirus era, disco themes resonate. People long for community and wonder if leaders have our backs. Social media offers some of the trappings that defined disco — from the clothes to the allure of being seen in a new way."

 

 

The Melting of the American Mind: Internet Pop Psychology and the Authoritarian Personality

by Maya Vinokour

The internet and social media have worked to normalize and validate authoritarian and illiberal worldviews, making the mindset that baffled thinkers like Theodor Adorno in 1947 commonplace today. 

 

 

On the Eve of Destruction: Breaking the Double-Bind of the Nuclear Arms Race

by Richard Rhodes

Politicians and defense contractors who wanted American nuclear supremacy won out over scientists seeking international effort to contain the extinction-level threat posed by thermonuclear weapons, even to the point of denying the planet-destroying power of the H-bomb. 

 

 

The Winner in Afghanistan? China

by Alfred McCoy

While the similarities between the American exits from Vietnam and Afghanistan are superficially obvious, the differences are more significant, and signal a steep decline in America's ability to influence world affairs. 

 

 

Melcher's Ghosts

by Monica Black

"Denazification prompted less soul-searching than resentment and anxiety among the German population. People worried that their prior affiliations and involvement in everything from war crimes to far less nefarious acts—like having obtained property illegally during the Nazi years—would be revealed."

 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181306 https://historynewsnetwork.org/article/181306 0
Memo From Irish History: Welcome to Your Future, American Women

A memorial to Savita Halappanavar, Dublin, 2018. Halappanavar died in 2012 of sepsis related to a miscarriage because her doctors, fearing prosecution under Irish abortion law, delayed performing life-saving surgery. The mural calls for a "yes" vote on a referendum to overturn Ireland's Eight Amendment, which established a fetal right to life and committed the state to its defense. Photo Zcbeaton, CC BY-SA 4.0

 

 

A misogynistic new anti-abortion law took effect in Texas on September 1st that imposes the strictest limitations on access to abortion in the entire United States. According to the law, pregnant people–who, despite the text of this legislation, may or may not be “women” in the sense that they are adult human females who self-identify as such–must recognize that they are pregnant and procure an abortion by the gestational age of six weeks.

The now-infamous law deputizes private citizens to snitch on their friends and neighbors in a manner akin to witch hunts. It incentivizes such snitching (in the form of “civil suits”) with the promise of a $10,000 reward if the accused is found guilty of providing or aiding and abetting an abortion after six weeks’ gestation. 

It may be difficult for many Americans to envision a world in which pregnant women and girls are seen as the tools of men and/or of the state, because Roe v. Wade has been the law of our land since January 1973. But as a historian of modern Ireland who wrote her dissertation on sexual assault, I understand this world all too well. Americans can and should learn what the Texas law means in practice. It means that in the state of Texas people with male bodies can exercise full autonomy over themselves, including their sexual choices and the consequences thereof; people with female bodies cannot.

In the Republic of Ireland, however, all living adults remember the time prior to 2018 when abortions, which had already been illegal in Ireland for over a century, were banned more forcefully under the Eighth Amendment of Bunreacht na hÉireann, the Constitution of Ireland. Indeed, that document read, “The State acknowledges the right to life of the unborn and, with due regard to the equal right to life of the mother, guarantees in its laws to respect, and, as far as practicable, by its laws to defend and vindicate that right.” 

Under this amendment, Irish women’s bodies were not their own; the state claimed the right to force a pregnant person to give birth. This policy instrumentalized women as not fully human individuals; rather, in the words of philosopher Martha Nussbaum in her book Sex and Social Justice, pregnant people were “means to the ends of others.” 

For much of the history of independent Ireland, the state concerned itself with showing Irish Catholic morality in contrast with the alleged immorality of Britain, its former colonial master. As a consequence, the sexual ethics of Irish people, especially Irish women, became the focus of the Catholic church in alliance with the government. In fact, as historian James M. Smith has written, this alliance worked to “contain” sexuality in such a way that Irish women could be seen as “pure” and Ireland could be portrayed as a place where immoral sexual acts such as fornication, rape, and abortion did not occur. 

Of course, these acts occurred regularly, but the goal was to keep such immorality out of sight by pretending it did not exist. The most odious methods by which “immorality” was hidden involved either (1) confining such “immoral” girls and women to the now-notorious Magdalene asylums where “fallen” girls and women lived out miserable lives, or forcing them to reside and give birth in “mother and baby homes,” where they were similarly removed from the eyes of society.

The consequences of “containing” sexuality in Ireland are myriad and have been detailed by many historians, including myself, in other places. Here we are concerned only with the consequences these policies had on the lives of pregnant girls and women, as this is the horror show that is about to ensue in Texas and other states that will inevitably pass similar statutes now that the Supreme Court of the United States has allowed the Texas law to take effect. In order to make the comparison clear and relevant, I will focus on cases that occurred in Ireland in the past thirty years.

In 1992, a fourteen-year-old girl publicly referred to as X was raped by a neighbor and became pregnant. The young girl was suicidal as a result of the rape and pregnancy.  So, the girl’s parents decided to take their daughter to Britain to have an abortion. Before they did so, though, her parents contacted the gardaí (the Irish police) to inquire if fetal tissue obtained after an abortion would be admissible as evidence of paternity at the rape trial. Guided by the Eighth Amendment’s declaration that the state had to “vindicate” the “life of the unborn,” the state issued an injunction that compelled X to return to Ireland before the abortion was performed. The Irish High Court interpreted the Eighth Amendment to mean that the government was obligated to seek her return to Ireland and keep her in the country until she gave birth. In the end, President Mary Robinson invoked her power to refer the X Case to the Supreme Court, which reversed the High Court’s decision and lifted the injunction (by which point this was irrelevant because X had already had a miscarriage). 

In 2010, cancer patient Michelle Harte discovered that she had accidentally become pregnant. While her oncologists advised her to have an abortion, the ethics committee of Cork University Hospital said that she was not eligible for an abortion in Ireland because they did not believe that Michelle’s life was in “immediate threat.” In the meantime, cancer treatment was stopped because of the pregnancy, and Michelle frantically gathered together the resources to travel to Britain to obtain an abortion. She was reportedly so weak by the time she was ready to travel that she could not board the plane without assistance. A writer for the Irish Times noted, “...the fact that [Michelle’s] life was at risk as a result of pregnancy wasn’t just a matter of probability: it was a medical certainty.” In Harte’s case, the “rights” of a fetus were given equal consideration to the dire need of the woman for cancer treatment. 

Although these cases certainly aroused the fury of Irish women and their allies, the death of Savita Halappanavar in 2012 was the moment everything changed in Ireland. Savita was a 31-year-old woman who was seventeen weeks pregnant when, in October 2012, she presented at a Galway hospital in the middle of a miscarriage in which the gestational sac was already protruding through her cervix. Fetal death was imminent, but the doctors, fearing the wrath of the Eighth Amendment, did not perform a surgical abortion because there was still a fetal heartbeat. By the time they did perform the procedure, Savita was suffering from septicemia and she died a painful death that was entirely preventable if the doctors had not feared the legal consequences of providing appropriate treatment for their patient. 

As Irish feminist Rosita Sweetman wrote in her book Feminism Backwards, “Savita’s death symbolised what all women faced as long as the Eighth Amendment remained, as long as the life of the foetus was legally on a par with the life of the mother.” 

Americans, welcome to your future. 

The outcry following Savita’s death galvanized the movement that eventually culminated in the repeal of the Eighth Amendment to the Irish constitution in May of 2018. The role of the Catholic church in Ireland had been declining for many years at this point, as the role of women in the public sphere had been expanding. Still, it took the tragic death of a young woman to finally force the policy to change in Ireland. Unfortunately, ultra-conservative members of the U.S. Republican Party are heading in the opposite direction, and American girls and women will suffer the same fates that these Irish women suffered.

Elizabeth Nash, state policy analyst for the Guttmacher Institute, told the New York Times that “Health providers will be very conservative about interpreting the law, because they don’t want to cross a line.” This fear will lead to the premature or preventable deaths of pregnant cancer patients like Michelle Harte, as doctors struggle to understand when an abortion beyond six weeks gestation is considered medically necessary under the law. 

The Texas law will likely lead to cases that resemble the tragedy of Savita Halappanavar, the woman who died of a septic miscarriage because her doctors were afraid of being prosecuted for performing an illegal abortion, even though Savita’s pregnancy was clearly no longer viable when she presented at the hospital in Galway. How many girls and women like Savita will present at Texas hospitals with similar cases? How many of them will die? 

The Texas law will force questions about the right of girls and women to move freely around the state, as people who drive girls and women to abortion clinics are considered to be “aiding and abetting” the illegal procedure. Uber and Lyft have promised to pay the legal fees of drivers who are accused of aiding and abetting in this fashion, which is commendable, but their pledge is really beside the point. Will pregnant people be enjoined from leaving the state to procure an abortion, as was the girl-child X in Ireland in 1992? If not, is the law really intended to punish indigent girls and women? 

Protests over the X Case occurred on both sides of the Atlantic Ocean, with protesters holding posters that read “Ireland Defends Men’s Right to Procreate By Rape.” The Texas law may similarly be interpreted to allow sexually violent men to force girls and women to carry pregnancies to term. Although the text of the law reads that “a civil action under this section may not be brought by a person who impregnated the abortion patient through an act of rape, sexual assault, incest, or any other” criminal conduct, the law does not specify whether the criminal act must be proven or merely asserted in order for this provision to obtain. Therefore, men could easily avail of this law to impregnate a girl or woman via rape, and then report a violation of the law in order to obtain $10,000. At least in Ireland, the law did not incentivize the sexual abuse of women.

There is further evidence that the Texas law is intended to punish people with female bodies and commit violence against them. Were the law truly intended to protect women and fetuses, the law would compel the impregnating party–that is, the provider of the sperm that fertilized the egg–to start providing monetary support for the pregnancy at the same point at which abortions become illegal. The law does not do that. The law, therefore, creates a society that seeks (1) to shame women either for acting on their own sexual agency, and (2) to shield men for the consequences of their sexual choices.

Ireland was able to look at its recent history of the horrific treatment and preventable deaths of pregnant women and girls and say “Enough.” I hope that the government of the United States can say “Enough” and act to make girls and women fully autonomous human beings under the law before we have our own X, Michelle Harte, and Savita Halappanavar.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181238 https://historynewsnetwork.org/article/181238 0
Untold Stories from the Largest Boat Lift in History

 

 

As we approach the 20th anniversary of the terrorist attacks on September 11, 2001, we might think that we’ve heard all we need to (and then some) about that day. In a nation consumed with mourning more than 650,000 people due to the continuing COVID-19 pandemic, what does it mean to remember the outpouring of unity and grief that followed the murder of close to 3,000 after passenger-filled planes were piloted as missiles?

Innumerable accounts have combed the depths of the grief, shock, anger, and resilience of that day. They chronicle epic heroism and tiny acts of kindness. But “9/11 fatigue” does a disservice to history. Beyond the jingoistic rhetoric, the market-driven spectacle, the bumper-sticker sloganization of memory, the printed tourist guides and Twin Towers tchotchkes for sale around the site perimeter, lie essential, inspiring, and instructive stories about basic human goodness and the power of collective action. Episodes of pragmatism, resourcefulness, and compassion. Moments of grace and solicitude. Lifesaving efforts born of professional honor. The joining of unlikely hands.

Still, remarkably, some of the most affecting of these stories have gone unheard.

Here’s just one. On the morning of September 11, telecom specialist Rich Varela was working a contract gig on the twelfth floor of 1 World Financial Center, directly across the street from the South Tower. A few minutes into his workday, he looked around the windowless, nearly soundproof “comp data” room humming with servers, telephone switchboards, and other electronics, and noticed that things seemed oddly quiet. There must be a late bell today, he reasoned.

About 15 minutes after he’d sat down at his desk, a “crazy, ridiculous rumble” erupted, and the building did a little shimmy. Maybe it was one of those big 18-wheelers, thought Varela, picturing the thunder created when a large truck rolls over metal plates in the street. He gave a passing thought to his surroundings. If that had been an explosion, from a blown gas line or something, nobody would even know I was in here.

A few minutes later, his buddy called from Jersey. “Get outta there! They’re crashing planes into the World Trade Center!”

Because his friend had a history as a prankster, Varela didn’t buy his story right away. “No, I’m serious,” the friend said. And then it clicked. That rumbling sound. Varela gathered his things, neither panicked nor dawdling. He opened the thick door of the comp room that had shielded him from the screeching and strobes of the building’s fire alarm. Down the hall, phone receivers dangled off corkscrewed cords, papers lay strewn across worktables, and chairs loitered at odd angles rather than nestling neatly under their desks. The trading floor was wholly uninhabited. “It was like people just evaporated.”

In the lobby, Varela caught his first glimpse of what looked to him like Armageddon. The whole front of 1 World Financial Center had imploded, leaving the plate glass window in shards. Amid piles of smashed concrete and polished stone, pockets of flame feasted on combustibles. Varela stared at the blazing hole in the side of the South Tower. “You could hear a pin drop. You’re in Lower Manhattan. You could hear sirens in the distance but immediately in the area there was no motion of life. I thought that was so eerie.”

Outside, a series of artillery-like blasts made him duck for cover. The eruptions sounded like “cannon fire or missiles coming into Manhattan.” Boom! Boom! Are there battleships out in the water shooting planes out of the sky? Varela wondered. Turning to face the towers, he realized he was hearing the sounds of bodies hitting the ground. He tried to shake off the images as he headed toward the Hudson.

Halfway to the river, still reeling, Varela heard a rumbling. The South Tower was imploding. He tried to outrun the plume of ash, dust, and smoke that barreled toward him. At the seawall, Varela spotted a boat.

On September 11, 2001, nearly half a million civilians caught in an act of war on Manhattan escaped by water when mariners conducted a spontaneous rescue. This was the largest waterborne evacuation in history—more massive than the famous World War II rescue of troops pinned by Hitler’s armies against the coast in Dunkirk, France. In 1940, hundreds of naval vessels and civilian boats rallied to rescue 338,000 British and Allied soldiers over the course of nine days. But on that Tuesday in 2001, approximately 800 mariners helped evacuate between 400,000 and 500,000 people within nine hours. The speed, spontaneity, and success of this effort was unprecedented.

Somehow, in the sea of reporting that followed the attacks, this fact has garnered remarkably little attention. I wrote Saved at the Seawall: Stories from the September 11 Boat Lift to address that omission.

Within minutes after thick, gray smoke began spilling through the airplane-shaped hole in the World Trade Center’s north tower, adults and children—some burned and bleeding, some covered with debris—had fled to the water’s edge, running until they ran out of land. Never was it clearer that Manhattan is an island. Mariners raced to meet them, white wakes zigzagging across the harbor. Hours before the Coast Guard’s call for “all available boats” crackled out over marine radios, ferries, tugs, dinner boats, sailing yachts, and other vessels had begun converging along Manhattan’s shores.

So many people ached to contribute something that day and the days that followed. They donated blood, bagged up stacks of peanut butter and jelly sandwiches, built stretchers that went unused. But, as fate would have it, mariners who had skills and apparatus that could help right away became first responders. Their collaborative efforts that morning saved countless lives. So did other selfless acts made by people from all walks of life. Varela is just one shining example.

As the debris cloud rained down on him, Varela bounded over the steel railing separating the water from the land and leapt onto the bow of fireboat John D. McKean. He felt his leg buckle and almost snap when he hit the deck. A mass of people jumped on after him, falling onto the deck, some landing on him, and the boat rocked under the weight of the leaping hoards. Varela worried it might capsize. He stumbled over people on his way to the far side of the deck, away from the avalanche, then curled in on himself, choking as everything went black.

When the air cleared a bit, Varela saw casualties all around him. Somebody was nursing a broken leg. A woman lay splayed out beside the bow pipe. It looked as if she had landed face-first on the steel deck. He hollered to a nearby firefighter that she needed medical attention—that she was unconscious and might already be dead. There was little anyone could do on the boat, so reaching a triage center, quickly, was imperative. But other lives needed saving, and people continued to clamber aboard.

Quickly, Varela made his first choice of many that day to help others. Coughing and gagging, he yanked off his gray-green long-sleeved cotton shirt, tore off a strip, and wet it with water he found dribbling from some leaky hose on deck. He tied the makeshift filter around his face and then tore off more strips for fellow passengers.

Soon, the fireboat McKean would evacuate people to safety at a rundown pier in Jersey City. Before that, though, Varela helped heave up a docking line used to rescue a young woman from the water. While there on the Jersey side, he helped carry a chair to transport a man with a shattered leg off the boat.

Then, just as the fireboat crew prepared to cast off lines, the second tower collapsed. Varela saw the looks on the faces of the fireboat’s crew as 1 World Trade Center pancaked down, burying their fellow firefighters along with the civilians they’d been sent to save.

Their horror prompted Varela to make the decision of a lifetime. “‘I’m coming with you,” he said. “You guys need help.”

“Let’s go,” came the reply, and Varela jumped back on board. So did an older gentleman, explaining, “My son’s in that building.”

“It really felt like, I might die today,” Varela later said. “And I was okay with it.” These guys need help, he thought. And that was it.

So many people that day made choices to take risky action for the sake of others: Firefighters climbing the stairwells, co-workers helping those less mobile to escape the burning buildings, office workers in dress shirts hauling equipment with rescue workers in the plaza, and mariners who dropped evacuees at safer shores, over and over, then set course straight back to the island on fire to save still more people trapped at the seawall.

Surfacing these long-overlooked stories grants us a window onto who we have been for one another and who we can be again. Remembering that this, too, is part of our heritage can help us reclaim our humanity as we face the perils of today.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181235 https://historynewsnetwork.org/article/181235 0
9/11's Memorials and the Politics of Historical Memory

National 9/11 Memorial, Manhattan. Photo by author.

 

 

 

 

The post-9/11 era, which effectively began shortly after the terrorist attacks on September 11, 2001, has decidedly come to an end. Shaped not only by the loss of life on 9/11 but also by all that ensued in its wake, including the long wars in Afghanistan and Iraq, it was defined by fears of foreign terrorism, security culture, and a xenophobic and jingoistic form of patriotism. But the post-9/11 era was also significantly shaped by an under-appreciated force: the culture of memory.

 

As the 20th anniversary of 9/11 looms, it makes sense to reflect on this preoccupation, if not obsession, with memory. The urge to memorialize two decades ago was swift and strong, rising out of a deep sense of grief and loss. Over one thousand 9/11 memorials were built around the country and around the world. Many of these memorials were prompted by the decision of the Port Authority of New York and New Jersey to hand out pieces of steel recovered from the site for memorials from 2010 until 2016. In New York, the 9/11 memorial and museum garnered extraordinary attention when they opened in 2010 and 2014. They also cost, together, almost $1 billion, a significant percentage coming from public funds.

 

Memorialization has been a nationally affirming enterprise, providing comforting narratives of national unity whose coherence nonetheless required the exclusion of many aspects of the event. With memorials being built well into the 2010s, it seemed increasingly that the surfeit of 9/11 memory was not about 9/11, or even those who died that day. Rather it reflected a desire to return to that post-9/11 moment of national unity, in which, however falsely, the nation seemed to speak with one voice: we are Americans.

 

What does it mean to remember 9/11 20 years later, when an entire generation has been born since? The memory-focused rebuilding of Ground Zero in lower Manhattan most painfully raises this question. Does anyone still care that One World Trade Center, formerly known as the Freedom Tower, is 1,776 feet tall in a gesture of patriotism? Or, that the $4 billion publicly-funded Oculus shopping mall (excuse me, transportation hub) has a skylight that opens on the anniversary of 9/11? The 9/11 museum, which tells a nationalistic story of 9/11 as an exceptional historical event and sells 9/11 hoodies and souvenirs in its gift shop, looks increasingly dated. Both the museum and the memorial are hugely expensive to run, the memorial because of its water features and security costs. The museum’s business model of selling entry tickets to tourists for $26 has been challenged by the pandemic, forcing it to furlough or lay off more than half its staff and cancel its plans for 20th anniversary special exhibitions. It has become the subject of debate about its relevance a mere seven years after its opening. 

 

Despite the proliferation of 9/11 memory, a notable shift in national memorialization was signaled when in 2018 the National Memorial for Peace and Justice and Legacy Museum opened in Montgomery, Alabama. A memorial to over 4,000 victims of lynchings, it demands recognition that terrorism is not a new or foreign aspect of American history, but has long been a part of the US national story as racial terrorism. The Legacy Museum re-narrates the US myth of racial progress to argue that the contemporary mass incarceration of Black Americans is evidence that slavery never completely ended.

 

National Memorial for Peace and Justice and Legacy Museum, Montgomery, Ala. Photo by author.

 

 

Here, memory is being deployed not to uphold a myth of national unity, as 9/11 memory did, but to demand that the national script be revised. It is notable that Bryan Stevenson, who founded the memorial and museum through the Equal Justice Initiative, felt that memorialization was the best strategy to raise public awareness about the legacies of slavery. At the same time, long fought-over Confederate monuments have been toppled and removed, a reckoning with the past that once seemed impossible and now seems inevitable. 

 

We don’t know what the new era we have entered will bring. But the demand that the nation confront its own history of terrorism has been activated. This means remembering terrorism not as a force that comes from outside but as a fundamental aspect of the American project of settler colonialism and slavery that lives on today. The memory of this past must be confronted in the present. To recognize the history of US terrorism is not to demand shame but to open up the opportunity for the nation to move forward from its difficult histories. 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181236 https://historynewsnetwork.org/article/181236 0
Scammed From the Beginning: Rejecting Expertise as an American Value

 

 

My home state of Arkansas has been the subject of many recent news reports due to a low incidence of vaccination against COVID-19 combined with a high incidence of the unvaccinated rushing to feed stores to purchase cattle deworming agents under the belief that these are more effective against a raging respiratory illness than any of that stuff being “pushed” by the “medical establishment.” Many commentators have linked this behavior to the prevalence of conspiracy theories on social media, while others center the increasing failure of average citizens to respect the knowledge and skills experts have accumulated through years of education and experience. The various books touted as providing some kind of means for understanding our present moment fall into these frameworks, from Nancy L. Rosenbaum and Russell Muirhead’s A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy (2019) to Tom Nichols’s The Death of Expertise: The Campaign against Established Knowledge and Why It Matters (2018). However, I would argue that the best text for making sense of why people across the country are rushing to eat horse paste during a pandemic was published more than thirty years ago—Nathan O. Hatch’s 1989 The Democratization of American Christianity.

By tracking the evolution of five difference religious movements in the early American republic (the Christian movement, Methodists, Baptists, Black churches, and Mormons), Hatch demonstrates how the revolutionary fervor for all things democratic pervaded the spiritual and cultural realms as much as it did the political, resulting in the conscience becoming individualized and all traditionally earned authority being held suspect. “Above all, the Revolution dramatically expanded the circle of people who considered themselves capable of thinking for themselves about issues of freedom, equality, sovereignty, and representation. Respect for authority, tradition, station, and education eroded,” Hatch writes. “It was not merely the winning of battles and the writing of constitutions that excited apocalyptic visions in the minds of ordinary people but the realization that the very structures of society were undergoing a democratic winnowing.”

These popular religious movements “denied the age-old distinction that set the clergy apart as a separate order of men, and they refused to defer to learned theologians and traditional orthodoxies.” In addition, they also empowered “ordinary people by taking their deepest spiritual impulses at face value rather than subjecting them to the scrutiny of orthodox doctrine and the frowns of respectable clergymen.” One early Baptist leader, John Leland, emphasized the right of any layman to read and interpret the Bible for himself, writing, “Did many of the rulers believe in Christ when he was upon earth? Were not the learned clergy (the scribes) his most inveterate enemies?” Some backwoods religious dissenters went even further and emphasized their own illiteracy as making them purer vessels for the Almighty, unable to be corrupted by traditional, elite education.

The craving for democratic equality went beyond the statehouse and beyond the church and infused all spheres of life. In his work, Hatch draws occasional attention to folk like Samuel Thompson, an “uneducated practitioner of natural remedies who learned his botanic medicine in rural New Hampshire at the close of the eighteenth century.” In his autobiographical narrative, Thompson argued that Americans “should in medicine, as in religion and politics, act for themselves.” He and his adherents published an array of pamphlets and journals that “made their case by weaving together powerful democratic themes.” As Hatch summarizes, the cornerstone of their teachings was this: “In the end, each person had the potential to become his or her own physician.” No wonder, then, that such medical practices became adopted by these increasingly democratic religious movements. And medicine was not the only profession increasingly undergoing a “democratic winnowing,” for everywhere, common people were determined to throw off the shackles of traditional expertise often associated with the old order as it pertained to law, economics, and more.

According to legend, as British troops surrendered to General George Washington at the end of the Siege of Yorktown, their band showed a remarkable sense of the historical import of the moment by playing a little tune called, “The World Turned Upside Down.” Most historians hold this tale to be apocryphal, but there is some irony in the legend, for the ballad was first published in the 1640s as a protest against Parliament’s prohibitions of the celebrations of Christmas. In England, the common people defied the elite Parliamentary infringement upon their “lowly” traditions, singing:

Listen to me and you shall hear, news hath not been this thousand year: Since Herod, Caesar, and many more, you never heard the like before. Holy-dayes are despis’d, new fashions are devis’d. Old Christmas is kickt out of Town. Yet let’s be content, and the times lament, you see the world turn’d upside down.

Yet in America, those “new fashions,” associated with Herod and Caesar in this ballad, flew under the banner of Christ. But not only was Christmas “kickt out of Town” in America—so, too, was that more recent tradition of inquiry dubbed the Enlightenment. When I was in grade school, we were taught to regard the American Revolution as the apogee of Enlightenment thinking, the incarnation of those lofty ideas of liberty exposited by the likes of John Locke and Jean-Jacques Rousseau. And certainly, those who formulated the Declaration of Independence and the American Constitution drew from their works. However, the main thrust of American culture ran in the opposite direction.

“This vast transformation, this shift away from the Enlightenment and classical republicanism toward vulgar democracy and materialistic individualism in a matter of decades, was the real American revolution,” writes Hatch. Some writers may lament the so-called “death of expertise,” but we have to ask: when was it ever truly embraced here in the United States? Those empty shelves of cattle dewormer at the local feed store are as much a legacy of the American Revolution as the laws that govern this country. The motivation to self-medicate with horse paste is driven not by some kind of new, mad, conspiratorial thinking but, instead, by a fervent, foundational belief that “all men are created equal.”

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181196 https://historynewsnetwork.org/article/181196 0
The Specter of Emancipation and the Road to Revolution: A Rejoinder to Richard Brown et. al. Editor's Note: HNN recently reposted an excerpt of a Medium post authored by Carol Berkin, Richard D. Brown, Jane E. Calvert, Joseph J. Ellis, Jack N. Rakove, and Gordon S. Wood. That post took the form of an open letter of critique of remarks made by Dr. Woody Holton in the Washington Post addressing the significance of Lord Dunmore's proclamation promising emancipation to enslaved Virginians who took up arms on the side of the Crown in 1775, and of the broader significance of the preservation of slavery as a motive for American independence.

HNN has offered Dr. Holton the opportunity to publish a rejoinder, which he has accepted. 

 

I am flattered that six distinguished professors of the American Revolution have taken an interest in my work—or least its potential impact. Just one index of these scholars’ significance is that I cite all six of them in my reappraisal of the founding era, Liberty is Sweet: The Hidden History of the American Revolution. It is due out next month.

 

But it saddens me that these senior professors have chosen to deny the obvious fact that the informal alliance between enslaved African Americans and British imperial officials infuriated white colonists and helped push them toward independence.  Surely the professors know that the Continental Congress chose as the capstone for its twenty-six charges against King George III the claim that the king (actually his representatives in America) had “excited domestic insurrections”—slave revolts—“amongst us.”

 

Congress’s accusation culminated more than a year’s worth of colonial denunciations of the British for recruiting African Americans as soldiers and even—allegedly—encouraging them to slit their masters’ throats (as writers in Maryland, Virginia, and North Carolina all expressed it). Indeed, the six professors’ timing is perfect. Others having also doubted this claim, especially in reaction to the New York Times’s “1619 Project,” I last month began a project of my own. Every day I tweet out one quotation from a white American of 1774-1776 who denounced Britain’s cooperation with African Americans, along with an image of the quoted document.

 

The book version of the #1619 Project appears in 76 days. 1 of its central claims—that colonial whites’ rage at the Anglo-African alliance pushed them toward Independence—has been disputed. So I will tweet 1 piece of evidence every day for the next 76.

— Woody Holton (@woodyholtonusc) September 1, 2021

 

I will end the series after seventy-six days, but I have collected sufficient evidence to go on and on.

 

I am in no position to lecture these distinguished professors, who count three Pulitzer prizes among them, but since they have criticized my work, I have no choice but to speak plain: I think their critique betrays a fundamental misunderstanding of how the Declaration of Independence came about.

 

It happened in stages. In 1762, most colonial freemen were, all in all, satisfied with their place in the British empire. Indeed, as Prof. Wood’s former student Brendan McConville emphasizes in The King’s Three Faces, they loved their new king. The initiative for changing the imperial relationship came not from the colonies but from Parliament. From 1763 through late 1774, Parliament sought more from the provincials, especially in the areas I like to summarize as the 4 Ts: taxes, territory, trade, and treasury notes (paper money). And all the free colonists wanted was . . . none of those changes. Until late in 1774, they strenuously resisted Parliament’s initiatives, but most of them would have been perfectly happy to return to the status quo of 1762. They did not seek revolution but (to use another loaded word from English history) restoration.

 

The grand question then becomes, “What converted the colonists from simply wanting to turn back the clock—their view from 1763 to 1774—to desiring, by spring 1776, to exit the empire?” Many things: the bloodshed at Lexington, Concord, and Bunker Hill; the news that the administration of Lord North was going to send German (“Hessian”) mercenaries against them, the publication in January 1776 of Common Sense, and much, much more.

 

All I argued in the essay that the professors criticize is that one of these factors that turned these white restorationists into advocates for independence was the mother country’s cooperation with their slaves. It was not the reason, but it was a reason. And that is important, because it means that African Americans, who of course were excluded from the provincial assemblies and Continental Congress, nonetheless had a figurative seat at the table.

 

Nor was Blacks’ role passive. Congress depicted them as incited to action by the emancipation proclamation issued by Lord Dunmore, the last royal governor of Virginia, and the professors adopt that same formulation. But here again, the timeline is crucial. Whites began recording African American overtures to the British in the fall of 1774. At first British officials turned them away, but they kept coming, right up until Dunmore finally published his emancipation proclamation on November 15, 1775, four score and seven years before Lincoln’s.

 

The professors claim that white colonists were already headed toward independence in fall 1774, when these African American initiatives began. But in this they indulge in counterfactual history—assuming they know what would have happened. It seems clear to me that, even that late, had Parliament chosen to repeal all of its colonial legislation since 1762, it could have kept its American empire intact. What we are looking for are the bells that could not be unrung. Especially in the south, one of the British aggressions that foreclosed the possibility of reconciliation was the governors’ and naval officers’ decision to cooperate with the colonists’ slaves (as well as with Native Americans—the Declaration of Independence’s “merciless Indian Savages”—but that is another story).

 

In Liberty is Sweet, I supply much more evidence for my stadial (stages-based) view of the road to independence. I compare it to a mouse’s escape from a maze, since it was the product not of a grand design but of a series of discrete choices at intersections, from none of which the next was visible. Would that the distinguished professors had waited to judge my reinterpretation by my 700-page book rather than the 700-word Washington Post article I wrote to promote it!

 

The professors may be correct that we would still get independence even if we removed one of its main ingredients, like Dunmore’s Proclamation . . . or the Battle of Lexington and Concord, which I teach as not only the first battle of the revolution but also, for many, especially in New England, the final argument for independence. But I would never take that remote possibility as a reason to write a history of the American Revolution that omitted Lexington and Concord. And by the same token, I hope the professors would never omit the Anglo-African alliance.

 

I agree with the professors that it would be a disservice to pretend that enslaved Americans played a significant role in the origins of the American Revolution if there was no evidence that they did. But the evidence is overwhelming, and I invite you to sample it on Twitter at @woodyholtonusc. If we heed the professors’ call to ignore the influence of the enslaved people of the founding era, we will dishonor not only those heroic Americans but our own search for truth.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181195 https://historynewsnetwork.org/article/181195 0
The Attica Prison Uprising, 50 Years Ago Today Good morning, HNN!

I'm pleased to present the first episode of Season 3 of Skipped History, chronicling the Attica Prison uprising of 1971. It’s been 50 years since the stunning rebellion, and still the consequences are unfolding:

You can also watch the full episode on Instagram and a preview on Twitter.

Today’s story comes from Blood in the Water by Heather Ann Thompson. Have you read it? I found it truly draw-dropping.

I hope you enjoy the video. Questions, comments, and suggestions for further reading are welcome!

Cheers,

Ben

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/blog/154539 https://historynewsnetwork.org/blog/154539 0
Teaching "All Men are Created Equal" (Part I)

 

 

The controversy over Critical Race Theory has animated teachers, school administrators and state legislators — not to mention parents. The former chancellor of the New York City schools, Richard Carranza, went so far as to proclaim that it was the duty of teachers to combat “toxic whiteness” — a disastrous term that was picked up by the New York Post.

 

One of the difficulties in discussing Critical Race Theory is that the term has become entwined with the ideas in Robin DiAngelo’s White Fragility. Endless disclaimers that Critical Race Theory (CRT) is about systemic rather than individual racism seem specious to those who conflate the idea with the so-called “anti-racism training” associated with DiAngelo, and the passive-aggressive personal confrontations offered in her training sessions. Educators and others are afraid of undoing the self-esteem of white students, and this is a legitimate concern. I imagine that many race-training sessions at workplaces are intimidating to adults, but the idea is even more of a danger to classroom teaching. No teacher should enter a classroom and announce that “I will be very cautious about this, but you need to understand that you all as individual white people are perpetuating racism in this country.” You cannot have a real discussion after that, no matter how gently you try to approach the subject. As a long-time teacher of American history, I hope to show that it is possible to discuss racism and the years of protests against it without intimidating students of color or white students.

 

This essay is dedicated to the students and teachers who want to cut through the controversy about teaching race and racism to confront the truths in American history with all its twists and turns, lights and shadows.

 

I was a teacher of American history for more than 30 years at a high school in Brooklyn, at several of the New York City community colleges, and Hunter and City College as an adjunct instructor. I taught abolition, slavery and Civil Rights, which consumed much of my class time from the first day to the last every semester. My students were a glorious mixture of nearly every race and color in New York City.

 

I treated them whatever their academic level as intellectuals-in-training by assigning them speeches and documents long and short for homework, which they had to bring to class the next day. I would not lecture, give them any questions to answer or ideas to look for when they read. Instead they were asked to  choose sentences they liked or disliked for what ever reason and made brief comments explaining their choices.

 

In some classes I had the kids write the first few words of their sentences on the blackboard and then we would discuss what they had chosen.  They read their whole sentence out loud and everyone read silently along with them. These were works by ML King Jr., Frederick Douglass, Madison Grant (one of the founders of scientific racism) and Barack Obama. I also taught a class in which we read only American speeches. My first question nearly every day we did these longish readings of 10 or 15 pages was for example, “What do you think about Frederick Douglass's “Fourth of July Oration?'' That would lead to a discussion that served as an introduction to the lesson before we turned to their sentences. We would do shorter documents – sometimes in class – one or two times a week and the longer ones twice in three weeks. In my high school classes I called this the Tarzan Theory of Reading because we were swinging through the document by grabbing on to sentence after sentence.

 

To teach you have to “bring the things before the eyes.” When we discussed the clause “ all men are created equal” the understanding came from the students that it embodies the hopes and dreams of every American and, simultaneously the nightmares of inequality and violence that people of color have been forced to live with in this country. I did this on the first day of every class when the students suggested events for a timeline from 1492 to 1865. They each would write down three events, I would ask students to volunteer one event, I would write them on the board and then we would discuss them as we went along. The Declaration of Independence and its most famous phrase always came up. When it came to teaching the American Revolution, I spent five days discussing the document and its implications for the Revolution and history up to the present. Whatever you think of the New York Times' 1619 Project with its attempt to de-emphasize the importance of the Declaration, it is still necessary to understand the most famous phrase in American, if not world history. Here is a brief description of my lessons on the Declaration concentrating on the last day in the sequence when we discussed the meanings of “all men are created equal.”

 

The first assignment for the series of  Declaration classes was to determine how many parts the document had -- keeping within a limit of five. Then the students were asked to find and underline the references to the Native Americans, the Stamp Act, the Boston Port Act, the Quebec Act and the Massachusetts Government Act. The last three were parts of the Intolerable Acts which caused the Americans to respond by forming the first Continental Congress in 1774. By narrowing down the Declaration structure to three parts, the students could see that in the middle part of the document all the sentences began with He or For. Those were the Grievances.

 

The students pointed out each of the grievances that they had underlined. We concluded that “For taxing us without our consent” was ambiguous. It could be the Stamp Act or the Navigation Acts, the Townshend Acts or the Tea Tax which was the motivation for the Boston Tea Party  in 1773. The Intolerable Acts were more straightforward to identify.  All of these details were parts of classes for the weeks prior to our study of the Declaration which was written in June and July of 1776, more than a year after the first battles of the American Revolution, Lexington and Concord in 1775. Originally celebrated as Patriots' Day in Boston on April 19 it began with the famous “shot heard 'round the world.”

 

Then I asked for a volunteer to read the first paragraph of the Declaration itself, which begins with “When in the Course of Human Events....” and ends with “impelled them to the separation” in their version. (The document I used was the version from the Yale Avalon 18th century document site.) When I asked them what they thought of the opening words, the students concluded that it is a theory of history: people make history. This is a description of agency, a key term for historians that describes how all peoples can take control of their fate. In our case, the Americans became revolutionaries by protesting the Stamp Act, and the Tea Act, and forming the First Continental Congress in response to the Intolerable Acts.

 

The first paragraph also discusses the laws of nature and nature's God that entitled the Americans to separate from Great Britain. Now a student reads the next few sentences which contain the clause “That all men are created equal” and list the unalienable  rights to life, liberty and the pursuit of happiness. Thus, “all men are created equal” is one of the natural laws like the right to life and liberty and, of course gravity, that are all on the same plane here: natural laws –- the laws discovered by Sir Isaac Newton that describe how the universe runs. We discussed the most famous clause, “all men are created equal,” by itself on the last day of the week.

 

At this point I hand out the part of the Declaration written by Thomas Jefferson that the Second Continental Congress dropped from the final version. It was the section that blamed the king for slavery in the 13 American colonies. Here is the beginning:

 

 He has waged cruel war against human nature itself violating its most sacred rights of life and liberty in the persons of a distant people who never offended him, captivating and carrying them into slavery in another hemisphere, or to incur miserable death in their transportation thither. (T)his piratical warfare, the opprobrium of infidel powers is the warfare of the Christian King of Great Britain.

 

As my students saw immediately, Thomas Jefferson described the slaves as humans with natural rights and he called slavery “cruel war.”   Clearly this omitted section was meant to be part of the grievances because the paragraph begins with “He.” Jefferson uses the phrase “piratical warfare” which might be obscure to readers today, but my students knew it referred to man-stealing, an abolitionist term for enslavement. They had read the five-page polemic “African Slavery in America,”  by Tom Paine  for the third day of class. Man-stealing is one of the key phrases in his abolitionist pamphlet published in 1775 by the Pennsylvania Journal and Weekly Advertiser. Jefferson also refers to the middle passage, which caused the enslaved people to suffer “miserable death” in their journey to North America. He also sarcastically called the slave trade the work of the “Christian King of  Great Britain,” who was practicing the “excreble commerce” of the “infidel powers:” the Spanish, Portuguese and Muslims who had preceded the British in the slave trade.  Now the hypocrisy of future president Jefferson becomes the topic of discussion; especially since his livelihood depended on the labor of hundreds of enslaved persons on two plantations. He blamed the king for foisting the slaves on the Americans and complained that the king was also

 

exciting those very people to rise in arms among us and to purchase that liberty of which he has deprived them, by murdering the people upon whom he also obtruded them: thus paying off former crimes committed against the liberties of one people with crimes which he urges them to commit against the lives of another.

 

Such blatant hypocrisy was common for those defending system of slavery; especially in view of his words about equality and liberty for whites and blacks in this very document. The audacity of Jefferson to claim that “his” slaves were unfairly treated by the king and that the king was to blame for his (Jefferson's) own ill-gotten gains reminds me of a drug dealer who claims it is fine to sell drugs to “get over.” Of course he does not take them himself, which would be dangerous to his health. It takes a close reading of the phraseology in the quote above to figure out who were the slaves and who were the Patriots in the convoluted grievance. It is remarkable that almost all of the Declaration is clearly written. It is a prose poem that drives you on, in the same way the Gettysburg Address does. This omitted section is turgid.

 

Finally, we must point out why the Second Continental Congress rejected the grievance on slavery and the king. The Congress had agreed that the Declaration had to be unanimous in order to create a united front against the king and his army. But the slaveholders, led by South Carolina, refused to vote for the Declaration if it included the section criticizing slavery. It was left out of the final version.

 

What makes this intimate bond of Enlightenment idealism and rank racism a grievance is the argument that the king is encouraging the slaves of the Patriots (not the Loyalists, truth be told) to kill the revolutionaries to obtain their freedom as members of the British Army. The slaves of the Loyalists were not offered that opportunity by Lord Dunmore in his recently famous but misunderstood Proclamation of 1775. After all, the Loyalists were supporting the king, so they could keep their slaves. This idea in the section comes out in the final grievance that says “he has excited domestic (slave) insurrection among us” which Jefferson couples with a separate grievance condemning the king for encouraging the “merciless Indian savages” to wage war against the Patriots by murdering our people of “all ages, sexes and conditions.” Students are not used to reading the word “savages:” shocking language for a document about equality. It is an assault on modern sensibilities, but of course it was a common way to refer to the Native Americans.

 

So this class began with a discussion of the causes of the turmoil in the 1760s and '70s as examples of human agency, then moved to a description of unalienable rights based on nature or nature's God, then finally to a justification of the Revolution as part of natural law, and an inclusion of the idea “all men are created equal” as one of those natural laws. But, as has become apparent in context, all this is bound up with the deep hypocrisy about the contradiction of holding humans with natural rights in the bonds of what the slave holding founding father called the “cruel” and “piratical” war that we call chattel slavery.

 

At this point we have read only about 80 words at the beginning of the final version of the document.  

 

Part II of this essay will appear in the coming weeks on HNN. 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181197 https://historynewsnetwork.org/article/181197 0
See a Piece of History: Retired FDNY Fireboat John D. McKean

FDNY Fireboat John D. McKean celebrates the 125th anniversary of the Brooklyn Bridge, 2008. 

 

 

Editor's Note: The Retired FDNY Fireboat John D. McKean was pulled into emergency service on September 11, 2021 supporting firefighting with its still-functional pumps and transporting evacuees from Manhattan. The McKean and its crew are discussed in this week's essay by Jessica DuLong on the 9/11 boatlift. 

 

New Yorkers and visitors to the city will have the opportunity this month to visit the retired FDNY Fireboat John D. McKean at the Hudson River Parks Friends' Pier 25 in lower Manhattan. The McKean will be available to visitors and for public and private rides in New York Harbor. 

The McKean was purchased at auction in 2016. In 2018, the Fireboat McKean Preservation Project was formed with the mission of preserving this historic vessel for museum and educational purposes. In 2019, the McKean underwent major repairs to its hull at the North River Shipyard in Upper Nyack, NY, with subsequent restoration of the ship's above-deck areas. Through the Hudson River Parks Friends, the McKean will be able to dock at Pier 25 to be in proximity to lower Manhattan for the 20th anniversary of the attack on the World Trade Center on September 11, 2011. 

John D. McKean was a marine engineer aboard the Fireboat George D. McClellan when he was fatally burned by a steam explosion in 1953. Despite his injuries, he kept his post to help bring the ship in safely. An already-ordered but yet-to-be commissined fireboat was named in his honor the following year. John McKean's spirit was reflected in the vessel's service supporting firefighting and transporting evacuees on September 11. The McKean was also used to fight the 1991 Staten Island Ferry Terminal fire and assisted with the rescue of passengers of the US Airways Flight 1549 Hudson River landing executed by pilot Chesley "Sully" Sullenberger in 2009. 

For more information, go to www.fireboatmckean.org .

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181239 https://historynewsnetwork.org/article/181239 0
The Missed Lesson of Vietnam: Plan for Unconditional Victory or Don't Intervene at All

Secretary of State Henry Kissinger, President Richard Nixon, and Maj. Gen. Alexander Haig discuss the situation in Vietnam at Camp David, November 1972.

 

 

There have been any number of pundits comparing the fall of Saigon in 1975 to the horrors we are witnessing as the United States withdraws from Afghanistan. Events in Afghanistan are far worse, some argue; at least in Vietnam there was a “decent interval” between the Peace Accords signed in January 1973 and the collapse of South Vietnam in April 1975.

All of this misses the point. Both situations prove the folly of starting a war where there is little likelihood of unconditional surrender. History screams this lesson, and the United States in its delusional belief in its vast military power continues to fall into the trap that somehow this time will be different.

“Vietnamization” did not work any better than attempts by the United States to put the Afghan government and its military on their feet after the American invasion in October 2001. The chaotic end to both long wars was predestined from the first days of the conflicts. Unless an invading nation can impose its terms after an unconditional surrender, as seen at the end of the Second World War with Germany and Japan, the result will be the predictable loss of public support for the war and a withdrawal that leaves the situation in chaos. 

Even the end of the First World War illustrates this problem. After four years of brutal war, the parties stopped fighting under the terms of an Armistice brokered in large part by the United States. The war had been a draw—there were no clear winners and no clear losers. But the peace treaty put the blame on Germany and a weak democratic government was installed, known as the Weimar Republic. After a brief time and intense disorder and fighting in the streets of Germany, the Nazis rose to power, arguing the peace treaty was “a stab in the back,” and a second even more catastrophic world war followed a 20-year interregnum.

In Vietnam, the key architects of the end of the war, Richard Nixon and Henry Kissinger, knew that the peace they were forcing on South Vietnam was highly unlikely to last and that President Thieu of South Vietnam was doomed by the very terms Kissinger was negotiating in Paris with the North Vietnamese Special Advisor, Le Duc Tho.

Looking back, the terms seemed almost ridiculous: leave the Vietcong in place in the South, remove the Americans, hold supposed free elections to reunite the country, and allow material to continue to pass through the demilitarized zone (DMZ). Worse, the United States had to agree to pay billions of dollars to North Vietnam to help it rebuild after years of war (euphemistically referred to in the peace accords as “healing the wounds of war and postwar reconstruction”).

In one of the more instructive Nixon tapes, Nixon and Kissinger spoke just days into the new year of 1973 in the Oval Office about the inevitability of the fall of South Vietnam. Two options were under consideration: Option One (peace now) and Option Two (continued bombing for return of POWs).

“Look, let’s face it,” Nixon told Kissinger, “I think there is a very good chance that either one could well sink South Vietnam.”

Nixon said he had responsibilities “far beyond” Vietnam alone and that he could not allow “Vietnam to continue to blank out our vision with regard to the rest of the world.” He spoke of his efforts at détente with the Russians and his recent opening of China.

And while the options both carried great risks, Nixon knew the American public wanted out and even a bad result was better than continuing.  He had trouble even verbalizing the concept given how painful the reality was to him. “In other words,” he told Kissinger, “there comes a point when, not defeat—because that isn’t really what we are talking about, fortunately, in a sense—but where an end which is not too satisfactory, in fact is unsatisfactory, is a hellava lot better then continuing. That’s really what it comes down to.”

Kissinger responded that if President Thieu’s government fell, it would be “his own fault.”

There is a unique humiliation brought down on a warring superpower like the United States when the enemy combatants know they can outlast public opinion in America. The North Vietnamese correctly read the results of the national election in the U. S. in November 1972. Yes, President Nixon won in a landslide; but in Congress Democrats held onto their majorities in the House and the Senate. In fact, the Democrats gained six seats in the Senate—one being a freshman from Delaware named Joe Biden.

After that election, peace was no longer “at hand,” as Kissinger erroneously predicted. Instead the North Vietnamese dug in. Congress threatened to cut off all aid, military and otherwise, if Nixon did not end the war.

In Paris, Kissinger was rudely greeted with a request that the United States set a unilateral deadline for withdrawal. “North Vietnam’s sole reciprocal duty,” Kissinger sarcastically wrote, “would be not to shoot at our men as they boarded their ships and aircraft to depart.”

It took a massive bombing campaign in December 1972, the infamous “Christmas Bombings,” to finally exert enough pressure to obtain a peace agreement that was really nothing more than a veiled surrender.

We do not know what the future holds for Afghanistan. Predictions range from total disaster to perhaps, in time, normalization. Neither scenario is likely. Vietnam teaches us something. After 50,000 American dead and millions killed across the region, nearly fifty years later, Vietnam is on no one’s “axis of evil” list. Americans vacation there; trade is brisk between the two nations.

Korea, the one outlier to the rule that unconditional surrender is the only sure way to predict an outcome in war, actually proves the point given the incredible price the United States continues to pay in maintaining a military presence there—and all the United States and world have to show for this continuing commitment is a virtual madman in charge of North Korea who justifies an aggressive nuclear arms program because of the bogeyman of a continuing threat of war following an armistice that was signed nearly seventy years ago.

The lessons of the First World War, Vietnam and Afghanistan are plain: don’t overestimate American military power and don’t expect clean exits where it was predictable from the start that a long-term war was not winnable.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181237 https://historynewsnetwork.org/article/181237 0
The Roundup Top Ten for September 10, 2021

Americans Sought Safer Abortions in Mexico Before Roe, Too

by Lina-Maria Murillo

"No matter what antiabortion crusaders try, pregnant people will always find ways to have abortions — and networks that go beyond borders have long helped them navigate treatment options."

 

50 Years Since Attica, Will America Observe the Human Rights of Prisoners?

by Heather Ann Thompson

"The Attica prison uprising was historic because these men spoke directly to the public, and by doing so, they powerfully underscored to the nation that serving time did not make someone less of a human being." 

 

 

Stop Using Islam to Critique the Texas Abortion Ban

by Sajida Jalalzai

Using comparisons to the Taliban or other Islamic radicals to attack anti-choice laws obscures the deep roots of misogyny in white Christian America. 

 

 

What Is Owed: The Limits of Darity and Mullen's Case For Reparations

by William P. Jones

A historian argues that a recent and influential book calling for reparations could strengthen its case by considering the arguments made by historians about the connections of American slavery to other manifestations of racism. What's needed is to link reparations to a global overturning of racial inequality.

 

 

America Is Giving the World a Disturbing New Kind of War

by Samuel Moyn

The adoption of rhetoric of "humane war" after Vietnam has allowed discussions of how to wage war to sideline discussions of whether to wage war at all, and encourages secrecy, surveillance, and long-term engagement. 

 

 

SCOTUS Ended the Eviction Ban, but not the Fight Against Eviction

by Maia Silber

Philadelphia's housing crisis during the first world war shows that worker and citizen activism is essential to compel governments to act to secure adequate affordable housing. 

 

 

The Discursive Power of the Pittsburgh Courier and the Black Press

by Adam Lee Cilli

The influential Black newspaper's publisher Robert L. Vann has been criticized as a self-promoting tribune of the Black bourgeoisie. A historian argues he should be reconsidered as a pragmatist building alliances in a time of upheaval for Black America. 

 

 

Too Often, Politicians Pick Their Voters

by Warren E. Milteer Jr.

Political factions and then organized parties have fought over the size, composition and geographical ordering of the electorate since the founding. This legacy today undermines the legitimacy of government and the political will to protect the right to vote. 

 

 

Giving the Women of the Divine Comedy their Due

by Laura Ingallinella

One scholar's project is using Wikipedia and her students to recover the historical personhood of Dante's women and elevate them above literary symbols or caricatures. 

 

 

There's More War in the Classroom Than You Think

by William Hitchcock and Meghan Herwig

Whatever the causes of the decline in history enrollments, it's not because history departments have rejected the study of war and military history. 

 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181234 https://historynewsnetwork.org/article/181234 0
Were the 9/11 Attacks Preventable?

 

 

It has now been twenty years since the terrorist attacks of September 11, 2001 plunged the nation into shock, consternation, grief, and fear. Amid the despair over the loss of nearly three thousand lives and the anxieties about further strikes, many questions arose over how such a devastating blow on American soil could have happened. The most important of them was also the most elusive: were the attacks preventable? After two decades of investigation, the answer remains an equivocal “perhaps.”

 

Presidents Bill Clinton and George W. Bush were well aware that the Islamist militant Osama bin Laden and his Al Qaeda network posed a serious threat to American interests and lives. Clinton compared him to the wealthy, ruthless villains in James Bond movies. To combat the dangers that Al Qaeda created, he and his advisers considered a wide range of military and diplomatic options that ranged from kidnapping bin Laden to U.S. military intervention in Afghanistan. But the use of cruise missiles against Al Qaeda camps in Afghanistan in 1998 produced acutely disappointing results. Other military alternatives seemed too risky or too likely to fail and diplomatic initiatives proved fruitless.

 

During the transition after the 2000 presidential election, Clinton and other national security officials delivered stark warnings to the incoming Bush administration that bin Laden and his network were a “tremendous threat.” The immediacy of the problem was heightened by Al Qaeda’s bombing of the destroyer USS Cole in the harbor of Aden, Yemen in October 2000, which caused massive damage to the ship and claimed the lives of 17 crew members. Clinton and his advisers strongly recommended prompt consideration of the options they had weighed.

 

Bush and high-level national security officials were not greatly impressed. They regarded terrorism as an important but not top-priority problem. The president later revealed that he did not feel a “sense of urgency” about bin Laden and that his “blood was not . . . boiling.”

 

The Bush administration viewed Clinton’s campaign against Al Qaeda as weak and ineffective, and it was dismissive of the advice it received. Rather than drawing on the experiences of its predecessor, it embarked on the preparation of a “more comprehensive approach” that National Security Adviser Condoleezza Rice believed would be more successful. During the spring and summer of 2001, it worked at an unhurried pace, even in the face of dire warnings from the U.S. intelligence community that Al Qaeda was planning attacks that could be “spectacular” and “inflict mass casualties,” perhaps in the continental United States.

 

Eight months after he took office, Bush’s White House completed its comprehensive plan to combat Al Qaeda. The steps it included in the form of a National Security Presidential Directive (NSPD) were strikingly similar to the options the administration had inherited from Clinton. The final draft of the NSPD called for greater assistance to anti-Taliban groups in Afghanistan, diplomatic pressure on the Taliban to stop providing bin Laden safe haven, enhanced covert activities in Afghanistan, budget increases for counterterrorism, and as a last resort, direct military intervention by the United States. This proposal was little different in its essentials than what the Clinton administration had outlined, and it offered no novel suggestions on how to carry out its objectives more successfully. Deputy Secretary of State Richard Armitage later commented that there was “stunning continuity” in the approaches of the two administrations.

 

The NSPD landed on Bush’s desk for signature on September 10, 2001.

 

The troubling question that arises is: could the calamities that occurred the following day have been prevented if the NSPD had been approved and issued earlier? There is no way of answering this question definitively; it is unavoidably counterfactual. Yet it needs to be considered. The 9/11 plot was not so foolproof that it could not have been foiled by greater anticipation and modest defensive measures.

 

The threat that Al Qaeda presented was well known in general terms within the national security apparatus of the federal government, even if specific information about possible attacks was missing. But responsible officials and agencies did not do enough to confront the problem. A presidential statement like the NSPD of September 10, if distributed sooner, could have called attention to the dangers of potential terrorists present in the United States. The CIA and the FBI failed to track the whereabouts or investigate the activities of two known Al Qaeda operatives who lived openly in California for about 20 months, took flying lessons, and participated in the hijackings on 9/11.

 

On July 5, 2001, high-level officials from seven agencies received a briefing from the National Security Council’s National Coordinator for Counterterrorism, Richard A. Clarke. He cited the dangers that Al Qaeda presented and the possibility that it “might try to hit us at home.” The agencies responsible for homeland security did not react in meaningful ways to the warning, largely because a terrorist strike seemed far less likely in the territorial United States than abroad. Perhaps an earlier NSPD, armed with the weight of presidential authority, would have sharpened the focus on the risks of a terrorist plot within America and galvanized security officials and agencies into effective action. Perhaps, for example, the Federal Aviation Administration would have tightened airline boarding procedures or made terrorists’ access to cockpits more difficult. The FBI instructed its field offices to make certain they were ready to collect evidence in the event of a terrorist assault, but it did not order them to take any special steps to prevent an attack from occurring.

 

Even if the “what-if” queries surrounding the failures that allowed 9/11 to happen cannot be answered, we can agree with Condoleezza Rice’s heartfelt admission in her memoirs: “I did everything I could. I was convinced of that intellectually. But, given the severity of what occurred, I clearly hadn’t done enough.” Earlier adoption of the NSPD might not have made a difference. But the haunting thought remains that it might have spared America the agony of 9/11.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181150 https://historynewsnetwork.org/article/181150 0
20 Years of Flawed Assumptions Led to Failure in Afghanistan

Mural at Bagram Air Base. Photo by author.

 

 

“What we are seeing now is truly shocking and shows we missed something fundamental and systemic in our intel, military and diplomatic service over the decades — deeper than a single (horrible) decision. Something at the very core that unraveled 20 years in only days.” In the emotional week following Kabul’s fall to the Taliban, these were the words of a close colleague who spent years advising senior U.S. military and the Afghan government. Countless explanations emerged attributing the well trained and equipped Afghan Army’s loss to a barbaric insurgent militia, citing an antagonistically factional, corrupt, and illiterate army plagued by poor morale, lacking any incentive to keep fighting, and a long-sustained over-reliance on U.S. close-air support. Poor governance by the power-hungry Afghan elites in Kabul, the same ones who consistently ignored military and security reforms, was freshly scrutinized. Finger pointing in the District abounded, identifying intel failures, lack of a conditions-based withdrawal, a consistent strategy, or a military culture unable to admit failure.

After the U.S. dedicated two decades and trillions of dollars to defeat the Taliban, one must ask: why were the world’s superpower’s best efforts and superior military might insufficient? During my time as the Political-Military Advisor to U.S. commanders in Eastern Afghanistan, I witnessed the Taliban’s ability to swiftly defeat the Afghan Army in the provincial capital of Kunduz in 2015. Even then, problems plaguing the Afghan Army, such as high AWOL numbers or “ghost soldiers,” and the Taliban’s capabilities, were evident. But there is much, much more to it than simple metrics.

Recent justifications and excuses fail to consider the central flawed assumption underpinning U.S. efforts from day one: that the majority of Afghans were as opposed to Taliban governance as the Coalition. But local anti-Taliban uprisings are no more indicative of an entire nation’s political leanings than a mob storming the U.S. Capitol. We looked through our Western lens, anticipating the population’s embrace of a new government, believing we would be the liberator of a nation from its fundamentalist oppressors.

The West’s perceptions of Taliban human rights atrocities inflicted on the Afghan population are substantially graver than leading Afghan elders’ perceptions—specifically those Elders who negotiated Jalalabad’s handover, which took place hours before Kabul’s fall and met with minimal resistance. How could this happen? Jalalabad is the 5th largest city in the country and home of the 201st Corps, a top-performing Afghan unit, as well as the home of the Ktah Khas—one of the most elite special forces divisions in the country, consisting of army, police, and intelligence agency units. Just as in 1996, when the Taliban was welcomed as fellow Pashtuns filling a void and quelling the warring warlord factions created by Russia’s departure years prior, now also many cities have quickly surrendered to them. This is largely due to the same deeply rooted, patriarchal, conservative cultural and social conditions remaining throughout the last quarter of a century there.

Ridding Afghanistan of the Taliban is akin to eradicating components of a 1,700 year old “code of life.” For decades, the militant group has been intricately woven into the fibers of society – to include creating shadow governments where the state structure collapsed and facilitating transport of the country’s booming drug trade presided over by provincial leaders and esteemed village Elders. For many Afghans with roots in a culture drastically different than ours, Taliban governance was simply not as barbaric as what we saw through our Western lens. Or at least not worth sacrificing their lives to prevent.

Long before the Taliban emerged, there were tribal policies of gender segregation (purdah), represented by burkah clad women. These policies, considered draconian by the West, dominated the countryside of this patriarchal society. Confined to caring for families and working the fields, women gave little thought to pursuing Parliamentary offices or higher education where interaction with men might occur. Their burkah provides a gender barrier and purdah safeguards honor while ensuring they remain as intended: protected and invisible. While there were variances in severity dependent on region and socio-economic status, the Taliban’s barring of women from schools was not new but an imposition of conservative cultural village norms onto city women in Kabul or Kandahar. This did not suddenly change with the Taliban’s fall in 2001; Afghanistan does not now have an entire generation of educated and liberated women. In reality, only 29% of girls age 15 and older are literate today (compared to 55% of males). Similarly, honor crimes also continued to occur throughout the country over the last two decades; as recently as May of 2020 18-year-old Nazela was murdered after running away with her boyfriend in Badakhshan. As extreme as the militant group’s gender ideology is, it is not the sole source of responsibility for human rights abuses and oppression of women.  Accordingly, opposition to the Taliban based on social repression could never serve as a basis for widespread opposition to its rule.

Like the Taliban, 40% of Afghans are Pashtun, a fiercely independent people long ruled by Pashtunwali. This ancient tribal code and way of life is similar to the strict interpretation of Islamic Law the insurgency promulgates. Their ideology ensured centuries of tribal survival and its custom, or tribal, law dominates the country’s informal justice systems. The Taliban have historically represented a unique blend of Deobandi Islam, Saudi Wahhabism, and tribal Pashtun beliefs and values. The punishments they administer have been present for centuries and, at times under Shari’a, were less brutal than tribal justice of stoning or honor killings. By incorporating their extremist interpretation of Shari’a in conjunction with tribal law policies the rural population had already been accustomed to, the Taliban minimized opposition throughout much of the country.

Following the 2021 Kabul takeover, President Biden accurately stated, “You can’t give them the will to fight.” There are reports of stories half a world away in remote Mexican highland towns like Pantelho in Chiapas Province, where indigenous vigilantes armed with makeshift weapons have been successful in driving out powerful drug cartels terrorizing their communities. In the absence of effective state security forces, advanced weapons, or training, they are only armed with the will to free their communities from militant oppression. Yet the Afghan Army, trained and armed with over 22,000 Humvees, 51,000 tactical vehicles, 600,000 key weapons, and 200 rotary and fixed-wing aircraft, opted to hand over the country. It’s clear that for many in Afghanistan, the Taliban threat was not dire enough to warrant defiance or resistance.

This is not to deny that many Afghans did sacrifice their lives in battle over the years, including those desperately rushing to the Kabul airport attempting to escape feared political retribution. A brilliant commander I served under recently publicly wrote, “I had served with some true Afghan heroes and had too many episodes of Afghan leaders and people who actually were genuine, who didn't want a return of the Taliban. They wanted prosperity for their family and were humble. They were patriots in their own way. I now know and accept that these honorable, noble Afghans were actually very unrepresentative.” Whether these individuals were inspired to fight for their country, their families or a better life, remains unknown. And pockets of historical resistance remain, such as the Panjshir Valley, led by the son of famed Northern Alliance fighter Ahmad Shah Massoud, who continues his father’s legacy against the Taliban. These unfortunately are not the majority of a 332,000 strong military, which was provided with the capability and capacity to win.

We wrongly assumed the majority of the Afghan Army, representative of the Afghan people, was as opposed to the Taliban as we in the West, and especially that a predominantly male army would suddenly fight for freedoms for Afghan women - freedoms many never possessed. We assumed they’d wage war rather than allow rule by an extremist interpretation of a religion they already practice. We assumed their training and capabilities equaled motivation. But no amount of intelligence assets, military strategies, nation building efforts, or financial assistance could force the Afghans to fight for a Western version of peace and prosperity. This is not a U.S. loss, but an Afghan one.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181171 https://historynewsnetwork.org/article/181171 0
The Generosity of James Loewen

This real estate advertisement, touting "No Malaria, No Mosquitoes, No Negroes," was one of many James Loewen discovered in Arkansas while writing an entry on "Sundown Towns" for the Online Encyclopedia of Arkansas, an entry that still informs Arkansans about their state's history.

 

 

 

One of the things I quickly learned working as an editor at the online Encyclopedia of Arkansas, a project of the Central Arkansas Library System, is that some scholars simply do not consider writing encyclopedia entries a valuable use of their time. Writing encyclopedia entries either does not count sufficiently toward tenure or simply does not pay enough. Granted, there are plenty of exceptions, and numerous examples of generosity out there, but to edit an encyclopedia is to face regular rejection by many of the leaders in their particular fields. I learned that as a graduate student serving as an editorial assistant on Dr. William M. Clements’s The Greenwood Encyclopedia of World Folklore and Folklife, and the experience carried over into this new position.

So it was a surprise to find, one day in early 2006, a message from the nationally renowned Dr. James W. Loewen in my inbox, and an even greater surprise to see that he was offering to write an entry on sundown towns for the Encyclopedia of Arkansas, which had not yet even gone live. Of course, we eagerly agreed, and we made sure his entry was ready to go by the time we formally launched the site on May 2 of that year. Later that summer, Dr. Loewen actually came to our state to speak at Arkansas Governor’s School, and while he was in the area, he dropped by our library to chat with us and see if our archives contained anything relevant to sundown towns. Sure, he could be a bit brusque, but he was deadly earnest in his desire find and expose these manifestations of American racism that had, for so long, gone unnoticed and unexplored by most scholars.

As it happened, I had taken a leave of absence from my Ph.D. program in 2005 to assume my job at the Encyclopedia of Arkansas, and I would soon be returning to my classwork and contemplating a dissertation topic. Talking with Jim, as I would come to know him, got the gears turning in my head, and I emailed him a few days later to see if he thought there was enough material left uncovered to warrant additional research on sundown towns in Arkansas. “Of course!” he responded. “I wish you had talked to me about this while I was there!”

Over the coming years, we would regularly correspond as I pursued this subject, and he even gave me his home phone number in case I needed it. The night before I defended my dissertation in 2010, we had a long conversation that was equal parts encouragement and equal parts him putting me through my paces. A few years later, I turned my dissertation into the book Racial Cleansing in Arkansas, 1883–1924: Politics, Land, Labor, and Criminality.

After that, I must admit, I lost touch with Jim for a while as I pursued research into other aspects of Arkansas history. But he got back in touch after I started writing articles for the History News Network (he particularly liked my piece “The Garbage Troop”). We would catch each other up on our latest projects, and he was always complementary toward our efforts in Arkansas to account for this state’s history of racial cleansings and lynchings.

I am reminded of Jim’s generosity every time I look at the analytics for the Encyclopedia of Arkansas, for his entry tops our most visited webpages by a large margin. Just yesterday, I took a glance at our real-time numbers to find that, of the ninety-or-so people on the Encyclopedia of Arkansas at that moment, fully one-third of them were reading about sundown towns. I think it would have pleased him mightily to know that.

Rest in Peace, Jim.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181152 https://historynewsnetwork.org/article/181152 0
Nationalist Nostalgia in Russian Film Mirrors Putin's Political Bending of History

Russian director Nikita Mikhalkov with Vladimir Putin on the set of "Burnt By the Sun 2," 2008.

 

 

A recent viewing of Nikita Mikhalkov’s Sunstroke (2014) on Amazon Prime got me reflecting on this famous contemporary Russian director as well as Ivan Bunin (1870-1953), the author of the short story of the same title which inspired the film.

 

While the director is a great admirer of Russian President Vladimir Putin (a former Communist KGB officer), the earlier writer wrote Cursed Days, a scathing criticism of the Russian Communists during their revolution and civil war, and emigrated from Russia in January 1920. But what Mikhalkov and Bunin before him both possess (and Putin also shares) is a peculiar nostalgia for certain aspects of old tsarist Russia.

 

All three men are major figures in the last century and a quarter of Russian history. By 1900 Bunin had already written his first stories and poetry and was a friend of some of the most famous writers of his day, for example, Tolstoy, Chekhov, and Gorky. In 1933, he became the first Russian writer (although now an emigré living in France) to receive the Nobel Prize for Literature--first awarded in 1901. Primarily known for such short stories as “The Gentleman from San Francisco” and “Sunstroke,” Bunin also wrote novels (e.g., The Village) and poetry, as well as non-fiction like “About Chekhov.” Mikhalkov is the “most famous living Russian director.” His Burnt by the Sun won a U.S. Academy Award (1995) for Best Foreign Language Film, and he has directed more than 20 films. In addition, he has been active in various other realms--acting (about two dozen films), producing, heading the TriTe film studio and the Russian Filmmakers’ Union, and occasionally speaking out for what he likes to think of as an enlightened Russian conservatism. Putin, of course, has been Russia’s chief political leader since the resignation of Boris Yeltsin at the end of 1999.

 

Regarding Bunin’s nostalgia, an Italian scholar, D. Possamai, has written, “there is no doubt that the sense of melancholic nostalgia for a Russia that no longer exists is one of the key stylistic features in Bunin’s works.” The scholar writes that this is primarily true in his short fiction and cites his “Antonov Apples” (aka “The Scent of Apples”), which first appeared in 1900, as a prime example. Other scholars such as V. Terras (Bunin “is a nostalgic writer” and M. Slonim (the emigre Bunin was a “seeker after things past and gone”) also highlight his nostalgia.  

 

We can also sense it in “Sunstroke,” the 1925 story upon which the Mikhalkov film is based. A lieutenant stands on a summer night at the railing of a Volga River paddlewheel steamship; he suggests to a small, attractive woman he has met just hours before that they get off at the next stop. They do so and take a carriage, through a small provincial town, to an inn, where they make love in a large room with white curtains, still “smoldering from the day’s sun.” Their first kiss was so delirious “that for many years they remembered this moment, and neither one experienced anything like it for the rest of their lives.” But the next morning she insists on resuming her river journey while he stay behind. He agrees, but soon realizes “ he would never see her again, this thought overwhelmed and astonished him. No, it can’t be! It was too bizarre, unnatural, unbelievable! And he felt such pain and such a sense of how useless the rest of his life would be without her, that he was gripped by horror, despair.” In the evening, he boards another Volga ship. Bunin’s final sentence reads, “The lieutenant sat on the deck under the awning feeling like he’d aged ten years.”

 

In his film, Mikhalkov interweaves this story, displayed in appealing colors, into another, presented in darker, somber tones, about hundreds of White Army officers who have surrendered to Red (communist) forces near the end of the Russian Civil War in 1920. The lieutenant of Bunin’s story is now one of these White officers held at an internment camp near the Black Sea, and his Volga adventure is displayed in flashbacks. Mikhalkov provides both sympathetic and unsympathetic characters on both sides of the conflict, and suggests what personal defects led the lieutenant and other Whites to loose the war. But his main conclusion seems to be that the civil war was a Russian tragedy. At the end of the film, the following words are flashed on the screen: “From 1918 to 1922, only in the South and the Crimea, Russia lost more than 8 million of its people.”  

 

Regarding Mikhalkov’s nostalgia, Birgit Beumers has written in her Nikita Mikhalkov: Between Nostalgia and Nationalism (2005), “The overall argument of this volume is that Mikhalkov performs a shift from a nostalgia for a past that is openly constructed as a myth to a nostalgia for a past that pretends to be authentic. I contend that this move is a result of the collapse of the Soviet value system . . . and the director’s inability to face up to the reality of the 1990s, when he turned both past and present into a myth that he himself mistook for real and authentic.”

 

Beumers chronicles Mikhalkov’s development from his first full-length film, At Home Among Strangers (1974), up until Dark Eyes (1987), starring the Italian actor Marcello Mastroianni.  She believes the first typifies his early approach to basing “his work not on the historical facts but on a myth.” But the second moves “away from the reflective or ironic nostalgia of his earlier films towards a restorative nostalgia that tends towards nationalistic revival.”

 

Mikhalkov’s most acclaimed film, at least outside Russia, was his award-winning Burnt by the Sun (1994). Like his later Sunstroke, it is complex and reflects positive and negative aspects of both tsarist and Soviet Russia. Unlike Sunstroke, he is also one of the chief actors in this earlier work, portraying the communist officer Kotkov. Beumers concludes that Kotkov does not destroy “the beauty of Russia” inherited from the tsarist period, and quotes Mikhalkov as saying, “‘Yes, Bolshevism has not brought happiness to our country. But is it morally correct on the basis of this indisputable fact to put under doubt the life of entire generations only on the grounds that people happened to be born not in the best of times?”

 

Beumers adds that Mikhalkov “longs for a past when it was possible to believe in ideals,” but that he realizes that on the day portrayed in his film at the beginning of the Great Terror of 1936-38, “they are about to disappear. . . . By casting himself as a kind Bolshevik commander, who believes in the ideals of the Revolution and, furthermore, is a perfect father to his child, he offers himself in the role of a leader of the people, but one who would return to the roots of socialism . . . . This confusion of fiction and reality leads to the portrayal of a political Utopia, which Mikhalkov would

gradually mistake for an authentic ideal.”

 

Mikhalkov’s next important film was The Barber of Siberia (1998), which was a box-office hit in Russia. It appeared the same year as the remains of the last tsar, Nicholas II, and his family were reburied in St. Petersburg, and its nostalgic toward the reign of Nicholas’s father, the reactionary tsar (1881-1894) Alexander III, who is played by none other than Mikhalkov himself. Through the character of an American woman, Jane Callaghan (English actress Julia Ormond), the moral bankruptcy of the West is symbolized and contrasted to the moral superiority of Russians like Alexander III.

 

Although Mikhalkov directed other films between his 1998 one and his 2014 Sunstroke, including 12, a remake of 12 Angry Men (1957), a more significant work of his (at least for this essay) was his 2010 political manifesto, 63 pages that reflect his nostalgic nationalism and conservatism. It praises “law and order,” state power and loyalty to it, and the Russian Orthodox Church. It also considers Russia-Eurasia as “the geopolitical and sacred center of the world.” Although it mentions the Stalinist terror, Mikhalkov considers the overall achievements of the Soviet era more important.

 

Mikhalkov’s view are very close to those of Putin’s (for the similarities see my two 2015 essays, “Vladimir Putin: History Man?” and “Is Vladimir Putin an Ideologue, Idealist, or Opportunist?”). Since 2015, two further indications of his nostalgic nationalism have been his comments during a 2017 unveiling of a Crimean monument to Alexander III--the same tsar Mikhalkov played in The Barber of Siberia--and Putin’s long 2021 essay on “On the Historical Unity of Russians and Ukrainians.”

 

Speaking on Alexander III in Crimea (annexed by Russia in 2014 after earlier being part of Ukraine), Putin said:

 

We are unveiling a monument to Alexander III, an outstanding statesman and patriot, a man of stamina, courage and unwavering will.

He always felt a tremendous personal responsibility for the country’s destiny: he fought for Russia in battlefields, and after he became the ruler, he did everything possible for the progress and strengthening of the nation, to protect it from turmoil, internal and external threats. . . .

The reign of Alexander III was called the age of national revival, a true uplift of Russian art, painting, literature, music, education and science, the time of returning to our roots and historical heritage. . . .

He believed that a strong, sovereign and independent state should rely not only on its economic and military power but also on traditions; that it is crucial for a great nation to preserve its identity whereas any movement forward is impossible without respect for one’s own history, culture and spiritual values.

 

In his essay “on the historical unity of Russians and Ukrainians,” Putin stressed all in the past that had united the two peoples, and downplayed all that divided them (e.g., “Ukraine's ruling circles decided to justify their country's independence through the denial of its past. . . . They began to mythologize and rewrite history, edit out everything that united us, and refer to the period when Ukraine was part of the Russian Empire and the Soviet Union as an occupation. The common tragedy of collectivization and famine of the early 1930s was portrayed as the genocide of the Ukrainian people.”)

 

In her The Future of Nostalgia (2001), Svetlana Boym wrote that “the nostalgic desires to obliterate history and turn it into private or collective mythology.” In the case of Mikhalkov and Putin (less so with Bunin) their type of nostalgia does not so much “obliterate history,” but attempts to mythologize it to justify present political goals.

 

In doing so they are far from unique among modern cultural and political leaders. In his perceptive portrait of Ronald Reagan, Reagan's America: Innocents at Home (1998), Gary Wills described the former actor who became president as the “sincerest claimant to a heritage that never existed, a perfect blend of an authentic America he grew up in and of that America’s own fables about its past.” Similarly, Donald Trump’s slogan “Make America Great Again” attempted to appeal to a nostalgia grounded more on myth than historical reality. Beyond Russia and the USA, as a 2019 Foreign Policy essay noted, “The problem is that the world is now primarily dealing with a toxic restorative nostalgia. . . . In many ways, nostalgic nationalism is the political malaise of our time.”

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181102 https://historynewsnetwork.org/article/181102 0
Three Badasses of the Revolutionary Era That Your Textbooks Never Told You About

The sculpture "Point of View" imagines a meeting between Guyasuta (Seneca) and George Washington which shaped the future of the Ohio Valley. Pittsburgh.

Image Lee Paxton CC BY-SA 4.0

 

 

 

 

Now that the kids are back in school and many will soon study the American Revolution, University of South Carolina historian Woody Holton, author of Liberty is Sweet: The Hidden History of the American Revolution, due out in October, wants to introduce you to three badass American Revolutionaries your textbooks never told you about.

 

 

 

Joseph Harris

 

Not all enslaved Americans of the colonial era grew rice and tobacco. All along the East Coast, Blacks were forced to serve as river pilots, entrusted with the safety of ship, cargo, and crew. During the American Revolution, numerous African American pilots used their skills to win their freedom. In July 1775, one of them, Joseph Harris of Hampton, Virginia, escaped and offered his services to Mathew Squire, captain of HMS Otter, who desperately needed help navigating Chesapeake Bay.

 

On September 2, a hurricane convulsed the Atlantic coast, driving Squire’s and Harris’s ship aground near Hampton. Harris borrowed a canoe from a slave and paddled his captain across the mile-wide mouth of the James River to the safety of the British fleet, anchored near the modern Norfolk Naval Shipyard.

 

The Patriots who controlled Hampton seized Squire’s grounded vessel. When he demanded it back, they insisted that he first return the property he had stolen: his “Ethiopian director,” Joseph Harris. He refused, and on October 27, a squadron of small Royal Navy craft attacked Hampton in what would become the first Revolutionary War battle fought south of New England.

 

The captain of Joseph Harris’s boat drove it too close to Hamptons’ defenders, who captured it and most of its crew—but not Harris or his captain, who swam to another of the attacking vessels. That made twice in two months that Harris had escorted his captain to safety. Now that he and other fugitive slaves, including those serving in an “Ethiopian Regiment,” had proved their value, Lord Dunmore, the last royal governor of Virginia, made the Anglo-African alliance official. On November 15, 1775, he issued an emancipation proclamation similar to the one Lincoln would publish four score and seven years later. In it, he promised freedom to any Patriot’s slave “able and willing” to bear arms for his king. Thousands heeded the call.

 

Dunmore’s offer to African Americans infuriated whites, especially in the South. They blamed the governor, not the slaves, and the Anglo-African alliance became the single most important factor driving white southerners from merely seeking autonomy within the British empire to demanding total separation from the nation that had, in the words of the Declaration of Independence, “excited domestic insurrections amongst us.”

 

About the time Congress adopted the Declaration, Joseph Harris died aboard a British warship in Chesapeake Bay, apparently of disease.

 

Guyasuta

 

In 1753, when George Washington first crossed the Appalachian Mountains to the region around modern Pittsburgh, it was to deliver an eviction notice. A French army had occupied what is now western Pennsylvania, and Washington’s British employers wanted them gone.

 

Accompanying Washington on the last leg of his western trek was a Seneca warrior named Guyasuta. His job was to hunt game for the British, and his and Washington’s paths would cross again.

 

In the French and Indian War, which started the very next year, the Senecas sided with France. In July 1755, when a nominally French army consisting mostly of Native Americans decimated Gen. Edward Braddock’s expeditionary force eight miles east of the future site of Pittsburg, two of the survivors, on opposite sides, were Guyasuta and Washington.

 

Guyasuta also joined in Pontiac’s Uprising (1763-1764), a Native American revolt against the British. But he played an even larger role in negotiating a settlement that was favorable to the natives. In a May 1765 conference, he observed that the British had only treated indigenous Americans fairly when the had to compete with the French for their support. “As soon as you conquered the French,” he reminded a British Indian agent, “you did not care how you treated us, as you then did not think us worth your notice.”

 

The removal of the French threat enabled the British to crack down on their own American colonists as well, and in the ensuing Revolutionary War, the Senecas fought on their side. Indeed, in 1782, when Parliament decided to make peace with the former colonists and asked the Senecas to do the same, Guyasuta held out, leading a mixed band of native warriors and white Loyalists in the last major incursion into Pennsylvania—at Hanna’s Town on July 13, 1782.

 

Esther DeBerdt Reed

 

You know Abigail Adams and Betsy Ross, but what about the Philadelphian who founded America’s first national organization of women, loosened the purse-strings of Lafayette, stood her ground against George Washington, and got posthumously plagiarized by Thomas Jefferson?

 

Esther DeBerdt Reed was born in Britain and came to America only in 1770 with her new husband, Pennsylvania’s Joseph Reed, whom she had met during his years studying law in London. Five years later, Joseph became an aide to George Washington but then forfeited his confidence by criticizing him in a letter to another Continental Army officer that fell into Washington’s hands.

 

By early 1780, Joseph was president of the Pennsylvania executive council, the highest position in the state. But he was still trying to angle his way back into his former commander’s good graces. Esther proposed to advance that cause with a grand gesture on behalf of the beleaguered, even mutinous, Continental soldiers. The women of Philadelphia would go door-to-door collecting funds from other women to be disbursed to the troops. They soon found hundreds of donors, including the fabulously wealthy Lafayette, who gave a thousand guineas on behalf of his wife back in France.

 

Washington welcomed the women’s campaign, but he also suggested changes. The funds should be deposited in a Philadelphia bank recently founded by Joseph’s political rivals; it urgently needed support. And when the women were ready to make their gift, they should not just hand the money to the soldiers, who might use it to get drunk. Instead, the women should buy cloth and sew shirts for the troops, most of whose clothes were in tatters.

 

Esther boldly informed the commander-in-chief that she did not like either of his changes. Since the Philadelphia bank was new, its banknotes would be worth less than the money she and the other women deposited. Moreover, merely fitting the soldiers out with shirts, which the army owed them as their employer, would defeat the women’s whole purpose of giving each man an “extraordinary gift.”

 

Washington gave in to Esther’s objection to using her husband’s enemies’ bank. But he was clearly rattled at her refusal to give the soldiers shirts instead of cash. He once more insisted on that change.

 

Part of Esther’s purpose in undertaking the campaign had been to improve her husband’s relationship with the commanding general, and both Reeds realized that it now threatened to do just the opposite. Esther gave in, and by the end of 1780, she and the other women had sewn 2,000 of them.

 

Esther Reed did not live to witness this accomplishment. She died on September 18 at the age of thirty-three, apparently of dysentery. In a rare eighteenth-century obituary for a woman that actually mentioned her accomplishments, the Pennsylvania Gazette, Benjamin Franklin’s old newspaper, described the women’s campaign in loving detail and speculated that Reed may have damaged her health by “imposing on herself too great a part of the task.”

 

At the start of the campaign, Reed had written a broadside (single-sheet document) justifying the women’s extraordinary activism. She pointed to the examples of European queens who had extended “the empire of liberty”—a phrase no one had previously used except as a synonym for Heaven. Reed and the other women later spread their movement into other states by sending her broadside to the governors’ wives, including Martha Jefferson, partner of Thomas. Perhaps it is a coincidence that Governor Jefferson, who also received the Philadelphia broadside from another source, wrote a letter later that year in which he is universally credited with coining a description of the United States that is still frequently quoted today: “empire of liberty.”

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181174 https://historynewsnetwork.org/article/181174 0
Lafayette as "The Nation’s Guest" (1824-1825)

 

 

From Hero of Two Worlds: The Marquis de Lafayette in the Age of Revolution by Mike Duncan, copyright © 2021. Reprinted by permission of PublicAffairs, an imprint of Hachette Book Group, Inc.

 

As soon as he received the letter from President Monroe, Lafayette began arranging his return to America.

 

In July 1824, Lafayette, his son Georges, and his secretary Levasseur traveled to Le Havre, France, to meet their waiting ship. In France, local leaders couldn’t wait for Lafayette to leave. In America, they couldn’t wait for him to arrive.

 

Lafayette and the party arrived in New York on August 15, 1824. Newspapers publicized the imminent arrival of the Hero of Two Worlds, so when he reached Manhattan, boats of every shape and size packed the harbor. When he disembarked, an honor guard of aging veterans of the American War of Independence saluted the last surviving major general of the Continental Army. Lafayette had not set foot on American soil for forty years and already he could tell he was going to enjoy himself. It was nice to be loved again.

 

Here he was a living legend—a pristine icon of the most glorious days of the Revolution. He found himself as celebrated in Philadelphia as New Orleans; Vermont as much as South Carolina; rural hamlets as well as big cities. Lafayette belonged to everyone, and wherever he went he was described as the Nation’s Guest. Whether intended or not, his very presence reminded local and state leaders they were a single nation with a shared past and collective future. Lafayette certainly never let them forget it.

 

••

 

Lafayette’s admirer and friend Fanny Wright, who believed the United States represented the vanguard of human progress, met up with the party from England. As feared, the presence of the young woman was socially awkward. Eleanor Parke Custis Lewis, Washington’s step-granddaughter, who also joined Lafayette’s party in New York did not take kindly to the unaccompanied young woman.

 

Part of the reason for the tension with Nelly Lewis was Fanny Wright’s staunch abolitionism, while the Washingtons remained committed slave-owners. Lafayette was caught in between his own abolitionist principles and the desire for social harmony. Though he never publicly embarrassed his slave-owning friends in America, Lafayette also never missed an opportunity to demonstrate his own commitment to emancipation. Believing the universal education of the African population of paramount importance to successful emancipation, Lafayette made a point of visiting the African Free School, an academy established by the New York Manumission Society to give equal education to hundreds of black pupils.

 

Lafayette was greeted by an address from a bright eleven-year-old student named James McCune Smith: “Here, sir, you behold hundreds of the poor children of Africa sharing with those of a lighter hue in the blessings of education; and, while it will be our pleasure to remember the great deeds you have done for America, it will be our delight also to cherish the memory of General La Fayette as a friend to African emancipation, and as a member of this institution.” Young James McCune Smith would grow up to become the first African American to hold a medical degree, a prominent antebellum abolitionist, and mentor of Frederick Douglass.

 

As they moved into the southern states, Lafayette’s company confronted the unavoidable contradiction of American liberty and American slavery. Levasseur, as much an abolitionist as Lafayette and Wright, was not comfortable with the things he now saw. “When we have examined the truly great and liberal institutions of the United States with some attention,” he wrote, “the soul feels suddenly chilled and the imagination alarmed, in learning that at many points of this vast republic the horrible principle of slavery still reigns with all its sad and monstrous consequences.” As Levasseur published his journals under Lafayette’s general editorial direction, and whose political views the journal was meant to promote, we can take Levasseur’s observations as bearing Lafayette’s stamp of approval.

 

Reflecting on his travels, Levasseur remained hopeful emancipation was inevitable, partly because everyone agreed slavery was terrible. “For myself,” he wrote, “who have traversed the 24 states of the union, and in the course of a year have had more than one opportunity of hearing long and keen discussions upon this subject, I declare that I never have found but a single person who seriously defended this principle. This was a young man whose head, sufficiently imperfect in its organization, was filled with confused and ridiculous notions relative to Roman history, and appeared to be completely ignorant of the history of his own country; it would be waste of time to repeat here his crude and ignorant tirade.”

 

Lafayette and Levasseur shared a concern this racist ignorance threatened the standing of the United States in the world for its on-going violations of fundamental human rights: “If slave owners do not endeavor to instruct the children of the blacks, to prepare them for liberty; if the legislatures of the southern states do not fix upon some period, near or remote, when slavery shall cease, that part of the union will be for a still longer time exposed to the merited reproach of outraging the sacred principle contained in the first article of the Declaration of Rights; that all men are born free and equal.” And as Lafayette had written in his own declaration of rights, violation of this sacred principle always left open the right of the victims of tyranny to exercise another fundamental right: resistance to oppression.

 

The next stop on the official itinerary was a visit to Thomas Jefferson at his slave plantation, Monticello. Lafayette and Jefferson had not seen each other since 1789, during the hopeful days after the fall of the Bastille. Lafayette’s party spent a week at Monticello, and Jefferson escorted them to Charlottesville to tour his pride and joy: The University of Virginia. James Madison even made an appearance.

 

Levasseur noted while Lafayette stayed with his Virginian friends— all of them members of the plantation slave aristocracy—Lafayette did not shy away from bringing up emancipation: “Lafayette, who though perfectly understanding the disagreeable situation of American slaveholders, and respecting generally the motives which prevent them from more rapidly advancing in the definitive emancipation of the blacks, never missed an opportunity to defend the right which all men without exception have to liberty, broached among the friends of Mr. Madison the question of slavery. It was approached and discussed by them frankly It appears to me, that slavery cannot exist a long time in

Virginia, because all enlightened men condemn the principle of it, and when public opinion condemns a principle, its consequences cannot long continue to subsist.” Levasseur, however, was far too optimistic about the noble sentiments of the Virginians and the future prospects of slavery. Condemning something in principle has little bearing on whether it is allowed to persist in reality.

 

••

 

Lafayette made it back to Boston just in time for the Bunker Hill celebrations on June 17, 1825. After being escorted to the site in a carriage drawn by six white horses, he laid the cornerstone of the Bunker Hill monument. After a day of mutually admiring speechmaking, Lafayette requested a bag full of the dirt from the excavation site so he could take it home with him and always keep soil from the birthplace of American liberty.

 

 

Before Lafayette departed once and for all, George Washington Parke Custis conceived of sending a present to another liberty-loving American: Simón Bolívar. Bolívar recently completed a series of campaigns ending Spanish rule in Venezuela and Colombia and now campaigned in Peru. Citizens of the United States cheered the exploits of the Liberator, and Lafayette agreed Bolívar was the Washington of South America. The gift package included a pair of Washington’s pistols, a portrait of the late president, and a letter from Lafayette. Lafayette offered the President Liberator, “personal congratulations from a veteran of the common cause,” and said of the enclosed gifts, “I am happy to think that of all the existing men, and even of all the men of history, General Bolívar is the one to whom my paternal friend would have preferred to offer them. What more can I say to the great citizen whom South America hailed by the name of Liberator, a name confirmed by both worlds, and who, endowed with an influence equal to his disinterestedness, carries in his heart the love of liberty without any exception and the republic without any alloy?” Lafayette departed for home proud the great work of liberty continued its inexorable march through the Americas.

 

On September 8, 1825, a new naval frigate recently christened the Brandywine in Lafayette’s honor set sail for Europe with Lafayette aboard. He never returned to the United States. And while he sailed away content in the knowledge the legacy of his past glories would live forever in the New World, he hoped a few final future glories still lay ahead in the Old World.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181153 https://historynewsnetwork.org/article/181153 0
The Scandalous Six Week Walk That Inspired D. H. Lawrence's Most Popular Novels (excerpt)

 

 

On August 5, 1912, Frieda von Richthofen, a thirty-three-year-old German aristocrat and married mother of three, awoke to the sound of rain. It was four thirty in the morning. Quivering strips of pearly light seeped through the sides of the shutters. She opened her eyes, dimly aware of her young lover strapping up their rucksacks and humming beneath his breath. At last she was about to embark on a real adventure, the sort of escapade she’d dreamt of for the last ten years. It had been a long, dry decade in which her emotionally restrained life in a comfortable suburban house on the edge of industrial Nottingham had almost driven her mad.

 

Her lover was the fledgling writer D. H. Lawrence, a penniless coal miner’s son whom she’d met four months earlier. The pair of them had been poring over maps and guidebooks for days, plotting a route that would take them through “the Bavarian uplands and foothills,” over the Austrian Tyrol, across the Jaufen Pass to Bolzano, and down to the vast lakes of Northern Italy.

 

Later, this six-week walk would become much mythologised as their “elopement.” But the evidence suggests this was less an elopement than a feverish bid for freedom and an inarticulate yearning for renewal. On the first misty, sodden step of that six-week walk, Frieda began the process of reinventing herself as a woman without children, scissoring herself free from the restrictions and responsibilities that accompanied being a mother in Edwardian England. Almost overnight she transformed herself from a fashionably dressed and hatted mother and manager of multiple household staff to someone else entirely: a woman who put comfort before fashion, who took responsibility for her own cooking and laundry, who swapped warm, soapy baths for ice-cold pools and the latest flushing lavatory for speedy squats among the bushes.

 

Frieda’s isolation was exaggerated by her choice of paramour. Lawrence spoke with a Derbyshire accent. He dressed in cheap clothes and came from a rough mining village. He was also six years younger than she was, at a time when women were expected to marry older men. To leave children, a comfortable home, and a successful husband broke every taboo. To leave them for a man like this was unthinkable.

 

In 1912, this was not how women behaved. Least of all mothers.

 

 

Frieda and Lawrence put on their matching Burberry raincoats. Frieda donned a straw hat with a red velvet ribbon round the brim. Lawrence wore a battered panama. They squeezed a spirit stove into a canvas rucksack, planning to cook their supper at the side of the road. They had twenty-three pounds between them, barely enough to reach Italy.

 

The pair chose a punishing route that would fully occupy them with its steep climbs and its perilous twists and turns. Neither of them had walked or climbed in mountains before, neither was a skilled orienteer, and neither was particularly fit. Lawrence found the mountains bleak and terrifying, seeing there the eternal wrangle between life and death. Later, he made full use of his Alpine terror in Women in Love, sending Gerald Crich to a lonely death in the barren glaciers of the Alps. Frieda, however, thought it was “all very wonderful.”  

On this walk, the pair averaged ten miles a day, much of it uphill and strenuous, much of it cold, always with their packs on their backs. On some days they walked farther still. Only when the weather was particularly hostile did they allow themselves the luxury of catching a train to the next town.

 

Even as Frieda and Lawrence celebrated their new-found freedom—from their past lives and from the passionless provincialism of pious England—they were acutely aware of how hemmed in they were. Frieda’s presence had a profound effect on Lawrence, sparking a creative surge that resulted in dozens of short stories, poems, and essays, as well as his three acknowledged masterpieces: Sons and Lovers, The Rainbow, and Women in Love. But as he led Frieda farther and farther away from her previous life, he began to see how necessary she was to him. Not only for his happiness but for the continued blooming of his genius. Many of his poems are testimony to this feeling of necessity, a feeling that occasionally tips into a terrified dependency:

 

The burden of self-accomplishment!

The charge of fulfilment!

And God, that she is necessary!

Necessary, and I have no choice!

 

Frieda discovered that her new-found liberty was similarly compromised. She left her husband, children, and friends to discover her own mind, to be freely herself. But freedom is infinitely more complicated than simply casting off the things we believe are constraining us. Hurting others in the pursuit of freedom and self-determination brought its own struts and bars, its own weight of guilt. Frieda never shared the great weight of her guilt. She couldn’t. Lawrence wouldn’t allow it. His friends joined forces with him, insisting that she put up or shut up, that her role was to foster his genius. At any price.

 

 

After six weeks, Frieda and Lawrence arrived in Riva, then an Austrian garrison town on Lake Garda. Vigorous ascents over steep mountain passes in snow and icy winds followed by nights in lice-ridden Gasthäuser had left them looking like “two tramps with rucksacks.” Within days a trunk of cast-off clothes from Frieda’s glamorous younger sister had arrived, swiftly followed by an advance of fifty pounds for Sons and Lovers from Lawrence’s publisher. In a big feathered hat and a sequinned Paquin gown, an exuberantly overdressed Frieda and a shabbier Lawrence sauntered round the lake, celebrating their return to civilisation and rubbing shoulders with uniformed army officers and elegantly dressed women.

 

So why did Frieda devote less than twenty-five lines of her memoir to this pivotal time in her life? She writes in the same book: “I wanted to keep it secret, all to myself.”

 

This journey was so vivid and intense, so personal, that neither Frieda nor Lawrence wanted to enclose it or share it. When Lawrence fictionalized a version of it in Mr. Noon, he never sought publication (unusually for him as they invariably needed the money). Instead, he consigned it to a drawer. Nor, after his death, did Frieda try and have Mr. Noon published—despite publishing other writings Lawrence had chosen to keep private. Mr. Noon stayed unpublished until 1984.

 

 

Twenty months after their Alpine hike, and at his insistence, Frieda married an increasingly restive and cantankerous Lawrence, arguably exchanging one form of entrapment for another. It wasn’t until his death in 1930 that she became free to live as, and where, she wanted. In a bold attempt to finally assert her own identity she used the name “Frieda Lawrence geb. Freiin von Richthofen” on the opening page of her memoir. That, I suspect, was her definitive moment of freedom.

 

After Lawrence died, she lived in the same ranch (“wild and far away from everything”) in Taos, New Mexico, for much of her remaining twenty-six years. Here she cultivated a close group of friends, a surrogate for the family she’d sacrificed. And she walked. Her memoir is peppered with references to walking: “We were out of doors most of the day,” she says, on “long walks.” Her first outing with Lawrence, shortly after they met, had been “a long walk through the early spring woods and fields” of Derbyshire with her two young daughters. It was on this walk that she discovered she’d fallen in love with Lawrence. Later she wrote of “delicious female walks” with Katherine Mansfield, walks through Italian olive groves, walks into the jungles of Ceylon, walks along the Australian coast, walks through the canyons of New Mexico, or simply strolls among “the early almond blossoms pink and white, the asphodels, the wild narcissi and anemones.” Frieda walked in the countryside for the rest of her life.

 

But the pivotal walk of her life—the six-week walk she skirted in twenty-five lines—was the most significant. From here, Frieda emerged as herself, as the free woman she had always longed to be—dressing in scarlet pinafores and emerald stockings, swimming naked, making love en plein air, walking as she wished. She had also become the free woman Lawrence needed for his fiction. He made full use of her in his writing, continually remoulding her, most famously as Ursula in Women in Love, and Connie in Lady Chatterley’s Lover. His novels shaped history, but Frieda was the catalyst.

 

 

Excerpted from WINDSWEPT: Walking the Paths of Trailblazing Women by Annabel Abbs. Copyright (c) 2021 by Annabel Abbs. Published by Tin House.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181172 https://historynewsnetwork.org/article/181172 0
Review: Heroes of Ireland's Great Hunger

 

 

Christine Kinealy, professor of history and the founding director of Ireland’s Great Hunger Institute at Quinnipiac University in Connecticut, has created an impressive academic juggernaut for the study of the mid-19th century Great Irish Famine and for bringing the famine to the attention of a broader public. Her more recent published work includes The Bad Times. An Drochshaol (2015), a graphic novel for young people, developed with John Walsh, Private Charity to Ireland during the Great Hunger: The Kindness of Strangers (2013), Daniel O’Connell and Anti-Slavery: The Saddest People the Sun Sees (2011), and War and Peace: Ireland Since the 1960s (2010). She recently released a collection of essays, prepared with Jason King and Gerard Moran, Heroes of Ireland’s Great Hunger (2021, co-published by Quinnipiac University Press and Cork University Press). In Ireland, it is available through Cork University Press. In the United States, it is available in paperback on Amazon.

 

Co-editor Jason King is the academic coordinator for the Irish Heritage Trust and the National Famine Museum in Strokestown, County Roscommon, Ireland. Co-editor Gerard Moran is a historian and senior researcher based at the National University of Ireland – Galway.

 

Sections in Heroes of Ireland’s Great Hunger include “The Kindness of Strangers,” with chapters on Quaker philanthropy, an exiled Polish Count who distributed emergency famine relief, and an American sea captain who arranged food shipments to Ireland; “Women’s Agency,” with three chapters on women who “rolled up her linen sleeves” to aid the poor; “Medical Heroes,” with five doctors who risked their own lives to aid the Irish; and sections on the role of religious orders in providing relief and Irish leadership. Final reflections include a chapter on “The Choctaw Gift.” The Choctaw were an impoverished Native American tribe who suffered through the Trail of Tears displacement to the Oklahoma Territory. They donated more than they could afford to Irish Famine Relief because they understood the hardship of oppression and going without.

 

Kinealy, King, and Moran managed to enlist some of the top scholars in the field of Irish Studies from both sides of the Atlantic to document how individuals and groups made famine relief a priority, despite official government reticence and refusal in Great Britain and the United States. In her work, Kinealy continually draws connections between the Great Irish Famine and current issues, using the famine as a starting point for addressing problems in the world today. The introduction to the book opens with a discussion of the global response to the COVID-19 pandemic.

 

Readers meet powerful individuals who deserve a special place in the history books. James Hack Tuke, a Quaker, not only provided and distributed relief, but attempted to address the underlying issues that left Ireland, a British colony, impoverished. His reports from famine-inflicted areas highlighted pre-existing social conditions caused by poverty, not just famine-related hunger. His reports challenged the stereotype popularized in the British press that the Irish were lazy and stressed the compassion the Irish showed for their neighbors. While working with famine refugees in his British hometown of York, Tuke became ill with typhus, also known as “Famine Fever,” a disease that caused him to suffer from debilitating after-effects for the rest of his life. After the famine subsided in the 1850s, Tuck continued his campaign for Irish independence from the British yoke.

 

Count Pawl de Strzelecki of Poland was an adopted British citizen who documented the impact of the Great Irish Famine so that British authorities could not ignore what was taking place and who spoke out against the inadequacy of British relief efforts. Strzelecki was a geographer, geologist, mineralogist, and explorer. As an agent for the British Association for the Relief of Distress in Ireland and the Highlands of Scotland, he submitted reports on conditions in County Donegal, Mayo and Sligo. The reports challenged the British government’s effort to minimize the impact of the famine of the Irish people. On a personal level, Strzelecki provided direct aid to impoverished Irish children and lobbied before Parliamentary committees for increased governmental and institutional attention to their plight. He later worked to provide assistance to women who were emigrating to Australia.

 

The chapter on Asenath Nicholson was written by my colleague at Hofstra University, Maureen Murphy, Director of the New York State Great Irish Famine Curriculum and author of Compassionate Stranger: Asenath Nicholson and the Great Irish Famine. Nicholson was an American Protestant evangelical who travelled the Irish countryside delivering relief packets and sending reports home to the United States in an effort to raise more money. While she distributed Bibles, she did not make participation in Protestant services a condition of aid, unlike a number of British aid workers. Murphy describes Nicholson as a “woman who was ahead of her time – a vegetarian, a teetotaler, a pacificist, and an outdoor exercise enthusiast” (96). Nicholson’s achievements were largely ignored by a male-dominated world until brought to public attention by Murphy’s work.

 

Daniel Donovan was a workhouse medical doctor in Skibbereen, perhaps the hardest hit town in County Cork and in all of Ireland. I consider him one of the most significant heroes included in the book. As epidemic diseases devoured the countryside, Dr. Dan, as he was known locally, treated the poor and documented conditions for the outside world. Donovan’s diary reported on the impact of the famine in Skibbereen was published in 1846 as Distress in West Carberry – Diary of a Dispensary Doctor and sections were reprinted in a number of newspapers in Ireland and England, including The Illustrated London News. Dr. Dan, who became a major international medical commentator on famine, fever, and cholera, continued to serve the people of Skibbereen until his death in 1877.

 

I do have one area of disagreement with the editors. I would have included a section on the leaders of Young Ireland and the 1848 rebellion against British rule, including William Smith O’Brien, John Blake Dillon, John Mitchel, and Thomas Meagher. Rebellion, as well as relief, was an important and heroic response to the Great Irish Famine.

 

At Hofstra University in suburban New York, I teach a course on the history of the Great Irish Famine and its significance for understanding the world today. Too often courses like these focus on horrors and I look forward to using this book in class to shift the focus, at least in part, to heroes.

 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181173 https://historynewsnetwork.org/article/181173 0
Resettling Refugees is Harder than You Think – A Personal History

 

 

 

In the past I have written extensively for History News Network about my extended Vietnamese family, but I have never written about their settlement in America after the war and the new roots they established in the Washington, D.C. Metro area. Because of the politically contentious prospect of resettling large numbers of Afghan refugees in the United States today, here is an essay about The Family, as we in our family call them, and some truths about the trials and tribulations of helping refugees enter and start a new life in America. In retrospect, 1975 was not today. Afghans are not Vietnamese. Though the end of each war was chaotic, the circumstances of each war and how they ended are vastly different. Not everything I have written about the Vietnamese applies to Afghanistan but there are enough similarities that bridge what took place 46 years ago and what is happening today.

 

To begin. It seems as if it was only yesterday that I woke up on a weekday morning in the spring of 1975 with 21 Vietnamese refugees living in the large, finished basement of my sprawling house in Rockville Centre, N.Y. In reality, I admit, it was no surprise. They were my wife Josephine’s mother and father. Her three brothers. Wives. Children. Cousins. They were there because I brought them there—they had no other place to go. The war had finally ended in Vietnam. They had become America’s newest refugees.

 

Without doubt, they missed their home and their former way of life. As far as I could tell they experienced no permanent scars over the next years of their lives because they had left Saigon in a hurry. in time they made a success of their new lives. Many years later some of The Family returned to Saigon, then renamed Ho Chi Minh City, to witness a life they were glad to have missed while they were becoming successful and entrenched in Maryland and Virginia. To help them in America, my mother-in-law carried with her a half million Vietnamese piasters that immediately lost their value after the country fell. I still have the bills that are as crisp as they were when first obtained.

 

I helped get them out of Saigon as the country was falling to the North Vietnamese and Viet Cong. With my Vietnamese wife, Josephine, we were able to move them from the American government camps, usually US Army bases, where they had been patiently waiting for the freedom they knew they would have once they walked unimpeded on American soil. They wanted freedom badly enough to move as fast as they could from their homes in Saigon as the country collapsed into near chaos caused by marauding forces led by Hanoi in North Vietnam. Escape for them was difficult and harrowing but it was an easy decision for my family to make to travel thousands of miles into the unknown when the war ended. If, after escaping, they had any doubts of what they had done, they never expressed them to me.

 

Now that they were free, taking the next step into their new life was living in my basement. I suddenly had the role of the all-knowing big uncle whose job would be to resettle his family into a healthy, changed and meaningful life. But there were challenges to overcome that had no immediate answers or solutions. Though they wore blue jeans and sneakers of every make and brand, they were not us, meaning they were foreigners without a hint of what an American was back then, at least on the surface. My family was a curious mix of Catholic, Buddhist, and I, my wife and children, Jewish. That mixture in one family was usually enough to create problems in a Western household, but it meant nothing to these Vietnamese to have three different religions under one roof. So, what some may have considered a major problem would not get in the way of the family’s resettlement.

 

My family had lived in well-built houses in central Saigon. But they did not understand the modern conveniences as we knew them in America. They cooked with charcoal on brick stoves and in brick ovens. In my home we had the usual gas stove, a microwave and a tabletop oven for quick meals, toast and bagels or heating a pastry or muffin. We had a huge refrigerator. They knew almost nothing of refrigeration because they only had a mini model bought for them in the PX during my days in Saigon. We had to teach them how to use the stove and how to use the refrigerator.  For instance, close the door after using the fridge so the food stays cold and fresh. In Saigon, there was no cold to protect the food. A pot of rice was always out and open in the kitchen all day as family members filled a bowl of rice when hungry. Buying rice in 50-pound bags, we duplicated that need with great success to give our guests a taste of their former homes. To teach them how to use a stove meant telling them to turn off the gas when not in use, how to adjust the burners, and not to use matches to turn on the gas. They loved the washer and dryer we had in the basement and they learned how to use them effortlessly and frequently. With 21 people living together, washing and drying the few items they owned was necessary and important. Most did not speak very much English. I spoke almost no Vietnamese. It made communication and instruction that much more difficult but, with the help of my wife, who was a good linguist, we managed to muddle through because of the desire of the refugees to get everything right and to fit in as fast as they could.

 

Although we were new to the community, the people in Rockville Centre were very generous. There were many stories in Newsday about what we were doing. Every morning we found clothing and food on our doorstep in a meaningful effort to help us care for the new arrivals. I am not sure the same attitude would prevail today.

 

Soon, though, because of my work, we were on our way to Maryland and my new job as Washington producer for the Today Show. The 21 settled into my new home in Potomac, Maryland. Because of my industrious, hard -working and determined wife, Josephine, and a few of my contacts, she found them jobs as landscapers, in hotels and offices where they cleaned rooms and started working in kitchens. Thy eagerly accepted their new jobs and worked hard. They earned their own money and experienced how people lived outside the confines of my home. With money in their pocket, growing bank accounts, and more of my and Josephine’s continued tutorials in less than a year they were ready to rent their own homes and truly begin independent lives.

 

Granted, this is 46 years later and there have been enormous changes in the world. Forty-six years ago, Smart phones did not exist. The Internet did not exist. Laptop computers did not exist.  Social media did not exist. Movements fighting for freedom and equality were small, often underground and not very influential. Today, all those entities are real, powerful and difference makers with which the new refugees are familiar. The differences, however, are still great. Cultures often created around deeply conservative religious beliefs play a bigger role in societies that are very dissimilar to even our diverse way of life. Our customs are not better than that of the average Afghan refugee. They are different. We are not the same and we must recognize that, adjust and learn to live with the differences the new refugees bring to our shores as they must learn to adjust to us. Experience shows it isn’t easy. But beginning with shared humanity, it is possible.

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181175 https://historynewsnetwork.org/article/181175 0
The Roundup Top Ten for September 3, 2021

Uncovering and Protecting the Black History of Nantucket

by Tiya Miles

"Although the Black community of New Guinea has passed into history, its mark on the landscape remains, a reminder that Nantucket was once a place of working-class ingenuity and Black daring."

 

What if the Coronavirus Crisis Is Just a Trial Run?

by Adam Tooze

The disjointed and haphazard global response to the COVID pandemic bodes poorly for the world's capacity for coordinated action to face inevitable crisies in the near future. The problem isn't a lack of means but a lack of commitment to collective action.

 

 

70 Years after the UN Refugee Convention, the US Needs to Commit to Help Displaced People

by Linda K. Kerber

The UN Refugee Convention does not impose any real obligations on any nation to offer asylum. The United States must lead the way in recognition of the deeply interconnected world created in large part by American power.

 

 

Hurricane Ida Shows the Climate Dystopia Ahead for All of Us

by Andy Horowitz

"Structural problems need structural solutions. Don’t give charity to Louisiana because it’s unique. Demand that Congress take meaningful action, because Louisiana is not unique, and you may be next."

 

 

Daddy Issues

by Bethany Moreton

White American Christians have embraced aggressive patriarchy as access to social and economic power has become more concentrated in fewer hands. 

 

 

Black Women and Civil War Pensions

by Holly A. Pinheiro, Jr.

Widows and surviving children of Black veterans of the Civil War used their status as pensioners to claim belonging in the nation, but authorities frequently allowed notions of respectability rooted in white supremacy to undermine them. 

 

 

The Agency of the Irresponsible

by Sarah Swedberg

When universities bend to political pressure and adopt "personal responsibility" policies for vaccination, masking, and social distancing they give agency to the irresponsible and take it away from those who are actively working to protect public health. 

 

 

Don't Buy Egyptian Antiquities – Even Fakes

by Erin L. Thompson

Buying antiquities without due diligence into their provenance feeds a black market for looted archaeological objects. 

 

 

There's No "Labor Shortage," Just a Shortage of Wages and Worker Protection

by Lane Windham

American workers, especially women, aren't being lazy. They're taking part in an unrecognized general strike against low wages, inadequate childcare, and dangerous workplaces made more dangerous by COVID. 

 

 

Socialist Actor Ed Asner Fought for Labor

by Jeff Schuhrke

Ed Asner fought for the representation of small-time actors in the Screen Actors Guild and protested American support for right-wing autocrats in Central America. 

 

]]>
Thu, 23 Sep 2021 17:58:20 +0000 https://historynewsnetwork.org/article/181170 https://historynewsnetwork.org/article/181170 0