History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sat, 17 Aug 2019 18:13:04 +0000 Sat, 17 Aug 2019 18:13:04 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/site/feed The History and Mythology of the Mayflower Arrival in 1620

The Mayflower and its ‘Pilgrims’ reminds us of an event which has entered into the cultural DNA of the United States. This is so, despite the fact that those who sailed and settled did so as English citizens and subjects of the British crown. As with many such formational national epics, myths jostle with realities as events are remembered and celebrated. Three particularly stand out from this epic story.

The myth of Plymouth Rock and Mary Chilton’s bold leap ashore has reverberated through art and popular accounts of the Mayflower. It was a 17th-century event which might be summed up as: “One small step for a woman; one giant leap for the USA.”

The problem is that this story only emerged in 1741 when a project to build a new wharf at Plymouth, Massachusetts, prompted a ninety-five-year-old local resident - Elder Faunce - to claim that a particular rock (about to be buried in the construction process) had received the first step taken onto the shore. Elder Faunce had been born in about 1646, but he had heard of the rock’s ‘history’ from his father, who had arrived in Plymouth in 1623 (three years after the Mayflower). However, no account which is contemporary with the landing in 1620 substantiates this claim. Neither William Bradford nor Edward Winslow made any reference to the ‘rock’ in their records relating to the momentous arrival.

Despite this, the rest, as they say, is ‘history’. Or rather, ‘mythology’. But today that potent myth is enshrined (literally) on the sea shore under its classically-inspired canopy. The rock itself is now much reduced in size, having been broken by various attempts to move it and by the actions of souvenir hunters. Nevertheless, it is a reminder of the power of such symbols to engage with personal and community imaginations.

Then there is the myth of Thanksgiving. As with Plymouth Rock, the image of this event is vivid. And yet the reality is that nobody in the fall of 1621 would have described what occurred as  ‘Thanksgiving’. This was because ‘Thanksgivings’ were solemn observances, with long services, preaching, prayer and praise. They did not officially have one of these until July 1623.

What occurred in 1621 was a ‘Harvest Home’ celebration. We do not even know exactly when it happened; but it probably took place in late October or early November. Native Americans were definitely there. However, whether they were invited in gratitude for their assistance or simply arrived because food was available we cannot know. What is strange is that when William Bradford later compiled the record known as Of Plymouth Plantation he failed to mention the event at all. He just said that the Pilgrims enjoyed “good plenty” after the harvest of 1621. He had clearly forgotten the event!

If it was not for the 115 words preserved in another document, called Mourt’s Relation, we would know nothing about it whatsoever. This account, probably written by Edward Winslow, says that after the harvest was safely brought in, four men were sent off on a day of duck hunting to provision a special celebration. This celebration included marching and the firing off of muskets, viewed by both Pilgrims and Native Americans. This was then followed by a feast that lasted three days. To this feast the Native Americans added a contribution of five deer. No turkey or cranberry sauce was present.

The myth of virgin territory is rather less specific but it tends to color much of how we view the matter of the Pilgrims’ arrival and settlement. In this construct we imagine the arrival of the Mayflower as the first footfall of Europeans on a territory hitherto untouched by such arrivals. Nothing seems to convey the sense of their epic voyage as much as the impression of ‘first contact’ between the emigrants and a landscape and native community that had no previous connection with Europeans. From this perspective, the First Encounter with the Nauset people, on Cape Cod in December 1620, seems to reveal a native community whose first reaction to the newcomers was inexplicably hostile.

The reality was much more complex and reveals both the remarkably connected nature of the northern Atlantic communities by the 1620s and the reasons why the Nauset were so unwelcoming in their reaction to the exploratory party of Pilgrims. French and English fishermen and Basque whalers had been landing on the New England coast for over a generation. This helps explain why the Pilgrims were later assisted by Native Americans (Samoset and Tisquantum) who could speak English. As a result, alien diseases had cleared coastal communities before anyone’s foot was placed on the legendary ‘Plymouth Rock’.   

As early as 1616, perhaps as many as ninety per cent of the people living in the vicinity of what would become Plymouth had died in an epidemic. This was why the Pilgrims found cleared fields but no native inhabitants there. And the reason for the hostility shown by the Nauset was because they had lost members to English ‘traders’, who had kidnapped them as slaves. It was a slaving expedition that had taken Tisquantum to England, via Málaga in Spain, and then back to his (now devastated) North American home, with the ability to speak English.  

When writing Mayflower Lives (published by Pegasus Books, New York), and exploring the contrasting lives of 14 ‘Saints’ and ‘Strangers’, the interplay between myth and reality was apparent. This is not to disparage the impact of the Mayflower voyage and settlement through a reductionist revisionism. It is simply to reaffirm the central nature of disentangling myth from reality in any historical exploration of this momentous time. That is the very nature of historical enquiry and we should not fear it, even when applying it to an iconic event.

Perhaps more importantly, though, it is a testimony to the remarkable potency of the Mayflower and its legacy, that it has become the stuff of mythology as well as of history. I think that the original Pilgrims would have understood, for they believed that what they had embarked on was not a run-of-the-mill activity. They believed that they walked hand-in-hand with providence and this was the foundation of their mindset and outlook. They always believed that what they were doing was of greater significance than it might outwardly appear.

In the 21st century we may agree or disagree with their perspective on life, but what is undeniable is the fact that what they achieved has inspired the imaginations of huge numbers of people and still challenges us, as we seek to understand it today. It is both history and myth intertwined to a remarkable extent.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172776 https://historynewsnetwork.org/article/172776 0
We Need a Unifier in 2020

 

Two recent mass shootings in El Paso and Dayton remind us how hate-filled our country has become. In an online manifesto that appeared just before the first shooting, the killer of 22 people wrote; “This attack is a response to the Hispanic invasion of Texas. . . . I am simply defending my country from cultural and ethnic replacement brought on by an invasion.”  

Some Democratic 2020 presidential candidates faulted President Trump for stoking racism and targeting Hispanic Americans and immigrants. Bernie Sanders, among others, also criticized the National Rifle Association (NRA) and Republican lawmakers who supported its opposition to meaningful gun control. 

Writing of the 2016 presidential election, historian Jill Lepore observed, “The election had nearly rent the nation in two. It had stoked fears, incited hatreds.” She also believed that the Internet and social media had “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right.” Now, three years later, the situation has worsened. Shortly before the El Paso shooting, the shooter’s manifesto appeared on 8chan, an online message board. It encouraged his “brothers” on this site to spread widely the contents of the manifesto. As a New York Times article noted, “In recent months, 8chan has become a go-to resource for violent extremists.” 

In contrast to such violent extremism, some of our finest Americans urged the opposite approach.  Dorothy Day, a woman identified by both Barack Obama and Pope Francis as one of our great Americans, stated, “We must always be seeking concordances, rather than differences.” In his Keynote Address to the 2004 Democratic Convention, Obama (then still only an Illinois state senator) voiced a similar appeal. He urged us to remember that “we’re all connected as one people. . . . It is that fundamental belief . . . that makes this country work.  It’s what allows us to pursue our individual dreams, and yet still come together as one American family ‘E pluribus unum’ [as is stamped on much of our currency]. Out of many, one. . . . There is not a black America and white America and Latino America and Asian America [nor he indicated a Red-State America and a Blue-State America]; there is the United States of America.” The young Obama then told his listeners, “That’s what this election is about.  Do we participate in a politics of cynicism or do we participate in a politics of hope?”  

In 2010, now as president, he reiterated his plea for unity and told University of Michigan graduates, “We can't expect to solve our problems if all we do is tear each other down. You can disagree with a certain policy without demonizing the person who espouses it.”

Although President Trump has often demonized those who oppose him, we should remember that we can vigorously oppose his policies without demonizing all his supporters. In 2020 we need a unifier, as Franklin Roosevelt was. A unifier who can help us narrow the yawning gap between Republicans and Democrats, between different segments of our multiethnic society.   

Cory Booker began his presidential campaign in early 2019 acknowledging our nation’s divisiveness and stressing the need for love and unity.  Another candidate, South Bend (IN) mayor Pete Buttigieg, has urged the need for political humility, for trying to understand the position of those with whom one disagrees, and for realizing that some Trump supporters can, in many ways, be good people. 

For various reasons, however, neither Booker nor Buttigieg has emerged, at least yet, as one of the top four Democratic candidates. But like Dorothy Day, Obama, and Lepore, the two candidates are correct to urge more mutual understanding. Political compromises may not always be possible. Nor are they always for the best. But humility, empathy, and open-mindedness regarding political opinions, our own and others, are worthwhile goals that should not be minimized.

Seeking unity out of the diverse composition of our multifaceted nation has always been the aim of our most enlightened leaders, from the framers of our constitution to President Obama. In his acclaimed biography of George Washington, Ron Chernow writes of our first president as “the incarnation of national unity” and as ‘an apostle of unity.”  On the eve of the Civil War, Abraham Lincoln told our nation, “We [North and South] must not be enemies. Though passion may have strained it must not break our bonds of affection.” We should listen to “the better angels of our nature.”

One of Lincoln’s contemporaries and a great admirer of him, the poet Walt Whitman, also stressed what unified us as a people. In a recent Atlantic essay on the poet, Mark Edmunson wrote:

At a time when Americans hate one another across partisan lines as intensely perhaps as they have since the Civil War, Whitman’s message is that hate is not compatible with true democracy, spiritual democracy. We may wrangle and fight and squabble and disagree. Up to a certain point, Whitman approved of conflict. But affection—friendliness—must always define the relations between us. When that affection dissolves, the first order of business is to restore it. 

Several years after Lincoln’s death the great abolitionist Frederick Douglass gave a major speech praising the composite nature of the United States. His message was that if “we seek the same national ends” our diversity—“Indian and Celt; negro and Saxon; Latin and Teuton; Mongolian and Caucasian; Jew and Gentile”—can be a great blessing. 

A great admirer of Whitman and Lincoln—his six-volume biography of Lincoln earned him a Pulitzer Prize in History—poet Carl Sandburg extolled the beauty of our composite nation in all its glorious variety. The son of Swedish immigrants himself, he appreciated our ethnic diversity. His Jewish friend Harry Golden wrote in 1961 that “the fight against anti-Semitism and Negrophobia had been a special project” to him. As a newspaper man in Chicago after World War I, he printed the platform of the National Association for the Advancement of Colored People (NAACP), and was later honored by being made a lifetime member of that organization. During World War II, he hired two Japanese-Americans to work for him during the same period that over 100,000 other such Americans were being uprooted and sent to internment camps. In addition, he wrote a column warning against such prejudice. 

Although a strong supporter of Franklin Roosevelt, he also understood and appreciated political differences. In his long, long poemThe People, Yes (1936) he wrote: 

The people have the say-so. 

Let the argument go on

. . . . . . . . . . . . . . 

Who knows the answers, the cold inviolable truth?  

 

Yet, like Whitman and Lincoln he championed national, and even international, unity. In the Prologue for The Family of Man (1955), a book of photographs from around the world, he wrote of how alike we are “in the need of love, food, clothing, work, speech, worship, sleep, games, dancing, fun. From the tropics to arctics humanity lives with these needs so alike, so inexorably alike.” 

 

His friend Adlai Stevenson (the Democratic presidential candidate in 1952 and 1956) once said that Sandburg was “the one living man whose work and whose life epitomize the American dream.” In 1959, on the 150th anniversary of Lincoln's birth, he addressed a Joint Session of the U. S. Congress, becoming the first private citizen to do so in the twentieth century. After his death in 1967, he was honored by almost six thousand people at the Lincoln Memorial, including President Lyndon Johnson, Chief Justice Warren of the Supreme Court, Sandburg’s friend Justice Thurgood Marshall, and various poets and members of Congress. 

 

Another poet who died the same year as Sandburg, but first arose to fame as part of the Harlem Renaissance of the 1920s, was strongly influenced by both him and Whitman. This was Langston Hughes, and his poem “Let America Be America Again” (1935) reflects their spirit. 

 

Let America be the dream the dreamers dreamed—

Let it be that great strong land of love. 

. . . . . . . . . . . . . . 

I am the poor white, fooled and pushed apart, 

I am the Negro bearing slavery's scars. 

I am the red man driven from the land, 

I am the immigrant clutching the hope I seek—

. . . . . . . . . . . . . . 

O, yes, I say it plain, 

America never was America to me, 

And yet I swear this oath—

America will be!

 

Decades later, Martin Luther King, Jr. (MLK) expressed a similar dream in his most famous speech: “I have a dream that one day down in Alabama. . . little black boys and black girls will be able to join hands with little white boys and white girls as sisters and brothers. . . . With this faith we will be able to transform the jangling discords of our nation into a beautiful symphony of brotherhood.” 

 

Five years later, immediately following news of MLK’s death, Senator Robert Kennedy (RFK) urged a similar unity: “What we need in the United States is not division; what we need in the United States is not hatred; what we need in the United States is not violence and lawlessness, but is love, and wisdom, and compassion toward one another, and a feeling of justice toward those who still suffer within our country, whether they be white or whether they be black.” (For more on King and Kennedy, see here.)

 

Now, a half century after hatred had taken the lives of MLK and RFK, rancor again runs rampant in our country. Although we need a unifier to emerge as president in the 2020 election, we perhaps first need the electorate to value and seek unity, to remember that the aim of politics should be to further the common good, not just our own narrower partisan interests. 

 

Recently some groups and individuals have moved in that direction. In December 2016 a group of Trump and Hilary Clinton supporters got together in Ohio and from that meeting emerged “Better Angels,” a “national citizens’ movement to reduce political polarization in the United States by bringing liberals and conservatives together to understand each other beyond stereotypes, forming red/blue community alliances, teaching practical skills for communicating across political differences, and making a strong public argument for depolarization.”

 

On an individual level, writers such as the recently deceased Tony HorwitzGeorge Saunders, and Pam Spritzer have, like Dorothy Day, MLK, RFK, Barack Obama, and Pope Francis, advised dialogue and working (in King’s words) “to transform the jangling discords of our nation.”In the months leading up to the 2020 elections we would do well to heed their advice. We can do this not only by working toward the defeat of the divider Donald Trump, but also by replacing his politics of divisiveness with one of inclusiveness and compassion.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172765 https://historynewsnetwork.org/article/172765 0
The Cultural History Behind Once Upon a Time...in Hollywood

Note: the following article contains spoilers

 

 

Once Upon a Time…in Hollywood, a film filled with nostalgia for old movies, music, television programs, cars, and celebrities, is Quentin Tarantino’s love letter to Los Angeles.  In an interview with Entertainment Weekly, the writer/director declared:  “I grew up in Los Angeles….the only people who love it the right way, are the people who grew up here….The film became a big memory piece.”  Though he does not make much of an effort to dig deep into historical issues, he creates a cast of characters (some real, some fictional) and provides images that offer an interesting and sometimes provocative glimpse of life in L.A. in 1969. He also explores some of the tensions between old-time Hollywood and the 1960s counterculture that was becoming more prevalent and menacing by the end of the decade.  Tarantino leaves few doubts about where his sympathies lie.  

 

The plot, if we can call it that, is very simple, covering just three days in 1969, two in February and the October day of the Manson family massacre.  We follow the activities of two fictional aging Hollywood figures, a former big time TV star, Rick Dalton, (Leonardo DiCaprio) and his stunt double, Cliff Booth (Brad Pitt).  Dalton is passed his prime as an actor and is struggling to remain relevant, though his roles are now limited to playing the “bad guy,” which talent agent Marvin Schwarzs (Al Pacino) tells him is the kiss of death.  The movie also follows Sharon Tate (Margot Robbie) as she shops for presents for her husband Roman Polanski, sees herself in a movie, and parties at the Playboy Mansion with numerous celebrities.  As Rick works on the set of the TV show The Lancer, Cliff spends his days fighting Bruce Lee, fixing Rick’s TV antenna, and picking up a hitchhiker who happens to be a member of the Manson “family.”  After a six-month stay in Italy where Rick tries to revive his career by taking roles in spaghetti westerns, Rick and Cliff return to LA several hours before the time of the massacre. 

 

Los Angeles was (and still is) a city filled with cars. Many scenes in the movie are set in cars travelling through the remarkably recreated streets of 1960s L.A. The director emphasizes the personal nature of the film by using shots in moving cars that are pointed upwards, as if from the perspective of asix-year-old Tarantino sitting  inside his stepfather’s Karmann-Ghia, which happens to be the type of car that Cliff drives when he’s not using Rick’s. Through the eyes of a child, we see the billboards for old products like RC and Diet Rite Cola, movie theater marquees announcing names of films being shown, and, of course, classic cars like Mustangs, Cadillac Coupe de Villes, and VW Beetles.  The producer acquired nearly 2,000 cars to use in the background to help set the tone. Avid car enthusiasts may be bothered by the inclusion of some cars that had not been produced before 1969, but the overall impact of the old cars in the film is terrific.  

 

Car radios supply much of the soundtrack throughout the film, which includes a number of hit songs by well-known artists who remain popular to this day, such as Neil Diamond, Deep Purple, and Simon and Garfunkel. Yet the soundtrack goes beyond a predictable 60s-greatest-hits collection and includes numerous songs by lesser known acts like the Buchanan Brothers, the Box Tops, Buffy Saint-Marie, and Willie Mitchell, which puts the viewer in the back seat of a car listening to whatever happens to come on the radio, just as it would have been in 1969 before the days of personal playlists and specialized satellite radio channels.  We even have to hear the commercials. Tarantino also includes some news bulletins, including one on Sirhan Sirhan, Robert Kennedy’s assassin.  (More of these kinds of bulletins would have added to the richness of the historical context.)  The soundtrack also raises some interesting references in the film. For example, Cliff is listening to “Mrs. Robinson” as he eyes a flirtatious hippie teenager named Pussycat, who ends up being a member of Manson’s cult.  In an interesting twist on the song and the film with which it is inextricably linked, The Graduate, the older Cliff rebuffs Pussycat’s precocious sexual advances because she can’t provide proof of her age.  The scene also brings to mind a comparison with one of the historical figures in the film, Polanski, who in 1977 sexually assaulted a thirteen-year-old girl.  

 

More than anything else, Once Upon a Time is about the entertainment industry, and it is filled with dozens of references to movies and TV programs.  In one of the more memorable scenes, in the middle of the day Sharon Tate walks into an LA theater showing Wrecking Ball, starring herself and Dean Martin, and asks the manager if she can go in for free because she is in the movie, just as Tarantino had once done while trying to impress a date by taking her to see True Romance, which he had written.  Instead of reshooting scenes from Wrecking Ball with Margot Robbie playing Tate’s role, we see the actual fight scene between Tate and Nancy Kwan that was choreographed by Bruce Lee, as the fictional Tate (Robbie) soaks up the audience’s reaction.  In this writer’s favorite scene, as Rick reflects on how his career would have been so different had he been given the iconic role of Virgil Hilts in The Great Escape instead of Steve McQueen, Tarantino splices Rick into the actual movie.  We see Rick’s Hiltz defiantly delivering the same lines (“I intend to see Berlin…before the war is over”) to Commandant von Luger as he is sent to “the cooler” after his first escape attempt was thwarted. We also see many classic TV shows like Mannix in the background of many scenes, a show Brad Pitt told Entertainment Weekly was his father’s favorite show. The film includes many actors playing the stars of the era.  At a party at the Playboy Mansion, we see Michelle Phillips, Momma Cass, and Roman Polanski.  Damian Lewis, who bears a striking resemblance to Steve McQueen, makes the iconic star seem rather creepy and strange as he talks about Tate and Polanski’s relationship.  Mike Moh’s portrayal of Bruce Lee is also unflattering, so much so that the martial arts star’s family publicly objected to it.  

 

An overarching theme of the film is the clash of old Hollywood and the counterculture.  Early in the film, a group of teenage hippies—who end up being part of the Manson cult—is shown digging through a dumpster as they sing the lyrics to an actual Charles Manson song, “I’ll Never Say Never to Always”:  “Always is always forever/As long as one is one/Inside yourself for your father/All is more all is one.”  Rick and Cliff, the Hollywood heroes, are repulsed as they catch a glimpse of the hippies in the dumpster and they frequently show contempt for them throughout.  Though Manson (Damon Herriman) appears only once in the film driving his Twinkie truck outside the Tate/Polanski home months before the murders, his presence is felt throughout by the way his “family” members talk about him.  There is a very convincing portrayal of the cult at its home base at Spahn Ranch, an old site used in westerns like the ones Rick used to star in.  

 

Cliff, a decorated war veteran from either World War II or Korea, is the hero in this movie.  It’s not hard to imagine Cliff doing stunts for actors like Gary Cooper and John Wayne. Though Cliff was rumored to have killed his wife (in a flashback, we see Cliff on a boat pointing a spearfishing gun at his wife as she berates him for being a “loser,” but we don’t see him pull the trigger), he acts with a cool, detached dignity for most of the film. For example, after refusing Pussycat’s sexual advances in his car, he drops her at Spahn Ranch where she lives with dozens of members of the Manson “family.” He asks the teen and other members of the cult about George Spahn (Bruce Dern), whom he remembers from his work on Rick’s TV show at the ranch. In his attempt to get to Spahn, Cliff encounters Squeaky Fromme (the future would-be assassin of Gerald Ford played by Dakota Fanning), who refuses to allow the stuntman to see Spahn. (Tarantino accurately portrays the fact that Fromme had a transactional sexual relationship with Spahn that enabled the cult to live at his ranch.  Spahn also gave her the infamous nickname by which she is known.)  Cliff calmly yet firmly informs Squeaky that he’s coming in and that she can’t stop him. Once convinced that Spahn is not threatened by the hippies, Cliff leaves the ranch, but not before pummeling one of the male cultists for slashing his tire.  

 

As the time of the massacre approaches, the washed-up Hollywood duo are hardly in any condition to heroically prevent the horrific violence at Tate’s home next door to Rick’s.  As the Manson murderers walk up Ciello Drive, Rick is drunk, floating in the pool with headphones on, and Cliff is at the beginning of an LSD trip from an acid-laced cigarette. At this point, Tarantino abandons the original events and creates a fictional ending.  Instead of breaking  into Tate’s home, the three cult members (four were actually present) go to Rick’s, only to be brutally beaten (in Tarantino-style violence) by Cliff and mauled by his pit bull.  Rick is oblivious to all of this until one of the screaming assailants jumps into his pool to escape.  As if to punctuate the point that these old- time heroes are still relevant, Rick retrieves a flame thrower used in one of his movies to incinerate a group of Nazi officers, and turns it on the girl flailing in the pool to eliminate the threat of the cult.  (We see the scene from Rick’s movie earlier, and it’s a clear reference to the climax of Tarantino’s Inglorious Basterds.)  The film concludes with a pregnant Sharon Tate and one of the other guests greeting Rick and finding out about all the commotion. Old Hollywood has saved the day.  

 

Make no mistake, this movie is a folktale, just as the title suggests.  Tarantino does not really attempt to explore in any great depth the many critical political, economic, and social developments at this critical juncture in history.   Viewers looking for signs of the environmental distress in the city brought on by the ubiquitous cars, racial tensions in the aftermath of the Watts riots, or other issues confronting the city will be sorely disappointed.  But like most folktales, Once Upon a Time…in Hollywood is filled with interesting characters, events, and messages from a bygone era.  

 

                  

 

                  

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172769 https://historynewsnetwork.org/article/172769 0
Russian History Gives the United States an Ominous Warning  

       

 

         Russia is often in the news these days – corrupt and repressive at home, aggressive and malevolent in relation to neighbors and rivals. Yet this Russia is heir to a country that shaped the twentieth century and had a formative impact on the cultural and political history of the modern world. It cannot be dismissed as a plaything of Vladimir Putin’s arrogant ambitions. Over the past hundred years, Russia has been a bellwether, not an exception. We should take heed.

 

         Russia has more than once demonstrated the ease with which complex societies can fall apart. It has shown how difficult it is to uphold the legitimacy of nations and to install and sustain democratic regimes. The country we know as the Russian Federation changed names, borders, and political systems twice in the course of the twentieth century. We remember the collapse of the Soviet Union in 1991. A megapower suddenly vanished--the ideology that sustained it deflated like a punctured balloon. The periphery defected – fourteen former Soviet republics emerged as independent nations. Nevertheless, Moscow remains the center of a multiethnic territory that continues to span the Eurasian continent. Democratic in form, authoritarian in practice, Russia is still a major player on the international stage.

 

         In 1991 the center not only held, but other structures also persisted. Positions of power within the Soviet hierarchy translated into opportunities to amass personal fortunes. Alumni, like Putin, of the Soviet political police, the former KGB, used their insider status and institutional leverage to shape a new form of authoritarian rule. A pseudo-capitalist oligarchy arose on the ashes of Soviet Communism, while the welfare of the majority of the population eroded. What we now deplore as the increasing disparity between rich and poor in the developed industrial nations in Russia was an instant product of Communism's collapse. Much has improved in the post-Soviet sphere; open borders, an uncensored press, freedom of speech and assembly (though increasingly imperiled), prosperity for the better off, not just the elite. But the rule of law is a mere fig leaf and the hope that free markets would result in a free society has not been fulfilled.

 

         This recent transition – by now thirty years old – was not the first time the center held, against all odds, and the promise of liberation was disappointed. Seven decades earlier, between 1917 and 1921, an entire civilization collapsed and a new one was founded. In 1913 Tsar Nicholas II celebrated the three-hundredth anniversary of the Romanov dynasty; in August 1914 he took Russia into World War I on the side of the Allied powers. In March 1917 mutinies in the imperial armed forces, bread riots by working-class women, industrial strikes in the key cities, and peasant revolts in the countryside led to the defection of the military and civilian elites. For years, a burgeoning civil society and a disaffected radical fringe had been dreaming of change – the one of the rule of law, the other of socialist revolution. When Nicholas renounced the throne, a seven-month experiment in democratic politics ensued – at the grass roots in the form of elected soviets (councils), on the scale of empire in the form of elections to a Constituent Assembly. Millions voted at every level; democracy was in the air. Yet, the Provisional Government, which honored Russia’s commitment to the Allied cause, could not cope with the same war that had proved the monarchy’s undoing. In October 1917, the Bolshevik Party, under the leadership of Vladimir Lenin, arrested the liberal ministers, took control of the soviets, and heralded the installation of the world’s first socialist government.

 

         Few thought this handful of radical firebrands would stay in the saddle. The old elites launched a fierce opposition. As committed internationalists, anticipating the outbreak of world revolution, the Bolsheviks immediately sued for peace. In March 1918 they signed a separate treaty with the Central Powers but the fighting nevertheless continued. Though relatively bloodless, the October coup unleashed a plethora of brutal civil conflicts lasting another three years. The old regime, desperate to bolster its popularity in wartime, had mobilized the population against itself, demonizing the inner enemy, only to weaken itself from within. The fissures held together by autocratic rule and the imperial bureaucracy now broke open – class against class, region against region, community against community.

 

         Armies were not the only combatants in the struggle for independence and domination that followed 1917. All sides used the energy of popular anger to strengthen their own cause. As early as December 1917 the Bolsheviks had established a political police, ancestor of the KGB, to direct the furies of class conflict at officially stigmatized social categories and political rivals. Defenders of monarchy vilified the Jews; Polish, and Ukrainian nationalists took aim at each other. In the context of fluid and endemic violence, vulnerable communities took the brunt. Across the former Pale of Settlement (abolished after March 1917) tens of thousands of Jewish inhabitants were murdered; Muslims and Christians in the Caucasus settled old scores. Enemies and traitors, real and imagined, were everywhere targets of spontaneous and organized rage.

 

         The Civil War was a consequence of state collapse, but it generated the birth of nations. Poland, Finland, and the Baltic States, with outside backing and by force of arms, established their own borders. World revolution had not materialized; the movement for national self-determination, a Leninist slogan, took its place. In the end, the Bolsheviks maintained control of the heartland – moving the capital from Petrograd to Moscow in March 1918, conquering the breakaway Ukrainian provinces, defeating a range of military and ideological opponents on both the Right and the Left, and reconstituting a massive, multiethnic state on the footprint of the former empire. They created a new, “people’s army,” using the endemic violence of social breakdown to form a new type of regime, energized by continuous internal struggle. The self-proclaimed dictatorship of the proletariat promised a new and higher form of democracy and the inauguration of a new and brighter era for humankind. Instead, it resulted in a system that inflicted untold damage on its own population: forced collectivization, murderous famines, purges, and the Gulag.

        What eventually became the Soviet Union in 1924 nevertheless survived the Stalinist Terror and the onslaught of World War II, playing a decisive role in Allied victory. With Stalin’s death, the system began slowly to soften, but until the last moment the basic principles of class warfare and ideological dictatorship endured. Soviet Communism is now dead; people are beginning to forget why it invoked passions on both sides – either fiery commitment or moral outrage. The Western democracies cannot boast, however, of the triumph of capitalist markets and liberal constitutions. Civil societies are generating antidemocratic populist movements and corrupt and self-serving politicians are brazenly flouting the law. Racial and cultural antagonism and nationalist fervor, encouraged from on high, bolster the power of corrupt elites. Trump and Putin more and more mirror each other. The democratic impulse that flourished in the Revolution and was defeated in the Civil War emerged again in Russia after 1991 but has once againbeen foiled. We can’t afford to look down our noses at Russia. Its history over the last century should give us pause. Great Powers die and democracy easily withers.    

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172774 https://historynewsnetwork.org/article/172774 0
America’s Self-Cultivation Crisis

 

No sooner did candidate and self-help guru Marianne Williamson engineer her breakout moment in the Democrat’s presidential debate on July 31 in Detroit than she found herself panned for half-baked views on depression and mental health. But Williamson’s quixotic campaign has highlighted one salutary theme: America had better learn to up its game in cultivating civic empathy lest the “dark psychic force of collectivized hatred” of which she spoke tear us apart.

 

Mass shootings in El Paso and Dayton over the Aug. 3 weekend, in which hate-filled gunmen killed 31 people and wounded dozens more, brutally underscore the point. White supremacists and weaponized haters represent the antithesis of civic empathy, and by now we know good intentions alone won’t fix the curse of gun violence in America; we need consensus and action on sane gun-control measures. We also need a more robust empathy offensive to reknit our fraying commonweal.

 

What’s stopping us? The list is as long as a Donald Trump necktie, but let’s start with the president. 

 

For someone as uncouth as our trash-talking commander in chief, personal cultivation can evoke images of raised teacups, curled pinky fingers and snoots in the air; high culture is hedge-fund moneybags snapping selfies with a Hollywood celebrity at a golf tournament. In the president’s incurious, ego-bound world, self-promotion trumps self-cultivation. But nurturing respect for the urge to improve ourselves for the common good is as American as Abe Lincoln’s bootstrapping fondness for book learning or the heroes championing literacy and reading programs today. 

 

Even so, rowdy disregard for soulful striving is as old as it is nonpartisan. “It’s a revolt of the mediocre many against the excellent few,” wrote New York Times columnist Bret Stephens, speaking specifically of today’s “campus radicals” on the activist left. “And it is being undertaken for the sake of radical egalitarianism in which all are included, all are equal, all are special.” But you could make a similar argument about radical populists on the right, in the way, as Stephens says, it “emboldens offense-takers, promotes doublethink, coddles ignorance … [and] gets in the way of the muscular exchange of honest views in the service of seeking the truth.”

 

Purposeful self-cultivation is the natural antidote to that kind of obdurate yahooism. No, it’s not likely to dilute the toxic delusions of hardcore white nationalists any time soon, if ever. But just as Trump’s hate speech has created a climate in which  hate groups can flourish, it’s important in our competitive, free-agent nation that we work on a counter-climate—one that helps blunt our sharp elbows and creates space for sober reflection based on thought, study and regard for the importance of issues beyond the self. Civic-minded cultivation values individual well-rounding, dedication to craft and quiet competence. The rub, sad to say, is that increasingly larger segments of American society appear to want none of it. 

 

Author Tom Nichols argues our “Google-fueled” culture has eroded respect for personal achievement in the public interest. Skepticism of the high and mighty is a time-honored and healthy feature of American democracy. Yet today, as Nichols says in his 2017 book “The Death of Expertise,” we’re no longer just properly skeptical about our experts. Rather, “we actively resent them,” he writes, “with many people assuming that experts are wrong simply by virtue of being experts. We hiss at ‘eggheads’—a pejorative coming back into vogue—while instructing our doctors about which medications we need or while insisting to teachers that our children’s answers on a test are right even if they’re wrong. Not only is everyone as smart as everyone else, but we all think we’re the smartest people ever …. And we couldn’t be more wrong.”

 

It’s hard for Americans to cultivate fruitful conversation when we’re shouting across a mountain range of misplaced ego, let alone a cavernous income divide. For a majority of American workers real wages haven’t budged for some 40 years; the ever-widening gap between the rich and the rest now means that America’s “1 percent” averages 39 times more income than the bottom 90 percent; women on the job make 79 cents to men’s dollar, and the income split between whites and minorities has deepened.

 

Yet it’s clear the great American income squeeze has hit more than the pocketbook. How many people have time or energy to read a book, enjoy a concert, enroll in tango lessons or get involved in community building activities on a sustained basis when they’re struggling to keep heads above water? How do you fructify life in a world of shifting job prospects, burdensome college debt and eclipsed expectations?

 

“If we pull back from a narrow focus on incomes and purchasing power …  we see something much more troubling than economic stagnation,” Brink Lindsey argued in The American Interest. “Outside a well-educated and comfortable elite comprising 20-25 percent of Americans, we see unmistakable signs of … social disintegration — the progressive unraveling of the human connections that give life structure and meaning: declining attachment to work; declining participation in community life; declining rates of marriage and two-parent childrearing.”

“This is a genuine crisis,” said Lindsey, “but its roots are spiritual, not material, deprivation.”

Little wonder cognoscenti have touted a link between self-cultivation and self-preservation since time out of hand. In the ideal state, Cicero said, the individual “is endowed with reason, by which he comprehends the chain of consequences, perceives the causes of things, understands the relation of cause to effect and of effect to cause, draw analogies, and connects and associates the present and the future” so he can assess “the course of his whole life and makes the necessary preparations for his conduct.”

 

My maternal grandmother, Alice Brasfield, didn’t know from Cicero, but she saw the linkage between self-cultivation and survival clear as day. Forced to quit school at 12, she outlasted a gothic girlhood in 1890s Canada by reading voraciously and committing the dictionary to memory. When I knew her in the 1950s, Alice had cultivated a light touch on the piano, wrote thoughtful letters in an elegant hand, and relished handing all comers their rear-ends in Scrabble. She preached old school: reading until eyeballs bled, knowing some poetry, a few songs and jokes by heart, and learning to offer others something in conversation beyond self-regard. She had nothing against baseball, but bristled at my decision to give up the music lessons she paid for to dawdle, inconclusively, on the diamond.

 

Of course, it was easier working toward such high-minded goals in the booming economy of 60 years ago when earning a living wasn’t as much of an uphill fight as it can be today, and time moved at its less frantic, pre-digital pace.  

 

Like Alice, millions of working-class Americans, who had scaled the rough side of the mountain, saw self-cultivation not only as a stepping stone to a more complete life but a boon to community, as well. While looking forward to the Book-of-the-Month Club selection landing in their mailboxes, working-class folks read the news as a civic duty and bore the art of eyeball-to-eyeball conversation as a serious pastime. Even late-night TV tipped its hat to the higher culture, wedging in among the stupid pet tricks and celebrity buzz, literary lions like Lillian Hellman, James Baldwin and William Saroyan. 

 

Empathy grew from the urge to experience a more expansive life. As Anton Chekhov put it in a letter to a troubled brother, the cultivated “have sympathy not for beggars and cats alone. Their heart aches for what the eye does not see.”

 

Philosopher John Dewey saw that impulse as vital in a democracy, the goal of which, according to Bill Kovach and Tom Rosenstiel in “The Elements of Journalism,” “was not to manage public affairs efficiently. It was to help people develop to their fullest potential.” As Dewey himself said, closing the loop between individual and community, “I believe that education is a regulation of the process of coming to share in the social consciousness; and that the adjustment of individual activity on the basis of this social consciousness is the only sure method of social reconstruction.”

 

Embracing culture allows individuals to see the kind of “dark psychic force” Marianne Williamson cited, as well. In his momentous novel “Invisible Man,” for example, Ralph Ellison, in advising us to remember that “the mind that has conceived a plan of living must never lose sight of the chaos against which that patterned was conceived,” is suggesting that power and entitlement, misused, is a force for social disintegration and blindness. Making the invisible visible, on the other hand, gives a society greater tensile strength.

 

Today, in a country riven by matters of race and gender, immigration and identity, and rural vs. urban rivalry, we’re at a historically delicate moment. “American confidence is in tatters …” New York Times columnist David Brooks wrote. “As a result, we’re suffering through a national identity crisis. Different groups see themselves living out different national stories and often feel they are living in different nations.” What’s needed, as Brooks suggests, is for Americans to create a new national story to help us explain to ourselves who we are and what we value to the point of action, and that’s not possible without the exercise of civic empathy.

 

As college teacher, I’m hopeful we’ll get there. Young people I know, students and former students now in their 20s and 30s, are making headway against our material-driven culture by opting for downsized homes and more frugal lifestyles. Too often that’s out necessity, but the shift also speaks to a focus on “genuine” rather than “plenty,” and a growing recognition that unchecked materialism not only plays havoc with  the ozone layer, but punches holes in the soul in a way that only psychic income, not greenbacks, can fill.

 

Wild prediction: Marianne Williamson will not become our next president. Nonetheless, hercall for spiritual renewal—you might call it a New Deal for Hearts and Minds—makes good practical sense. Sustained work at self-cultivation opens the eyes and feeds the spirit, defuses hair-trigger judgments, and generally makes for a more even-keeled society.

 

So, here’s a message for candidates who do have a shot at becoming president: Turn your telescopes around—see the state of the nation’s soul, not as new-age mumbo jumbo, but as an umbrella idea that houses important but necessarily wonky policy prescriptions for fixing immigration, healthcare, income inequality, access to education and student debt.

For most of us mortals, cultivating the self, and adding our “light to the sum of light,” as Tolstoy put it, is an elusive goal. But it’s worth aiming for. At a minimum, it’s the best revenge for living in a savage world. It’s also a prudent bet that integrating more rounded lives into our society will give us one that works better than it does now. Our survival may depend on it.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172767 https://historynewsnetwork.org/article/172767 0
We Must Stop Valuing Guns More Than People

A memmorial for the victims of the El Paso mass shooting

 

There is a darkness within the soul of America. It shows itself with increasing frequency all across the country, from garlic festivals to shopping malls, from schools and churches and synagogues. But these mass shootings are only symptoms of the disease at the core of the nation. Some of the worst diseases have the fewest symptoms and the very worse have only one symptom: death with no warning. 

 

The wave of mass shootings and the many thousands of others killed one by one are only symptoms of the American disease. It is a disease most people deny which does not make it go away. The disease in the American soul is that we as a society value guns more than people. It is a choice this country has made and one it has no real desire to cure. 

 

That this country values guns more than people doesn’t prevent the predictable reactions after the shootings: The cries of oh no, not again. The wringing of hands. The tears rolling down the cheek. The expected question of why doesn’t somebody do something to stop this with the unspoken answer that nobody will do anything about it. We as a society are deeply hypocritical on this issue publically crying that something must be done without any real desire to change things. If we were to ever do something about this, which is improbable at best, we would have to first admit that we care more about guns and gun owners than the murdered and maimed victims of guns. 

 

America’s deep devotion to guns expresses itself in many other ways most strongly expressed in the modern distortion of the plain meaning of the Second Amendment. When the Constitution was in the process of ratification by the original states one concern raised about taking this major step into a new government was whether each state could still maintain their local militia. Would, they asked, the Massachusetts or Georgia or other state militias be abolished or absorbed into a national army. The framers of the Second Amendment answered by guaranteeing “well regulated” state militias, now more commonly known as state national guards, alongside the federal army and navy, It never precluded individuals from owning firearms but it never granted civilians an unlimited right to own and use any gun they wanted. If “well regulated” applied to organized military units it certainly more than applied to individuals. The firearms industry and its lackey the National Rifle Association are lying when they claim the Second Amendment is anything more. 

 

The cliché response to all mass shootings including the latest in El Paso and Dayton is to proclaim that our thoughts and prayers are with the victims. Thoughts and prayers are fine, but God can’t solve this problem. Nor is it enough to answer as most people, including a majority of gun owners, do on polls that they support background checks, assault weapon bans or other measures that would make at best a small dent in the problem. To think in the light of repeated shootings followed by nothing that answering a poll will change anything is in plain terms stupid. 

 

If mass shootings are ever to end, which they almost certainly never will, it will require first admitting we as a nation have valued guns over people. Then we would have to make the deepest change in our soul since the Civil War when a majority stopped believing that black people were no more than livestock to be bought and sold. If we as a nation and as indiduals could change to that degree we might change things. 

 

Because most of us will not be killed by a gun we can continue as before, pretending to care about something we have no will or intention to change. If we are not willing to change let us at least treat the disease by being honest about it.    

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172768 https://historynewsnetwork.org/article/172768 0
Art and Defacement: Basquiat at the Guggenheim

 

Consider the following facts as you wend your way to the Guggenheim Museum and its uppermost gallery, where you will presently find The Death of Michael Stewart (1983), Basquiat’s gut-punching tribute to a slain artist, and the centerpiece for an exhibition that could hardly be more timely. Black people are three times more likely to be killed by police than white people. According to mappingpoliceviolence.org in 2014, fewer than one in three black people killed by police in the U.S. were suspected of a violent crime and allegedly armed. As American pediatrician Dr. Benjamin Spock once observed, “Most middle-class whites have no idea what it feels like to be subjected to police who are routinely suspicious, rude, belligerent, and brutal.” Such brutality is the focal point for Basquiat’s “Defacement”: The Untold Story, an exhibition that commences from a painting created by Jean-Michel Basquiat in honor of a young, black artist – Michael Stewart – who met his tragic end when he was supposedly caught by the New York City Transit Police tagging a wall in an East Village subway station during the early morning hours of September 15, 1983. What precisely transpired that night remains unsettled to this day, but what is sufficiently known is that the twenty-five year old Stewart was handcuffed, beaten and strangulated by a chokehold with a nightstick – likely causing a massive brain hemorrhage, whereby he fell into a coma and never regained consciousness, dying two weeks later. Other artists, among them Andy Warhol and Keith Haring, responded to Stewart’s death with commemorative works of their own, which are featured in the exhibition. Also included is a yellow flyer created by David Wojnarowicz – portraying the officers with vicious, skeletal faces –  to announce a September 26, 1983, rally in Union Square in protest of Stewart’s “near-murder”, when the young man was still languishing in a coma, “suspended between life and death”. In fact, Basquiat must have seen Wojnarowicz’s poster (which was taped “all over” downtown, as another artist recalls), and apparently it served as a direct source for the composition of Basquiat’s painting.  The Death of Michael Stewart (informally known as Defacement) was originally painted directly onto the drywall of Haring’s Cable Building studio; later cut out of the wall and placed within an ornate gilded frame which Haring hung immediately above his bed. There the painting remained until Haring’s death from AIDS-related illness in February 1990. Two positively ravenous officers – with pink flesh and blue uniforms – wield their nightsticks above a solitary, black and haloed figure fixed motionlessly between them. In the upper register of the painting, above the trio of figures is the word ¿DEFACEMENTO? – a word that during the 1980s was often used as a term for graffiti. In the context of the painting, the artist draws our attention to the reality that what was truly being defaced was not a piece of property but a life: it is the police officers, teeth bared and thirsty for blood, who are committing the act of defacement, of disfigurement. Basquiat’s art was constantly in dialogue with the history of Western painting; and in this case, his work may in fact be seen as revisiting and restaging a classic theme – namely, the flagellation of Christ. The exhibition includes several other works by Basquiat, dealing with closely related subjects that occupied him throughout his relatively short but intense and extraordinarily prolific career. Irony of a Negro Policeman (1981), La Hara (1981) and Untitled (Sheriff) (1981), all take up the themes of white power, authority and law enforcement – generally portraying the police as frightening and monstrous. La Hara is an especially mesmerizing work, the title of which – repeated four times in the upper left-hand portion – refers to a Nuyorican/Boricua slang term for a police officer; derived from O’Hara, since at one time a large contingent of New York City law enforcement was Irish. The officer in this work is downright scary: with a ghostly white complexion, blood shot eyes and crooked, menacing teeth, set within a jaw that is open wide enough for the figure to be talking to us  – all of which serves to convey a kind of seething rage, ready to explode in violence at the slightest provocation. As with many of his figures, Basquiat has painted this officer with his rib cage exposed, and in certain areas we can see right through him to the fire-engine red background. In other words, what we have is a skeletal figure, whose bleached white bones invoke a kind of living dead: not simply a monster but an abomination. Charles the First (1982) and CPRKR (1982), both references to jazz saxophonist Charlie Parker, are among the paintings of Basquiat to champion and glorify the father of bebop – granting him, in fact, the stature of a king. These two works, different as they are from Defacement, nevertheless share with it certain themes. At a basic level, all three works are concerned with death, and precisely the death of the young, black, male artist. CPRKR is a kind of grave marker for Parker who was dead at thirty-five: a minimalist work consisting almost entirely of the initials in the title, references to the place (“THE STANHOPE HOTEL”) and year of Charlie Parker’s death (“NINETEEN FIFTYFIVE”), and a cross. At the bottom of the work, Basquiat has printed the name “CHARLES THE FIRST”. Charles the First abounds with references to the life and work of the great musician; but two features are particularly notable in the present context: at the painting’s top left corner is the word “HALOES” – indicating that in Basquiat’s scheme of things Parker is also a kind of saint, one of a number characteristics he shares with the Stewart of Defacement.  At the bottom of the painting, Basquiat issues the warning “MOST YOUNG KINGS GET THEIR HEADS CUT OFF” – which at the very least reminds us that, for Basquiat, a premature death is the price that the black artist pays for genius. Basquiat himself died in 1988 at the age of twenty-seven from a heroin overdose. The Guggenheim’s glance back to 1983 and the death of Michael Stewart accomplishes what art exhibitions should, but all too rarely do – it grants us perspective on our present moment; a way of confronting the reality that we are currently living and navigating. We all know the names of unarmed black men who recently had their lives cut short – Trayvon Martin (killed in 2012), Eric Garner (killed in 2014, by an illegal chokehold like the one that killed Stewart), twelve-year old Tamir Rice (shot dead in 2014 by white police officer Timothy Loehmann), eighteen-year old Michael Brown (also shot dead by white Ferguson police officer Daren Wilson), Philando Castile (killed in 2016) – and the list goes on. The show does not allow us to forget that this violence has a long, painful history in America. Basquiat’s “Defacement”: The Untold Story does what exhibitions should do – it tells us a story we don’t want to hear but need to hear.  

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172771 https://historynewsnetwork.org/article/172771 0
Donald Trump is no James Monroe

 

 

When the vice-presidential candidates squared off against each other in the 1988 debate, Lloyd Bentsen delivered one of the sharpest political blows ever landed on an opponent. Republican Dan Quayle was proudly touting his years of experience and equating them with John F. Kennedy’s 14 years in Congress before his 1960 presidential campaign. That’s when Bentsen pounced on the unsuspecting Indiana senator in a memorable and flawlessly delivered take-down: . “I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.” It was a defining moment and Bentsen’s language has since been used as a formula for other political insults.

 

Whether he knows it or not, Trump thinks he embodies President James Monroe because he claims to have done what Monroe accomplished. Trump insists he is America’s favorite president, that he has made America great again, and that he is immune from impeachment because “you can’t impeach somebody that’s doing a great job.” But how does President Trump really stack up, especially when measured through the lens of history? How does Trump compare to last of our founding father presidents as he sought reelection in 1820? Are Trump’s claims to greatness justified, or is he embellishing his record, just as Dan Quayle boastfully compared himself to John F. Kennedy?

 

There are four key areas where Trump falls short of the reputation and statesmanship of his predecessor, James Monroe, from 200 years ago.

 

Elections

 

Trump will never be as popular as James Monroe was in the early 1800s. When Monroe was first elected in 1816, his electoral victory was impressive. He racked up a rousing 84% of the electoral votes. His victory of electoral votes is the 15th highest percentage ever won. By contrast, in 2016, Trump won just 56% of the electoral votes. Despite the president’s claims that he won the Electoral College in a landslide, his Electoral College victory ranks very low – 46th out of 58 presidential elections.

 

In 1820, Monroe received all the electoral votes except one, coming the closest of any president to tying the unanimous Electoral College victories of George Washington.

 

Without ever cracking a history book, Trump has created his desired narrative that his 2016 election was a very substantial victory despite evidence to the contrary. He also predicts he will win an even more astounding victory in 2020. But in our divided nation, it is a pipe dream to suggest that Trump will come anywhere close in 2020 to Monroe’s stunning second term victory. In fact, there are scenarios in which Trump may lose in a monumental landslide.

 

When it comes to electoral victories, Donald Trump is certainly no James Monroe and never will be. He simply doesn’t measure up to Monroe’s popularity.

 

Leadership

 

Monroe’s impressive electoral victories reflected the hope and sense of national unity and optimism following the War of 1812. In his inaugural address on March 4, 1817, Monroe pledged that “harmony among Americans…will be the object of my constant and zealous attentions.” It was the beginning of the Era of Good Feelings, a catch phrase that came to be associated with Monroe’s presidency.

 

Monroe was a pragmatic president. He tried to govern in a non-partisan manner, noting that “the existence of parties is not necessary to free government.” Monroe recognized his obligations to all Americans and not just those of his Democratic-Republican Party. Biographer Harry Ammon observed that Monroe “viewed the party as embracing all elements of American society and therefore he accepted the fact that it must also adopt measures meeting the needs of the widest possible spectrum of American opinion.”

 

While the Era of Good Feelings characterized Monroe’s tenure as president, Trump’s presidency might best be described in the dark “American carnage” language of his inaugural address. Rather than unite the nation and promote harmony and non-partisan governing, Trump has consistently stirred up and created divisions among Americans and governed from a blatantly political posture that panders only to a dedicated base of supporters. He quickly abandoned his election day victory pledge to “be president for all of Americans” and to “seek common ground, not hostility; partnership, not conflict.” Trump’s leadership from the gutter is a far cry from the stable and enlightened leadership demonstrated by James Monroe.

 

Monroe was certainly the beneficiary of the generally optimistic mood of the nation when he assumed the presidency. He was the right man for the moment, with “a good heart and an amiable disposition,” in the words of one congressman. It’s not a phrase likely to be used to describe Trump, who has also been the beneficiary – and notably creator – of the nation’s mood. He has stoked and fanned the flames of fear, anger, and bigotry in a deeply divided nation.

 

Maturity

 

In 1793, Monroe, then a senator from Virginia, wrote a letter to his friend, Secretary of State Thomas Jefferson, about the futility of responding to foreign and personal insults. “The insults of Spain, Britain, or any other of the combined powers,” he wrote, “I deem no more worthy of our notice as a nation than those of a lunatic to a man in health, - for I consider them as desperate and raving mad.”

 

By personality, Monroe was not a sable-rattling pugilist. He was able to ignore such “desperate and raving mad” comments. In contrast, Trump is, by his own admission, a “counterpuncher.” He allows no insult to go unanswered, and his behavior is quite predictable. While he may have praised an individual in the past, once their real feelings about him become public, he attacks.

 

For example, on April 11, 2018, Trump tweeted his warm regard for House Speaker Paul Ryan, praising him as “a truly good man, and while he will not be seeking re-election, he will leave a legacy of achievement that nobody can question. We are with you Paul!”

 

The feeling was apparently not mutual from Ryan’s perspective. Ryan’s real opinion about Trump leaked out recently from Tim Alberta’s new book “American Carnage.” According to Alberta, Ryan thought Trump was inept:  “I told myself I gotta have a relationship with this guy to help him get his mind right. Because I’m telling you, he didn’t know anything about government…I wanted to scold him all the time.” At another point, Ryan said he saw retirement as his “escape hatch” from having to work with Trump for two more years.

 

With remarkable predictability, Trump tweeted an angry tirade on the same day that Ryan’s comments became public. “Paul Ryan, the failed V.P. candidate & former Speaker of the House, whose record of achievement was atrocious (except during my first two years as President),” the petulant president typed, “ultimately became a long running lame duck failure, leaving his Party in the lurch both as a fundraiser & leader…” Trump continued his Twitter rant against Ryan in two additional derogatory tweets.

 

When it comes to seasoned maturity, Monroe was secure enough in his own skin to recognize the futility of punching back with insults of his own. Trump has never learned that skill but is forever stuck in juvenile behavior that unnecessarily escalates the vitriolic heat in the public square. While Trump envisions himself as the perfect leader without any flaws, his record speaks otherwise, and fails to demonstrate the calm maturity of Monroe who saw the futility of responding to insults.

 

Military

 

James Monroe was the second and last president to have served in the Revolutionary War. He volunteered to fight for independence. Serving under General George Washington during the Battle of Trenton – a surprise attack on Christmas Day by American forces on Hessian mercenaries – 18-year-old Monroe was one of just a half-dozen Americans soldiers injured in the fighting. The bullet that pierced his shoulder remained there the rest of his life.

 

In contrast, Donald Trump sought and received a deferment from the draft for military service in Vietnam due to bone spurs in his heels. Serious questions have been raised about whether the doctor issuing the report was simply doing a favor for Trump’s father, or if Trump really did have a medical condition that would disqualify him from military service.

 

James Monroe honorably served the embryonic nation with heroism in the military, while Donald Trump found a way to avoid military service, and doesn’t come close to matching Monroe’s sacrificial service.

 

Conclusion

 

The character and characteristics of the 45th president are a far cry from the honorable service and integrity of our 5th president. Trump fantasizes about big election victories; Monroe actually had them two centuries ago. Trump wants to be viewed as a strong president, but his vision of strength is to divide America while Monroe put his leadership skills to work in uniting the nation with the Era of Good Feelings. Trump has a microscopically thin skin while Monroe exhibited a tough and seasoned maturity by ignoring insults. Trump loves the armed forces enough to throw himself a July 4th extravaganza, but not enough to have served in the military. James Monroe put his life on the line in helping purchase American independence as a soldier. And so to paraphrase the words of Lloyd Bentsen, “I knew James Monroe. James Monroe was a friend of mine. Mr. President, you’re no James Monroe.”

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172779 https://historynewsnetwork.org/article/172779 0
The Greatest Generals Across Generations

 

 

Reflecting on the 1989 invasion of Panama and subsequent hunt for strongman Manuel Noriega, Colin Powell lamented that, "A President has to rally the country behind his policies. And when that policy is war, it is tough to arouse public opinion against political abstractions. A flesh-and-blood villain serves better." This American tendency to personalize conflicts and world events is reminiscent of the “Great Man” theory of history, which posits that world events can largely be explained by the impact – positive or negative – of individual leaders.

A similar phenomenon occurs in military history, as the analysis of campaigns and wars is often reduced to an assessment of the opposing commanders’ performance. Although the quality of an army’s generals is a critical factor determining victory or defeat in battle, chance often plays as much of a role as skill or institutional variables in determining who commands at a given time or place. Nowhere is this more evident than with the U.S. Army during World War II, which produced the greatest generation of operational commanders in U.S. history. It is easy to view photographs of George Marshall beside his mentor General Pershing, or of George Patton standing next to a tank in 1918, know what they achieved in building and leading the Army in WWII, and therefore perceive their rise as inevitable. Yet as Edward Gibbon observed, “The fortune of nations has often depended on accidents.” Indeed, a string of accidents and coincidences were vital to shaping the roster of U.S. commanders in World War II – and hence the Allied victory – in three underappreciated ways.         

First, there was the simple timing of the war. As the legendary British general Sir Edmund Allenby once told Patton, for every Napoleon and Alexander that made history “there were several born. Only the lucky ones made it to the summit.” In other words, which commanders achieve greatness is partly determined by fate, specifically whether they are actually in command when a great conflict starts. For example, if WWII had broken out in 1934 rather than 1939, Marshall would still have been in the exile pettily imposed upon him by then-chief of staff Douglas MacArthur, serving with the Illinois National Guard instead of a position that allowed him to become what Winston Churchill called “the Organizer of Victory.” Conversely, if the war had broken out in 1944, Marshall would have already retired after serving four years as chief of staff from 1939-1943. Thus, the timing of Hitler’s invasion of Poland and France played a significant role in determining who led U.S. forces in WWII.

        

Second, in Strange Victory Ernest May suggests that the German invasion of France in 1940 was likely saved when a falling boar’s head in the Belgian hotel serving as Heinz Guderian’s command post narrowly missed the brilliant Panzer commander’s head. Similarly, a series of near-misses preserved the leaders who would command U.S. forces in that conflict. On the morning of November 11, 1918, hours before the Armistice, an errant bomb was dropped on the other side of a stone wall from where Marshall was eating breakfast in the 1st Army’s mess. Marshall escaped with just a nasty bump on his head, but as one historian observes, “Had the walls of the old house been less sturdy, a different chief of staff would have led the American armies against the Germans in the next war.” In 1920 Dwight Eisenhower and Patton came within “five or six inches” of being decapitated by a snapped steel cable while experimenting with tanks at Camp Meade, and in 1924 Omar Bradley was earning extra money working construction on the Bear Mountain Bridge when a cable snapped and cut the watch off his wrist. Eisenhower again narrowly escaped death in the 1930s when his plane nearly crashed upon takeoff in the Philippines. Although the pilot announced, “We ain’t going to make it,” the plane cleared the hill at the runway’s end by a few inches. Conversely, his friend James Ord – whom Ike called “the most brilliant officer” of his time – was killed in a plane crash in the Philippines in 1938. Other officers whom fate cruelly denied the opportunity to earn glory in WWII included Adna Chaffee, Jr., the father of the Armored Corps, who died on active duty of cancer in 1941; and Bradford Chynoweth, an innovative contemporary of Eisenhower and Patton's who took command of a Philippine division in November 1941 and hence was doomed to spend the war in a Japanese prison camp after the surrender on Bataan.

        

Finally, some tragedies inadvertently proved fortuitous to the American war effort. If Marshall’s wife Lily hadn’t suddenly died in 1927, he would have remained an instructor at the Army War College. Instead, unable to bear the constant reminders of her at Washington Barracks, his friends on the General Staff arranged for him to become assistant commandant of the Infantry School at Ft. Benning, where two hundred of the instructors and students who served under him from 1927-1932 – including Bradley, Matthew Ridgway, and “Lightning Joe” Collins, amongst others – rose to the rank of general during WWII. Similarly, Eisenhower was called to the War Department after Pearl Harbor because the head of the War Plans Division’s Asia department was killed in a plane crash on December 10, 1941. This accident forced Marshall to find a replacement, thereby setting in motion the partnership that was crucial to winning the war.

        

None of this is to say that individual commanders don’t matter, or that institutional factors such as professional military education or training exercises are unimportant in shaping those men eventually placed in command of great armies. Rather, it is important to recognize that whereas in retrospect history often appears to have unfolded in a straight line, reality is almost always more chaotic and uncertain. In the end, the Greatest Generation’s generals’ triumphs were anything but predetermined, and required a series of accidents and twists of fate to bring them to the point where their innate courage, intelligence, and determination could be decisive.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172778 https://historynewsnetwork.org/article/172778 0
Tiananmen Square-1989: Beijing’s Amnesia and Memory Hole  

 

 

 

Recently we commemorated the 30th anniversary of the massacre of Chinese Students in Tiananmen Square in Beijing.  These students and other participants were peacefully protesting corruption in their government and calling for democratic political reforms in China. On June 4thth 1989, their protest was brutally crushed by the People’s Liberation Army acting under the direct orders of Deng Xiaoping. Hundreds, maybe thousands, of students and their supporters were either killed or seriously wounded. Many more were arrested and imprisoned for trying to promote democracy in China.

 

Since that event, the Chinese government has launched a full campaign, largely successful, to expunge this episode from Chinese history and the minds and memories of the Chinese people. It would also like the rest of the world to forget and disregard what happened on that day. Unfortunately, this brutal suppression of political/human rights was broadcast globally on television. To this very day, the Chinese government has never apologized to the victims, or their families, of this brutal political crackdown. Instead, the leaders in Beijing have tried to spread two separate narratives to two separate audiences. To the Chinese people, they say this event never happened. They want the rest of the world to believe it was fault of the students who only got what they deserved. But, the Chinese Communist Party (CCP) cannot have it both ways.

 

On the one hand, the leaders in Beijing would have the world believe that such a dastardly event never happened. Or, if it did, and this is quite a concession, it was a minor affair. If it was such an inconsequential event, then why employ the full power of the People’s Liberation Army which included both armed soldiers, who fired on the students, and tanks. Then, in an absolutely mind-boggling explanation, the CCP has turned itself into victims and unarmed students into perpetrators.

 

As the Chinese journalist Louisa Lim argues in her well researched book: The People’s Republic of Amnesia: Tiananmen Revisited, Tiananmen seems to have gone down the Chinese version of George Orwell’s Memory Hole. In a recent article in the New York Times, she wrote that the CCP has waged a kind of war against any mention of Tiananmen. The episode is erased from official histories. Web sites that document it are blocked.

 

Any mention of this event calls forth Chinese defiance. Recently, Chinese Defense Minister Wei Fenghe told an international conference in Singapore that the student gathering was a kind of riot; “political turmoil that the central government needed to quell.”

 

Within China, the government has tried its best to completely inhibit or censor internal discussion of Tiananmen altogether. Late last year, the government began a crackdown on Twitter users who posted criticism of Communist rule. A former leader of the 1989 protest was barred from traveling to Hong Kong to commemorate the anniversary.

 

There are reports by students in China that their virtual-private-network services have been recently suspended. VPNs, as they are called, are used by citizens in authoritarian countries to by-pass state censorship of the Internet. Some believe that this directly connected to the Tiananmen “anniversary.” But the official story from Beijing is that it is retaliation for new American tariffs. What ever the case, there is little doubt that every

 

year the commemoration of Tiananmen is greeted with anxiety, even trepidation in Beijing. It is simply amazing that this self-described “incident” in Chinese history could still stoke such fear among the Chinese leadership.

 

One wonders why, if for nothing else, their own peace of mind, the CCP does not just admit it culpability for what happened, apologize to the Chinese people, put this event in their rear-view mirror, turn the historical page, and move on? The unwillingness of the CCP to take this seemingly rational step might well speak to its political insecurity. Might this honesty not play well with and gain the respect of the Chinese people?

 

But sadly, this is not the case. The actions of the CCP are typical of dictatorships. The Soviet Union, from whom China borrowed its Leninist political system, would erase the entries of unpopular officials from official encyclopedias. In recent years, Mao’s last wife and the leader of the notorious “Gang of Four”, Jiang Qing, was cropped out of the picture taken at the funeral of Mao Zedong. Simply put, she was historically erased from this event. Maybe the fact that she ended up in prison where she committed suicide had something to do with her trip down Beijing’s “Memory Hole”.

 

What makes the threat from China quite dangerous, however, is that the Chinese are accumulating the power to also mold the collective memory of people around the world. They are not there yet.

 

But China’s intent to own pieces of the world’s digital infrastructure, as well as social-media platforms, threatens to make the free world’s internet as limiting as the one China imposes on its own citizens.

 

It isn’t just China’s push for Huawei to help built 5G wireless networks around the world. China’s Byte Dance owns the popular social-media platform Tik Tok which is popular among young web users in the West.

 

A Chinese gaming company now owns the popular gay dating app-Grinder and China’s We Chat is rapidly expanding its market shares in Europe and Asia for its easy-to-use program to pay for goods and services with a phone.

 

If the CCP had learned the lessons of Tiananmen and met the demands of its people for more personal freedoms, these developments would not be so fearful. In open societies there is a division between private business and the government. But this is often not the case in China.

 

In 2017, the state enacted a new national intelligence law that compels private Chinese businesses to cooperate with the Chinese government. US companies, as a price of doing business in China, have sometimes also cooperated but at least they have a choice.

 

This means that the data collected by Huawei, We Chat and Tik Tok can be collected, stored and searched by Chinese security agencies.

 

Minimally, this arrangement would give China the means to bully Western companies to comply to the type of web censorship it imposes domestically on its own citizens. It has already done this with American technology companies when it comes to their Chinese products. A  sobering scenario would allow Chinese government officials to mine the personal data of non-Chinese citizens.

 

Fortunately, the United States has begun to meet this threat. It has launched a global campaign to convince allies to block Huawei from participating in the building of national 5G networks. Last month, President Trump signed an executive order prohibiting the purchase or use of communication technologies owned or controlled by a foreign adversary.

 

One can only hope that this is not too late to save the internet from a Chinese takeover. If it is, then the rest of the world could soon be subjected to the war on history China now wages on its own people.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172781 https://historynewsnetwork.org/article/172781 0
Can Muslims get a fair shake in India?

India is back to square one, thanks to Prime Minister Narendra Modi's move Monday to scrap special political rights for a Muslim-majority state in the Hindu-dominated country. The Muslim-rights issue, which led to the creation of Pakistan as a Muslim homeland in 1947, has now resurfaced: Can the Muslims get a fair shake in India?

 

By scraping Kashmir's special autonomy status, Modi has taken a dangerous step toward implementing the vision of his ultra-nationalist party's spiritual guru, the late V.D. Savarkar, who proposed more than 90 years ago to keep minorities under control in an India ruled by the Hindu majority.

 

Sitting in a prison cell on the Andaman Islands in the Bay of Bengal, the convicted-armed-revolutionary-turned-Indian-nationalist drew up his solution to the vexing question of India's minorities. His idea: The Muslims and Christians can stay in India, but they will be subservient to the Hindus; they will be granted no rights that may infringe upon Hindu rights; and since they are minorities, they must obey the majority.

 

This was not his initial plan, however. He initially wanted to convert the Muslims and the Christians back into Hindu. But he faced a big obstacle. Savarkar could convert the Muslims or the Christians, but could not arbitrarily decide their caste. A Hindu must belong to a hierarchical caste, and it is acquired through birth only. Hindu religion does not permit assigning a caste.

 

To overcome this insurmountable barrier, he revised his idea. He decided he is a Hindu, not an Indian. His motherland is Hindustan, which encompasses the land from the Himalayas to the Indus River. Hindustan boasts a 5,000-year-old rich culture, which influenced a vast number of people from Greece to Japan. On the contrary, India is a concept championed by the nationalists who wanted an independent united country for all of its inhabitants, regardless of their religion.

 

Muslims and Christians Unwelcome

In Savarkar's Hindudom, the Muslims and the Christians were less than welcome. He disliked them because of their allegiance to Mecca and Rome; they worshiped foreign gods and had no cultural affinity toward Hindustan. Even though Buddhists and Sikhs were no longer pure, they were still acceptable because their religions originated in Hindustan.

 

Sarvarkar, an atheist who labeled his vision as non-religious and cultural, was unwilling to give the Muslims a separate homeland next to Hindustan. He feared that even though they were only 25 percent of the total population, they could still someday reconquer Hindustan if they were allowed to have their own country. The Muslims were a small band, too, when they conquered India in 712 AD and eventually built a vast empire. 

 

He figured that the next time around they would be in a much stronger position to repeat their past success, because they would receive support from other Muslim nations. To nip that possibility in the bud, he supported the creation of Israel. He saw the Jewish state as a barricade against the Muslim Arab world.

 

He feared a Muslim resurgence so much that he wanted British rule in India to continue. He sought only dominion status for Hindustan. Only Britain, he believed, was powerful enough to keep the Muslims at bay if they ever attempted to invade Hindustan again.

 

But to his chagrin the nationalist tide swept India, as independence stalwarts like M.K. Gandhi, Jawaharlal Nehru and Moulana Abul Kalam Azad pressed the colonial power to leave. Savarkar's idea took the back seat, but remained very much alive, even though malnourished.

 

After the murder of Prime Minister Indira Gandhi in 1984, the Indian National Congress party, the champion of secular India, fell on hard times; it had no comparable charismatic leader to carry forward the torch. Savarkar's followers gradually gained ground and picked Modi, who was once condemned globally as the mastermind behind Muslim massacre in his home state of Gujrat, as the reincarnation of their guru.

 

Modi shows anti-Muslim bias

With a huge re-election victory two months ago, Modi embarked upon implementing Savarkar's dream to appease his hardcore anti-Muslim forces. First, he nullified a Muslim marriage law that had existed for centuries. India's constitution, however, protects religious laws of other minority groups, and Modi did not touch them, showing his bias against Islam. Even the Mogul or the British did not touch India's religious laws.

 

On Monday, keeping Muslims leaders under house arrest and deploying tens of thousands of soldiers in Kashmir, the prime minister moved to take away the special rights — own flag, own law and property rights — granted by India's constitution to the state in a blitzkrieg exercise in a matter of hours.

 

Imran Khan, prime minister of nuclear-armed Pakistan, arch-rival of nuclear-armed India, has threatened war. Pakistan considers Kashmir a disputed territory. China, which occupies parts of the state, denounced India's action as “unacceptable,” but is unlikely to take any military action. Pakistan can do very little on its own, unless it wants to risk a nuclear confrontation. Washington seems less than thrilled to stick out its neck. Nonetheless, the danger level remains high, and the fallout will be felt in India and plague its neighbors.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172770 https://historynewsnetwork.org/article/172770 0
Counterculture 1969: a Gateway to the Darkest and the Brightest

In late December 1966, Time released its first publication for the coming new year, an issue that has annually featured the magazine’s “Man of the Year” since 1927. The choice of Man of the Year goes, as Time reports, “to the person or idea or thing that, “for better or worse, has done the most to influence the events of the year.”

 

For most of the years that the Man of the Year had been awarded, the recipients had been individuals – those towering figures of the 20th century from Lindberg to Hitler to Churchill to Roosevelt.  But for 1966, the honor went, for the first time, to an entire generation: The Time Man of the Year for 1966 was those Americans under 25 years old – the young people in the United States who, the magazine reported, “had already shown they would remake the world.” 

 

“In the closing third of the 20th century,” Time wrote, “that generation looms larger than all the exponential promises of science or technology: it will soon be the majority in charge. “Never have the young been so assertive or so articulate, so well educated or so worldly,” Time reported. “Predictably, they are a highly independent breed, and – to adult eyes – their independence has made them highly unpredictable. This is not just a new generation, but a new kind of generation.” 

 

A new kind of generation to be sure. The real impact of young people in 1960s America was beginning to be heard, with groundbreaking influence from the counterculture that was developing across the country.

 

How counterculture arrives

 

Recognizing that a counterculture exists is simple; appreciating its starting points can be more challenging. 

 

How does a counterculture emerge? Books, college courses, and academic careers are built on exploring that question. But the most basic answer is that every culture produces countercultures – to challenge, to question, and to defy mainstream society.   The 1960s would produce its own unique brand of counterculture -- a product of events, timing, and the will of a new generation of Americans to change a host of social ills ranging from civil rights to saving the environment -- with effects that endure today.

 

“Some of counterculture comes from the deep psychological needs of people to rebel, to create an identity,” said MIT sociologist Gary Mark. 

 

“An important factor had to do with the end of World War II – the triumphs, and the economic expansion,” Marx said. “Depression-era people had to struggle and were focused on obtaining some kind of economic security. Suddenly the next generation saw more affluence, and affluence meant that there was no need to struggle.

 

“Once you had that security,” said Marx, “you had the leisure to engage in alternative kinds of ideas.”

 

And in the 60s there were more young people than to consider these ideas – an abrupt result of the Baby Boom of 1945 to 1964.  By the middle of the 1960s, there were nearly as many Americans under 25 as over 25 (by the end of 1969, 100 million Americans in the Baby Boom Generation would be 25 or under).  

 

The moment was right for protest, change, and rebellion.

 

 

 

Baby boomers: the key to the 1960s

 

“The story of 1960s counterculture is the story of those baby boomers,” said Ken Paulson, dean of the College of Media and Entertainment at Middle Tennessee State and former editor of USA Today. “With baby boomers, you have a generation that had far more free time and far more disposable income. They had discovered that by sheer numbers they could drive demand for the things they cared about.”

 

The counterculture of the 60s represented not only activism about national issues, but it also inspired rejection of social norms.

 

“There were at least two very distinct countercultures, especially here in California,” said Stanford cultural historian Fred Turner.  “One centered in Berkeley – the new left, doing politics to change politics; and the other centered in San Francisco – wanting to avoid politics, celebrating consciousness, psychedelia, and transformation of consciousness.”   

 

Vietnam: the center of dissent

 

Overarching all other counterculture grievances -- and the focal point for many of them -- was the war in Vietnam, which over the course of four presidential administrations, had become a seemingly unsolvable political and strategic catastrophe.  

 

For growing numbers of Americans in the late 1960s, Vietnam was a conflict with none of the clarity and noble mission of World War II, no end in sight, and progress measured only in terms of the growing death toll.  By 1969, almost 50,000 Americans had died in Vietnam, and troop strength was its highest: some 520,000 men and women.  Everyone in America knew someone fighting in the war, or who had died in the war, or who might be drafted to fight in a country they did not know and for a cause they did not understand.  

 

Colleges as the catalysts

 

In the 1960s, college campuses simultaneously became a norm of middle-class culture and the focal points for dissent and protest. 

 

In stark contrast to previous generations, in which education after high school was a rare privilege, college attendance in post-war America exploded; in 1940, about 500,000 Americans were in college; 20 years later, college enrollment had increased to 3.6 million.  College granted more time for reflection and discussion, chances to question the status quo, and opportunities to explore the unconventional to make a better world. 

 

In 1960s America, counterculture became a mix of social activism, environmental awareness, and a platform of expanding demands for civil rights, social equality, and the rights of women.

 

The Free Speech Movement – which had developed at UC Berkeley in 1964 as a response to, of all things, the university’s policies that restricted political activities on campus – took root as the first significant civil disobedience on college grounds; most other major universities would soon follow with their own protests, especially in support of civil rights and opposition to the war.

 

And then there was the most tangible denunciation of all: complete rejection of society, and dropping out entirely. The hippie living in a commune – although only a minor percentage of counterculture lifestyle choices – became the high-profile symbol of counterculture values.

 

Exploring the divisions 

 

CBS News tried to make some sense out of the impact of counterculture with a project called “Generations Apart,” an ambitious exploration of the generation gap as seen through national surveys and interviews with both young and older Americans. Hosted by reporter John Laurence, the results of the project aired in three broadcasts in late May and June 1969. The programs became a vivid reminder of how countercultural values and generational differences had changed America. 

 

The episode of “Generations Apart” titled “A Profile in Dissent” was particularly hard-hitting, and described how young people (17-23) and parent-age Americans viewed social change and political controversy.

 

“It is a time of dissent for many of America’s young,” said Laurence. “The collision of events that they could not control has caused a challenge to values they cannot accept.”

 

“The majority are quiet as they always are, ready to conform as they always are,” admitted Laurence. “But a growing minority is shaking up society and raising their voice. There is a swelling tide of dissent among the young in America today. It is surging up against some of the most basic institutions of adult society.”

 

The survey showed a widening generation gap on a range of fundamental issues involving sex, religion, drugs, and money. The program’s most startling finding was the broad rejection of the most basic ideals of middle-class values: six out of 10 young people said they want “something different in life” from what their parents wanted. Among college students, only one-third said patriotism is important, while two-thirds of all young people said civil disobedience is sometimes justified.  

 

“Will they find the definitions for the ‘something different’ that they are searching for?” Laurence said of a new generation of Americans. “Only the young can tell us. And maybe not even the young can say for sure how they are going to shape this society, until they are older.”

 

The peak of counterculture

 

By some measures, counterculture in the 1960s would reach a pinnacle in August 1969, when two events in particular – incredibly, only a week apart – would symbolize both the worst and the best of the era: the first -- two nights of murder by the followers of Charles Manson -- would demonstrate the tragic vulnerability of some who sought alternatives to mainstream 1960s America.  And the second -- the gathering of 400,000 in rural New York for “three days of peace and music” that forever after would be known simply as Woodstock -- would showcase the new generation at its best.

 

The counterculture of the 1960s would continue to evolve into the early 1970s.  Bythe mid-70s, much of the energy of the 60s had changed, in part because some of the major goals of the 60s, such as expanding national social programs, the environment, and civil rights, had been at least partially achieved – or perhaps more important, had moved into the ongoing mainstream discussion of America’s political and social concerns. The counterculture of the 1960s in its endlessly evolving forms continues today, now as a broad influential force in a spectrum of social movements and cultural expression.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172783 https://historynewsnetwork.org/article/172783 0
1919, the Year of Racial Violence: An interview with David Krugler

History is a record of the incessant struggle

of humanity against ignorance and oppression.

Helen Keller, 1918

In the wake of the Great War, Americans were hopeful for a new year of domestic tranquility and prosperity. Black troops came home from the battlefields of France to claim the same democracy for which they had fought. They were badly disappointed. Atrocities by whites against African Americans intensified. Lynching of black citizens continued with impunity and mob violence targeting black citizens exploded in cities across the country. African Americans suffered tremendous losses but also defended themselves against the onslaught of horrific violence launched to protect and enforce white supremacy.

But 1919 became the bloodiest year of racial violence in American history. In his new book 1919, The Year of Racial Violence: How African Americans Fought Back (Cambridge), history professor David F. Krugler vividly details the extent of the violence from white mob attacks on black citizens in cities from Charleston and Washington, D.C., to Chicago and even Bisbee, Arizona, as well as dozens of lynchings, from the murders of individual blacks to the greatest mass lynching in our history at Elaine, Arkansas, resulting in as many as 230 deaths, a white majority response to efforts by African American sharecroppers to organize a union.

Professor Krugler goes beyond previous histories of this tumultuous period by putting African Americans at the center of the story. He stresses the character of the violence as antiblack collective violence by whites—rather than “race riots.” He recounts the efforts of African Americans to resist the violence through heroic self-defense in the streets, media campaigns to correct inaccuracies in the mainstream white press, and work by the NAACP and others to achieve justice and equal protection through the legal system.

The book also shows how black resistance to white mob violence laid the foundation for the Civil Rights Movement 40 years later that would lead to many of the policies for which the African Americans of 1919 struggled. It’s also the first work to document government efforts to disarm African Americans and to obstruct their legal right to obtain weapons and defend themselves at that time.

Critics have praised 1919, The Year of Racial Violence for its new perspective on history, its extensive and original research, and its lively prose. Adriane Lentz-Smith of Duke University wrote: “This powerful book captures the high cost and high stakes of the War for Democracy brought home. By turns devastating and inspiring, it sets the new standard for exploring African Americans' struggle for safety, truth, and justice in the aftermath of World War I." And Chad Williams of Brandeis University commented: "With meticulous research and narrative force, David Krugler has produced a brilliant account of one of the most turbulent and bloody years in American history. As he powerfully demonstrates, African Americans, in the face of horrific nationwide racial violence, used every tool at their disposal to fight back and preserve both their citizenship and humanity. 1919, The Year of Racial Violence is a landmark achievement." 

David F. Krugler, a professor of history at University of Wisconsin—Platteville, specializes in modern US history. His other books include The Voice of America and the Domestic Propaganda Battles, 1945-1953, and This Is Only a Test: How Washington, D.C., Prepared for Nuclear War.  

Professor Krugler generously talked about his new book on racial violence by telephone from his office in Wisconsin.

Robin Lindley: What inspired your sweeping study on the racial violence in America in 1919?

Professor David Krugler: I originally set out to do a project on all of the post-World War I upheaval in the United States. Because most of my research was in the post-World War II period, I was looking for a new era within modern U.S. history to study.

I began my research with an episodic, sweeping view in mind on 1919. A book came out in 2007 when I was in the early stages of research by Anne Hagedorn, a journalist and historian, called Savage Peace: Hope and Fear in America in 1919. That proved to be great timing on her part because it got me thinking about what I could do that was new. So I decided to narrow my focus, and I’m happy I made that decision to focus on racial conflict and black resistance.

Robin Lindley: I think many readers will be surprised by the extent of racial violence in 1919. You detail many of the conflicts and atrocities, but you also offer a fresh perspective on the violence.

Professor David Krugler: In the original draft, I felt I was dwelling too much on white-on-black violence. I got some good feedback when I was writing. The reader suggested that I foreground black resistance rather than the violence visited upon African Americans.

The extent and the frequency of the violence and the seemingly minor causes of violence are shocking to the modern reader. I wanted to lay the groundwork so that readers understand the ideology of the time and how entrenched white supremacy was.

Today we see white supremacy as extremism and hate groups on the fringe, so it was important to me to lay out how structural white supremacy was. For millions of white Americans there was no discrepancy between being enthusiastic supporters of the war to make the world “safe for democracy,” as President Woodrow Wilson called the Great War, and denying African Americans equal opportunity and constitutional rights because of the scientific racism of the time and because the ideology and practice of white supremacy was so entrenched.

I wanted people to understand why violence and mob violence were used so frequently. Then the narrative turned toward what African Americans did in response so that the book didn’t become an almost numbing narrative of one violent episode after another.

Robin Lindley: You’re very careful about language and it was instructive that you distinguish the casual term of “race riot” that was used to describe the violent outbursts and substituted the more precise term of “antiblack collective violence” in the recognition that the violence—devastating acts of assault, murder and arson—was initiated by whites against black citizens.

Professor David Krugler: The main problem with the term “race riot” is that it suggests that all the participants have an equal responsibility for causing violence and breaking the law. That just wasn’t the case in 1919 with almost every violent incident involving whites organizing extralegal outbursts to punish blacks for perceived affronts to white supremacy or to punish those who allegedly committed crimes, particularly against whites. To call this violence a race riot would be totally misleading.

I want people to understand that, although African Americans found themselves in the middle of riots at great peril to their own lives, their response was shaped by instinct as well as a thinking decision to resist, not to riot.

Robin Lindley: And the mass violence was so widespread, from Arizona to Chicago to Washington, D.C., and in the South.

Professor David Krugler: It’s important to know that this wasn’t just a southern phenomenon. Indeed, some of the worst riots occurred in the upper Midwest—Chicago, for example.

And Washington, D.C. Even though some classify it as a southern city, in many ways it’s not. It’s the nation’s capital and that made the racial conflict and black resistance all the more public. It was viewed at the time by observers and participants as a particularly revealing episode of racial violence in terms of the causes of why whites attacked blacks and what blacks did in response.

Robin Lindley: When we read about this distant violence, I think there’s a perception that blacks were victims or passive, but you stress that African Americans defended themselves against white attacks. They also were much more likely to be prosecuted for violence than whites, even though blacks were not the perpetrators. In fact, whites were seldom charged.

Professor David Krugler: In my research, I was struck by the press reports of racial violence in mainstream newspapers. Time and again, editors and journalists in mainstream newspapers blamed African Americans for the violence. Because that reflected the beliefs of law enforcement officers and even federal troops, that became a justification for arresting African Americans and charging them with serious felonies when whites who caused the violence were not charged. That came out time and time again in Chicago and Washington D.C. with so many black men charged with carrying deadly weapons or concealed weapons when the facts of these cases showed that these weapons were procured for self-defense. African Americans couldn’t rely on law enforcement to protect them or stop the violence.

Robin Lindley: You explore the historical context of the 1919 riots. Black troops were returning from World War I combat and expecting democracy at home after fighting for it in Europe. Were there more black combat troops in that war than in World War II?

Professor David Krugler: In terms of numbers, there were more black troops in combat in World War II than World War I.

The major black divisions were segregated and assigned to French command in World War I, and it’s revealing that they were the only American troops assigned to French command. All white units remained under the command of General John Pershing and the American Expeditionary Force. The majority of African Americans in uniform were channeled to service positions in labor or supply battalions. Many of the black soldiers then in France were stevedores or doing backbreaking labor. That was true stateside as well.

But the four divisions of black soldiers that were in combat distinguished themselves many times. That’s what got a lot of attention. The demobilization of U.S. forces was so swift and white and black soldiers were returning in large numbers in ship after ship in places like New York and Charleston. That enabled the black units that were acclaimed in combat--divisions such as the Harlem Hellfighters—the 369th, and the 370th, the Eighth Illinois National Guard--to come home to welcoming parades that attracted so much attention and celebration. The arrival inspired African Americans in those cities--those in uniform, those that had loved ones in uniform, and those that lived and worked in the United States— to say, “All right, the time is here for us to have the democracy for which we fought in France. We will do whatever it takes to get it.”

Robin Lindley: And at that time you also had the Great Migration, the New Negro Movement, and the Red Scare, so there was almost a perfect storm in terms of the cry for justice for black Americans and then the white fear of a black uprising.

Professor David Krugler: Yes, these things came together in a short time. The New Negro Movement was well under way before the war, and the war provided a way to gel the movement around at last obtaining constitutional rights and equal opportunity.

At the same time, five hundred thousand African Americans moved from the South to the Midwest and Northern industrial cities, creating strains in those cities. And they experienced racism and segregation in those cities as well but they were finding more opportunity.

The draft brought hundreds of thousands of young black men in uniform to different parts of the country. Many of these draftees and enlistees then encountered the New Negro Movement so that, in some ways, the Army became an incubator for the Movement. There was a lot of awareness of that among white officers even before troops were sent to France. They were raising cries of alarm that there were so many black men in uniform that it would be harder to enforce the status quo. They didn’t quite use that language, but they said, “the Negro question would remain unsettled.” Those were code words for saying that black servicemen were not going to submit docilely to inequality if these forces were unleashed.

Robin Lindley: You also discuss the 1919 violence in terms of the American history of mob violence and “rough justice,” including lynching. What was “rough justice”?

Professor David Krugler: The still prevailing understanding of historic lynching today is that it took place in areas where legal systems were not fully in place so people took the law into their own hands to fill a need.

As historian Michael Pfeifer has shown convincingly and beyond doubt, that wasn’t true. Time after time, lynch mobs carried out their murderous acts in communities that had fully established law enforcement agencies and court systems. The mobs wanted immediate gratification. They did not want to go through the court process, even when a sentence of death was likely for the alleged criminal. They wanted to take that person’s life right away, as Pfeifer and other historians point out.

Much of this violence was directed against African Americans for even minor offenses against white supremacy. This really comes out in 1919, when African Americans were being lynched by white mobs because they refused to yield their vehicle on a road or they didn’t use proper forms of address. So rough justice was used to maintain and protect white supremacy and to terrorize other African Americans to quash the New Negro Movement and to provide the larger white community with the sense that they were in control and had the means to deliver immediate “justice” to those who presented any affront to it.

Robin Lindley: You recount graphically the many instances of horrific violence and atrocities. I think many readers may not know of the violence in Elaine, Arkansas, that resulted in perhaps the largest lynching of blacks in American history with the murder by some counts of more than 200 African Americans.

Professor David Krugler: Elaine was one of the later episodes in 1919 and was the most murderous episode of antiblack collective violence that year. It occurred in late September and early October.

In Phillips County, Arkansas, where Elaine is located, there had been some union organizing by African American sharecroppers who had hired U.S. Bratton, a white lawyer from Little Rock, to represent them. He had sent his son to Phillips County to take testimony from the sharecroppers.

When the planters learned of the unionization effort and the hiring of a lawyer, they cracked down hard. They sent a special agent of the Missouri Pacific Railroad as well as a sheriff’s deputy to break up a union meeting late one night in September. That led to a shootout. The sharecroppers were prepared for violence against them to block their movement. In that exchange of gunfire, the white special agent for the railroad was killed.

This shooting led to the mobilization of a mob of thousands [of whites] who broke into smaller mobs. Many were deputized so they had the authority of law behind them. The sheriff of Phillips County, Frank Kitchens, used parts of the mob as his posse. White people too joined from Mississippi.

What followed was a massacre—and it is not an exaggeration to use that word. Or a pogrom.

The estimates of the number of dead range from 20 to more than 230. The Equal Justice Initiative of Alabama issued a report this year putting the number of murdered African Americans at 237. Previous accounts I note in the book put the number around 25. The Equal Justice Initiative study hadn’t come out before my book was published, so I used the figure of 25, which is still a large sum and almost as many who died in Chicago that year. If the figures from the Equal Justice Initiative are correct, it was by and large the bloodiest incident of antiblack collective violence in our history.

And it really began because a group of African Americans were organizing to protect their economic interests. For years, they had been ruthlessly cheated out of their earnings and mired in debt peonage by the planters. They were seeking to break out of those chains, this form of slavery by another name that kept them tied to the land, enriching the landlords while leaving them in poverty. When they made a move to break out of that, the mobs formed to break up the union. They defended themselves and that led to even greater retaliatory violence, which also led to African Americans doing what they could to defend themselves, but they were hopelessly outnumbered.

The sharecroppers were rounded up, and that led to another episode of resistance—trying to save the lives of the men accused of conspiracy to murder whites and sentenced to death.

Robin Lindley: And you detail this story of the struggle for justice for these accused black sharecroppers, the Elaine 12.

Professor David Krugler: A lot of scholars are familiar with the U. S. Supreme Court decision that grew out of this case, Moore v. Dempsey, 1923, that established the federal government’s obligation to insure that state judicial proceedings protect the constitutional rights of the accused, particularly due process. To the modern person, it would seem self-evident that such protections were required in state proceedings, but that wasn’t the case until that decision. And that decision would not have reached the Court had it not been for the NAACP and a black lawyer in Arkansas, Scipio Jones, who undertook extensive efforts to defend the 12 sharecroppers who were sentenced to death after being tortured and convicted in hurried and grossly biased court proceedings.

Robin Lindley: Weren’t many innocent bystanders killed in the Elaine violence? I recall an incident of a woman and baby who were burned to death in their home.

Professor David Krugler: The mother and baby incident occurred in Florida in 1920.

In Elaine in 1919, there were many instances of people who were not part of the union effort who died. That’s not to say that the sharecroppers deserved what happened to them, but the efforts of the mobs and posses did not just punish the sharecroppers but also terrorized the majority black population so they would never again undertake anything that would question the status quo. With that blood thirst, you have instances like that of an elderly woman murdered and her body dragged out to the road and her dress pulled over her head. There was an elderly man killed in his bed. These were conscious, very disturbing decisions by the mob to set examples.

There are so many of these atrocities in 1919, to dwell on them could be very disturbing and numbing, but we cannot ignore them because they happened and we have to understand why that happened. As I said, my purpose in the book was to show what African Americans did in response when these atrocities were under way.

A photograph of the posse in Phillips County, Arkansas, that attacked black sharecroppers and their families. Source: AHC 1595.2, from the Collections of the Arkansas History Commission.

Robin Lindley: It seems that these acts of violence against blacks occurred for very trivial reasons, and the acts of self-defense by blacks usually were only after black citizens had asked for protection from law enforcement and local government.

Professor David Krugler: That’s an important point. In so many of the cities where the violence took place, the initial response of black organizations—particularly NAACP branches—was to go to the authorities and ask them to restore order.

A great example of this occurred in Washington, D.C., which had a very active and effective NAACP branch. Its officers met with the commissioner of the District of Columbia, the equivalent of a mayor, and the police chief, and they asked for protection. When they got a very qualified response from these two top officials, they were understandably indignant. In fact, the commissioner of the district, Louis Brownlow, was more interested in knowing what the NAACP leaders were going to do. They left the meeting saying it was [Brownlow’s] job to make sure that everyone is protected and order is restored. If you’re not going to do it, they said, the black men in Washington were not going to stand by and let themselves and their families be shot down like dogs.

This is a great example of that initial response seeking the protection of law and order and being rebuffed and not getting the obligatory response from authorities and then undertaking to do it themselves.

Robin Lindley: That failure of officials to protect African Americans was striking. And then, in terms of justice, whites who initiated violence were seldom charged with crimes yet blacks were frequently charged and prosecuted. The NAACP sought equal justice and compensation for damage to African Americans.

Professor David Krugler: That occurred in Charleston, South Carolina, which had the first major outbreak of antiblack collective violence in 1919. White sailors attacked black civilians and black-owned businesses and destroyed some of them. The NAACP branch in Charleston then sought compensation from the Navy, which did nothing. They also tried to get the Secretary of the Navy to do something and that didn’t work.

Seeking unbiased proceedings in courts was consistent in the cities where there was antiblack collective violence and unfair targeting of black self-defenders by authorities. There were all sorts of legal efforts to see that these people received adequate defense and that they were able to present the facts of the case and their efforts to defend themselves were presented as such, and even that they had a weapon was seen in this context.

This fight for justice as a whole saw success through legal victories, even though there were many setbacks and failures.

Robin Lindley: You note some bright spots in this grim history with some legal victories and the black press acting as a corrective to the biased mainstream white media that presented the perspective of the white majority.

Professor David Krugler: Yes. James Weldon Johnson, an official with the national office of the NAACP, was especially effective in his writing for the New York Age, a black weekly. He took on the misleading and often outright false accounts that were published in the major dailies in U.S. cities about the causes of the violence. Because he went to some of the places where the violence occurred, he was able to provide well-sourced rebuttals to the narratives and establish the real causes of the violence.

Walter White of the NAACP was another writer and he was even more intrepid. He went to Phillips County, Arkansas, and his life was at risk down there when it was discovered he was from the NAACP and was African American. He was very fair skinned so he used that to his advantage to pass as white to carry out investigative work.

White also went to Chicago for its riots. His evidence was not only useful for identifying what caused Chicago’s rioting and for rejecting the stories that were blaming blacks, it was also essential to the fight for justice because he got affidavits from many black Chicagoans on the actions of individual white rioters.

White even delivered much of the evidence to the grand jury that was convened in Chicago to bring riot charges against those who were arrested. As a result of White’s efforts, the grand jury went on strike because in the initial stages of the grand jury hearing, the state’s attorney was presenting only black defendants. After receiving evidence from the NAACP through Walter White that in fact many whites had been arrested and that they were responsible for so much of the violence, the all-white grand jury said they had to see these cases to have a fair judicial proceeding.

Here we see how these efforts tie together—trying to establish the truth about what happened and also secure a chance for fair court proceedings for those who had been detained and charged.

Portrait of Will Brown published in the book Omaha’s Riot in Story and Picture (1919). Source: RG2467-8, Nebraska State Historical Society

Robin Lindley: You also share some vivid photographic evidence from the time. I’ve been haunted by the photograph of the burnt body of Will Brown, a black man lynched in Omaha, since I first saw it as a child in a history book.

Professor David Krugler: That photograph comes from a book published shortly after Omaha’s racial conflict called Omaha’s Riot in Story and Picture. To the modern reader, it’s a disturbing publication because it has the feel of a children’s book. The text is very simple, but the photographs are unforgettable, particular the one that shows a young Will Brown with a pensive, almost sad expression, and of course, the readers know what happened to him. It’s almost as if, in that expression, he’s showing an awareness of his fate, of the horrible death he suffered at the hands of Omaha’s courthouse lynch mob. There are other pictures that give a sense of how many people poured out in Omaha for the storming of the courthouse and the seizure of Will Brown who had been falsely accused of the sexual assault of a young white woman and the assault of her boyfriend.

As I describe in the book, those assaults never took place. It was concocted by an Omaha political boss, Tom Dennison. He had been planning for months to discredit and oust from office progressive reformers who had displaced his party. A newspaper associated with Dennison and his machine had been publishing throughout 1919 lurid accounts alleging that black men had sexually assaulted white women and girls. It’s my belief that Dennison saw that those stories created a stir but they didn’t accomplish his basic purpose. I believe he had his young assistant and his girlfriend make up the attack. I think they found Will Brown ahead of time and had him arrested. I don’t think Dennison anticipated that the courthouse would be stormed and Will Brown seized, but that’s what happened.

Will Brown was hanged from a light pole in downtown Omaha and shot numerous times and his body was cut down and burned. There’s a horrific photo of people—men, women and even a little boy—gathered around the burned body of Will Brown. This is where we see rough justice because the mob members didn’t see what had happened as a shameful crime. They were proud of what happened and wanted to be photographed.

Robin Lindley: It looked almost like a party atmosphere as Mr. Brown burned.

Professor David Krugler: The historians I benefited from explain that this goes back to the notion of rough justice that people believe what they’re doing is right and that it’s not actually a crime to kill someone even though they have been arrested and will be going through the court system. By that logic, it shouldn’t surprise us that they would pose for pictures. This photographic evidence of lynchings is substantial. It should shock us and bother us, but when we understand the logic behind it, we have a better sense of why it occurred.

Robin Lindley: 1919 must have been the worst year of racial violence in our history, at least since Reconstruction.

Professor David Krugler: Absolutely. Reconstruction saw some violent years, but in terms of the frequency and the compressed amount of time, 1919 stands out as the most violent year for racial conflict in US history. This perfect storm of these forces coming together helps us understand why 1919 was so remarkable: the end of the war; the purposes for which the US fought; the mobilization for the war; the Great Migration; the New Negro identity; black military service; increasing organizational activities of the NAACP. These and other factors came together to create the conflicts.

Of course, the overriding cause was the determination of many white Americans to maintain the prewar system of racial segregation and discrimination. Because so many African Americans were determined not to return to that and to achieve full equality and opportunity, that produced friction and resistance.

The number of dead was well over one hundred. Depending on how we count the Phillips massacre, from 20 to 237 African Americans were killed there. In Chicago, the death toll was 38 and 23 of those were blacks. Those were the two bloodiest conflicts. When you add up all of the events and then add more than one hundred lynchings of African Americans in 1919, then you have triple digit death tolls. When you consider the loss of property and livelihoods and businesses—especially in Charleston and Longview, Texas—those who survived with their lives lost everything in this antiblack collective violence.

By those measures, there was a terrible cost. But perhaps the greatest cost was to US democracy itself. I wouldn’t want to impose modern sensibilities or expectations on a time one hundred years ago, but it seems that because it was possible for the United States to mobilize for the world’s most devastating war to that time to make the world safe for democracy, it doesn’t seem impossible to also have had democracy at home for all. That’s what African Americans were saying before, during and after the war. And there was agreement from some white Americans. The epigraph that begins the book is from a white officer who helped command the Harlem Hellfighters. He basically said that, having fought to make the world safe for democracy, it’s time for America to have democracy—meaning for African Americans. That awareness was around in one voice after another, but it’s unfortunate that it wasn’t possible at that time.

Robin Lindley: Your book tells a remarkable and largely unknown story. It has so much resonance with the events recently with police killings of black men in Ferguson, Missouri, and elsewhere, and the mass shooting of African American church people in Charleston by a deranged white supremacist.

Professor David Krugler: Thank you. I think there is a lot of contemporary relevance and 1919 has a lot to tell us about where we are as a nation today. I’ll offer one example.

The recently released Department of Justice report on policing practices in Ferguson reads—if you leave out the technology—as if it was from 1919 in terms of the deliberate targeting of African Americans and Ferguson civic leadership viewing black constituents as revenue generators and not even recognizing that they are the people they serve, that these are our citizens and are taxpayers. There was no compunction about deliberately targeting individuals because of the color of their skin. In other words, there was a double justice system in place, and that was true in so many places in 1919 when African Americans were not treated equally by the police and when rioting began and whites attacked African Americans, police went after African Americans. This helps us understand why the history of distrust and friction on the part of so many black communities and the police that are supposed to protect them is at such a low point now.

This isn’t just something that happened recently in Ferguson or was happening in the 1960s to the present. It has a long history. One of the cartoons in the book from a black Washington weekly called The Bee shows a well-dressed black woman approaching a police officer lounging against a street pole while the background shows a white mob attacking African Americans. The black woman asks, “Why don’t you stop them?” And the police officer responds, “Ha, Ha. That’s what I would like to know.” That exchange was actually reported in one of Washington’s daily newspapers. The police were saying they weren’t getting guidance and weren’t told what to do.

From 1919 to the outbreaks in Ferguson, we see this continuum and perhaps the best lesson is to understand that it’s been a long time in the making and that we need to take steps to break that historical continuity of destruction and often biased policing.

Fortunately today, we don’t have mob attacks to maintain white supremacy, but does that mean all is well? I think the Justice Department report on Ferguson shows that’s not the case.

“This Nation’s Gratitude,” published in the Washington Bee, July 26, 1919, p.1.

Robin Lindley: Recently, on the first anniversary of the killing of Michael Brown in Ferguson, a white patriot group armed with assault weapons arrived on the scene supposedly to assist police. A black commentator doubted that groups of armed black men with assault weapons would have been welcome there.

Professor David Krugler: Yes. The Oath Keepers, I believe they’re called, tried to explain to those organizing protests in Ferguson that, “We’re on your side, to assure the police don’t do anything to you.” But I think it’s a legitimate question about what the response would be if large numbers of African Americans were openly bearing long rifles and assault weapons on the streets.

In 1919, the sight of African Americans bearing weapons or even the fear that they would have access to weapons preoccupied the Military Intelligence Division and the Bureau of Investigation, the forerunner to the FBI. The young J. Edgar Hoover headed one of its units.

In the book, I looked at the efforts to disarm blacks. The federal government, in collusion with local law enforcement and even gun dealers, denied African Americans their Second Amendment rights and denied them access to weapons they needed to defend themselves because the notion, the lie, was that they were organizing uprisings and conspiracies to murder whites en masse.

Robin Lindley: Your book is a vivid reminder of how ignorance and intolerance erode democracy. Thank you for your book and your thoughtful comments Professor Krugler.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/160430 https://historynewsnetwork.org/article/160430 0
An Open Letter to Senator Elizabeth Warren Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

Elizabeth Warren in the Democratic Debate on July 30, 2019

 

 

Dear Senator Warren,

 

Congratulations on your campaign thus far and on your performance during the two sets of televised debates. You have done a fine job of presenting yourself and your ideas for America.

 

But I don’t think the Democratic debates have been successful so far. The apparent need for each candidate to distinguish themselves from the rest and the relentless efforts of the media, including the debate organizers and hosts, to dig out points of difference and conflict have resulted in a set of seemingly contradictory messages about what Democrats stand for and how you all differ from the much more unified message of Trump and the Republicans. 

 

I have believed for some time that the way for any Democrat to beat Trump is for all Democrats to emphasize what unifies you, or us. I wrote about that idea in May, but the message apparently did not have any effect. So I am trying a different tack: asking you to take the lead in helping the whole field of candidates be clearer about what all Democrats propose to the American people, each in their own way and with their own emphases.

 

I ask you to formulate a statement of Democratic principles and policies that every candidate could accept in a public way, preferably as part of the next debate. That statement should address the underlying agreement among all Democrats on issues that separate us from the Trump candidacy: the intention to address climate change rather than call it a hoax; to raise taxes on the rich, not the rest of us; to fund education, child care and infrastructure more vigorously; to move forward from Obamacare, not destroy it. Imagine the impact on the voting public if every candidate on the stage said the same few words about their support for what surveys show that the majority of the public wants.

 

Success in this effort does not depend on getting unanimous agreement. If someone wants to say, “I don’t support those ideas,” let them. That will further emphasize the unified stance of the rest. Success would be taking control of the debate and the whole effort to get rid of Trump.

 

I ask you to do this because of your status as a leading candidate, your willingness to take the lead in formulating what being a Democrat means, and your fearlessness in articulating your campaign message. Taking this lead may not help you personally to jump over the other Democrats, but it could help all Democrats gather the votes of the majority of Americans who disapprove of Trump. I don’t see how such an effort could harm your campaign among Democratic voters.

 

I understand that your own policy proposals would not get approval from all the candidates, or perhaps even from most. That is exactly what is confusing to the average American voter. You would have to formulate a statement that falls short of the plans that you have outlined. I support your candidacy because of those particular plans. But more important, I believe that the majority of Americans support the general foundation of Democratic ideas and ideals that inform you and the other candidates.

 

The media, from FOX to MSNBC, from the NY Times to the local papers that most Americans read, will not draw the obvious and important contrasts between what Republicans have done and tried to do and what Democrats would do if Trump is defeated. They will keep talking about a horse race, goading you all to attack each other, and emphasizing the smallest differences over the larger consensus.

 

Thank you for your service thus far to all Americans.

 

Steve Hochstadt

Springbrook WI

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/blog/154233 https://historynewsnetwork.org/blog/154233 0
Please stop saying the past is in the past. It isn’t.

Betsy Ross 1777, a ca. 1920 depiction by artist Jean Leon Gerome Ferris, Library of Congress

 

When controversy flared last month over a Revolutionary War-era American flag and Nike sneakers, a reporter asked me what might sound like simple questions: 

 

Should the past stay in the past? Should we be making a fuss over historical realities? 

 

“It’s the history of the United States,” another person told the reporter for the story. “Can’t change it. … What happened happened.” 

 

Here’s the problem with that common sentiment: What happened in the past often has a profound impact -- on real people in real life. Right now. 

 

If you missed it: Nike dropped its Air Max USA 1 shoe after former NFL quarterback Colin Kaepernick worried that the “Betsy Ross” flag design -- with 13 stars in a circle, on the shoe’s heel -- harkens back to an era of black slavery and has been used by white nationalist groups, according to published reports. 

 

That version of the flag dates to the late 1700s. A century before, in 1676, both black and white workers rose up against the ruling body of the Jamestown colony led by Nathaniel Bacon, a white Virginian colonist. The uprising led the Governor’s Council to find ways to separate African workers from their white peers. 

 

The event is often credited as the rationale for dividing, demeaning and diminishing the worth of black people in America -- division that we’re still grappling with. It happened. 

 

Next let’s consider the Three-Fifths Compromise, a legislative agreement in 1787 that determined slaves would be considered three fifths of a free person. It happened.  

 

Although the three-fifths designation was ostensibly for purposes of taxation and legislative representation, it helped set the stage for how black Americans are viewed and treated in this country. If you are three fifths of a person, you are much easier to abuse, ignore and oppress. 

 

The three-fifths notion represents the genesis of the present debate on whether a citizenship question should appear on the national census. It explains why the census counts people, not citizens.  

 

More recently, Plessy v. Ferguson, the landmark U.S. Supreme Court case of 1896, cemented the American concept of “separate but equal.” In practice, that has always meant separate and unequal. Homer Plessy, a person of color (one-eighth black), refused to move to a rail car designated for black people only. He was arrested and took his case to the Supreme Court, where he lost in a 7-to-1 decision. It happened.  

 

The aftermath of the ruling led to the rise of Jim Crow laws across the South, affecting such everyday services and accommodations as schools, theaters and travel as well as voting rights. Even today, people are often segregated based on race all over America, and the voting rights of people of color are being challenged in several states.  

 

After enslavement, lynching was a common means of racial intimidation and terrorism of men and women of color. There have been nearly 5,000 recorded lynchings since the late 19th century. But due to poor record-keeping and reporting, it’s likely there have been many, many more. It happened.  

 

As late as 1998, a black man in Texas was lynched, dragged by the neck three miles behind a car driven by three white supremacists. That happened, too. 

 

This “past” has never left. Reincarnations of nooses and their imagery are everywhere and still used to terrorize. Even in 2019, pictures of nooses were posted in a classroom in Roosevelt, Long Island.   

 

When it comes to race in America, the past is not the past. Shakespeare got it right: The past is prologue. According to a Pew Research Center survey in June, eight in 10 black adults say ”the legacy of slavery still affects black people in the U.S. today.” 

 

Last year, the center reported that “black households have only 10 cents in wealth for every dollar held by white households.” Likewise, the Economic Policy Institute reminds us that in this period of economic boom, black workers had the highest unemployment rate in the country, 6.3 percent -- nearly twice that of whites. And the Centers for Disease Control and Prevention tells us that black Americans “are more likely to die at early ages from all causes.” 

 

I suggest that these indicators, and many more, are not the result of happenstance or coincidence today, but directly caused by things that happened in the past. 

 

The flag credited to Betsy Ross, as an artifact of American history, is innocent. Unfortunately, Kaepernick was correct: This flag has been adopted as a proxy for the Confederate flag and is flown by white supremacist groups such as the Ku Klux Klan, the Patriot Movement, neo-Nazi groups and the militia gang Three Percenters, a group formed after the election of Barack Obama. The throwback flag represents an era when slavery was legal and commonplace.  

 

So is the past really in the past? Or is it a profound part of our everyday lives? To me, the answer is indisputable. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172725 https://historynewsnetwork.org/article/172725 0
The Constant Threat of Mass Shootings Requires Increased Protection for Presidential Contenders

The Secret Service flag

 

After the two latest mass shootings last  weekend (August 3-4, 2019) in El Paso, Texas and Dayton, Ohio, Americans once again mourned the lives lost and worried for the future. The weekend’s shootings were just two of 250 such massacres in 216 days in 2019. As many of these tragedies were connected to an explicit political philosophy, Americans must face the threat of assassination attempts against the Democratic Presidential contenders. 

 

In the not-so-distant past, two presidential candidates running for office in the midst of American chaos and division were shot. Senator Robert F. Kennedy was assassinated while running for his party’s nomination in 1968. Governor George Wallace was shot and paralyzed in 1972. I discuss both of these incidents in my book Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (2015).

 

Today, mass shootings often target a specific group of people. For example, self-identified “incels” (involuntarily celibate) young men have targeted and attacked women, Dylan Roof targeted African Americans in the Charleston church shooting, the El Paso shooter intentionally killed Latinos as he was concerned that they would “take over” Texas politics; the Pulse Night Club shooter attacked the LGBTQIA community, and the perpetrator of the Pittsburgh synagogue shooting targeted Jews.  Many of the presidential contenders are women, African American, Latino/a,  gay, and/or Jewish. This makes these candidates the potential subject of a hate crime. 

 

So it seems reasonable to demand that the US government, through the Secret Service, the FBI, and other security agencies, must immediately provide Secret Service protection to the ten candidates who will appear in the Democratic Presidential debates in September and beyond. I believe the risk is high enough candidates should receive protection for a few months after they drop out of the race.

 

If the government can support the $110 million bill and counting of Donald Trump’s golf outings in his first two and a half years in office, then the American people must demand protection of those who might be its next President, so that we, hopefully, avoid future tragedies such as those that made Robert F. Kennedy and George Wallace victims a half century ago.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172727 https://historynewsnetwork.org/article/172727 0
Samantha Smith's Dream of Peace and Nuclear Disarmament

Samantha Smith and her letter to Andropov, Samantha Smith Foundation

 

Growing up in Massachusetts, during the Carter and Reagan presidencies, I was one of many little kids worried about nuclear war. We knew about the horror of nuclear weapons from the atomic bombs dropped on Hiroshima and Nagasaki in August, 1945 ending World War 2. 

 

Both America and the Soviet Union were testing a lot of nukes to which I told my family "there was no need to test nuclear weapons because if you fire a nuclear missile at a coffee cup, obviously the cup will break."  The Cold War arms race continued nonetheless. 

In Maine 10-year old Samantha Smith was also deeply concerned about nuclear war. Her mother encouraged her to write to the Soviet leader Yuri Andropov in 1982. She did, in a personal hand written letter, telling Mr. Andropov  "I have been worrying about Russia and the United States getting into a nuclear war. Are you going to vote to have a war or not?.....God made the world for us to share and take care of. Not to fight over or have one group of people own it all. Please lets do what he wanted and have everybody be happy too."  Off in the mail Samantha's letter went, probably not expecting a reply. But then months later Samantha got the surprise of her life.  Samantha's letter was printed in the Soviet newspaper Pravda. Then she got a personal reply from Andropov himself, inviting her to the Soviet Union! The next thing you know Samantha was on TV too, talking about what she wanted most of all: peace.   Samantha Smith toured the Soviet Union in July of 1983, meeting Russian kids, and became an ambassador for peace and nuclear disarmament. She believed people of rival nations could get along and did not want war. Samantha also visited Japan to reinforce their desire to eliminate the nuclear weapons which they had suffered in the atomic bombings on Hiroshima and Nagasaki.  Samantha tragically lost her life in a plane crash in 1985. I remember hearing about the shocking news. Samantha’s spirit has never been forgotten. This is so important because there is a complacency that has set in when it comes to nuclear weapons.  Right now leaders are dragging their feet in reducing the nuclear threat.  There are still about 14,000 nuclear weapons in the world according to the Arms Control Association. The U.S. and Russia have about 90 percent of the nukes.   Shouldn't we all be worried about them today, too? There is still the risk of nuclear war, accidental launch or nuclear terrorism. How long do we want to live with this danger?  Billions of dollars are poured into nuclear weapons each year. Wouldn’t we rather spend this money on fighting hunger, poverty, disease and climate change?  The Move the Nuclear Weapons Money Campaign wants to end nuke spending and use it toward the benefit of humanity. The danger of nuclear weapons is shared by every person, every country. Everyone can take part in this goal of nuclear disarmament, much like Samantha encouraged.  I see examples of this idealism today when working with the CTBTO youth group, who passion is to achieve the Comprehensive Nuclear Test Ban Treaty. This treaty would end nuclear testing forever, helping pave the road for elimination of all nukes.  Samantha Smith taught us your voice matters and you can make a difference in ridding the world of nuclear weapons and achieving peace for all. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172726 https://historynewsnetwork.org/article/172726 0
Woodstock, The Moon Landing and Sesame Street, Too: 50 Years of American Cult Art

The Lantern Bearers by Maxfield Parrish

 

The year 1969 was a seminal twelve months in the life of the Norman Rockwell Museum, in Stockbridge, Massachusetts. The hometown of the celebrated American artist, who made the past come lovingly to life on his canvasses and magazine covers, opened the museum, mourned the closing of the Saturday Evening Post magazine, so often graced by Rockwell’s illustrations, and stood eyewitness to some of the most remembered events in American history, such as the Apollo 11 moon landing, the Woodstock, N.Y. rock concert, mud and all, the Charles Manson murders, the student rights revolution, sexual revolution and just about any revolution you could find.

 

The museum is celebrating the year 1969 not with an exhibit about politics or history, but one about how artists showcased their work in that one, single year. The exhibit, Woodstock to the Moon: 1969 Illustrated, that opened recently, has an unusual star – the gang from the Sesame Street television show, that debuted that year. When you walk into the first of the two exhibit halls you are face to face, in all of his glory, with the Cookie Monster. He’s busy, too, chomping down hard on the first of seven huge chocolate chip cookies. Right next to him is a series of photographs from the first year of the show, along with Sesame Street memorabilia. The light hearted Sesame Street display sets the tone for the rest of the exhibit.

 

The exhibit contains the art – illustrations, photographs, posters – of 1969 events.

 

You find out things you did not know or simply (my case) forgot, such as the fact that in 1969 the number one movie at the box office was not an acid rock movie, but a solid, good old western, Butch Cassidy and the Sundance Kid. The winner of the Oscar for Best Picture that year was none of the tried and true Hollywood hits, but an X rated movie – Midnight Cowboy (“I’m walkin’ here, I’m walkin’ here…” says Ratso Rizzo as he crossed a busy midtown New York Street).

 

There are numerous colorful movie posters from 1969. Everybody remembers that the hit musical Hello Dolly!, starring Barbra Streisand, debuted that year, what about the James Bond movie (“Bond, James Bond“) On Her Majesty’s Secret Service? The Wild Bunch? Once Upon a TIme in the West?  And it was the year that one of the most memorable, and funny, movies of all time, Animal House, was released.

 

On the personal side, I forgot that 1969 was the year the satirical National Lampoon magazine started publishing (yes, the forefather of all those movies). It was the summer four people were killed at a Rolling Stones concert at Altamount Speedway in northern California.  It was the year that Elvis Presley began his long tenure in Las Vegas, selling out 636 consecutive concerts.  And oh, yes, the cartoon character Scooby Doo made his much-remembered debut. It was also the year that Kurt Vonnegut’s wonderful off beat novel, Slaughterhouse Five, hit the book shelves.

 

There are posters, large and small, that celebrate many of these films with wild illustrations by the prominent artists of the day, including Rockwell. There were illustrated posters for Janis Joplin (she of the gravelly voice) concerts, television shows, movies, political races and Presidents (yes, Nixon).

 

There is even a quite scary magazine illustration of the Frankenstein monster for one of his long-forgotten movies.    

 

Another part of the exhibit is a tribute to the moon landing with lots of photographs and a Rockwell painting of Neil Armstrong and Buzz Aldrin om the surface of the moon.

 

In the middle of one of the halls is a 1969 model television set that continually plays trailers for movies that premiered that year. There is a large and pretty comfortable sofa across from it, so you can sit back and enjoy the history and the memories.

 

The museum exhibit has some problems, though. First, it only fills two halls. The movie posters alone from that year it could fill the entire building. Why not make the exhibit as big as possible?  

 

Many parts of 1969 art history are ignored. There is practically nothing from the Vietnam War and less on student campus riots. There was certainly plenty of artwork for those two epoch events. There is Sesame Street, but what about all the others shows that debuted that year? You want color and drama, what about some photos or artwork about the downtrodden underdogs, the New York Mets, who the World Series that year? 

 

Even so, Woodstockto the Moon: 1969 Illustrated, is worth the trip to the Rockwell Museum, happily celebrating its fiftieth anniversary amid the forested campus where it sits. A look back at the artwork from 1969 is a rare chance to gather memories for many and, for younger people, learn how artists took on the army of events that happened just in that one single year. Much of what we know is from the work of artists, whether filmmakers, illustrators, painters or writers.

 

Besides, how can you go wrong with an exhibit that gives you, in all their glory, John Wayne, James Bond and John Belushi?

 

The exhibit runs through October 27. The Rockwell Museum is just off route 183 in Stockbridge, Massachusetts. The museum is open daily from 10 a.m. to 5 p.m. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172724 https://historynewsnetwork.org/article/172724 0
Reflecting on Patriotism in the Trump Era

 

Sometimes these days I am asked what bothers me about the present leadership (or lack of it) that occupies the Oval Office, appoints subordinates, decides if our country will go to war, and proclaims the direction we are all going.  If hesitant, all I have to do is take a look at the columnists on the editorial page of the nearest newspaper. 

 

That President Donald J. Trump is the target of every arrow is nearly inevitable, as there are many different Causes for open hostility. He is enroute to permanently dividing our “united” Country, one I thought headed that happy direction.  In our Congress, Republicans and Democrats seem to have precious little in common.  

 

Men and women in uniform hope these days that the erratic Trump fellow represents only a minority of the “folks back home.”  Meanwhile, it does seem more than likely that almost any unit of professors and teachers will be deeply resentful of the Trumpian use of language on Twitter. 

 

There is bitterness among the competent over cabinet appointments and the handling of foreign relations. Many of us who spent so long fighting our wars do fear a Trump war stirred up almost casually—without thinking. As we try to follow what our president is up to, we discover he is likely to be vacationing. We observers feel that our hotel and golf course magnet only has one foot in “governing.”

 

Let’s admit it: the 4th of July that just passed became something of a strain midway when our erratic President  did much as he pleased. Predominating was showmanship.  It seems true enough that children and the military minded admittedly did seem to get a real “bang” out of watching those nine overflights in formation. 

 

As we celebrated our nation’s birthday, many of us reflected on our shared history and our future. All that time we existed as the United States of America; all those wars and battles we can’t forget.  The heroism that we acknowledge without reservation. Yes, a lot of us did want to commemorate the 4th of July. Did we want it to become a Trumpian holiday?

 

We found ourselves detouring as we tried “celebrating” as well as “commemorating” the dead and dying in the aftermath of War.  Our White House occupant clearly got in the way! The holiday was imperfect for some of us. I was at least was a bit sorry for Donald as the rain trapped him in his apparently bullet-proof square glass cage where the prepared text had to be given, no matter what.  

 

Patriotic holidays are decidedly a time for looking back with pride. Always, we center our attention on “our military,” even though there are an endless number of administrative units who have served usefully and faithfully over the decades. 

 

That brings up several new subjects for worry and concern.  Will our intelligence services do their highly essential job under that inexperiencedpresident? That’s not all.  How in the world is William Pendley, long-critical of the Bureau of Land Management going to manage once in charge? Out here, the BLM controls vast square miles of land. It stands for “conservation.”  Pendley apparently favors wholesale drilling for oil on the BLM acres—we shutter. 

 

And yet, there is much to be proud of. We did win Independence, preserve our new capital after 1812, and expand to the Pacific after waging war briefly with Mexico to end up with West Texas, New Mexico, Arizona, and California! That bloody Civil War early in the 1860s proved so costly (there were over six hundred thousand casualties). In the years after Lee surrendered, many of those who fought and survived came to ponder if the former slave was as “free” as had been hoped. The memory at the time was of bravery on both sides, in any case! But as time passed, that Reconstruction Era proved a disappointment, for sure. A New South gradually arrived, but not in every county.

 

As the Indians were forced onto Reservations, they and sensitive observers hoped for a better outcome, somewhere down the line.  After all, the natives were being displaced (to put it gently), and results were by no means universally approved. Fighting Spain, we did manage to bring a form of freedom to Cuba and the Philippines, but taking the long view offers only diluted celebration.  

 

World War I brought celebration to our streets but also division as our homegrown Germans fumed. The era also brought women’s right to vote and Prohibition. With massive World War II we came to  appreciate fully what the American military is able to do under terrible pressure. Hitler, Mussolini, Tojo and dedicated Japanese, all surrendered as we and our sacrificing Allies in two wars brought the right side to its final triumph.  The path to sudden victory at the very end (in August, 1945) revealed a tool of war all hope to be avoided if there is a “next time.”

 

We took on northern Korea, then  apparently with no plans to do so, we fought intervening China.  It was cold and there was plenty of discouragement.  South Korea today looks good to most observers. (Except, oddly, to our President, who clearly  isn’t sure! One could almost accuse him of partisanship toward that awful North Korea.) The Republic of China is another entity over which we can show pride—although we didn’t quite fight a war (not yet, anyway)  for that independent state.  

 

Any pride mainlanders have felt for interacting with Puerto Rico has been pretty much shelved in today’s time of post-storm neglect. There is turmoil, as one writes, and something drastic must be done. Lots of us hope our deteriorating national government will take on this neglected island as a project very soon now. 

 

Far on our periphery we are in an uproar over our Federal Government's conduct on the southern border. There is disrespect for family as an entity and cruel treatment of children who deserve Care.  

 

Meanwhile, we do realize that partly due to our Vietnam effort  nearly all of South Asia ended up safe from Communism as the last century ended.  It isn’t put that way very often—though President Reagan ventured to say “noble” when considering everything overall. (Maybe fighting Saigon helped, a little, in the overall region.) But almost all admit these days that our massive South Vietnam effort didn’t come to the kind of conclusion this sometimes war-making state likes to see. 

 

Our Nation cherishes victories and deplores defeats—not surprisingly. We prefer to feel something like content with the outcome when the flags are packed away!  That Vietnam War with its humiliating end in 1975 is hard to reconsider….

 

Grenada and Panama don’t begin to make up for that ghastly exit from south Asia in 1975; not close.  In a few decades Kuwait was a small war won, but Iraq has attracted few to cheer.  Using our weapons in Afghanistan has not brought much cheer (except maybe over at those National Guard buildings).  Those casualties do not help in such places.

 

Thoughtful Americans are in an uproar as our Nation’s “leadership” seems willing to embark on an unsought, unthought, sudden New War.  Fight Iran? Why in the hell? Will they start it by attacking our carrier force? Or attack the British instead? Is that Oval Office seeking a war to divert us?  From what?  From all the administrators who quit or were jailed, or proved incompetent?

 

Wars worth “celebrating” are not easily come by, are they?  Between memories of the dead and the wounded, sad analyses of how causes did and didn’t work out, and frustrated hopes as  expectations get compared with results, we do have to think, consider, reflect, yes, and avoid much better--don’t we?  

 

At the outset of this essay I expressed giant unease about the state of our good Country as led by one who continues to act, repeatedly, as one out in the hustings, campaigning.  He spends every weekend at purely political rallies. 

 

Nonetheless, looking back to the recent July 4th remembrance, we did commemorate another historic Independence Day first class. It was easy, as always, wasn’t it,  to find  a whole lot to remember and to celebrate enroute?  It seems clear that the problem we faced as we tried to be “patriotic” was that we do not respect or even trust our presidential leadership.  We, most of us anyway, want New Leadership. 

      

But:  let’s put our heads together before July 4th, 2020.  We do like tapping our feet cheerfully to that patriotic band music—yes, we do. But let’s make sure our Land stays at peace.  Most of all, we will not be bypassing this or any other patriotic holiday.  All of us with the label “American” want to participate with unmitigated, patriotic enthusiasm at times when loyalty to Country is appropriate.

 

If we must, we can commemorate July 4th this coming year in spite of who occupies that Pennsylvania Avenue address.  But it would be so nice if we could somehow be free of that odd figure by then.  We do want to be able to concentrate on a United States of America that is still proud to be the home of every one of us!

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172728 https://historynewsnetwork.org/article/172728 0
Remembering The Red Summer 100 Years Later This summer marks the hundredth anniversary of 1919’s Red Summer, when, from May to November, the nation experienced ten major “race riots” that took the lives of more than 350 people, almost all black. How should the challenging but essential task of remembering and commemorating this troubling history be confronted? 

 

What to call the racial violence is the first challenge. Race riot is, as Ursula Wolfe-Rocca of the Zinn Education Project pointed out, a problematic term, implying that everyone caught up in the violence was equally responsible for the disorder. Yet almost every instance of racial violence in 1919 began with white people organizing to attack African Americans for specific purposes: to drive them from jobs and homes, to punish or lynch them for alleged crimes or insults against whites, to block black advancement. In Chicago, for example, white gangs carried out home invasions to drive black residents from houses in previously all-white neighborhoods. To call such actions “riots” minimizes their overtly racist intent and overlooks the instigators. 

 

Although Red Summer captures the scope of the violence, it doesn’t convey the purpose or totality of white mob attacks directed at African Americans during 1919. In my work on 1919’s racial violence, I propose using antiblack collective violence as a replacement for race riot. In some instances, the terms massacre and pogrom are warranted: in Phillips County, Arkansas, in the fall of 1919, white mobs and posses killed more than 200 African Americans in a frenzy that grew out of a pre-meditated attack on a union meeting of black sharecroppers. The very words we use to describe 1919’s violence can be a step toward an unflinching, complete understanding of the event’s significance and legacy.

 

Another challenge: to acknowledge the victimization of African Americans while also recognizing their sustained resistance through armed self-defense. In Washington, D.C., hundreds of black residents formed a cordon to deter white mobs; in Chicago, black veterans recently returned from combat during World War I put on their uniforms and patrolled streets to stop mobs when the police couldn’t or wouldn’t. Resistance also took the form of legal campaigns to clear African Americans charged with crimes for defending themselves and to pressure prosecutors to bring charges against attacking whites. Black self-defense is yet another reason to jettison the term race riot: in fighting back, African Americans were resisting, not rioting. 

 

Typical narratives about 1919’s antiblack collective violence, especially in school textbooks, often conclude abruptly: the attacks ended, the affected community moved on. Such treatment isolates the violence, implying it was an aberration without lasting effects. Yet for African Americans, discrimination and the upholding of white supremacy took other forms. Many cities are examining this problem through programming related to remembrance of 1919. In Chicago, the Newberry Library, along with numerous local partners, is sponsoring a series entitled Chicago 1919: Confronting the Race Riots. A basic purpose is to show Chicagoans “how our current racial divisions evolved from the race riots, as the marginalization of African Americans in Chicago became institutionalized through increasingly sophisticated forms of discrimination” such as red lining and racist housing policies. The Kingfisher Institute of Creighton University in Omaha, where a white mob lynched a black laborer falsely accused of assaulting a white woman in 1919, recently sponsored a lecture by Richard Rothstein on how federal housing policies built a residential racial divide across the country. In June, the Black History Commission of Arkansas hosted a symposium which in part focused on how survivors and the black community recovered from the Phillips County, Arkansas, race massacre. 

 

As these commemorations demonstrate, 1919’s racial violence—which, as the Newberry Library observes about Chicago, “barely register[s] in the city’s current consciousness”—is receiving in-depth attention to ensure that present and future Americans understand how a century-old event shaped the cities, and nation, where they live now.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172710 https://historynewsnetwork.org/article/172710 0
Trump’s Tariff War Resembles the Confederacy’s Failed Trade Policies

 

Current efforts by the United States to put tariff pressures on China resemble the Confederacy’s efforts to pressure Great Britain during the American Civil War. In the early 1860s the Confederate leaders’ strategy backfired, damaging the southern economy and weakening the South’s military. Recent developments in the tariff fight with China suggest that President Trump’s strategy could backfire as well. America’s tariff negotiators should consider lessons from the record of Confederate missteps.

 

In the Confederates’ dealings with Britain and the Trump administration’s tensions with China, booming economies gave advocates of a tough negotiating stance exaggerated feelings of diplomatic influence. Southerners thought the robust cotton trade provided a powerful weapon in their efforts to secure official recognition from the British. President Trump expresses confidence that America’s flourishing economy can withstand a few temporary setbacks in order to win major trade concessions from the Chinese. In both cases, leaders failed to recognize that their gamble had considerable potential for failure.

 

During the 1850s, southern cotton planters enjoyed flush times. Sales of cotton to English textile mills brought huge profits to the owners of cotton plantations. “Our cotton is . . . the tremendous lever by which we can work out destiny,” predicted the Confederate Vice President, Alexander Stephens. Southerners thought English textile factories would have to shut down if they lost access to cotton produced in the United States and that closures would leave thousands of workers unemployed. To the Confederates, it seemed that the English would have no choice but to negotiate with them. Britain needed “King Cotton.” The Confederate government did not officially sponsor a cotton embargo, but the southern public backed it enthusiastically.

 

Presently, a booming economy has put the Trump administration in a strongly confident mood, too. President Trump’s advisers and negotiators on tariff issues, Peter Navarro and Robert Lighthizer, hope China will buckle under American pressure. They expect tariffs on China’s exports will force a good deal for the United States. President Trump encouraged the tariff fight, asserting trade wars are “easy to win.”

 

Economic pressures hurt the British in the 1860s and the Chinese recently, but in both situations coercive measures led to unanticipated consequences. During the American Civil War, some textile factories closed in Britain or cut production, yet British textile manufacturers eventually found new sources of cotton in India, Egypt, and Brazil. Now the Chinese are tapping new sources to replace lost trade with the United States. China is buying soybeans from Brazil and Argentina, purchasing beef from Australia and New Zealand, and expanding commercial relationships with Canada, Japan, and Europe.

 

The failed strategy of embargoing cotton represents one of the great miscalculations of the South’s wartime government. If the Confederacy had continued selling cotton to the English during the early part of the Civil War – before the Union navy had enough warships to blockade southern ports – it could have acquired precious revenue to purchase weapons of war. The absence of that revenue contributed to a wartime financial crisis. Inflation spiked. Economic hardship damaged morale on the home front. Many Confederate soldiers deserted after receiving desperate letters from their wives. Fortunately for African Americans, ill-conceived Confederate diplomacy speeded the demise of slavery.

 

Many economists now blame President Trump’s trade fights with China and several other nations for volatility in stock markets. They attribute a recent global slowdown in commerce largely to President Trump’s protectionist policies. More troublesome, though, may be the long-term consequences of the administration’s policy. Much like the South’s foolish cotton embargo, America’s tariff waris forcingthe Chinese to seek commercial ties with other countries. China appears to be moving away from close relationships with American business.

 

That shift could prove costly. American prosperity in recent decades owes much to commerce with China and the eagerness of Chinese investors to buy American stocks and bonds, including U.S. government debt. If the present conflict over tariffs leads to reduced Chinese involvement in American trade, the Trump administration’s risky strategymay be a reiteration of the Confederates’ foolish gamble on the diplomatic clout of King Cotton.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172711 https://historynewsnetwork.org/article/172711 0
When Republicans Encouraged Immigration

 

Last month, President Donald Trump celebrated his narcissistic Fourth of July at the Lincoln Memorial. It is not surprising that  Trump chose this location as he makes it a point to invoke Abraham Lincoln’s name whenever it suits his purposes or his distorted view of history.  Apparently, however, he is totally unfamiliar with Lincoln’s legacy on immigration.    

 

Whereas Trump espouses a “go back where you came from” ideology and wants to slam the door shut on immigrants, Abraham Lincoln consistently articulated an economic philosophy that relied heavily upon immigrant labor. 

 

In his earliest speeches, Lincoln saw immigrants as farmers, merchants, and builders who would contribute mightily to the nation's economic future. 

"I again submit to your consideration the expediency of establishing a system for the encouragement of immigration. . . .While the demand for labor is thus increased here, tens of thousands of persons, destitute of remunerative occupation, are thronging our foreign consulates and offering to immigrate to the United States if essential, but very cheap assistance, can be afforded them. It is very easy to see that under the sharp discipline of the Civil War, the nation is beginning a new life. This noble effort demands the aid and ought to receive the attention of the Government."

This is far from Trump’s characterization of immigrants as  criminals, rapists, and drug dealers that Trump has called them.  

 

Whereas Trump has fought the Courts, his own Justice Department,  and just about everyone else to keep immigrants out, Lincoln’s endorsement resulted in the first, last, and only bill in American history to actually encourage immigration.  

 

Symbolically and appropriately, Lincoln’s Act to Encourage Immigration became law with Lincoln's signature on July 4, 1864, exactly 155 years before Trump’s narcissist celebration of the holidaythis past year.  President Lincoln's message and legislation on this subject seemingly begun a wave of support for federal and other action to encourage immigration which lasted for decades. 

 

So strong was feeling on this matter that the 1864 platform of the Republican Party (running as the Union Party) noted, "That foreign immigration, which in the past has added so much to the wealth, development of resources, and increase of nations, should be fostered and encouraged by a liberal and just policy." Compare that to the Republican Party of today!  

 

The Bill was amended and strengthened after Lincoln stated in his Annual Message to Congress on December 6, 1864, "The act  . . . seems to need Amendment, which will enable the officers of the government to prevent the practice of frauds against the immigrants while on their way and on their arrival in the port, so as to secure them here a free choice of vocations and places of settlement."   Now envision the detention camps of the Trump era.  

 

Lincoln’s death and the ultimate repeal of Abraham Lincoln's Act to Encourage Immigration could not remove the effect it had upon immigration. Its important influence foretold the massive flow of immigration to the U.S. in the following decades. The secondary effects of the act, such as the popularization abroad of another of Lincoln's landmark laws, the Homestead Act, encouraged thousands of immigrants to settle as farmers in the Midwest and West. 

 

Though he did not live to see the completion of his dream, Lincoln deserves credit for initiating a plan that personified Emma Lazarus's words long before they were memorialized on the Statue of Liberty.  Trump, meanwhile, represents the repudiation of those words.

 

Lincoln’s humble origins were never far from his thoughts and his belief that everyone deserved the opportunity that America affords permeated virtually all of his beliefs.  On the other hand, Trump, born with a silver spoon in his mouth, stands in direct contrast to Lincoln’s background and compassion.  Donald Trump seemingly scorns the common man or middle class.  He reviles the Constitution and the Declaration of Independence whereas Lincoln stated quite explicitly, if not prophetically: 

“Wise statesmen as they were, they knew the tendency of prosperity to breed tyrants, and so they established these great self-evident truths, that when in the distant future some man, some faction, some interest, should set up the doctrine that none but rich men, or none but white men, were entitled to life, liberty, and pursuit of happiness, their posterity might look up again to the Declaration of Independence and take courage to renew the battle which their fathers began – so that truth, justice, mercy, and all the humane and Christian virtues might not be extinguished from the land; so that no man would hereafter dare to limit and circumscribe the great principles on which the temple of liberty was being built.”

For a president already prone to depression, Trump’s America would send Lincoln into the “hypo” (for hypochondriasis) as he called it.  He would be appalled that the Republican Party stands by idly and refuses to hold Trump accountable for virtually all that Lincoln stood for.

 

In short, the Lincoln majestically sitting behind Trump during the July Fourth event earlier this month weeps for what is now lost.  

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172709 https://historynewsnetwork.org/article/172709 0
The Professor Who Was Ostracized for Claiming the Civil War Was About Slavery – In 1911

The Battle of Williamsburg, Kurz and Allison

 

Sometimes when we’re poking around in an archive, we come across century-old documents that are strangely relevant. That’s what the story of Enoch Marvin Banks became to me. An aging letter from 1911 that I found in the Columbia University archive revealed a story that could be in today’s headlines: people in the Jim Crow South tried to capture the memory of the Civil War for political gain.

 

My main research involves Progressive-Era economic thought, and John Bates Clark was one of America’s foremost economists. Sifting through his papers, I came across the usual letters of economic theories and perspectives, but then something unexpected: A long letter from Enoch Marvin Banks dated April 2, 1911 (the quotes below come from this letter). Banks was a professor of history and political economy at the University of Florida, and he seemed distressed. He was “being violently assailed”, evidently over an article he’d written. I didn’t have the article at the time, but I could understand its context from the hints Banks gave. Basically, Banks had committed the crime of blaming the Civil War on slavery. Southern leaders, he stated, had made “a grievous mistake in failing to formulate plans for the gradual elimination of slavery from our States.” In his view, wise leadership would have ended slavery slowly, kept the union intact, and avoided the catastrophe of civil war.

 

With a google search, I later found the article in question, “A Semi-Centennial View of the Civil War” in The Independent  (Feb. 1911). Upon reading it, I discovered that Banks was even more explicit in print: “The fundamental cause of secession and the Civil War, acting as it did through a long series of years, was the institution of Negro slavery” (p. 300). Banks didn’t stop there. He attacked the South’s leadership as well, praising Abraham Lincoln and criticizing Jefferson Davis as a statesman of “distinctly inferior order” (303). Such views were incendiary in the Jim Crow South, and the cause of Banks’ distress.

 

Banks’ views touched off a firestorm in his native South (he was born in Georgia and spent most of his life in the South). Confederate veterans’ groups responded with widespread criticism. Banks included a clipping from the United Confederate Veterans Robert E. Lee Camp No. 58 in his letter to Clark. The clipping addressed the University of Florida for having a staff member who sought to “discredit the South’s intelligence and to elevate the North and to falsify history.” “Shall such a man continue in a position as teacher where he will sow the seeds of untruth and make true history a falsifier?,” they asked. The veterans demanded Banks be removed from the university and replaced with “a man who will teach history as it is and not mislead and poison the minds of the rising generation.”

 

As Banks told Clark, he simply couldn’t stand the controversy and pressure. He obliged these demands by resigning from the university and retreating back Georgia. He died only a few months later. Some suspected that the strain of the ordeal diminished his already weak health and led to his eventual death.

 

This moment reflected the ongoing battle over the legacy of the Civil War and the ideology of the Jim Crow South. As Banks wrote his article, the South was building and codifying its system of racial segregation. Part of this project involved capturing the war’s historical memory. Confederate leaders had to be presented as noble warriors fighting for a lost cause. Jefferson Davis, who was attacked then and now for incompetence, was “one of the noblest men the South ever produced,” according to the Confederate veterans’ group. That’s why they blamed Banks for distorting history, as he challenged the history that was being constructed. As Fred Arthur Bailey wrote in one of the few articles dedicated to this affair: “This tragic incident was but a small part of a large, successful campaign for mind control. Self-serving, pro-Confederate historical interpretations accomplished their purposes” (17). I can’t help agreeing with Bailey’s conclusion.

 

This ordeal seems to me a perfect example of how history becomes a battlefield. It’s no secret that the historical memory of the Civil War became contentious almost as soon as the war ended. In a world where debates about Confederate statues and flags frequently make headlines, I can only conclude that the battle is very far from over.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172695 https://historynewsnetwork.org/article/172695 0
JSTOR Interview Archive Helps Preserve History

History gets lost every day. Every time someone who witnessed -- or made -- history passes away, we lose their perspective unless it had been captured in memoirs or recordings.  Documentarians like Peter Kunhardt of the Kunhardt Film Foundation play an important role in retaining this witnessed history: by recording interviews with historical figures and then crafting these into a narrative, they amplify these voices as they share their perspectives with a broader audience. 

But even documentaries still miss something important. When, for example, Kunhardt interviewed Jesse Jackson for King in the Wilderness, his HBO documentary about Martin Luther King, Jr.’s final three years, he recorded over 90 minutes of Jackson’s recollections and perspectives, but only used a small portion for the final film.  Kunhardt himself says, “…no matter how fascinating the interview, important information is edited out of the final project.” In all, for his 2018 film, Kunhardt recorded more than 31 hours of interviews with 19 people who witnessed and made history with King. Each uncut interview is a treasure trove of important, witnessed history, much of which ends up on the cutting room floor. What if we could preserve these full-length interviews for future generations and use technology to make them even more useful for education?

Introducing the Interview Archive, a Prototype by JSTOR Labs

Now we can. I am pleased to announce the release of the Interview Archive. This prototype includes all 19 uncut interviews filmed for King in the Wilderness – interviews with civil rights leaders like Marian Wright Edelmanm, John Lewis and Harry Belafonte, who made history alongside King. We didn’t stop there, though: the site also helps researchers and students explore the rich material. Each minute of each interview is tagged with the people, places, organizations and topics being discussed. Users can use these tags to explore the different perspectives on over a thousand topics. They can also click on the tags while watching the interviews for background information from Wikipedia, to find and jump to other mentions of the topic in the Interview Archive, or to find scholarly literature and historical images related to the topic in JSTOR and Artstor.  

 

 

The site is a fully-functioning prototype built by JSTOR Labs, a team at the digital library JSTOR that builds experimental tools for research and teaching. At this point, it contains the source interviews from a single documentary; enough, we think, to convey the concept and useful if you happen to be teaching or researching this specific topic. Our aim in releasing this prototype is to gauge interest in the idea.  We hope that historians and teachers will reach out to us at labs@ithaka.org with their thoughts on this concept as well as what material they would most like to see in an expanded site.

Most importantly, we are thrilled to be able to share and preserve the full-length interviews from King in the Wilderness. These interviews belong in the hands of educators, students and researchers, helping to keep this history from being lost.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172696 https://historynewsnetwork.org/article/172696 0
PubMed Central Offers a Historical Treasure Trove  

Where can you freely read, download, text mine, and use for your research and teaching the full text of millions of historically-significant biomedical journal articles spanning three centuries, alongside millions more current biomedical journals? Look no further than PubMed Central (PMC) of the National Library of Medicine (NLM), National Institutes of Health. The NLM is the world’s largest biomedical library and one of the twenty-seven institutes and centers which constitute NIH, whose main campus is located in Bethesda, Maryland.

 

The NLM launched PMC in early 2000 as a digital counterpart to the library’s extensive print journal collection. In 2004, the NLM joined with Wellcome (a London-based biomedical charity, and one of the world’s largest providers of non-governmental funding for scientific research), the Jisc (a UK-based non-profit membership organization, providing digital solutions for UK education and research), and a number of medical journal publishers to agree that medical journals contain valuable information for research, education, and learning. Thus, journal archives should be digitized and freely available to all who would wish to consult them. Two years later, that agreement yielded public access to the full-text of 160 journals spanning nearly two centuries. More recently, the NLM completed a multi-year partnership with Wellcome to expand the historical corpus of PMC with dozens more titles encompassing three centuries and hundreds of thousands of pages. You will find a hyperlinked list of these titles at the end of this article; clicking on each title will take you its associated, digitized run in PMC.

 

While medical journals have always been invaluable resources, their digitization increases their accessibility and creates new opportunities to realize their research value. PMC makes available the machine-readable full text and metadata of the digitized journal articles, including titles, authors (and their affiliations where present), volume, issue, publication date, pagination, and license information. Such article-level digitization also enables us to link data, that is, to connect individual and associated articles with corresponding catalog records, and sometimes even Digital Object Identifiers (DOIs), to improve discovery and use of the articles by interested researchers.

 

In writing about one of these newly-available titles—namely The Hospital, a journal published in London from 1886-1921—on the popular NLM History of Medicine Division blog Circulating Now, Dr. Ashley Bowenobserved that “For researchers interested in the administration of British hospitals in the late 19th and early 20th century, [this journal] is a vital resource.” The Hospital“carried the tag-line ‘the modern newspaper of administrative medicine and institutional life,’ [and] published an enormous variety of items of interest to physicians, nurses, hospital administrators, and public health professionals—everything from medical research to notes on fire prevention and institutional kitchen management, reflections on ‘the dignity of medicine,’ opinions about housing policy, and much more.” Inspired by Dallas Liddle’s recent research which “used [digitized] file size as a way to identify the rate of change in Victorian newspapers,” Bowen downloaded and analyzed every article in the entire run of The Hospital—including all the file names and file sizes—to study the changing content, trends, and sheer volume of this important journal over time, to appreciate its metadata created in the process of digitization, and to evaluate this metadata “in addition to…traditional content analysis.” 

 

Bowen has also used PMC’s historical corpus to research Alfred Binet’s early 20thcentury intelligence tests using The Psychological Clinic and utilized Bristol Medico-Chirurgical Journal and its semi-regular series of articles about “Health Resorts in the West of England and South Wales.” 

 

Understandably, given the sheer size and scope of the overall PMC corpus, Bowen’s studies only scratch the surface of the archive which currently encompasses nearly 5.5 million full-text articles. Nearly 500,000 of those articles were published in 1950 or earlier and over 1 million articles date from 1951-1999. 

 

Among the millions of articles you will find alongside those surfaced by Bowen are:

  • Sir Alexander Fleming’s discovery of the use of penicillin to fight bacterial infections, which appeared in the British Journal of Experimental Pathology, 1929
  • Sir Richard Doll’s groundbreaking study that confirmed that smoking was a “major cause” of lung cancer, which appeared in the British Medical Journal, 1954; and 
  • Walter Reed’s paper proving that mosquitoes transmit yellow fever, which appeared in the Journal of Hygiene, 1902.  
  • reports of centralized health and relief agencies in Massachusetts during the 1918 influenza pandemic; 
  • an appeal for justice by Arthur Conan Doyle, related to the infamous case of the Parsi English solicitor George Edalji, which reflected contemporary racial prejudice;
  • a medical case report on America’s 20th president James A. Garfield, following his assassination in 1881; 
  • post-World War II thoughts about the future of the Army Medical Library by its director Frank Rogers; and 
  • a paper by the bacteriologist Ida A. Bengtson, the first woman to work in the Hygienic Laboratory of the U.S. Public Health Service, the forerunner of the National Institutes of Health. 

 

So, if we haven’t already tempted you to explore PMC for your own research and teaching—and explore its Open Access Subset and Historical OCR Collection, both ideal for text mining—what are you waiting for? Dive in! Encourage your colleagues and students to explore it. Be in touch and let us know what you discover in PMC. We would love to hear from you!

 

List of the historical journal titles made available freely in PMC  through the multi-year partnership between Wellcome and the NLM/NIH. 

Clicking on each title will take you to its associated, digitized run in PMC.  

 

Learn more about PMC and the partnerships dedicated to growing its freely-available historical content:

Public-Private Partnerships: Joining Together for a Win-Win,” Jeffrey S. Reznick and Simon Chaplin, The Public Manager, December 9, 2016.

PubMed Central: Visualizing a Historical Treasure Trove,” Tyler Nix, Kathryn Funk, and Erin Zellers, Circulating Now, the blog of the National Library of Medicine’s History of Medicine Division, February 23, 2016.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172694 https://historynewsnetwork.org/article/172694 0
“Mr. Straight Arrow,” John Hersey, and the decision to drop the atomic bomb

John Hersey

 

Roy Scranton’s ”How John Hersey Bore Witness” (The New Republic, July-August 2019) is an insightful look at a new book on one of my favorite authors.  It touched all the right notes and has prompted me to add Mr. Straight Arrow to my “Christmas list.”  Sadly, in the midst of this otherwise fine review, author Scranton repeats the discredited old chestnut that President Harry S. Truman dropped atomic bombs on Hiroshima and Nagasaki even though he knew Japan was trying to surrender.  Truman’s real reason for using the weapons, according to Scranton, was to employ them as a diplomatic club against the Soviet Union. This allegation was popular in some quarters during the 1960s and 70s, but was only sustained by a systematic falsifying of the historical record and it continues to pop up even today.  Underscoring this sad fact is the link Scranton provides which takes readers to a 31-year-old letter to the New York Times from Truman critic Gar Alperovitz purporting that “dropping the atomic bomb was seen by analysts at the time as militarily unnecessary.”  Presented in the letter is an interesting collection of cherry-picked quotes from a variety of diary entries and memos by contemporaries of Truman, such as Dwight D. Eisenhower.  All are outtakes and have been long rebutted or presented in their actual contexts.  Even key figures are misidentified.  For example, FDR’s White House chief of staff Admiral William D. Leahy, who chaired the meetings of the Joint Chiefs of Staff, is elevated in the letter to the position of Chairman of the Joint Chiefs.  As for the notion that Japan was trying to surrender, this is not what was beheld by America’s leaders who were reading the secretly decrypted internal communications of their counterparts in Japan.  In the summer of 1945, Emperor Hirohito requested that the Soviets accept a special envoy to discuss ways in which the war might be “quickly terminated.”  But far from a coherent plea to the Soviets to help negotiate a surrender, the proposals were hopelessly vague and recognized by both Washington and Moscow as no more than a stalling tactic ahead of the Potsdam Conference to prevent Soviet military intervention --- an intervention that Japanese leaders had known was inevitable ever since the Soviets’ recent cancellation of their Neutrality Pact with Japan.  Japan was not trying to surrender.  Even after the obliteration of two cities by nuclear weapons and the Soviet declaration of war the militarists in firm control of the Imperial government refused to admit defeat until their emperor finally forced the issue.  They had argued that the United States would still have to launch a ground invasion and that the subsequent carnage would force the Americans to sue for peace leaving of much of Asia firmly under Japanese control.   The war had started long before Pearl Harbor with the Japanese invasion of China, and millions had already perished.  That Asians in the giant arc from Indonesia through China --- far from Western eyes --- were dying by the hundreds of thousands each and every month that the war continued has been of zero interest to Eurocentric writers and historians be they critics or supporters of Truman’s decision.  As for the president, himself, Truman rightfully hoped, after the bloodbaths on Okinawa and Iwo Jima, that atomic bombs might force Japan’s surrender and forestall the projected two-phase invasion which would result in “an Okinawa from one end of Japan to the other.”    Hersey understood this well.  Fluent in Chinese (he was born and raised in China), Hersey was painfully aware of the almost unimaginable cost of the war long before the United States became involved and, after it did, observed the savagery of battle on Guadalcanal first hand.  Yes, he understood it quite well and it will come as a surprise, even shock, to many that neither Hersey nor Truman saw Hiroshima as an indictment of the decision to use the bomb.   Those were very different times and the prevailing attitude, according to George M. Elsey, was “look what Japanese militarism and aggression hath wrought!”  (Truman also made similar observations when touring the rubble of Berlin during the Potadam Conference.)  The president considered Hiroshima an “important” work and, far from being persona non grata, Hersey would sometimes spend days at a time in Truman’s company when preparing articles for The New Yorker.  This level of access was not accorded to other journalists and circumstances resulted in Hersey sitting in on key events such as when Truman learned that the Chinese had just entered the Korean War and a secret meeting with Senate leaders over the depredations of Joe McCarthy. Although exceptions can be found in the literature, Hersey’s Hiroshima was simply not viewed in the postwar period as an anti-nuclear polemic and Elsey, who served in both the Roosevelt and Truman administrations before going on to head the American Red Cross for more than a decade, remarked to David McCullough that “It’s all well and good to come along later and say the bomb was a horrible thing.  The whole goddamn war was a horrible thing.”   Scranton, himself, gives a brief nod to this fact, admitting that the midnight conventional firebombing of Tokyo earlier that year killed even more people, approximately 100,000, yet one shudders to think what he teaches to his unsuspecting students.  The “revisionist” Japanese-were-trying-to-surrender hoax prominently recited in his review of Mr. Straight Arrow has long been consigned to the garbage heap of history by a host of scholarly books and articles*  including, ironically, a brilliant work by one of Scranton’s own colleagues at Notre Dame.  Father Wilson D. Miscamble’s The Most Controversial Decision: Truman, the Atomic Bombs, and The Defeat of Japan (Cambridge University Press, 2011) is a hard hitting, well researched effort that is especially notable for its thoughtful exploration of the moral issues involved. Though Scranton and Miscamble share the same campus, a colleague of mine maintains that the two scholars have never met.  Perhaps they should get together for coffee some morning. -------------------------- * Six particularly useful works are: Sadao Asada’s award winning, “The Shock of the Atomic Bomb and Japan’s Decision to Surrender -- A Reconsideration,” Pacific Historical Review, 67 (November 1998); Michael Kort, The Columbia Guide to Hiroshima and the Bomb, (New York: Columbia University Press, 2007) and “The Historiography of Hiroshima: The Rise and Fall of Revisionism,” The New England Journal of History, 64 (Fall 2007); Wilson D. Miscamble C.S.C., The Most Controversial Decision: Truman, the Atomic Bombs, and the Defeat of Japan (New York: Cambridge University Press, 2011); Robert James Maddox, ed., Hiroshima in History: The Myths of Revisionism, (Columbia, Missouri: University of Missouri Press, 2007); Robert P. Newman, Enola Gay and the Court of History, (New York: Peter Lang Publishing, 2004).

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172708 https://historynewsnetwork.org/article/172708 0
How Two Regicides Escaped to America and Became Folk Heroes

 

 

If you went to school in the USA in the late nineteenth or early twentieth centuries, chances are you would have been taught about Edward Whalley and William Goffe. Ask American – or indeed British – schoolchildren today about Whalley and Goffe and you will almost certainly be met with blank stares. This journey from widespread recognition to relative historical obscurity is perhaps surprising when we learn that Whalley and Goffe were significant colonial American figures who had been directly involved in one of the most seismic events in British history: they were the fourth and fourteenth signatories of the death warrant of King Charles I, the only king in British history to be lawfully tried and put to death.

 

But what also makes them stand out from the other fifty-seven signatories is that they were eventually lauded as American folk heroes, dominant figures in early American literature, and American proto-revolutionaries – men some would consider Founding Grandfathers. They earned this reputation because, when the British monarchy was restored in 1660, Whalley and Goffe fled to America where they continued to uphold the principle of revolutionary republicanism against tyrannical monarchy.

 

To achieve a peaceful and successful Restoration, Charles II and his courtiers understood the need for widespread forgiveness for many of those who had been involved in the events that led to Charles I’s death – increasing intransigence between king and parliament in the 1630s and civil wars in the 1640s – and those who had been involved in the constitutional experiments of the English Republic in the 1650s. But there were some key figures that had been simply too closely involved in the regicide, most notably the surviving signatories of the king’s death warrant, to be forgiven and their actions forgotten.

 

Some of these regicides remained in England, confident that their executions would secure their martyrdom, provide a chance to gain sympathisers while re-vocalising their commitment to Oliver Cromwell’s Good Old Cause, and to ensure eternity sitting at the right hand of God. But others preferred to keep promoting the cause by staying alive and taking their chances by fleeing: some to France, Germany, the Netherlands or Switzerland, or – like Whalley and Goffe – to America.

 

The American colonies were predisposed to be sympathetic to these two devoutly Puritan regicides who had devoted their energies to working against the Stuart dynasty – the religious policies of which had, famously spurred early generations of colonists to cross the Atlantic. Indeed, Whalley and Goffe were openly welcomed in Boston and Cambridge, Massachusetts, and even when warrants arrived from England making it clear that the regicides were wanted men, the sincerity of the colonial authorities’ attempts to capture Whalley and Goffe was open to question. While individuals like Governor Endecott of Massachusetts Bay made overtures to suggest that they were being earnest in their pursuit of the king killers, their delayed and ineffectual actions suggested quite the opposite.

 

The closest that Whalley and Goffe came to being discovered, for example, was in 1661 when Endecott appointed two hapless bounty hunters, a merchant and a ship’s captain: Thomas Kellond and Thomas Kirke.  They spent a couple of weeks being outwitted by the colonial authorities before being given generous grants of land, perhaps to buy them off and discourage any further pursuit. Whalley and Goffe spent the remaining fifteen to twenty years travelling around the New England colonies, residing in New Haven, Hadley and Hartford, hiding from a threat that never really, it could be argued, existed.

 

After a flurry of ineffectual activity against the regicides in the year or two following  the Restoration, it became clear that Charles II’s imperial pragmatism was taking precedence over an ideological witch-hunt. Aside from the potential repeated embarrassment of sending over agents to capture Whalley and Goffe, haplessly chasing them through unknown territory that the pair and their protectors inevitably knew more intimately, Charles II and his courtiers had to tread carefully with their newly regained American colonies.

 

It was economically and politically unwise to alienate potentially lucrative trading partners in the Atlantic basin by encroaching on the colonists’ liberties for the sake of capturing two anxious and ageing Puritans. Furthermore, Charles II was inevitably distracted by developments back home, from the incendiary religious politics of the post-civil war era to the natural disasters of the mid-1660s. While Whalley and Goffe were three thousand miles away cowering in basements, hiding in caves, and eking out an existence of a little trade and a lot of prayer, the urgency to capture and execute them gradually receded.

 

This did not stop the author of the first full-length history of Whalley and Goffe, Ezra Stiles, from reinterpreting their story to portray them as brave, thrusting, revolutionaries who were acting as revolutionary heralds  from an earlier age. Stiles was researching and writing in the 1790s, in the aftermath of the American and French Revolutions and he enthusiastically wanted to see the regicides as men sowing the seeds for such visionary ideas over a hundred years earlier.

 

To do so, he had to rely on oral histories. And since fugitives don’t tend to leave a detailed trail of documentary evidence this inevitably led to distortion and excited exaggeration. Individuals retold debatable stories from earlier generations who were determined to associate themselves, their families, and their locality to the story of these heroic men on the run. It also helped Stiles’s cause that the most dramatic story involving the regicides – the tale of Goffe, the ‘Angel of Hadley’, appearing from nowhere to protect the colonists from the indigenous population in King Philip’s War – was actually probably true. 

 

It was on this basis that the majority of novels, plays, poems and paintings that featured the regicides in the nineteenth century portrayed them as heroic champions of liberty defeating the pantomime-villain efforts of the tyrant Charles II and his sneering courtiers. Such a caricature was naturally attractive to schoolchildren growing up with the tale of two obscure Englishmen whose breathlessly heroic actions could be seen as joining the teleological dots between English and American Revolutions. Looking beneath the veneer of this simplistic image might undermine this narrative, but it restores the more fascinating truth of two men whose actions represented something far, far greater.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172697 https://historynewsnetwork.org/article/172697 0
FDR's Token Jewish Refugee Shelter

Cartoon by Stan MacGovern in the New York Post, June 1, 1944

 

Seventy-five years ago today, a ship sailed into the New York harbor, carrying more than 900 European Jewish refugees. Unlike similar ships that had approached America’s shores in the 1930s, the S.S. Henry Gibbins was not turned back. Instead, the passengers were taken to an abandoned army camp in upstate New York, where they spent the rest of the war in safety, far from Hitler’s clutches.

 

Why did President Franklin D. Roosevelt permit this group of Jewish refugees to enter the United States? What had changed since the years when other ships were turned away?

 

By the autumn of 1943, news of the mass murder of Europe’s Jews had been verified by the Allies and widely publicized in the United States, although major newspapers often buried it in the back pages. There could be no doubt that at least several million Jews had been slaughtered by the Germans and their collaborators, and many more were still in danger.

 

President Roosevelt and his administration insisted that nothing could be done to help the Jews except to win the war. Others disagreed. In October 1943, U.S. Senator Warren Barbour, Republican of New Jersey, introduced a resolution calling on the president to admit 100,000 Jewish refugees “for the duration of the war and six months thereafter.” The resolution was endorsed by both the National Democratic Club and the National Republican Club.

 

Granting temporary refuge was a way of addressing the life-and-death crisis that Europe’s Jews faced, without incurring the wrath of those who opposed permanent immigration. The Jews who were saved would go back to Europe, or elsewhere, when the war ended.

 

Sen. Barbour tragically passed away just a few weeks later, but the idea of temporary refuge gained traction. In early 1944, a proposal for temporary havens was presented to President Roosevelt by the U.S. government’s War Refugee Board (a small, underfunded agency recently created by FDR under strong pressure from Jewish groups and the Treasury Department).

 

“It is essential that we and our allies convince the world of our sincerity and our willingness to bear our share of the burden,” wrote Josiah E. DuBois, Jr., a senior official of the War Refugee Board, in a memo to Roosevelt. The United States could not reasonably ask countries bordering Nazi-occupied territory to take in refugees if America itself would not take any, DuBois argued.

 

The president was reluctant to embrace the plan; he had previously confided to his aides that he preferred “spreading the Jews thin” around the world, rather than admitting them to the United States. 

 

Secretary of War Henry Stimson, for his part, vigorously opposed the temporary havens proposal. In his view, Jewish refugees were “unassimilable” and would undermine the purity of America’s “racial stock.”

 

Public pressure pushed the plan forward. Syndicated columnist Samuel Grafton played a key role in this effort, by authoring three widely-published columns advocating what he called “Free Ports for Refugees.”

 

“A ‘free port’ is a small bit of land… into which foreign goods may be brought without paying customs duties… for temporary storage,” Grafton explained. “Why couldn’t we have a system of free ports for refugees fleeing the Hitler terror?… We do it for cases of beans… it should not be impossible to do it for people.”

 

The activists known as the Bergson Group took out full-page advertisements in the Washington Post and other newspapers to promote the plan. Jewish organizations helped secure endorsements of “free ports” from religious, civic, and labor organizations, including the Federal Council of Churches and the American Federation of Labor. U.S. Senator Guy Gillette (D-Iowa) introduced a resolution calling for free ports; eight similar resolutions were introduced in the House of Representatives.

 

Support for the havens plan could be found across the political spectrum. The liberal New York Times endorsed it; so did the conservative Hearst chain of newspapers. Temporary refuge was fast becoming a consensus issue.

 

With public pressure mounting, the White House commissioned a private Gallup poll to measure public sentiment. It found that 70 percent of the public supported giving “temporary protection and refuge” in the United States to “those people in Europe who have been persecuted by the Nazis.”

 

That was quite a change from the anti-immigration sentiment of earlier years. But circumstances had changed, and public opinion did, too. By 1944, the Great Depression was over and the tide of the war had turned. The public’s fear of refugees had diminished significantly, and its willingness to make humanitarian gestures increased. 

 

Despite this overwhelming support for temporary refuge, President Roosevelt agreed to admit just one token group of 982 refugees. And he did not want them to be all Jews; FDR instructed the officials making the selection to “include a reasonable proportion of the [various] categories of persecuted minorities.” (In the end, 89% were Jewish.)

 

Ironically, the group was so small that they all could have been admitted within the existing immigration quotas. There was no need for a special presidential order to admit them, since the regular quotas for citizens of Germany and German-occupied countries were far from filled in 1944. In fact, they were unfilled in eleven of FDR’s twelve years in the White House, because his administration piled on extra requirements and bureaucratic obstacles to discourage and disqualify refugee applicants.

 

Of the 982 refugees whom the president admitted in August 1944, 215 were from Germany or Austria. Yet the German-Austrian quota was less than 5% filled that year. The second largest nationality group was from Yugoslavia; there were 151 Yugoslavs in the group. That quota was less than 3% filled in 1944. There were also 77 Polish citizens and 56 Czechs; those quotas were only 20% and 11% filled, respectively. Put another way, a combined total of 39,400 quota places from those particular countries sat unused in 1944, because of the Roosevelt administration’s policy of suppressing refugee immigration far below the limits that the law allowed.

 

The S.S. Henry Gibbins arrived in the New York harbor on August 4, 1944. Ivo Lederer, one of the passengers, recalled how they cheered when the ship approached the Statue of Liberty. "If you're coming from war-time, war-damaged Europe to see this enormous sight, lower Manhattan and the Statue of Liberty--I don't think there was a dry eye on deck."

 

The refugees were taken to Fort Ontario, an abandoned army camp in the upstate New York town of Oswego. It would be the only "free port" in America. By contrast, Sweden, which was one-twentieth the size of the United States, took in 8,000 Jews fleeing from Nazi-occupied Denmark. 

 

According to conventional wisdom, most Americans in the 1940s were against admitting Jewish refugees. It is also widely assumed that members of Congress—especially the Republicans—were overwhelmingly anti-refugee, too. America’s immigration system supposedly made it impossible for President Roosevelt to allow any more Jewish refugees to enter. And American Jews allegedly were too weak to do anything about it. 

 

Yet 75 years ago this summer, those myths were shattered when a coalition of Jewish activists, rescue advocates, and congressmen from both parties, backed by a large majority of public opinion, successfully pressured FDR to admit a group of European Jewish refugees outside the quota system. 

 

Refugee advocates had hoped the United States would take in hundreds of thousands of Jews. Sadly, President Roosevelt was interested in nothing more than an election-year gesture that would deflect potential criticism. Famed investigative journalist I.F. Stone was not off the mark when he called the admission of the Oswego group “a token payment to decency, a bargain-counter flourish in humanitarianism.” 

 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172707 https://historynewsnetwork.org/article/172707 0
Woodstock at 50: A Conversation with Award Winning Filmmaker Barak Goodman

On Tuesday, August 6th, PBS is set to release its newest documentary to their series American Experience. “Woodstock: Three Days That Defined a Generation” explores the legendary music festival by turning the cameras to the crowd. Emmy award winning filmmaker Barak Goodman and PBS tell the story of those who attended the concert, and how they endured a three-day festival with deficient infrastructure.

 

Woodstock is represented as the embodiment of the 1960’s counter-culture. The legendary festival remains prominent in the lore of hippie culture and the adage of sex, drugs, and rock n roll. Filmmaker Barak Goodman explores the culture of the late 1960’s and tells the story of who made Woodstock a historic experience: the audience.

 

Prior to the film’s release, HNN was able to connect with Goodman to discuss what making the film taught him about the importance of Woodstock. Goodman highlights the distinct culture that produced the festival, the importance of communitarianism in the crowd, and examines the lessons we should carry through to the future. 

 

Jonathan Montano: Before we start, I wanted to say that I learned a lot from the documentary. It was very interesting and educational. To start, let’s just talk about what you think Woodstock says about the counter culture era in general.

 

Barak Goodman: Sure, you know I think a couple of things, it’s a big question. I mean by 1969 there had been a lot of talk, a lot of sort of expression of what the counter culture was about. There were slogans – like peace and love and so forth – and I think in some ways what Woodstock showed us was that those concepts – those slogans – had a basis in reality. That this generation and these kids had taken on board these concepts and really tried to make something real of them. 

 

What makes this festival the sort of window – the lens – into the counter culture it’s that they had to execute these big concepts in a real way and under trying circumstances. They had to express peace and love, they had to build a new city, as they say, in order to avoid a disaster. So, I think it really made it concrete, and brought to focus, what these concepts really meant in real life.

 

That’s the greatness and magic of what happened there and really what’s so inspiring about it. When chips were down, they acted and put their money where their mouth was. That was the saving grace of the festival.

 

Montano: Right, especially considering everything they had to go through, it really seemed as though it might’ve gone terribly.

 

Goodman: In maybe 99 of 100 cases like this it would’ve gone terribly. These people were hungry, tired, living poorly, had little help from the outside except some medical assistance from the state of New York, and of course heroic help from the surrounding communities. But essentially, they were on their own. They had only each other. They had only what was in their hearts and souls at that point to get them through. I don’t want to exaggerate – this isn’t the Donner Party – but it wasn’t a pleasant experience. Many other festivals had devolved into violence, especially with drugs and all around. But this called on their better natures, and that’s what is inspiring.

 

Montano: Yeah, I agree. I know today there are countless festivals from Coachella, Lola-pa-looza, and even what could’ve been Fyre Festival. But was Woodstock the first of its kind? Did it pave the way for festival culture? Or is it entirely distinct?

 

Goodman: I think both. There were other festivals that preceded Woodstock but nothing on that scale. It’s become a myth, an icon, an inspiration for festivals that have come since. I think we all have this idea of Woodstock in the back of our heads when we go to a festival, but I don’t think it has or ever will be repeated. I mean you see all the anniversary Woodstock concerts try to be mounted and they all fail in comparison.

 

I think ultimately you can’t recreate the particular set of circumstances that made Woodstock unique. You’re never going to have that many people show up unexpectedly with no infrastructure for them. You’ll never have the isolation, especially now with cell phones. There would be constant communication with the outside world. There were just so many unique circumstances that make Woodstock unrepeatable. But I think it is a beacon and an inspiration for every future festival that has happened. We all want to go to Woodstock when we go to Coachella.

 

Montano: Absolutely. Would you say that the culture in general was a unique circumstance that made Woodstock happen?

 

Goodman: Yes, I do, but I want to hedge it a bit. I do think that the late 60’s were a special moment. We had a galvanizing issue in Vietnam that brought people together in a way that was almost unique. We had a whole generation ready to turn the page on their parent’s generation, wanting to be different.

 

But I would also say that we are seeing a bit of a repeat. My kids are that age, and they feel somewhat the same as my generation did about Vietnam but for them it’s global warming. In other words, we let them down and left them an unsustainable world and we’re doing nothing about it. It’s going to have to be them that does something. While I don’t think it’s exactly the same, and you wouldn’t have Woodstock, you are seeing a level of activism and a level of communitarianism, and Us vs. Them, that I think for the first time it does feel like the 1960’s. That’s my hope.

 

Montano: Certainly. The comparison between Vietnam and global warming is really interesting.

 

Goodman: Yeah. I mean, these are existential threats. Back then it was “yeah, that could be me going over there and dying.” It’s a bit more diluted now but young people do feel as though there might not be a habitable world to live in if we don’t take the issue on ourselves; if we don’t change things. The great thing about Woodstock is that it on a microcosmic level that you can change things by banding together and being a community if you’re willing to pull in the same direction. I’m hoping this film gets seen by young people because it shows a way to move forward.

 

Montano: The sixties are often represented as the era of sex, drugs, and rock n roll, especially in pop culture. How does Woodstock add a nuance to that depiction?

 

Goodman: Right. So, like a lot of stereotypes it’s rooted in something but it’s also a stereotype, a caricature really. I certainly had that caricature going into making the film. I thought they were just hippies doing drugs, but that just trivializes what they were.

 

I think that what Woodstock, and what I hope the film does, is in some ways rehabilitate the hippie culture. It wasn’t a caricature. Don’t make fun of stoned out hippies wandering naked in a field. This was about something real. It was really beautiful Really beautiful. It was inspiring. It’s something to aspire to not look to down on. That was a real revelation for me in making the film and made it a joyous experience. The feeling like these kids were on to something, they had something to teach us.  I love the moment in the film when Max Yeager, someone who represents a different generation and a different point of view, gets up there and makes a very appreciative speech to the audience saying you showed us, you taught us something.

 

Montano: Which leads me to my next question: Is there anything we should take from Woodstock and apply to contemporary times? In other words, is there something we should learn not just about Woodstock, but from it?

 

Goodman: Absolutely. And I think that’s the last part of the show. It’s basically the better angels of our nature. In this dark time, we tend to give up on people, at least I do. I begin to question if people are drawn to good or not good. Light or dark. What Woodstock shows is that we have within us an enormous capacity for sharing, generosity, Unitarianism, all those things. And boy is that sorely needed right now. It is nothing more of a reminder of how much can get done by following that path, rather than divisiveness or violence.

 

I think it’s that simple. It was a beautiful moment of collective goodness prevailing over what could’ve been a very dark and destructive experience. That’s what Woodstock has to teach us.

 

Montano: The documentary emphasizes a self-police system, mainly through the hog farm, and even self help system for drugs that led to bad trips. A user would be taken care of, then they’d take care of the next bad trip. Do you think anything similar is possible today especially considering the era of social media, hyper-security, even helicopter mom type emphasis in place today?

 

Goodman: You did a much better job pointing to the things that Woodstock can teach us than I did. Absolutely. How brilliant was that? What a stroke of genius to understand the crowd well enough to know that a bunch of rag tag hippies from New Mexico would be better cops than armed New York Policeman. That to me shows a deep, deep understanding of who these kids were and what they were all about.

 

I do think, and I’m no expert on security, but I do think that hyper militarized, Us vs. Them attitude of policing right now draws out the worst and leads to more conflict than it needs to. And I would love to see an attempt to do something much more like Woodstock, with a please force not a police force. And just to deescalate – and we all see it – particularly in confrontations between cops, and usually people of color, that get escalated so quickly, and guns get pulled out and bullets fly. But isn’t it easier to take a deep breath, realize we’re all human beings and just talk to each other? It just feels like that is more productive. That’s what that festival did and thank god they did it.

 

Montano: In a really brilliant way

 

Goodman: In such a brilliant way! God, I mean, not only did the hog farm supply security, but they ended up feeding everybody, and taking care of overdoses. And it was because they had already figured out how to take care of each other in a communitarian way. And they knew how. it wasn’t a set of skills that many had, but they did. Stanley Goldstein was like, ok, that’s who we need here. And yes, I would love to see that attempted at events today.

 

Montano: To begin to wrap up, what did you think of Woodstock before the documentary and did your experience change that thought at all?

 

Goodman: Totally. I think I felt as most people did. That Woodstock was a great rock 'n roll concert. The original movie showed that. And it was that! But I didn’t understand the real story, which is what happens to the crowd. That was the goal of the documentary. We wanted to turn the camera’s around and show the crowd. Whatever made Woodstock Woodstock was not up on the stage, it was down with the people. That was the revelation, the gee whiz moment. That’s why our film can stand next to the other brilliant one from 1971, because it’s about something totally different.

 

Montano: Exactly. When I first watched it, I expected to see things like Hendrix and the Star Spangled Banner, you know – the rock 'n roll side - the brilliance of the music. But that’s not what you showed, and I was enamored by that. And obviously I’m a lot younger than Woodstock, but everybody kind of knows it. But I only knew it as a rock concert. I had no idea, for example, that there was free admission, and that really blew my mind.

 

Goodman: Right, right. We weren’t about to try and re-do the concert movie. That original movie is so great, who would want a new one? We wanted a different one. We got the material to make it and now the two films are companion pieces.

 

Montano: Just one final question: do you think it’s possible to have a festival today where there’s free admission? Do you think anyone would concede the way Woodstock did? To me it’s just impossible to imagine.

 

Goodman: No, I don’t. Not in our current climate. First of all, you can’t have the same thing because of cell phones. There will never be that isolation again. They were on their own and they had to make it through together.

 

And the money side, I just don’t see it. It’s so improbable that the quartet of people who put this on would all be so in over their heads. They were so naïve in ways and this just hadn’t been done. And back then if you just had money you could do it. But now a days you get disasters like Fyre Festival. But here it was also partly the human beings. Joel and Jon were – Joel remains, Jon passed away – a wonderful human being who just doesn’t put money ahead of other people. And that’s what happened.

 

But it was also the times. Not everyone was counting money all the time and figuring out profit margins. It was a capitalist venture, but it was so loose. They were just writing checks – they weren’t even keeping count. They didn’t know how in debt they were. It was just a different moment in our history and one I’m nostalgic about. Today, all the accountants would be there and with their lawyers, and law suits would be flying long before a single chord of music was played. And actually, that is happening with the 50th anniversary concert. It’s probably going to fall apart because it’s a different moment in history. Woodstock was unique and entirely special.

 

Watch “Woodstock: Three Days That Defined a Generation” on PBS Tuesday, August 6th to enjoy an incredible documentary highlighting the 50th anniversary of the historic music festival.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172668 https://historynewsnetwork.org/article/172668 0
All the world’s a stage when you’re Boris Johnson basking in the no-deal-Brexit spotlight

 

In politics, it is said that you campaign in poetry and govern in prose.  Alexander Boris de Pfeffel Johnson, the boy who once said he wanted to be ‘world king,’ has finally realized his life-long ambition of becoming British Prime Minister. The real task of uniting his fractured party over Brexit – and most importantly, uniting a divided country – now lies ahead.

 

Johnson defeated Jeremy Hunt, the mild-mannered former foreign secretary by a convincing majority – that much was never in doubt. Some 160,000 Conservative or Tory Party members – or about 0.24% of the British population – chose the next prime minister of the United Kingdom.  The opposition Labour Party has already criticized Johnson for lacking legitimacy.

 

For Americans wondering how it is that such a small minority could determine the leader of the country, in the UK as opposed to the U.S., people vote in a general election for a party – not an individual – and the leader of the party that has a majority in the House of Commons or Parliament becomes prime minister and forms the next government.

 

This time around, there wasn’t a general election.  The Tory Party remained in office despite the resignation of its leader, Theresa May.  The task then was to choose a new party leader.  Of the Conservative Party faithful, the turnout for the leadership election was 87.4%.  Jeremy Hunt managed 46,656 of the votes against 92,153 for Boris Johnson.  Although Johnson’s majority was thumping, it was in fact less than David Cameron’s (the former prime minister) when he became Tory Party leader in 2005. 

 

It is still a decisive result and the scale of the victory is important.  It gives Boris Johnson tremendous lift-off as he begins his premiership.  He campaigned on a pledge to take Britain out of the European Union by Halloween, ‘no ifs ands or buts,’ come what may, ‘do or die.’  In a way, his election is an endorsement by Tories for a so-called ‘hard Brexit’ – leaving the European Union without a trade deal.

 

His choice of cabinet ministers clearly shows that Boris Johnson is intent on delivering Brexit.  It is also a sure sign to the EU based in Brussels that there’s a new sheriff in London.  Whether Brussels is going to be moved by a cabinet of die-hard ‘Brexiteers’ in Britain is another matter altogether.

 

The contrast between Boris Johnson and his predecessor could not be sharper.  Theresa May, the daughter of a vicar, was distinctly shy and notoriously non-sociable with a deep sense of duty.  She was happy to bury her head in work behind the scenes without any fanfare.  Boris Johnson on the other hand is clearly one of the more colorful prime ministers in recent times with his fair share of controversies.  He is known for his jokes and irreverence.  His personal life has raised a lot of eyebrows.  A twenty-five-year marriage has ended amid allegations of several extra-marital affairs.  The police were recently called to an apartment he shared with his girlfriend after a neighbor claimed to hear yelling, plate-smashing and a woman shouting ‘get off me’ and ‘get out.’  Johnson has refused to answer any questions about the alleged incident.

 

Boris Johnson and Donald Trump have a few things in common.  They are both from New York and are disruptors who believe in throwing out the rule book of politics and pursuing a more unorthodox style of leadership.  Both men defy political gravity and have survived controversies that would otherwise have scuppered the leadership ambitions of any candidate.  Both Trump and Johnson have also been plagued with alleged infidelity, gaffes, and remarks deemed as racist.

 

One thing is sure, Donald Trump has an ally in Downing Street, tweeting that Boris Johnson would make a great prime minister.  But there are some fundamental differences.  In the past, Boris Johnson has described Trump as ‘unfit’ to hold the office of President of the United States, and while he was foreign secretary (secretary of state) Boris Johnson openly opposed Donald Trump’s Muslim ban.  It would be interesting to watch how the two men with very similar personalities interact.  I am sure all sins have been forgiven.  The question now is, will the so-called ‘special relationship’ between Britain and America be special only for the British?  Time will tell.

 

As mayor of London, Boris Johnson presided over a reduction in crime by 27%.  He was also successful in winning two consecutive mayoral elections in Labour-dominated London.  He helped bring the Olympics to the capital in 2012, and his supporters say that he will be equally as successful as prime minister.  His sense of optimism and can-do spirit for Britain is infectious and powerful in a party trounced in the recent European elections to fourth place.

 

Johnson is also well known for his rhetorical tour de force, and an ability to use oratory to amuse, entertain and wow his audience.  Contrary to the caricature of him as a gaffe-prone bombast, Boris Johnson has a formidable intellect and is the ultimate insider outsider – he read classics at Oxford and is the 20th prime minister to be educated at the highly selective Eton College, where Britain’s elect are born and bred.  He went on to become a journalist but was later fired for making up quotes before later entering Parliament in 2001.

 

Johnson is a marmite figure – you like him or loathe him – his critics say that being mayor of London is not the same thing as being prime minister.  That the job at Number Ten entails uniting an entire country behind a vision.  His leadership style, they argue, of selecting people who can deliver around him won’t be enough.  Johnson is notorious for his lack of attention to detail. 

 

Banking on his star gold dust quality of campaigning with soundbites and rhetoric alone, may not be enough to succeed in the top job.  Boris Johnson will have to focus on the minutia of detail across the whole spectrum of government.  As mayor, he could avoid the agony of policy detail, allowing his officials to do the heavy lifting instead.  In Downing Street and as prime minister, there is no hiding place.  The buck starts and stops with him.  His party’s razor-thin majority of two in Parliament – likely to dwindle to just one after a by-election – will require all the diplomatic and persuasive skills he can muster.

 

The leadership at Downing Street might have changed, but the mathematics of the House of Commons remain exactly as they were when Theresa May encountered her difficulties, ultimately leading to her resignation.  There are mutterings among quite senior Tory pro-European members of parliament that they are prepared to bring down the new government if that is what it takes to stop a no-deal Brexit.  More importantly, the House of Commons has voted resoundingly to stop a no-deal Brexit from ever happening.  

 

That makes it difficult to see how Boris Johnson can persuade the European Union to re-negotiate a new Brexit deal.  When he comes to Brussels, his EU counterparts already knows that he has a wafer-thin majority in Parliament and is therefore unlikely to yield an inch. 

 

The EU has made it clear that it will not go back on the Withdrawal Act it negotiated with Theresa May.  The so-called ‘Backstop’ – the guarantee of a frictionless border between Northern Ireland (part of the UK) and the Republic of Ireland (part of the EU) in the event of a no-deal Brexit – remains the major sticking point between London and Brussels.  How Boris Johnson succeeds in threading that needle where Theresa May has failed remains to be seen.

 

I predict Boris Johnson will call a general election before October 31, 2019 in a bid to win a fresh mandate, this time, from the British people.  The smart money is on him winning, given that the opposition Labour Party under Jeremy Corbyn is flat on its back with its own internal problems over a lack of a coherent direction on Brexit.  The Tories in a bid to recover votes from the Brexit Party of Nigel Farage will tout a clear message as the party of ‘Leave.’  The Cabinet picked by Boris Johnson does not strike me as one designed to manage the day-to-day affairs of the country.  It looks rather like a campaign team, ready to sell the message of a no-deal Brexit on the eve of a general election.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172698 https://historynewsnetwork.org/article/172698 0
How African American Land Was Stolen in the 20th Century

 

Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

I recently read an article in the New Yorker that so shocked me that I knew I had to tell you, my small audience, all about it. Vast tracts of land owned by African Americans were taken from them in the 20th century. At the heart of the story is racism in many forms: how the promise of emancipation after the Civil War was broken; how whites used bureaucracy and twisted legalisms to take black land from owners too poor to defend themselves; how the teaching of American history was whitewashed to bury this story. I was shocked because, after decades of studying history, I had no idea about this fundamental cause of economic inequality in America. Writing this article pushed me into investigating the even larger story of how black Americans were prevented from owning real estate, one of the fundamental sources of wealth.

 

Here’s a short version of the history. At the time of Emancipation, Union General William Tecumseh Sherman declared that 400,000 acres formerly held by Confederates be given to African Americans. His order came to be known as the promise of “40 acres and a mule”. But the newly established Freedmen’s Bureau was never able to control enough land to fulfill this promise. In 1866, Congress passed the Southern Homestead Act, opening up 46 million acres of public land in southern states for Union supporters and freed slaves. The land was uncultivated forest and swamp, difficult for penniless former slaves to acquire or use. Southern bureaucrats made it difficult for blacks to access any land and southern whites used violence to prevent blacks from occupying land. Within 6 months, the land was opened to former rebels. In 1876, the law was repealed.

 

The much more extensive Homestead Act of 1862 granted 160 acres of government land in the West to any American who applied and worked the land for 5 years. Over the course of the next 60 years, 246 million acres of western land, the area of California plus Texas, was given to individuals for free. About 1.5 million families were given a crucial economic foundation. Only about 5000 African Americans benefitted.

 

Despite obstacles, many black families had acquired farmland by World War I. There were nearly 1 million black farms in 1920, about one-seventh of all American farms, mostly in the South. During the 20th century, nearly all of this land was taken or destroyed by whites. Sometimes this happened by violent mob action, as in Tulsa, Oklahoma, in 1921, or the lesser known pogrom in Pierce City, Missouri, in 1901, when the entire black community of 300 was driven from town. A map shows many of the hundreds of these incidents of white collective violence, concentrated in the South. Many of the thousands of lynchings were directed at black farmers in order to terrorize all blacks and make them leave.

 

Other methods had a more legal appearance. Over 75 years, the black community of Harris Neck, Georgia, developed a thriving economy from fishing, hunting and gathering oysters, on land deeded to a former slave by a plantation owner in 1865. In 1942, the federal government took gave residents two weeks notice to leave, their houses were destroyed, and an Air Force base was created. That site was chosen by the local white politicians. Black families were paid two-thirds of what white families got per acre. Now the former African American community is the Harris Neck National Wildlife Refuge.

 

Vast amounts of black property were taken by unscrupulous individual use of legal trickery, because African Americans did not typically use the white-dominated legal system to pass property to their heirs. White developers and speculators took advantage of poorly documented ownership through so-called partition sales to acquire land that had been in black families for generations. One family’s story is highlighted in the New Yorker article, co-published with ProPublica. The 2001 Agricultural Census estimated that about 80% of black-owned farmland had disappeared in the South since 1969, about half lost through partition sales.

 

Decades of discrimination by the federal government made it especially difficult for black farmers to retain their land as farming modernized. The Department of Agriculture denied loans, information, and access to the programs essential to survival in a capital-intensive farm structure, and hundreds of thousands of black farmers lost their land. Even under President Obama, discrimination against black farmers by the USDA continued.

 

Because land was taken by so many different methods across the US, and the takers were not interested in recording their thefts clearly, it is impossible to know how much black land was taken. The authors of the New Yorker article say bluntly, “Between 1910 and 1997, African Americans lost about 90% of their farmland.” That loss cost black families hundreds of billions of dollars. In 2012, less than 2 percent of farmers were black, according to the most recent Agricultural Census.

 

While rural blacks lost land, real estate holdings of urban blacks were wiped out by a combination of government discrimination and private exploitation. Because black families could not get regular mortgages due to redlining by banks, if they wanted to buy a house they had to resort to private land sale contracts, in which the price was inflated and no equity was earned until the entire contract was paid off. If the family moved or missed one payment, they lost everything. A recent study of this practice in Chicago in the 1950s and 1960s showed that black families lost up to $4 billion in today’s dollars.

 

For the first time in decades, reparations for African Americans who were victimized by the white federal and state governments are being discussed seriously. This story about whites taking black property shows how superficial, disingenuous and ahistorical are the arguments made by conservatives against reparations. When Sen. Mitch McConnell delivered his simplistic judgment last month, he was continuing the cover-up of modern white real estate theft: “I don't think reparations for something that happened 150 years ago for whom none of us currently living are responsible is a good idea.”

 

Surveys which demonstrate that the majority of white Americans are against reparations only demonstrate how ignorance of America’s modern history informs both public opinion and survey questions. Gallup asked, “Do you think the government should – or should not – make cash payments to black Americans who are descendants of slaves?” While blacks were in favor 73% to 25%, whites were opposed 81% to 16%. A different question might elicit a more useful response: Do you think the government should make cash payments to millions of black Americans whose property was stolen by whites and who were financially discriminated against by American government since World War II?

 

Today’s economic gap between black and white began with slavery. Emancipation freed slaves, but left them with nothing. Hundreds of millions of acres of land were given away to white families. When blacks gradually managed to get some land, it was taken by violence and legal trickery during the 20th century. After World War II, blacks were denied access to another giant government economic program, the GI Bill, which helped millions of white veterans acquire houses. The collusion of federal, state and local governments, banks, and real estate professionals bilked African Americans of billions of dollars in real estate, with the subprime mortgage crisis only a decade ago as the latest chapter. What I have written here is only an outline of the racist narrative.

 

Despite the ravages of slavery, the American story would have been very different if the ideas and practices behind Lincoln’s Emancipation had been put into effect. Instead, white supremacy reemerged in the South and throughout the US. The power that white supremacists exerted in 20th-century America is symbolized by James F. Byrnes, a South Carolina politician, who served in the House of Representatives 1911-1925, was one of the most influential Senators 1931-1941, was appointed to the Supreme Court by FDR, but then led the Office of Economic Stabilization and the Office of War Mobilization during World War II, became Secretary of State 1945-1947, and was Governor of South Carolina 1951-1955. In 1919, he offered his theory of American race relations: “This is a white man’s country, and will always remain a white man’s country.” He followed that motto throughout his career.

 

Our nation is still paying the price.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/blog/154232 https://historynewsnetwork.org/blog/154232 0
Roundup Top 10!  

 

President Trump’s Baltimore tweets were racist — but he also fails to grasp what ails our cities

by Sara Patenaude

Decades of racist policies — not absentee congressmen — explain cities’ biggest struggles.

 

The Real Problem With Trump’s Rallies

by Kevin Kruse

There are a lot of similarities between the president and George Wallace of Alabama. But there’s also one big difference.

 

 

Donald Trump and Boris Johnson Rode the Same Wave Into Power. History Suggests the Parallels Won’t Stop There

by David Kaiser

The history of Anglo-American relations and how Trump and Johnson will utilize them.

 

 

When we hear populism, we think Donald Trump. But we should be thinking Elizabeth Warren.

by Gregg Cantrell

How the left can reclaim the mantle of populism that is rightly theirs.

 

 

How music took down Puerto Rico’s governor

by Verónica Dávila and Marisol LeBrón

Underground music overcame censors to gain popularity and political power.

 

 

What 'Infests' Baltimore? The Segregation History Buried in Trump's Tweets

by Paige Glotzer

In slamming Maryland Rep. Elijah Cummings’s 7th District as “rodent infested,” Trump borrows from the rhetoric that first segregated the city.

 

 

Trump’s Venom Against the Media, Immigrants, “Traitors,” and More Is Nothing New

by Adam Hochschild

The parallels between 1919 and 2019.

 

 

Winston Churchill Would Despise Boris Johnson

by Ian Buruma

Britain’s new leader has a sadly exaggerated sense of the importance his country will have after Brexit.

 

 

Don't just revile Amy Wax--rebut her

by Jonathan Zimmerman

Outrage does not and cannot refute what she said -- only facts can do that.

 

 

Expecting Ireland to be servile is part of a long British tradition

by Richard McMahon

The Brexit crisis is another example of how the UK so often ignores the consequences of its conduct on its neighbour.

 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172706 https://historynewsnetwork.org/article/172706 0
Leonardo da Vinci: Still a Genius 500 Years Later This summer marks the 500th anniversary of the death of Italian painter, scientist, inventor and architect Leonardo da Vinci. If you mention his name to most people they will say, rather quickly, “the Mona Lisa” or “the Last Supper.”  They are two of the most impressive works in world art history, but da Vinci was far more of an artistic force than just two paintings. He was the man who invented the primitive bicycle, the tank, the machine gun, the airplane and all sorts of machines that made life easier for people. He also put together 16,000 thousand pages of notes and illustrations in large, thick notebooks. You needed an invention? He had it, or could quickly produce it.

 

Many of the world’s most prominent museums are showing exhibits of his work this summer, whether the mammoth Metropolitan Museum in New York or the smaller Berkshire Museum, in Pittsfield, Massachusetts. There are many exhibits of da Vinci’s work in museums in London, England, Milan, Florence and Turin in Italy, Poland, Germany, Scotland and other nations.

 

The exhibits about da Vinci, who was born in 1452 and died in 1519, offer a rare and breathtaking look at the artistry of the great man, cover his life and times and offer some intriguing information about him. As an example, he suffered from strabismus, a permanent disease of the eyes that threw his vision slightly out of line. The benefit, though, was that it enabled him to “see” three dimensional foundations for his work and permitted him to produce three dimensional drawings and designs that no one else could.

 

He was an ethereal artist, to be sure, but he was also a shrewd businessman. As a young man he realized that the Italian states of Venice, Tuscany, Milan and others were frequently attacked by other nations. He plunged into efforts to design weapons and transports for the military. He designed a tank with four wheels propelled by men inside the tank who also used levers and triggers to fire weapons at the enemy (the tank was not actually used until World War I, 400 years later). He designed the mobile machine gun, a small, three-foot-high machine on wheels that was moved about quickly in a battle. Numerous eight or nine “guns,” or barrels, were mounted on it for rapidly firing. It was deadly. He was told that the biggest problem armies had was crossing rivers and so he designed temporary bridges made out of fallen trees set up like the vaulted ceilings of domes in which the trees, leaning on top of each other, offered the support that pillars normally would provide. The bridge could be constructed quickly and taken down just as quickly.

 

The Berkshire Museum, in Pittsfield, has a large and impressive exhibition called “Leonardo da Vinci: Machines in Motion,” produced by the Da Vinci Museum, in Florence, Italy, and loaned to it.

 

Da Vinci always believed that what we thought was true was not necessarily true. “All our knowledge had its origins in our perceptions,” he often said.

 

He proved that with his machines, as shown in the Berkshire exhibit. He used the “worm screw,” a long wooden screw that could be turned easily by levers and wheels. His screws were made to lift huge weights easily. He designed rotary screws that could be manipulated by hand and mesh with other screws or gears to move weights. In the Berkshire exhibit is the “double crane’ that da Vinci invented. People used one side of the crane to lift up things and the other to drop them down to street level. Both could be used at the same time and operated by just one or two people. He even invented a machine for blacksmiths so they could use it quite easily to pound down iron with a hammer without using their arms or hands at all. He invented a printing press just 40 years after Guttenberg’s; his could be run by just one man. 

 

What were some of his most popular inventions? Well, first and foremost, the bicycle. Da Vinci built a wooden bike that operated just like those of today. The day I was at the exhibit, kids flocked to it.

 

“Hey dad, I didn’t know they had bikes back then!” screamed one child.

 

The Berkshire Museum has weekly days on which kids can participate in hands on Da Vinci play sessions with their own drawings

 

The Berkshire Museum exhibit sprawls over the entire second floor of the building and a gallery on the first. It is spacious. There are two large video screens on which you see examples of his art and his life story. They add a nice touch to the exhibit and carry da Vinci from the 15th century to the 21st.

 

Da Vinci worked during the Italian Renaissance, called the age of discovery, and people were eager for his inventions. City and state governments supported his work and he became friends with wealthy and powerful people.

 

Some museum exhibits are more compressed than the Berkshire Museum’s but elegant, such as the one at the Metropolitan Museum of Art, in New York. The curators there decided not to stage a large exhibit of the artists/inventor’s work, but instead showcase one famous painting. They chose Saint Jerome Praying in the Wilderness, an unfinished masterpiece started in 1485 and about 85% complete. The painting is in its one gallery with religious music playing all day. That gallery is set inside of a larger gallery of religious paintings, sculpture and artifacts to give the exhibit a very religious feel.

 

The Met exhibit, that drew quite a crowd when I was there, sets off Saint Jerome by himself, surrounded by black walls for effect. It is impressive. The curators urge you to study the painting to see how da Vinci worked. As an example, there is the outline of a lion at the bottom right of the painting that needs to be completed. There are also fingerprints on the top of the painting where Leonardo tried to smooth out paint with his gnarly fingers

 

Max Hollien, director of the Met, said that the work “provides an intimate glimpse into the mind of a towering figure of western art.”

 

The New York exhibit shows that da Vinci did not paint in a very careful way, working on some parts for weeks and then jumping to other parts. He sketched out few final portrait drafts and tended to approach his work in an uninhibited way.

 

So, the next time you sit back in an airplane you can than Leonardo.

 

The Berkshire Museum exhibit is on display until September 8. The Met Museum exhibit is on display continues to October 6.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172679 https://historynewsnetwork.org/article/172679 0
Dear Moderators of the Presidential Debates: How About Raising the Issue of How to Avert Nuclear War?

 

You mass media folks lead busy lives, I’m sure.  But you must have heard something about nuclear weapons―those supremely destructive devices that, along with climate change, threaten the continued existence of the human race.  

 

Yes, thanks to popular protest and carefully-crafted arms control and disarmament agreements, there has been some progress in limiting the number of these weapons and averting a nuclear holocaust.  Even so, that progress has been rapidly unraveling in recent months, leading to a new nuclear arms race and revived talk of nuclear war.

 

Do I exaggerate? Consider the following.  

 

In May 2018, the Trump administration unilaterally withdrew from the laboriously-constructed Iran nuclear agreement that had closed off the possibility of that nation developing nuclear weapons.  This U.S. treaty pullout was followed by the imposition of heavy U.S. economic sanctions on Iran, as well as by thinly-veiled threats by Trump to use nuclear weapons to destroy that country.  Irate at these moves, the Iranian government recently retaliated by exceeding the limits set by the shattered agreement on its uranium stockpile and uranium enrichment.

 

At the beginning of February 2019, the Trump administration announced that, in August, the U.S. government will withdraw from the Reagan era Intermediate-Range Nuclear Forces (INF) Treaty―the historic agreement that had banned U.S. and Russian ground-launched cruise missiles―and would proceed to develop such weapons.  On the following day, Russian President Vladimir Putin declared that, in response, his government was suspending its observance of the treaty and would build the kinds of nuclear missiles that the INF treaty had outlawed.

 

The next nuclear disarmament agreement on the chopping block appears to be the 2010 New START Treaty, which reduces U.S. and Russian deployed strategic nuclear warheads to 1,550 each, limits U.S. and Russian nuclear delivery vehicles, and provides for extensive inspection.  According to John Bolton, Trump’s national security advisor, this fundamentally flawed treaty, scheduled to expire in February 2021, is “unlikely” to be extended.  To preserve such an agreement, he argued, would amount to “malpractice.”  If the treaty is allowed to expire, it would be the first time since 1972 that there would be no nuclear arms control agreement between Russia and the United States.

 

One other key international agreement, which President Clinton signed―but, thanks to Republican opposition, the U.S. Senate has never ratified―is the Comprehensive Test Ban Treaty (CTBT).  Adopted with great fanfare in 1996 and backed by nearly all the world’s nations, the CTBT bans nuclear weapons testing, a practice which has long served as a prerequisite for developing or upgrading nuclear arsenals.  Today, Bolton is reportedly pressing for the treaty to be removed from Senate consideration and “unsigned,” as a possible prelude to U.S. resumption of nuclear testing.

 

Nor, dear moderators, does it seem likely that any new agreements will replace the old ones. The U.S. State Department’s Office of Strategic Stability and Deterrence Affairs, which handles U.S. arms control ventures, has been whittled down during the Trump years from 14 staff members to four.  As a result, a former staffer reported, the State Department is no longer “equipped” to pursue arms control negotiations.  Coincidentally, the U.S. and Russian governments, which possess approximately 93 percent of the world’s nearly 14,000 nuclear warheads, have abandoned negotiations over controlling or eliminating them for the first time since the 1950s.

 

Instead of honoring the commitment, under Article VI of the 1968 nuclear Nonproliferation Treaty, to pursue negotiations for “cessation of the nuclear arms race” and for “nuclear disarmament,” all nine nuclear powers are today modernizing their nuclear weapons production facilities and adding new, improved types of nuclear weapons to their arsenals.  Over the next 30 years, this nuclear buildup will cost the United States alone an estimated $1,700,000,000,000―at least if it is not obliterated first in a nuclear holocaust.

 

Will the United States and other nations survive these escalating preparations for nuclear war? That question might seem overwrought, dear moderators, but, in fact, the U.S. government and others are increasing the role that nuclear weapons play in their “national security” policies.  Trump’s glib threats of nuclear war against North Korea and Iran are paralleled by new administration plans to develop a low-yield ballistic missile, which arms control advocates fear will lower the threshold for nuclear war.

 

Confirming the new interest in nuclear warfare, the U.S. Joint Chiefs of Staff, in June 2019, posted a planning document on the Pentagon’s website with a more upbeat appraisal of nuclear war-fighting than seen for many years.  Declaring that “using nuclear weapons could create conditions for decisive results and the restoration of strategic stability,” the document approvingly quoted Herman Kahn, the Cold War nuclear theorist who had argued for “winnable” nuclear wars and had provided an inspiration for Stanley Kubrick’s satirical film, Dr. Strangelove. 

 

Of course, most Americans are not pining for this kind of approach to nuclear weapons. Indeed, a May 2019 opinion poll by the Center for International and Security Studies at the University of Maryland found that two-thirds of U.S. respondents favored remaining within the INF Treaty, 80 percent wanted to extend the New START Treaty, about 60 percent supported “phasing out” U.S. intercontinental ballistic missiles, and 75 percent backed legislation requiring congressional approval before the president could order a nuclear attack.

 

Therefore, when it comes to presidential debates, dear moderators, don’t you―as stand-ins for the American people―think it might be worthwhile to ask the candidates some questions about U.S. preparations for nuclear war and how best to avert a global catastrophe of unprecedented magnitude?

 

I think these issues are important.  Don’t you?

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172611 https://historynewsnetwork.org/article/172611 0
White Nationalists and the Legacy of the Waffen-SS from Postwar Europe to Today

 

The military collapse of Germany in 1945 was so total that any fears held by the Allies of a Nazi revival were soon dissipated. The success of West Germany as a successful, democratic state confirmed the view that Germany’s militaristic history really was a thing of the past. And yet the specter of Nazi Germany has never completely disappeared, the exploits of the Waffen-SS continue to provide inspiration for extreme right-wing nationalists. More recently there has been a trend to rehabilitate Waffen-SS veterans from Eastern European countries and to reconsider them as patriots rather than agents of Nazi tyranny. 

 

The Waffen-SS was just one of many strange organizations to emerge from Nazi Germany. Originally intended as an elite palace guard to protect Adolf Hitler it grew massively in size and scale to become a multi-national army, the military wing of Heinrich Himmler’s dreaded SS. Of the 900,000 men passing through its ranks, approximately half were drawn from countries outside the German Reich, some even recruited from ethnic groups normally considered beyond the Nazi racial pale. 

 

During the course of World War II, the armored formations of the Waffen-SS gained a reputation for excellence on the battlefield, a reputation that had to be set against its involvement in the mass killings of civilians and numerous battlefield massacres. In the aftermath of defeat in 1945, the Waffen-SS was not only judged to be a criminal entity by the Allies but was also condemned by many old comrades in the Wehrmacht, the German armed forces, eager to pass on blame for the many atrocities committed by Germany during the war. 

 

In response, former Waffen-SS soldiers established HIAG, a self-help group to lobby the new West German government for improved material conditions for its members and to promote the idea that the Waffen-SS was an honorable body of soldiers, similar to those of the German army. The advent of the Cold War reinforced and expanded this narrative, the SS veterans arguing that the multi-national Waffen-SS was a forerunner of NATO and a saviour of Western civilization from Soviet barbarism.

 

The concept of the honorable Waffen-SS achieved some traction, not least in forming its own sub-genre of military publishing. The memoirs and divisional histories written by former SS soldiers were taken-up and amplified by a new postwar generation of enthusiasts in the United States and Western Europe. They also provided a convenient ideological template for the various Neo-Nazi and extreme right-wing groups that emerged in Western Europe during the 1950s, who drew heavily on the distinctive iconography of the Waffen-SS. In the long run, however, these groups achieved little: they were rent by internal division, reviled or simply ignored by the general public and subject to hostile scrutiny by national security services. 

 

Behind the Iron Curtain, any form of celebration of the Waffen-SS would have been unthinkable – a treason against the memory of the Red Army’s victory over Nazi Germany. But this would all change with the fall of the Berlin Wall in 1989 and the collapse of communism in Eastern Europe. 

 

In many of these countries, the long years of Soviet domination were conflated with those of Nazi Germany. Resistance to Soviet rule was always based on nationalist lines, and any organization that even appeared to have fought for national sovereignty became an object of veneration, and in the 1990s this included the Waffen-SS. The governments of Hungary, Latvia and Estonia were in the forefront of this new welcoming attitude, even sending ministers to preside over SS commemorations. In actuality, Hitler and Himmler – totally opposed to any concept of national self-determination – had cynically employed these soldiers for anti-partisan duties or as cannon fodder on the Eastern Front. 

 

Regardless of the historical truth, extreme nationalists were quick to exploit the new opportunities in the East. Waffen-SS veterans from Germany and the rest of Western Europe were invited to Eastern Europe to take part in celebrations otherwise banned in their home counties. 

 

Hungary publicly acknowledged the Waffen-SS in its annual ‘Day of Honor’ celebrations, first held in 1997, which commemorated the siege of Budapest in 1944-45. In something of a festival atmosphere – complete with flying flags, martial music and the laying of wreaths – veterans from the Waffen-SS marched alongside those of the German Wehrmacht and the Hungarian Army, to the applause of an appreciative audience of nationalist and neo-Nazi groups. 

 

Latvia and Estonia were also prominent in welcoming Waffen-SS veterans from across Europe, who in turn donated relief supplies and money to their hosts. Support for the Waffen-SS was somewhat more controversial in the Baltic states, however, with its large minority populations of Russian-speaking citizens opposing the erection of memorials glorifying SS soldiers as freedom fighters. Despite this, Narva in Estonia became a key site of commemoration, the former battleground where Waffen-SS units from the Baltic states, Germany and Western Europe had fought together in defense of the city in 1944. 

 

The break-up of the Soviet Union in 1991 led to the formation of Ukraine as an independent state. During World War II the Waffen-SS had raised a Ukrainian division to fight on the Eastern Front. After independence its veterans found themselves transformed from fascist collaborators into heroes in the struggle for nationhood. The once neglected gravestones of former soldiers were tended by volunteers and the division’s distinctive lion insignia was publicly displayed by young Ukrainians. In the ensuing conflict between Russia and Ukraine, the link between the Waffen-SS and present-day Ukrainian paramilitary forces became apparent. The infamous Azov Battalion openly espoused anti-semitic, white supremacist attitudes and adopted the Wolfsangel insignia worn by several Waffen-SS divisions. 

 

In 21st century Europe, pressures from mass migration, the negative aspects of globalization and the rise of populist political parties emboldened extreme nationalists. The influx of migrants from Africa and the Middle East encouraged them to take their lead from Waffen-SS mythology and define themselves as defenders of Europe from outside threat. At a Waffen-SS commemoration in Estonia in 2005, a Swedish neo-Nazi described his meeting with a tall Belgian veteran: ‘I am so eager standing over here with this two-meter man. He asks me, for the sake of their honor, to free Sweden from the foreign occupiers and explains that we Aryans will die if nothing happens.’ The fawning encounter, as described here, chillingly suggested that the work of the Waffen-SS was not yet complete and needed others to finish the task. Racially motivated attacks on migrants have become increasingly commonplace. 

 

More mainstream nationalist political parties such as Jobbik in Hungary and AfD in Germany have enjoyed some success in recent elections, and while they publicly disown association with neo-Nazi groups a close relationship exists between them. Thus, the neo-Nazi Hungarian Guard – modelled on the Nazi-sponsored Arrow Cross Party – acts as a shadowy para-military force on behalf of Jobbik. This resurgence of extreme nationalism demonstrates the enduring influence of National Socialism, including that of the Waffen-SS. It is a legacy Europe could well do without.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172610 https://historynewsnetwork.org/article/172610 0
Racism in America: What We Can Learn from Germany’s Struggle with Its Own Evil History

Memorial to the Murdered Jews of Europe in Berlin

 

President Trump’s racist tweet storm and the reactions of members of Congress have generated substantial media coverage that often criticizes the ugly remarks. Yet these discussions generally lack any historical context and are a poor substitute for a meaningful national conversation on the central role of slavery in America’s economic and political development.  

 

What would such a “national reckoning” over slavery look like? 

 

Many countries have historically oppressed racial minorities or committed terrible ethnic cleansing, but few have adequately grappled with their past or paid substantial sums in compensation to victims. 

 

One nation that has done both is modern Germany, which has publicly acknowledged responsibility for the Holocaust. The Germans have even created a special noun, Vergangenheitsbewältigung, to describe the process. According to Wikipedia, the word has two closely related meanings: first, “a struggle to overcome the negatives of the past,” and second, “public debate within a country on a problematic period of its recent history.” 

 

In Germany, this national debate has produced dozens of major monuments honoring the victims of the Holocaust and the creation of a K-12 educational curriculum which explains the Nazi government’s role in war crimes and condemns the perpetrators.  

 

Since 1952, the German government has paid more than $75 billion in reparations to the state of Israel, Jewish relief organizations and individual victims of the Holocaust. 

 

It is beyond the scope of this article to compare the crimes committed during America’s two hundred years of African-American slavery with the horrible tragedy of the Holocaust. We can, however, look at the process by which the German people have, in the last thirty years, slowly come to accept responsibility for the Nazi’s regimes crimes.  

 

As the late historian Tony Judt wrote in his classic Postwar: A History of Europe Since 1945, “a nation has first to have remembered something before it can begin to forget it.” 

 

In his book (published 2005), Judt notes that during the first five years after the end of World War II, the new, reconstituted German government tried to avoid any punishment or moral responsibility for crimes of the Third Reich. Their position, reflecting the attitude of most Germans, was “It was all Hitler’s fault.” 

 

In schools, in the news media, and in many government statements, most German adults avoided any mention of the crimes against the Jews. The people experienced a “national amnesia” regarding the years 1933-45. 

 

The first breakthrough came when Chancellor Konrad Adenauer in 1952 negotiated a treaty with the new nation of Israel. Known as the Luxembourg Agreement, it initiated a large-scale series of payments that continue to this day. 

 

While the initial decision to “write a check” (i.e. pay compensations) came relatively soon after the war, when many of the victims were still alive and in desperate need, it would require a new generation, one with no direct ties to the Nazi regime, to publicly acknowledge the German people’s responsibility for the crimes committed against the Jews and other minorities.

 

During the massive rebuilding effort of the 1950s and 1960s, many attempts were made to remove any traces of the Nazi regime. For example, in Munich local authorities wanted to tear down and pave over the Dachau concentration camp. The American military, which had captured the site intact, insisted it be preserved as a testament to the crimes committed there.

 

As Judt noted in Postwar, a national discussion was spurred in 1979 when a four-part American TV series on the Holocaust was shown on German TV. The series included portrayals of the round-ups of Jewish citizens and depictions of the gas chambers. Many younger Germans had never been exposed to these images before.  

 

On January 27, 1995, for the 50th anniversary of the liberation of Auschwitz, thousands of Germans voluntarily participated in ceremonies remembering the Holocaust. In 1999, the German parliament commissioned a new Memorial to the Murdered Jews of Europe, which opened in 2005. This stark display of 2,711 bare concrete slabs stands in the middle of the new, reunified Berlin. Today dozens of other major monuments and thousands of small plaques acknowledging the Holocaust are on display across Germany. 

 

Can we Americans begin our own Vergangenheitsbewältigung?

 

At best, we have taken a few tentative steps. We have had a few movies and TV series, notably Twelve Years a Slave, which depicted the horrors of slavery in antebellum America. We also have the new National Museum of African American History and Culture, which has a deeply moving exhibit on slavery. 

 

But there is also widespread ignorance or denial about slavery. Confederate statues still adorn many Southern cities and the halls of the U.S. Capitol. The Tennessee governor recently proclaimed July 13 Nathan Bedford Forrest Day in his state, honoring the former Confederate general, slave trader and early KKK leader.

 

A 2018 report by the Southern Poverty Law Center found that less than 8% of students knew why Southern states seceded from the union; only 12% knew about the economic importance of slavery to the North. A key problem, the SPLC noted, is that while a dozen states require teaching about some aspects of slavery, there is no nationally accepted, systematic approach. 

 

For example, few American history textbooks point out that protections for slavery were embedded in the Constitution or that slave owners dominated the federal government from 1787 through 1860. 

 

Sven Beckert, an acclaimed Harvard history professor, noted in his 2018 book Slavery’s Capitalism that during most of the 20th century, slavery was treated as a just Southern problem. However, a “new consensus” is emerging that all the American states benefited significantly from plantation slavery.  Rather than being a sidetrack or minor element in our history, Beckert asserts it is in fact “the interstate highway system of the American past.”

 

If so, it has yet to appear on the cultural road map of most Americans. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172650 https://historynewsnetwork.org/article/172650 0
Is Stonehenge a Tourist Rip-Off?

 

Recently, Trip Advisor called Stonehenge the eleventh worst tourist rip-off in the UK. One reviewer called it a pile of bricks in a field. The prohibition on entering or touching the stones has not helped but the invisibility of the culture that built Stonehenge also detracts from the site’s significance. Interpreting the enigmatic stones is rather like visiting a cathedral without any knowledge of Christianity. 

 

Stonehenge sits, isolated, on a chalk down, the last stone circle to be built in Britain. Subsequent cultures littered tombs around the site. Farmers ploughed and destroyed all but a remnant of the avenue that linked the circle to the River Avon. The tribal lands were split between three counties; it is no longer the sum of its parts. Modern roads have dissected the site and compromised the archaeology. Vehicle noise overwhelms the wind soughing through the stones. In the past decade, Stonehenge has attracted far more visitors. The opening of the visitor centre by English Heritage in 2013 has stimulated much of this interest.

 

Prehistory, the unrecorded period before the Romans, is a challenge to us. As Christianity developed, many pagan sites were covered by churches in order to suppress pagan culture.  Archaeologists argue with each other about how and why Stonehenge was built and rarely agree. Horticulturalists can provide crucial information to help unravel Stonehenge’s mystery by examining its soil and topography. 

 

After the Ice Age, nomadic hunter gatherers walked into Britain from Europe and roamed the country for 6000 years. In 4000 BC, Britain was an island due to rising sea levels and horticulture allowed people to settle. The adoption of horticulture is too often confused with farming. The latter requires metal implements and beasts of burden that would only later emerge. Farming utilized the fertile soils that were too heavy for manual tilling. Horticulture, however, is growing mixed crops on a small scale, specifically using manual labour. This horticultural phase demanded light soil, and the silt along the River Avon was ideal. I call this area Avonlands, and a culture developed there. Stonehenge is the proof of their success. 

 

Merely a hunting zone for 6000 years, Avonlands offered advantages unparalleled in Britain as horticulture began. The river, 60 miles long, linked the tribal area to the sea and provided a highway when land was difficult to cross. At the sea end, coastal trading routes were extended east to continental Europe and west to Eire and the Scottish Isles. The river is slow moving and without rapids, ideal for log boats, and has the highest number of species in any UK river. It had annual salmon runs that  continued  until the 1950’s when the number and weight of fish severely declined. Similarly, sturgeon runs were once strong but have since disappeared. As a chalk river, the Avon contained very high levels of calcium. Two litres of water each day from the river provide 50% of human calcium needs. The skeletons found locally have large, strong bones. Some display severe bone breaks which healed. The river’s calcium content may be why Stonehenge was identified as a healing centre. 

 

The Avon headwaters rise on chalk downs to the north of Stonehenge. The river meanders past the circle and down to the sea at Christchurch. It is a slow river, which means it rarely floods. When it does, usually in winter, floods last just two to four days. That inundation deposits fresh silt and a host of vegetative matter on the land alongside the river, maintaining the soil’s fertility. We call this water-meadow and it rarely experiences drought. Reeds grow in the wetter areas. 

 

The horticulturalists’ tools were basic including antler picks and flint axes. Flint is found in chalk. In flint mines to the east, it was found as dinner plate sized nodules that were extensively traded. Wood and bone were also used to create tools. 

 

The people grew grain. As a stored food it removed the jeopardy of winter. They increased their stock of cattle and pigs, the latter being the most prominent meat in feasts. Because pigs eat human faeces,   this reduced the incidence of disease and gut worms. Unlike cattle, pigs feed in woodlands, especially on autumn crops of acorns. 

 

These horticulturalists still foraged, hunted and fished. They used the river, water-meadows, sea, marshes and wildwood. They cropped bespoke wood for their huts through coppicing and pollarding. Cutting thin wood was relatively easy using a flint axe whilst felling a tree was onerous. They thatched their huts using local reed, still in use today. A dry hut was a health advantage when most people were restricted to leaking grass and heather roofs. Significantly, they operated an economy based on the production and use of these materials. The true measure of their success was a food surplus. Only with this could they spend months each year, for 500 years, building an increasingly complex temple to their horticultural gods.

 

Stonehenge is a barometer for this culture, the first in Britain. They began with a single circle of unworked bluestones in 3000 BC. These stones were floated or dragged from South Wales, over 140 miles away. Subsequent remodelling reflects greater resources in food and people. They hauled 80 sarson stones 20 miles to Stonehenge, each weighing up to 50 tonnes. The final building phase ended in 2500 BC with the hand shaped stones and lintels we see today. This remodelling suggests a changing relationship with the gods that the people believed gave them their horticultural success.

 

Avonlands had too few water meadows to maintain a growing population. With the introduction of metal, horticulture was replaced with the expansion of farming across Britain. An outmoded Stonehenge fascinated the Romans only to be damned by its pagan ancestry for the next 2000 years. If Stonehenge is to be restored to its rightful heritage then it must be reengaged with the River Avon and its tribal lands. Only then can we interpret the astounding achievement of these prehistoric people.    

 

For more by this author, check out their new book: 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172608 https://historynewsnetwork.org/article/172608 0
Invented in the Fifties, Adolescence Swallowed the Nation—And Now the White House, Too

 

Donald Trump is the great mono-story of our time. Unless you’ve been away from the planet for a while, you know cable news is all Trump, all the time; Trumpworld is social media’s mesmerizingly dystopian subdivision. Yet the fixation with our president’s juvenile leadership style overlooks a deeper one: America’s obsession with eternal adolescence.

 

Hanging onto youth is an id-driven urge, right up there with sex and counting “likes” on Twitter and Instagram. But for all its pluses— inspiring seniors to trade the recliner for kettlebell training, for instance—it can lead to behavior that disses public norms, evades the bald-face truth, and swaps fevered fantasies for common sense. We get uber-brattishness not only in the Oval Office, but also across the political spectrum.

 

I didn’t really tumble to the mythic power of adolescence until the 1990s, when my friend Alex Gibney and I created a television history of the 1950s, and I was reminded the decade of Hula Hoops, Frisbees and Silly Putty had invented teenagers, too. Born in 1950, I remember fretting the rock ‘n’ roll boom Big Mama Thornton and Elvis helped ignite would fizzle before I hit the magic No. 13. Little did I realize I was a foot soldier in a cultural revolution.

 

Teenagers had always been with us, of course. It wasn’t until after World War II, however, that rising prosperity merged with booming birthrates to make the young an irresistible force. In his eponymous bestseller on which we based “The Fifties” TV series, David Halberstam explained: “In the past when American teenagers had made money, their earnings, more often than not, had gone to help support their parents or had been saved for one … long-desired purchase, like a baseball glove or a bike.” In the new affluence, however, teenagers “were becoming a separate, defined part of the culture: As they had money, they were a market, and … they were listened to and catered to.”

 

All that being catered to led to baby boomers acquiring a bit of a reputation. Compared to our frugal, self-sacrificing Depression-era parents, it was said (mainly by our frugal, self-sacrificing parents) that we could be a spoiled, pouty and rebellious lot.

 

But boomers dug the attention—so much so we weren’t about to let adolescence go. According to author Kurt Anderson, “the commitment of Americans, beginning with the baby boom generation, to a fantasy of remaining forever young” meant that “during the 1980s and ’90s, when American adults, like no adults before them — but like all who followed — began playing video games and fantasy sports, dressing like kids, and grooming themselves and even getting surgery to look more like kids.” Anderson called it the “Kids ‘R’ Us Syndrome.”

 

Is it any wonder, then, we eventually got a “Kids ‘R’ Us” president? Trump’s daily tweet storms, with his “I know you are, but what am I?” tone, his dysfunctional relationship with the truth, and his ducking of personal responsibility, signal off-the-hook traits that once sank public careers, pronto. But let’s be honest: Isn’t the Donald an example, albeit extreme, of the degree to which our society has normalized such behavior?

 

When Trump was born in 1946, America was revving up twin revolutions in communication technology and transportation that would help fuel the drive to extended adolescence. The first big impact, dramatically shrinking time and distance, manifested in how ordinary Americans were growing hungry for flashier, sped-up lifestyles.

 

In an interview for “The Fifties,” McDonalds founder Dick McDonald told me that after the war, he and his brother Mac McDonald witnessed the new appetites at their San Bernardino restaurant. “[A]ttitudes were completely different …,” said Dick. “We were beginning to get many complaints [about] how slow the service was. We decided, if the customer wants fast service, we’re going to give them fast service”; Dick and Mac started rolling out orders in 20 seconds, instead of 20 minutes.

 

Meanwhile, new technologies accelerated everything from book publishing and telecommunications to automobile and air travel. As the world got smaller and information gushed in, sociologist C. Wright Mills argued that faster times required new mental habits to let people make sense of the larger forces influencing their lives. “What they need, and what they feel they need,” wrote Mills, “is a quality of mind that will help them to use information and to develop reason in order to achieve lucid summations of what is going on in the world and of what may be happening within themselves.”

 

Fat chance of that. By the late 1950s, TV had swept the nation and, for broadcast impresarios, introspection didn’t monetize. A “great media-metaphor shift” was underway, as cultural critic Neil Postman put it in his landmark 1985 book “Amusing Ourselves to Death.” Americans were briskly moving from a print-based culture to an electronic-centered one. “[U]nder the governance of the printing press …,” Postman wrote, “discourse in America was … generally coherent, serious and rational.” Tailoring messages to fit the new TV medium meant fewer words and more pictures, less reflection and more emotion, and ended up rendering our national conversation “shriveled and absurd.”

 

That may be overstating the case, but it’s true that more and more Americans were traveling faster and lighter. Upwardly mobile, they more easily distanced themselves from many of the stark facts of life of only a few years before—backbreaking physical labor; lack of proper health care leading to untreated illness, poor teeth, and premature death; and limited access to higher education and comfortable retirements.

 

Fast-forward to the early 1990s, when personal computers and the rise of the internet induced the next great media shift. Suddenly, our wildest teenage dreams were now only a click or two away, stoking on-demand fantasies of leisure-time exoticism, sex and ever-unfolding material wonders. Ensorcelled, adult-adolescents suspended disbelief to a degree that, in the past, had been the preserve of dreaming, questing teenagers. 

 

Today, in our lucid moments, we know that social media moguls have actively encouraged some of our worst behavior. Pushing angry, hate-filled content that swamps reason, they lock in eyeballs and profits. As internet pioneer Jaron Lanier, put it: “So if they can find a group in which they can elicit those emotions … they’re going to keep on reinforcing that because they can get more engagement, action, and more money out of that group ... .” And, of course, today’s ease of access to our primitive passions allows the world’s real fake-news artists to inflame the public mind and mess with our elections.

 

Too often, internet-abetted outrage spills into violence; yet more often, in an adolescent society, it shows up with its passive-aggressive twin, complacency. The upshot is we have a hard time knowing our own minds or maintaining our focus on serious matters. “It’s very easy to ignore the world when the internet is fun and, at the margin, it’s cheap,” said economist Tyler Cowen. “You can protest politically on your Facebook page or write a tweet and just put it aside, get to the next thing” without breaking a sweat.

 

Cowan argues that our complacency derives from a long-term decline in “American dynamism.” “We now have a society where we’re afraid to let our children play outside,” he said. “We medicate ourselves at much higher rates. We hold jobs for longer periods of time. We move across state lines at much lower rates than before …. But once you’re on a track to make everything safer, it’s very hard to stop or restrain yourself along appropriate margins.”

 

What is surely is true is that, try as we might to minimize risk, the door can never be shut tight. Instead, the failure to act on underlying problems typically opens the window to compounded trouble.

 

Which may explain what’s eating at our teenagers. Not only are kids taking longer to mature today, they’re coping with a hollowness at the heart of their digital lives that concerns psychologist Jean Twenge. “Parenting styles continue to change, as do school curricula and culture, and these things matter,” she wrote in The Atlantic. “But the twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever.” Pointing to a spike in teen suicide rates, Twenge said, “There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives—and making them seriously unhappy.”

 

You have to think that even C. Wright Mills would be shocked at the speed with which information technology has separated us from reality and amped up self-inflation. Who among us hasn’t posted Facebook items that elevate our achievements, and tout fancy friends or vacations? In such a world, maxims that once guided life, like “To thine own self be true,” can sound unbelievably trite. Meanwhile, our digital imps, like teen hoods haunting a fifties’ street corner, whisper that only suckers resist the urge to self-inflate.

 

It isn’t exactly Holmesian to deduce that trouble with setting sensible limits on our actions and aspirations has made it harder to talk sensibly with one another. President Trump’s spokespersons have talked unabashedly about “alternative facts” or insist that “truth isn’t truth”—prime examples of what Neil Postman called “dangerous nonsense.” Meanwhile, bloviators, left and right, insist on minor points of political dogma, and go haywire at any whiff of apostasy.

 

Such juvenile behavior is hard on a democracy. As Tom Nichols wrote in his 2017 book “The Death of Expertise,” when the concept “is understood as an unending demand for unearned respect for unfounded opinions, anything and everything becomes possible, including the end of democracy and republican government itself.”

 

To be sure, reasonable folks can understand how tempting it is for fellow citizens suffering the hard knocks of low pay or no pay, and reverse social mobility, to entertain a lazy slogan like “Make America Great Again.” But yearning for halcyon days when the economy was headed for the stars, sock hops were the groove, and a white, male-dominated supermajority told everybody, at home and abroad, how things were going to go down is, at best, a pipedream.

 

In our TV series, David Halberstam, that clear-eyed observer of American life, cautioned that hindsight is invariably 20-20. “There’s a tendency to romanticize the fifties and to forget the narrowness, the prejudice …” he said. “But I think there’s a generally far greater freedom and a sense of possibility, economic and other, today than in the past.”

 

That’s still true enough. Had David lived to see the changes churned up by our present decade (he died in 2007), however, I’m willing to bet he’d remind us that the greatest of cultures can develop bad habits of mind, lose their edge to adolescent whimsy or outright chicanery—and nations in that fix get the leadership they deserve.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172652 https://historynewsnetwork.org/article/172652 0
The Long History of Unjust and Lawless Attorneys General

 

As Robert Mueller testifies this week before Congress, the United States Department of Justice is once again in the spotlight. Earlier this summer, the House of Representatives held Attorney General William Barr in contempt for his refusal to comply with a subpoena on the 2020 census. Barr is hardly the first AG who has used his appointment as Attorney General to promote lawlessness and injustice. 

 

In fact, in the past 100 years, Attorneys General have violated the Bill of Rights; engaged in political corruption and lawless acts while advocating "law and order"; endorsed abuses of power in the name of "national security”; and refused to cooperate with Congressional investigations of wrong doing. The list of controversial Attorneys General who have undermined their oath to uphold the Constitution of the United States is long and includes eight individuals who served from the time of the presidency of Woodrow Wilson to the presidency of Donald Trump.

 

The first is A. Mitchell Palmer, who served as Attorney General under Woodrow Wilson from March 1919 to March 1921. For much of the period from October 1919 to March 1921, Wilson was incapacitated by a stroke, giving Palmer license to abuse his position.  Palmer initiated the  “Palmer Raids”, also known as the “Red Scare”, in which thousands of people suspected to be Socialists or Communists were rounded up and jailed. The prisoners were often denied their basic civil rights and writ of habeus corpus and detained for months before they were finally released. A small percentage who were not American citizens were deported.  Assisting Palmer in his quest to “save the nation” from the Soviet Union’s new leader Nikolai Lenin was future FBI Director J. Edgar Hoover and  other zealots who had no issue with violations of the Bill of Rights. Palmer undermined respect for the rule of law as he denied basic civil liberties and civil rights to thousands of victims of his “Red Scare”.

 

Palmer’s successor, Harry M. Daugherty, was the Attorney General under President Warren G. Harding and briefly under President Calvin Coolidge from March 1921 to April 1924.  Prior to his appointment, Daugherty was Harding’s campaign manager and part of the infamously corrupt  “Ohio Gang.” Two members of the cabinet under Harding and Coolidge---Secretary of State Charles Evans Hughes and Secretary of Commerce Herbert Hoover—were wary of Daugherty, and eventually Coolidge asked for his resignation after evidence emerged that Daugherty had knowledge of the infamous Teapot Dome scandal (oil lands in Wyoming were given to the Sinclair Oil Company illegally by Secretary of the Interior Albert Fall). Hints of the Teapot Dome and other scandals began to emerge in the last days of Harding’s presidency before his sudden death in August 1923.  Daugherty was indicted in 1926 and tried twice but deadlock in the Justice Department led to the dismissal of charges. Nevertheless, Daugherty was left under a cloud of corruption which undermined the historical reputation of President Harding.

 

Four decades later, President Richard Nixon appointed his campaign manager, John N. Mitchell as Attorney General, a position he held from January 1969 to March 1972.  Mitchell was regarded as one of the closest advisers to Nixon and was infamous, like his president, for his support of “law and order.” Ironically, Mitchell didn’t always follow the letter of the law. He was not vetted by FBI Director J. Edgar Hoover (President Nixon requested he not be), advocated the use of wiretaps in national security cases without obtaining a court order, promoted preventive detention of criminal suspects although it potentially violated the Constitution, and did not properly enforce court-ordered mandates for desegregation. Most famously, the Watergate tapes proved he helped plan the break-in at the Democratic National committee headquarters and was deeply involved in the cover-up that followed. Even after he left the Justice Department and became the head of the Committee to Reelect the President, he threatened Watergate journalist Carl Bernstein and Washington Post publisher Katherine Graham. Mitchell was indicted and convicted of conspiracy, obstruction of justice, and perjury. He spent 19 months in prison and lost his law license for his illegal and unethical actions.

 

Nearly two decades later, President George H. W. Bush appointed William Barr as Attorney General and Barr served from November 1991 to January 1993. In his first round as head of the Justice Department, Barr faced criticism after he encouraged the President to pardon former Secretary of Defense Caspar Weinberger, who served under President Ronald Reagan from January 1981 to November 1987. In the aftermath of the Iran Contra Affair, Weinberger faced indictment and trial on charges of perjury and obstruction of justice. 

 

After the Presidential Election of 2000, President George W. Bush selected former Senator John Ashcroft of Missouri as his first Attorney General, serving from February 2001 to February 2005.  Ashcroft endorsed the use of torture, including in the Abu Ghraib abuse scandal in Iraq in 2004. Further, he endorsed unregulated surveillance by the Foreign Intelligence Surveillance Court as well as FBI surveillance of libraries and retail sales to track suspects’s reading habits. Critics of the Patriot Act and the post-September 11th policies of the Bush Administration argue this was a massive privacy violation.  With his reputation undermined, Ashcroft decided to leave his position after Bush won a second term in 2004.

 

Ashcroft was replaced by Alberto Gonzales, who served from February 2005 to September 2007. Previously, Gonzales was a member of the White House Counsel from January 2001 to February 2005 and  was Bush’s General Counsel during his Texas Governorship from 1995-2001.  Gonzales’s tenure as Attorney General was highly controversial as he endorsed warrantless surveillance of US citizens and gave legal authorization to “enhanced interrogation techniques,” later, generally acknowledged as torture.  He also presided over the firing of nine US Attorneys who refused to adhere to back-channel White House directives to prosecute political enemies.  Further, he authorized the use of military tribunals and the denial of the writ of habeus corpus to detainees at the Guantanamo Bay Naval Base in Cuba.  Eventually, he resigned while under fire for abusing his office and his politicizing it. 

 

Corruption and abuse by the Attorney General have continued under President Donald Trump with Jeff Sessions; then with Matthew Whitaker as his replacement as Acting Attorney General, and finally with recent return of William Barr to the office.

 

Sessions, who had been an Alabama Republican Senator since 1997, almost immediately sparked controversy after news broke that he misled the Senate on his contacts with Russian officials during the 2016 Presidential campaign. Sessions therefore recused himself from the investigation into Russian collusion during the campaign. Trump was immediately displeased and pressure slowly mounted from within the administration for Sessions to resign.

 

But when Sessions left the administration in fall 2018, Acting Attorney General Matthew Whitaker, who served from November 2018 to February 2019, circumvented normal Senate confirmation procedure which subsequently caused an uproar. This led to numerous legal challenges to his claim that he could supervise the Mueller investigation.  

 

The brief and controversial tenure of Whitaker ended in February 2019, with the appointment of William Barr. A quarter century after he served as AG under George H.W. Bush, Barr was the AG  for the second time. Today, Barris even more controversial as he has enunciated his vision of unitary executive authority, adding to Donald Trump’s belief that his powers as President are unlimited.  Barr has refused to hand over the entire Mueller Report to committees in the House of Representatives,  has refused to testify before the House Judiciary Committee, and was recently held in criminal contempt for refusing to share information about Trump Administration attempts to add a citizenship question to the upcoming 2020 Census.  

 

So the Justice Department and the Attorneys General over the past century under Republicans Warren G. Harding, Richard Nixon, George H. W. Bush, George W. Bush and Donald Trump has undermined public faith and its reputation as a fair minded cabinet office intent on enforcing fair, just policies.  However, the past century began with a horrible tenure under Democrat Woodrow Wilson at his time of incapacity, allowing A. Mitchell Palmer to set a terrible standard followed by seven of his successors in the Justice Department. Regaining confidence in the agency and the holder of the office of Attorney General will require a change in the Presidency, clear and simple. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172654 https://historynewsnetwork.org/article/172654 0
Would Slavery Have Ended Sooner if the British Won the American Revolutionary War?

John Singleton Copley's The Death of Major Peirson painting depicts Black Loyalist soldiers fighting alongside British regulars

 

 

"I would never have drawn my sword in the cause of America, if I could have  conceived that thereby I was founding a land of slavery."

-Marquis de Lafayette, French military leader who was instrumental in enlisting French support for the colonists  in the American War of Independence 

 

 

Historians and the American public have long grappled with the contradiction that the Revolutionary War was waged under the banner "all men are created equal" yet was largely led by slave owners. 

 

The July 4th, 1776 Declaration of Independence (DI) was in itself a revolutionary document. Never before in history had people asserted the right of revolution not just to overthrow a specific government that no longer met the needs of the people, but as a general principle for the relationship between the rulers and the ruled: 

 

"We hold  these truths to be self-evident, that all men are created equal, that they are endowed by  their creator with certain unalienable rights, that among these are Life, Liberty, and the pursuit of happiness. That to secure these rights, governments are instituted among  Men, deriving their just powers from the consent of the governed.--That whenever any Form  of Government becomes destructive of these ends, it is the Right of the People to alter or abolish it, and to institute new Government..."   

 

And yes, "all men are created equal" excluded women, black people and the indigenous populations of the continent. Yes, it was written by slave-owner Thomas Jefferson with all his personal hypocrisies. Yes, once free of England, the U.S. grew over the next 89 years to be the largest slave-owning republic in  history. 

 

Americans are taught to see the birth of our country as a gift to the world, even when its original defects are acknowledged. The DI along with the Constitution are pillars of American exceptionalism--the belief that the U.S. is superior and unique from all others,  holding the promise of an "Asylum for the persecuted lovers of civil and religious  liberty" in the words of Thomas Paine in Common Sense. 

 

Indeed, the powerful words of the Declaration of Independence have been used many times since the Revolutionary War to challenge racism and other forms of domination and inequality. Both the 1789 French Revolution and the 1804 Haitian revolution--the only successful slave revolt in  human history--drew inspiration from this clarion call. In 1829 black abolitionist David  Walker threw the words of the DI back in the face of the slave republic: "See your declarations Americans!!! Do you understand your own language?" The 1848 Seneca Falls women's rights convention issued a Declaration of Sentiments proclaiming that  "We hold these truths to be self evident that all men and women are created  equal." Vietnam used these very words in declaring independence from France in 1946. And as Martin Luther King, Jr. stated in his 1963 “I Have a Dream” Speech, the Declaration was "A promise that all men, yes, black men as well as white men, would be guaranteed the unalienable rights of life, liberty, and the pursuit of happiness."  

 

Historian Gary Nash, among others, has strongly argued against the viewing history as inevitable. He argues this short circuits any consideration of the fact that every historical moment could have happened differently. For instance, in his book “The Forgotten Fifth,” Nash argues that if Washington and Jefferson had been faithful to their anti-slavery rhetoric and  chosen to lead a fight against slavery during the American Revolution,  there was a good  chance they could have succeeded.

 

Perhaps a different question might be asked: what if the British had won, had defeated the colonists' bid to break from the mother country? Is it possible that the cause of freedom  and the ideals of the DI would have been paradoxically better served by that outcome?  

 

England's Victory Over France Leads to the American War For Independence  

 

It was, ironically, England's victory over France for control of the North American continent in the seven years' war (1756-1763) that laid the basis for their North American colonies to revolt just 13 years later. As the war with France ended, the British 1763 Proclamation prohibited white settlement west of the Appalachian mountains in an attempt at detente with Native Americans -- bringing England into conflict with colonists wanting to expand westward. More serious still were the series of taxes England imposed on the colonies to pay off its large war debt: the 1765 Stamp Act, the 1767-1770 Townshend Acts, and the 1773 Tea Acts, among others. As colonial leaders mounted increasingly militant resistance to these measures, so too did British repression ramp  up.   

 

While "No taxation without representation" and opposition to British  tyranny are the two most commonly cited causes propelling the colonists' drive for independence, recent scholarship (Slave Nation by Ruth and Alfred Blumrosen, Gerald  Horne's The Counter-Revolution of 1776, and Alan Gilbert's Black Patriots and Loyalists in particular) has revealed a heretofore unacknowledged third major motivating force: the preservation and protection of slavery itself. In 1772, the highest British court ruled in the Somerset decision that slave owners had no legal claims to ownership of  other humans in England itself, declaring slavery to be "odious". Somerset  eliminated any possibility of a de jure defense of slavery in England, further reinforced  at the time by Parliament refusing a request by British slave owners to pass such a law. While Somerset did not apply to England's colonies, it was taken by southern colonists as  a potential threatto their ability to own slaves. Their fear was further reinforced by the 1766  Declaratory Act, which made explicit England's final say over any laws made in the  colonies, and the "Repugnancy" clause in each colony's charter. Somerset added fuel to the growing fires uniting the colonies against England in a fight for  independence.  

 

"Seeing the Revolutionary War through the eyes of enslaved blacks turns its meaning  upside down"  Simon Schama, Rough Crossings   

 

Among the list of grievances in the DI is ararely scrutinized statement: "He [referring to  the king] has excited domestic insurrections amongst us." This grievance was motivated by Virginia Royal Governor Lord Dunmore's November 1775 proclamation stating that any person held as a slave by a colonist in rebellion against England would become  free by joining the British forces in subduing the revolt. While 5000 black Americans, mostly free, from northern colonies joined with the colonists' fight for independence, few of our school books teach that tens of thousands more enslaved black people joined with the British, with an even greater number taking advantage of the war to escape the colonies  altogether by running to Canada or Florida. They saw they had a better shot at  "Life, liberty and the pursuit of happiness" with the Britishthan with their  colonial slave masters. To further put these numbers in perspective, the total population of the 13 colonies at the time was 2.5 million, of whom 500,000 were slaves and indentured servants. While there is some debate about the exact numbers, Peter Kolchin in American Slavery points to  the "Sharp decline between 1770 and 1790 in the proportion of the population made up  of blacks (almost all of whom were slaves) from 60.5% to 43.8% in South Carolina and from 45.2% to 36.1% in Georgia" (73). Other commonly cited figures from historians estimate 25,000 slaves escaped from South Carolina, 30,000 from Virginia, and 5,000 from  Georgia. Gilbert in Black Patriots and Loyalists says "Estimates range between twenty thousand and one hundred thousand... if one adds in the thousands of not yet organized blacks who trailed... the major British forces... the number takes on  dimensions accurately called 'gigantic' (xii).  Among them were 30 of Thomas Jefferson's slaves, 20 of George Washington's, and good ole "Give me liberty or give me death" Patrick Henry also lost his slave Ralph Henry to the Brits. It was the first mass emancipation in American history. Evidently  "domestic insurrection" was legitimate when led by slave owners against England  but not when enslaved people rose up for their freedom--against the rebelling slave owners!  

 

Before There Was Harriet Tubman There was Colonel Tye  

 

Crispus Attucks is often hailed as the first martyr of the American revolution, a free  black man killed defying British authority in the 1770 Boston Massacre. But few have  heard of Titus, who just 5 years later was among those thousands of slaves who escaped to the British lines. He became known as Colonel Tye for his military prowess in leading black and white guerrilla fighters in numerous raids throughout Monmouth County, New Jersey, taking reprisals against slave owners, freeing their slaves, destroying their weaponry and creating an atmosphere of fear among the rebel colonists--and hope among  their slaves. Other black regiments under the British fought with ribbons emblazoned  across their chests saying "Liberty to Slaves".  One might compare Col. Tye to Attucks but if Attucks is a hero, what does that make Tye,  who freed hundreds of slaves? Perhaps a more apt comparison is with Harriet Tubman, who escaped slavery in 1849 and returned to the south numerous times to also free hundreds of her brothers and sisters held in bondage.  

 

So what if the British had won?  

 

At no point, however, did the British declare the end of slavery as a goal of thewar; it was always just a military tactic. But if the Brits had won, as they came close to doing, it might have set off a series of events that went well beyond their control. Would England  have been able to restore slavery in the 13 colonies in the face of certain anti-slavery resistance by the tens of thousands of now free ex-slaves, joined by growing anti-slavery forces in the northern colonies? As Gilbert puts it, "Class and race forged ties of solidarity in opposition to both the slave holders and the colonial elites." (10) Another sure ally would have been the abolitionist movement in England, which had been further emboldened by the 1772 Somerset decision. And if England had to abolish slavery  in the 13 colonies, would that not have led to a wave of emancipations throughout the Caribbean and Latin America? And just what was the cost of the victorious independence struggle to the black population? To the indigenous populations who were described in that same DI grievance as  "The merciless Indian Savages"? Might it have been better for the cause of freedom if the colonists lost? And if the colonists had lost, wouldn't the ideals of the DI have carried just as much if not more weight?   

 

"The price of freedom from England was bondage for African slaves in America.  America would be a slave nation." Eleanor Holmes Norton, introduction to Slave Nation  

 

We do know, however, the cost of the colonists' victory: once independence was won, while the northern states gradually abolished slavery, slavery BOOMED in the south. The first federal census in 1790 counted 700,000 slaves. By 1810, 2 years after the end of the slave trade, there were 1.2 million enslaved people, a 70% increase. England ended slavery in all its colonies in 1833, when there were 2 million enslaved people in the U.S. Slavery in the U.S. continued for another 33 years, during which time the slave population doubled  to 4 million human beings. The U.S abolished slavery in 1865; only Cuba and Brazil ended slavery at a later date. The foregoing is not meant to romanticize and project England as some kind of abolitionist savior had they kept control of the colonies. Dunmore himself was a slave owner. England was the center of the international slave trade. Despite losing the 13  colonies, England maintained its position as the most powerful and rapacious empire in the world till the mid-20th century. As England did away with chattel slavery, it  replaced it with the capitalist wage slavery of the industrial revolution. It used food as a weapon to starve the Irish, conquered and colonized large swaths of Asia, Africa and the Pacific.   

 

Historian Gerald Horne wrote that  "Simply because Euro-American colonists prevailed in their establishing of the U.S., it should not be assumed that this result was inevitable. History points to other  possibilities...I do not view the creation of the republic as a great leap forward for  humanity" (Counter-Revolution of 1776, ix).  The American revolution was not just a war for independence from England. It was also a  battle for freedom against the very leaders of that rebellion by hundreds of thousands of enslaved black people, a class struggle of poor white tenant farmers in many cases also against that same white colonial elite, and a fight for survival of the indigenous  populations. But the colonists' unlikely victory lead to the creation of the largest slave nation in history, the near genocide of the indigenous populations, and a  continent-wide expansion gained by invading and taking over half of Mexico. The U.S. went on to become an empire unparalleled in history, its wealth origins rooted largely in slave labor. 

 

The struggles for equality and justice for all that the Declaration of Independence promised continues of course but ML King's promissory note remains unfulfilled.  The late Chinese Premier Chou en Lai was once asked his assessment as to whether the French revolution was a step forward in history. His response was, "It's too soon to  tell". Was the founding of the United States a step forward in history? Or is it still too soon to tell?

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172653 https://historynewsnetwork.org/article/172653 0
Reconsidering Journalist and Gay Activist Randy Shilts

 

Twenty-five years ago, Randy Shilts went to his death a conflicted man. He talked openly about the frustration of his life being finished without being complete. Beyond the obvious pain, panic or dread he must have felt as he succumbed to AIDS-related health complications, he knew there was so much more to do, but he was not going to be around to do any of it. 

 

In many senses, he had not resolved personally the dissonance that seemed ever beneath the surface of just what his role was – or who he was supposed to be. He publicly clung to claims of being an objective journalist – “What I have to offer the world is facts and information,” he told Steve Kroft and the nation during one of his last interviews (aired on the CBS franchise 60 Minutes the week he died in February 1994). A closer examination of his life and work reveals, however, that it was not just facts and information that he sought to impart as an objective journalist, but that he sought to fully inhabit the role that Walter Lippmann long ago described as the “place” for many journalists. He fully embraced the idea that our society and culture were in need of an interpretation a trained journalist (and critical thinker) could provide, but moreover, the need to shine the light of media in the right places and set the agenda of what was important, and what was not. 

 

Shilts’s light-shining would be what could win him fierce critics, particularly as he sought to transport himself from just a “gay journalist” to “journalist who happens to be gay.” As one of the nation’s very first openly gay journalists at a major daily newspaper, The San Francisco Chronicle, he sought to distinguish himself not only as a legitimate journalist, but also as a representative of the welcomed homosexual in a vehemently heterocentric world. Doing so meant he quickly personified the clichéd role of the journalist more skilled at making enemies than friends, who instead worried more about making a point. For Shilts, that is what journalism was for, to move society and the people in it to a new place, to a new understanding of or relationships with one another. Facts and information could do that, but Shilts understood how to focus those facts, right down to when a story appeared; he openly acknowledged that he timed stories that raised the frightening prospect of HIV and AIDS for Friday editions of The Chronicle to put some fear into his fellow gay tribe members as a weekend free for partying approached.

Writing about Shilts, someone I have grown to admire and love during more than nine years of research, required, however, more than just a tribute or formal canonization. After only a brief period of probing, one quickly finds that the legacy of Shilts remains a mixed one, depending upon whom one encounters. Many gay rights advocates laud his promotion of gay martyr Harvey Milk via Shilts’s first book, “The Mayor of Castro Street,” or his focus on the battle for lesbian and gay U.S. service members to serve openly in his last tome, “Conduct Unbecoming.” Others who fought the battles of the deadly HIV-AIDS pandemic that darkened America and the rest of the world reject praise and offer vocal and pointed criticism of Shilts as a heartless “Uncle Tom” of the gay liberation movement. These latter feelings flow directly from the praised – and scorned – work that won Shilts his most fame, “And the Band Played On,” and the portions of it resting on the idea of “Patient Zero” responsible for “bringing AIDS to America,” as the editors of The New York Post opined in 1987.  

My research attempts to bridge these two shores, unearthing the incredible drive and determination of this outspoken gay man from small town Illinois, to becoming an early and important barrier breaker as an openly gay reporter for a major daily newspaper.  Dead at the age of 42, a full 22 years before scientists would clear Shilts’s “Patient Zero” as clearly not the man who brought AIDS to America, the posthumous review of Shilts seems incomplete and unfair if it does not include a full review of all aspects of the journalist, author and man.  There is value in taking up the lingering issues of how to place Shilts as a journalist and early gay liberation leader, and bring at least some resolution to the remaining conflict. We can do so without relieving Shilts of having to own his own words and actions, all the while placing them in context of his entire life. 

The story of Shilts and the mixed legacy that remains a quarter century later has connecting points to our contemporary lives – where we seek a fuller understanding of historical people and times, but to do so in the correct context. Shilts’s story remains incomplete if we do not take in a fuller consideration of his successes alongside the problems centered primarily on his construction of the “Patient Zero” myth. Robbed of the living that could have resolved the dangling irritants in his story, we’re left with the same conflict Shilts felt as his life wound down. Similarly, we must wait the end of many stories playing out around us and perhaps find some resolution in the context time affords. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172616 https://historynewsnetwork.org/article/172616 0
Is Trump the Worst President in History?

 

As the chance of getting rid of Donald Trump — through impeachment or by voting him out — continues to dominate the headlines, the historical challenge is compelling.  No president has been a greater threat to the qualities that make the United States of America worthy (at its best) of our allegiance.

 

The rise of Trump and his movement was so freakish that historians will analyze its nature for a long time.  From his origins as a real estate hustler, this exhibitionist sought attention as a TV vulgarian.  Susceptible television viewers found his coarse behavior amusing. Then he announced that he was running for the presidency and it looked for a while like just another cheap publicity stunt.

 

But his name-calling tactics struck a chord with a certain group of voters.   Our American scene began to darken.  Before long, he was hurling such vicious abuse that it ushered in a politics of rage. As his egomania developed into full megalomania, the “alt-right” gravitated toward him.

 

The “movement” had started.

 

More and more, to the horror of everyone with power to see and understand, he showed a proto-fascist mentality.  So alarms began to spread: mental health professionals warned that he exemplifies “malignant narcissism.”

 

Never before in American history has the presidential office passed into the hands of a seditionist.  And the use of this term is appropriate.  With no conception of principles or limits — “I want” is his political creed —he mocks the rule of law at every turn.

 

At a police convention in 2017, he urged the officers in attendance to ignore their own regulations and brutalize the people they arrest.  He pardoned ex-Arizona sheriff Joe Arpaio, who was convicted of criminal contempt of court.  He appointed Scott Pruitt to head the EPA so he could wreck the agency and let polluters have the spree of their lives.

 

Trump is fascinated by powerful dictators with little regard to human rights or democracy. He compliments Vladimir Putin and hopes to invite that murderer to stay in the White House.  He likes Rodrigo Duterte of the Philippines, a tyrant who subverts that nation’s democracy.

 

So, Trump certainly has the personality of a fascist.  But he is not quite as dangerous as other authoritarians in history.

 

In the first place, he lacks the fanatical vision that drove the great tyrants like Hitler and Stalin to pursue their sick versions of utopia.  He is nothing but a grubby opportunist.  He has no ideas, only appetites.   The themes that pass for ideas in the mind of Donald Trump begin as prompts that are fed to him by others — Stephen Miller, Sean Hannity, and (once upon a time) Steve Bannon. To be sure, he would fit right in among the despots who tyrannize banana-republics.  But that sort of a political outcome in America is hard to envision at the moment. 

 

Second, American traditions — though our current crisis shows some very deep flaws in our constitutional system — are strong enough to place a limit on the damage Trump can do.  If he ordered troops to occupy the Capitol, disperse the members of Congress, and impose martial law, the chance that commanders or troops would carry out such orders is nil.

 

Third, Americans have faced challenges before. Many say he is our very worst president — bar none.  And how tempting it is to agree.  But a short while ago, people said the same thing about George W. Bush, who of course looks exemplary now when compared to our presidential incumbent.

 

The “worst president.”

 

“Worst,” of course, is a value judgment that is totally dependent on our standards for determining “badness.”  And any number of our presidents were very bad indeed — or so it could be argued.

 

Take Andrew Jackson, with his belligerence, his simple-mindedness, his racism as reflected in the Indian Removal Act of 1830.  Take all the pro-slavery presidents before the Civil War who tried to make the enslavement of American blacks perpetual:  John Tyler, Franklin Pierce, James Buchanan. Take James K. Polk and his squalid war of aggression against Mexico.  Take Andrew Johnson, who did everything he could to ruin the lives of the newly-freed blacks after Lincoln’s murder.

 

The list could go on indefinitely, depending on our individual standards for identifying “badness.”  Shall we continue?  Consider Ulysses S. Grant and Warren G. Harding, so clueless in regard to the comparatively easy challenge of preventing corruption among their associates.  Or consider Grover Cleveland and Herbert Hoover, who blinded themselves to the desperation of millions in economic depressions.  And Richard Nixon, the only president to date who has resigned the office in disgrace.

 

Which brings us to Trump.

 

However incompetent or even malevolent some previous American presidents were, this one is unique. The Trump presidency is a singular aberration, a defacement of norms and ideals without precedent.  However bad some other presidents were all of them felt a certain basic obligation to maintain at least a semblance of dignity and propriety in their actions.

 

Not Trump.

 

Foul beyond words, he lurches from one brutal whim to another, seeking gratification in his never-ending quest to humiliate others. He spews insults in every direction all day.  He makes fun of the handicapped.  He discredits journalists in order to boost the credibility of crackpots and psychopathic bigots.  He accuses reporters of creating “fake news” so he can generate fake news himself: spew a daily torrent of hallucinatory lies to his gullible followers.

 

He amuses himself — with the help of his money and the shyster lawyers that it pays for — in getting away with a lifetime’s worth of compulsive frauds that might very well lead to prosecutions (later) if the evidence has not been destroyed and if the statute of limitations has not expired.

 

So far, however, he is always too brazen to get what he deserves, too slippery for anyone to foil.  

 

Anyone with half of ounce of decency can see this wretched man for what he is.  They know what’s going on, and yet there’s nothing they can do to make it stop.  And that adds to Trump’s dirty satisfaction. Any chance to out-maneuver the decent — to infuriate them — quickens his glee.  It makes his victory all the more rotten, incites him to keep on taunting his victims.  

 

It’s all a big joke to Donald Trump, and he can never, ever, get enough of it. 

 

The question must be asked:  when in our lifetimes — when in all the years that our once-inspiring Republic has existed — have American institutions been subjected to such treatment?  How long can American morale and cohesion survive this?

 

Nancy Pelosi has said that in preference to seeing Trump impeached, she would like to see him in jail.  Current Justice Department policy — which forbids the indictment of presidents — makes it possible for Trump to break our nation’s laws with impunity.  Impeachment is useless if the Senate’s Republicans, united in their ruthlessness and denial, take the coward’s way out.

 

So the prospect of locking him up may have to wait.  But the day of reckoning for this fake — this imposter who will never have a glimmer of clue as to how to measure up to his office — may come in due time.  Then the presidential fake who accuses his victims of fakery will live with some things that are real:  stone walls, iron bars, a nice prison haircut, and the consequences of his actions.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172612 https://historynewsnetwork.org/article/172612 0
(Re)-Claiming a Radical Evangelical Heritage at Christian Colleges

Wheaton College

 

The recent controversies at my alma mater, Taylor University, over the invitation of Vice President Mike Pence’s to deliver their Commencement Address have illuminated the continued fissures at evangelical Christian colleges like Wheaton College (issues of race and gender), Gordon College (issues of sexual identity and gender discrimination), and Azusa Pacific (same sex dating) in this last decade. Some Christian Colleges like Eastern Mennonite and Goshen College have chosen to voluntarily disassociate themselves from the Council of Christian Colleges and Universities(CCCU)—the touchstone organization for evangelical colleges and universities in North America—because the CCCU opposed their internal decisions to include homosexuals in undergraduate admissions and faculty and administrative hires. 

 

At Taylor, the decision of university president Lowell Haines (ironically, a well-known long-haired folk and rock singer at TU in the Seventies), to invite the Vice President to graduation exposed the nascent political and social divisions among faculty, students and alumni that all previous TU Presidents had managed to avoid by their collective decisions not to bring divisive political figures to campus. As a direct (or indirect result) of these long-held political boundaries, the relatively new President decided to resign just weeks after Commencement while the University was in the midst of a new strategic plan with an ensuing Capital Campaign. It was clear that the President was “caught off guard” as to the vast ranges of evangelical thought among his constituencies.

 

It would be easy to assume that the recent strains among evangelical higher education institutions are just the result of Protestant divisions that are ubiquitous throughout history. However, the history of American evangelicalism suggests something completely different: evangelical colleges are returning to a uniquely radical American nineteenth century evangelicalism. This was especially the case with Christian colleges with no denominational control such as Wheaton, Taylor, Gordon, etc.  In the past century, Wesleyan scholar Don Dayton wrote Discovering an Evangelical Heritage, a small book about the conflicted identifies of American evangelicalism that had a profound impact on Christian college campuses. Dayton chronicled the lives of nineteenth century evangelicals like Theodore Weld, Jonathan Blanchard, Charles Finney, the Tappan Brothers and other abolitionists of the antebellum period that founded colleges like Oberlin, Knox, Grinnell, Berea, Wheaton and a host of other Midwestern colleges with egalitarian visions of race and gender accompanied by a heavy critique of our political establishments. These colleges were a true reflection of a radical evangelical re-visioning of the United States through liberatory practices towards women and African Americans in their teachings, theologies and political and social activism. Understanding the history of these colleges led to a more complex understanding of American evangelicalism that includes a totally different brand of Christianity than twentieth century fundamentalism with its rigid social rules and literal interpretations of Scripture.

 

The most radical and renowned evangelical leader of that period was John Brown who was “…The only white man worthy of a membership in the NAACP” according to W.E.B. DuBois. As most students of American history know, Brown and his family considered themselves part of a “martyr-age” of evangelicals who would live and die for the emancipation of enslaved African Americans. Brown’s motivation was based on Scripture, especially the Old Testament that proclaimed: “God hath made one blood of all nations”.

 

The antebellum evangelical movement and Fundamentalists disagreed over issues like the role of women in the church, racial segregation, creationism and science, the definitive role of Scripture in faith and practice, and other theological issues that were informed by a premillennial view of the End Times. Thus, fundamentalist and evangelical churches both quietly (and sometimes openly) argued with each other until the issues were settled in civil courts, legislation, and within other societal structures.

 

After the Civil War and moving towards the Twentieth Century, this post-millennial evangelicalism faded with the aftermath of World War I. After witnessing the devastation of the war,  few theologians believed that the world would continue to improve and thus usher in the Second Coming of Christ. The Fundamentalist movement was birthed from the Progressive Era as a counter to the secular teachings and influence of Marx, Freud, Darwin and the higher criticism of Scripture taught on college campuses and infiltrating main-line Protestant denominations.

 

Thus, many former evangelical colleges like Wheaton (and others) pulled inward and did not engage the wider collegiate academic culture that was perceived to be hopelessly secularized. Many Anabaptist institutions were the exception and it can be more readily understood why Eastern Mennonite and Goshen have departed from fundamentalist teachings on race, class and gender because of the nature of its oppositional culture.

 

Other-worldly fundamentalism held its sway in interdenominational collegiate and church institutions until a small group of post War evangelicals (i.e. John Hoekenga) began a new movement in 1947. Inspired by Carl Henry’s National Evangelical Association, this group of influential and intellectual evangelicals attempted to consolidate and organize the more moderate wing of the evangelical movement.

 

After the relatively calm period of the post -War Eisenhower administration (and with the election of our first Catholic President John Kennedy), the turbulent Sixties with the Vietnam War, Vatican II, women’s movement and the Civil Rights Movement spotlighted social issues that the Church had not effectively dealt with since the nineteenth century. Taken as whole, these issues animated more evangelicals to splinter into smaller political and social interest groups represented by journals such as Sojourners, Christianity Today, Christian Herald and a slew of non-denominational publications from more fundamentalist institutions like the Sword of the Lord out of fundamentalist Tennessee Temple University. Right-wing preacher and broadcaster Carl McIntire from New England, University President Bob Jones in the Deep South, popular Midwestern broadcaster Dr. M.R. De Haan, and other popular radio broadcasters attempted to counter the more inclusive messages from the moderate to liberal voices of the  new evangelicalism represented by Billy Graham.

 

Many historians agree that the defeat of Jimmy Carter in 1980 by Ronald Reagan signaled a significant divide between the progressive evangelicalism represented by Carter over against the conservative policies and personal testimony of President Reagan’s born- again experience. Even though Reagan rarely attended church (and was a divorced man—an anathema to conservative evangelical groups), nevertheless, like Donald Trump, evangelicals believed that Reagan’s policies such as “trickle-down” economics fit their newly found upper-middle class lifestyles which birthed both materialism and Mammon—or, the love of money, which Charles Finney had warned evangelicals against in the last Century and predicted would end the evangelical movement in America.

 

The Reagan administration also made evangelicals feel less guilty about their clear responsibilities to the poor (for which there are over two thousand admonitions in Scripture). Also tied to issues of class were the ubiquitous and conflicted issues of race and gender tied to women’s ordination and the racial integration of churches and colleges. These issues were resurrected by the Civil Rights Movement, (which few white evangelical leaders joined) however, other main-line Protestant denominations felt that these social issues must also be resolved within their churches. As a result, the ordination of women and the wider acceptance of African Americans in positions of authority and empowerment moved slowly along, leaving evangelical Christian colleges in both a modern- day quandary and debate.

 

Issues of race, class and gender, joined by the growing acceptance of the LGBT community within society and main-line churches, (like the Episcopal Church of the United States and other main-line Protestant denominations) also influenced a new generation of young evangelicals. With the election of our first African American President in 2008, Barack Obama, a self-professed Christian and member of a controversial minister’s (Dr. Jeremiah Wright) church (Trinity United Church of Christ—Chicago) and a recent “convert” to endorsing gay rights, the debates on campus and in most churches began in earnest and furthered the generational divide among evangelicals.

 

Thus, the current evangelical divide has long-standing historical roots. In addition, I identify three systemic causes behind this wide chasm of evangelicals that threaten the future existence of their institutions above and beyond growing costs and competition with state institutions.

 

1. Students at evangelical colleges no longer look solely to Scripture in order to make decisions on issues of human sexuality. While a literal interpretation of Scripture is a main tenet in fundamentalist churches (and some conservative evangelical institutions), it is no longer the only touchstone for student’s moral decision making. Also, the complex world of biblical hermeneutics (and who speaks for God in a post-modern era) has made it difficult for younger evangelicals to make universal moral proclamations. In Wesleyan centered churches, (like Anglicans and Methodists) the theological quadrant of Scripture, Tradition, Reason and Experience is being more widely applied to decisions on human sexuality and other moral issues. Thus, for some evangelicals, Experience, or, (the subjective role of Holy Spirit) has superseded literal interpretations of Scripture in support of same sex rights and other controversial moral issues.Further, sexual norms in general are not as salient or relevant to younger evangelicals. The former taboos of living together and pre-marital sex are just about non-existent among the young.  The former teachings and admonitions of complete abstinence before marriage or living together without a license never took hold on this current generation. And, for some evangelicals, same sex relationships were part and parcel of these former prohibitions. They did not hold up to the standard of reason or in the experience of peoples whose lives they respected.

 

2. The Reagan Revolution generation of evangelicals are perceived by younger evangelicals to be pre-occupied by health, wealth, literal interpretations of Scripture, the coming Apocalypse and individual rights. There is also a deep division over the face and nature of secular Presidential leadership. There is a deliberate divorce among older evangelicals’ attitudes between Donald Trump’s personal life and the policies that he favors that directly benefit them. Gone are the days when you hear the comment “worldly” as a pejorative admonition among fundagelicals and the obvious conspicuous consumption among mega-churches.

 

3. There is a current generation of evangelicals (like Sojourners led by Jim Wallis in D.C.) and other popular authors and speakers like (Tony Campolo, Rob Bell, Matthew Vines) that are harkening back to a radical evangelical heritage of the nineteenth century and; thus, the current face and nature of evangelicalism as personified by media driven leaders like (Franklin Graham, Jerry Falwell, Jr. Jim Dobson) who are considered by these young evangelicals as culturally bound, both worldly and other-worldly and [they] are perceived to compromise their professed values to their current material comforts as imaged in this current Presidential administration. This distaste of America’s secular leaders (along with keeping a considered distance from the current crop of evangelical leaders that they believe compromise their values for the “approval of men”) echoes John Brown’s contempt for compromising Christians and slaveholders during the Antebellum period of U.S history.

 

If these trends continue, then, I believe that evangelical colleges will collapse from within. They are already facing a steep enrollment decline largely because their alumni base is not sending their children to their alma maters. They do not see either the spiritual need or the monetary value since there are so many “good” secular universities with active Christian groups on campus that cost a lot less, and, driven by the current vocationalism and student debt, financial reserves are also a major threat to these institutions.

 

The cultural ramifications of these social and cultural upheavals on their Churches is also profound. As denominational churches and the evangelical movement (as represented by mega-churches) continue to differ on their stances towards the acceptance of LGBT church members and priests, pre-martial sexual relationships, multicultural church bodies and the face and nature of American hyper-capitalism, both church and college structures will continue to divide their institutions and the degree to which Christians can and will compromise their spiritual values against secular realities will continue to be debated.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172651 https://historynewsnetwork.org/article/172651 0
History is a Verb, Something You Do: An Interview with Mark Doyle

 

Mark Doyle is a historian of Ireland and the British Empire and Professor in the Department of History at Middle Tennessee State University.

 

 

What books are you reading now?

 

Because it’s summer, I’m being a bit self-indulgent in my reading. I just finished Ulysses, which, to my shame as an Irish historian, I had never read before. It was a harrowing but ultimately very rewarding experience - knowing the historical background was a help, but I wouldn’t have survived without online guides like Shmoop and joyceproject.com to guide me through the murky bits. I feel nothing but admiration for people who read and appreciated it upon its initial publication in the early 1920s without the assistance of our modern Joyce Industrial Complex.

 

I always try to keep at least one novel going – not just because it makes a nice break from my academic reading, but also because it helps me be a better writer. After Ulysses I started The Dispossessed by Ursula K. Le Guin, a science-fiction novel about two worlds with contrasting social and political systems, one capitalist/individualist and the other anarchist/collectivist. I don’t read much sci-fi, but I am drawn to books that take an abstract idea or ideology and follow it to its logical conclusions. George Orwell, HG Wells, Jose Saramago, Margaret Atwood, and George Saunders are other writers in this vein – in addition, I’m sure, to many sci-fi writers whose work I’ve yet to read. I’m quite enjoying The Dispossessed and rather wish I’d read it twenty years ago, when I first began thinking about the theory and practice of these different social systems.

 

I’m also reading At Freedom’s Door by Malcolm Lyall Darling. Darling was a member of the Indian Civil Service from 1904 and 1940, mostly in Punjab, and in 1946-7 he took a tour around northern India to gauge the state of the country on the eve of independence. While it is somewhat colored by Darling’s cultural assumptions and blind spots, it’s an invaluable source about the social and economic conditions in (primarily) rural India just before Partition. I’m particularly interested in the communal relations that Darling describes and am thinking about how his experiences in northwest and northern India might compare with conditions further east. I’m supervising a PhD student who is working on partition in Bengal (northeastern India), and the nature of Hindu-Muslim relations on the ground will be a crucial component of his research.

 

Finally, I’ve just started Fossil Capital by Andreas Malm. I’ve been thinking (and fretting) quite a lot about the role of the historian in the face of catastrophic climate change, and I’m hoping this book, about the roots of the fossil fuel economy, can suggest one way forward. Sometimes our work seems so trivial in comparison with the existential threats we’re faced with – more than once I’ve considered chucking it all in and just chaining myself to a tree in the Amazon – but I also know that our work is necessary for helping humanity survive whatever lies ahead. Malm’s book seems like one promising way forward, an effort to understand how we got into our current predicament and a suggestion about how to find our way out, but there are other approaches that may be just as important. I’ll say more about this below.

 

What is your favorite history book?

 

It’s impossible to choose just one, but I think the one that has had the biggest impact on me as a scholar (and a human) is Ordinary Men by Christopher Browning. It’s a study of a Nazi police battalion who rounded up and killed Jews during the Holocaust, but it’s also much more than that, a multilayered accounting of the social, political, and cultural forces that can lead people to commit extraordinary violence against other people. Browning’s conclusion, that any group of people placed in similar circumstances would act as these German policemen did, is a masterpiece of complex historical argument that made me feel personally implicated. It forced me to ask myself whether I would have the courage to stand up for what I knew was right even when everyone else was doing wrong, whether I would be the one in a hundred who refused to shoot or purposely misfired or actually tried to intervene to save lives, and it’s a question that I still ask myself all the time. When I read the book in graduate school I was still figuring out what the purpose of history was, what role it could play in contemporary society, and here was a powerful answer that continues to resonate in much of my work. History can help us understand the structural forces that foster suspicion, prejudice, resentment, and violence, and once we understand those forces we can begin to make better choices not just about how we live our own lives, but how we order our societies.

 

Why did you choose history as your career?

 

I never really chose to become a historian. It was more an accumulation of smaller decisions that led me in this direction: the decision to add a history major to my philosophy major as an undergrad, the decision to study abroad in Dublin my junior year, the decision to apply directly to PhD programs rather than getting a Master’s first, and so on. At a certain point I was so far along the road that I was incapable of imagining what else I would do with my life, and I also found that I was pretty good at it and mostly enjoyed it, and so here I am. It sounds trite to say “I didn’t choose history. History chose me,” but I suppose it’s sort of true. At a more fundamental level, though, I suppose I gravitated toward history because I liked hearing stories about people and places beyond my own experience, and that remains my primary motivation today. 

 

What qualities do you need to be a historian?

 

Curiosity, empathy, and a commitment to evidence-based, rational argument. It helps to have a bit of imagination, too. In a way, being a historian is like being a novelist: you have to imagine your way into lives that are very different than yours in order to come up with plausible explanations for why things happen the way they do. Unlike novelists, however, we’re required to root our imaginings in the available evidence: the art of history is essentially trying to get that equation right.

 

Who was your favorite history teacher?

 

It’s difficult to name a single teacher. The best teachers I’ve had, whether in history or something else, have all been good storytellers. I’m a frequent practitioner of abstract thought and advocate for big ideas, but the most effective entry point into any topic – before you get to the abstractions and ideas – is a good story that’s capable of eliciting an emotional response. As an undergrad at Tulane University I had a professor, Sam Ramer, who would tell the most outlandish (but true!) stories about Russian history, and I can remember laughing and shaking my head in wonder that such things ever really happened. It was almost enough to get me to adopt a Russian studies major, until Dr. Ramer persuaded me that my ignorance of the Russian language and impending departure for a year in Dublin might make it hard to fulfill the requirements. Fortunately, as it turned out, outlandish true stories aren’t confined to Russian history: Dr. Ramer was just unusually good at telling them.

 

What is your most memorable or rewarding teaching experience?

 

Many of my students, particularly at the survey level, come to history with a negative preconception about the discipline. In their high school experience history was mostly about memorizing dry facts (names, dates, etc.) in preparation for a standardized test, and they often don’t think of it as a subject devoted to argumentation, one in which the questions are as important as the answers. My most rewarding teaching moments come when a student tells me – or, better, inadvertently shows me – that my class has changed the way they think about history. I don’t need to convert all of my students into history majors, but I do want all of my students to develop certain habits of mind that they can apply to all realms of their lives: critically assessing information, considering multiple points of view, grasping the provisional nature of historical (and many other kinds of) knowledge, finding ways to articulate their ideas with clarity and precision. These habits might show up in their coursework, but I also see it when a student has a revelation in class (one student realized, in the midst of a discussion about 20th-century fundamentalism, that she had been raised by fundamentalists), when students cluster in the hallway to further debate something we were talking about in class, or when a previously reticent student begins to find her voice. These are the moments that make the job worthwhile.

 

What are your hopes for history as a discipline?

 

As I hinted earlier, I think historians have an important role to play in confronting the various crises we face at the moment. The chief crisis is climate change, and so we obviously need to be doing lots of environmental and climate-related history, but this is a problem whose impacts go well beyond weather, ecology, or the natural world. Migration (and attendant racism and xenophobia), resource scarcity, traumatic economic restructuring, public health crises, civil wars, interstate violence – the knock-on social effects of climate change will be massive, and many are already getting underway. All of these processes have their own history, and my hope is that historians will use their expertise in these matters to help our societies respond in humane, nuanced, and evidence-based ways to the crises that are coming.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I don’t deliberately collect old books or objects, but in this job it’s hard to avoid accumulating large quantities of both. Most of my recent acquisitions relate to a book I’ve just written about the English rock band the Kinks. Without really intending to, I’ve ended up with quite a few magazine clippings, records, and ephemera related to the band. My favorite is a small doll of Ray Davies, the band’s lead singer and songwriter, that my wife gave me a couple of years ago. He’s smirking at me from a bookcase as I type this, in fact, probably wondering why I haven’t corrected the page proofs yet.

 

What have you found most rewarding and most frustrating about your career? 

 

The most rewarding aspects are related to teaching, although the opportunities for travel have also been invaluable. The most frustrating aspects are the things that most humanities academics complain about, I suppose: a devaluing of our work in the public discourse, lack of government support for our work, the casualization of the professoriate and disappearance of good tenure-track jobs, expanding administrative duties that keep us from performing our core functions. On the whole, however, I feel tremendously fortunate to be in a profession that allows me to indulge my curiosity and share my enthusiasms with captive audiences.

 

How has the study of history changed in the course of your career?

 

I think we’ve become much more aware of how our work impacts the public. For all sorts of reasons – economic, political, demographic – we’re under growing pressure to justify our existence, and this has given rise to more public history programs, more outreach via websites and social media, more interventions in political debates, more efforts to communicate our research to people beyond the academy. On the whole I think this is a good thing, although too much emphasis on the impact or utility of historical scholarship can leave little space for the exploration of esoteric topics for their own sake, and I would like there to be continued space for that. I don’t want historians to be judged simply by their “outputs” or history departments to be valued simply for their ability to get their students jobs, and this tends to be the default position of university administrators and legislatures. The challenge is to define for ourselves the value of our discipline and then communicate that to the wider public, and on the whole I think we’re better at that now than when I started down this path twenty years ago.

 

What is your favorite history-related saying? Have you come up with your own?

 

I’ll go with Marx: “Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past.”

 

In my research methods class I tell my students to think of history not as something that you learn but as something that you do. That section of the class is called “History is a verb,” so I’ll claim that as my own history aphorism.

 

What are you doing next?

 

This summer I’m writing an article about several tours of Ireland by the African-American choral group the Fisk Jubilee Singers in the 1870s. This is an offshoot of a larger project, which may become a book or may become something else, about African and Asian migrants/immigrants to Ireland in the nineteenth century. They were there, but very few historians have thought to look for them. As Ireland becomes ever more multicultural, it’s important to know more about the history of migration into the country, particularly by people of color, and of the ways mainstream Irish society regarded outsiders in their midst.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172609 https://historynewsnetwork.org/article/172609 0
Joe Biden is a Product of His Time

 

Democratic frontrunner Joe Biden recently apologized for his characterization of the segregationists he worked with as a U.S. senator: “At least there was some civility. We got things done. We didn’t agree on much of anything,” he said. This episode illustrates a troubling phenomenon of our current political culture and to a certain extent historical discourse. As American attitudes toward gender, race, and class have evolved, scholars and the public have a tendency to criticize the practices and beliefs of their elders. 

 

For example, some contemporary historians, have been critical of the Founding Fathers, men who either owned slaves or agreed to include slavery as part of the new American nation. Particular opprobrium has been aimed at Thomas Jefferson, a man who condemned slavery, yet owned slaves. Furthermore, Jefferson claimed that the black race was intellectually inferior, yet penned the words “All men are created equal.”

 

The Founders are not the only dead white males whose behavior does not pass muster with today’s historians. Abraham Lincoln also has come under fire from a group of historians who challenge the conventional view of him as “the Great Emancipator.” The most outspoken of these voices is the late historian Lerone Bennett Jr. who wrote, “The real Lincoln... was a conservative politician who said repeatedly that he believed in white supremacy.” In his book Forced into Glory: Abraham Lincoln’s White Dream, Bennett writes that the entire concept of emancipation was antithetical to Lincoln who reluctantly issued the Emancipation Proclamation only as a military tactic. This reading conveniently ignores Lincoln’s public statements disavowing slavery and his efforts to pass the Thirteenth Amendment. 

 

How should we assess this phenomenon of a later generation’s unfavorable view of their political forebears’ attitudes? In the case of the Founding Fathers, clearly they either participated in or tolerated slavery. However, they were ahead of their time creating a government based on sovereignty of the people. That was their goal and they would not let differences over slavery prevent them from achieving it. 

 

 Lincoln too was a product of his time. In his day, all but a tiny minority of abolitionists believed in racial inequality. Most whites in the North like Lincoln opposed slavery, but had serious misgivings about racial equality and the mixing of the races. They were not willing to fight to end slavery in the South or the border-states. But when Lincoln was faced with the destruction of the Union or the abolition of slavery, he chose the latter. His desire to save America was greater than the racial prejudice of his day. The abolitionist William Lloyd Garrison and Frederick Douglass both believed Lincoln had evolved.

 

Biden also is a product of his time. An incident in Biden’s first year in the Senate is instructive of this point. In 1973, Senator Jesse Helms (R-NC), also a freshman, was speaking on the Senate floor. Biden, who disagreed with Helms’ position on civil rights and racial attitudes went to Senate Majority Leader Mike Mansfield to express his disgust. Mansfield advised the young Biden to find something good in every senator so that he could work with all of them to accomplish things. Biden took that advice. When he came to the Senate, many of the most powerful figures were Southern committee chairmen with segregationist pasts. But Biden learned to work with them as did another young man who came to the Senate ten years earlier, Ted Kennedy. As for Biden’s opposition to court ordered busing, a poll taken in 1973 indicated that only nine percent of black parents wanted their children bused away from their neighborhood schools. By the end of the 1970s busing had disappeared as a divisive issue.

 

When assessing past attitudes, it is important to recognize that most people are influenced by the thinking of the day. Even those figures who are ahead of their time as the Founding Fathers and Lincoln were, are products of their environments. Moreover politics in any era is complicated and compromises are sometimes necessary. The Founding Fathers had a country to create. Abraham Lincoln had a country to preserve. Joe Biden who believes in racial equality wanted to create equality of opportunity and promote the progressive agenda. To achieve those ends, all these figures had to work with people whose views were antithetical to theirs. Their willingness to compromise when necessary in the service of a higher aim was not weakness. It was practical statesmanship.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172655 https://historynewsnetwork.org/article/172655 0
Another Kind of Patriotism Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

I went to a patriotic rally on Sunday. There was a lot of talk about flags, which were shown with great reverence. Military veterans were honored as heroes, due great respect. It was colorful and loud.

 

The rally had nothing to do with Trump. The event was a traditional Ojibwe, or Anishinaabe or Chippewa, Pow Wow, celebrated every year at the Lac Courte Oreilles reservation in northwestern Wisconsin. The Honor the Earth Homecoming Pow Wow is the opposite of the “patriotic” rallies that Trump is holding as the beginning of his re-election campaign.

 

On the way to the site, signs were posted along the road urging everyone to think of themselves as unique and worthy persons. Inside, the focus was entirely on the celebration of Native American traditions, wisdom, and culture, without any hint of comparison to other cultures. Members of the local tribe were joined by tribes from across the region, each of whom could sing and drum their own songs. There were no enemies, just friends.

 

Ojibwe veterans from all service branches were named and honored for their service to the American nation and to the Ojibwe nation. But no weapons were displayed, except ceremonial versions of traditional hunting weapons displayed by brightly costumed dancers.

 

Politics was conspicuously absent, as was any complaint about how the Ojibwe and all other Native Americans have been treated by white settlers who invaded the lands they lived in and took them for their own. The only historical hint I heard from the announcer, who was also broadcasting over the reservation’s radio station WOJB, was his brief mention that the Anishinaabe had been defending their land for hundreds of years, long before the appearance of whites.

 

The messages of the Pow Wow were clear: “We are patriots. We love our land and our unique culture. We love America and have defended it in every war. We welcome and respect all Americans.”

 

Donald Trump’s rally in North Carolina, and his whole constant campaign about himself, send the opposite messages. “We are patriots, better patriots than you. We love America and therefore we hate you. Hating you is true patriotism.”

 

I find the implicit violence of the crowd in North Carolina to be just a few steps away from the real violence of the white supremacists in Charlottesville. What if a woman in a hijab had walked in front of that crowd as they chanted “Send her back”? That is the new Republican model of patriotism.

 

What could love of America mean? It could be love of the land, the amazing lands of our 50 states, encompassing beautiful vistas of mountains and lakes and prairies and desert that might be unmatched anywhere else. The Ojibwe love their land as a sacred trust from previous generations, the little bit that has been left to them after centuries of white encroachment. They wish to preserve it forever.

 

Love of America could be allegiance to the principles at the foundation of our political system. Those principles have not been consistently followed, and a truly democratic and egalitarian nation is still a dream to be realized, rather than a reality to be defended.

 

It could be reverence for American history, our unique national story of the creation of a new democracy by European immigrants and the evolution of the United States toward a more perfect union by embracing the lofty principles set forth in our founding documents. That story has many dark chapters, but we could say that American history is a narrative of overcoming – the struggle to overcome regional division, racism, sexism, homophobia, poverty, a struggle that may continue long into the future.

 

Love of America could be affection for Americans. I think of my own tendency to root for American athletes when they compete against athletes from other nations at the Olympics, the World Cup, or in tennis Grand Slams. Americans are incredibly diverse, and it is not easy to put into practice a love for all Americans, no matter ethnic, economic, educational, regional and personality differences. At the least, it should mean that one practices good will toward another American until proven wrong by inhumane behavior.

 

I don’t see any of these forms of love for America in contemporary conservative politics. Conservatives support digging up American land rather than preserving it and fight against every attempt to preserve clean water and air. They taunt conservation organizations who worry about global warming, deny the science of climate change, and oppose all efforts to prevent our own land and the whole globe from becoming less friendly to human habitation. The Trump campaign now sells Trump-branded plastic straws as a deliberate sneer at attempts to save ocean life from being overwhelmed by plastic. For today’s conservatives, American land is a source of financial exploitation: don’t love the land, love the money you can make from it.

 

Today’s conservatives, preceding and following Trump, don’t respect the democratic principles that America has at least tried to embody. From blatant gerrymandering to vote suppression to attacks on the free press to praise for dictators and criticism of foreign democracies, principles have been entirely replaced by temporary political advantage as the source of conservative action.

 

Conservatives hate American history, instead trying repeatedly to substitute myths for facts. They deny the historical realities of racism, the “patriotic” excesses of McCarthyism, the expropriation of Native American lands. They attack historians who simply do their job of uncovering evidence about how Americans behaved in the past, good and bad. And they celebrate some of the worst Americans: the Republican state government in Tennessee has now named July 13 as “Nathan Bedford Forrest Day”, honoring the Confederate general who became the first Grand Wizard of the Ku Klux Klan.

 

Conservatives don’t like most Americans. Again led by Trump, and operating as his megaphone, Republican politicians attack Democrats as enemies of America, despite that fact that Democrats represent the majority of American voters.

 

I didn’t see any Trump hats at the Ojibwe Pow Wow, and I doubt that any Native Americans cheered for Trump in North Carolina. These very different rallies represent opposing ideas about patriotism and America. In my opinion, one expresses a beautiful vision of land and people that has stood for America for hundreds of years. The other is an incoherent reverence for a cult figure of dubious value.

 

I never liked cults.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/blog/154231 https://historynewsnetwork.org/blog/154231 0
Can You Ever Tame the Shrew? Is She Really the Shrew?

 

In Padua, Italy, in the 1500s, the populace had its rules. Rule #1 – marry for money Rule 2# marry for money. Rule #3 – marry for money.

 

Kate, the bombastic, headstrong.in-your-face daughter of a wealthy merchant there, Baptista, does not believe in those rules and is intent on shattering them. When she is of the age to be married, young Petruchio arrives in town, handsome as anyone can be, and full of good humor, charm and plenty of spunk. He’s the rich heir to his father’s massive fortune. 

 

Kate falls for him, but keeps putting him off to upset the rules makers. There’s no rush, either, because her father has declared that she cannot marry until her younger sister Bianca walks down the aisle, flowers in hand.

 

Sweet, innocent and very quiet Bianca is pursued by an army of feisty young suitors and meets numerous men and women who are not who they seem to be. Who will marry first? Can Kate stay the shrew? Will old Padua every be the same again?

 

All of those questions are answered amid the lovely forest and meadows of The Mount, writer Edith Wharton’s sprawling, beautiful estate in Lenox, Massachusetts, where Shakespeare and Co. stages The Taming of the Shrew in an outdoor dell just about every night of the week through August 17.

 

This new production of William Shakespeare’s classic is gorgeous, as gorgeous as the forest in which it is staged, as gorgeous as a moonlit night in New England.

 

Director Kelly Galvin has done wonders with the setting. Actors race on to and off the stage from the woods and disappear back into them. Gangs of characters meander under the trees. The forest is large and deep and the meadow wide so there is plenty of room for Galvin to work some magic and she does.

 

Galvin has given the play a splashy new look. It appears to be set in 2019 and the characters all seem like extras from the Godspell show or college beer blast– all tie-dyed shirts, sneakers and shorts. Baptista, the dad, is his own fashion statement with a light green 1601 meets 2019 suit and lovely pea green hat. He looks like Leonardo de Caprio, too.

  

The stage in the Dell is a small one, set on boards. It is framed by huge billboards with words like  POW! BUZZ! And O! painted on them in color. It looks like a glossy scene for one of the old Batman television shows (“to the bat cave Robin!}. There is much audience participation in the comedy, too, with large, colorful applause signs raised from time to time that draw boisterous roars from the audience. The audience is frequently encouraged to groan AWWWW at some points in the play and OOOOO at others and it does.

 

Director Galvin has also livened up the play with several contemporary rock and roll songs, such as Billy Idol’s White Wedding (staged during a wedding in the play).

 

The director works with a fine cast, too. Particularly good are Nick Nudler as Petruchio, Matthew Macca as dad Baptista, Jordan Marin and Caitlyn Kraft (don’t ask) as Gromio, Devante Owens as Lucentio, Bella Pelz as Bianca, Daniel Light as Hortensio, and Dara Brown as Tranio.

 

Kirsten Peacock, as the tempestuous Kate, steals the show. She plays her role to the hilt, appearing as brutish woman wrestler at times, ready to body slam someone instead of the comely Milady.

 

In short, this Taming of the Shrew  is a madcap romp through history, literature and the woods. It is a lot of fun. The theater really encourages people to attend the play, too, heavily advertising it in the Berkshires and slashing ticket prices to just $10.

 

NOTE – the recurring controversy. The play may have been a frolic in the 1600s, but its message, that it is better to be a docile, subservient wife and not an aggressive shrew who must get her way, has reverberated over the centuries. You would think that the #MeToo movement would be out in full force against this one. The end of the play is particularly sexually savage when the newly wed husbands stage a contest to see which wife is more subservient than the others.

 

You could re-write the whole play and amend the “shrewishness” and sexism of it, but then it would not be Shakespeare and it would not be literary history. You don’t like the sexism? Frown, grimace and roll your eyes all at the same time.

 

It’s just a play, and as Shakespeare often said, the play’s the thing.

 

See this play. It is a first-rate production of a first-rate comedy, full of mirth, tricks, ooos and aaahs, wild clothing, joyous weddings, plenty of rockin’ good music and a bunch of young people doing acrobatics you did not think possible.

 

Hooray for sneakers, and Shakespeare too.

 

PRODUCTION: The play is produced by Shakespeare and Co. Sets: Devon Drohan, Costumes: Amie Jay. The play is directed by Kelly Galvin. It runs through August 17.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172626 https://historynewsnetwork.org/article/172626 0
Roundup Top 10!  

Why disabled Americans remain second-class citizens

by David Pettinicchio

The big hole in our civil rights laws.

 

How the failure of popular politics triggered the rise of Boris Johnson

by Jesse Tumblin

Instead of solving intractable problems, public referendums simply exacerbate them.

 

 

Ellis Island's history casts today's border cruelty in an even harsher light

by Megan J. Wolff

Conditions right now are dirtier, more dangerous, and significantly crueler than they ever were at Ellis Island -- most pointedly so where children are concerned.

 

 

When the American right loved Mexico

by Mario Del Pero and Vanni Pettinà

Back when conservatives exalted free markets, our neighbor to the south was a vital ally.

 

 

2020 election is a test America can't afford to fail

by Nicole Hemmer

If the American system re-elects Trump, then something is deeply wrong with either our system or ourselves.

 

 

Trump's Supreme Court Challenge Has a Historical Precedent

by Bethany Berger

As Trump agrees reluctantly to respect the court—at least in the case of the census—he follows, in part, that long-ago legal victory of the Cherokee Nation.

 

 

Chicago’s resistance to ICE raids recalls Northern states’ response to the Fugitive Slave Act

by Kate Masur

Almost 170 years later, the Fugitive Slave Act is viewed as one of the most repressive federal laws in all of American history.

 

 

The long, ugly history of insisting minority groups can’t criticize America

by Tyler Anbinder

Trump’s attack against four Democratic members of Congress fits a pattern in U.S. politics.

 

 

Fifty Years After the Moon Landing, Recalling One Small Misstep

by Tad Daley & Jane Shevtsov

Why did the first humans to set foot off Planet Earth plant the flag of only part of Planet Earth?

 

 

Trump revives the idea of a ‘white man’s country’, America’s original sin

by Nell Painter

It can’t be left to black Americans alone to resist the president’s racism. Citizens of all colours need to resist, and embrace activism.

 

 

Lincoln Would Not Recognize His Own Party

by David W. Blight

He would see the Republicans as the antithesis of everything he fought for.

 

 

All the Presidents’ Librarians

by Michael Koncewicz

Despite being spied on and intimidated during my time in Yorba Linda, I still think presidential libraries are too important for historians to wash their hands of them.

</

 

The Vicious Fun of America’s Most Famous Literary Circle

by Jennifer Ratner-Rosenhagen

The Algonquin Round Table trained a generation of socially conscious writers.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172649 https://historynewsnetwork.org/article/172649 0
Do We Want the America of Frederick Douglass or Donald Trump?

From left to right: Congresswomen Ayanna Pressley, Ilhan Omar, Alexandria Ocasio-Cortez, and Rashida Tlaib

 

 

One version of America is that of President Trump, whose recent tweets led the U. S. House of Representatives to condemn his “comments that have legitimized and increased fear and hatred of new Americans.” His slogan “Make America Great Again,” his attempts to limit voting, and his pandering to Christian evangelicals are not-so-subtle signals that he perceives himself as defending the fortress of white, primarily male and Christian, dominance against the threat of increasing darker-skinned peoples. 

 

Before his recent tweets against “The Squad”—congresswomen Alexandria Ocasio-Cortez (N.Y.), Ilhan Omar (Minn.), Ayanna Pressley (Mass.), and Rashida Tlaib (Omar and Tlaib also being Muslim)—there were also his claims that President Obama was born in Africa, his reference to Haiti and countries in Africa as “shithole countries,” and his stated preference for immigrants from countries like Norway. 

 

His recent charge that the four congresswomen “originally came from countries whose governments are a complete and total catastrophe, the worst, most corrupt and inept anywhere in the world (if they even have a functioning government at all)”  is not only factually wrong—three of them were born in the USA–it also is reminiscent of his “shithole countries” remark.

 

He added that rather than criticizing “the United States, the greatest and most powerful Nation on earth,” the four women should “go back and help fix the totally broken and crime infested places from which they came.” By doing so, Trump joined a long line of false patriots who equated criticizing America with hating and being disloyal to it.

 

A competing and completely different version of America is that of the abolitionist Frederick Douglass, whom President Obama identified as one of America’s “great reformers.” In a Boston speech a century and a half ago, Douglass insisted that “our greatness and grandeur will be found in the faithful application of the principle of perfect civil equality to the people of all races and of all creeds, and to men of no creeds.” Douglas advocated a “composite nation.” Historian Jill Lepore calls the concept “a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them.” 

 

Douglass recalled the U. S. mistreatment of both Native Americans and African Americans, but he also praised the contributions of the latter, as well as various immigrant nationalities like the Irish and the German. In his day, there were great fears regarding Chinese immigrants, but he also thought that they could enrich our nation. “Do you ask, if I favor such immigration, I answer I would. Would you have them naturalized, and have them invested with all the rights of American citizenship? I would. Would you allow them to vote? I would. Would you allow them to hold office? I would.”

 

Although Douglas’s Christian beliefs were “vital to understanding who he was, how he thought, and what he did,” he also believed that “we should welcome all men of every shade of religious opinion, as among the best means of checking the arrogance and intolerance which are the almost inevitable concomitants of general conformity. Religious liberty always flourishes best amid the clash and competition of rival religious creeds.”

 

In Douglass’s day, as in 2019, there were those who thought like Sen. Garrett Davis of Kentucky (“I want no negro government; I want no Mongolian [Chinese] government; I want the government of the white man which our fathers incorporated”) and Douglass recognized such fears and objections. “Is there not such a law or principle as that of self-preservation? . . . Should not a superior race protect itself from contact with inferior ones? Are not the white people the owners of this continent? Have they not the right to say, what kind of people shall be allowed to come here and settle? Is there not such a thing as being more generous than wise? In the effort to promote civilization may we not corrupt and destroy what we have?”

 

But Douglass believed that the talents various immigrants would bring to the USA, the example of nationalities and creeds living harmoniously together that we would set for the rest of the world, and the honoring of “essential human rights” overcame all of these objections.

 

Following Douglass’s speech, from the 1870s until World War I, millions of immigrants entered the United States. As darker skinned peoples from southern and eastern Europe began arriving in greater numbers, nativist sentiments against them increased. In the 1920s, such attitudes were strong in the Ku Klux Klan, which flourished in that decade.

 

In 1924, Congress passed the Immigration Act, containing an Asian Exclusion Act, almost banning Asian immigrants, and a National Origins Act, limiting European immigrants but favoring those from northern Europe. As Jill Lepore has written, the purpose of the new law “was to end immigration from Asia and to curb the admission of southern and eastern Europeans, deemed less worthy than immigrants from other parts of Europe.” Reflecting such sentiments was Indiana Republican Fred S. Purnell, who said, “There is little or no similarity between the clear-thinking, self-governing stocks that sired the American people and this stream of irresponsible and broken wreckage that is pouring into the lifeblood of America the social and political diseases of the Old World.” As Robert Dallek tells us, in the early 1930s, there was still strong prejudice against southern and eastern European immigrants: “The belief that these groups could never be turned into citizens who fully accepted Anglo-Saxon economic and political traditions” was widespread.

 

From the beginning of FDR’s presidency to that of Trump, immigration law and attitudes toward immigrants have varied, but the contrast between nativist views fearful of immigrants and Douglass’s favoring of a “composite nation” represent two ends of a wide spectrum of views on American immigration—and diversification. One end sees it as threat, the other as a blessing.

 

Unlike Trump and many of his followers, the House of Representatives’ statement condemning the president’s remarks takes the positive view akin to that of Douglass. It quotes President Kennedy, “whose family came to the United States from Ireland.” His 1958 book, A Nation of Immigrants, states that “the contribution of immigrants can be seen in every aspect of our national life. We see it in religion, in politics, in business, in the arts, in education, even in athletics and entertainment. There is no part of our nation that has not been touched by our immigrant background. Everywhere immigrants have enriched and strengthened the fabric of American life.’’ The congressional statement also quotes President Reagan who thought that immigrants were ‘‘one of the most important sources of America’s greatness.”

 

President Trump’s 1960’s “America-Love-It-or-Leave-It” attitude is also contrary to that of Douglass and many of our finest Americans. From Douglass’s 1852 speech “What to the Slave Is the Fourth of July?” to his words of 1892 that America still had “cause for repentance as well as complaisance, and for shame as well as for glory,” we see the abolitionist both praising and criticizing his country. His spirit was continued by such Americans as Martin Luther King, Jr. and James Baldwin, who in the 1950s wrote, “I love America more than any other country in this world, and exactly for this reason, I insist on the right to criticize her perpetually.” Just recently Sayu Bhojwani, New York's first commissioner of immigrant affairs and born in India herself, quoted Baldwin’s words in her opposition to Trump’s views, and added that sometimes “to create a more inclusive and transparent democracy, our paths demand disobedience and disruption.”  

 

Taking the criticism of Trump and some of his followers one step further, they have stated or suggested that the four congresswomen he has criticized are socialists or communists. Referring to them, Trump himself tweeted, "We will never be a Socialist or Communist Country. IF YOU ARE NOT HAPPY HERE, YOU CAN LEAVE! It is your choice, and your choice alone. This is about love for America. Certain people HATE our Country.” Trump apologist Sen. Lindsey Graham, R-S.C., called the four congresswomen “a bunch of communists."

 

“Un-American,” “communists,” such over-the-top, dangerous and polarizing rhetoric, takes us back to the days of the House Un-American Activities Committee and Senator Joseph McCarthy’s wild charges and investigations of the early 1950s. Trump’s onetime closeness to and fondness for McCarthy’s chief counsel, Roy Cohn, is not coincidental.

 

The present essay is not suggesting that the present immigration question is not complex and needful of thoughtful consideration (see, e.g.,here and here), but the Trumpian fearmongering now going on, is not the way to approach it. In 2019 and 2020, we Americans have a choice. Which kind of America do we want? A “composite nation,” as Douglass advocated, one that takes pride in its ethnic and religious diversity and sets an example for other countries who are fearful of immigrants? Or an America afraid of immigrants, especially if not lily white, Christian ones, as Trump and many of his followers desire? Because of this choice and so many others, the 2020 U.S. elections are  crucial, and all real patriots should hope (and do all they can) so that we make the right choices.

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172582 https://historynewsnetwork.org/article/172582 0
Jane Addams and Lillian Wald: Imagining Social Justice from the Outside

Jane Addams (left) and Lilian Wald (right)

 

Anyone who has taken a United States history course in high school knows the story of Jane Addams and Chicago’s Hull House, the first Settlement House in America and arguably the genesis of social work in the country. More advanced textbooks may even have discussed Lillian Wald, founder of New York’s Henry Street Settlement House, who was instrumental in introducing the concept of “public health” – and the important epidemiological axiom that physical well being is inseparable from economic and living conditions. 

 

What no one learned in high school, or later, was that Addams and Wald were women who loved other women and that these relationships – as well as the female friendship networks in which they were involved – were profoundly instrumental to their vision of social justice that changed America. 

 

Since its founding – even amid deep seated prejudices and politics of exclusion and animus – there has been an American impulse to help the less advantaged. This was the kinder aspect of Winthrop’s 1630 sermon “The City on the Hill” (also know as “A Model of Christian Charity”)  and the sentiment was evident in George W. H. Bush’s 1988 “A Thousand Points of Light” speech. Helping fellow countrymen – at least those deemed worthy of help – was a social and political virtue. 

 

What Jane Addams and Lillian Wald did was different. They imagined an America in which helping the poor was not charity but a work of democracy and a demonstration of equality. Addams and Wald, and many other women like them, were complicated products of the traditional American impulseforcharity and the massive reforms of the progressive era. What made them distinct was that their status as single women, and as lovers of women, gave them an outsider status that allowed them to envision different ways of structuring society.

 

Jane Addams, born in 1860, grew up in what looked like a nineteenth-century picture-book American home in Cedarville, Illinois with servants and farmhands. Her family was prosperous and owned factories that produced wool and flour. Her father, friends with Abraham Lincoln, was an abolitionist and progressive and raised his children likewise. While attending Rockford Female Seminary in Rockford, Illinois, Addams met Ellen Gates Starr and the two became a couple, exchanging constant letters while they were apart. In 1885 Starr wrote to her: 

 

My Dear, It has occurred to me that it might just be possible that you would spend a night with me if you should be going east at the right time. If you decide to go the week before Christmas - I mean - what do I mean? I think it is this. Couldn't you decide to spend the Sunday before Christmas with me? Get here on Saturday and go on Monday? . . . Please forgive me for writing three letters in a week

 

In 1887, after hearing about Toynbee Hall in London’s impoverished East End neighborhood of London, Addams became intrigued with the new concept of a settlement house: group living in poor neighborhoods that brought local women, men, and children together with teachers, artists, and counselors from various backgrounds. Today, we might call the concept “intentional living groups.” These collectives – often funded by wealthy people – offered education, health care, arts training, day care, meals, and emotional support for the economically disadvantaged. Addams and Starr visited Toynbee House and decided to open something similar in Chicago. In 1889 they opened Hull House with the charter “to provide a center for the higher civic and social life; to institute and maintain educational and philanthropic enterprises, and to investigate and improve the conditions in the industrial districts of Chicago.”  Later, after Addams and Starr separated, her new lover Mary Rozer Smith joined her in this grand social experiment.

 

 

 

 

Lillian Wald had a similar story. Born into a comfortable, middle-class Jewish family in Cincinnati, Ohio in 1867 she was raised in Rochester, New York. Although she was a brilliant student, she was tuned down by Vassar College because she was considered too young at age 16. Instead, she later went to nursing school. Inspired by Jane Addams and Hull House, upon graduating, Wald and her close friend Mary Brewster moved into a tenement in the immigrant communities of New York’s Lower East Side and began their nursing careers. They believed that nursing involved more than physical care. It was important for them, and other nurses, to live in the neighborhoods of the people for whom they cared and to address the social and economic problems as much as the physical ills. Wald coined the term “public health nurse” to convey the broad swath of this goal. Soon, Wald and Brewster moved into a home on Henry Street that eventually became the Henry Street Settlement. This became  a model of community-based health initiatives and eventually the Visiting Nurse Service grew out of this work. By 1910, there were 12 branches of Henry Street Settlement throughout the city, 54 nurses, and 15,492 patients.

 

Wald and Brewster received emotional and financial support from many women, and some men. But, much of the core of Henry Street Settlement was formed around a close network of single women, who among themselves had a complex series of personal friendships and romantic relationships. The Manhattan socialite, and daughter of a prominent New York minister, Mabel Hyde Kittredge, for example, worked at Henry Street Settlement for many years and was an intimate friend to Wald. In the early years of their friendship she wrote to Wald:

 

I seemed to hold you in my arms and whisper all of this. . . . If you want me to stay all night tomorrow night just say so when you see me. . . . Then I can hear you say "I love you"-and again and again I can see in your eyes the strength, and the power and the truth that I love. 

 

Wald had a vast network of women friends – lovingly referred to as her “steadies” – and at the end of her life she said “I am a very happy women... because I’ve had so many people to love, and so many who have loved me.” 

 

What does it matter that Addams and Weld were women who loved women?  Addams had two major loves in her life, with whom she shared work, a vision and a bed. Wald’s relationships were less dedicated, but no less intense. Would they have been able to do this important work if they had been heterosexual, married and probably mothers? Certainly there were many married women – from Julia Ward Howe in the mid-nineteenth century to Eleanor Roosevelt in the mid-twentieth century –  who partook in public life, public service, and social reform. What set Addams and Wald and their friendship circles apart was that they were outsiders to social conventions. 

 

In a world dominated by heterosexual expectations being a single woman culturally set you apart in ways that were dismissive – words and phases such as “spinster” and “old maid” – but also liberating: you were not burdened with the duties of marriage and motherhood. Addams and Wald were also fortunate to come from wealthy families which gave them the ability to dictate their own life choices. With limited opportunities for gainful employment, many women understood that marriage was their best path to economic security. As women unattached to male spouses, Adaams and Wald were able to break from the traditional methods of female giving such as the ideology of motherly love or the distanced, munificent “lady bountiful.” 

 

Yet there is something else here as well. Unburdened by the expectations of heterosexual marriage these women imagined and explored new ways of organizing the world. They created new social and housing structures – extended non-biological families – that were more efficient and more capable of taking care of a wealth of human social, physical and emotional needs. In large part they were able to do this because they did not rely on the traditional model of heterosexual marriage and home as the building block of society. Instead, they rejected this model. 

 

Historian Blanche Wiesen Cook has written extensively on how these female friendship circles – precisely because they were homosocial, and in many cases homosexual – were able to transform American social and political life with a new vision of how to organize society and conceptualize how to care for family is the largest sense of the word. Such a vision is not only profoundly American, it is the essence of social justice. 

]]>
Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172589 https://historynewsnetwork.org/article/172589 0
The History of the American System of Interrogation

 

Why do people confess to serious crimes they did not commit? Such an act appears totally against human nature. And yet, we have the case of the Central Park Five where five teenage boys didn’t just say “I did it,” but gave detailed, overlapping, confessions. Based upon those confessions, and with no supporting physical evidence, the boys were sentenced to prison. After serving nearly a dozen years, someone else confessed to the rape, and that confession is backed up by DNA evidence. How can this happen?

 

The more important question is: were the Central Park Five confessions a freak accident? The answer is NO. Every day, adults and juveniles falsely confess to serious crimes they did not commit.

 

After 40 years of practicing law and having handled a case similar to the Central Park Five, I can tell you that it is not the people who make these false confessions that society should look to for an explanation, but rather the system itself. In particular, how the authorities question potential suspects.

 

At the core of American criminal justice is an accusatorial system that assumes a suspect is guilty. This accusatory model runs through all levels of law enforcement and naturally leads to an accusatory method of interrogation, where the suspect is presumed guilty by their questioner.

 

At first, physical torture was used to extract confessions, verifying the interrogator’s theory of guilt. Then in 1936, the United States Supreme Court ruled, in the case of Brown vs. Mississippi, confessions obtained through violence, such as beatings and hangings, could not be entered as evidence at trial. The court recognized that any human can be coerced to say anything, and as such, confessions by torture were unreliable.

 

As a consequence, the authorities went to a softer and less obvious method of coercion: the “third degree.” The third degree left less-observable physical marks of torture; the police shoved the suspect’s head into a toilet, twisted arms, or struck the accused in places that would not leave an obvious mark. Interrogations were conducted nonstop for days, with sleep deprivation, bright lights, verbal abuse, and threats to the suspect and suspect’s family all commonplace.

 

In the early 1960s, John E. Reid, a polygraph expert and former police officer, and Fred E. Inbau, a lawyer and criminologist, devised an extensive method of psychological interrogation called the Reid Technique of Interrogation. This model is based on psychological manipulations and the ability of the questioner to tell when the suspect is lying, and it is used today by practically all police departments in the United States. The Reid Technique follows the American tradition of accusatory criminal investigation. Instead of torture, however, the Reid Technique utilizes isolation, confrontation, the minimization of culpability and consequences, and the officer’s use of lies about evidence that supposedly proves the suspect is guilty.

 

This accusatory method establishes control over the person being investigated by leaving the suspect alone in a small, windowless, claustrophobic room prior to interrogation; has the interrogator ask accusatory, closed-ended questions that reflect the police theory of what happened; and has the officers evaluate body language and speech in order to determine if the suspect is lying. The goal of this psychological interrogation is to overwhelm the person being questioned and to maximize the suspect’s perception of their guilt. When necessary, a softer approach by the investigator allows the suspect to perceive their conduct in a socially more acceptable light and thereby minimizes both the perception of the suspect’s guilt and the likely legal consequences if they confess.  

 

By the time the Supreme Court decided Miranda v. Arizona, psychological interrogations had supplanted physical coercion. But with no obvious marks of torture, the Supreme Court now had difficulty distinguishing voluntary from involuntary confessions. The Court noted:

[T]he modern practice of in-custody interrogation is psychological rather than physically oriented.

 

As we have stated before, this Court has recognized that coercion can be mental as well as physical, and that blood of the accused is not the only hallmark of an unconstitutional inquisition.

 

The justices went on to emphasize the “inherent coercion of custodial interrogation [when considering] the interaction of physical isolation and psychological manipulation,” and concluded that new safeguards were necessary in order to ensure non-coerced confessions. Thus, the Court required the now-famous Miranda rights warning that law-enforcement agencies read to suspects.

 

These safeguards are as follows:

  • You have the right to remain silent.
  • Anything you say will be used against you in court.
  • You have the right to an attorney.
  • If you cannot afford an attorney, one will be provided to you.
  •  

    However, these Miranda safeguards do not prevent psychologically induced false confessions.

     

    Scholars say the flaw in psychological interrogations like the Reid Technique is the assumption that the investigator can detect when the suspect is lying. Studies challenge the ability of a person, even a trained investigator, to tell when a person is lying. This is particularly true when dealing with the young, the poorly educated, or the mentally ill, especially when the suspect is under the psychological stress of isolation, accusation, and the presentation of false evidence.

     

    As a criminal defense attorney, I have had firsthand experience observing the results of the Reed Technique when used against juvenile suspects. The Crowe murder case is a prime example of how psychological interrogation goes wrong when the police follow their presumed skilled assumption that a suspect is guilty. In the Crowe case, the police had no evidence as to who killed 12-year-old Stephanie Crowe. But her 14-year-old brother, Michael, didn’t seem to be grieving appropriately. By the time the police were done interrogating Michael, he had confessed to the murder, and two other high school friends had given statements tying them to the murder. All three were charged as adults for the murder. I represented one of the boys. At defense insistence, a mentally ill vagrant’s clothing was tested. DNA tests found Stephanie’s blood on the vagrant’s clothing. The boys were released and exonerated of all guilt.

     

    Such false confessions need not happen. There is a new method of interrogation created by the High-Value Detainee Interrogation Group (HIG) that is quietly being used by a few law enforcement agencies. HIG was established in 2009 as a reaction to the physical and psychological torture at Guantanamo Bay, Abu Ghraib, and other overseas CIA facilities during the post-9/11 Bush years. The HIG technique represents a joint effort by the FBI, the CIA, and the Pentagon to conduct non-coercive interrogations.

     

    How HIG works and the tactics used by interrogators are closely held government secrets. But this we do know: The United States government through HIG has funded over sixty studies in the psychological and behavioral sciences worldwide, with particular emphasis on studies of the law enforcement models in England and Canada. These two countries have abandoned the Reid Technique of psychologically accusatory interrogations for a “cognitive interview” model where the suspect is asked what they know about the crime. This method presumes the suspect is innocent and allows the suspect to tell their story without interruption or accusation. The investigator may ask the suspect about contradictions or inconsistencies between the suspect’s narration and the known evidence. But the interrogator may not lie about the evidence or deceive the suspect.

     

    Could we be seeing the end of the American method of accusatory interrogation and the beginning of a new and more effective form of interrogation? One thing is sure: the cognitive-interview method was found, in a 2014 HIG study, to be more effective in producing true confessions as opposed to false confessions than the American accusatory approach.

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172588 https://historynewsnetwork.org/article/172588 0
    "If ever I felt that I ought to be five priests it was that week:" Chaplains in World War 2

     

     

    An exasperated Father Henry Heintskill, C.S.C., a Notre Dame chaplain posted to naval duty in the Pacific, faced the same issue with which almost every military chaplain grappled during World War II—how to perform the multiple tasks that normally required the services of two or three chaplains. “There are all sorts of problems the men have,” Father Heintskill wrote in a letter to his superior, “they’re worried about conditions at home, etc.  We have to do what we can.”  He explained that after one Friday evening service, at least two hundred men gathered for Confession, requiring him to remain an hour after lights out at 9:30 p.m.  “If ever I felt that I ought to be five priests it was that week.”

     

    Chaplains celebrated Mass and helped the men complete government forms. Some soldiers, not long out of high school, wondered what combat would be like.  Others asked about the morality of taking another man’s life. 

     

    One duty, however, was paramount—to be at the side of a soldier or sailor as the young man died. It was then that the chaplain could administer Last Rites and its promise of dying with a clear conscience. 

     

    That was evident as soon as the first day of war, when ninety Japanese aircraft struck Clark Field in the Philippines shortly after noon. Father John Duffy, a Notre Dame diocesan priest from Ohio, eluded bullets and bombs as he ran to the field, littered with dead and wounded, to hear confessions and administer Last Rites. To avoid wasting precious moments by inquiring about the soldier’s faith, he gave Last Rites to any dying serviceman he came across. “I knew it would be effective for the members of my faith & that it would do the others no harm,” he explained later.  “There wasn’t sufficient time for inquiry about religious tenets of the wounded.” 

     

    Four months later, Father Duffy lay at the receiving end of the sacrament. After enduring severe abuse at the hands of cruel Japanese guards on what became known as the infamous Bataan Death March—“Extreme Unction, Baptism, Confessions administered daily on march,” wrote Father Duffy. “Death, pestilence, hunger, exhaustion, depleted all.”—the priest lay on the ground, apparently dying from bayonet slashes to his body. A Protestant chaplain knelt beside his friend, held Duffy’s head in his hands, and prayed, “Lord, have mercy on your servant. He’s a good man who served you well. Receive his soul.”  Within moments another Catholic chaplain came upon the scene and, also thinking the priest was dying, anointed Father Duffy.

     

    The importance of Last Rites extended even to the enemy. Two and one half years later, during the bitter combat in Normandy following the June 6, 1944 invasion, Father Francis L. Sampson spotted a German soldier lying in a creek a few feet away.  He crawled over to do what he could for the enemy soldier, but as Sampson lifted him into his arms, the German groaned a few times and died.  Because he saw a higher duty, the Catholic chaplain from Notre Dame, wearing an American uniform, gave absolution to a German soldier dying in a French creek.

     

    Father Joseph D. Barry, C.S.C., recognized that a paralyzing fear for a wounded or dying soldier was to lie alone on the battlefield, left to contend with his fears. During his more than 500 days in European combat with the 45th Infantry Division, Barry exerted every effort to reach a boy prone on the ground and bring him the peace of knowing that someone was there with him. “After 54 years, I can still see Father Barry administering last rites to soldiers in the field while enemy shells exploded all around him,” wrote Albert R. Panebianco, a soldier in the 45th Infantry Division.  

     

    On one occasion Barry talked with a soldier who, due to go into battle in a few hours, feared that “this might be my last night.” The soldier confided that he accepted fear as part of his task, but wondered if he would control his panic and still perform when it counted.  

     

    Barry inquired if there was anything the priest could do for him. Above all, the boy told Barry, he had wanted to be a good soldier—for his men, his family, his country, and his God—and if he died, would Barry please tell his family that he had fulfilled that wish.  During combat later that night, German fire cut down the youth. Father Barry rushed to him, cradled the mortally wounded boy in his arms, and with explosions and combat nearly drowning out his words, shouted into the dying boy’s ear, “Remember how we talked last night.  Here it is.  And I can say you were a good soldier.”

     

    Father Barry consoled more people than the soldiers he tended. He also penned letters to parents and loved ones, often at the behest of the dying soldier who asked the priest to inform his mother or wife that he loved her. Above all, he made certain that they knew that their son had died with a priest at his side. “I wrote to so many.  You could write what they wanted to know more than anything else, ‘I wonder if there was a priest with my boy.’” Barry explained in an interview.  “And that is the only reason I wrote,” he said. 

     

    In Okinawa, Father John J. Burke, C.S.C., knew the difficulty of fashioning letters to grieving loved ones. After a Japanese torpedo struck the aft portion of his battleship, USS Pennsylvania, on August 12, 1945, killing twenty men, Father Burke mailed twenty responses to loved ones in which he relayed, with dignity and compassion, information about the loss of their son, brother, or husband. Rather than send a similar form letter to each family, he instead crafted similar opening and ending paragraphs, but inserted personal information unique to each individual in the main portion. “God bless you in your present sorrow,” Father Burke began each letter.  “As the Catholic Chaplain aboard the U.S.S. Pennsylvania I want to assure you that your son [here he inserted their first name] received Catholic Burial.  The Holy Sacrifice of the Mass was offered several times for the repose of his soul.”

     

    He then added personal information about each sailor.  To Mrs. Angeline Ortbals of Ferndale, Michigan, whose son, nineteen-year-old Seaman 1/c Robert J. Ortbals, died, he wrote that Robert “had a heart of gold” and went out of his way to help his shipmates. To the parents of a sailor named Roemer, he wrote that, “I feel that a boy so young must very soon, if not already, be enjoying the eternal happiness of heaven which is beyond human description and to which, in God’s mercy we all look forward.” 

     

    Father Burke closed all letters by explaining that their son had recently attended Mass and received Communion, and that as far as he knew, had led a religious life.  “It is impossible for me to express anything that will lessen the sorrow which you must endure.  You have returned to God your beloved son on your Country’s Altar of Sacrifice.  In this supreme sacrifice your son is most like our Divine Savior; and you, I trust, most like his Blessed Mother.  God bless you with the humble and Christian spirit of resignation to His Divine Will.”

     

    Though they conducted many rigorous tasks, the chaplains cherished the knowledge that they had comforted dying young men, and subsequently their families, in these final moments. As Father Duffy related, “I did what I could for each regardless of his faith, and a look of ineffable peace came to the face of many a tortured soul in that last bitter hour on earth.”

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172584 https://historynewsnetwork.org/article/172584 0
    From the Founding Fathers to Trump: Who Can Be an American? “Why don’t they go back and help fix the totally broken and crime-infested places from which they came” the president of the United States recently tweeted. Trump was referring to four Congress women of color, three of whom were born in the United States. The other is a naturalized American citizen. Trump continued his criticism of “the Squad,” in particular Congresswoman Ilhan Omar, at a campaign rally and the crowd responded by chanting “send her back.” As David Leonhardt of the New York Times wrote, “It was an ugly, lawless, racist sentiment, and President Trump loved it”. He later denied that he supported the crowd’s chant.  It’s hard to imagine Trump telling a white man, even someone like Senator Bernie Sanders who he disagrees with, to go back to where he came from. Correctly charged as a racist statement, the tweet also reflects an age-old question in our history: who can be an American?

     

    There have always been two views of what defines American identity. One is tied to a traditional racial or ethnic view, ethno-nationalism for short, the other is that America is an idea. Gunnar Myrdal of Sweden dubbed the second one the American Creed: Americans were bound together by “the ideals of the essential dignity and equality of all human beings, of inalienable rights to freedom, justice and opportunity.” Myrdal was referring to Jefferson’s natural rights section of the Declaration of Independence when he made this observation.

     

    Today, the United States is religiously, culturally, and ethnically diverse. Yet we see ourselves as Americans in large part due to this creedal notion of America. In 2018, two scholars at Grinnell College “polled Americans on what they most associate with being a real American.” They found that a “vast majority of respondents identified a set of values as more essential than any particular identity.” As the historian Mark Byrnes wrote for HNN back in 2016, “The United States is fundamentally an idea, one whose basic tenets were argued in the Declaration of Independence and given practical application in the Constitution.” These ideas revolve around liberty, equality, self-government, and equal justice for all, and have universal appeal. “Since America was an idea, one could become an American by learning and devoting oneself to” those universal ideas, Byrnes observes.

     

    Despite the strong appeal of the American Creed, 25 percent of those polled by Grinnell College held nativist views similar to those espoused by Donald Trump during his 2016 election campaign for president, and as further reflected in his comments after Charlottesville and in his recent tweet. The view that ethnicity and race made the United States one people predominated in the early American Republic. John Jay, in Federalist No. 2, made the argument that the United States was one nation at the time of the debate over ratification of the Constitution by appealing to ethno-nationalism. He wrote that we are “one united people—a people descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs and who, by their joint counsels, arms, and efforts…established their general liberty and independence.” Jay’s thesis largely reflected the traditional view of nationhood. A majority of Americans were of English descent in 1788 and they viewed America as a nation for white people, with Caucasians, and specifically Anglo-Saxons,as the superior race. Some scholars even defend these ideas: the late Samuel P. Huntington, who was a Harvard political scientist, has argued that this Anglo-Saxon heritage ultimately contributed to the American Creed.

     

    While Jay’s ethno-nationalist perspective obviously cannot describe the United States today, it was inaccurate even in 1790, when we were already a diverse people. African Americans, most of them enslaved, were 20 percent of the total population in 1790. In Pennsylvania, thirty-three percent of the people were of German ancestry, and both New York and New Jersey had large numbers of German and Dutch peoples. There were also conflicts between the English and these other groups, including the Irish, the Scottish, and the Welsh, who were themselves from the British Isles.

     

    While ethno-nationalism has deep roots in the United States, so too does Jefferson’s American Creed. Jay himself noted that the United States was “attached to the same principles of government,” a reference to the support for the establishment of a government grounded in the consent of the governed. To Thomas Paine, the country was drawn from “people from different nations, speaking different languages” who were melded together “by the simple operation of constructing governments on the principles of society and the Rights of Man” in which “every difficulty retires and all the parts are brought into cordial union.” Washington saw America as a place that was “open to receive not only the Opulent and respectable Stranger, but the oppressed and persecuted of all Nations and Religions.” Hector St. John de Crevecoeur was another adherent of the view that America was an idea. Originally from France, he emigrated to New York during the colonial period. Crevecoeur talked about the amazing “mixture of English, Scotch, Irish, French, Dutch, Germans, and Swedes” who were “a strange mixture of blood.” He referred to people who came to the United States as Americans, a place where “individuals of all nations are melted into a new race of men.”

     

    Still, rapid increases in immigration have always threatened this notion of American identity.The founding fathers differentiated between the people who inhabited the original thirteen colonies, who were largely drawn from northern Europeans, even if today we see few differences between people of European descent. Benjamin Franklin complained about the “Palatine Boors” who swarmed “into our Settlements…herding together” and creating a “Colony of Aliens.” Thomas Jefferson doubted that he shared the same blood as the “Scotch” and worried about immigrants from the wrong parts of Europe coming to the United States,” Francis Fukuyama writes in his recent book Identity. During periods of rapid immigration, nativist movements tend to emerge. 

     

    For people of color, America has rarely been a welcoming place. Many Black Americans were brought here as slaves, and Native Americans were overrun as the insatiable desire for land led to ever greater westward expansion. Our history must always take into account “the shameful fact: historically the United States has been a racist nation,” as the historian Arthur Schlesinger framed it in his book The Disuniting of America. “The curse of racism has been the great failure of the American experiment, the glaring contradiction of American ideals.”  

     

    So much of American history can be seen as an attempt by previously excluded groups to also be granted their share of the rights for which the American Creed calls. By the 1850’s, abolitionists had been agitating for an end to slavery and the extension of rights to black people. Lincoln eventually become committed to a creedal view of America that extended the rights enshrined in the Declaration of Independence to all people, black and white, native born and immigrant. Martin Luther King Jr., in his “I Have A Dream Speech” delivered in front of the Lincoln Memorial in 1963, reminded the nation that it had fallen short of its founding ideals, that the Declaration of Independence was a “promissory note to which every American was to fall heir…yes black men as well as white men.”  

     

    In the early 21st century, in the aftermath of the election of our first African American president, many of us hoped that our great nation had grown beyond the ethno-nationalist version of America, but the election of Donald Trump proved that this is not the case.  As Americans, our challenge, in the words of Francis Fukuyama, is to avoid “the narrow, ethnically based, intolerant, aggressive, and deeply illiberal form that national identity took” in our past, and which Trump is trying to reignite. Instead, we need “to define an inclusive national identity that fits [our] society’s diverse reality.” The challenge of our times is to continue our commitment to a creedal vision of America. We need to make a reality of the opening words of our Constitution, that “We the People” means all people who share the American creed, regardless of race, ethnicity, or religion, and to constantly strive to unleash, in Lincoln’s words, “ the better angels of our nature.” We can start by denouncing Trump’s continuing appeal to racism. 

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172583 https://historynewsnetwork.org/article/172583 0
    Can Donald Trump Be Compared to Caligula, the Mad Emperor of Rome?

     

     

    Even before Donald Trump was elected president of the United States he was being compared to Caligula, third emperor of Rome. Following Mr. Trump’s election, comparisons flowed thick and fast. But, is it fair to compare the unpredictable, ultimately chaotic reign and questionable mental state of Caligula with the administration and personality of the forty- fifth president of the United States? Do comparisons stand up to scrutiny?

     

    Well, both men ruled/rule the largest military and economic powers of their age. Caligula emptied the treasury with his extravagances. Trump presides over a ballooning U.S. national debt. Neither man had served in the military they ended up commanding.

     

    Both had few friends growing up. Both had multiple wives. Both men had successful, wealthy fathers. The parents of both Caligula and Trump all died before their son rose to the highest office in the land.

     

    Both men rid themselves of senior advisers who restrained them. Both were/are sports lovers, building their own sporting facilities in furtherance of their passions. In Caligula’s case it was chariot racing and hippodromes. For Trump, it’s been golf and golf courses.

     

    Then there are the obvious differences. Caligula was twenty-four years old when he came to power. Trump was seventy on taking the top job. Caligula had absolute power with no specified end date. Unless the system is changed, Trump can expect a maximum of eight years in power. Trump has made numerous outrageous claims. Caligula made just one—that he and his sister Drusilla were gods.

     

    Caligula was well-read and an accomplished public speaker with a lively if barbed wit. Trump’s wit can be similarly stinging. But he comes across as an inarticulate man, exhibiting an obvious discomfort with formal speeches and producing a nervous sniff when out of his comfort zone.

     

    It’s instructive to look at the handshakes of both. The handshake as a form of greeting went well back before the foundation of Rome. Originally, it demonstrated that neither party held a sword in the right hand. If a Roman respected the other party, he would “yield the upper hand” in a handshake, offering his hand palm up.

     

    Caligula yielded the upper hand to few men other than his best friend Herod Agrippa, grandson of King Herod the Great. Donald Trump sometimes yields the upper hand. But is it through respect, or diffidence?

     

    At his first public meeting as president with Russia’s president Vladimir Putin at the 2017 G20 summit in Germany, Trump offered his hand first, palm up, yielding the upper hand to Putin. He did the same when meeting France’s president Emanuel Macron that same year. In contrast, Trump offered female leaders Germany’s chancellor Angela Merkel and Britain’s prime minister Theresa May a straight up and down handshake.

     

    Through late 2018, Trump was photographed yielding the upper hand to Japan’s Prime Minister Shinzo Abe and Australian Prime Minister Scott Morrison. In October, he did the same with America’s then ambassador to the United Nations, Nikki Haley, in the Oval Office.

     

    In terms of policy, Trump and Caligula are poles apart. Some of Caligula’s public infrastructure policies were ambitiously innovative and progressive, if expensive. While Trump has always painted himself as entrepreneurial, his policies have been regressive – a blanket program of retreat. Retreat from the Paris Climate Accord. Retreat from free trade. Retreat from government regulatory control of the economy and the environment. Retreat from military boots on the ground in Syria and Afghanistan.

     

    As US Secretary of State Mike Pompeo said in January, “When America retreats, chaos often follows.” From the available evidence, it seems Caligula did suffer from a mental illness. Trump’s mental stability is, in the words of an ancient Roman saying, still before the judge. 

     

    In the end, it wasn’t external foes who caused Caligula’s downfall. Caligula was brought down by a dread among his inner circle of being next as he eliminated many around him. Loyalty and friendship were no guarantee of survival. Similarly, it’s been said that President Trump turns on a dime when it comes to friends. In the case of Caligula’s friends, self-preservation eventually made the most loyal the most lethal.

     

    When Caligula’s reign was terminated at the point of swords wielded by assassins in his own guard, it had lasted around four years, the equivalent of a U.S. presidential term. Perhaps it will take that long for the proverbial knives to come out among the Republican old guard in Washington today. As was the case in AD 41, it will probably not be a pretty sight.

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172593 https://historynewsnetwork.org/article/172593 0
    The Beginning of the HIV/AIDS Epidemic – and One Doctor's Search for a Cure

     

    The following is an excerpt from The Impatient Dr. Lange by Dr. Seems Yasmin. 

     

    Before the living dead roamed the hospital, the sharp angles of their bones poking through paper-thin bed sheets and diaphanous nightgowns, there was one patient, a harbinger of what would consume the rest of Dr. Joep’s life. Noah walked into the hospital on the last Sunday in November of 1981. It was Joep’s sixth month as a doctor and a quiet day in the emergency room at the Wilhelmina hospital, a red brick building surrounded by gardens in the center of Amsterdam. 

    Noah was forty-two years old, feverish, and pale. His skin dripped a cold sweat. The insides of his cheeks were fuzzy with thick streaks of white fungus. And then there was the diarrhea. Relentless, bloody diar- rhea. Noah’s stomach cramped, his sides ached, he couldn’t swallow food. Doctors admitted him to the infectious disease ward, a former army barracks in the ninety-year-old hospital, where they puzzled over the streaky plaques of Candida albicans, a yeasty fungus growing inside his mouth, and the bacteria Shigella breeding inside his gut. 

    Noah swallowed spoonfuls of antifungal medicine. Antibiotics were pushed through his veins until his mouth turned a rosy pink and his bowels quieted. Still, the doctors were baffled by his unlikely conglomeration of symptoms. “The patient needs further evaluation,” they wrote in his medical records. “He has anemia. And if the oral Candida recurs, it would be useful to check his immune function.” They discharged him on Friday, December 11, 1981. 

    Had they read the New England Journal of Medicine on Thursday, December 10, they would have found nineteen Noahs in its pages. 

    +++

    Reports were coming in from Los Angeles and New York City of gay men dying from bizarre infections usually seen in transplant patients and rarely in the elderly. Like Noah, their immune systems had been an- nihilated and they were plagued with a dozen different bugs—ubiquitous microbes that rarely caused sickness in young men. 

    The week that Noah walked out of Wilhelmina hospital, the New Eng- land Journal of Medicine dedicated its entire “original research” section to articles on this strange plague. In one report, scientists from Los Angeles described four gay men who were brewing a fungus, Pneumocystis carinii, inside their lungs, and Candida inside their mouths and rectums. Doctors in New York City puzzled over fifteen men with worn-out immune systems and persistent herpes sores around their anus. 

    By the time the New England Journal of Medicine article was printed, only seven of the nineteen men in its pages were still alive.

    +++

    ...Four days after Noah walked out of the Wilhelmina hospital, a young man walked into the emergency room at the Onze Lieve Vrouwe Gasthuis, or Our Lady Hospital, a ten-minute bike ride away in east Amsterdam. 

    Dr. Peter Reiss was on call that Tuesday night, his long white coat flapping around his knees as he hurried from bed to bed. Peter and Joep had met in medical school, where they realized they were born a day apart. Joep was one day older than Peter, and he liked to remind his friend of this fact. 

    Peter picked up the new patient’s chart and stroked his trim brown beard as he read the intake sheet. His bright blue eyes scanned the notes just as he had read the New England Journal of Medicine the previous Thursday. 

    He walked over to the cubicle and pushed aside the curtain. There was Daniel, a skinny nineteen-year-old with mousey blonde hair, sitting at the edge of the examination table. His skin was pale with bluish half- moons setting beneath his eyes. He was drenched in sweat and hacking a dry cough. 

    Daniel looked barely pubescent, Peter thought, and he checked the chart for his date of birth. There it was, February 1962. Daniel watched as his young doctor slipped blue gloves over his steady hands. Peter had a broad, reassuring smile and a calm manner. 

    “Tell me, how long have you been feeling sick?” he asked gently. Daniel had been ill since November. First, a prickly red rash dotted his chest and arms, then itchy red scabs appeared on his bottom. The diarrhea started soon after and he was running to the toilet every few hours. He had a fever that kept creeping higher. 

    Peter asked if he could palpate his neck and Daniel nodded. Pressing his fingers under the angles of his jaw, the pads of Peter’s fingers found shotty lymph nodes as large as jelly beans. Peering inside Daniel’s mouth, he saw a white blanket of Candida coating his tongue and tonsils. When he stepped back, Daniel coughed and caught his breath and Peter realized that the air had escaped his own lungs, too. 

    A teenaged boy with enlarged glands, oral thrush, and perianal herpes sounded a lot like the journal articles he had read on Thursday. The case reports flashed through his head: young gay men, history of drug use, American cities. 

    “Are you sexually active?” Peter asked softly. Daniel looked away. “Yes,” he whispered. “I had sex with a man for the first time ten weeks ago. He was much older than me, forty-two years old, I think. I heard he’s very sick.” 

    +++

    In the summer of 1981...Amsterdam was a safe haven. Lovers from the provinces could walk down the street and do the unthinkable: hold hands, hug, plant playful kisses on their boyfriend’s faces. Here, there was safety in numbers— freedom in a place where gay men were met with smiles instead of slurs. Those who took vacations in the United States reported back that Amsterdam was a lot like San Francisco with its kink bars and bathhouses, places where gay men could hang out and enjoy anonymous sex. 

    In both cities, the new illness was preying on love and freedom. If colonialism had sparked the spread of HIV from chimpanzees to humans, homophobia was the fuel that helped the epidemic spread from one person to another. The virus was exploiting the need for comfort and community as it swept through bedrooms and bathhouses in the Castro and on Reguliersdwarsstraat. 

    More than twenty bathhouses dotted San Francisco, including the Fairoaks Hotel, a converted apartment building on the corner of Oak and Stiner. Yoga classes ran alongside therapy sessions and group sex. Wooden-framed signs at the front desk advertised poppers and t-shirts at five dollars apiece. 

    Disease detectives from the CDC descended on these refuges to col- lect samples and stories, a nameless disease with an unknown mode of transmission giving them license to inject themselves into the private lives of strangers. They offered no answers, only long lists of questions: How many men did you have sex with? What kind of sex was it? Can you write down all your lovers’ names? 

    The men offered up memories and saliva samples, fearful of what the government doctors would find inside their specimens. The disease detectives were trying to work the investigation like any other outbreak, following the same steps in their usual logical manner. Except this time, the world was watching and waiting for answers. 

    A handful of diseases have been eliminated from a few pockets of the globe, their numbers dwindling to levels that give humans a sense of dominance over the microbial world. But only one infectious disease has been eradicated: smallpox. 

    The mastermind behind the global erasure of that virus was Dr. Bill Foege, a looming figure who worked tirelessly to eradicate smallpox in the 1970s. In 1977, he was appointed director of the CDC by President Jimmy Carter. 

    But a few years into his tenure, Bill’s scientific acumen was up against political fatuity. Carter lost the election to Ronald Reagan, who was sup- ported by a political-action group called the Moral Majority. “AIDS is the wrath of God upon homosexuals,” said its leader, Reverend Jerry Falwell. Pat Buchanan, Reagan’s communication director, said the illness was “nature’s revenge on gay men.” 

    Reagan said nothing. He uttered the word “AIDS” publicly for the first time in May of 1987 as he neared the end of his presidency. By that time, fifty thousand people were infected around the world and more than twenty thousand Americans had died. 

    To make matters worse, the Reagan administration demanded cuts in public health spending. Bill had to tighten his purse strings just as the biggest epidemic to hit humanity was taking off. 

    Even within the CDC, some leaders were doling out politically motivated advice. “Look pretty and do as little as possible,” said Dr. John Bennett, assistant director of the division of the Center for Infectious Diseases. He was speaking to Dr. Don Francis, a young and outspoken epidemiologist who had returned from investigating the world’s first outbreak of Ebola in Zaire. 

    Bill possessed a stronger will. Armed with political savvy and epide- miologic expertise, he instructed Dr. James Curran to assemble a team. James was head of the research branch of the CDC’s Venereal Disease Control Division. By assigning him a new role, Bill was working the sys- tem to give James enough latitude to conduct what would be the most important investigation of their lives. 

    James gathered thirty Epidemic Intelligence Service officers and CDC staff to form a task force. Joined by Dr. Wayne Shandera, the Epidemic Intelligence Service officer assigned to Los Angeles County, the task force for Kaposi’s sarcoma and Opportunistic Infections got to work. 

    The first item on the to do list in any outbreak investigation—even one as devastating as AIDS—is to come up with a case definition, a short list of criteria that will help other doctors look for cases. The disease detec- tives huddled around a table in their Atlanta headquarters and listed the major scourges of the new syndrome. 

    A case was defined as a person who had Kaposi’s sarcoma or a proven opportunistic infection such as Pneumocystis carinii pneumonia. They had to be aged younger than sixty, and they couldn’t have any underlying illness such as cancer or be on any medications that would suppress their immune system. 

    They shared the case definition with doctors around the country and by the end of 1981, as Noah and Daniel were walking into hospitals in Amsterdam, the CDC had a list of one hundred and fifty-eight American men and one woman who fit the description. Half of them had Kaposi’s sarcoma, 40 percent had Pneumocystis carinii pneumonia, and one in ten had both. Looking back, the earliest case they could find was a man who fell sick in 1978. 

    They looked for connections between the cases and found them- selves writing names on a blackboard and drawing white lines between the people who had sex with one another. A spider’s web of a vast sexual network emerged. Thirteen of the nineteen men who were sick in southern California had had sex with the same man. 

    Task force Drs. David Auerbach and William Darrow cast their net wider, looking at ninety gay men across a dozen cities who fit the case defi- nition. Forty of those men had sex with the same man, who was also sick. Still, some were vehemently opposed to the idea that the syndrome was sexually transmitted. If it was spread through sex, why hadn’t this happened before? But then came the summer of 1982 and reports of babies and hemophiliacs with Pneumocystis carinii. The common link was blood transfusions. This added a new mode of transmission. Like hepatitis B, the new illness was spread through sex and blood. 

    The CDC announced four groups of people were most vulnerable to the new illness, hemophiliacs, homosexuals, heroin users, and Hai- tians, and the disease earned a new name: 4H. With that public health announcement came public outrage and vitriol against those groups, especially gay men and Haitians. Houses were burned, children expelled from school, families forced to move towns because they were sick. Poli- ticians sat complicit in their silence. 

    It was unparalleled, this confluence of public health, politics, clinical medicine, and public anxiety. The unknown disease was spreading faster than imagined. Humanity had never seen anything like it. 

     

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172592 https://historynewsnetwork.org/article/172592 0
    Climate Change and the Last Great Awakening

     

    Historians from Joseph Tracy in 1842 up to the present day have seen the religious revival movement of the mid-eighteenth century as the first mass movement in American history.  With its roots in the works of Congregationalist minister Jonathan Edwards and the John Wesley-influenced Anglican George Whitefield, it renewed and expanded the Puritan notion of the “second birth” in achieving salvation in a Christian-dominated milieu.  Tracy dubbed this movement the “Great Awakening” and it began what I like to think of as “conscience in American history.”  This movement, as it ebbed and flowed over time, influenced the American Revolution, abolitionism, women’s rights, the labor movement, social welfare, and environmental concerns right up to the present day.  

     

    Another American mass movement was the social revolution of the 1950s into the 1970s that resulted in more equitable civil rights, the end of the Vietnam War, the rejuvenation of the women’s rights movement, and the environmental movement. Today, this mass movement for change is in need of resurgence. We need another Great Awakening to convince the public and political leaders to accept the devastating reality of man-made climate change and embrace efforts to combat it. Even if movements like Extinction Rebellion take off and become massive, with millions of people in the streets, the attempt to curb the catastrophic effects of Climate Disruption may well be the Last Great Awakening.

     

    Fifty years ago, historian Richard Bushman wrote that twentieth-century inhabitants, if they had ever even heard of it, misunderstood the nature of the eighteenth-century Great Awakening.  This was a period of religious “revival” that ran through the middle third or so of the eighteenth century.  The fervor of the original sixteenth and seventeenth century Puritans – the ones who had made their way to Massachusetts aboard the Mayflower– believed themselves to be creating a “City on the Hill” to welcome the imminent return of Christ the Messiah, ushering in a thousand-year reign known as the Millennium.  These colonists were Calvinists, meaning that they embraced not only millennialism, but a doctrine known as “pre-destination” – it had already been determined who was going to Heaven and who was going to Hell in the eternal realm of the Father, Son, and Holy Spirit.  But how could one know if one was pre-destined for Heaven or not? Calvinists responded that one could know by having a “second birth” of the spirit – a profound psychological experience that would leave a lasting mark on one’s psyche making it quite clear that one versed in the doctrine had been “Chosen.” The relief from the experience or dread from not having it could, and usually was, profound.  Today, we hear of people being “Born Again” in charismatic Christian churches (and elsewhere), but many consider this to be either fake or the ramblings of the mildly insane.  So, when one mentions this business of a “second birth” now, many people simply ignore it and carry on with their lives.  They don’t understand the implications for those having the experience, especially during the eighteenth century’s Great Awakening.  This is what Bushman was saying.  

    Great Awakener Jonathan Edwards’s famous “Sinners in the Hands of an Angry God” is often put forward as an example of the kind of jeremiad that would induce the desired “second birth” of the reprobate (one who had not had the “second birth”). Subtitled “Sermon on the Danger of the Unconverted” and delivered at Enfield, CT in July of 1741, it needs to be remembered as a spoken sermon delivered "enthusiastically."  (Think “hellfire and brimstone”.)  The function of the sermon was to induce a sublime terror that would propel the listener into a cataclysmic psychological transformation.  

     

    This is not an unusual psychological journey for humans on Planet Earth. Visionary experiences are common in the mystical aspects of all religions.  The context and agency can vary.  Bushman implies this in his essay when he compares the eighteenth-century Great Awakening to the ‘60s Civil Rights and anti-war movements.  The Civil Rights movement – delayed justice for African Americans – and opposition to the atrocious crime known as the Vietnam War, were “awakenings” that had both a political and cultural side.  The politics were that of the New Left – Students for a Democratic Society, the Black Panthers, the American Indian Movement, and others.  The cultural side was a bohemian spirit inherited from the "Beats" of the 1950s that became truly massive with the "hippie movement" and featured, among other things, the shared experiences of rock music and psychotropic substances like LSD, psilocybin mushrooms, peyote cactus buttons, etc.  While this directly impacted a fairly small percentage of the population overall, historians and other students of this period including psychologists and other care-givers, are beginning to understand this impact. And, like the Great Awakening of the eighteenth century, there was a ripple effect that spread throughout the culture at large.  

     

    Like the first Great Awakening, the fragmented twentieth-century movements mentioned above made an impact that is strongly felt today, although many people born since then don’t realize it.  The idea of self-realization – becoming the person you were meant to be (or, in Christian terms, the person God intended you to be), getting society back on a track of justice and equity and freedom, having a sense of mission for bringing positive change to the world – these are the results of such profound psychological experiences not unlike the Great Awakening. An entire generation awoke to both what we were doing to the planet and that we were essentially poisoning ourselves by not paying attention to what we were putting in our mouths.  In the eighteenth-century Awakening, Bushman estimated up to twenty or thirty percent of a town could be converted to the “New Lights” (those who had experienced the “second birth”) in one pass by the iconic Anglican preacher George Whitefield.  The counterculture and its politics, while suppressed, has maintained significant numbers of adherents.  It is possible to see both of these “Awakenings” as “seasons of revival, outpourings of the Holy Spirit, and converted sinners experiencing God’s love personally."  In the eighteenth century, those who experienced the “second birth” often “saw the light” of both religious and political freedom, compassion for their fellow humans, and a strong sense of staying attuned to their inner life.  In the twentieth century, many in the Boomer Generation experienced similar feelings of love and compassion for not only their fellow humans, but for all life.  

     

    Now, in the twenty-first century, as the Boomer Generation has begun passing through the Sun Door, we have the reality of Climate Disruption and cataclysmic change staring us in the face.  It was the counterculture of the Sixties and Seventies – readers of Aldo Leopold and Rachel Carson (et al.) – who first awoke to this danger on a mass scale.  If one takes the language of the first Great Awakening metaphorically, Bushman’s admonition hits close to home.  The “slippery slope” that Edwards, Whitefield, and a small army of itinerant revivalists used to induce the transformative “second birth” describes the reality we now face.  We are sliding.  Edwards’s “God” is our “Climate” and its reality cannot be denied.  We're too late to prevent many of the cataclysmic disruptions to come in the name of profit and convenience.  But we can still mitigate some of it.  We still have some agency, but it is slipping away daily. This is the Last Great Awakening. It’s not just humanity that is sliding, it’s the entire planetary ecological system as well as future generations.  This is the REAL DEAL, the Sixth Extinction is underway and everyone is responsible now. 

     

    Richard Bushman’s observation that modern observers do not understand those eighteenth-century individuals who underwent a “second birth” has gained profound significance.  Ignoring what we have been doing to our home planet has created consequences that we are only beginning to grasp and that are no longer abstract.  The itinerant preachers of the Great Awakening bent on inducing a “second birth” were, in their way, absolutely right about the need for profound personal change and diligent attention to one’s inner life.  The Last Great Awakening, if it is to be massive and successful, must involve profoundly altering our personal behaviors and inner lives while seriously committing to living in a sustainable way.  Indeed, the sublime terror needed to propel massive action does not need the abstract references to HELL of the First Great Awakening.  The consequences of the “reprobate’s” failure to change is not abstract at all; it's happening now.

     

    ©Douglas Harvey 2019

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172585 https://historynewsnetwork.org/article/172585 0
    Exploring the Curious Sources of Medieval Law: An Interview with Acclaimed Historian Robin Chapman Stacey

     

    As my research has shown, lawbooks of this period could communicate ideas and opinions as well as information; they could convey outrage and resentment as well as the stability of custom.

    Our challenge as scholars is to read in ways that allow us to fully get whatever jokes these authors might be telling.

    Robin Chapman Stacey, Law and the Imagination in Medieval Wales

     

    I first met Professor Robin Chapman Stacey, an acclaimed medieval historian, at the University of Washington in Seattle in May 2019. She had recently published a book about Wales of the thirteenth century, Law and the Imagination in Medieval Wales (University of Pennsylvania Press, 2019). 

     

    I told her I thought that law and imagination were contradictions. She briefly explained that she had found that literature, rituals, myth and other imaginative enterprises had influenced and shaped the law she examined at this far off time in Wales. I was hooked when she mentioned the bawdy humor, flights of whimsy, and even burlesque in this body of law. For her, the law then was actually a complex political fiction. 

     

    I was intrigued by her remarkable book. It was apparent that Professor Stacey had spent years on this ambitious project that required painstaking translating from medieval Welsh and Latin, as well as rigorous exploration of the work of other scholars—and a perceptive and knowing sense of humor. 

     

    In Law and the Imagination in Medieval Wales, Professor Stacey examines the literary and political aspects of the Welsh lawbooks, and argues that the laws are best read not as objective records of native custom but, rather, as important and often humorous commentaries on the politics of thirteenth-century Wales, a nation facing challenges from within and without.

     

    Professor Stacey finds political commentary and even bizarre comedy in the Welsh lawbooks, arguing that they addressed threats to native traditions posed by the encroaching English while attempting to assure stability in domestic concerns such as marriage, divorce, and inheritance, as well as deal with corruption, abuse and violence. Welsh law then also reflects a special concern for preserving male authority, evidence of discomfort with the participation of women in the economic and political affairs. Professor Stacey peppers her examination of the old lawbooks with examples of medieval Welsh irreverence, bawdiness, wit, and sexual humor as she breathes life into the dry bones of this law of yore.

     

    Robin Chapman Stacey is a Professor of History at the University of Washington. She teaches medieval history, and her academic focus is Ireland, Wales, and England from the Iron Age through the thirteenth century. In addition to her appointment in History, she is an Adjunct Professor in the Gender, Women, and Sexuality Studies Department. Her work with students has been honored with the UW Distinguished Teaching Award. She has graduate degrees from Yale and Oxford, and has done intensive academic study in medieval Welsh and Irish languages. 

     

    Professor Stacey’s other books include The Road to Judgment: From Custom to Court in Medieval Ireland and Wales (1994), on the Irish and Welsh institution of personal suretyship in the high middle ages; and Dark Speech: The Performance of Law in Early Ireland (2007), on the role played by speech and performance in ensuring social order in early medieval Ireland. Her books have received prizes from the Medieval Academy of America, the American Conference for Irish Studies, and the Board of Celtic Studies of the University of Wales. She has also written numerous articles on subjects pertaining to medieval Ireland, Wales, and England, including divorce, law and memory, riddles, and legal education. 

     

    Professor Stacey’s research has been supported by grants from the Guggenheim Foundation, the American Council of Learned Societies, All Souls College Oxford, and the Dublin Institute for Advanced Studies. She is a Past President of the Celtic Studies Association of North America, and has served on the Board of Directors of the American Society for Legal History, and was a past Councilor and Executive Board member of the Medieval Academy of America. 

     

    Professor Stacey graciously responded by email to my barrage of questions about her career and her new book. 

     

    Robin Lindley: Thank you Professor Stacey for agreeing to respond to some questions on your work as a historian and your new book, Law and the Imagination in Medieval Wales. Before getting to your book, could you mention how and why you decided to study history?

     

    Professor Robin Chapman Stacey: I have always been fascinated by the past, but until I went to college, I was planning to be an archaeologist.  In fact, I taught myself Egyptian hieroglyphics in middle school—only to realize once the deed was done that I wasn’t going to get very far knowing the script if I didn’t also know the language!   

    The switch from archaeology to history, however, was the direct result of my taking a mind-numbingly dull class in anthropology at the University of Colorado followed a year later by a mind-blowingly stupendous class in history on the French Revolution at Colorado College.  The teacher I had for French Revolution, Professor Susan Ashley, was the best undergraduate teacher I ever had:  one of those instructors who could make everything you did for class matter so intensely that you stopped paying attention to what was actually going on in your day-to-day life.  After her class, the die was cast--I switched to history and never looked back.  

    Ironically, of course, I now make use of both archaeology and anthropology in my historical work, whereas the closest I get to the French Revolution is the occasional novel!

     

    Robin Lindley: And how did you decide to specialize in medieval history with an emphasis on Wales, Ireland and England? 

     

    Professor Robin Chapman Stacey:  I had a class on medieval English history at Colorado College in which we were asked to write a short paper on the Easter controversy as depicted in Bede.  I was so intrigued by the manner in which Anglo-Saxon, British, and Irish met and mingled in the multi-cultural world of the north that I decided later to write my senior Honors thesis on a related topic. Naively, I thought this would be a great thing to study in graduate school—I didn’t even know enough about what I was doing when applying for schools to realize that most graduate programs wouldn’t have someone teaching that period of history.  

    Happily, while there were no Celtic specialists in sight at Yale, where I went (Harvard is the premier Celtic program in the country), I did have the good fortune to work with two very supportive (though decisively non-Celtic) medievalists, Professors John Boswell and Jaroslav Pelikan, who did everything they could to promote my interest in what seemed to them the most obscure of subjects.  Then, in my second year, Professor Warren Cowgill, an eminent Indo-European linguist, decided to offer a course in Old Irish, which I jumped at the chance to take. Honestly, I was terrible at it: I had never even had a course in linguistics before, and Old Irish is an incredibly complex language.  However, I was also stubborn, wouldn’t quit, and was fortunate enough to win a Fulbright to study Irish at Oxford, where I met my mentor and now good friend, Professor Thomas Charles-Edwards.  He was both a world expert in Irish and Welsh law and the nicest and most patient man in the universe; without his help, I might never have finished my degree.   

     

    Robin Lindley: You have a gift for writing that breathes life into the dry bones of the law. How did you also come to focus on law in your work as a historian? Did you ever consider law school and working as a lawyer?

     

    Professor Robin Chapman Stacey:  Thank you! I enjoy writing, at least when I don’t hate it, if you know what I mean.  

    In terms of law:  well, I am intellectually interested in legal issues and always have been.  However, the fact that law emerged as my professional focus was the result of an entirely random event:  when I went in to consult Professor Cowgill about a paper topic for Old Irish, he pulled a legal text down off the shelf and told me to work on it.  That text, Berrad Airechta, a tract on personal suretyship, ended up being the basis for both my Oxford and Yale theses.  It was probably also my experience with that text that caused Professor Charles-Edwards to agree to work with me later at Oxford.  Had he chosen a literary rather than a legal text, my career might have been altogether different.

     

    Robin Lindley: Your background is fascinating. You’ve studied Welsh and other languages and translated from documents that are hundreds of years old. I read that you had a special tutor at Oxford in Welsh. How would you describe your interest in learning often obscure languages and your facility with languages other than modern English?

     

    Professor Robin Chapman Stacey:   Well, I had done some French, Latin, and German in high school, and until I went to graduate school and studied Old Irish, I had thought of myself as being fairly decent at learning languages.  In fact, my graduate degree was actually in Medieval Studies rather than in History, because I had initially thought when applying that I would like to work across the disciplines of history, language, and literature.  A year of Old Irish cured me of any illusions I might have had about my facility with languages (!), though it also persuaded me that I needed them if I was to do serious historical work, and that is why I applied for the Fulbright.  

    I hadn’t intended to tackle Welsh until I went to Oxford and found myself on the receiving end of this rather intimidating question: “You do know French, Latin, German, Anglo-Saxon, Irish, and Welsh, don’t you?”  (For the record, my tutor now claims this account to be entirely apocryphal, but that’s my story and I’m sticking to it!)  Two weeks later, I had started Welsh and found myself completely wasting the time of the highly eminent Welsh professor D. Ellis Evans, who sat there patiently as I went painfully, word by word, through one of the easiest texts in the medieval language.  Then came a summer course in Modern Welsh, and now I work more with medieval Welsh and Irish than I do with Latin.

     

    Robin Lindley: It seems you have a long-term interest in the role of imagination in the politics and culture of the societies you study. Did your new book on Wales somehow grow from your previous research on performance and law in early Ireland in your book Dark Speech?

     

    Professor Robin Chapman Stacey: That is a fascinating question, and I’m not sure I know the answer to it.  My M.Litt. thesis at Oxford was fairly straight-forward legal history.  However, as I worked to turn that into a book, I became more and more interested in questions like the social context of the law, the nature of its authority, and its relationship to other genres and ways of thinking in the period.   

    In my subsequent work, I found myself returning more and more to the connections between law and language, on the one hand, and law and literature on the other.  Dark Speech is focused very much on the former topic, the main issue being the ways in which the use of heightened language in performance lent authority to particular legal rituals or specialists.  Law and the Imagination, by contrast, focuses on the latter, exploring the ways in which literary themes and tropes were used by the medieval Welsh jurists to comment on contemporary political events. I suppose one could say that imagination ties both of these projects together, and in that sense one may have led to the other.

     

    Robin Lindley: How do you see the historical problem you address in your new book on medieval Wales?

     

    Professor Robin Chapman Stacey:  The Welsh lawbooks are the most extensive body of prose literature extant from medieval Wales.  They are preserved in approximately 40 manuscripts, both in Welsh and in Latin, and were clearly extremely important to the people who wrote and made use of them.  One can read them as law is traditionally read, as more or less straight-forward (if stylized) accounts of legal practice.  Reading them in this way gives us a sense of how Welsh law worked and developed over time.  However, these lawbooks were written in the twelfth and thirteenth centuries, the last two centuries of Welsh independence and a time of rapid internal change and heightened external conflict with England.  It is my belief that these texts reflect the period in which they were composed in very direct ways, and that reading them in the way we read literature, with close attention to theme and symbolic detail, reveals them to be a sophisticated, opinionated, occasionally even humorous commentary on contemporary political events. 

     

    Robin Lindley: When I first saw your book, I thought that the concepts of law and imagination were absolutely contradictory. How do you respond to that sense in light of your findings on the law in medieval Wales?

     

    Professor Robin Chapman Stacey:  We are so accustomed to seeing historical legal texts in the light of our own experience—no one nowadays would likely turn to a statute book for fun!—that we often make presumptions about lawbooks written in the past, and those presumptions govern how we read them.  

    What I am arguing in this book is that at least with respect to this particular body of sources, we need to let go of our expectations and read these texts within their own historical context.  The lawbooks of medieval Wales were not legislative in nature, and their authors had extensive family connections with poets, storytellers, and other more overtly literary artists.  There is a rich body of political poetry extant from this period, as well as a number of fabulous tales and a considerable corpus of erotic verse.  The lawbook authors are aware of all of these and, I argue, deploy many of the same tropes and symbols in their own work.   If we abandon the idea that law is always factual and instead approach these lawbooks in the way we might a tale, I think we will see that these texts also are meant to be read on more than one level.

     

    Robin Lindley: Your work is deeply researched and thoughtful. What are the some of the challenges and limitations regarding sources in the kind of work you do?

     

    Professor Robin Chapman Stacey:  The biggest general challenge was realizing what I was arguing and deciding how far I was willing to take it, on which you see further below.  The biggest source challenge was that every tractate within the lawbooks is different from the others in its nature and development; additionally, many lawbook manuscript versions also differ significantly from one another.  This made it challenging to draw general conclusions while respecting also the fact that what is true of one tractate or manuscript might not be true of another.

     

    Robin Lindley: How did your book evolve from your initial conception to the final product?

     

    Professor Robin Chapman Stacey:  Honestly, this was one of those books that crept up on me article by article until I realized that everything I was writing about Welsh law was tending to go in the same direction and ought actually to be a book. 

    The very first article I wrote on the subject was an assignment given to me by the editors of a volume on the court tractate to write about the king, queen, and royal heir.  I was asked—a daunting invitation for a relatively young scholar!--to write on these figures because all the other contributors (infinitely more experienced and venerable than myself) had already written on the subject.  At first, I felt stymied and intimidated, but then I began to study these sections carefully and to realize that what I had expected to see was not at all what I was finding.  I was particularly taken by the manner in which the passages I had been assigned seemed intentionally to be creating an image of space within the court that was more politicized than real.  When I later found myself writing about divorce, I was struck again by the “unreality” of what was being described, and by the political subtext that seemed to me to be emerging from what purported to be a description of procedure.  Other chapters followed, and the book progressed from there.

     

    Robin Lindley: You write about Wales in the thirteenth century. What was Wales like then? Was it a loose group of principalities bound by a common language? 

     

    Professor Robin Chapman Stacey: The unity of Wales in the twelfth and thirteenth centuries was vested primarily in language, culture, law, and a shared mythology. Various Welsh rulers, usually but not always from the northern province of Gwynedd, had made attempts over the centuries to exert political control over the other regions of Wales. However, these attempts were sometimes successful and sometimes not, and they never lasted, not least because Welsh lords from other regions often resisted Gwynedd’s attempts to extend its rule. Additionally, the pressures posed by the presence of Marcher lords and by the aggression of the English crown were a constant obstacle to native unification. 

    Native law was an important factor in defining Welsh identity in the period, but it was already the case that individual Welshmen—and even some Welsh rulers—were beginning to adopt Common Law ideas and procedures even before the final conquest of Wales by Edward I in 1282-1283. The Statute of Rhuddlan, enacted in 1284, established the basis for English rule of the Principality up till 1536, permitting some aspects of native law to continue, but introducing Common Law procedures in other areas.

     

    Robin Lindley: You also write about a period when the Wales was eventually conquered by Britain. How did Welsh law respond to English power and Welsh nationalist concerns?  

     

    Professor Robin Chapman Stacey: My argument is that the Welsh lawbooks speak directly to the need for political unity in face of external aggression and to the importance of native law as a marker of Welsh identity.  They tackle some of the criticisms commonly made against the Welsh in the period, such as sexual immorality and an inordinate penchant for violence.  They also voice, albeit obliquely, criticisms of native rulers for their abandonment of native customs and increasingly intrusive forms of governance.

     

    Robin Lindley: You see the law as a form of political literature where even the ridiculous can have meaning. Could you give a couple of examples of where you found this quality in law?

     

    Professor Robin Chapman Stacey:  My favorite examples are the “burlesques”—ridiculous and even obscene rituals described by the jurists as taking place in the court or localities.  For example, one of the things the authors of the lawbooks were concerned about was the degree to which native Welsh rulers were enlarging their jurisdiction by intruding on the traditional prerogatives of the uchelwyr, “noble” or “free” classes from which the lawbook authors came.  The manner in which they described the royal officers taking part in this process was intended (I think) to convey their contempt for them.  The royal sergeant or tax collector, for example, is depicted in the lawbooks as wearing ridiculously short clothing with boots better suited to a child than to a full-grown man; additionally, he is wrongly dressed for the season, wears his underwear on the outside, and goes about trying to do his dirty business holding a short (and flaccid) spear in front of him in a gesture that certainly looks sexual to me in the manuscript illustration we have of it.  Similarly, the judge is said to sleep at night on the pillow the king sits on during the day (imagine the odors here, not to mention the posterior as a source of royal justice); and the porter (who guards the entrances and exits to the court) is imagined as receiving all the bovine rectums from the court as his due.  If this isn’t satire, then I don’t know what is!  

     

    Also held up for ridicule are women and men who violate native Welsh sexual strictures by having sex without marrying.  Women would normally get property as part of an authorized marital arrangement.  What happens in one passage is that the greased tail of a steer is thrust through the keyhole of a house.  The woman is on the inside; the man is on the outside with two of his friends urging on the steer with whips.  If she can hold on to the tail, she can keep the animal; if she can’t, she gets nothing.  If one imagines the see-sawing motion back and forth, the grease coming off on her hand, and the two “helpers” stirring on the excited animal…well, the sexual imagery is pretty hard to avoid.  We don’t know whether this was an actual ritual in the community; however, it functions in the lawbook as a response to the perceived immorality of the Welsh by showing that they do indeed punish bad behavior and uphold marriage.

     

    Robin Lindley: The Welsh distinguished court and country in law. Did this mean that there was one law for royalty and elites at court and another law for workers and farmers in the country? Why was this distinction significant?   

     

    Professor Robin Chapman Stacey:  The basic pattern of the lawbook divides Wales into court and country.  The court is where the prince and his entourage and servants live, and the country is all the rest.  The intent is not so much to suggest that each live by different laws, as to highlight the differing spheres of jurisdiction.  The tractate on the court really just describes the different royal officers and their duties and prerogatives.  The tractates on the country are focused on different legal subjects and procedures affecting Welshmen as a whole:  suretyship, land law, marriage, animal law, and the like. Part of the idea here is, I think, to create a sense of spatial politics moving outwards from the royal court to encompass the settled and wild parts of the gwlad, “kingdom.”  Opposed to this (at least in one redaction) is the gorwlad, the lands outside of gwlad, which are portrayed as regions of danger and predation.  This is part of how the jurists highlight the need for political unity in the face of external threat.  

     

    Robin Lindley: What are some ways the Welsh law reflected past stories, lore, myth, and such?  

     

    Professor Robin Chapman Stacey: There are certain myths reflected in the lawbooks, again with political intent.  Perhaps the most obvious of them is the harkening back to a (mythical and politically motivated rather than historical) time before the coming of the English when Britain was ruled by a Welsh-speaking king residing in London.  But another, I think, is the image of Wales itself created in these lawbooks:  as timeless, set in the reign of no particular king and thus of them all, and enduringly native.  It says something important about these sources, I think, that some of the most obvious facts of political life in thirteenth-century Wales—such as the March and Marcher lords--aren’t even mentioned.

     

    Robin Lindley: It seems law always has a role in assuring stability. Under Welsh law, wasn’t there a special interest in limiting any social mobility and keeping people in their place?   

     

    Professor Robin Chapman Stacey: Yes, I think you would find that in any body of medieval law to some extent.  But because these texts are written by uchelwyr, this free or noble class, the focus is mainly on protecting themselves from unwarranted and untraditional royal demands, and on preserving their own distinctiveness as a class from economic pressures fragmenting their wealth and pushing them downwards.

     

    Robin Lindley: You stress “bodies” in the law. How did the law see gender and class in terms of the bodies pertaining to law?

     

    Professor Robin Chapman Stacey:  I think bodies and, particularly, the gendering of bodies, play an important role in the political commentary implicit in these tracts. One example would be the depiction of the sergeant mentioned earlier, where he is portrayed symbolically not only as ridiculous, but as simultaneously a child and a woman in the items he is given by the king.  In fact, I think the body itself is an important metaphor in these texts, with the prince being depicted as the only person fully in possession of an entire body, while his courtiers are represented by the body parts of the animals they receive as rewards for their service.  Each officer receives something appropriate for his office—the judge gets the tongues, the watchman gets the eyes, the doorkeeper the skins, etc.  It may be that we are supposed to take these rewards as serious; after all, animal body parts had real value in the middle ages. On the other hand, their symbolism is evident, and some of them (in my view) just have to have been invented by the jurists.  What was a porter to do with a pile of rectums, for example?  And what are we to make of the image of a watchman surrounded by a mound of eyes?

     

    Robin Lindley: You mention that slaves were considered “no-bodies” under Welsh law. What did this mean in terms of the treatment of slaves?

     

    Professor Robin Chapman Stacey: Given the way in which other ranks of person are represented in the lawbooks by body parts symbolic of their status and duties, it seems significant to me that slaves receive nothing themselves, and that their owners receive only the tools of their trade in compensation if the slave is killed.  They are, literally, “no-bodies” as far as the lawbooks are concerned.

     

    Robin Lindley: The Welsh sense of humor was often ribald and sexual and that came through in some laws. How did this humor influence legal relationships such as marriage?

     

    Professor Robin Chapman Stacey:  I don’t think we have any real way of knowing how humor played out in marriage.  However, I can see a mordant sort of humor in the divorce procedure described in these texts, where the divorcing parties divide up their goods one by one. Previous historical interpretations portrayed this process as practical in nature, as setting each party up to move on into a new life and new relationships.  However, when one looks at the items being separated, many of them are things that cannot function apart from one another—one party gets all the hens and only one cat to protect them, while the other gets all the cats and no hens, for example.  One party gets the sieve with large holes and the other the sieve with fine holes.  Moreover, several of these symbols are sexual and, I think, intended to be shaming and perhaps ruefully funny.  Not only are there lots of phallic symbols (validated as such by contemporary Welsh poetry), the top and bottom stones of the quern by which seed is ground go to different parties, and it is surely not coincidental that this provision comes right after one about the separation of the bedclothes.  What we have here, I think, is a symbolic “sermon” on the infertility and waste that result from divorce.

     

    Robin Lindley: Many readers may be surprised that Welsh law permitted divorce and canon law did not prevail there in the thirteenth century. How would you describe the reality of divorce in Wales then? Although permitted in law, it seems divorce was regarded as a great human failure.   

     

    Professor Robin Chapman Stacey:  Yes, I think that is exactly the point.  Welsh law did permit divorce, although the Welsh were severely criticized by the Church for this because divorce was not allowed under canon law.  My suspicion about the divorce “burlesque” is that it is a sign that attitudes to divorce were changing on this front even in native Wales, even though the practice had not yet been abolished.

     

    Robin Lindley: Female virginity was prized in the law. Why was this status so significant?

     

    Professor Robin Chapman Stacey:  I’m not sure how to explain why some societies value female virginity and some seem to care somewhat less about it.  For the Welsh, virginity was tied up in lordship, which might explain something of its importance.  Lords received monetary payments for the first sexual experiences of women under their jurisdiction; these payments are said to parallel in very direct ways the payments made to lords when men under their jurisdiction die.  My guess is that virginity was perceived as an asset for the lord:  both a duty that their female dependents owed them and a source of revenue and potential political connections.

     

    Robin Lindley: Outsiders saw the Welsh as violent and bloodthirsty. What was the reality you found when you compare other societies of the period?

     

    Professor Robin Chapman Stacey:  Most studies done of Welsh violence have found that the Welsh were no more or less bloodthirsty than others at this period. Indeed, at least one study has even argued that the Welsh learned some of their most gruesome practices from England! However, it was a common criticism of the Welsh, and I believe that many lawbook authors deliberately downplay violence in their work as a way of rebutting this accusation. 

     

    Robin Lindley: In the Welsh law of homicide you discuss, murderers are less of a concern than those who abet or assist in murders. Why was that the focus of the law?   

     

    Professor Robin Chapman Stacey:  Again, a fascinating question to which I wish I knew the answer!  Part of it likely reflects the fact that payments for abetment go to lords, and these sources are written in part to support good lordship.  Part of it also, I think, is the desire on the part of the lawbook authors to downplay the specter of actual violence.  They couldn’t just leave the abetments section out of their work, as all indications are that it was an old and traditional part of the lawbook pattern.  However, only certain manuscripts go on to describe actual violence:  most don’t, and I think that is deliberate.

     

    Robin Lindley: Is there anything you’d like to add for readers about your work or your groundbreaking new book on law and imagination in Medieval Wales?

     

    Professor Robin Chapman Stacey:  Merely my thanks to you for taking the time to ask about it!  Few people realize just how interesting and important these sources are.  The written Welsh legal tradition is extensive and, as I hope I have suggested above, wonderfully absorbing in its imagery and attention to symbolic detail.  And yet few medievalists are acquainted with these texts.  One of my hopes is that by tackling questions of interest to historians of other medieval regions, that might change.  

     

    Robin Lindley: Thank you for your generosity and thoughtful comments Professor Stacey. And congratulations on your fascinating new book.  

     

    Professor Robin Chapman Stacey: You’re welcome and thank you!

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172591 https://historynewsnetwork.org/article/172591 0
    The Power of Microhistory: An Interview with Bancroft Prize Winner Douglas Winiarski

     

    Dr. Douglas Winiarski recently won a grant from the National Endowment for the Humanities. From the press release

    "University of Richmond religious studies and American Studies professor Douglas Winiarski will begin two central chapters of his latest book project, Shakers & the Shawnee Prophet: A Microhistory of Religious Violence on the Early American Frontier, 1805–1815, this summer.

    His research is being funded by a $6,000 Summer Stipend by the National Endowment for the Humanities. 

    “Shakers and the Shawnee Prophet examines the local sources of religious violence on the early American frontier during the years leading up to the War of 1812,” said Winiarski. “I anticipate that the book will resonate with readers attuned to the politics of religious difference and the troubling connections between religion and violence in our own times.” 

    Winiarski is the author of Darkness Falls on the Land of Light: Experiencing Religious Awakenings in Eighteenth-Century New England which was awarded several honors in 2018 including the prestigious Bancroft Prize in American History and Diplomacy and the Peter J. Gomes Memorial Book Prize."

     

    Dr. Winiarski kindly gave HNN an interview and explained his excitiing research.

     

    Your previous work focused on the religious history of New England. What prompted you to shift to studying the American frontier for this project? 

     

    This project actually predates my work on popular religion in eighteenth-century New England. I first starting thinking about the pacifist Shakers’ unusual relationship with the militant followers of Tenskwatawa, the notorious Prophet and brother of the famed Shawnee war captain Tecumseh, more than two decades ago. As a graduate student at Indiana University, I was fortunate to study with two outstanding mentors: Stephen Stein, a leading historian of Shakerism and new religious movements in America, and David Edmunds, who had written the definitive biography of Tenskwatawa. They were colleagues and knew each other, of course, but had never discussed the intriguing connections between their scholarship. Steve’s definitive Shaker Experience in America makes only a brief reference to the Shakers’ 1807 mission to the Indians of the Old Northwest; David’s Shawnee Prophet relies on the Shaker missionaries’ journals but doesn’t explain why those sources existed in the first place. As I read their books side by side, I realized that both scholars were working around the edges of a fascinating, untold story. 

     

    I started poking around with some of the primary sources from the period and was stunned by what I found. The Shakers produced dozens of journals, letters, and speeches documenting their meetings with the Prophet and his followers. They provisioned the Prophet’s villages and invited Tecumseh to visit their communal settlement at Turtle Creek, Ohio. During a three-day meeting during the summer of 1807, in fact, the Shakers and Shawnee danced together in front of an astonished and horrified audience numbering in the hundreds. Then the frontier erupted into violence. The Prophet faced relentless pressure from all sides, native and American. The Shakers suffered intimidation, theft, and arson. In 1811, the Prophet’s movement was nearly destroyed at the Battle of Tippecanoe; one year earlier, an armed mob threatened to raze the Shakers’ entire village. It’s an amazing story, but also a tragic one. I always planned to come back to this project after I had completed Darkness Falls on the Land of Light.

     

    What is a religious micro history? What made you use that term?

     

    I’m pretty sure the term “microhistory” won’t make the final cut in the title of the book, but I used it in my NEH application to signal the distinctive method I’ve adopted in this project. Microhistorians take well-documented, but obscure individuals or events and use them to explore larger historical phenomena, issues, problems, and themes. It’s a narrative genre as well. Classic microhistories, such as John Demos’s Unredeemed Captive or Paul Johnson and Sean Wilentz’s Kingdom of Matthias, often read like gripping historical novels. They share much in terms of approach with popular nonfiction books, such as Erik Larson’s Devil in White City.

     

    A microhistorical approach allows me to tell a great story. This one features two groups of religious outsiders—both despised and feared by their own people—who briefly discovered common ground and mutual respect within the racially charged and frequently violent crucible of the early American frontier. But I think the story of the Shakers and the Shawnee Prophet raises larger questions about religion, race, and violence in the newly United States. Seen from a broader angle, it’s a cautionary tale about what could have been, what might have been. The violence that erupted in response to the Shakers’ meetings with Tenskwatawa and Tecumseh brings into sharp relief the important connections between America’s rising imperial aspirations, which were directly tied to the dispossession of native Americans, and the emergence of American evangelicalism and the rise of the southern Bible Belt.

     

    What do you think would surprise readers about this time period and subject? Did anything surprise you while you researched?

     

    Just about every Protestant denomination sent missionaries to the native peoples of the Ohio Valley and Great Lakes region during the early decades of the nineteenth century. Most missionaries were aggressive agents of culture change; a few, including the Moravians and Quakers, tried to work closely with native communities and sympathized with their plight. Yet all of them presumed that the Indians needed to change or perish, to become in the language of the period, “civilized.” The Shakers were different. They were convinced that other denominations believed and practiced Christianity in error. In fact, they typically referred to Baptists, Methodists, and Presbyterians as “antichristians”! In 1805, three Shaker missionaries traveled to Kentucky and Ohio seeking to convert members of these evangelical denominations to the Shakers’ distinctive faith. After some impressive early gains, however, the Shaker mission faltered and they began to experience various acts of violence at the hands of their antichristian neighbors. Two years later, when members of the struggling Shaker community at Turtle Creek heard that a new group of “religious Indians” led by a charismatic prophet had recently established a new settlement less than fifty miles away, they quickly dispatched a delegation. But the Shakers never attempted to convert the Shawnee Prophet or his followers; nor did they think the Indians needed to be civilized. Instead, the Shakers understood the Prophet’s movement as a manifestation of the very same “operation of the holy Spirit” that has once existed among Anglo-American “revivalers” on the frontier. The Holy Spirit had abandoned evangelical Protestants, as one Shaker noted in his journal, and been “given to the Heathen, that they should bring forth the fruits of it.” This is an extraordinary statement. And it set the stage for a series of fascinating encounters between the two groups—and a lot of anti-Shaker violence as well.

     

    What lessons has your research into this the period offered for American politics and society today?

     

    And I guess that’s the contemporary payoff of this project. I think my book will resonate with readers attuned to the politics of religious difference and the troubling connections between religion and violence in our own times. Loose talk of religious “extremism” seems to be everywhere in our public discourse these days—of individuals “radicalized” and religious beliefs “weaponized.” Anglo-Americans would have understood the Shakers and the Shawnee Prophet in similar terms in 1807. Then, as now, violence against outsider religious communities was fueled by partisan media and warhawking politicians. When Indiana territorial governor and future president William Henry Harrison wrote to the Secretary of War and mistakenly claimed that Tenskwatawa “affects to follow the Shaker principles in everything but the vow of celebacy,” he was beating the drums of racial and religious prejudice to drum up support for a Indian war that would begin with the Battle of Tippecanoe and continue through the War of 1812. Perhaps, in understanding the little-known story of the Shakers and the Shawnee Prophet, readers will be in a better position to assess the dangerous power of similar forms of political invective in our own times. We’ll see.

     

    What has been your favorite archival experience while researching this book?

     

    That’s an easy one. It’s my obsessive quest to recover the history of something called “the jerks.” My book is about the rise of western Shakerism and the believers’ complicated relationship with the Shawnee Prophet. Along the way, I needed to figure out why the Shakers of upstate New York and New England sent missionaries to the trans-Appalachian west in the first place. It turned out to be a interesting story in itself. By the turn of the nineteenth-century, the Shakers had developed a notorious reputation for their ecstatic dancing worship practices. During these “laboring” exercises, as they were called, Shaker brothers and sisters frequently collapsed to the ground, trembled uncontrollably, or whirled in circles. Outsiders began referring to them as “convulsioners.” Powerful religious revivals began sweeping across western settlements in during the first years of the nineteenth century, but the Shakers waited to send missionaries to investigate until a curious article appeared in a local newspaper in November 1804. The anonymous author of “The Jerks” described a peculiar new religious phenomenon that had recently surfaced in the Shenandoah Valley of Virginia in which the bodies of revival participants convulsed uncontrollably. Benjamin Seth Youngs, the central figure in my book, and two colleagues set off for the frontier just one month later. The Shaker missionaries targeted communities of these “jerkers” during their travels, and revival participants who experienced such unusual somatic phenomena emerged among the Shakers’ earliest converts in Kentucky and Ohio. So I spent quite a bit of time tracking down these “jerkers”—the first “Holy Rollers” in the history of American evangelicalism. Readers can learn more about that side project on my website (www.douglaswiniarski.com/essays) or by visiting my “History of the Jerks” digital archive (https://blog.richmond.edu/jerkshistory).

     

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172590 https://historynewsnetwork.org/article/172590 0
    History Provides a Critical Thinking ‘Toolbox’ for Students: An Interview with Ortal-Paz Saar

     

     

    Ortal-Paz Saar is a postdoctoral researcher in the Department of History and Art History at Utrecht University where she specializes in religious studies and Jewish cultural history. 

     

    What books are you reading now?

     

    Yesterday I finished A Pale View of Hills by Katzuo Ishiguro, which I read slowly, so as to “save” it for as long as possible. It is a true masterpiece, from every imaginable point of view. It could make a very good horror film, meaning an intelligent, not a scary one. As it is, the novel makes you shiver because of the things it does not say, somewhat like Giorgio de Chirico’s painting “Mystery and melancholy of a street”. Before that I read Sarah Perry’s The Essex Serpent, a historically-set novel with moving and convincing characters. I came across it by chance in the bookstore, while looking for a novel by a different Sarah, Sarah Moss, whose Ghost WallI enjoyed very much.

     

    What is your favorite history book?

     

    If you mean “history” as in “non-fiction”, then Gideon Bohak’s Ancient Jewish Magic: A History (2008). Bohak was my PhD supervisor and is a true intellectual whom I profoundly admire. He also happens to write exceptional academic prose: clear, pleasant to read, and full of humor. 

     

    When reading fiction, I tend to notice the historical setting, and I often learn a lot from novels – good ones encourage you to go and read more about a period.

     

    Why did you choose history as your career?

     

    I started out as a classical archaeologist but fell in love with magic-related artifacts during my MA studies, which led to a doctorate focusing on manuscripts and history. For me, historical research is fascinating in a way similar to a detective investigation: you have clues, some of which are misleading and others fragmentary, and you need to piece together an image. You strive to achieve an accurate one, although history (especially pre-modern periods) precludes certainty.

     

    What qualities do you need to be a historian?

     

    To be a historian you probably just need to study, find a research topic and work on it. To be a good historian, however, you need to have curiosity, imagination, passion, and the courage to go against the current if you believe you are right. Come to think of it, one needs those qualities to be good in every profession.

     

    Who was your favorite history teacher?

     

    Tough question. I do not recall any positive (high)school experiences with this subject. At Tel Aviv University I had several good teachers, and particularly liked Prof. Israel Roll, who unfortunately passed away in 2010. He had a very systematic method of teaching, clear and easy to follow, whether he was teaching about classical art, excavations in Pompeii or sites in Israel. I can still remember many of his classes, and think the students appreciated him.

     

    What is your most memorable or rewarding teaching experience?

     

    Both memorable and rewarding: adult-education courses in which the participants were people from all walks of life and different religious backgrounds: an orthodox Jew sitting next to two Muslims and several atheists. My lectures were about the major world religions, and I will always remember the warm, respectful, and friendly atmosphere at those meetings. 

     

    What are your hopes for history as a discipline?

     

    If we want to maintain history as a subject worthy of being taught even when increasingly more historical information can be found online, we need to seriously think about its raison d’être. We need to ask: What does history really teach us, why is it needed today? These questions seem to be even more pertinent when we talk about ancient history – why should people care about what happened more than two millennia ago? I do not often come across discussions on the philosophical aspects of this discipline; maybe because people working within the discipline love it, so they do not stop to ponder on its future or its relevance. They climb the mountain because it is there. However, I find it important to pose these questions, both in class and among colleagues. 

     

    One of the things I often tell students is that I would like to teach them critical thinking, and that history provides a toolbox they can take with them once they finish the course. This is increasingly important in the age of fake (news-, deep-, you name it). It may not be long before distinguishing true history from its other forms becomes impossible, and worse: irrelevant. My hope is that we, and the student generations we help shaping, will be able to prevent this.

     

    On a less serious note, I hope that someone will finally invent the time machine that history lovers have been dreaming about for so long (suggested reading: M.R. James’ “A View from a Hill” -- and anything else by this author).

     

    Do you own any rare history or collectible books? Do you collect artifacts related to history?

     

    None. I am not a collector, and tend to get rid of things that clutter my space (books are never clutter, of course).

     

    What have you found most rewarding and most frustrating about your career?

     

    Rewarding: the fact that my work and my hobbies coincide. I go to work each morning feeling happy about the hours ahead. This is probably one of the greatest blessings one can ask for. Frustrations? None so far.

     

    How has the study of history changed in the course of your career?

     

    One word: digitization. Two words: digital humanities. I remember writing my PhD and using microfilm images of Cairo Genizah manuscripts: black and white, poor quality, and the microfilm machines were constantly breaking. Only a decade has passed and those manuscripts, fully digitized, can be viewed from any laptop, in excellent resolution. Secondly, the rapid increase in the use of DH techniques and methodologies has brought to light historical patterns previously unseen, enabling people to ask questions that were unconceivable before. 

     

    What is your favorite history-related saying? Have you come up with your own?

     

    Not just history-related: “The broader your horizons, the more points of contact you have with infinity” (attributed to Blaise Pascal). I use it to justify spending time reading interesting (but irrelevant) things when I should finish writing my monograph on epitaphs that reflect Jewish diasporic identity.

     

    What are you doing next?

     

    Finishing that monograph, of course. 

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172587 https://historynewsnetwork.org/article/172587 0
    Apollo: America’s Moonshot and the Power of a National Project

     

    On October 4th, 1957, Americans looked skyward to see their world had forever changed. The Soviet satellite Sputnik was orbiting the Earth every couple hours. The Soviets had kicked off the Space Race in grand fashion, shocking and embarrassing America’s political establishment.

     

    Senate Majority Leader Lyndon Johnson was unequivocal about the new threat: “soon, the Russians will be dropping bombs on us from space like kids dropping rocks onto cars from freeway overpasses.” In 1958, Johnson supported a massive appropriations bill to expand the American space program and create NASA. During the 1960 election, John F. Kennedy would hammer Vice President Nixon about the “missile gap” between the USSR and America. 

     

    As Kennedy took office, the Soviets retained their edge in the Space Race. On April 12th, 1961, Yuri Gagarin became the first man to orbit the Earth. A month later, after Alan Shepard completed America’s first spaceflight, the president announced that America would land a man on the moon before the end of the decade. 

     

    Far from naïve idealism, the declaration laid out an ambitious roadmap to restore America’s technological superiority. As Kennedy later said, “we choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard.” His goal would demand a tremendous amount of time and resources. By the mid-1960s, NASA’s budget was over 4% of federal spending, and its activities engaged over 400,000 people. 

     

    On July 20th, 1969, Kennedy’s bold vision was realized when Neil Armstrong set foot on the lunar surface. However, NASA did far more than win the Space Race. Each dollar the agency has spent has produced eight to ten dollars of economic benefits. As historian Douglas Brinkley noted, space hardware spurred major advances in nearly all facets of modern life including: “satellite reconnaissance, biomedical equipment, lightweight materials, water-purification systems, [and] improved computing systems.” The Apollo program shows how big goals paired with effective execution and government-supported R&D can play a unique role in driving national progress. 

     

     

    The Longest Journey

    Mercury

    Before Americans could walk on the moon, they needed to reach space and return safely. Project Mercury was America’s first foray into manned spaceflight and established several key practices that were essential the moon landing’s success. 

     

    First, Mercury established the public-private partnership approach NASA would effectively use during Project Gemini (the successor to Mercury) and Project Apollo. NASA’s Space Task Force designed the Mercury spacecraft and McDonnell Aircraft produced it. Likewise, army engineers designed the rocket that would propel the spacecraft into orbit, and Chrysler built it. 

     

    Second, the Mercury project thrived on cooperation instead of competition. In the earlier days of rocket design, Army and Navy teams competed against each other. Now, the entire Mercury program fell under NASA administration that concentrated personnel and resources on the task of spaceflight. The seven pilots who became the Mercury astronauts came from the Air Force, Marines, and Navy. 

     

    In 1961, Alan Shepard became the first American in space, with a fifteen-minute suborbital flight. The next year, John Glenn would become the first American to orbit the Earth. Three other manned flights would follow Glenn, and the astronauts would be feted as national heroes. However, a world of difference separated these brief journeys above Earth from a quarter-million-mile adventure to the moon.

     

    Gemini

    NASA recognized that the path to the moon demanded incremental steps over several years. This long-term perspective envisioned Gemini as a stepping-stone to Apollo. Although Gemini spacecraft never flew more than eight-hundred miles from Earth, the two-man missions provided invaluable knowledge about the tasks required to make a moon landing possible. 

     

    On Gemini 4, Ed White became the first American to perform an extra-vehicular activity (EVA), commonly known as a spacewalk. Later Gemini missions would refine the techniques for maneuvering outside the spacecraft required when the astronauts landed on the moon.

     

    Given that a roundtrip to the moon would take nearly a week, NASA had to ensure that the crew could survive in space for far longer than during the Mercury missions. Gemini 5 orbited the Earth one-hundred-twenty times during a weeklong mission. Later in 1965, Frank Borman and Jim Lovell spent two cramped weeks within Gemini 7.

     

    Borman and Lovell also participated in the first space rendezvous. Orbiting hundreds of miles above Earth, they would be joined by Gemini 6. During the rendezvous, the two spacecraft would come close enough for the astronauts to see each other clearly. This exercise provided confidence for advanced docking procedures, where Gemini crafts would connect with an unmanned target vehicle. This docking simulated the detachment and reattachment of the Lunar Module (LM). 

     

    The Gemini program concluded by setting altitude records and practicing reentry into the Earth’s atmosphere. With Gemini completed, NASA was ready for Apollo. 

     

    Apollo

    Project Apollo marked the culmination of America’s manned space program. By the mid-1960s, over half of NASA’s annual $5 billion budget (approximately $40 billion in today’s dollars) went to the Apollo program. NASA contracted with thousands of companies, including IBM, who developed state-of-the art computers. Dozens of universities provided their brightest minds too, including MIT, who developed navigation and guidance systems. Apollo was propelled into space by the Saturn V rocket, a three-hundred-foot colossus, designed by the US Army under Wernher von Braun’s direction. The three-man Apollo capsule also far exceeded the tiny Gemini capsule in spaciousness and complexity.

     

    However, Apollo suffered from numerous minor engineering defects and technical glitches, continually frustrating its first crew. Then, on January 27th, 1967, disaster struck. During a test of Apollo 1, faulty wiring created a spark which rapidly spread through the capsule’s pure oxygen environment. Astronauts Gus Grissom, Ed White, and Roger Chaffee perished. 

     

    For a period, the space program’s very future was in doubt. Even before the fire, some critics had condemned Apollo as a “moondoogle.” Now, the public and Congress were demanding immediate answers. 

     

    Rather than attempting to deflect blame, NASA created a review board to investigate Apollo 1. Frank Borman and other astronauts literally walked the floors of North American Aviation, the company that had assembled the capsule. Engineers, research directors, and spacecraft designers also joined the review board. After several painstaking months, the board recommended a series of comprehensive changes that would ultimately make Apollo far safer and more reliable. “Spaceflight will never tolerate carelessness, incapacity or neglect,” Flight Director Gene Kranz told his team after the tragedy, “from this day forward Flight Control will be known by two words: ‘tough and competent.’” 

     

    When Apollo resumed manned spaceflights in October 1968, the culture of nonstop self-improvement instilled by Kranz and others had taken root. Apollo 7 was an operational success.

     

    Apollo 8 marked a huge step forward as the crew of Frank Borman, Jim Lovell, and Bill Anders became the first human beings to orbit the moon. After their quarter-million-mile journey, they approached within seventy miles of the lunar surface and glimpsed the far side of the moon. While in lunar orbit, Anders snapped a photo of our fragile home planet in the void of space. “Earthrise” would became an icon of the nascent environmental movement.

     

    After a successful test of the LM on Apollo 10, Apollo 11 mission put Neil Armstrong and Buzz Aldrin on the moon (their crewmate Michael Collins piloted the main ship as they descended). After Apollo 11, NASA completed five additional lunar missions. In the later missions, astronauts spent almost a full day on the moon and successfully deployed a lunar rover. They also conducted valuable experiments and returned with rock samples that have taught us much about the moon’s origins and the state of the early Earth. 

     

    No More Moonshots

    After the first moon landing, public interest in the space program waned. Even in the last years of the Johnson administration, NASA’s budget was cut as the Vietnam War escalated. Now, after America conclusively won the Space Race, Nixon enacted even steeper cuts. By the early 1970s, the 400,000 people working with NASA had been reduced to under 150,000. Ambitious plans for lunar colonization and further exploration were scrapped along with the final Apollo missions

     

    In the late 1970s, NASA turned its attention to the Space Shuttle program. The shuttle would provide a reusable and cost-effective vehicle for transporting astronauts into low-Earth orbit. However, the shuttle proved far more expensive and less dependable than expected. Among the Shuttle program’s greatest accomplishments was the construction of the International Space Station (ISS). However, many at NASA considered the shuttle a partial success at best. NASA Administrator Michael Griffin argued that the Saturn rocket program could have provided more frequent launches into deeper space at a similar cost. Had that path been pursued, “we would be on Mars today, not writing about it as a subject for “the next 50 years,’” Griffin asserted. The Shuttle program ended in 2011, and American astronauts now use Russian crafts to reach the ISS. NASA’s current budget is less than 0.5% of total federal spending, barely 1/10th its mid-1960s peak.

     

    Interestingly, NASA’s grand plans also fell victim to the ideals of the Reagan revolution. While President Reagan supported Cold War military spending, he espoused the belief that “government is not the solution to our problem, government is the problem." That philosophy has become an article of faith for American political conservatives. Even among moderates, a deep skepticism of government programs has become commonplace. 

     

    Faith in government has been replaced by faith in markets. True believers claim that market competition alone drives human progress and advancement. However, economic realities have challenged that optimistic assessment. For corporations, executive compensation has become increasingly linked to stock performance. Investors press for management changes if companies underperform their targets. Corporate leaders are ever more beholden to the next quarterly earnings report and short-term growth. These demands make it far harder to invest in long-term R&D efforts, especially when the outcome is uncertain. For startups, the situation is not much better. A firm may develop a novel product, leading to a massive infusion of venture capital. However, that capital comes with a high price. To justify a sky-high valuation, investors require rapid expansion. That expansion puts an obsessive focus on user growth and customer acquisition for the existing product, leaving little time for meaningful innovation.

     

    Many in the business community have recognized the limitations of the current market system and have sought new ways to pursue ambitious projects. Earlier this year, a set of entrepreneurs launched the Long-Term Stock Exchange to address concerns with short-termism. Google and Facebook have created their own venture and innovation arms to pursue projects beyond their core business activities. 

     

    Returning to space exploration, Jeff Bezos’ Blue Origin and Elon Musk’s SpaceX both are working towards private spaceflight. Although Blue Origin and SpaceX have shown promise, their budgets represent a miniscule fraction of their founders’ assets (and a tiny percentage of Apollo’s budget in adjusted dollars). Their companies each employ only a few thousand people. While not discounting their impressive accomplishments to date, both companies are passion projects of ultra-wealthy individuals. 

     

    The Best of Our Energies

    Projects like Apollo show what a national mission can achieve. President Kennedy understood that his “goal [would] serve to organize and measure the best of our energies and skills.” We need similar thinking today. We face challenges that the market is poorly equipped to address from infrastructure improvement to antibiotic development. Multiyear projects that require significant resources and provide broad-based benefits to society are prime candidates for government investment. That is not to say government should go it alone. Apollo succeeded as a collaborative effort between the government, companies, and research institutions. Indeed, given NASA’s partnerships today with Blue Origin and SpaceX these companies may well be key contractors for a reinvigorated American space program. 

     

    Government funded R&D also brings a cascade of associated benefits. As mentioned previously, NASA research has led to the development of many new technologies from the everyday: memory form, water filters, and smartphone cameras, to the lifesaving: cancer detection software, fireproofing, and search and rescue signals. The modern world would be unthinkable without satellite communication, advanced computers, and the internet, which all began within government research programs. 

     

    Finally, Apollo represents the best of our American spirit. It represents exploration and innovation, hard-work and team-work, as well as the relentless desire to push the limits of human possibility. Our history is one of big dreams. We dug the Panama Canal, built the Hoover Dam, sent a man to the moon, and sequenced the human genome. These accomplishments have become part of our national identity. We should be similarly audacious today. Let’s pledge to wipe out cancer or address the challenges of climate change head-on. Regardless of the mission, let us remember Apollo and shoot for the moon. 

     

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172580 https://historynewsnetwork.org/article/172580 0
    What is a Concentration Camp? Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

     

    A new argument has broken out over the Holocaust, or more precisely, over references to the Holocaust in contemporary life. The sequence of events is revealing about politics, but not especially reliable about history.

     

    In response to the increasing comparison of right-wing populists in Europe and America to Nazis, last December Edna Friedberg, a historian in the United States Holocaust Memorial Museum’s William Levine Family Institute for Holocaust Education, wrote an official statement for the Museum about the dangers of Holocaust analogies. She was clear about what she condemned: “sloppy analogizing”, “grossly simplified Holocaust analogies”, “careless Holocaust analogies”. Dr. Friedberg criticized the political use by left and right of “the memory of the Holocaust as a rhetorical cudgel”. She urged upon everyone better history, “conducted with integrity and rigor”.

     

    This was not controversial, but rather typical of what historians say about the much too common references to Hitler and Nazis and fascism in our political discourse.

     

    Congresswoman Alexandria Ocasio-Cortez said last month on social media that the U.S. is “running concentration camps on our southern border”. Many Jewish organizations and Holocaust institutions condemned her remarks, as well as the usual chorus of conservative politicians. Although she did not mention the Holocaust, it was assumed that she was making one of those careless analogies for political purposes.

     

    This appears to have prompted the USHMM to issue another brief statement on June 24, that then ignited a wider controversy: “The United States Holocaust Memorial Museum unequivocally rejects efforts to create analogies between the Holocaust and other events, whether historical or contemporary. That position has repeatedly and unambiguously been made clear in the Museum’s official statement on the matter,” referring to Dr. Friedberg’s earlier statement.

     

    In response, an international list of over 500 historians, many or most of whom write about the Holocaust, signed an open letter to Sara J. Bloomfield, the director of the Museum, published in the New York Review of Books, urging retraction of that recent statement. They criticized the rejection of all analogies as “fundamentally ahistorical”, “a radical position that is far removed from mainstream scholarship on the Holocaust and genocide.” They argued that “Scholars in the humanities and social sciences rely on careful and responsible analysis, contextualization, comparison, and argumentation to answer questions about the past and the present.”

     

    There have been many media reports about the Museum’s June statement and the historians’ letter criticizing it. But there has been no discussion of the obvious distinction between the original statement by Dr. Friedberg and the newer unsigned “official” statement. Dr. Friedberg had noted that the “current environment of rapid fire online communication” tended to encourage the “sloppy analogizing” she condemned. Ironically, the too rapid response by someone at the Museum to Rep. Ocasio-Cortez’s remarks ignored the difference between bad historical analogies for political purposes and the careful use of comparisons by scholars. Now the stances of the Museum appear contradictory.

     

    The outraged historians also ignored the difference between the two versions of Museum statements, and demanded a retraction of the recent version without reference to Dr. Friedberg’s thoughtful statement.

     

    An easier out for the Museum is to issue one more statement affirming that Dr. Friedberg’s formulation is their official position, excusing itself for the poorly worded June statement, and thanking the historians for defending the proper context in which the Holocaust ought to be discussed and the proper means for that discussion.

     

    Lost in this furor is the fact that Ocasio-Cortez did not make a Holocaust analogy when she referred to concentration camps. Widely accepted definitions of concentration camp are worded differently but agree in substance. The online Merriam-Webster dictionary defines concentration camp as: “a place where large numbers of people (such as prisoners of war, political prisoners, refugees, or the members of an ethnic or religious minority) are detained or confined under armed guard.” The Oxford English Dictionary offers some history: “a camp where non-combatants of a district are accommodated, such as those instituted by Lord Kitchener during the Boer War (1899–1902); one for the internment of political prisoners, foreign nationals, etc., esp. as organized by the Nazi regime in Germany before and during the war of 1939–45.” The Encyclopedia Britannica offers a similar definition: “internment centre for political prisoners and members of national or minority groups who are confined for reasons of state security, exploitation, or punishment, usually by executive decree or military order.”

     

    Perhaps the most significant definition of the phrase “concentration camp” in this context comes from the USHMM itself, on its web page about Nazi camps: “The term concentration camp refers to a camp in which people are detained or confined, usually under harsh conditions and without regard to legal norms of arrest and imprisonment that are acceptable in a constitutional democracy. . . . What distinguishes a concentration camp from a prison (in the modern sense) is that it functions outside of a judicial system. The prisoners are not indicted or convicted of any crime by judicial process.”

     

    From what we have learned recently about the actual conditions in the places where asylum seekers are being held on our southern border, Rep. Ocasio-Cortez’s use of the term fits closely within these definitions. She is supported by people who understand the realities of concentration camp life. The Japanese American Citizens League, the oldest Asian-American civil rights group, calls the camps which the US government set up to hold Japanese American citizens “concentration camps”, and repeated that term in June 2018 to condemn the camps now used to hold asylum seekers.

     

    Rep. Ocasio-Cortez used careful and responsible analysis to make a comparison between current American policy and a century of inhumane policies by many governments against people who are considered enemies. It will take much more contextualization and argumentation to tease out the differences and similarities between all the regrettable situations in which nations have locked up entire categories of innocent people. But given the emotions which have prompted even the most thoughtful to leap to briefly expressed one-sided positions, it appears unlikely that such rational processes will determine our discourse about this important subject.

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/blog/154229 https://historynewsnetwork.org/blog/154229 0
    Roundup Top 10! Videos of the Week

    How Do We Know Something Is Racist? Historian Ibram X. Kendi Explains

    Kendi explained the important historical context behind telling a person of color to go back to where they came from.

     

    Historian Jon Meacham: Trump now with Andrew Johnson as 'most racist president in American history'

    The historian drew the parallels between Trump and Johnson one day after Trump targeted four freshman House lawmakers.

     

     

    Roundup Top 10

    HNN Tip: You can read more about topics in which you’re interested by clicking on the tags featured directly underneath the title of any article you click on.

    Not Everyone Wanted a Man on the Moon

    by Neil M. Maher

    Protesters in the late ’60s and early ’70s pushed for spending at home on the same multibillion dollar scale as the moon race.

     

    Republicans Want a White Republic. They'll Destroy America to Get It

    by Carol Anderson

    Already, Trump and the Republicans have severely harmed the institutional heft of checks-and-balances. But they’re not done.

     

     

    Tennessee just showed that white supremacy is alive and well

    by Keisha N. Blain

    Honoring a former Confederate general and KKK grand wizard in 2019 is outrageous.

     

     

    Trump’s America Is a ‘White Man’s Country’

    by Jamelle Bouie

    His racist idea of citizenship is an old one, brought back from the margins of American politics.

     

     

    Citizenship once meant whiteness. Here’s how that changed.

    by Ariela Gross and Alejandro de la Fuente

    Free people of color challenged racial citizenship from the start.

     

     

    How activists can defeat Trump’s latest assault on asylum seekers

    by Carly Goodman, S. Deborah Kang and Yael Schacher

    Immigration activists helped give power to asylum protections once before. They can do it again.

     

     

    Eisenhower's Worst Nightmare

    by William D. Hartung

    When, in his farewell address in 1961, President Dwight D. Eisenhower warned of the dangers of the “unwarranted influence” wielded by the “military-industrial complex,” he could never have dreamed of an arms-making corporation of the size and political clout of Lockheed Martin.

     

     

    Immigration and the New Fugitive Slave Laws

    by Manisha Sinha

    The abolitionists’ protests against the fugitive slave laws, which deprived large groups of people of their liberty and criminalized those who offered assistance to them, should be an inspiration in our dismal times.

     

     

    Marshall Plan for Central America would restore hope, end migrant border crisis

    by William Lambers

    The Marshall Plan was key to restoring stability to Europe after WWII. Now, a similar approach must be taken in Central America.

     

     

    What the French Revolution teaches us about the dangers of gerrymandering

    by Rebecca L. Spang

    Our institutions must remain representative and responsive.

     

     

    What Americans Do Now Will Define Us Forever

    by Adam Serwer

    "I want to be very clear about what the country saw last night, as an American president incited a chant of “Send her back!” aimed at a Somali-born member of Congress: America has not been here before."

     

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172581 https://historynewsnetwork.org/article/172581 0
    The Truth About the Holocaust and Holocaust Education

     

    Principal William Latson of Spanish River Community High School in Palm Beach County, Florida was removed from his position and reassigned to a different position in the Palm Beach County school district after refusing to admit that the Holocaust was a “factual, historical event.”  

     

    It is not that he personally denied the Holocaust. Rather, in an email to a student’s parent, he relied on a faux professionalism and a dangerous sense of relativism, claiming that as a school district employee, he was not in a position to say that the Holocaust is a factual, historical event since not everyone believes the Holocaust happened.

     

    While it is important to recognize the limits of one’s own expertise, and it is usually a good idea to avoid speaking as an authority on issues that are outside of the scope of one’s proficiency, Latson’s claim that he could not admit the Holocaust is a historical fact is not only unacceptable, it is irresponsible. One does not need to be a professional historian to know that the Holocaust occurred.

     

    We know that it occurred. 

     

    We can visit Auschwitz and walk through the barracks of the concentration camps that now serve as memorials and house personal artifacts of the victims, such as clothing, shoes, prosthetic limbs, even human hair.  We can stand in the gas chambers and see the ovens used to burn the bodies of those who were murdered.  We can talk to survivors and witness the numbers tattooed on their arms.  

     

    Moreover, when educating students, it is vital to provide them with the skills to analyze data, verify which data are reliable, and arrive at justifiable conclusions; however, it does a disservice to students to make them question the veracity of obvious facts, simply because “not everyone believes in them.” Imagine if the principal questioned the importance of teaching “Introduction to Physics,” simply because not everyone believes that gravity exists.

     

    Encouraging students to “see all sides” of a complicated issue where values can be prioritized in different ways with varying implications to teach them to arrive at a conclusion based on facts, their values, and the consequences is a sound lesson. Presenting truth and falsity as two equally valid options is quite another.

     

    Underlying Latson’s refusal is a question of the importance of Holocaust education.  Currently, 11 states, including Florida, the state where Mr. Lawson was principal, have laws requiring schools to provide Holocaust education. The most recent state to require it was Oregon in 2019, whose law stipulates that instruction be designed to “prepare students to confront the immorality of the Holocaust, genocide, and other acts of mass violence and to reflect on the causes of related historical events.” Yet the importance of Holocaust education is not simply out of awareness of a historical fact. Holocaust education can provide a unique lens to evaluate many contemporary social, political and professional issues that challenge us today.  

     

    While it is typically conceived of as a devastating moment in Jewish history, the Holocaust has much broader lessons to teach. The Holocaust is the only example of medically sanctioned genocide in history. It is the only time where medicine, science and politics merged to endorse the labeling, persecution, and eventual mass murder of millions of people deemed “unfit” in the quest to create a more perfect society. Individuals were stripped of their dignity and viewed solely as a risk to the public health of the nation. Their value was determined by their usefulness – or danger – to society.

      

    Today’s political and social landscape is one where the voices of nationalism and populism have become louder and louder. While the ethos of liberal democracy and multiculturalism still provides a strong foundation for peaceful civil societies and international law, its influence on shaping the future of domestic and international politics is waning. One can see the deleterious effects of hardening ideologies and prejudice, not only along the margins of society, but even within those sectors of society that have traditionally been seen as its stalwarts.

     

    By showing that the tragedy of the Holocaust is not only a tragedy in Jewish history but a lesson for everyone, Holocaust education can serve to foster civics and ethics education. The Holocaust can functionas a historical example for understanding the danger of placing societal progress and political expediency ahead of individuals. Holocaust education is an opportunity to teach the next generation about the essential connection between the past and the future, to give them the tools they need to learn about moral decision making and to emphasize our responsibility to stand up and speak out when we see evil in any form.

     

    How we teach the memory of the Holocaust is intricately tied to our vision for the future of our society. Let’s stop thinking that “Never Forget” is enough of a message. Let’s remember not only for the sake of remembering, but for the sake of developing our students to become people who respect each other.

    ]]>
    Sat, 17 Aug 2019 18:13:04 +0000 https://historynewsnetwork.org/article/172561 https://historynewsnetwork.org/article/172561 0