History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sat, 15 May 2021 11:21:00 +0000 Sat, 15 May 2021 11:21:00 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/site/feed India's COVID Crisis Has Roots in its Colonial-Era Tax Structure

Indian Prime Minister Narendra Modi with the Statue of Unity, depicting Sardar Patel, the first Home Minister of independent India. 

The statue, the world's tallest, measures nearly 600 feet. 




So what does the current health crisis in India have to do with the British Raj? Quite a lot as it turns out. But to understand the connection, we need to rid ourselves of two preconceptions about Indian history.


The first, put about by Indian nationalist historians, is that the British Raj was heavily extractive. The second, made popular by Congress Party leaders, is that India is a socialist country. Once free of these two myths, we can trace the historical link between taxation and India’s current healthcare problems.


After 1857, the British government in India was permanently cash-strapped. The bill for supressing the Uprising proved enormous, and a search for funding followed, in parallel with a desire to recruit local support. Investment in infrastructure to generate revenue and enhance popularity was considered a good idea, but where was the money to come from? Borrowing was difficult, even dangerous, for multiple reasons, and taxation was a sensitive issue. Liberals within the Raj were uncomfortable about taxation without representation, and didn’t like to interfere with markets and free trade; conservatives didn’t like the idea of taxation at all and queried the need for popularity. Eventually the only uncontroversial expenditure after 1857 was on the army, at levels that often approached fifty per cent of government revenue. But this was still a relatively small amount; the colonial state never disposed of more than around 15 per cent of GDP.


Taxation of various kinds was imposed and abolished to meet emergencies, but the prudential constant was that agricultural incomes – the main source of wealth in a pre-industrial economy – remained tax exempt. The government’s income came primarily from the middle and lower classes, via excise and indirect taxes.


This was pure political cowardice. The one really well-developed taxation structure in India was the Land Revenue system, inherited from the Mughals. It touched every landholder in the country, and was awash with accurate data. But by the late nineteenth century, the landlords were the main political prop of the Raj, and it was considered unwise to provoke the ‘natural leaders’ of society. This policy remained in place till 1947.


And did India’s landlords choose to tax themselves after independence? No: there was no one to force them. Around 70 per cent of India’s first members of parliament were large landowners, most of them in the Indian National Congress. But under the leadership of firebrand socialist Jawaharlal Nehru, wasn’t the Congress a socialist movement? No, it wasn’t.


There has been a long-standing assumption among left-leaning academics that a mass movement for national liberation must be a socialist movement. The poor constitute a majority in any society, so a mass movement must be comprised of the poor, and they must want redistribution of wealth. So goes the reasoning.


But this is not what happened in India. The Indian National Congress was never a socialist party. It is better understood as a national alliance for self-government; it had a thin policy offer, which was a product of constant compromise. Congress tried to be all things to all Indians, in order to be an effective national movement. Inclusivity was always the priority, not the details of post-independence programs. The Karachi Resolution of 1931, which was Nehru’s baby, was as socialist as the movement ever got, and its text didn’t mention the word socialism.


The Congress Socialist Party, founded in 1934, was only an in-house pressure group, and Nehru never joined it. After independence, the socialists left the Congress en masse in 1948. Nehru was a self-proclaimed socialist, but he became the post-independence leader not because he was the ideological embodiment of the Congress, but because Gandhi believed him to be the best man to handle the process of demission with the British.


Very few other leading Congress figures were socialists. Gandhi wasn’t, and the number three in the Congress pantheon, Sardar Patel, definitely wasn’t. In fact, it is Patel, India’s first home minister, whom Narendra Modi has chosen to commemorate with a statue so tall it can probably be seen from China (where much of it was made). Patel, the iron man of the Congress, is now the composite material man of the BJP, because he was a religious, social conservative. Nehru’s legacy of seventeen years as prime minister was a dirigiste economy and a commitment to democracy, not socialism.


Indira Gandhi dabbled with socialist polices between 1970 and 1975 – nationalising banks, abolishing the princes’ privy purses – but never again. The most truly socialist prime minister India ever had was Chandra Shekhar in 1990-91, but he only managed to last five months at the head of a minority government during a time of acute national crisis.


India currently self-identifies as a socialist country because in 1976 the 42nd Amendment added two words to the preamble of the Constitution, turning India into a ‘sovereign, socialist, secular, democratic republic’. But where is the substance? India’s Electoral Commission requires that all political parties must accept the Constitution. So, recalling Henry Ford, you can have any kind of politics you like in India, as long as it’s socialist.


Is this anything more than window dressing from a bygone era? India does not have collectivised agriculture, and the government has been backing away from running industry since liberalization in the 1990s. More telling is the continued survival of the colonial era tax exemption for the rural rich. Indeed, there is a persistent and unchallenged aversion among India’s wealthier classes to submit to any kind of serious taxation regime.


Only about 10 per cent of Indians pay income tax, and Narendra Modi, like others before him, remains as reluctant to tax rich Indians as ever the British were. He recently raised the threshold for income tax to about three times average income.


Bluntly put, no state can be run on socialist lines if the government disposes of only about 12 per cent of its GDP. This is where India has been for a long time, at a level that broadly matches the Raj. The defence budget has shrunk, but India spends about five times more on agriculture and ‘rural development’ combined than it does on health. Politicians pour money into inefficient agriculture in a bid to buy votes from the rural population. Around 60 per cent of India’s people still live on the land; more than 80 per cent of them are poor. And in a democracy, the votes are where the people are. You do the political math.


India’s government spends about 1.3 per cent of GDP on healthcare, with the private sector topping up the figure to somewhere around 3.5 per cent. This is barely adequate in normal times – the UK’s version of socialised medicine eats 10 per cent of the national cake – but it is woefully short in a time of pandemic. India suffered terribly under British management in the great influenza pandemic of 1918-20, losing up to twelve million lives. A century on, the same barebones health provision may exact another terrible toll.


India’s socialism has always been for the poor; the rich have made their own arrangements, as they learned to do under British rule. And it is the poor who will suffer most as the coronavirus rages on.


© Roderick Matthews 2021

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180166 https://historynewsnetwork.org/article/180166 0
Reflections on Russia's "Victory Day"

Ukrainian civilians attack a Jewish woman during a pogrom encouraged by Nazi occupying forces in Lviv (then Poland, now Ukraine), July 1941.



Every year on May 9th, my birth country of Russia marks Victory Day to celebrate the end of the Great Patriotic War, otherwise known as WWII. You may know this holiday as May 8th, but it was after midnight in Moscow when Germany surrendered. But while these particular dates are historical events, they matter less for the story at hand. All that matters is that at some point in the months that followed that surrender, a young Jewish woman boarded a train from an orphanage in Siberia where she had spent the war, and returned to the Ukraine. That woman was my grandmother. Stories like hers remind us that the Holocaust is not quite so firmly in the past, as its impact continues to reverberate on the descendants of survivors. This has been eye-opening for me, as a professional historian who studies trauma in a different historical period.

My grandmother was the older of two sisters, whose mother had died of a brain tumor a few years before the war. As the German army was beginning its ruthless invasion of Western Ukraine in summer of 1941, the extended family decided that it would be best for the father to stay home with the frail younger sister, and send the older one away. The train of children made it out of the Ukraine, but barely – miraculously, the German bombing of the train managed to leave it unscathed. For four whole years, then, she lived in an orphanage. One can only imagine what this was like for a quiet and shy teenage girl, who had been uprooted from home so abruptly and dramatically, and sent to an isolated and impoverished part of the Soviet Union, a region best known for serving as the place of exile for political criminals. The only clue of her experiences there has to do with one of the most basic of human needs: food. She recalled to her daughter, my mother, that the best meal she had at the orphanage came on the day when the director’s horse died.

But after four long years away from her family, she finally got to board a train home. She perhaps wondered why she had not received any mail from her relatives over these years, but mail was notoriously unreliable during the war. It is possible, therefore, that she kept hoping beyond hope that there was someone still left, who was waiting for her return. And so, she may not have known the truth for certain until she got home. But, you see, there was no home left. As it turned out, her father and sister only lived for less than a month after her evacuation in 1941. The rest of her relatives were gone as well, along with the centuries-old Jewish community of which they were a part. The world may have heard of Babi Yar, but there were many such massacres all across the Ukraine. As a result, for my grandmother home-coming was no celebration. Instead, it made her confront indescribable mourning and loss.

The loss of family, community, and an entire tradition must have weighed hard. But one loss surpassed them all, and as a result, went unmentioned: the loss of God. I do not remember my grandmother, as her last visit took place when I was still too young to have memories. I only know her story from the snippets that she reluctantly told my mother over the years. But what I do know is that before the war, the shtetls where my grandparents grew up were vibrant religious communities whose routines revolved around observing the rhythms of the Jewish religious calendar. And yet, in all of her growing-up years, my mother never heard them speak of God. That silence speaks heartbreaking volumes. I am left wondering what they would have said, had they known that their granddaughter’s search to know the God of her ancestors ultimately led her to Christ. After all, it was the self-proclaimed Christian neighbors of the Ukrainian Jews who were so eager to turn them in to the Nazis. And it was these same neighbors who did not welcome the bereaved and orphaned remnant, like my grandmother, back into their communities after the war, and continued to reject and persecute them long after.

My mother, who grew up in the Ukraine, remembers an occasion when she, a young teenager, was walking down the street. A passer-by looked at her distinctly Jewish appearance, cursed at her, and spit in her face. Years later, her father went out for a brief walk in the evening and never came back. A drunk motorcyclist accidentally drove over the sidewalk, killing him. When my grandfather’s identity as a Jew was mentioned in court, the case against the motorcyclist was dismissed. The year was 1981.

I get to reflect on these stories, the family legacy of inter-generational trauma, the Holocaust, and structural anti-Semitism, from a position of safety and privilege as a white middle-class American citizen, and a convert to Christianity to boot. But my mother has sent numerous books over the years to my children. All of them are about the Holocaust.

As a military historian of the ancient world, I study the impact of the trauma of war on people who, in many cases, have been dead for over two thousand years. But in these crumbs of family history that my mother has dropped over the years, I see how trauma is lived out in real human relationships. The Holocaust is not a single event, and through the generations that continue to live out its legacy, it continues to impact real lives of real people today. As historians, we can sometimes forget how personal the tragedies that we study once were, and maybe still are. But as historians, we are also tasked with making sure that the world will not forget. Zachor, al tishkach. 

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180165 https://historynewsnetwork.org/article/180165 0
A CIA Historian’s Photos of the Afghan War Tell the Story of Those Being Left Behind in Afghanistan  




On May 1, the Pentagon officially began withdrawing troops from Afghanistan, thus ending an extraordinary, two decade long war fought under four presidents (a departure I have criticized from the strategic perspective). I know this land well from my journeys across more than half of its provinces as a professor of Afghan history and as an employee of the CIA’s Counter-Terrorism Center (see HNN article on my journey to the Agency here). I was tasked by the CTC with tracking the movement of Taliban and Al Qaeda suicide bombers, who I discovered were being dispatched into our team’s zone from neighboring Pakistan’s North Waziristan tribal agency as part of a terroristic effort to shatter the democratic institutions we were attempting to build there. I also worked at a Forward Operations Base in Regional Command East as a S.M.E. (Subject Matter Expert) for legendary insurgent-hunter, General Stanley McChrystal’s, U.S. Army Information Operations team.


During the course of my journeys over more than half of Afghanistan’s war-torn provinces, I came to love the ancient, almost timeless people of this land, many of whom dreamed of a building a better world with American help. While carrying out my independent scholarly fieldwork, and on my solo missions for the CIA and US Army beyond the safety of our base’s walls in what my team described as the Afghan “Red Zone,” I also did something that none of my US Army comrades---who traveled in convoys and were restricted by ROEs (Rules of Engagement)---could do. I freely photographed the fascinating Afghan people around me as they went about their lives in an active war zone.


Sadly, my rare photos are images of a world that is, in many ways, already fading as the Taliban continue their relentless march from the half of the country that they have already conquered in recent years. I fear these photos may be some of the final record of a threatened way of life---like those photographs taken of Afghanistan in the 1960s when Western tourists visited this relatively stable kingdom and found Westernized Afghan women wearing miniskirts---that will soon disappear. When our final support troops depart by September 11th, the increasingly confident Taliban are expected to launch a nationwide assault on an embattled Afghan democratic government ally that, like our former South Vietnamese allies, will be left to fend for itself without its American “big brothers.”



1. The Warlord. In this photograph, that could have almost come from the Middle Ages, my friend and focus of my book The Last Warlord. The Afghan Warrior who Led US Special Forces to Topple the Taliban Regime, the legendary Uzbek Mongol cavalry commander General Dostum, is pictured riding his prized war stallion Surkun. He rode Surkun into combat alongside horse-mounted U.S. Special Forces Green Berets to overthrow the ethnic Pashtun-dominated Taliban regime in 2001. Hundreds of his riders were killed in the desperate mountain campaign against their Taliban blood enemies that was won in just two months with only 300 US troops. These allied proxy fighters are the unsung heroes of the war on the Taliban and, under Dostum, Mongol horsemen boldly rode into war to change the course of history for the first time in hundreds of years (see my video of this remarkable campaign here).


Led by the larger-than-life secular warlord Dostum---who was brilliantly portrayed in the movie 12 Strong. The True, Declassified Story of the Horsesoldiers by Iranian actor Navid Negahban---the northern Persian-Tajik, Uzbek-Mongol and Hazara-Shiite Mongol tribes joined with a Green Beret A Team led by Captain Mark Nutsch  (portrayed as a modern day Lawrence of Arabia by Thor actor Chris Hemsworth) in breaking out of their remote mountain base to topple the hated Taliban regime by December 2001. See my article on the making of 12 Strong and my efforts to bring authenticity to the remote New Mexico mountain movie set where this movie, whose Afghan half was based on my book, was filmed here.



2. The Girl. I photographed this girl in a roadside tent babysitting little brother while her parents worked in the fields. At nine years old she was charged with protecting him and the family tent by herself. While American children grow up safely in kindergartens, learning with technology in advanced schools, partaking in school sports, dating, having access to orthodontics and other medical care, books, computers, baby seats, and internet, Afghan children do not have such unimaginable luxuries. Many die as infants from disease and lack of access to medical help. I have no idea what became of this curious and welcoming girl’s fate, but in Afghanistan many impoverished children like her do not get the opportunity to learn how to read and write.    





3. The Chicken Fighters. My weekly visit to the Friday afternoon chicken fights in Kabul, a favorite past-time for men who bet on winners in a hillside garden beloved by Kabuli families known as the Bagh e Babur. The famous Garden of Babur was built around the marble tomb of the 16th Turkic-Mongol Muslim warlord Babur “The Tiger.” This warlord marched out of Kabul with his cannon-equipped Central Asian warriors and fought his way across Hindustan to forge the legendary “Moghul” dynasty, famous for the Taj Mahal.


Many Afghans enjoy timeless recreational activities of the sort their ancestors did hundreds of years ago. Such past-times include two-humped Bactrian camel fighting in the Uzbek north, kite flying, and chess. Among the Afghans’ most cherished sports is the ancient horse-mounted Afghan “polo” game of buzkashi played by whip-wielding horse warrior-heroes known as chapanzades. These predominately Turkic-Mongol Uzbek and Hazara riders---who are feted as heroes in the same way Americans hero worship their baseball or football players---play a game first introduced to the Eurasian nomads by Genghis Khan the Jehanger (World Conqueror) in the 13th century to teach his warriors how to be tough.  



4. The Hosts. Me (in sunglasses) in the middle of a smiling crowd of curious Afghans, of the sort that often gather around a Westerner when he or she appears in their midst. I was always amazed at the warm welcome I received while traveling across this land. I was regularly invited into Afghans’ simple homes as an honored American guest. There, my impoverished, but thrilled, hosts would eagerly offer me lamb or goat, often after slaughtering their only source of meat, to honor me. The warmth I experienced in my travels compared drastically to the stereotypical images many Americans have of this as a uniformly hostile land. I wish many of my fellow Army teammates who were confined to our base could have had this sort of experience to get to know the Afghans.


Hospitality to honored guests like myself is an ancient tradition all of Afghanistan’s ethnic groups cherish. For them offering panagah (sanctuary in Persian) or melmastiia (hospitality to guest in need in Pashto) is almost a religious duty. So protective of me was the Uzbek leader General Dostum in my visits to his northern realm, that he had armed fighters sleep outside my door and follow me around (much to my annoyance) to keep me safe. But the most famous recipient of melmastiia was the Saudi Arabian terrorist Osama bin Laden who was granted sanctuary by the Taliban of the south in 1996.



5. The Burger and Pizza Chef. In the bustling capital of Kabul, I found Western-inspired sights that would have been unimaginable under the grim Taliban masters. For example, I found previously-banned beauty shops, internet cafes that brought the world to previously isolated Afghans, a few (now closed due to Taliban attacks) bars for foreign aid workers and military contractors, a shopping mall (whose entrance was protected by armed guards with metal detectors to protect it from suicide bombers), TV studios, and even American-style restaurants, including one of my favorites KFC, Kabul Fried Chicken.


The restaurant pictured above was run by an Afghan who had, like tens of thousands of his people worked on a U.S. base. There he learned how to make such popular dishes as pizza and hamburgers. Two decades of American cultural influence and exposure to the modern West has radically transformed many Afghans, especially those who are better off and live in the cities. This influence, in the form of women on such Afghan channels as Tolo, young men wearing American-style clothes, women in university, and young people on Facebook, will be hard for the provincial Taliban to completely eradicate should they re-conquer Kabul and other comparatively cosmopolitan, liberal cities.  




6. The School Girls. A group of middle school girls who were, after five years of being denied the right to be educated by the Taliban, excited to be attending school. The girl in the middle was crying as she told the story of how her parents were killed by the Taliban. She worriedly told me “the day the Americans leave the Taliban will return and execute us if we try to learn to read and write which is forbidden by their law.” Girls like her told me horror stories of the Taliban misogynists’ draconian brutality against women. One teenager told me of a girl in her village who was stopped by the Taliban’s dreaded, whip-wielding moral police, the Committee for the Prevention of Vice and Promotion of Virtue, and discovered upon investigation to have had forbidden finger nail polish on her fingers. She was then dragged screaming to the town square. There, she had her fingers cut off in public with a sword as a Taliban mullah or priest read passages from the holy Koran condemning her as a “prostitute.”


In a sign of things to come, and stark warning to girls who might have hopes for continuing to get the sort of education they have been able to receive for 20 years, on May 8th terrorists slaughtered 30 people, mainly school girls, and wounded another 52 with a massive bomb outside a girl’s school.  



7. The Guardians of the Buddhas. High in the remote fastness of the mighty Hindu Kush Mountains live the Shiite Mongol Hazaras who have been terribly repressed by the Sunni Muslim extremist Taliban from the southern lowlands. The Hazara highlanders’ villages were burned in the late 1990s and their people continue to be slaughtered even in their weddings by the Sunni Taliban and ISIS fanatics who considered them to be Shiite “heretics.” Under the influence of their notorious Al Qaeda Saudi guest Osama bin Laden, the Taliban iconoclasts also blew up the magnificent stone Buddhas carved into the wall of one of Afghanistan’s most scenic valleys, the 8,000 foot high Vale of Bamiyan, by the ancient Aryan Tokharians in the 6th century.


Approximately 800 years after they were carved by the Buddhists inhabitants of the valley, Mongol garrisons settled in the heights of the Hindu Kush (the name Hazara signifies a Unit of One Thousand Troops) and gradually came to see themselves as protectors of Afghanistan’s most prized archeological treasure. The Taliban’s senseless culturecide destruction of the locally cherished Buddhas that appeared on the Afghan currency was a calculated blow to the Hazaras’ pride and spirit.


Here I am photographed standing with Hazara kids with beautiful Vale of Bamiyan behind me and the empty dark niche carved into its rock wall where the 180 foot, larger of the two Buddha carvings guarded the secluded valley for a millennia and half. I was led by a local guide through Taliban landmines to the mournful, rubbleized ruins of the massive Buddhas. He told me that his people believe that when evil is approaching their lands, the ghosts of people in nearby ruins who were slain by Genghis Khan scream in warning. My Hazara friends on Facebook tell me their elders claim them the voices are now screaming again as the Taliban once again approach their mountains from the south.




8. The Orphans. During my visit to this home for war orphans run by a wonderfully compassionate Afghan woman, I encountered this angel who sang an Afghan song for me. I wanted to take her, and all of them, home with me when I found they were sleeping three to a bunk bed and living on a simple, twice-daily meal of porridge and bread. When I asked about their future, the kind woman who ran the shelter told me the girls might get lucky and be married off as a second, third, or fourth wife to an older man and the boys might get lucky and be adopted to serve as laborers. As grim as these fates seemed, they were far better than the fate of hundreds of thousands of Pashtun war orphans in the 1980s and 1990s who were adopted by fundamentalist madrassas (religious seminaries) where they were indoctrinated by Saudi-funded mullahs to become fanatical, misogynistic Taliban.




9. The Nomads. As you drive around Afghanistan’s south, you encounter wandering Pashtun sheep and goat herders who have nomadized on these lands since the dawn of time. These primordial nomads known as Kuchis live disconnected from settled villagers, except when they wander into their markets to sell sheep skins, beautiful hand-woven carpets, milk, or meat. The Pashtun Taliban rarely interfere with the migrating Kuchi tribes who live by their own ancient rules. For this reason, they were, and hopefully will be, less impacted by the Taliban’s harsh enforcement of shariah Islamic law and their relatively free women rarely wear the all-encompassing burqa veil. The elder on my right welcomed me to his encampment with the famous hospitality of his people and I was given insights into a timeless world of a people who do not have cell phones (of the sort many Afghans now have) or even electricity, running water, or heating. 



10. The Americans. I chose not to share a typical combat mission photo, but to instead include this image, of the sort few non-military people see, of my base members enjoying rare down-time from the strains of war to celebrate July 4th. Here two of our base members who were famous for their “Scottish-rock” songs (hence the kilt and bag pipes) entertain the base during an Independence Day barbecue. While life as a “Fobbit” on a F.O.B. (Forward Operations Base) could be stressful—we had a car-borne suicide bomber detonate at our heavily guarded gate killing several soldiers, some mortar shelling, and a PTSD suicide that summer—we were far better off behind our walls and blast barriers than troops living in much smaller, remote, exposed C.O.P.s (Command Outposts).  


Almost 800,000 American troops served in Afghanistan and they drastically transformed this land that time forgot on many levels. I remember making the long drive from the town of Jalalabad near the Pakistani border, over the mighty Hindu Kush Mountains, and across the burning deserts of the north to the Uzbekistan border on a beautiful ribbon of black tarmac and being proud we had built it (this road was far smoother than those found in my hometown of Boston!). American troops also built schools, hospitals, and wells, de-mined fields, trained a police force of 116,000 and army of 180,000 troops, and helped prevent the Taliban insurgents from taking a single regional capital or town of any size for twenty years.


History shows that more aid was pumped into the building of Afghanistan’s democracy than was spent to rebuild post-World War II Europe in the Marshall Plan (when factoring in inflation). These efforts brought much, but certainly not all of this un-developed land (that was never colonized or modernized the way say Russian-Soviet Uzbekistan was), from the Middle Ages into the 21st century. They also gave Afghanistan a “government in a box” that included mandatory seats for women, ended the Taliban repression of the northern Uzbek Mongol, Tajik Persian and Hazara Shiite Mongol ethnic groups, gave girls a chance to be educated, and gave all those who dreamed of an escape from the medieval cruelty of the Taliban a breathing space to try to forge a pluralistic democracy where women’s and minorities’ rights were respected. It remains to be seen how permanent these fragile gains are.




11. The Taliban. I took this photograph in a fortress-like prison in the northern deserts where thousands of Taliban prisoners of war were being held by General Dostum. I was given the rare chance to sit down with the notorious Taliban and talk with them while my glowering Uzbek guards looked on. The captives ranged from hardcore fanatics, including one who said he would kill me as an infidel if he was not handcuffed, to less extreme “village Taliban” who had been paid to fight. I also met Pakistani Taliban who had been lured into Afghanistan in 2001 to defend it from the American “infidels.” As I took in the despair in the prisoners’ eyes, like the one I captured here, part of me felt sorry for them.


But when I remembered the stories of school girls who had had disfiguring battery acid thrown in their face by Taliban to punish them for daring to get an education and the shredded bodies of victims of one of their senseless suicide bombings I encountered, I forgot such emotions.   Should they conquer post-American Afghanistan, girls will once again be forbidden from sitting in a classroom with boys to learn to read and write, Western-style restaurants and beauty salons will be banned (along with buzkashi and chicken fighting), and the repressed Shiite Hazaras of the Hindu Kush Mountains will once again be repressed. I am still haunted by one Taliban prisoner’s bold boast of the Taliban’s famously defiant mantra to me back in 2003 “You Americans may have the watches, but we have the time…We will outlast you.”


Sadly for the Afghan people, it appears he was right.


For more of Dr. Williams’ photos from Afghanistan and Islamic Eurasia, articles, videos and his books see his website at: brianglynwilliams.com


I would like to thank Carol Hansen and my wife Miriam Braz Williams for their assistance with this article

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180164 https://historynewsnetwork.org/article/180164 0
Are Campus Bookstores Undermining Student Learning?





Once upon a time, universities used to run most of their own facilities and on-campus services. Today most universities outsource many of their facilities and on-campus services. The majority of U.S. college campuses have a cafeteria run by Aramark, Sodexo, or Compass Group. Universities also outsource custodial work, landscaping, and maintenance. And, of course, the campus bookstores that charge students hundreds of dollars for textbooks are also typically outsourced. Follett alone operates 1,200 campus bookstores. While outsourcing contracts have raised concerns about worker exploitation on campus, faculty should also be concerned about the potential exploitation of students by the college bookstore.


When the campus bookstore is run by interests outside the school, student interests are not a priority. One easy way that this can be seen is in the push for digital textbooks in the classroom. Publishers and campus bookstores continue to promote semester-long digital access as “better for students,” who, they claim, prefer digital books and benefit from savings. These claims reflect mainly self-interest by publishers and/or an ideology of technopoly. Though young people are “digital natives,” they do not prefer digital content for the classroom. Not only are students tired of reading on screens, claims of “lower prices” for digital textbooks are also dishonest because while students are paying less than the alleged market value of textbooks, they are also getting far less than previous generations. Students exchange their money but they do not receive a good, only a temporary service—a digital textbook is provided online for limited time. Many digital textbooks cannot be downloaded and they can almost never be printed. Access ends with the semester.


Digital access is not ownership and it is a way that students are being strung along for services when they should be receiving something more. Students cannot access these books later in college, or life, to use as a resource. For some classes leasing the books may seem just fine, but there are many books that students would like to consult again later and would benefit from having on hand. And the more difficult the subject, the more important it is to have the format best-suited to learning (print). Neither can students resell what they never possessed. The elimination of the resale market will have a negative impact on students in the long run. As the subcontracted bookstores continue to push digital content, and incentivize it for universities, it is increasingly difficult for students to get books in other formats.


At some schools, students in certain classes have no choice to get books from other places. College textbook publishers are increasingly convincing colleges to adopt “inclusive access.” In this model, the entire class has digital course materials, which appear on the first day of class and disappear after the last, are arranged through the bookstore, and are included with tuition. This is often easier and cheaper than students finding the books themselves, but it eliminates the independence of the student as a consumer and, again, eliminates the possibility of ownership. Inclusive access is facing legal challenges for the monopolistic threat it poses to independent bookstores, but we should also be concerned about the ways it narrows options for our students. It is not wrong to rent books or to offer digital access to books as an alternative to ownership, but it is unethical to deny students the opportunity to own their learning materials.


Most significantly, digital textbooks can detract from the educational experience in other ways. Most students simply do not learn as well from digital content as they do from traditional print books, which can also be rented. And while students struggle to manage healthy amounts of screen time, mandating that more of their resources be on screens is not helpful. Excessive screen time is linked to many negative health outcomes and the screens themselves often become a distraction in the classroom. What discounts are worth inferior learning and increased health risks? Lower prices should not be the only consideration in learning environments. Bookstore recommendations should not determine how faculty choose to assign books, student interests should.


In our country, college has been a pathway to intellectual discovery and a certain amount of financial stability. Today’s undergraduates are increasingly being cornered into ongoing financial commitments for everything, while they never take possession of anything. So much of today’s music and entertainment is accessed through services which cease as soon as the payments do. While digital textbooks provide ease of access on the first day of class for students whose bills are fully paid, students who struggle with bills may find themselves cut off from the classroom materials. The advantage of potential ownership is actual possession. And when students are denied the ability to own the textbooks, through things like “inclusive access,” faculty aren’t just choosing to assign “cheaper” books, we are selling our students to the publishers. College is not just about having a book in your hand, but when you can only lease the learning materials, you can leave a semester pretty empty-handed.


It is easier to outsource the college bookstore than to run it with university staff. It may even be wiser. But when university faculty and/or administrators decide on course materials based on publisher-oriented bookstore recommendations, rather than student interests, the mission is at stake. Students come to college and learn to ponder the “good life” even as they prepare to take hold of it. Universities are traditionally non-profit, they are intended to exist for the mutual benefit of all involved, but when they outsource to for-profit organizations, we should be wary of subcontracting away the souls of our institutions. College bookstores are right that digital access is often cheaper for students, but they are wrong to suggest that it is better. Online learning materials are not always optimal for student learning, for many reasons. And when our model of classroom education offers nothing more than temporary digital access to learning materials, we undercut the value of the material and we are threatening what we hope will be some of the lasting benefits of the learning for our students.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180169 https://historynewsnetwork.org/article/180169 0
Mudlarking: Searching for Lost Treasure – and History – on the Banks of the Thames



Ever since man first quenched his thirst in its waters, he has left his mark on the riverbed.

Ivor Noël Hume, Treasure in the Thames (1956)



London would not exist without the River Thames.  It is a source of fresh water and food, a path of communication and transportation, and acts as a real and imaginary boundary.  More importantly, it facilitates trade using the incoming and outgoing tides that has made London such a functional and successful port.


Since the beginning of time, the River Thames in London has been a great repository, collecting everything that has been deposited into its waters. Once discovered, these objects reveal stories of the capital’s fascinating history and its inhabitants.


Established by the Romans in the 1st century AD, the edge of the river has always been a hive of activity. Merchants, boat builders, sailors, fishermen and even passengers crossing the Thames would have kept the riverside busy around the clock. Lightermen and Stevedores (quayside laborers) would have worked tirelessly loading and unloading ships’ cargo and transporting imported goods to warehouses along the river front. Local traders, shops and taverns would have been packed into the adjacent streets and lanes, providing materials and refreshments for those thriving industries and their employees.


There are many reasons why objects were deposited or accidentally lost in the river. For instance, the earliest settlers deposited votive offerings in to the Thames as they considered the waters to be sacred. Celtic tribes also deposited valuable, highly decorated military items in the Thames. In the Medieval period, pilgrims returning from their long journeys abroad or pilgrimages in Britain would cast their pewter souvenir badges into the river to express their gratitude for safe passage on their journey. Today, the Hindu community living in London considers the Thames to a replacement for the holy Ganges River in India and deposit a wide variety of colorful offerings into the river.


The dense, silty Thames mud is “anaerobic,” meaning oxygen is absent. When objects are dropped into the mud, the turbulent current of the incoming tide quickly buries the object in the dense, black silt. Without oxygen, the objects are preserved in the condition in which they are dropped into the Thames. Sometimes objects are found that are perfectly preserved after many years in the river.


It is through all of these objects that we can discover and understand London’s rich history and its inhabitants who have lived along the river, from prehistoric man to modern Londoners in the 21st century.


Early historians and archaeologists first realized the historical importance of the River Thames through dredging works carried out during the 19th century. Some of the most significant and historically important artifacts were discovered during this time including the Battersea Shield (Celtic), Waterloo Helmet (Celtic), and bronze head of Emperor Hadrian (Roman). Thomas Layton, Charles Roach Smith and G. F. Lawrence were antiquarians in London who collected valuable and historically important artifacts that were dredged from the river in the 19th century. Many of their most significant finds are now on display in London’s museums. The term “Mudlark” was first used during the 18th century and was the name given to people literally scavenging for things on the riverbank. These original mudlarks were often children, mostly boys, who would earn a few pennies selling items like coal, nails, and rope that they found in the mud at low tide.


Today’s mudlark is different from those poor wretches of the 1700s. Instead of mudlarking to survive, mudlarks today have a passionate interest in London’s rich archaeology and history. Equipped with a mandatory license, mudlarks use a variety of methods to search the foreshore and have discovered and recovered an incredibly wide range of artifacts.


In 1980, the Society of Thames Mudlarks and Antiquarians was formed and were granted a special mudlarking licence from the Port of London Authority. They work very closely with the Museum of London and the Portable Antiquities Scheme (PAS), where their finds are recorded.


Over the last 40 years, mudlarks have made a really important contribution to the study of London’s history through the sheer volume and variety of finds that they have discovered. Numerous toys, like miniature plates and urns, knights on horseback and toy soldiers have actually changed the way historians view the medieval period. Made mainly from pewter, these medieval toys are exceptionally rare and have helped transform perceptions of childhood during the Middle Ages.  The museum has acquired tens of thousands of mudlarking finds recovered from the River Thames foreshore, which is the longest archaeological site in Britain. Many of the most significant mudlark discoveries are on permanent display in the Museum of London and other museums in London.


Mudlarking has now become a popular hobby that gives both adults and children a unique “hands on history” experience and deepens their understanding of London’s past. The Thames Museum Trust, established in 2015, is currently developing a new museum in London to showcase a wide variety of amazing artifacts from the mudlarking community’s private collections. Exhibitions and lectures at the Tate Modern, the Bargehouse at the Oxo Tower, and the Art Hub Studios were extremely popular events.


Mudlarks Jason Sandy and Nick Stevens have combined their writing and photography skills to produce a new publication for Shire Books. With contributions from more than 50 mudlarks and accompanied by over 160 color photographs, this fascinating book tells the story of London using extraordinary mudlarking discoveries.


Written chronologically, the book begins with prehistory and takes the reader on an epic journey through time.  From historically significant masterpieces, such as the Battersea Shield and Waterloo Helmet, to personal items including dress and fashion accessories, children’s toys and religious artefacts. Each object reveals a unique story and offers us a tantalizing glimpse into the past.  This book brings them back to life, often uncovering new and important information about London’s history.


Early Bronze Age Flint Arrowhead. This delicate, barbed and tanged flint arrowhead has survived in exactly the same condition as when it was made approximately 4500 years ago. Finder and image: Tony Thira



Iron Age Bead 800-100 BC. Of the 50 or so known examples of this type, this is the only one that has been found attached to a metal ring.

Finder: John Higginbotham  Image: Nick Stevens


Roman Oil Lamp A rare example of a North African ceramic oil lamp depicting a running lion symbolising Christianity. AD 300–410 Finder: Alan Suttie, Image: Stuart Wyatt



Roman Coin of Hadrian AD 117-138. This is the very first example of Britannia being used on a British coin.  Finder and image: Nick Stevens


Medieval Pilgrim’s Badge. Depicting the martyrdom of St. Thomas Becket, the Archbishop of Canterbury who had fallen out of favour with King Henry II. Dated to 14-16th centuries. Finder: Tony Thira. Image: PAS


17th century Seal Matrix  Used to authenticate documents and letters, they were often engraved with the owner’s initials, family crest or personal emblem.Inscribed with the cursive letter K on the base. At the top of the rounded handle, the dotted outline of a heart is shown emerging from flowing flower petals. Finder: Jason Sandy. Image: Jason Sandy


18th century Momento Mori Ring. These types of rings were sometimes given out at funerals to commemorate a deceased person.  Often with initials and dates inscribed on the inside, their morbid style was very popular during this time. This memento mori ring is engraved with a skull and was most likely inlaid with black enamel. Finder: Nick Stevens Image: Nick Stevens


Pudding Lane Token 17th century traders token from Pudding Lane (spelt pudin). Pudin was the Medieval term for offal. Slaughterhouses nearby contaminated the lane with blood and entrails.  Dated 1657. Finder and Image: Nick Stevens


18th century Prisoners Ball and Chain. Interestingly the lock is closed. Did the prisoner die whilst shackled or perform a miraculous escape? Finders: Steve Brooker and Rick Jones. Image: Rick Jones


Victoria Cross Medal VC medal from an unknown soldier. Issued for an act of valour during the Battle of Inkerman in the Crimean War. Dated 1853. Finder Tobias Neto. Image PAS

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180167 https://historynewsnetwork.org/article/180167 0
Experiments With Reality: New Histories of the Magical

Arthur's Seat Coffins, discovered 1836. Photo Kim Traynor CC BY-SA 3.0




"Magic never weakened. Magic never hid."

Leonard Cohen, Beautiful Losers

About a mile east of Edinburgh Castle, there is an extinct volcano with the rather mythic-sounding name of Arthur's Seat, which makes its gradual and green ascent over the Scottish capital. In 1836, a group of schoolboys who were rabbit hunting discovered some slate pulled over the entrance to a cave. Being curious, they pulled it aside and descended into the earth. There they would find something most unusual – seventeen miniature coffins, professionally carved with brass hinge pieces for the lid, each a little under four inches tall. Naturally, inside of the seventeen miniature coffins were seventeen miniature corpses – wooden dolls with unique faces and clothes stitched perfectly to their tiny bodies. Arranged in two lines of eight, with the final coffin by itself, the oldest was already showing signs of rot, while the latest could have been placed there only the day before. Something so mysterious, so strange, so creepy couldn't help but generate theories. Were they some sort of memorial? Votive offerings? Wooden homunculi? A contemporary editorial in The Scotsman noted that "Our own opinion would be, had we not some years ago abjured witchcraft and demonology, that there are still some of the weird sisters hovering about." Edinburg was a city of the Enlightenment, where a generation before thinkers like David Hume, Adam Smith, and Thomas Reid promulgated the stolid, sober, and commonsensical; where James Watt invented the steam engine and where James Clerk Maxwell would go on to discover the laws of thermodynamics. And yet in 1836, it seemed like somebody was practicing magic on Arthur's Seat.


The Arthur's Seat burial isn't mentioned in archeologist Chris Gosden's Magic: A History from Alchemy to Witchcraft, from the Ice Age to the Present, but in his career as curator of Oxford University's Pitt Rivers Museum he has handled "a witch in a bottle; a slug impaled on a blackthorn to stop the rain; potatoes carried by an old man in his picket to help with his rheumatism; the tip of a human tongue (no ideas about that one); and an onion stuck up a pub chimney with a piece of paper on which was written the name of a temperance campaigner trying to get the pub closed down." Such magic, Gosden argues, is a good deal more common than is assumed, not a marginal activity practiced by charlatans or gurus, but indeed a universal practice inextricably connected to religion and science, and one that hasn't been eclipsed in the modern world so much as it has been sublimated.


Within Magic, Gosden provides accounts of phenomena as diverse as witchcraft among the Azande, the Renaissance writings about Hermes Trismegistus, ancient Jewish incantation bowls used to capture demons, and shamanistic rituals of the Eurasian steppe. What these activities share is a view of existence which sees the sacred and the profane as ever permeable, where humans are able to affect not just their environment, but the nature of being itself. Gosden explains that "magic was a humanization of the universe. There is a continuity between the human will or actions and the world around us," for "magic allows the universe to enter us, whether this be through the movement of the stars or the messages relayed by moving stones." Hidden within the book is a more succinct definition; when discussing Scythian art, Gosden describes it as a "series of experiments with reality," an adept summation which includes phenomena as diverse as Kabbalah, the I-Ching, and druidic ceremonies.  


An expert in ancient Celtic art, Gosden has done field work from Papua New Guinea to Mongolia, and his vociferous interests encompass astrology, alchemy, and divination. At times Magic's focus can seem so vast that it reads a bit like Professor Casaubon's Key to All Mythologies in George Eliot's Middlemarch, and yet Gosden's enthusiasm is well taken. He isn't the first contemporary scholar to take magic seriously (something which he notes), as the past two generations of researchers in fields as diverse as history, anthropology, religious studies, literary studies, sociology, and psychology have all rejected the dismissiveness of previous scholars. Gosden explains that that traditional historiography promoted "the idea that human intellectual culture moved from a belief in magic, to a belief in religion, and then to a belief in science. Each successive belief system was more rational, institutionally based and effective than its predecessor," but that model wasn't only simplistic and disrespectful, it also minimized the ways in which magic has been instrumental in the making of meaning.


Within my own discipline of Renaissance Studies, scholars such as Keith Thomas in Religion and the Decline of Magic (1971), Dame Frances Yates in The Rosicrucian Enlightenment (1972), Ronald Hutton in The Rise and Fall of Merry England: The Ritual Year 1400-1700 (1994), and Owen Davies' Grimoires: A History of Magic Books (2009) have all examined the centrality of magic. Davies writes in Magic: A Short Introduction that his subject " provided emotional empowerment… and inspired solutions when alternative sources of knowledge (including science) were inadequate." What magic offered wasn't explanation of how things happened so much as to why they did; a science of the symbolic rather than the empirical. Gosden's approach to magic is to understand what it does for those who believe in it, rather than to dismiss it as mere superstition. He writes that "human history is not purely or mainly about a practical mastery of the world but creating a set of close relationships with magical forces," which is to say with ineffable, intangible, transcendent meaning. Magic rituals, be they casting lots, making an astrological star chart, or indeed suturing tiny wooden dolls into small coffins for some inscrutable reason, aren't done for the same practical purposes that planting a field or digging a well are, for magic is about giving shape to a "sense of self and group… Magic is derived from participation, and participation has many dimensions." What Gosden conveys is just how normal magic is, an intractable part of the human experience that's as central to our expressions of meaning as formally organized religion.


Though Gosden is admirably anti-positivist, rightly refusing the fallacy which interprets magic as simply inadequate science, or worse, as mere hucksterism, the book would have benefited by more thorough engagement with philosophical argumentation. Writing with the confidence of Casaubon, Gosden sweepingly argues that "Human history as a whole is made up of a triple helix of magic, religion and science, the boundaries between which are fuzzy and changing, but their mutual tension is creative." This theoretical neologism of the "triple helix" is as woven through the text of Magic as it apparently is through history, and while the broad contours of the concept aren't necessarily erroneous, they can serve to obscure as much as illuminate. The difficulty with the concept is neither that its too sweeping (for academe could benefit from more ambition) nor that it gives weight to divergent epistemologies. Rather the problem is that these terms are inexactly used. Both "science" and "religion" are of course variable concepts with complicated genealogies; as deployed by Gosden, however, and the former seems to broadly mean "empiricism" (which is notably simpler than the strict definition of science), and "religion" which he defines as involving belief in deities combined with hierarchical organization. Both of these concepts are, of course, far more complicated than how they're used here, and more variable as well (though he acknowledges that later point). While the spirit of the triple helix is well taken – that any epoch is defined by a complex relationship between ways of understanding that are both literal and symbolic, both rational and aesthetic – the fuzziness of the metaphor betrays its utility.


Regardless, Magic is a welcome introduction, and in collating such a variety of examples from so many different cultures, Gosden provides convincing evidence for his claim that "magic was not marginal or dubious but central to many cultural forms." There is an enchantment to Gosden's enthusiasms, and more importantly to the copious cases which he provides. The technocratic partisans might have it that magic is simply cock-eyed superstition, hoaxes on the credulous, but Gosden rightly elevates it to any of the great explorations of meaning which mark our species, from philosophy, to art, to literature. Most evocatively, he argues that magic might supply an invaluable metaphysic that coolly rational science and technology are lacking, one that could be particularly crucial in the warming days of the Anthropocene, writing that "Magic encourages a holistic view of human beings, linking them to the planet through practical and moral relationship. At a time when we need positive and holistic planetary thinking, magic has much to offer." It's a fascinating claim, and one which I hope that Gosden takes up in some future study.


For the casual reader, it might be hard to see how seventeen dolls in Scotland have much to do with planetary survival, but a reading of Magic will show how the most varied of practices – painting a buried Neolithic body with red ochre, divination from the stars, erecting circles of stones on a green field somewhere – are connected to an understanding of life, consciousness, and reality where humanity is not separate from the wider world, but an intrinsic part of it. "Magic offers the possibility of a communal life – a life lived together with all the cosmos," he writes, and indeed the disenchantments of rationality have alienated us from the wider universe, making us observers rather than agents (even as we irrevocably destroy our environment). Gosden writes that a "truly open community is hard to obtain or sustain, but the need to cool the planet and live in a greater state of equality is urgent… Magic allows for a sense of kinship with all things, living or not. And with kinship comes responsibility."


As a model for that sort of perspective, consider a site some five thousand miles to the east of Arthur's Seat, on an arid plateau in Turkey named Göbekli Tepe. There, in 1994, archeologists discovered one of the most astounding of prehistoric sites, a complex of stone columns, pillars, and altars at least eleven thousand years old. Anonymous artists rendered stone depictions of lizards and birds, bulls and foxes, in a structure which some have argued is the earliest temple. The nature of the faith of those who worshiped here is hidden beyond the veil of deep history, and yet the sculptures rendered seem to evidence an almost transcendent sense of humanity's place in nature. Amazingly, Göbekli Tepe predates agriculture, the temple constructed by nomadic hunter-gatherers. There is evidence that local agriculture developed specifically to feed and house those who built and then visited Göbekli Tepe, so that magic was a prerequisite for the emergence of civilization itself. A certain irony that magic inadvertently led to the system of organization which eleven millennia later pushes us to the precipice of ecological collapse, so much so that there is a poetic justice if some modern version of the faith of Göbekli Tepe – which understands humans as being both in and of nature – is responsible for our environmental redemption. Magic may yet protect us from our own blasphemies against nature's sacred order. 

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180170 https://historynewsnetwork.org/article/180170 0
The Roundup Top Ten for May 7, 2021

The Game Is Changing for Historians of Black America

by William Sturkey

"Active racism, exclusion, and environmental injustice have destroyed whole sections of Black history. Many who gripe about “erasing history” of Confederate monuments in the South have no idea how much history has already been erased." New digital tools are a key tool for reconstructing the record of Black history.


Child Welfare Systems Have Long Harmed Black Children Like Ma’Khia Bryant

by Crystal Webster

Although the death of Ma’Khia Bryant has been discussed as yet another example of police violence against Black Americans, it's important to recognize that she was also a victim of a child and family services bureaucracy that has been shaped by racism and left Black children to fend for themselves.



What History Can Teach Banks About Making Change

by Destin Jenkins

"Celebrating Juneteenth and recruiting more Black bankers is one thing. It is quite another for financial firms to use their unique power to actively undermine the systems that perpetuate racial inequality."



Germany’s Anti-Vaccination History Is Riddled With Anti-Semitism

by Edna Bonhomme

Antisemitism has often flared during public health crises such as pandemics; it has also attached itself to suspicion of vaccination, a trend that has been disturbingly prominent among German anti-vaxxers. 



The End of the Commonwealth of Puerto Rico

by Pedro Cában

Puerto Rico's status as Estado Libre Asociado (Free Associated State) arose in response to American desires to control the island as a colony while nodding to the island's autonomy during the Cold War. It is not up to the challenges Puerto Ricans face today, but Congress appears unwilling to move on from a colonial relationship.



Police and the License to Kill

by Matthew D. Lassiter

The history of the Detroit Police Department shows that police reforms won't reduce killing as long as departments can set priorities that result in racially targeted and discretionary enforcement and are allowed to investigate and sanction the conduct of their own officers. 



Visions against Politics

by Eileen Boris and Annelise Orleck

Historians Eileen Boris and Annelise Orleck are the guest editors of the spring edition of the AAUP's magazine focusing on the need for a New Deal for Higher Education. This is their introductory essay. 



Except for the Miracles

by Olúfémi Táíwò

"The deciding aspect of politics over these next crucial years will turn on battles against overwhelmingly powerful foes who will try to prevent radical redistribution of resources," writes Olúfémi Táíwò. The legacy of two radicals, in Ireland and Kenya, show the value of partial victory and learning from defeat. 



A DARPA for Health? Think Again, President Biden

by Victoria A. Harden

The founder of the Office of NIH History says that the Biden Administration is right to urge a national commitment to health-related research, but shouldn't bother with creating a special task-focused agency; the best support is more and more secure funding for basic research.



The Real ‘Second-Class Citizens’ Of Academia

by Donald Earl Collins

Cornel West's complaints of being treated like a second-class citizen by Harvard may reflect disrespectful treatment. But thousands of adjunct professors experience worse as a matter of routine on the front lines of American universities. 


Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180163 https://historynewsnetwork.org/article/180163 0
Remember The Essential Workers After COVID: They Deserve Better

NYC Transit presented workers at the Hudson Yards subway station with these shirts last June 11 in recognition of unprecedented work to clean and disinfect the city's transit system.

Photo Marc Hermann, MTA New York City Transit. Used under Creative Commons Attribution 2.0 Generic  license.



As widening vaccination makes it possible to imagine a post-COVID New York, what will happen to the “essential workers” who have borne so much of the brunt of the pandemic?  Nurses, transit workers, delivery people, grocery clerks, warehouse workers, and the like suffered a terrible toll of disease, death, and family loss.  At the MTA alone, at least 157 employees died of COVID during the first year of the pandemic.  Workers who in the past had been all but invisible to many New Yorkers commanded new respect and acclaim.  If nothing else, we realized how dependent we are on their hard, dangerous, unglamorous, and often poorly-paid labor. 

Pretty quickly, though, we began drifting back to our old ways.  The nightly clapping and pot banging for health care workers – a moving ritual of appreciation and solidarity – faded away.  Some retail and delivery companies instituted “hero” or “hazard” pay.  But by last July, Amazon, Whole Foods, Target, and Stop & Shop had reduced wages to their prior level (though some companies continued to pay occasional bonuses).  Protests and short strikes brought essential workers more PPE and improved safety conditions, but no long-lasting structural changes.  In a city with a short attention span, it seems like essential workers have had their fifteen minutes and it is time to move on.

Do we really want to leave it at that?  New York once paid much more attention to the millions of workers who did the grimy, monotonous, dangerous work to keep the city going.  Those workers, in turn, helped make the city a more livable, equitable, and secure community.  In the decades after World War II, at least a million New Yorkers belonged to unions, which showed their muscle as soon as the war ended with a series of strikes that virtually shut the city down.  Many also belonged to tenant groups, neighborhood associations, and ethnic societies that provided mutual assistance.  Politicians, not only in the two main parties but also in the American Labor and Liberal parties, both sponsored by unions, paid careful attention to the needs of their working-class constituents.

Unions, progressive professionals, and liberal and left-wing politicians together transformed New York life by creating institutions to meet the needs of the working population.  An expanded municipal hospital system and non-profit insurers like HIP, GHI, and Blue Cross served medical needs.  Tens of thousands of units of public housing and non-profit cooperative housing were constructed to meet a crisis of affordable housing, while rent control enabled those who already had homes to stay in them.  New public colleges like Baruch, Lehman, John Jay, and Medgar Evers were established, open enrollment instituted, and tuition eliminated.  While racial and gender discrimination remained pervasive, with union support New York State passed some of the first laws in the country banning discrimination in employment, union membership, and housing.  Plenty of people who today we would dub essential workers still had it hard, but as a group they had better homes, better health care, more income, and were more likely to be able to take vacations, retire in security, and launch their children off to better lives than at any time in the history of the city.

The advances stopped in the 1970s.  As the New York City government tottered on the edge of bankruptcy, working-class communities were devastated by municipal lay-offs, frozen wages, brutal service cuts, housing abandonment, fare hikes, and tuition charges at the City University.  When the city finally recovered, finance, real estate, and tourism dominated the economy, public policy, and civic culture like never before.  Michael Douglas’s unscrupulous Wall Street corporate raider Gordon Gekko displaced Jackie Gleason’s bus driver Ralph Kramden on The Honeymooners as the quintessential New Yorker.  Meanwhile, little attention was paid to the hundreds of thousands of immigrants who reversed the population decline of the city and contributed mightily to its revival.

Working people made occasional reappearances on center stage.  On 9/11, the city rediscovered the firemen, construction workers, restaurant workers, and police who were so many of the heroes and victims that terrible day.  But when lower Manhattan was rebuilt, little consideration was given to the needs of laboring New York, the kind of people who only shortly before had been hailed for their sacrifices and bravery.  In 2005, striking transit workers reminded New Yorkers of who—literally—kept the city going.  But mostly the sheen of wealth and style, which spread across Manhattan and into parts of Brooklyn and Queens, hid the labor of the home care workers, delivery people, dishwashers, street vendors and nail polishers who made the good life possible.  Most affluent New Yorkers did not know and did not care that they lived in a city with greater economic inequality and greater racial segregation than places like Houston that they looked down their noses at.

Now we have a choice to make.  Public policy helped turn New York into a finance center whose most outstanding product is inequality: subsidies for office towers, corporate tax exemptions, rezoning, prioritizing cars and commuter rail while buses creep along congested streets, massive investment in policing while social services remain woefully inadequate, and on and on.  In spite of scare talk from Albany, New York is not poor; tax revenues dropped only modestly during the pandemic and a huge inflow of federal money will soon arrive.  Now is the time to rebuild our social capital: repair our existing public housing and build government-owned housing that glistens like the high-rises for the rich; promote more non-profit cooperative housing, like Penn South and Rochdale Village, which have served residents so well; extend the subway, this time not to Hudson Yards but to southeastern Brooklyn and eastern Queens; build public hospitals that can compete with the rich Manhattan medical complexes, with their billionaire donors and government handouts, so when the next pandemic hits we do not see another sight like Elmhurst Hospital last spring. 

We can do a lot even without spending much public money.  New York State’s adoption of a fifteen dollar minimum wage was a good first step, but let’s be honest, that is still pretty paltry pay to survive in New York City.  Let’s raise the minimum further for essential workers, perhaps requiring bonus pay during health crises, and while we are at it let’s get serious about enforcing health and wage standards for all workers rather than looking the other way.

All of us literally owe our lives to our essential workers.  Let’s not forget it.  Giving them a fair shake is not only a moral obligation, it will make a better city for all of us.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180088 https://historynewsnetwork.org/article/180088 0
Another Bite at the Apple: Isaac Newton's Time as a Man of Politics and Economics

Isaac Newton, by Sir James Thornhill, 1712



‘The past is a foreign country: they do things differently there’. The opening sentence of L.P. Hartley’s The Go-Between (1953) has become a historical cliché. Yet contrary to its implications, the past has no independent existence, but is being continually updated by scholarly travelers: it is a constantly shifting territory, an endangered world plundered by souvenir-hungry historical tourists.


Since his death in 1727, Isaac Newton has been reborn in various incarnations that reflect the interests of their creators as much as the realities of his existence. His monument in Westminster Abbey records not only his contributions to physics, but also his commitment to biblical studies and the timetables of ancient chronology, while the posthumous statue in Trinity College Cambridge shows an enraptured Enlightenment gentleman wielding a prism like an orator’s baton. The tale of the falling apple began circulating only in the nineteenth century, and his alchemical interests were suppressed until the economist John Maynard Keynes unleashed a flurry of investigations after the Second World War.


Newton’s definitive biography remains Never at Rest (1980) by the American historian Richard Westfall, who subsequently undertook a prolonged psychoanalytical investigation of his authorial relationship with his subject. In an extraordinary article, Westfall outlined the conclusions he had drawn about himself. “Biography…cannot avoid being a personal statement,” he declared, confessing that he had painted “a portrait of my ideal self, of the self I would like to be.” As he continued soul-searching, he accused himself of resembling a puritanical Presbyterian elder determined to preserve unsullied Newton’s reputation as a scientific genius, and he admitted downplaying Newton’s thirty years of financial and political negotiations at the Mint.


I hold no such qualms. Immersed in modern concerns about global capitalism and international exploitation, in Life after Gravity (2021) I explore Newton’s activities during those last three decades that so discomfited Westfall. Many scientists regard his London years as an unfitting epilogue for the career of an intellectual giant, but economists see matters differently. More interested in falling stock markets than in falling apples, they are untrammelled by assumptions that the life scientific is the only one worth living. According to them, once Newton had tasted fame and the possibilities of wealth, he wanted more of both. And that entailed moving to the capital, where he earned a fortune, won friends and influenced people. I can only speculate about reasons – professional insecurity, the anguish of an impossible love affair, private worries about intellectual decline as he aged – but Newton engineered his move with great care. Determined to make a success of his new life, he broke definitively away from provincial Cambridge with its squabbling academics, and dedicated himself to his new metropolitan existence.


While he worked at the Mint, Newton continued to confirm and refine his theories of the natural world, but he was also a member of cosmopolitan society who contributed to Britain’s ambitions for global domination. He shared the aspirations of his wealthy colleagues to make London the world’s largest and richest city, the center of a thriving international economy. Like many of his contemporaries, he invested his own money in merchant shipping companies, hoping to augment his savings by sharing in the profits (although he sustained a substantial loss during the South Sea Bubble crash of 1720).


An uncomfortable historical truth is often glossed over: until 1772, it was legal to buy, own, and sell human beings in Great Britain. Newton knew that the country’s prosperity depended on the international trade in enslaved people, and he profited by investing in companies that carried it out. For fine-tuning his gravitational theories, he solicited observations of local tides from merchants stationed in trading posts. And when he was meticulously weighing gold at the Mint, he must have been aware that it had been dug up by Africans whose friends and relatives were being shipped westward across the Atlantic, where they were forced to cultivate sugar plantations, labor down silver mines, and look after affluent Europeans.


National involvement in commercial slavery was a collective culpability, and there is no point in replacing the familiar “Newton the Superhuman Genius” with the equally unrealistic “Newton the Incarnation of Evil.” By exploring activities and attitudes that are now deplored, I aim not to condemn Newton, but to provide a more realistic image of this man who was simultaneously unique and a product of his times.


Newton was a metropolitan performer, a global actor who played various parts. Since theatricality was a favourite Enlightenment metaphor, I chose to experiment by structuring my narrative around a dramatic conversation piece by William Hogarth: The Indian Emperor. Or the Conquest of Mexico (1732). It is seeped in Newtonian references. For example, centrally placed on the mantelpiece, Newton’s marble bust gazes out across an elegant drawing room, while the royal governess bids her daughter to pick up a fan that has dropped to the floor through the power of gravity. On a small makeshift stage, four aristocratic children arranged in a geometric square perform a revived Restoration play about imperial conquest and the search for gold. Traveling round the picture – the room, the audience, the stage – as if I were a fly on its walls, I describe how Newton interacted with ambitious wheelers and dealers jostling for power not only in Britain, but around the entire world.


Newton may have been exceptional, but he can no longer be seen as William Wordsworth’s Romantic genius soaring in strange seas of thought alone, an abstract mind divorced from the mundane concerns that affect every human being. Ensconced in a powerful position, he took decisions and implemented policies that contributed to fostering the exploitation and disparity lying at the heart of modern democracy. As a privileged British academic, I benefit from being enmeshed within a global economic system that promotes inequality, and whose growth has been linked with the rise of science and the rise of empire since the mid-seventeenth century.


Exploring Hartley’s foreign country of the past can help to reveal how we have reached the present, but for me the main point of doing that is to improve the future. The current state of the world is not pre-ordained. Instead, multiple individual choices have shaped the direction humanity has collectively taken, and millions of others will affect what lies ahead. Ensuring a better future requires that everybody – you, me – take personal action. In writing this book, I have tried to analyse some of the ways in which our predecessors went wrong: we must avoid repeating their mistakes.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180089 https://historynewsnetwork.org/article/180089 0
Locating the Present in the Past: a World History Teaching Challenge



One of the daunting aspects of teaching world history is the realization of how much has to be left out. Constraints of the survey course and even limits to the size of our often-huge textbooks inevitably compel a host of omissions or short-changes. While limitations apply most obviously to coverage, they also affect conceptual or analytical goals: we simply can’t do as much as we would like.

The following comment, framed within the burden of constraints, seeks to encourage debate about how those of us teaching world history are prioritizing our goals, particularly amid the growing complications of the world around us and as we seek to attract greater undergraduate interest in historical study. Debate is the operative term, for while I urge a new discussion of priorities, and risk raising a few hackles in the process, the effort must be framed in terms of a balance amid a number of desirable outcomes.

Here’s the starter: I have always believed that the primary goal of world history teaching was to help students place the world around them today in context of the past, to explore how current patterns have emerged from earlier developments by emphasizing the basics of historical analysis: change and continuity. Here, it seems to me, is the core argument for urging world history over other survey topics: we live in a complex world, not just a Western world or an American world or even a Chinese civ. world, and we need to work with students on historicizing this complexity, even seeking to promote a capacity to continue to evaluate the global present long after the course has ended. Arguably – though this merits testing – explicit linkage between survey history and current global concerns may also propel greater student interest, adding a further twist to the longstanding goal, particularly in light of the fact that the survey course is our principal encounter with non-majors.  

Primacy does not mean exclusivity. We can also hope that world history surveys, like other surveys, promote abilities in critical thinking and/or interpretation of sources and/or capacity to develop arguments; and I grant that world history (unlike other surveys) assures exposure to regional diversity, an asset in itself.   But, again, we can’t do everything, which is why we arguably need consistent attention to whether our effort to contextualize the global present is actually reaching the student audience or whether we have diluted the linkage with other concerns.

And this is where, in recent years, I have felt the need for some reassessment, because while my undergraduate students have continued to display the usual (varied) range of mastery of the major world history periods and regions, I have become less sure of their grasp of how current patterns have emerged from the past and how the connections provide greater contemporary understanding.

To be sure, an initial criterion, and an important one, continues to be met: my survey has always had a substantial 20th-21st century unit at the end of the course (though in what for me has always been a one-semester offering, “substantial” may be taken with a grain of salt). I have always believed that it was vital for survey instructors to avoid getting so sidetracked by earlier developments or other disruptions that merely reaching World War II seemed a major achievement. After all, the decades before, say, 2015 are often the hardest for students to grasp, between their own experience and conventional textbook coverage – yet are also the most vital in bridging between present and past.

I increasingly question, however, whether simply reaching the 21st century provides sufficient connectivity to establish the past-present linkage that really makes the survey a source of genuine perspective on contemporary issues, or a preparation for active use of world history insights in the students’ futures.  We may well need to create explicit opportunities for linkage discussions throughout the course, and not just at the end – at least this is what I hope to accomplish in significantly redoing my own course (without, I must add, converting into a contemporary problems exercise: the focus on history remains central). Yet this kind of approach complicates the priorities of many current world history surveys. Hence the desirability of further discussion of the needs involved, and the potential tensions in trying to meet those needs.

For asking students to apply what we looked at in week 2, on the classical civilizations, to our discussions of contemporary regional factors in week 13 may be an over-demanding stretch unless we prepare the analysis directly in discussing the classical period. It is arguably not enough to evoke the importance (and complexity) of continuity (or heritage) in principle, and assume that students will be able to link up on their own. The challenge is twofold: first, simply retaining active memory over the multi-week span (as opposed to retaining the capacity to recall a canned definition of Confucianism or Hinduism for examination purposes); and second, knowing enough about the contemporary world to anticipate some connections early in the course.

For many students don’t know a lot about what is going on in the world today. Of course there are marvelous exceptions, among the globally motivated (some of whom, however, frequently place out of the college survey). For many able students, finding out about the surge of Hindu nationalism, or the historical evocation in the Belt and Road Initiative, or the striking recent trends in global poverty and aging, or the fascinating regional differences in response to the pandemic is a first, which is why leaving the issue of connectivity to the end of the course often falters because of the unfamiliarity of the issues involved.

This is not a declension lament: I continue to be impressed with the ability and interest of many good students. I do think it is possible that contemporary students are slightly less prepared than their predecessors in the global arena, mainly because of the decline in social studies plus modern languages and the measurable shift of the principal news media away from international coverage since the end of the Cold War. And I am impressed at the widespread belief in generational uniqueness, particularly around the impact of social media and the often unexamined idea of the generation itself, though the issues are hardly unprecedented.  But the main point is not really new: connecting global past to global present is a hard job, and this in turn is the primary reason that further attention is warranted.

The tools are familiar enough, centering on the balance between change and continuity that undergird major current trends and patterns and link them to developments at least as far back as the classical period or even the advent of agriculture. The execution centers on a combination of time and explicit attention that allows students to begin to work on the relevant connections throughout the course, so that the challenges of the final, contemporary unit are at least somewhat familiar.

But this is where linkage needs bump up against two other, and recently more widely touted, world history goals. For creating a bit more space for more consistent linkage discussions, around change and continuity, means at least some reconsideration of coverage aspirations and, possibly, the amount of time available for analysis of sources.

We know from the recent agonies over the chronological redefinition of the Advanced Placement world history course that coverage revisions can be truly painful: it is demonstrably easier for historians to add than to subtract. Many world historians have a commendable desire to include as many regional experiences as possible, particularly those developing outside the West, and the current interest in racial and ethnic identities adds further fuel. An interest in pushing far back in time also runs deep. Again, however, we can’t do everything. There is no formula for the tradeoffs involved, but it may be desirable to experiment again with recombinations that can meet a goal of regional adequacy and chronological depth –certainly including a starting point well before 1500 – in order more fully to address the historical contextualization of the present.

A passion for work with sources runs deep as well, and may well have increased in recent years; and there is certainly no justification for pulling back entirely. Indeed, juxtaposition of sources may be an excellent way to support discussions of change and continuity, while contributing to the student skill set as well. Here too, however, there may be some need to cull a bit: too much attention to sources, though worthy in itself, may distract from the larger analytical goals, where, among other things, dealing with some relevant and challenging secondary interpretations may be more useful, while contributing to students skills as well. There are limits to the kinds of developments, particularly over time, that source work can easily explore.

The issue, again, is priorities. World history surveys must help introduce students to a wider past, but in my view it should be a past that actively sheds light on the present – and that takes work, and requires consistent attention as the course unfolds. If this is agreed – and again this is worth debate – then we also need discussion about what kinds of contemporary patterns particularly merit attention and about how ensuing analysis of change and continuity can best be promoted. The result will not change the survey beyond recognition, but it may help students understand why we are urging attention to the subject in the first place and how it can be an ongoing resource in sorting out the complexities of the world around us.  

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180093 https://historynewsnetwork.org/article/180093 0
Aidez Madrid!

Detail of "Aidez Espagne," Joan Miró, 1937



As we enter the final stretch of the campaign to elect Madrid’s powerful regional government, here is one last journalistic appeal to democrats and progressives outside Spain.

Paraphrasing the message in Joan Miró’s famous ‘Aidez l’Espagne’ poster—a striking image of a peasant, fist raised, exhibited in the Spanish Pavilion of the Paris International Exhibition  in 1937—the appeal is now "Aidez Madrid."

In  Paris in 1937 the Spanish Republic requested the support and intervention of Europe in its defense of democracy against Franco’s military coup, the summary executions of tens of thousands of democrats, and Hitler’s Stuka dive bombers, which  had  obliterated the Basque town of Guernica. The appeals were in vain and Western democracies abandoned the republic to its fate.

With no desire to overdramatize, those of us who witness directly the rise of the extreme right in Madrid feel compelled to make the historical comparison. We appeal to democrats and antifascists in the rest of the world to intervene once again. Intervene by expressing your dismay at conservative Partido Popular candidate Isabel Díaz Ayuso’s decision to open her arms to the extreme right party Vox, a 21st century version of Franco’s falangism.

Ayuso has not only adopted most of Vox’s incendiary language but has made clear that she is prepared to form a coalition government with the extreme right racist group after the May 4 elections. Ayuso has crossed every red line in her bid to win over Vox voters—necessary to avoid the election of a progressive government.  She explained an outbreak of COVID in the densely populated working class districts in the south of the city as the result of  “our immigrants’ way of life.” Her electoral slogan is “Freedom versus Communism.” In one TV interview she remarked. “If they call you a fascist, you are on the right side of history."


Vox poster, Madrid. "MENA" refers to unaccompanied minor foreign migrants (Menor Extranjero No Acompañado) who are claimed to each receive 4,700 Euros worth of social service assistance monthly.

Despite these views, Ayuso commands the support of 40% of Madrid’s electorate, according to the latest opinion polls, 80% of whom say they  have no issue with her plan to form a coalition government with the extremist Vox. To get a taste of how Vox directs its vitriol against the most vulnerable, take a look at their election posters, plastered in Madrid’s metro stations, which criminalize  unaccompanied minors—most from North Africa—who participate in social integration programs. 

For this reason Aidez Madrid is now a moral imperative. Democrats within and without Spain must protest at the inclusion of Vox in a Madrid government and use their influence to try to persuade Madrid voters to turn their backs on the far right. There is much  at stake. A PP Vox in Spain’s capital would set a highly dangerous precedent for the rest of the world.

It is crucial that Ayuso be made aware that, just as her quasi denialist policy of allowing bars and restaurants in the city center to fill with locals and tourists, she is playing with fire with her policy of modus vivendi with Vox. She must be warned. Yet no one in Europa or the US seems even aware of what is happening here..

There is little knowledge outside Spain of just how much power accrues to government of the capital in Spain’s decentralized state. While socialist prime minister Pedro Sanchez governs the Spanish state along with coalition partners Unidos Podemos, regional governments like Madrid’s are responsible for health , education, transportation, and most social services. If this were a Spanish general election,  public opinion in Europe would surely have rallied against a possible extreme right government. But the Madrid election appears to have slipped under the international radar.

This is not just a matter of principle. More than ever, Spain depends on the rest of Europe for its economic survival. Madrid has requested more than 21 billion euros from the European support program for post-pandemic economic recovery. Wouldn’t it be problematic for Europe's democratic credibility if parts of  this generous budget ended up under the control of a Vox minister in the Madrid regional government who chose, for example,  to channel European funds into  programs that excluded immigrants? How would Brussels feel if a Madrid government managing European funds for the green recovery included climate change deniers such as those in the Vox leadership?

It is an issue of added concern because Ayuso has promised to cut income tax in the Madrid region, making the European funds more important to her plans. This unfair tax competition not only discriminates against other cities in Spain, but also other European cities outside of Spain.

Perhaps the first Aidez Madrid’s  appeal should be European and American tourists attracted to Ayuso’s free for all policies during the pandemic, despite continuing concerns about covid and a slow roll-out of the vaccine program.

Some may choose to leave the bars and restaurants and visit the Reina Sofía museum, where they can admire Picasso’s Guernica—an extraordinary depiction of fascist violence painted in the first six months of 1937 and exhibited in the Paris Pavilion. Discriminating visitors can discover in the Reina Sofia’s moving collection of modernist Spanish art a room dedicated to the avant-garde Spanish pavilion in Paris designed by Luis Lacasa and the Bauhaus-influenced Josep Lluís Sert. On view are sculptures by Alexander Calder, Julio González (who died in a concentration camp), and Albert Sánchez, whose surrealist totem can be seen in the square in front of the museum entrance. Visitors might choose to watch Buñuel’s Las Hurdes, screened inside the pavilion, and now projected repeatedly at the Reina Sofía now. The pavilion containing these seminal works of modern art was the brainchild of Spain’s then socialist president, Francisco Largo Caballero.  

For that reason, we should all be concerned not only at the presence of Vox in the regional government but also by their influence on city authorities, too. Madrid Mayor Jose Luis Martinez Almeida, a close ally of Ayuso in the PP, has just announced, at the request of his coalition partner in the Madrid council Vox, that all place names in Madrid dedicated to Largo Caballero and fellow socialist Indalecio Prieto, should be withdrawn from the streets of the capital as part of a campaign to “rid the city of Communist symbols.”

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180091 https://historynewsnetwork.org/article/180091 0
Elijah Lovejoy Faced Down Violent Mobs to Champion Abolition and the Free Press

Wood engraving of November 7, 1837 mob attack in Alton, IL. Antislavery publisher Elijah Lovejoy was killed and his press, hidden in this warehouse, was destroyed, with the pieces thrown into the Mississippi River.




It was gratifying that Rep. Jamie Raskin would invoke an obscure 19th century newspaper editor while laying out the impeachment case charging President Trump with incitement in the brutal January 6 insurrection at the U.S. Capitol. The editor was Elijah Lovejoy, whose slaying at the hands of a proslavery mob in 1837 encapsulated a chilling wave of political violence that Abraham Lincoln, then an Illinois lawmaker, condemned as “mob law.”


As Congress confronts the aftermath of the pro-Trump crowd’s frenzied violence—some of which was directed against journalists trying to covering the events—it is a good moment to recall courageous editors such as Lovejoy who defended American freedom of the press in the face of violent and often well-organized efforts to silence them. Lovejoy’s story is a reminder that the liberties journalists hold dear are more fragile than we’d like to admit.


Over many months before he was killed in Alton, Illinois, Lovejoy, who ran a weekly paper called the Observer, was repeatedly targeted by mobs over his persistent writings against slavery—a stance considered akin to treason along the edge of the slavery South. A white Presbyterian pastor from Maine, Lovejoy was chased from St. Louis in 1836 over columns denouncing slavery as immoral. He scurried across the Mississippi River to the free state of Illinois to keep printing his weekly, but ran afoul of critics there, too. Three times, vigilantes destroyed his press and threw it into the river. Three times, friends helped Lovejoy buy a new one.


Lovejoy belonged to a small fraternity of editors—often devout Quakers or Presbyterians, mostly white—who used their printing presses in the decades before the Civil War to call for an end to chattel slavery, sometimes within the very states that practiced it. Early antislavery editors, such as the Quaker Benjamin Lundy, sought to prick the nation’s conscience before any large abolitionist movement had taken shape.


Writing against slavery could be dangerous—Lundy was assaulted by a slave trader about whom he had written in his Baltimore newspaper. William Lloyd Garrison, the firebrand editor of the Liberator, was publicly paraded by a Boston crowd with a rope looped around him. Antislavery editors were threatened, pursued by vigilantes and, in Lovejoy’s case, shot dead. Even in free states of the North, antislavery journalists were reviled as deranged fanatics bent on destroying the Union—an earlier era’s version of “enemy of the people.”


Except for Garrison, abolitionist editors have languished in the footnotes of the larger saga of slavery and the Civil War. That’s a shame. By using their newspapers to press for an end to human bondage, editors such as Lovejoy and James G. Birney elevated a twin cause: freedom of the press. These ink-stained abolitionists defied attempts to muzzle their papers at a moment when the young country was still working out how free its newspapers would be. Their defense of the right to publish on any topic, even unpopular ones, contributed to a national understanding of how a free press acts—even today.


Slavery’s defenders went to astonishing lengths to stifle public criticism, even in the North. Slave states outlawed “incendiary” abolitionist journals, which they said could incite slaves to rebel. Southern lawmakers sought vainly to get counterparts in the North to crack down on abolitionist materials at their source. Suppression extended even to Congress: the House of Representatives, dominated by Southerners and their Northern allies, approved a “gag rule” in 1836 to prevent discussion of slavery within its walls.


These steps and rampant mobs in the North represented a vast effort to suppress antislavery sentiment as a small but growing number of whites and free Blacks in the North had begun to mobilize. Pro-slavery forces in the South went so far as to intercept and burn sacks of antislavery journals sent through the U.S. mail.


But canny abolitionists spotted advantage in marrying the cause of the slave with freedom of the press. Birney, a White Kentuckian who liberated his slaves and embraced abolitionism, founded an antislavery newspaper in Cincinnati. Mobs twice wrecked his press, though Birney was uninjured.


Birney viewed mob attacks on the press as a chance to sow indignation in the North, and to cast abolitionists as defenders of civil liberties in the face of tyranny. “The contest is becoming—has become, one not alone of freedom for the black,” he told a fellow activist, “but of freedom for the white.” Even after his second press was destroyed while Cincinnati’s mayor looked on, Birney resumed publishing.


No editor carried the fight further than Lovejoy, whose Observer grew more strident after his flight to Illinois. Thugs in Alton sacked his office and threatened to tar and feather him. Local elites, including the Illinois attorney general, and hostile newspapers across the river in the slave state of Missouri employed pressure and thinly veiled threats to get him to stop writing about slavery.


Lovejoy refused, again and again, arguing that his freedom to publish on any subject was granted by the Constitution and God. “You may burn me at the stake…or you may tar and feather me, or throw me in the Mississippi, as you have often threatened to do,” Lovejoy told a gathering of citizens summoned to persuade him to quit. “But you cannot disgrace me.”


Days after that, on November 7, 1837, Lovejoy was fatally shot while fending off a pro-slavery mob. His press was hammered to pieces and tossed into the Mississippi. Lovejoy, 34, became the first American martyred in defense of journalism. John Quincy Adams, the Massachusetts congressman and former president, compared the shock of Lovejoy’s killing to “an earthquake throughout this continent.”


That was hyperbole. Yet Lovejoy’s struggle did help solidify an expansive understanding of press freedom for future generations of journalists. At moment when white nationalists now seek to turn “Murder the Media” into a right-wing brand, Lovejoy, and Birney remind us that an unfettered press has long tried to light our way, and that the liberties we take for granted didn’t come for free.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180090 https://historynewsnetwork.org/article/180090 0
The 1940s Fight Against the Equal Rights Amendment Was Bipartisan and Crossed Ideological Lines




The fight over the Equal Rights Amendment is often framed as a classic fight between liberals and conservatives with liberals supporting the amendment to ensure gender equality and conservatives opposing the amendment to preserve traditional gender roles. But the history of the ERA before the state ratification battles of the 1970s shows that the fight over complete constitutional sexual equality did not always fall along strict political boundaries. As the dynamics of the early ERA conflict suggest, support for and opposition to the ERA are not positions that are fundamentally tied to either conservatism or liberalism. The ERA was first introduced into Congress in 1923, and Congress held several hearings on the amendment from the 1920s through the 1960s. Early ERA supporters as well as amendment opponents included liberals and conservatives alike. At its roots, the ERA conflict reflects a battle over the nature of American citizenship and not a typical political fight between liberals and conservatives.


The resurrection of the anti-ERA campaign effort in the mid-to-late 1940s is a prime example of how the original ERA conflict transcended typical political disputes of the early twentieth century. The social upheaval of World War II created a surge in support for the ERA, which alarmed several notable ERA critics, such as Mary Anderson, former head of the Women’s Bureau, Dorothy McAllister, former Director of the Women’s Division of the Democratic Party, Frieda Miller, the new head of the Women’s Bureau, Frances Perkins, the Secretary of Labor, and Lewis Hines, a leading member of the American Federation of Labor (AFL). In a September 1944 meeting, the distressed ERA opponents decided to create the National Committee to Defeat the Un-Equal Rights Amendment (NCDURA). This organization hoped to break the growing energy behind the ERA by centralizing the opposition forces and launching a coordinated counterattack on the amendment.


While founders of the NCDURA were predominantly prominent liberal ERA opponents, the organization actively worked with conservative amendment critics to squash the growing support for the ERA. When word reached the NCDURA’s leaders in April 1945 that the full House Judiciary Committee intended to report the ERA favorably, the organization reached out to “the all-powerful” conservative Representative Clarence J. Brown (R-OH), as one NCDURA official had put it, to help stall the amendment in the House. Once the full House Judiciary Committee reported the ERA favorably in July 1945, the leadership of the NCDURA used its budding connections with Representative Brown and the House Rules Committee to delay action on the amendment.


The NCDURA worked with conservatives once again when the ERA made progress in the Senate in the period following World War II. After the full Senate Judiciary Committee reported the ERA favorably in January 1946, the NCDURA began to coordinate efforts with conservative Republican Senator Robert Taft of Ohio. Senator Taft opposed the ERA because, he claimed, it would nullify various sex-based state laws that he believed protected women as mothers and potential mothers. In preparation for the July 1946 Senate floor debate on the ERA, the NCDURA worked with Senator Taft to make sure that every senator received a copy of the “Freund Statement”, an extensive essay by eminent legal scholar and longtime ERA opponent Paul Freund that outlined various arguments against the amendment. Before the debate, the NCDURA and its allies in the Senate also introduced into the Congressional Record an article denouncing the ERA written by former First Lady Eleanor Roosevelt. The NCDURA’s work paid off. When the Senate voted on the ERA in July 1946, the amendment failed to receive the two-thirds majority of votes required for passage of a constitutional amendment.


In the final weeks of December 1946 and the early days of January 1947, it became clear to the NCDURA’s leaders that ERA supporters were not going to give up easily on their amendment. As a result, the NCDURA’s officials decided to continue to build relationships with prominent Republicans while creating a positive program that would provide an alternative measure for improving women’s status. In the months that followed, NCDURA leader Dorothy McAllister enlisted the help of influential Republican Party member Marion Martin, the founder of the National Federation of Women’s Republican Clubs, to encourage other important Republicans to oppose the amendment.


The NCDURA also began to work on a joint resolution that aimed to eliminate any possible harmful discrimination against women while reaffirming what ERA opponents believed to be equitable sex-based legal distinctions. The bill included two main objectives: declare a general national policy regarding sex discrimination and establish a presidential commission on the status of women. For the policy statement, the bill called for the elimination of distinctions on the basis of sex except for those that were “reasonably based on differences in physical structure, biological, or social function.” According to the bill’s backers, acceptable sex-based legal distinctions included maternity benefits for women only and placing the duty of combat service on men exclusively. The bill’s supporters also noted that the policy statement would only require immediate action by federal agencies; it would not necessitate immediate, compulsory action from the states. The purpose of the bill’s proposed presidential commission was to investigate sex-specific laws and make recommendations at the appropriate federal, state, and local levels.


In February 1947, the NCDURA had gained strong support for its bill from two influential conservative congressmembers: Senator Robert Taft of Ohio and Representative James Wadsworth of New York. While Senator Taft had started to help the anti-ERA effort in the mid-to-late 1940s, Representative Wadsworth had been a committed ERA opponent since the 1920s. Wadsworth supported the NCDURA’s bill because he believed that it would allow for the “orderly repeal” of unjust laws while preserving women’s right to special protection. Senator Taft and Representative Wadsworth introduced the Women’s Status Bill into Congress on February 17, 1947. To bolster support for the bill, the NCDURA changed its name to the National Committee on the Status of Women (NCSW) in the spring months of 1947.


The Women’s Status Bill, which was commonly referred to as the Taft-Wadsworth Bill in the late 1940s, obtained a decent level of support from both Democrats and Republicans. Most importantly for ERA opponents, the bill successfully helped to subdue the pro-ERA impulse that had taken root during World War II because the bill provided an alternative measure for improving women’s status that promised a degree of equality while preserving the rationale for sex-specific legal treatment. The NCSW had versions of the Women’s Status Bill introduced into Congress every year until 1954. While the measure failed to pass Congress, it did provide the blueprint for what would become President John Kennedy’s Presidential Commission on the Status of Women, which was created in 1961.


Opposition to the ERA is not the only position that has appealed to both conservatives and liberals. The pro-ERA momentum that accelerated during World War II had helped the ERA gain an array of backers from across the political spectrum. That momentum slowed because of the ERA opposition work in the post-war years. Still, it is important to recognize the ways in which support for and opposition to the ERA have the potential to attract conservatives and liberals alike. By giving greater attention to how the struggle over the ERA has defied conventional categories of political ideology, we can gain a greater appreciation for the complexities embedded in the fight over the ERA and a better understanding for why the amendment has yet to be ratified. ERA opponents succeeded in stopping the amendment in the post-World War II era because they embraced an alternative approach for improving women’s status. That approach appealed to many conservatives and liberals because it allowed for a limited equality that upheld what they believed to be women’s natural right to special protection.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180094 https://historynewsnetwork.org/article/180094 0
The Fateful Choice: Nuclear Arms Race or Nuclear Weapons-Free World

The Bulletin of the Atomic Scientists adjusted its Doomsday Clock to show 100 seconds to "midnight" in 2020, where it remains.



The recent announcement by the British government that it plans a 40 percent increase in the number of nuclear weapons it possesses highlights the escalation of the exceptionally dangerous and costly nuclear arms race.

After decades of progress in reducing nuclear arsenals through arms control and disarmament agreements, all the nuclear powers are once again busily upgrading their nuclear weapons capabilities.  For several years, the U.S. government has been engaged in a massive nuclear “modernization” program, designed to refurbish its production facilities, enhance existing weapons, and build new ones.  The Russian government, too, is investing heavily in beefing up its nuclear forces, and in July 2020, President Vladimir Putin announced that the Russian navy would soon be armed with hypersonic nuclear weapons and underwater nuclear drones. Meanwhile, China, India, Pakistan, and North Korea are expanding the size of their nuclear arsenals, while Israel is building a new, secret nuclear weapons facility and France is modernizing its ballistic missiles, cruise missiles, and missile-carrying submarines.

This nuclear buildup coincides with the scrapping of key nuclear arms control and disarmament agreements, including the Intermediate-Range Nuclear Forces Treaty, the Iran nuclear agreement, and the Open Skies Treaty.

Like arms races of the past, the reviving nuclear arms race places the world in immense danger, for when nations engage in military conflict, they are inclined to use the most powerful weapons they have available.  How long will it be before a nuclear-armed, aggressive government—or merely one threatened with military defeat or humiliation—resorts to nuclear war?

In addition to creating an enormous danger, a nuclear arms race also comes with a huge financial price—in this case, in the trillions of dollars.  Military analysts have estimated that the U.S. government’s nuclear “modernization” program alone will cost about $1.5 trillion.    

Of course, the nuclear arms control and disarmament process is not dead—at least not yet.  One of U.S. President Joseph Biden’s first actions after taking office was to offer to extend the U.S.-Russia New Start Treaty, which significantly limits the number of U.S. and Russian strategic nuclear weapons.  And the Russian government quickly accepted.  In addition, efforts are underway to restore the Iran nuclear agreement.  Most dramatically, the UN Treaty on the Prohibition of Nuclear Weapons, which was adopted by 122 nations in 2017, secured sufficient ratifications to become international law in January 2021.  The provisions of this landmark agreement, if adhered to, would create a nuclear weapons-free world.

Even so, when it comes to freeing the world from the danger of nuclear destruction, the situation is not promising.  None of the nuclear powers has signed the Treaty on the Prohibition of Nuclear Weapons.  And without their participation, a nuclear-free world will remain an aspiration rather than a reality.  In fact, the most powerful nuclear nations remain in a state of high tension with one another, which only enhances the possibility of nuclear war.  Assessing the situation at the beginning of 2020 and 2021, a panel appointed by the editors of the Bulletin of the Atomic Scientists placed the hands of their famous “Doomsday Clock” at 100 seconds to midnight, the most dangerous setting in its history.

As a result, a fateful choice lies before the nuclear powers.  They can plunge ahead with their nuclear arms race and face the terrible consequences.  Or they can take the path of sanity in the nuclear age and join other nations in building a nuclear weapons-free world.


Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/179977 https://historynewsnetwork.org/article/179977 0
FDR's Court Packing Plan Backfired in the South. Will Biden Repeat The Error?



In assessing recent moves by President Joe Biden and congressional Democrats pointing to possible expansion of the Supreme Court, it helps to recall the immediate and long-term consequences of an earlier Democratic attempt to alter the makeup of that body.  Over a roughly two-year span beginning in May 1935, a majority-Republican Court had almost systematically eviscerated a succession of key New Deal measures, including the National Recovery Act and the Agricultural Adjustment Act. By February 1937, President Franklin D. Roosevelt had finally had his fill of sitting by while the Court made war on his recovery program. Accordingly, he unveiled a plan that would enable him to appoint up to six additional, (and presumably more liberal) justices, based on the number of sitting members who were 70 or older.

In reality, Roosevelt's aggressive move did not stem entirely from his fear of an extended reign of judicial terror at that point. It also reflected his anger at mounting pushback against his entire New Deal legislative agenda, not only from House and Senate Republicans but from some key representatives of his own party as well. A number of these Democrats made no secret of their distaste for his Court plan, rendering it dead even before its arrival on Capitol Hill. This blatant display of ingratitude infuriated Roosevelt, whose historic landslide re-election victory just three months earlier had also swept a dozen new Democrats into the House and five more into the Senate, leaving the party with overwhelming majorities in both.

Still, for too many congressional Democrats, FDR's "court-packing plan" amounted to playing Russian Roulette with the constitutional balance of powers. It was particularly disquieting for the southerners among them who saw it paving the way for a judicial assault on the legal underpinnings of racial segregation in their region. In other circumstances, FDR might have recognized his attempt to dramatically remake the Supreme Court as the ultimate political longshot from the beginning, and, as he had done before, simply cut his losses and moved on.

In this case, though, his judgment was likely impaired by his exaggerated sense of the political loyalty and latitude that should be his due by virtue of the stunning tribute to his personal popularity recently rendered at the polls. This sense, in turn, only stoked his outrage at the congressional rebuff, leading him to compound the effects of one egregious political miscalculation with another. The latter lay in attempting what his critics deemed a Stalinesque "purge" of some of the Democrats who had the temerity not only to oppose his Court measure, but express reservations about some of his programmatic New Deal initiatives besides.

Having persuaded himself that his overwhelming personal popularity with the Democratic faithful put their votes in other political contests at his disposal, Roosevelt moved in 1938 to recruit and emphatically endorse primary challengers to several of what he saw as his most consistently obstructionist Democratic adversaries in Congress.  These included Ellison D. "Cotton Ed" Smith of South Carolina, who had been in the Senate since 1909, and Walter F. George of Georgia, who had joined him there in 1923. As southerners, both Smith and George likened Roosevelt's move against them to a second Yankee invasion, while other objects of his ire cited his interference as a violation of the sovereignty of their respective states. With but one exception, all of FDR's designated enemies bested their anointed primary challengers, typically by comfortable margins. Divided and worn down by all the primary in-fighting, and saddled with a slumping economy to boot, the Democrats would go on to lose 71 seats in the House and 6 in the Senate in the November general election.

In the end, Roosevelt had squandered an enormous amount of political capital, shattered his own image of invincibility and further widened the ideological and regional divisions within his own party. This additional stress on the potent but increasingly tenuous "New Deal Coalition" of southern whites, northern blacks and organized labor turned up the heat on the simmering tensions that finally boiled over with the Dixiecrat revolt in 1948. Beyond that, these southern Democrats now had additional incentive to join right-wing northern Republicans in resisting further expansion of not just presidential but federal prerogatives in general. In this, FDR's attempt to bring the Supreme Court to heel may have sown the seeds of the knee-jerk animus toward Washington that propelled Donald J. Trump to victory in 2016.

 In fact, it was this very hostility that spurred former president Trump and former Senate majority leader Mitch McConnell to pursue their own ideological realignment of the Court, aimed at skewing it in precisely the opposite direction envisioned by FDR. Their success in ramming through three decidedly conservative new appointments in rapid succession has prompted President Biden to create a thirty-six-member bipartisan commission of constitutional experts charged with exploring the pros and cons of Supreme Court reform. Meanwhile, on the heels of Biden's action, congressional Democrats have announced their own plans to introduce a bill that would expand the membership of the Court from 9 to 13.

Whether we may soon be looking at "deja vu all over again" remains to be seen, of course, but there is reason to think not. Ultimately Biden seems too much the political realist to believe that any plan to alter the composition of the Supreme Court has much better prospect of winning the requisite backing on either side of the aisle at this juncture than Roosevelt's did in 1937. Some have already ventured that, in assembling so cumbersome and diverse a commission with such an open-ended mandate, the President's initiative is a largely perfunctory attempt to fulfill a campaign pledge to the more liberal-minded cohort within his party.  

The largely positive popular response to the relative moderation of the new Democratic administration to date also begs the question of why the Democrats would choose to go all-in for so polarizing a measure at the risk of lending traction to GOP charges that Biden is engineering a radical leftist takeover of our government.   Joe Biden has made no secret of his admiration for Franklin Roosevelt, and it may well be that he could do with some of Roosevelt's charisma and elan. Yet, insofar as the circumstances and options he now confronts may compare to those facing his aspirational model some eighty-four Aprils ago, he enjoys at least one potential advantage. As a veteran of untold Senate skirmishes over White House initiatives, Biden should have entered the Oval Office fully aware of something that had apparently eluded FDR prior to the Court-packing debacle. Even in times less fiercely partisan than these, it is the ultimate ahistorical folly for any President to mistake even the most resounding personal triumph in winning the hearts of the voters for gaining sway over their choices of who represents them in Congress and elsewhere, much less for commanding the allegiance of those whom they select.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180092 https://historynewsnetwork.org/article/180092 0
The Roundup Top Ten for April 30, 2021

Rick Santorum And His Critics Are Both Wrong About Native American History

by Michael Leroy Oberg

Rick Santorum's recent comments reflect a white nationalist viewpoint. But too many liberal critics responded to them with historical references to Native "contributions" to the American nation that erase the violence that nation carried out against indigenous people. 


Vaccine Hesitancy Is as Old as Vaccines. I Take Comfort in That

by David Motadel

Resistance to vaccination is nothing new. But historical episodes of "vaccine hesitancy" have tended to dissolve. 



In the U.S, Praise for Anglo-Saxon Heritage has Always Been about White Supremacy

by L.D. Burnett

Labeling American political institutions as "Anglo Saxon heritage" reveals the ugly strain of thought that holds only some ethnic groups are congenitally capable of participating in citizenship. 



Police Reform Doesn’t Work

by Michael Brenes

Liberal calls for police reform operate within an ideological context where preserving order and enforcing private responsibility for social problems suppresses considering inequality. Minneapolis, the site of Derek Chauvin's trial and the killing of Daunte Wright, is an illustrative example. 



Mary Seacole and the Politics of Writing Black History in 1980s Britain

by Margo Williams

The revival of British interest in the life of Jamaican-Scottish nurse Mary Seacole reflected the rise of a movement by Black British activists to see Black history as a story of struggle, rather than of a color-blind narrative of Britishness. 



Colony of Cobblestone

by Carlos Santiago

San Juan's cobblestones are an illusion, aesthetic flourishes of "old world charm" intended to boost the tourism economy and conceal the island's status as a U.S. territory.



The Crushing Contradictions of the American University

by Chad Wellmon

The proliferation of student loan debt reflects the acceptance by banks, borrowers, and the federal government of the idea that higher education is transformative and beneficial. Is this ideology bordering on magical thinking? 



The Perils Of Participation

by Amanda Phillips de Lucas

The construction of US Highway 40 in West Baltimore blighted a Black community with far-reaching results. But it's important to understand that road planners used a selective idea of participatory planning to manufacture community consent for the project. 



The MOVE Bombing and the Callous Handling of Black Remains

by Jessica Parr

The remains of the victims of the Philadelphia Police Department's bombing of the MOVE organization in 1985, including two children, were acquired by the University of Pennsylvania, stored outside of climate control, passed on to Princeton, and eventually lost, a final indignity to the victims. 



Elegy for Op-Ed

by Michael J. Socolow

The decision by the Times to rebrand its outside commentaries reflects its failure to fight consistently over the years for the open exchange of ideas and to differentiate the views it published from its own official positions. 


Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180087 https://historynewsnetwork.org/article/180087 0
Lights, Camera... Survey! Americans Give History a Screen Test

Russian Filmmaker Mikhail Kaufman prepares a shot for Man With a Movie Camera (1929)




The results are in and it’s official: we are a nation of watchers. As Americans retreated to the security of their own homes amid the ravages of COVID-19, their love affair with screens only increased. According to Eyesafe/Nielsen, adults spent an average of thirteen hours, twenty-eight minutes per day watching a screen in March 2020.


That represents a daily increase of three hours, twenty minutes, relative to the third quarter of the previous year. Of those, live television viewing went up by over two hours each day for five-and-a-half hours total, while time-shifted watching increased by nearly twenty minutes.


Streaming video-on-demand viewings likewise spiked eighty-five percent over comparable three-week periods in 2019 and 2020.


What seems clear is that what we know about the world around us is increasingly dependent on electronic boxes of various sizes and dimensions, and on the content providers who fill them.


As a historian, I’m always intrigued by how the public learns about the past, which is why my colleagues and I recently ran a national poll to find out where people get their historical information. The results of that survey, a collaboration between the American Historical Association and Fairleigh Dickinson University, and funded by the National Endowment for the Humanities, indicate that historical consumption is a microcosm of the trends outlined above. Yet those same results, which will be published in full this summer, expose some fascinating incongruities as well.


First, the trends. Asked where they’ve gotten their information about the past since January 2019 (that is, pre-COVID to present), respondents showed an overwhelming preference for screens. Out of a range of nineteen possible sources, the top three choices – documentary film and TV, fictional film and TV, and TV news – were all video.


More traditional forms of historical information simply weren’t competitive: museum visits (tenth place), non-fiction history texts (twelfth) and college courses (dead last) trailed television and film by significant margins.


That said, the great bugaboo of recent disinformation, social media, likewise assumed back-of-the-pack status, coming in at fourteenth place. Although use of social media has remained robust during the pandemic, most respondents to our survey didn’t seem to view such platforms as having much to do with history, per se.


The incongruities emerged when we asked survey-takers to rank the perceived trustworthiness of those same sources above. Only documentary film and TV stayed in the top three, though it now trailed both museums and historic sites.


While TV news ranked third as a go-to source for history, it fared miserably in terms of reliability, coming in fifteenth. Fictional films and TV did even worse at seventeenth. Few respondents had taken a college history course since January 2019, but history professors were still highly trusted, garnering fourth position. The same was true for non-fiction books, which moved up the scale to sixth, despite being sparsely utilized.


A bit of a disjuncture thus emerges. Whereas the public reports largely turning to video for its historical information, those same viewers are skeptical of much of what they see on their screens.


Our survey couldn’t determine exactly what people were watching, a topic that awaits further investigation. But respondents’ high utilization of, and obvious trust placed in, documentaries – and their corresponding distrust in news and dramatizations – begs a certain amount of cynicism. Although one can find quality programming in the current state of “docu-mania,” there’s a proliferation of disinformation as well. Such nonsense as Mikki Willis’s Plandemic, or the all-day conspiratorial marathons on the History Channel (Ancient Aliens, anyone?), are wrapped in a patina of documentary that lends them unmerited credibility.


Meanwhile, news programs that may strive for factuality, and that are avidly consumed by history-minded viewers, were largely dismissed by our respondents as unreliable. Here, our survey reflects broader distrust in news services that have been assaulted by several years’ worth of “fake news” accusations. In a national survey from the 1990s similar to ours, people likewise looked askance at dramatized history on film and TV, but they have consistently devoured it nonetheless, if Academy and Emmy Awards are any indication. And just as documentaries can deceive, fictionalized video renditions of the past can be quite edifying if one bears in mind how to read historical films as cultural artifacts.


The increasingly simple ease of access to video media may explain a lot about current consumption habits of historical information. But if so, it bodes ominously for sources of the past deemed more trustworthy, yet which take more effort (reading books) or intentionality (visiting museums) to engage.


Maybe we shouldn’t be surprised by such disjuncture. After all, the nation’s alcohol consumption has surged during the pandemic despite the drug’s well-known detrimental health effects. People knowingly acting against their own self-interests in where they turn to for historical information is thus not an isolated phenomenon.


If there’s a glimmer of hope, it’s that Americans – no matter their age, race, gender or political affiliation – are often in agreement when it comes to their history consumption habits and views on the reliability of sources. Sixty-seven percent of our respondents in the 18-29 age bracket reported watching dramatic films and TV to learn about the past, a statistic that barely moved for the 65+ age cohort (66%). Meanwhile, 87% of those identifying as Democrats said they trusted documentaries somewhat or a great deal, compared with 84% of Republicans.


They may be watching very different historical programming, but the public’s preferences and attitudes toward it align more often than not. In a country as deeply divided as ours, that’s no small matter.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180008 https://historynewsnetwork.org/article/180008 0
Walter F. Mondale and the Creation of the New Vice Presidency

VP Walter Mondale with President Jimmy Carter, 1979. 



Walter F. Mondale transformed the American vice presidency.  Converting that disparaged position into the true second office of the land was an historic accomplishment that tells a lot about the gifted public servant he was.  Whereas others had failed to make the office consequential, Mondale created a new vision of the vice presidency and demonstrated that it could be a force for good.  He reinvented the office, not as an end in itself, but to allow government to better promote the general welfare and foster a more just society and more peaceful world.

The vice presidency had moved into the executive beginning in the 1950s but it was experiencing troubled times when Jimmy Carter and Mondale were elected in November 1976.  President Lyndon B. Johnson had abused Hubert H. Humphrey, Spiro T. Agnew had resigned in disgrace, Gerald R. Ford had spent nine unhappy vice-presidential months as far from Richard M. Nixon and Watergate as he could get, and Ford’s well-intentioned promise to give Nelson A. Rockefeller a significant role did not work and he dumped the vice president from the 1976 ticket.  And that was all in a decade!  By the mid-1970s, presidential historian Arthur M. Schlesinger, Jr. judged the vice presidency fatally flawed and favored abolishing it

Carter and Mondale had a different idea.  Carter knew he needed help and thought Mondale could provide it.  But Carter lacked a blueprint to make the vice presidency functional, and Rockefeller’s unhappy experience demonstrated that presidential intentions and hopes alone wouldn’t elevate the second office without a sensible vision appropriately resourced and well-implemented.

Mondale provided the vision which he conceived by looking at the office in a novel way.   Whereas past vice presidents had tried to structure a role to empower themselves and position themselves as presidents-in-waiting, Mondale asked instead how he could help Carter succeed. Whereas prior vice presidents thought power would come from managing some government programs, Mondale rejected that approach.  He concluded that the vice president could contribute by being a general adviser to the president and by handling special presidential assignments.

Mondale’s insight reflected his faith that the presidency could promote positive change but his belief that presidents needed help, which a properly-equipped vice president could uniquely provide.   Mondale thought recent presidents often lacked good and candid advice.  Other officials had departmental biases which skewed their perspective and people tended to avoid giving presidents advice they didn’t want to hear.  Mondale believed a vice president, as a senior political leader who shared the president’s interests and could consider the full range of problems that came to the Oval Office, was positioned to give the president unique and critical perspectives.  And he suggested that the vice president’s stature could enable him to discharge high-level presidential assignments.

Carter welcomed Mondale’s vision and gave him the access, information, and support to succeed.  He brought Mondale into the West Wing and his inner circle, invited him to any meeting on his schedule, saw him privately whenever Mondale wanted, gave Mondale all briefing papers Carter got, and insisted that administration officials treat Mondale as they would Carter.

Prior vice presidents had echoed John Adams’ complaint that they could do neither good nor evil, but Mondale did a lot of good.  He became Carter’s most important general adviser and gave him critical advice during their private meetings.  Mondale helped secure ratification of the Panama Canal treaties and create the Department of Education, fought to fund social programs, and worked with Carter to produce the Camp David Accords between Israel and Egypt.  Carter sent him on major diplomatic missions, including a highly successful trip to China to help normalize relations.

Yet more than any other initiative, Mondale’s work rescuing the Southeast Asia refugees demonstrated the capacity of the vice presidency to align American performance with its high ideals. Mondale persuaded Carter to allow him to work on the problem before it had attracted much attention.  With Carter’s support, Mondale motivated the State department to denounce mistreatment of these refugees and got the Pentagon to dispatch the Sixth Fleet to rescue boat people facing perilous seas.  He persuaded Carter to seek more funding for refugee resettlement and to increase those the United States would accept. 

Mondale took those demonstrations of American resolve to Geneva where he headed America’s delegation to a U.N. Conference on Indochinese refugees.  After spending the first day lobbying other nations, Mondale gave one of the most stirring speeches in vice-presidential history.  He likened the situation to that the world faced in 1938 when nations failed to intervene to save Jews from the Nazis.  Mondale proposed a seven-point international response and implored the world to act.  “Let us do something meaningful — something profound — to stem this misery. We face a world problem. Let us fashion a world solution.  History will not forgive us if we fail. History will not forget us if we succeed.” America’s leadership and Mondale’s speech persuaded other nations to join in addressing the humanitarian crisis.

After Carter and Mondale lost their re-election bid in 1980 to former Governor Ronald Reagan and former Ambassador George H.W. Bush, Mondale and his team proceeded to educate the incoming administration on their reinvention of the vice presidency.  Neither disappointment over the defeat nor the likelihood that Mondale would run in 1984 against Reagan (or, considering Reagan’s age, Bush), deterred Mondale from sharing the ins and outs of the new vice-presidential model.  Mondale saw Reagan and Bush as America’s new leaders, not his partisan rivals.  On their inauguration day, Bush said he hoped to imitate Mondale’s vice-presidential model.

Carter and Mondale demonstrated that the vice presidency was important not because the vice president might become president but because he or she could do good as vice president.  The model Carter and Mondale created of a “White House vice president” has largely defined the role of Mondale’s seven successors, three Democrats and four Republicans.  That’s not to say they have matched Mondale’s contributions to, or in, the office.  Those reflected the ideals, skill and character of a historic public servant and a special human being.

copyright Joel K. Goldstein, 2021

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180004 https://historynewsnetwork.org/article/180004 0
The Bloody Handkerchief

The inscriptions read, “Eliza to Corinne Pickett” and “L.P.Walker to Eliza.”


Leroy Pope Walker first claimed my attention not from The Pile of documents in my closet but from my silverware drawer, where his name is engraved on a silver serving spoon: “L.P. Walker to Eliza.”  It kept company in the drawer with another serving spoon, this one engraved “Corinne to Eliza.”  I knew these were family names, but that was all.

I was getting ready to trace this path of my family history when my husband offered to take on some of the research.  I showed him the names on my spoons, and then left to go out for the evening.  By the time I returned, Peter had found the answer.  He’d typed L.P. Walker into the Google search box and up had popped an entry in Wikipedia.  L.P. Walker, it turns out, was Leroy Pope Walker, the first Secretary of War of the Confederacy.  I was stunned.  This was the man who ordered the bombardment of Ft. Sumter on April 12, 1861, starting the Civil War.  I couldn’t remember ever having heard his name.   How the hell did he end up in my silverware drawer? 



Walker’s wife was Eliza Dickson Pickett, of “L.P. Walker to Eliza” on my spoon.  This Eliza was a first cousin of the Eliza on the other spoon, my great-grandmother Eliza Ward Pickett – the mother-in-law of my beloved paternal grandmother Blanche.  Looking back, I imagine Granny probably told me the provenance of the silver I was to inherit.  I just hadn’t paid much attention.  That was the past.  My life was about the future.    

The two Elizas now nestled together in my silverware drawer were near contemporaries. Both were married to men of consequence in Alabama who were on opposite sides of the burning question of the day: whether their state should stay in the Union or secede.  

Eliza Ward Pickett’s husband, my father’s paternal grandfather Edwin Banks, was strongly for staying in the Union.  L.P. Walker, married to Eliza Dixon Pickett, was an ardent Secessionist.  He was from a wealthy and influential planter family near what is now Huntsville.  As a lawyer active in politics, Leroy chaired the Alabama delegation to the 1860 Democratic National Convention where he helped lead a pro-slavery walk-out.  After Abraham Lincoln was elected, Walker joined the Confederate cabinet.

L.P. Walker was not Jefferson Davis’s first choice for Secretary of War: he was offered the job only after two other candidates had passed it up. Davis soon had reason to regret that Walker had said yes.  Among his many blunders upon taking office, he gave a speech in which he famously prophesied not only that the South would win, but also that the Civil War would be over so quickly that he’d be able to sop up any blood that was spilled with his handkerchief.  

He made an equally reckless declaration while the bombardment of Fort Sumter was underway:  "No man can tell where the war this day commenced will end, but I will prophesy that the flag which now flaunts the breeze here will float over the dome of the old Capitol at Washington before the first of May." 


The flag that survived the bombardment of Ft. Sumter.


Walker proved no better at administration than he was at prediction. He clashed with President Davis and soon quit as Secretary of War before he was fired.  As a consolation, he was commissioned as a brigadier general, but his military career went no better than his stint in as Secretary and he resigned his commission abruptly in 1862.  

It was beyond unnerving to realize that I’d  harbored the Confederate Secretary of War in a silverware drawer for so many years.  “Touching our food!” my daughter said. Just so.  Touching our food.  As James Baldwin wrote in his brilliant essay “White Man’s Guilt:” “The great force of history comes from the fact that we carry it within us, are unconsciously controlled by it in many ways, and history is literally present in all that we do.”

This connection to the past was more intimate than any of the documents in The Pile.  A piece of paper might represent something; a spoon is something  I had regarded my inherited serving spoons through a dreamy haze, appreciating the inscriptions because they hinted at ancestral mysteries – precisely because I didn't know the stories behind them.  But once you know things, you can’t unknow them.  All you can do is learn more.


Read more about Ann's Confederates In My Closet on her website. 


Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/blog/154493 https://historynewsnetwork.org/blog/154493 0
Whither Germany? Historiography and Public Reckoning with the National Past


Reviews of the German historian Hedwig Richter’s new book Demokratie: eine Deutsche Affäre (Democracy: A German Affair) have started an important debate in Germany about the relationship between democracy and dictatorship in German history. This article translates and summarizes the current conversation in German academic and media circles. It also looks at recent newspaper articles to illustrate how some historians link arguments about the past with prescriptions for the present.


Germany, like so many other western nations, faces uncertain times. After sixteen years of even-keeled leadership, Angela Merkel will step down as Chancellor this fall. A complex electoral situation shows Merkel’s CDU/CSU alliance losing ground over claims of mismanagement during the COVID pandemic. Electoral unknowns include the meteoric rise of the Green Party, the continued withering of the Social Democrats, and the far-right “Alternatives for Germany” (AfD) likely continuing to have representation in the parliament. So far, nothing unusual. Germany, like all of its European partners, faces important challenges: populist nationalism and xenophobia, dilemmas over European integration, navigating the pandemic, addressing climate change and the turn to renewable energies, and perhaps renegotiating the country’s relationship with the United States after the presidency of Donald Trump.

The end of the Merkel era, dealing with America’s descent into populist nationalism, and perhaps Brexit as well signal another important (and perhaps unique) problem for Germany: what role should Germany now play on the global stage or in promoting further integration efforts into the European Union? Should Germany have a more robust defense policy independent of the United States and NATO? Should Germany take the lead on international agreements to address climate change? Debates about the independent use of the German military, the European Union, and unilateral diplomacy have been a matter of introspection for many Germans since the end of the Second World War.

Enter the historian Hedwig Richter (University of the Bundeswehr, Munich) and her new book Demokratie: eine Deutsche Affäre [Democracy: A German Affair] (2020). As the title suggests, the book is about Germany’s long “affair” with democracy. In her book, Richter suggests Germany has had a much longer relationship with democracy than has been previously acknowledged. Rather than writing history of Germany as a “long road west” as Heinrich August Winkler famously did, Richter claims that Germany has always been part of the west, but that the western tradition includes positive norms as equality freedom, and justice as well as obviously negative traits including racism, militarism, and nationalism. Little can stand in the way of Richter’s account of Germany’s long relationship with democracy. The Wilhelmine Empire, in Richter’s reading, was in fact a state more defined by the democratic politics of mass participation than it was authoritarian constitutional monarchy offering a prime example of the ‘persistence of the old regime.’ In the same vein of the irresistible force of democracy, Nazism was not so much a break with the democratic traditions of the Weimar Republic, but rather a “totalitarian democracy” built on mass politics.  

Around this core conceptualization, Richter pursues four theses. First, that genuine democratic reforms in Germany was not a project of revolutionaries and the masses, but rather an elite-driven phenomenon. Second, that more often than not popular revolutions in German history failed and reform has been more successful in achieving democratic transitions (except for 1848 and 1989). Third, Richter connects the history of democracy to the history of the body, “its mishandling, its care, its wanting, and its dignity.” Finally, she claims that the history of democracy must be an international history looking beyond national borders.

These theses all reject the argument of Germany’s “special path” (Sonderweg) to modernity. This thesis, from the post-1945 period, held that Germany was a “belated nation” whose delayed national unification in 1871 and late industrialization fostered a continuity of pre-industrial elites, preventing a successful bourgeois rights revolution akin to France and Great Britain. While the best historiography from the period always avoided the teleological line ‘between Luther and Hitler,’ special path historiography posited there were some continuities of German history from the Wilhelmine period (1871-1918) that helped explain Hitler’s rise to power and the subsequent support for the Nazi dictatorship.

Written with a popular audience in mind, Richter’s book surveys Germany’s relationship to democracy from the end of the eighteenth century to the present. The book has received plaudits in the form of the prestigious Anna Krüger Prize and a nomination for the Bavarian Book Prize. Despite some mixed reviews from media outlets, the book has been very commercially successful, and Richter has been a guest on numerous public outlets in Germany.

For all its public praise, Democracy: A German Affair has come under serious scrutiny from some of the big names of German history. At the beginning of February 2021, Christian Jansen (University of Trier) wrote a review for the popular online German news forum H-Soz-Kult and concurrently published an even longer review on his personal Academia page. In that review, he criticized Richter for ignoring historiography, using language unbefitting a serious academic book, terminological imprecision, and generally factual discrepancies and exclusions of inconvenient facts. In March, the director of the Institute for Contemporary History Andreas Wirsching wrote another negative review in the online German journal Sehepunkte. He suggested that the only affair—in the scandalous meaning of the word—surrounding the book was that the book was published by a renowned German publisher (C.H. Beck), treated uncritically in many newspaper reviews, and deemed prizeworthy.

While never mentioning the book by name, the emeritus historian Ulrich Herbert (Freiburg) was asked about recent discussions regarding the relationship between National Socialism and democracy:

Democracy is something different than the mobilization of the masses. While national conservatives and pre-industrial elites—not just in Germany—saw in the rise of the masses approaching doom, the National Socialists relied entirely on the dynamism of the mass movement. The concept of democracy is devalued if one sees an element of “democracy” in the rule of the Nazi regime, which destroyed all democratic institutions, got rid of free voting and imprisoned and killed hundreds-of-thousands of Germans.

In yet another recent interview, the emeritus historian Jürgen Kocka (Free University of Berlin) offered a dualistic assessment of the recent controversy about the book, focusing his attention more on the reception of the chapters around the Wilhelmine Empire:

One must accept that the Wilhelmine Empire was both an authoritarian militarist-and-civil-servant state, which pursued aggressive colonial politics and bred extreme nationalism until the First World War; at the same time, it was the casing for enormous economic ascent, for quick societal and cultural changes, for a great awakening and emancipation. The new research has found a great deal of new material about the second aspect but occasionally loses the first from its view.

The dispute also revealed some methodological quarrels which might sound familiar to an American audience. More positive reviewers praised Richter for her ‘imaginative writing’ and her ability to provide a digestible history in a field saturated with dense academic writing. As much as anything, Richter’s defenders took a stand against the tone of critical reviews. In the Frankfurter Allgemeine Zeitung, the history correspondent Patrick Bahners wrote a personal polemic against Jansen and Wirsching in response to their own supposedly overblown criticism of Democracy: A German Affair. Under the name “An Inverted Stab in the Back Myth,” Bahners accused Jansen and Wirsching of mischaracterizing the very mixed reviews newspapers had given Richter’s work. The personal tone of his polemic was, ironically, unmatched by any of the reviews.

Hedwig Richter is a young, media-savvy female historian working in a field where many of the important positions are still held by older men. Is her addition to the historiography a new and creative way to bring history to a broader audience with an imaginative thesis meant to reconceptualize German history, or is it a violation of academic standards with an ill-defined and poorly articulated methodology? Not for the first time, the profession of history finds dissenters in the ranks of some journalists, public intellectuals, and opinion-makers.

A recent set of articles in the German newspapers Die Zeit and Süddeutsche Zeitung revealed just how much these historical-methodological debates in Richter’s book had been about revising modern Germany’s democratic self-perception. Under the title “The Angst about the Volk,” Richter and the associate editor of Die Zeit Bernd Ulrich complained that the idea of Germany’s “special path” to modernity is “chained to the Raison d'être of the Federal Republic” in a way that means German political and media elites are distrustful of the German people.

Why do German politics have such an angst about the Volk, that it does not dare say anything about Corona or the climate, let alone do anything about it in time? Why does Germany fluctuate between admiration and dismay at the United States? Why is it not able to accomplish a deployment-ready army? All of these things have different reasons and yet one common ground: an overwhelmed and one-sided relationship to Germany’s own past.

Germany tells the story of its past as “not a hero-or-victim history, but rather as one of guilt and purification.” This was somewhat for the good: “to recognize their guilt rather than relativizing, to accept responsibility for the terror of National Socialism and for the unique crime of the Holocaust has made Germans more humble, civil, and reflective—and carried them to their rehabilitation after 1945.” But this recognition was taken too far by people who pursued the special path thesis of German history:

In the public sphere there is a specific meaning of German guilt, in which the depth of the civilizational break is expanded with a bold causal logic into the depths of German history. Especially since the Wilhelmine Empire, but also where possible since Herder or Luther or even since the German Tribes in their dark forests, the Germans pursued their special path and manifested an almost mystical Germanness of difference and danger. The acceptance of a special Volk has important ramifications: up until today it sustains a hysteria and paralysis of the Federal Republic. 

The special path historiography had a pernicious impact on Germany up until today, according to Richter and Ulrich:

International comparative studies showed in the last years, that every country has its own history and its own uniqueness, but that the same time parallel changes in the north Atlantic region existed. And to remain by the time of the Wilhelmine Empire: antisemitism, racism or militarism can barely be analyzed without looking at their global aggressivity.

It is important not to simplify in national terms the good or bad phenomenon of high modernity around 1900. Modern states discovered on the one hand an impressive power of inclusion and participatory attractiveness, they bound a large part of the people with national feelings on the state and made passible the development of economic, scientific, and intellectual competencies. On the other side these states became brutal machines of war, who conquered continents, robbed other people, or annihilated them. It is a violence that—next to authoritarian elites—the masses and democratic elements demand and that is open for Fascist attempts at rule.

Richter and Ulrich argued that the special path thesis overlooked the unique short-term historical circumstances explaining the rise of the Nazis. About the identity of this group who simplified German history, Richter and Ulrich had this to say:

That the thesis of a special path, from the destiny-driven predisposition of the Germans for barbarity and from an inevitable path from the Wilhelmine Empire to National Socialism finds few voices and is in the historical profession almost entirely gone, does not mitigate its power in politics and media. The leading generation of political and media figures were raised with this view of Germany and the world and chained it to the Raison Raison d'être of the Federal Republic. Only the special path accounts protected the Republic from a relativization of the Holocaust, in short, and prevented a relapse into Fascism.

It is not the historical profession, but rather the “leading generation of political and media figures” who hold this view of Germany.

The historians Christina Morina (Bielefeld University) and Dietmar Süß (University of Augsburg) published a retort to this article with a piece in the Süddeutsche Zeitung called “German Spring.” The piece attacked what they see as Richter and Ulrich’s call for “a radical turn in the [German] culture of memory” about the Nazi period. First, they argued the piece in Die Zeit was bad history, as it “sacrifice[d] a differentiated view of the complexities of the present and history for a throwaway line.” Richter and Ulrich’s piece created a strawman of Sonderweg theory, held in fact by no particular German historian:

The historical discussion about the “special path” was about the question of structural foundations and societal bases for the rise of National Socialism. It dealt with Germany as a ‘delayed nation,’ democratic deficits and the meaning of the authoritarian state and militarism. The theses of the fifties, where a straight line led from Luther to Hitler found engagement almost nowhere … It [the ‘special path’] is a reason for the “Angst about the Volk” thought in the “leading” circles of politics and media. Who is meant by this? Die Zeit? Ministerial Civil Servants in the Federal Defense Ministry? The German World Service? The Government? The question remains open. Empirical evidence: nothing doing.

Accordingly, in their account of the “Special Path” Richter and Ulrich ignored the core of that historical controversy: why did such a large part of middle-class society, white collar workers, civil servants and also the workers so willingly associate themselves with the Völkisch movement of Adolf Hitler? Why did racism and antisemitism find such fruitful ground in Germany? Why, only here, did such a murderous stew emerge out of a longing for ‘community,’ for ‘leadership’ and a radical ‘decency’? With reference to the underestimated democratic traditions in Germany and the relative ‘normality’ of racism and nationalism in other counties, one does not do justice to the matter. Rather, the question of specific German histories of violence are lost in a universal no-mans-land.

To Morina and Süß, the argument about the place of the special path of German history was specious. Instead, the main point of the article was to revise the view of the German past with an eye to the German present:

The considerations of Hedwig Richter and Bernd Ulrich therefore represent nothing less than a deep historical-political cut. With their plea to get away from the “chains” of the until-now “Raison d'être” of the Federal Republic, and to solve the until-now valid “zombie paradigms,” they bring into question the place the fundamental consensus of a basically self-critical addressing of the Nazi period… Those who are interested in a ‘more sovereign’ Germany, a more forward-looking commitment to Europe, a more robust foreign policy, a more creative crisis management, and more justice should not sacrifice a critical relationship to their own history. Inner sovereignty and democratic creative power do not benefit from rhetorical swaggering and nationally-tinted quick-fix visions, but from the ability and will to achieve the highest possible degree of participation, individual freedom, and common good.

It took a long time for the realization to prevail in this country that National Socialism did not break over the Germans like a natural disaster. The critical visualization of this history as a factor of burden is nothing else than the old call for a thick line to be drawn through the past. The fact that the German history of democracy is being mobilized for this is what makes its advancement particularly embarrassing.

Morina and Süß’s critique of the article in Die Zeit lays out, perhaps more clearly, the stakes of the current debate in Germany. If it is true that Germany is in “chains” from a sixty-year-old historical dispute about the continuities of German history which leads to “zombie paradigms” about the German people on the part of political and media elites, then the politics of memory in Germany do need radical rethinking. If, on the other hand, this is an artificial crisis created by a historian and a journalist then one has to ask: who does it serve?

A basic question seems to be emerging from the ongoing disputes: what does the culture of critical historical self-reflection about the German past have to do with solving the pressing issues in contemporary Germany? To Christina Morina and Dietmar Süß, very little. Germany can have a more participatory democracy, a debate about independence from the United States, and a more active role in the EU without sacrificing an unflinchingly critical examination of the German past. To Bernd Ulrich and Hedwig Richter, Germany remains shackled by its past with leaders who do not trust the German people for their supposed proclivities towards authoritarian demagoguery. The identity of these Germanophobic German leaders is unclear. One thing is undeniable: the central role of commemorating the Nazi past in modern German public life is a very new and precarious phenomenon and its current location in German public consciousness can never be taken for granted. This is neither the first, nor the last time that a historian has recommended changing Germany’s historical self-perception. The historian Christian Goschler made the point that perhaps what is more novel in the recent controversy is that it reveals the fault-lines of public trust in academic work and professional gatekeeping. This problem of the culture of remembrance also touches larger debates about truth claims, scientific research, and the dissemination of knowledge.

The methodological issues debated by our German colleagues need to be heard in American universities. For Americans living in the era of Black Lives Matter, police violence, voter suppression, discussions about reparations for slavery, and a critical reexamination of the nation’s founding it might be useful to think about the international comparative dimensions of coming to terms with troubled pasts and the role historians are currently playing in this process elsewhere.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180010 https://historynewsnetwork.org/article/180010 0
Russia vs. Ukraine Redux? Mapping the Way Forward from the Recent Past

"Little Green Men" – masked and unbadged soldiers of the Russian Federation near the Ukrainian military base at Perevalne, Crimea, March 9, 2014.

Photo Anton Holoborodko (Антон Голобородько) CC BY-SA 3.0



In April 2014, because of Ukrainian-Russian tensions, I quoted historian Stephen Cohen: “I think that we are three steps from war with Russia, two steps from a Cuban missile crisis.”  Now, seven years later, New York Times Moscow correspondent Andrew Kramer writes of “the largest military buildup along the border with Ukraine since the outset of Kyiv’s war with Russian-backed separatists seven years ago. . . . The [Russian] mobilization is setting off alarms in the North Atlantic Treaty Organization, European capitals and Washington.”


On 20 April Ukrainian President Volodymyr Zelensky warned Ukrainian TV viewers of the possibility of war, urged Russian President Putin “to step back from the brink,” and proposed meeting with him. Only on 22 April did Russian Defense Minister Sergei Shoigu announce that Russia would begin withdrawing military forces from the Ukrainian border. 


A key question is Russian President Vladimir Putin’s motivation for Russia’s massive buildup, and we will return to it later in this essay. But first, as with many international crises, we need to understand the historical background of it.


Thus, to begin with, a brief survey of past Russian-Ukrainian relations. For several centuries, Ukraine was part of Russia. Soon after the collapse of that multi-ethnic Tsarist country in 1917, and the creation of the USSR in 1922, Ukraine became one of the tightly-Communist-controlled Soviet Republics and remained so until the 1991 breakup of the Soviet Union’s 15 republics into separate countries. In the thirty years of its existence as an independent state, Ukraine has see-sawed back and forth in regard to its relationship with Russia and the Western powers which dominate NATO.


Part of the reason for this see-sawing is the ethnic makeup of Ukraine. According to the 2001 Ukrainian Census, in western regions less than 5 percent of the people were ethnic Russians; in a few of the most eastern regions almost 40 percent were. Many others, in both eastern and western Ukraine, have both Ukrainian and Russian ancestors. In addition, many who consider themselves ethnic Ukrainians list Russian as their main language.  In general, citizens in western Ukraine have been more favorable to closer Ukrainian ties with the West, including possible European Union (EU) and NATO membership, while those in eastern Ukraine have been more likely than western Ukrainians to favor closer Russian ties.


In 2014 after Ukrainian President Viktor Yanukovych backed off a closer relationship with the EU, protests in Kyiv lead to his ousting. Putin, long concerned with growing Western influence in Ukraine and other former parts of the USSR, claimed that the West was behind driving Yanukovych from power. Putin then encouraged a vote for secession in Crimea, which was part of Ukraine but contained more than 50 percent ethnic Russians. After Crimea voted to secede, Russia then annexed it. Two other Ukrainian areas, aided by Russian “volunteers,” also chose to secede, both in the eastern Ukrainian Donbas region bordering Russia.


Those secession movements proclaimed the establishment of the Donetsk and Luhansk Peoples’ Republics (DPR and LPR), which in 2014 fought against Kyiv-backed troops and still exist today. Both of these republics contain only part of the regions (oblasti) they exist in, the other parts being controlled by the Ukrainian government and troops. As a result of the conflict which has simmered from 2014 to the present, around 14,000 people have died and over 2 million have fled the regions, going either to Russia or other parts of Ukraine.


A survey reveals something of the people’s wishes in early 2021 in the Ukrainian-controlled (U) and rebel (R) areas. It was carried out by U. S. and western European scholars including Gerard Toal (see below). According to it, about three-fourths of those in the U area think the two “republics” should be returned to Ukraine (with or without Ukraine granting them some autonomy). In contrast, in the R areas a slight majority want to join Russia, either with or without some autonomous status. Comparing their survey with an earlier 2019 one, the scholars who conducted the 2021 poll conclude that the “desire to return to Ukrainian government control in the breakaway territories appears to be fading.” Thus, even if and when Russia reduces troop levels in border areas, Russian-Ukrainian tensions regarding the self-proclaimed republics will continue.


Also important is how Russian citizens feel about the status of the DPR and LPR. According to an April 2021 survey by the respected Levada Center, a little over one-fourth of those surveyed thought the two self-proclaimed republics should be recognized as an independent state or states; about half of the survey sample were split almost evenly between those who thought they should become part of Russia or part of Ukraine (with or without Ukraine granting them some autonomy), and a little less one-fourth were unable or unwilling to choose any of the above options. Thus, public opinion in Russia is not pushing Putin toward annexing these two “republics” or setting them up as Russian satellite states.


As mentioned earlier, understanding Putin’s motivation for massing large numbers of Russian troops along the border is important. Significant in this regard is Zachary Shore’s  A Sense of the Enemy: The High Stakes History of Reading Your Rival's Mind (2014). In a review of it, I quoted his words, "Understanding what truly drives others to act as they do is a necessary ingredient for resolving most conflicts where force is not desired. It is, in truth, an essential first step toward constructing a lasting peace.”


In a recent interview former U. S Ambassador to Ukraine William Taylor said, “no one can get inside the head of Mr. Putin. No one understands his motivations.” To some extent this is true, but as I pointed out in 2015 there is a “Surprising Consensus on What the Experts Say about Putin.” He is a pragmatist, not an ideologue. He intends to restore Russia to major power status. He believes that Russia and many other countries require a strong state government and encourages patriotism to support it. He is deeply distrustful of Western-style democracy, the United States, and NATO and believes that Russia offers a unique civilizational model that differs from—but is equally valid to—that of the West. He is especially distrustful of any talk of bringing Ukraine into NATO, and he suspects major Western countries of trying to weaken him and Russia. His past experience as a KGB officer encourages his suspicious, conspiratorial views.

Since 2015 the Trump presidency occurred bringing with it President Trump’s surprising reluctance to find any fault with Putin. But now, with Joe Biden as U. S. president, U.S.-Russian relations are back on more familiar terrain. And Putin’s mindset does not seem to have changed much from the late Obama presidency.  

So where does that leave us in figuring out what’s up with this spring’s Russian military buildup on the Ukrainian frontier? As usual in trying to analyze crisis situations--or current affairs generally--determining reliable sources of information and analysis is of vital importance. (See, e.g., my 2014 op-ed “Whose Advice Should You Trust on Ukraine?”)

One whose expertise I value is Dmitri Trenin, head of Moscow’s Carnegie Center. Andrew Kramer cites him in the essay mentioned above, and Georgetown’s Angela Stent has called him “one of Russia’s most astute foreign policy observers.”

In an April 13 essay, he indicates that “it was [Ukrainian President] Zelensky who moved first.” Trenin mentions him closing down some Russian-language TV stations, charging some pro-Russian opposition “leaders with high treason,” and approving, in February, military moves close “to the conflict zone in Donbas.” These actions were

“enough to get him noticed in Moscow. . . . Even if Ukraine cannot seriously hope to win the war in Donbas, it can successfully provoke Russia into action. This, in turn, would produce a knee-jerk reaction from Ukraine’s Western supporters and further aggravate Moscow’s relations, particularly with Europe. . . . Being seen as a victim of Russian aggression and presenting itself as a frontline state checking Russia’s further advance toward Europe is a major asset of Kyiv’s foreign policy.” 

Although Trenin does not mention it in his essay, in early April Zelensky asked that NATO approve of a membership plan for Ukraine’s joining that organization, arguing that such a step would deter Russian aggression.

Trenin speculates that “the Kremlin decided to seize upon” Zelensky’s pre-April moves

“to raise the stakes. . . . Moscow felt it had nothing to lose and something to gain by acting boldly and on a larger scale. Russia decided not so much to test the new U.S. president as to warn him early on of the dangers involved regarding Ukraine.”

Trenin adds that

“it’s unlikely that Putin was bluffing when he said that a major attack against Donetsk and Luhansk would provoke a massive Russian response with catastrophic consequences for Ukraine.”

On the other hand, “Moscow will not formally recognize the [self-proclaimed] republics or allow them to accede to Russia.” Trenin does not view Putin as a warmonger: The main reason for the Russian border mobilization

“was to prevent the need to actually go to war against Ukraine in the future. . . . Going overkill in terms of military maneuvers on the Ukrainian border now may avoid the need to do terrible things at a later point. Under that same logic, doing nothing now would sow uncertainty and invite trouble, while doing nothing when trouble arrives would be suicidal for the Kremlin leadership.”

Besides Trenin, I also value the opinions of Gerard Toal, mentioned earlier as one of the scholars involved in a 2021 Donbas survey. Earlier, in 2017, he authored Near Abroad: Putin, the West, and the Contest over Ukraine and the Caucasus. In a review of it, I quoted favorably his words about trying to view objectively Putin’s motivations. Understanding them does not mean approving of them and does not provide the Biden administration and NATO with any detailed plan for negotiating with Putin and Ukraine.

But it does suggest some do’s and don’ts. Among the latter are: don’t, if you’re the U.S. or NATO, support a Ukrainian military attack against the Donetsk and Luhansk Peoples’ Republics, and don’t bring Ukraine into NATO. Former ambassador to Russia Jack Matlock (named by President Reagan and serving from 1987 to 1991) once stated that he thought that NATO expansion and talk of bringing Ukraine into NATO has been “Putin’s main concern” and that such talk has been “irresponsible.”

A “do,” as I indicated in the ending of a 2014 op-ed on U. S. policy on Ukraine, is the exercise of “more political imagination—on all sides.” Matlock and two other former U. S. ambassadors to Russia noted in 2014 that U. S-Russian relations had “descended into attempts by each side to pressure the other, tit-for-tat actions, shrill propaganda statements, and the steady diminution of engagement between the two governments and societies. . . . What the Western strategy lacks is an equally vigorous diplomatic approach to ending this conflict.” Such an approach might be similar to that taken by the USA, the USSR, the UK, and France in 1955 when they all agreed to pull out of occupied Austria provided it remained neutral. Austria agreed. It was a win for Austria--and a win for each of the former occupying powers.

Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180005 https://historynewsnetwork.org/article/180005 0
Is the Presidency a License to Kill? War Powers and the Constitution

George W. Bush was not alone among modern presidents in waging war without the constitutionally-mandated Congressional declaration of war.




George Washington presided as delegates to the Federal Convention in Philadelphia drew up the United States Constitution in some four months of 1787. Signers numbered 39, of 55 who attended, representing 12 states (Rhode Island absent).


History taught them “the executive is the branch of power most interested in war, and most prone to it” (James Madison). Monarchs often made war “for purposes and objects merely personal, such as a thirst for military glory, revenge for personal affronts, ambition ...” (John Jay). To discourage war, delegates allowed only Congress “to declare [i.e. initiate] war” (Article I, Section 8, Clause 11).


But they made two errors, one of commission, the other omission:


  • They gave one man all executive power, including command of the military. In 1789 the first Congress had 59 representatives and 22 senators; it assigned the Supreme Court 6 justices. Members now number 435, 100, and 9 respectively. Yet the executive has always been one man. While not king, he has become a ruler with more war power than George III had.
  • They failed to foresee abuse of the president’s military function and explicitly guard congressional war power from his encroachment. The impeachment process is available, but it has never been invoked for the high crime of illegal war. Anyway, what good would it do after a nuclear catastrophe?

    Some delegates opposed a one-man executive. Governor Edmund Randolph of Virginia called it “the foetus of monarchy.” He and Virginian George Mason favored giving three men joint executive power. Others, notably James Madison and Benjamin Franklin, wanted at least a council to assist the executive.


    Pennsylvanian James Wilson’s insistence on a solitary executive won out. He called it “the best safeguard against tyranny.” Perhaps Wilson et al. assumed that Washington would become president, setting high standards. Seven states assented, (not including Delaware, Maryland, and New York).


    The delegates made the president the army and navy’s “commander-in-chief,” a historic, strictly military position. Convention records don’t explain why they thought everyone elected president would be qualified—or why they trusted him not to misuse the power and initiate war.


    Washington and other early presidents respected Congress’s constitutional war power. Some presidents, from Polk to Franklin Roosevelt, undermined it by provocation or circumvention. Outright defiance began with Truman. Every subsequent president has emulated him in some way. Yet all presidents take the oath to “preserve, protect and defend the Constitution” required by Article II, Section 1, Paragraph 8.


    The delegates did their best. Theirs was a noble experiment. The experiment failed.


    Biden Upholds the Constitution


    Three decades ago, President George Herbert Walker Bush massed troops in Saudi Arabia, preparing to wage war on Iraq, He railed against its invasion of Kuwait, ignoring his own invasion of Panama a year earlier. Bush claimed the authority to start a war as military commander-in-chief.


    On constitutional grounds, 54 members of Congress, led by Rep. Ron Dellums (D-California), sued to prevent the conflict. Federal Judge Harold H. Greene found the plaintiffs justified (12/13/90): The Constitution’s framers “felt it unwise to entrust the momentous power to involve the nation in a war to the president alone.” Hence “the clause granting to the Congress, and to it alone, the authority to decide war.”


    Greene refused to issue an injunction however: Until a majority of Congress acted and war was certain, the case was not “ripe.” At least, like early U.S. judges, he tackled the constitutional issue. Modern courts usually duck such cases, saying plaintiffs lack standing to sue, raise a “political question,” et cetera.


    Senator Joseph R. Biden Jr. (D-Delaware), chairman of the Senate Judiciary Committee, called a hearing on constitutional war power (1/8/91) and said, “In England the king alone could decide to take a nation to war.” Here “the war power rests in the Congress.... The Constitution’s founders all understood this to be a key principle of our republic.”


    Biden quoted Alexander Hamilton: “commander-in-chief” amounts to just “supreme command and direction of the military and naval forces as first general and admiral....”  The president lacks the British king’s powers to declare war and raise and regulate fleets and armies—powers our Constitution gives the legislature (The Federalist, 69).


    Biden went on: “Americans once lived under a system where one man had unfettered choice to decide by himself whether we go to war ... and we launched a revolution to free ourselves from the tyranny of such a system.”


    Bush relented and allowed a congressional vote. It went his way, helped by testimony falsely alleging atrocities by Iraqi soldiers. Biden voted no.


    Biden Ignores the Constitution


    Biden’s 2007 memoir says he pressed President Clinton to bomb Serbia. Clinton did so in 1999, ignoring Congress. Biden urged him to keep it up.


    In 2002, as chairman of the Senate Foreign Relations Committee, Biden supported a resolution (originating in the White House) to let President George W. Bush decide whether to fight Iraq. The measure, adopted, violated the Constitution. As Biden had insisted 11 years earlier, such a decision was for Congress to make.


    In the 2020 debates, then Vice-President Biden made known he would send forces into combat “very, very reluctantly” and only when the “overwhelming, vital interests of the United States are at stake.” He did not mention Congress.


    Last February 25, 36 days after inauguration, President Biden bombed Syria, reportedly killing 22 people believed to be “Iran-backed militants.” (Victims of our “precision-guided munitions” are seldom identified as children, women, or peaceable men.)


    Spokesmen gave various imaginative explanations for the aggression. It was “defensive” and “retaliation” for an attack on U.S. forces in Iraq (though not an attack by Syria). It aimed to “deescalate” the regional situation and “to send a message to Iran.” What did that message say? Forget my pre-election promise of peace? An e-mail would have been clearer and saved a bundle. Nobody explained what overwhelming, vital interests of the U.S. were served by taking the 22 lives.


    Based on Biden’s 1991 rhetoric, that act of war was also an act of “tyranny.” Such tyranny has arguably spanned some four score years, irrespective of the president’s political affiliation.


    In Congress, reactions crossed party lines. The raid drew both praise for avenging attacks and condemnation for violating constitutional war power.  Prior congressional approval would not have sanctified the attack, though giving it constitutional legitimacy. Several U.S. treaties prohibit aggression.


    Moreover, what about the long-suffering people of Syria, whose homeland foreign leaders have appropriated as a battleground? Ex-Representative Ron Paul (R-Texas) expressed sympathy, writing that Biden, Trump, and Obama all deserved impeachment for attacking Syria.


    Throughout their terms, both Trump and Obama conducted unauthorized military actions in Asia and Africa. Biden’s election platform promised to end “forever wars.” Syria aside, how is he doing?


    Two days after inauguration, U.S. forces conducted an “emergency response” exercise in Somalia. Five days later came a U.S.-led air attack in Iraq, killing 11 “ISIS” people.  The Afghanistan war still rages. Biden supposedly halted support for the ravaging of Yemen, yet, without congressional authority, he promises to defend the Saudi monarchy, which perpetrates it. He assures Israel he will strengthen military cooperation. He deploys bombers to Norway. Warships approach China and Russia, as Biden insults and threatens America’s top nuclear rival. (Putin, a “killer,” will “pay a price.”)


    When Bush Senior invaded Panama, the late R. W. Apple Jr., chief New York Times Washington correspondent, postulated “a presidential initiation rite” since World War II for presidents “to demonstrate their willingness to shed blood ....” All believed “the American political culture required them to show the world promptly that they carried big sticks.” Bush—accused of timidity—showed by attacking he was “capable of bold action.”


    Was that Biden’s “message”?


    However administrations change, executive war-making persists. Imagine the reaction of the Constitution’s framers if they could see the power now at the president’s fingertips—capable of destroying life on earth.


    Presidential Hit Parade


    Summarized below (in reverse chronological order) are highlights of the records of

    the last 14 presidents, emphasizing war and violence. Reasons for the bloodshed are largely forgotten. These vignettes illustrate the executive’s affinity to war.


                                              * *       *       *       *


    Donald Trump ran for president favoring both peace and the killing of “terrorists” and their families. In office, he escalated existing hostilities and loosened rules of engagement, causing soaring civilian casualties. MOAB (Mother of All Bombs), the largest non-nuclear U.S. bomb, was exploded for the first time in Afghanistan (casualties unannounced). Trump picked fights in Asia and Africa, renounced arms treaties, threatened North Korea with “fire and fury,” and supported Saudi assaults on Yemen, vetoing a congressional resolution to quit. He assassinated Iran’s top general, then ordered Iran bombed but changed his mind.


    Barack Obama entered office opposing “dumb wars” but conducted them for his entire eight years, the first presidency to permit no peace. He escalated the Afghan war. His Libyan “no fly zone” became a war for regime change. He helped Saudis bomb Yemen. He plotted periodic drone assassinations in various countries and took pride in ordering Osama bin Laden shot, without trial, in Pakistan.


    George W. Bush started the Afghan war—now in its 20th year—though Congress never authorized war on Afghanistan. He then instigated America’s second war on Iraq, lying that President Saddam Hussein, had “weapons of mass destruction” and ties to terrorists. Estimates of resulting fatalities reach a million-plus. Bush approved torturing prisoners.


    Bill Clinton intervened in eight countries during eight years in office. Clinton bombed Yugoslavia for 11 weeks, aided by NATO (supposedly dedicated to peace). He ignored Congress, even after it voted against upholding his war. Other victims of his: Afghanistan, Bosnia, Colombia, Haiti, Iraq, Somalia, and Sudan.


    George H. W. Bush, George W.’s father, attacked Panama, without congressional authorization. He then planned war on Iraq. Congress narrowly approved forcing Iraqi troops to leave Kuwait. As Iraqis departed, U.S. forces fired on them. Civilians in Baghdad and other cities succumbed to U.S. bombs. Bush as vice-president was heavily involved in Reagan interventions.


    Ronald Reagan entered unauthorized hostilities in Lebanon, Grenada, and Central America. As though making up for a recent loss of 240 marines in a Lebanon bomb blast, he invaded Grenada, an island nation with one 2000th the U.S. population. About 80 were killed, including 20 Americans. With active CIA participation, Reagan sponsored the Nicaraguan Contras, whom he called “freedom fighters” but critics considered “terrorists.”  Scandal erupted when he sold Iran arms to finance the Contras. Reagan supported the Salvadoran regime despite its massacres of citizens, sending it military aid and armed “advisors.”


    Jimmy Carter ran for office pledging to involve the American people in forming foreign policy. The only U.S. president since Hoover to wage no overt warfare, Carter covertly armed anti-Soviet fighters in Afghanistan, forerunners of al-Qaeda. He threatened force to defend U.S. interests in the Persian Gulf. Trying to rescue hostages in Iran, Carter lost eight servicemen in an air accident.


    Gerald Ford, during his short, unelected term, sacrificed 41 Marines in a needless military assault on a Cambodian island. It aimed at freeing the Mayaguez, a merchant ship seized by Cambodia, which was preparing to free her anyway.


    Richard M. Nixon and Lyndon B. Johnson before him led the U.S. Indochina

    war, 1964–1973. The Wall in Washington, DC, commemorates 58,279 U.S. servicemen who fell in that presidential conflict—not Vietnamese, Cambodian, and Laotian victims, which number as many as 3.6 million. Nixon also intervened covertly in Chile and Johnson sent troops to the Dominican Republic


    John F. Kennedy approved the CIA’s Bay of Pigs invasion and sabotage program in Cuba. To look tough, he risked war with Russia in demanding withdrawal of missiles it had installed in Cuba, though he had put missiles in Turkey and ordered plans to nuke Russia. He sent weapons and thousands of “advisors” to South Vietnam and covertly eliminated its peace-seeking president.


    Dwight D. Eisenhower, commander of Allied European forces in World War II, inherited the Korean war, reaching an armistice in six months in 1953. He threatened to use nuclear weapons if war recurred, then made nuclear “massive retaliation” his general “defense” policy. Using the CIA, he overthrew governments in Iran and Guatemala and OK’d the invasion of Cuba. He sent military aid to Vietnam before and after the French left.


    Harry S. Truman is infamous as the first to use atomic bombs, in annihilating Hiroshima and then Nagasaki. He also launched the practice of outright presidential war-making when in 1950 he ordered combat in Korea without congressional permission, claiming authority from the UN. His war killed nearly five million, the majority civilians. Truman armed rightist regimes and French forces fighting in Indochina.


    Franklin D. Roosevelt was the last president to obtain a constitutional declaration of war (excluding Bush Senior’s reluctant OK of an Iraq war vote). However, FDR’s policies, provoking Japan economically and militarily while concentrating warships in Hawaii, apparently invited the “date which will live in infamy.” Promising 1940 voters peace, FDR executively armed Britain, engaged U-boats, and sent troops abroad. In 1939 he protested aggressors’ bombing of civilians as “inhuman barbarism.” After war was official, he ordered massive bombings of cities, killing innumerable civilians.


                                              * *       *       *       *


    Do we elect a chief executive—or a chief executioner? No president is likely to maliciously shoot someone to death point-blank. That’s murder. But no president seems to mind ordering many people shot or bombed in a distant land. That’s war.

    Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/179978 https://historynewsnetwork.org/article/179978 0
    “Not Your Fetish”: Protesting Racism and Misogyny Against Asian American Women

    Portsmouth Square, San Francisco, 1851




    Amid surging incidents of discrimination, bigotry, and violence against Asian Americans during the COVID-19 pandemic, the deadly shootings at three Atlanta-area spas heightened outrage and fear across the country. The shock waves reverberated through the Asian American community, galvanizing nation-wide vigils and protests against anti-Asian racism. Reminiscent of the pan-Asian unity coalesced by the killing of Vincent Chin in 1982, this level of solidarity among Asian Americans in the early response to the attacks is unparalleled in the Asian American history.


    Yet, dissenting voices, particularly doubts about the racist nature of the shootings and the subsequent implications on racial solidarity with other minorities, begin to bubble up. In WeChat—a popular social media platform among Chinese Americans—some pointed to the massage parlors’ possible involvement in sex trade, suggesting that the victims themselves deserved some blame. Some attributed the rise of violence to the lack of law enforcement in general. The Asian American Coalition for Education, a national organization known for its strong opposition to race-based education policies, blamed “racial divisions and toxic identity politics,” “a lack of adequate parenting,” and “broken families” among other factors in a press release.


    The reluctance to condemn anti-Asian racism appears to echo the local police’s initial statement that cited the suspect’s denial of racial motivation, despite the fact that six of the eight victims were women of Asian descent. According to the police, the suspect claimed that while suffering “sexual addiction” he saw the massage parlors as “a temptation that he wanted to eliminate.” Thus, in sharp contrast to his denial of racial intent, the suspect placing the blame on the spas and Asian American masseuses for the his own sexual impulses revealed exactly a sentiment of racialized misogyny. When a female Asian protester quietly stood in a rallying crowd in Portsmouth Square in San Francisco’s Chinatown holding a large “Not Your Fetish” sign over her head, this hidden psychology was suddenly brought to light. The psychologically entangled attitudes of both racial danger and sexual temptation toward Asian women have deep roots in American history since the mid-nineteenth century.


    When they first encountered Chinese immigrants in the Gold Rush era, many whites who were outraged by Chinese prostitution viewed Chinese women as sexually excessive and immoral. Coupled with anti-Chinese prejudice and anti-miscegenation attitudes, the severe gender imbalance of Chinese immigrants, 95% of whom were men, engendered an illicit but lucrative sex business. An estimated 70 percent  of Chinese women in San Francisco in 1870 were engaged in prostitution. Despite the prevalence of white prostitution, outcry and moral condemnation of prostitution by Chinese women led to government investigations and suppression. The stereotypes of Chinese women as promiscuous prompted the federal legislature to pass the Page Act of 1875 that defined all Chinese women as potential prostitutes, thereby prohibiting the entry of most Chinese women to the United States.


    The perception of Chinese women as immoral aliens accompanied white males’ fascination with their presumed exotic sexuality. Some Chinese brothels tailored their services and settings to attract curious white patrons. Ah Toy, “tall, English-speaking, and alluring,” was among a few Chinese madams who exploited the fascination of influential white clients to gain reputation as the most popular Chinese courtesan in San Francisco. Her exotic charm attracted white men who lined up and paid one ounce of gold just to see her face.


    Stereotypes of Asian women as exotic and servile objects of sexual fetishization persisted and grew. After the Chinese Exclusion, Japanese immigrants replaced the Chinese as a major source of cheap labor in the American west. Through the examination of a popular U.S.-Japan doll exchange in the 1920s, literary scholar Erica K. Kalnay unravels the intricacy of Americans’ sexual fantasies toward Asian femininity. American popular culture also propagated racial fetishization of Asian women. In The Toll of the Sea, a 1922 silent film, Anna May Wong portrayed Lotus Flower as a docile and sexually excessive Chinese woman who was willing to give everything, including her life, to her white male lover only after finding out he could not bring her to the U.S.


    Asian women were at once desired and feared. The fetishization of Asian women stood in stark contrast with the prevailing yellow peril trope in the early twentieth century resulting from the perceived growing threat of Japanese immigrants. The racial fear arose from not only the economic threat due to labor competition, but also the family formation of unassimilated aliens that presented sexual danger to the American racial purity. To preserve the ideal of American homogeneity, the Immigration Act of 1924 consequently barred all Asian immigrants from entry except for the Filipinos, who were later subjected to severe immigration restrictions in 1934.


    During the Cold War era, militarized U.S. foreign policy in Asia provided space and conditions under which U.S. soldiers perpetrated violent sexual exploitation of Asian women. Camptowns near U.S. military bases in Philippines, Korea, and Vietnam thrived on prostitution. From the 1950s to the 1970s more than one million Korean women provided sex services to U.S. servicemen in Korea. Although despised as fallen women by the Korean society, these military prostitutes functioned as instruments to serve the sexual demands of U.S. soldiers who in turn protected the security interests of both countries.


    Sexual violence against Asian women committed by U.S. soldiers intertwined with the sexualization of Asian women. The popular Broadway musical Miss Saigon, set in wartime Saigon, simply replayed the 1920s’ Lotus Flower story. U.S. servicemen’s military prostitution experience strengthened the exotic sexual image of Asian women who were at once fetishized objects of exploitation and submissive domestic servants and even docile war brides. Between 1950 and 1989 close to one hundred thousand Korean military brides, many of whom came from the camptowns through brokered marriage, entered the U.S. It is no surprise that a high percentage of military brides later divorced. Left to survive with little skill and resource in a foreign land, many took on prostitution again. This phenomenon in turn reinforced the image of Asian women embodying immorality and sexual danger.


    Historian Yuri Doolan traced the emergence of the illicit massage businesses around domestic military bases to military prostitution in Korea and the military brides. One of the Atlanta shootings victims, Yong Ae Yue, reportedly came to the U.S. in the 1970s as a military bride. Whether or not the massage parlors involved in the shootings conducted illicit businesses is unimportant. Regardless of the six Asian American female victims’ employment, denying the suspect’s racial intent on the presumption of their engagement in sex work not only misses the point, but also diverts the attention. The suspect perceived them as fetishized sexual objects that were simultaneously deemed dangerous. This perception, perpetuated through lived experiences and cultural productions, embodies a century and a half-long historical construction where racism and misogyny against Asian American women intersected. Perhaps the anti-Asian hate movement could forge a catalyst that more Asian American women would rise and tell those who hold sexist and racist beliefs of Asian American women: We are “Not Your Fetish.”

    Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180011 https://historynewsnetwork.org/article/180011 0
    "The Man Who Ran Washington: The Life and Times of James A. Baker III" by Peter Baker and Susan Glasser




    He was dubbed the “velvet hammer,” the “handler,” and the “ultimate fixer” by the national news media. Canadian trade negotiators, disconcerted by his hardline tactics and frequent use of profanity, had a less flattering nickname: “Texas crude.”

    For 12 years, from 1980 to 1992, James A. Baker III operated as one of the most powerful non-elected officials in Washington, serving first as President Ronald Reagan’s chief of staff and later as his secretary of the treasury. But Baker is best known for his four years as secretary of state under President George H.W. Bush.  Working closely with Bush, a close friend, he negotiated major arms control agreements and maintained stability in Europe as the Soviet Union crumbled and Germany was re-unified.

    A deft administrator, he readily adapted to new bosses, working for three different presidents. He was respected by Democrats, winning confirmation as secretary of state in 1989 with a 99-0 Senate vote.

    But who exactly was he and what did he stand for?

     A new biography by Peter Baker (no relation), a New York Times political reporter, and New Yorker staff writer Susan Glasser (a husband-and-wife team), provides a detailed (and at times hour-by-hour) accounting of his service in three administrations. The biography, written with Baker’s full cooperation, offers some fascinating accounts of the fierce debates conducted within the Reagan administration by ambitious aides who were given broad responsibility over policy matters by an elderly president.

    Yet despite the detailed accounts of White House discussions and personality clashes (e.g. feuds with Ed Meese and Dan Quayle), James Baker emerges as an enigma, a man with few core principles, other than a drive to continually maximize his own power.

    Richard Hofstadter, in his classic 1964 book, Anti-Intellectualism in American Life, noted the difference between an intellectual and a “professional” man. The professional man may be highly intelligent and well-educated, but he lacks the curiosity of an intellectual; he is unwilling to envision radically different perspectives or make strong dissents. In Hofstadter’s words, “the professional man lives off ideas, not for them.”  The professional’s mind is “made not for free speculation, but for a salable end.”

    James Baker embodies Hofstadter’s definition of the professional man. He was always goal-oriented, a master of tactics, not particularly concerned with (in the words of President Bush) “the vision thing.”

    As secretary of state, he spent 283 days out of the country, traveling more than 700,000 miles on his Air Force plane, but he showed little interest in the culture of foreign countries. He rarely left the U.S. embassy to engage in local sightseeing or meet with artists or writers. 

    Born into Privilege

    Baker was born in 1930 to a wealthy, powerful family. His father was the lead partner at Baker Botts, a prominent Houston law firm with a client list filled with big oil and gas companies. Baker grew up in a Texas that was deeply segregated and run by southern Democrats.

    He attended Princeton and the University of Texas law school, where he showed little interest in politics.  He was a registered Democrat until he made a close friend in 1957 at the Houston Country Club: George H.W. Bush. The son of U.S. Senator Prescott Bush (R-Connecticut), George harbored a lifelong interest in politics. He successfully ran as Republican for a Houston-based congressional seat in 1966, in a state that was dominated by conservative Democrats.  In 1970 he was urged by the national Republican Party to run for the U.S. Senate, and enlisted Baker to help manage his campaign in the Houston region.

    Bush lost that race to Democrat Lloyd Bentsen, but won more votes than expected. Bush was rewarded by being appointed as Ambassador to the United Nations by President Nixon.  At the Republican National Convention in 1972, he introduced Baker to Nixon and his cabinet. In 1975, Baker was named undersecretary for the Commerce Department by President Gerald Ford.

    Quickly impressing Ford administration aides Donald Rumsfeld and Dick Cheney, Baker was soon assigned to help manage President Ford’s campaign to win the Republican nomination for President in 1976.  Ford faced strong challenge from Ronald Reagan, but he ultimately won the party’s nomination, only to lose the general election to Jimmy Carter.

    In 1980, he was tapped by his close friend, George H.W. Bush, to be his campaign manager as he sought the Republican presidential nomination. Although Bush won primary contests in Iowa and Pennsylvania, he lost most of the others to the popular Reagan. But thanks to careful maneuvering by Baker, Bush was selected by Reagan as the vice-presidential nominee.

    Baker impressed Reagan, his wife Nancy and top Reagan aides, with his dogged determination masked by a soft Texas drawl — the velvet hammer approach.  He was rewarded by being named as Reagan’s chief of staff.  

    As the administration’s top gatekeeper, Baker had to manage a fractious staff that included men with large egos such as Alexander Haig, Ed Rollins and Ed Meese; at the same time he fended off criticism from hard-right conservatives like Richard Viguerie. Reagan, aged 69 when inaugurated, sometimes dozed off at long meetings and was happy to delegate the nuts and bolts of policymaking to Baker.

    Today, Reagan is best known for “Reaganomics,” a series of tax cuts that benefited wealthy Americans and triggered our nation’s huge rise in income inequality. But as the authors of The Man Who Ran Washington show, Reagan’s policies were often enacted with the help of leading Democrats, who controlled the House during all eight years of the Reagan administration. Baker proved a masterful negotiator with Democratic leaders Tip O’Neil and Dan Rostenkowski, conceding minor points to let them save face, while holding firm on principles he knew were important to Reagan.

    No Regrets

    In 1988 Baker managed George H.W. Bush’s 1988 campaign against Michael Dukakis. He opposed Bush’s selection of Dan Quayle as running mate, but Bush overrode him. This led to four years of animosity between the new vice president and Baker, now secretary of state.

    Bush’s 1988 campaign repeatedly attacked Dukakis, with the controversial Willie Horton ad a particularly low blow. The ad, criticizing Dukakis for a Massachusetts state prison furlough program, featured the face of Horton, a black convict who was released and went on to rape a white Maryland woman and stab her boyfriend.

    While Baker did not create the ad (it was produced by an outside PAC), he never disavowed it, despite widespread criticism that it depicted terrible racist stereotypes. He told the authors he had no regrets: “We won a big victory. The name of the game is to win ethically and I think we did.”

    So how does Baker view Donald Trump, who won the 2016 nomination after ridiculing Florida Governor Jeb Bush and President George W. Bush, the beloved sons of Baker’s close friend?

    At the request of Paul Manafort, James Baker met Trump in May 2016 in Washington. He presented him with a two-page memo that recommended Trump avoid isolationist talk, stop criticizing NATO, and work for a new nuclear arms control treaty.  Trump, of course, took none of that advice. Baker, sensitive to the Bush family’s wishes, ultimately declined to publicly endorse Trump, but he told the authors he did vote for him, saying that the billionaire was a conservative, “even if he’s crazy.” 

    It has been forty years since Reagan was first elected in 1980, yet the effects of his administration — including massive gains by the wealthy at the expense of the poor and the weakening of labor unions — continue to impact our nation today.  

    The Man Who Ran Washington offers an insightful look at “the velvet hammer,” a man who worked the levers of power for three Republican presidents and has no regrets on how he did it.

    Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180006 https://historynewsnetwork.org/article/180006 0
    Mars Perseverance and the King's Bay Plowshares: A Study in Priorities



    I won, I thought, often.  Bobby Allan and I in Ms. Cooke’s fifth grade “show and tell” vied for attention with our astronaut scrapbooks.  New York in 1962 had six daily newspapers plus Life and Look magazines, all with color spreads after space flights and available at the nearby newsstand of Manny’s Candy Store.  Scissors sharpened, I gathered and clipped first editions.  Letters to NASA brought still more details.  When I put aside thoughts of running for President, I presumed I might go into space.

    The Perseverance Rover’s presence on Mars kindles memories but “the thrill is gone,” as BB King sang of a soured romance.  My adult awareness of others’ suffering transcends childhood’s sense of adventure as the spacecraft’s $2.7 billion, 293 million-mile trip checks “different types of rocks such as clays and carbonates to determine whether they contain traces of ancient microbial life,” as Forbes reports.

    Just as my career goals evolved, our nation needs new priorities.

    The Hunts Point Produce strike settlement, hailed as a triumph in January, won little, an hourly raise of $1.85 an hour over three years; eighty-five undocumented immigrants now on hunger strike at Judson Memorial Church in Manhattan’s Greenwich Village are starving themselves to highlight the plight of 274,000 unemployed workers excluded from federal and state pandemic relief programs. 

    “Never before has man had such capacity to control his own environment, to end thirst and hunger, to conquer poverty and disease, to banish illiteracy and massive human misery,” President Kennedy told the United Nations General Assembly on September 20, 1963.  He had grown from harsh Cold Warrior to coexistence crusader, having one month before espoused peaceful coexistence with the Soviet Union.  A Dallas gunman killed him two months later.

    One wonders what might have been had he lived.  Instead, how much has changed?

    “There were 55,915 homeless people, including 17,645 children, sleeping each night in the New York City municipal shelter system in January,” the Coalition for the Homeless reveals.  City Harvest adds that 1.5 million New Yorkers of whom 466,000 are children deal daily with food insecurity – 38% and 39% growth, respectively, over pre-pandemic figures.

    The National Defense Authorization Act 2020 sailed through Congress, meanwhile, to expend $2 billion daily, according to the Friends Committee on National Legislation, for weapons systems, 800 overseas bases, and our entanglements in Afghanistan and Iraq.

    “In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by a military-industrial complex, President Eisenhower’s 1961 Farewell Address warned.  Now that complex wreaks an invisible war against at-risk Americans.

    These issues are old, as I am, but the post-9/11 provision of war weapons to urban police departments that an FBI inquiry found have been infiltrated by white supremacists prompted frequent and severe uncalled for police violence against people of color.  The claim that holds “a few bad apples” at fault for the five-year death sequence that NYPD Officer Daniel Pantaleo launched with his chokehold on Eric Garner is disproved by what we know has happened and the lack of mandatory reporting for police use-of-force statistics.

    “The incidence of wrongful use of force by police is unknown,” the National Institute of Justice and Bureau of Justice Statistics wrote in October 1999.  “Research is critically needed to determine reliably, vividly and precisely how often transgressions occur.”  The FBI’s National Use of Force Data Collection has had voluntary participation by federal, state and tribal law enforcement groups representing 41% of the nation’s sworn officers since its July 2019 origin.

    Bobby Allan and I in fifth grade thought our generation would solve problems like these. 

    Coming of age in the civil rights and Vietnam War eras convinced us it could.  High school peers and I chipped in for a bus to and from the November 1969 Moratorium at the Washington Monument and my service as a US Senate aide and Peace Corps Volunteer stemmed from ideals.  Watching “Blue Bloods” on television recalls my change advocacy while chairing the NYPD Training Advisory Council’s Race Subcommittee and that institution’s rigid resistance.

    Pablo Picasso’s 1938 Guernica mural speaks to me as its stark blue, black and white imagery shows the perpetual struggle of good and evil.  Even incremental change requires extreme effort, it seems in our time, as bureaucratic growth divides leaders from constituents and fewer people possess the nation’s vast wealth.  In 1983 I could question Congressman Stephen Solarz in person for his vote to fund President Reagan’s Nicaraguan Contra War; in 2019 Congressman Jerry Nadler’s New York scheduler – after I had filled out his required online contact box, sent letters, then called his Washington, Brooklyn and Manhattan offices – said “he’s too busy” to see me at any time.  

    “Be the change you want to see in the world,” Gandhi said.  I have his quote on a tee shirt that hung in my history classrooms through my 19-year teaching career.  Perhaps that is the best I can do as at age 68 I write from my living room corner about what I see through my window and the lens of experience about national and neighborhood life.

    The Kings Bay Plowshares poured vials of their blood, damaged a missile and spray-painted “Love One Another” on a sidewalk at the nuclear submarine base they entered in Georgia.  Because we value property over people, and are governed by leaders who function from fear, the seven, who deserve a parade, are in jail.  The Navy’s 14 Trident submarines can each launch 24 nuclear missiles and a $58.2 million order placed last year has more on the way.

    Is this to defend us or to dominate as we say Russia and China intend?

    We should instead address our internal enemies, homelessness and hunger to start.

    Our nation with democracy intact would be well worth defending.  It might cost less than our weapons and make policing unnecessary.

    The Perseverance Rover seeks Martian evidence of ancient microbial life.  Shouldn’t we figure out how to live better on earth

    Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180009 https://historynewsnetwork.org/article/180009 0
    The "Doughboys" Made their Biggest Contribution Fighting Postwar Hunger



    It was April, 1917 when the United States entered World War One and America's soldiers, nicknamed the Doughboys, headed to Europe to fight Germany. Less known about the Doughboys is their role in feeding the hungry in Europe, even long after Germany had been defeated. Following the Armistice of November 11, 1918, the nation of Latvia tragically saw no end in fighting. Latvia was invaded by the Communist Bolsheviks from Russia. The occupation led to severe food shortages in early 1919.  The Allies, including German forces as part of the Armistice, fought to retake Latvia's key port city of Riga from the Communists in May.  Knowing that people were starving in Riga, the American Relief Administration (ARA) took action. Doughboys were part of this special force for WWI relief.  Lt. George Harrington led a 40 car train of food supplies toward Riga to feed the starving civilians. They met resistance from the Communist Bolsheviks. Captain Evan Foreman reported in a radiogram “Between Libau and Riga, on Monday night,a band of robbers assaulted the train guard. The guard returned the hostile fire and an armoured train furnished machine - guns which completely routed the assailants."  There were even more challenges for the ARA Doughboys. Ten miles short of the train's destination, a worst case scenario was realized. The tracks were torn up, another consequence of war. Harrington wrote "During the advance the railroad between Mitau and Riga was destroyed for several kilometers." We learn in school never to give up, and to overcome obstacles. Lt. Harrington and his small force of doughboys epitomized that spirit. They went to work repairing the tracks and getting the help of anyone they could. Herbert Hoover, who led WWI relief, recalled in his memoirs that President Woodrow Wilson said "We need a lot more Harringtons" fighting hunger.    Harrington’s force got the train moving again and it arrived in Riga on May 29. What they saw shocked them. There were dead bodies from the fighting and also starvation.  Now that Harrington and his team arrived with food, they had to arrange an orderly distribution. But where do you begin amid such chaos? They went to the American consulate for help but it had been abandoned when the Communists had attacked. Then suddenly a frightened woman peeked through a crack and announced herself.  Miss Pawla Poedder, of Latvian descent, had been the secretary of the consulate for five years. She had been left in charge when the diplomats fled after the Communist attack. It was quite a promotion.  A story in Independent magazine tells how Ms. Poedder spent sleepless nights waiting for the Communists to do the worst, especially after posting a picture of President Wilson on the consulate door. The communists even threatened to seize typewriters. She never gave hers up.  Showing courage earned her the respect of the occupiers.  Poedder went into action using the local contacts to help organize the food distributions. Ms. Poedder was struggling with hunger herself and showed tears of relief that the Americans had returned. The Independent said that Uncle Sam owed Poedder a medal for her bravery.  The American Relief Administration achieved its mission of feeding Latvians, and child feeding programs led by Captain Thomas Orbison saved lives for months afterward.  But now today we face the challenge of famine threatening Yemen, where the UN World Food Program (WFP) has low funding for its mission to feed 13 million civilians impacted by the civil war.  In South Sudan 7.24 million people are in danger of severe hunger as we approach summer. WFP is also low on funding there. Burkina Faso and the Sahel region are also in danger of famine.  Each of us can donate or write to our elected officials urging food and peace for these countries. Like the Doughboys we must never give up the fight against hunger. 

    Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/179979 https://historynewsnetwork.org/article/179979 0
    The Roundup Top Ten for April 23, 2021

    Racism Has Always Been Part of the Asian American Experience

    by Mae Ngai

    Anti-Asian racism draws from different historical origins than Jim Crow, but their histories are part of the same conflict: whether White Americans are entitled to rule over other people, domestically or globally. 


    The Derek Chauvin Verdict Won’t Stop Cops Murdering Black People.

    by Kellie Carter Jackson

    Historical reflection shows that Derek Chauvin's killing of George Floyd was not an anomaly. His conviction won't purge policing of racial bias. 



    Why the Amazon Workers Never Stood a Chance

    by Erik Loomis

    "We may be in a period where economic justice concerns are more central to our politics than any time since the mid-20th century. But without a new round of labor law reform, organized labor cannot succeed."



    All the President’s Historians

    by Daniel N. Gullotta

    Joe Biden's attraction to Jon Meacham's historical narratives of American ideals triumphant over adversity makes sense for a president dedicated to healing and reunification. 



    How White Americans’ Refusal to Accept Busing has Kept Schools Segregated

    by Matthew D. Lassiter

    The legal distinction between "de facto" and "de jure" segregation has always been a convenient fiction allowing the perpetuation of segregation by obscuring the role of government in creating and sustaining a racially discriminatory housing market. 



    What the Rise and Fall of the Cinderella Fairy Tale Means for Real Women Today

    by Carol Dyhouse

    "Cinderella dreams an impossible dream: she isn’t a helpful role model for today’s young girls thinking about their future, and is unlikely to regain the intense hold over the female imagination that was evident in the 1950s."



    The Men Who Turned Slavery Into Big Business

    by Joshua D. Rothman

    "We still live in the world that Franklin and Armfield’s profits helped build, and with the enduring inequalities that they and their industry entrenched."



    Calls to Disarm the Police Won’t Stop Brutality and Killings

    by Maryam Aziz

    Calls to disarm police departments ignore the way that policing has used unarmed forms of violence in its efforts at social control, particularly of Black communities. 



    Adam Toledo's Killing is Part of a Brutal Pattern of Child Killings in America

    by Keisha N. Blain

    Repeated acts of police violence against children underscore the fundamentally racist roots of policing in the United States and demand a diversion of resources from police to social services. 



    American Journalism’s Role in Promoting Racist Terror

    by Channing Gerard Joseph

    American journalism profited from the sale of advertisements for the slave trade and stirred up lynch mobs. When will the industry acknowledge its role in American racism? 


    Sat, 15 May 2021 11:21:00 +0000 https://historynewsnetwork.org/article/180003 https://historynewsnetwork.org/article/180003 0