(R)evolutionary Biology (R)evolutionary Biology blog brought to you by History News Network. Mon, 25 Sep 2023 17:23:09 +0000 Mon, 25 Sep 2023 17:23:09 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.historynewsnetwork.org/blog/author/43 It's Madness to Keep on Developing Nuclear Weapons, so Why Do We?

Related Link What Historians Can Learn from the Social Sciences and Sciences

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; his most recent book is "Buddhist Biology: Ancient Eastern Wisdom Meets Modern Western Science" (Oxford University Press, 2014). 

Most people can be forgiven for ignoring the threat posed by nuclear weapons. It might seem surprising, but we have been preprogrammed by our own evolutionary history to engage in such ignorance. The nuclear age is just a tiny blip tacked onto our very recent phylogenetic past, so when it comes to the greatest of all risks to human survival, we are more threatened by the instincts we lack than by those we possess.

At the same time, we are immensely threatened by the weapons we possess, and which the current administration is planning to augment and “modernize.” But first, a glimpse into how our evolution figures into this mess:

It had been the world's first murder.  The ape-man exultantly threw his club (actually, the leg-bone of a early quadruped) into the air, and as it spun, it morphed into an orbiting space station.  In this stunning image from the movie "2001: A Space Odyssey," millions of cinema-goers saw the human dilemma in microcosm.  We are unmistakably animals, yet we also behave in ways that transcend the strictly organic.  Ape-men all, we are the products of biological evolution - a Darwinian process that is nearly always painfully slow - yet at the same time we are enmeshed in cultural evolution, a Lamarckian phenomenon which, by contrast, is blindingly fast and which proceeds under its own rules. 

As the cinematic ape-man's club traveled through air and ultimately, into outer space, director Stanley Kubrick collapsed millions of years of biological and cultural evolution into five seconds.  This isn't, however, simply a cinematic trick:  We are all time-travelers, with one foot thrust into the cultural present and the other stuck in our biological past.

We're unique among living things in being genuinely uncomfortable in our situation.  This should not be surprising, because even though our cultural achievements must have somehow evolved along with our organic beastliness, the two evolutionary processes (biological and cultural) have become largely disconnected, and as a result, so have we: from our self interest.

Imagine two people chained together; one a world-class sprinter, the other barely able to hobble.  To understand why biological and cultural evolution can experience such a disconnect (despite the fact that they both emanate from the same creature), consider the extraordinarily different rates at which they proceed. 

Biological evolution is unavoidably slow. Individuals, after all, cannot evolve.  Only populations or lineages do so.  And they are shackled to the realities of genetics and reproduction, since organic evolution is simply a process whereby gene frequencies change over time.  It is a Darwinian event in which new genes and gene combinations are evaluated against existing options, with the most favorable making a statistically greater contribution in succeeding generations.  Accordingly, many generations are required for even the smallest evolutionary step. 

By contrast, cultural evolution is essentially Lamarckian, and astoundingly rapid.  Acquired characteristics can be "inherited" in hours or days, before being passed along to other individuals, then modified yet again and passed along yet more - or dropped altogether - everything proceeding in much less time than a single generation.  Take the computer revolution.  In just a decade or so (less than an instant in biological evolutionary time), personal computers were developed and proliferated (also modified, many times over), such that they are now part of the repertoire of all technologically literate people.  If, instead, computers had "evolved" by biological means, as a favorable mutation to be possibly selected in one or even a handful of individuals, there would currently be only a dozen or so computer users instead of a billion. 

Just a superficial glance at human history shows that the pace of cultural change has been not only rapid - compared with the rate of biological change - but if anything the rate of increase in that change has itself been increasing, generating a kind of logarithmic curve.  Today's world is vastly different from that of a century ago, which is almost unimaginably different from 50,000 years ago ... although neither the world itself nor the biological nature of human beings has changed very much at all during this time. Cultural inventions such as fire, the wheel, metals, writing, printing, electricity, internal combustion engines, television and nuclear energy have all generated seemingly overnight, while our biological evolution has plodded along, a tortoise to the cultural hare.

Try the following Gedanken experiment.  Imagine that you could exchange a newborn baby from the mid-Pleistocene - say, 50,000 years ago - with a 21st century newborn.  Both children - the one fast-forwarded no less than the other brought back in time - would doubtless grow up to be normal members of their society, essentially indistinguishable from their colleagues who had been naturally born into it. 

A Cro-Magnon infant, having grown up in 21st century America, could very well find herself comfortably reading the New York Times on her iPad, while the offspring of today's technophiles would fit perfectly into a world of saber-toothed cats and stone axes.  But switch a modern human adult and one from the late Ice Age, and there would be Big Trouble, both ways.  Human biology has scarcely budged in tens of thousands of years, whereas our culture has changed radically.

Consider violence and aggression, since this, after all, was what our cinematic ape-man was doing when so adroitly represented on film.  The history of civilization is, in large part, one of ever-greater efficiency in killing: with increasing ease, at longer distance, and in larger numbers, as in the "progression" from club, knife and spear, to bow and arrow, musket, rifle, cannon, machine gun, battleship, bomber, and nuclear-tipped ICBM.  At the same time, the human being who creates and manipulates these marvelous devices has not changed at all.  

Considered as a biological creature, in fact, Homo sapiens is poorly adapted to kill: given his puny nails, non-prognathic jaws and and laughably tiny teeth, a person armed only with his biology is hard-pressed to kill just one fellow human, not to mention hundreds or millions.  But cultural evolution has made this not only possible but easy. 

Animals whose biological equipment make them capable of killing each other are generally disinclined to do so.  Eagles, wolves, lions, and crocodiles have been outfitted by organic evolution with lethal weaponry and not coincidentally, they have also been provided with inhibitions to their use against fellow species-members.  (This generalization was exaggerated in the past.  Today, we know that lethal disputes, infanticide, and so forth do occur, but the basic pattern still holds: rattlesnakes, for example, are not immune to each other's venom, yet when they fight, they strive to push each other over backwards, not to kill.)  Since we were not equipped, by biological evolution, with lethal weaponry, there was little pressure to balance our nonexistent organic armamentarium with behavioral inhibitions concerning its use. 

The disconnect between culture and biology is especially acute in the realm of nuclear weapons.  At the one-year anniversary of the bombing of Hiroshima, Albert Einstein famously noted that "the splitting of the atom has changed everything but our way of thinking; hence we drift toward unparalleled catastrophe."

musk ox

He might have been talking about musk-oxen.  These great beasts, like shaggy bison occupying the Arctic tundra, have long employed a very effective strategy when confronted by their enemies: wolves.  Musk oxen respond to a wolf attack by herding the juveniles into the center, while the adults face outward, arrayed like the spokes of a wheel.  Even the hungriest wolf finds it intimidating to confront a wall of sharp horns and bony foreheads, backed by a thousand pounds of angry pot roast.  For countless  generations, their anti-predator response served the musk ox well. 

But now, the primary danger to musk oxen is not wolves, but human hunters, riding snowmobiles and carrying high-powered hunting rifles. Musk oxen would therefore be best served if they spread out and high-tailed it toward the horizon, but instead they respond as previous generations have always done: they form their trusted defensive circle, and are easily slaughtered.

The invention of the snowmobile and the rifle have changed everything but the musk ox way of thinking; hence they drift toward unparalleled catastrophe.  (Musk oxen are now a threatened species.) They cling to their biology, even though culture - our culture - has changed the payoffs.  Human beings also cling to (or remain unconsciously influenced by) their biology, even as culture has dramatically revised the payoffs for ourselves as well.  That musk ox-like stubbornness is especially evident when it comes to thinking, or not thinking, about nuclear weapons.

Take, for example, the widespread difficulty so many people have when it comes to conceiving nuclear effects.  When told something is "hot," human beings readily think in terms of boiling water, burning wood, or perhaps molten lava.  But the biological creature that is Homo sapiens literally cannot conceive of temperatures in the millions of degrees.  Before the artificial splitting of uranium and plutonium atoms (a cultural/technological innovation if ever there was one), nuclear explosions had never occurred on earth.  No wonder we are unprepared to "wrap our minds" around them.  

Similarly with the vast scale of nuclear destruction: we can imagine a small number of deaths - so long as none of them include our own! - but are literally unable to grasp the meaning of deaths in the millions, all potentially occurring within minutes.  And so the conflict between our biological natures and our cultural products cloaks nuclear weapons in a kind of psychological untouchability.

By the same token, the "cave man" within us has long prospered by paying attention to threats that are discernible - a stampeding mastodon, another Neanderthal with an upraised club, a nearby volcano - while remaining less concerned about what cannot be readily perceived.  Since nuclear weapons generally cannot be seen, touched, heard or smelled (something to which government policy contributes), they evade our psychological radar, allowing the nuclear Neanderthal to function as though these threats to his and her existence don't exist at all.

If a homicidal lunatic were to stalk your workplace, people would doubtless respond, and quickly.  But although we are all stalked by a far more dangerous nuclear menace, the Neanderthal within us remains complacent.

This doesn’t mean, however, that our situation is hopeless. Begin with this rather homely question: Why are human beings so difficult to toilet train, whereas dogs and cats - demonstrably less intelligent than people by virtually all other criteria - are housebroken so easily? Take evolution into account and the answer becomes obvious. Dogs and cats evolved in a two-dimensional world, in which it was highly adaptive for them to avoid fouling their dens. Human beings, as primates, evolved in trees such that the outcome of urination and defecation was not their concern (rather, it was potentially a problem for those poor unfortunates down below). In fact, modern primates, to this day, are virtually impossible to house-break.

But does this mean that things are hopeless, that we are helpless victims of this aspect of our human nature? Not at all. I would venture that everyone reading this article is toilet-trained. Despite the fact that it means going against eons of primate history, human beings are able - given enough teaching and patience - to act in accord with their enlightened self-interest. This is something to celebrate!

Is it unreasonable, then, to hope that a primate that can be toilet-trained can one day be planet-trained, too? As Carl Sagan emphasized, eliminating nuclear weapons – eventually, all of them – is a basic requirement of species-wide sanity and good planetary hygiene. And yet, the United States government is currently planning to upgrade its investment in the nuclear triad of “delivery systems” (missiles, bombers and missile-firing submarines) as well as the bombs and warheads themselves, at an estimated cost of one trillion dollars over the next three decades.  

I began by noting that people (at least, those of us not making a career from them) can be forgiven for having largely ignored nuclear weapons. After all, our evolutionary past is paradoxically working against our ability to focus effectively on these terrible devices. But we’re more than musk oxen, and if nothing else, the currently intended nuclear expansion, involving as it does a huge expenditure of money desperately needed for genuine human needs, offers the prospect of converting a proposed technological and political obscenity into a counter-evolutionary opportunity.

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153626 https://historynewsnetwork.org/blog/153626 0
Why Everybody Should Be Familiar with Tycho Brahe's Blunder David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; he is writing a book about paradigms lost.

Tycho Brahe lived with a hand-crafted nose made of brass, after his real one was sliced off in a duel. Of greater interest than this anatomical peculiarity, however, is Brahe’s intellectual anomaly – which turns out to be not that unusual after all. Tycho Brahe was a renowned 16th century Danish astronomer and a great empirical scientist whose data were used to formulate Johannes Kepler’s three laws of planetary motion.

Confronted with irrefutable evidence that the known planets (Mercury, Venus, Mars, Jupiter and Saturn) revolved around the Sun, Brahe was nonetheless committed to the prevailing biblical view of a geocentric universe. So he devised an ingenious model in which those planets indeed revolved around the Sun ... but with the resulting conglomeration obediently circling a central and immobile Earth!

Modern versions of Brahe’s Blunder abound, in which people accept – begrudgingly – what is undeniable, while nonetheless desperately clinging to their pre-existing beliefs: what they want to be true. Recent prominent cases include granting that the Earth’s climate is heating up (the data are as clear as that confirming planetary orbits), but refusing to accept that human beings are responsible, or agreeing that evolution is real (when it comes to microbial antibiotic resistance, for example) but denying that it produced human beings. A good paradigm is a tough thing to lose.  

This, in turn, may contribute to what appears to be a “war on science” these days, manifested most notably by a triad of resistances: to evolution, to human-caused climate change, and to medical vaccinations. There are doubtless many reasons for this unfortunate situation, including religious fundamentalism (evolution), economic interest on the part of fossil fuel companies and their minions (global warming), and misinformation deriving at least in part from a single discredited but nonetheless influential medical report (vaccinations).

Of course, other factors are also at work, but an especially important contributor to this anti-science epidemic - and one that hasn’t received the attention it warrants - is the speed with which scientific “wisdom” has reversed itself, combined with widespread public reluctance to modify an opinion once established. Just consider how much easier it is to change your clothes than to change your mind. Most people aren’t as inventive as Tycho Brahe, and so the rapid reversals of scientific consensus have left many people feeling jerked around, and thus, confused and increasingly resistant to the whole enterprise.

Yet much of the power of science derives, paradoxically, from the fact that it is open to dramatic changes if the evidence so demands. Unlike, say, theology, scientific paradigms are constantly tested and revised, which in turns leads to the false impression that science itself is somehow unreliable. Most people, in short, have a hard time dealing with the accumulating debris of scientific paradigms lost – to the extent that confidence in science itself has become a victim. Ultimately, however, science is uniquely and profoundly reliable; indeed it offers our most dependable insights into the nature of the real world ... once we allow for the malleability of its wisdom.

In the past, the death of certain paradigms has been largely metabolized by the informed but non-professional public. Thus, most citizens don’t have much difficulty – at least these days – with replacing the Ptolemaic, earth-centered system with its Copernican, sun-centered version, or superseding Newtonian physics by Einsteinian relativity and quantum mechanics (so long as they aren’t asked to explain the latter two), or even recognizing, with Freud, that much of human mental life occurs in our unconscious.

In other cases, what Thomas Kuhn called a “paradigm shift” occasions substantial discomfort. Considering only biology, my area of expertise, examples include:

● Giving up the notion that human beings are somehow outside nature, but rather, are inextricably connected, via evolution as well as planetary ecology, to the rest of the living world.

● Understanding that evolution doesn’t work for “the good of the species,” but rather, it operates by maximizing the reproductive success of individuals, and, even more powerfully, genes.

● Acknowledging that communication between people, as between animals, naturally involves substantial amounts of deception, designed to manipulate others rather than simply to inform them.

● Absorbing the lesson that in many if not most cases seemingly “altruistic” behavior arises as a result of natural selection and thus isn’t evidence for divine intervention.

● Seriously undermining the subjectively of the potent but scientifically untenable belief in “free will,” since thoughts and behavior derive from the actions of neurons, which themselves are entirely responsive to antecedent physical conditions of electrochemistry, ion exchange, and micro-anatomy.

There are lots more, highly specific but nonetheless consequential tremors in what had, until recently, been part of the received wisdom from biological science, but now constitute paradigms lost. For example, we know that not all “germs” are bad (many – perhaps the majority – are necessary for a healthy life), that contrary to earlier dogma, many neurons are capable of regenerating, cellular differentiation isn’t necessarily a one-way street (“Hello, Dolly!”), and although individual lives are fragile, life itself is remarkably robust (living organisms have been found in some of the bleakest, most toxic and extreme environments).

Milton’s great poem, intended to “justify the ways of God to men,” described the consequences of Adam and Eve disobeying God and eating fruit from the Tree of Knowledge of Good and Evil. Ours is a lesser punishment: loss of paradigms instead of paradise. But in the end, what justifies science to men and women is something more valuable and, yes, even more poetic than Milton’s masterpiece: the opportunity to consume the fruits of our own continually reevaluated, deeply rooted, profoundly nourishing Tree of Scientific Knowledge. And to do so without committing Brahe’s Blunder.

To do so, however, we need to know that in science, paradigm lost is merely another phrase for wisdom gained.

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153646 https://historynewsnetwork.org/blog/153646 0
Here Are One Dozen Reasons Why The Nuclear Agreement with Iran Is Better than the One with North Korea David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington and a member of the Physicians for Social Responsibility national security task force. His most recent book is Buddhist Biology: Ancient Eastern Wisdom Meets Modern Western Science (Oxford University Press, 2014). 

Is the forthcoming nuclear agreement with Iran similar to the earlier, unsuccessful one with North Korea? There are in fact some similarities between Iran and North Korea, notably the fact that both countries have followed policies inimical to the interests of the US. However, there are also huge differences between Iran and North Korea, and equally immense differences between the proposed Iran Nuclear Agreement (INA henceforth) and the earlier 1994 arrangement with North Korea, formally known as the Agreed Framework (AF). Following are some key distinctions between the AF and the INA.

 1. Verification procedures and techniques that existed in 1994 when the AF was agreed have been greatly augmented in the intervening two decades, such that the INA constitutes a much more verifiable arrangement, including all possible supply chain routes. Although not perfect, the INA with Iran is the most reliable nuclear agreement ever reached between any two countries. (The details of verification within the INA have been extensively discussed elsewhere, and will not be repeated here.) Especially worth noting, however: the AF with North Korea specified very little in the way of actual verification procedures; by contrast, the overwhelming probability is that under the INA, any cheating on the part of Iran would be detected.

2. As far as can be determined from non-classified sources, the US did not possess any significant human intelligence assets in North Korea; hence we were limited in our ability to assess possible North Korean treaty violations. By contrast, human intelligence sources within Iran – whether reporting to the US or Israel – provided warning in 2002 that Iran was conducting undeclared nuclear activities in Natanz, Isfahan, and Arak. Human “intel,” in addition to satellite and other forms of surveillance, is available with regard to Iran.

3. When the AF was established, North Korea already had enough fissile material – in their case, plutonium – for at least one bomb and perhaps more. By contrast, Iran does not currently have enough fissile material – highly enriched uranium – for even one nuclear weapon. Therefore, under the INA, Iran cannot weaponize its current stockpile, even if could somehow evade the INA’s verification procedures; North Korea was able to do this.

4. The AF explicitly prohibited plutonium separation as a possible route for North Korea to make bombs, but it did not include direct prohibitions on uranium enrichment. The North Koreans then proceeded to import uranium enrichment technology (from Pakistan). By contrast, the INA outlaws all routes to nuclear explosives, both uranium enrichment – which Iran has developed thus far, using high-speed centrifuges – and plutonium separation, which it has not thus far attempted, and which, under the INA, it will not be able to do.

5. The AF was remarkably brief (just four pages!), and therefore left many details open to diverse interpretations and hence, unresolved disagreements. By contrast, the INA contains detailed procedures not only for verification, but also for dispute resolution and clear consequences for noncompliance on the part of Iran.

6. The AF was a bilateral US-North Korean understanding. No other countries participated. By contrast, the INA involves Iran and the US, the UK, France, Russia, China and Germany. In addition, it has been codified as part of a UN Security Council resolution, which is important because Iranian noncompliance would be subject not only to a response by the US, but by the international community, with a multilateral military consequence thereby potentiated.

7. The AF called for eventual “full normalization” of relations between the US and North Korea; although normalization would have been a desirable outcome, many other (non-nuclear) impediments arose, which in turn provided both countries with occasions for finger-pointing, accusations and blame. By contrast, the Iranian leadership has disavowed full normalization of relations with the US as part of the INA, providing a much clearer focus for monitoring the agreement itself, relatively unimpeded by peripheral issues.

8. North Korea constitutes a substantial conventional threat to South Korea, since Seoul is within range of substantial North Korean artillery forces. Hence, a military response to noncompliance with the AF was unlikely – something the North Korean leadership well knew. By contrast, Iran has very limited ability to project military power; it does not comprise an immediate conventional threat, even to Israel, which enjoys a huge military advantage over the largely obsolescent Iranian forces. This is not to say that a military attack on Iran would be advisable, but it is much more likely – in the event of Iranian cheating – than it was against North Korea. And the Iranian leadership certainly knows this.

9. In the case of the AF, North Korea’s immediate antagonists (South Korea and Japan) were and still are under the US “nuclear umbrella.” Hence, neither of these rivals were liable to “go nuclear” in the event of North Korean noncompliance – and neither did. By contrast, Iran is acutely aware of its ongoing competition with other gulf states, notably Saudi Arabia, Egypt and the UAE. If Iran were to defy the INA and seek to obtain its own nuclear weapons, this would almost certainly induce its Sunni rival states to go nuclear as well, an outcome Iran is unlikely to welcome.

10. The North Korean regime has long worried about military pressure from other countries, notably South Korea, the US and Japan, even possibly from China. Hence, it decided that a small nuclear arsenal was necessary for regime survival. By contrast, although Iran is definitely a troublemaker in its region, it is also the most populous nation, and is not threatened by any of its neighbors. Hence, it has nothing like the motivation to abrogate the INA that induced the North Korea to do so. In short, unlike the case of North Korea, the Iranian regime does not feel that it needs nuclear weapons to insure its survival.

11. North Korea was and still is the most isolated country in the world; hence, the consequences of undoing the AF were comparatively trivial; it lost little in the process. By contrast, Iran has a youthful population, much enamored of the West, along with numerous and influential economic stakeholders, all of whom would be seriously discomfited by a resumption of economic, social and political sanctions, which would result from their abandoning the INA, or cheating and being discovered.

12. North Korea was and still is a dictatorship, whose leadership felt free to do as it pleased with the AF; they were not concerned about any popular unrest caused by their decisions. By contrast, although Iran definitely isn’t a liberal democracy in the Western sense, its government was freely and popularly elected, and in Iran, public opinion is meaningful. Moreover, the Iranian public very much favors greater integration with the West, not less.

It has often been claimed that those who ignore history are doomed to repeat it. But it is also noteworthy that those who misread history – as by identifying parallels that are illusory or deeply misleading – are doomed to serious error, as in this case: failing to take advantage of an immensely favorable opportunity.


Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153674 https://historynewsnetwork.org/blog/153674 0
What the Prostate Gland Can Teach Us About War-Fighting This is David P. Barash's blog.  He is an evolutionary biologist and professor of psychology at the University of Washington; he is writing a book about the process whereby people are forced to accept that they aren’t central to the cosmos.

                  In 1829, Francis Henry Egerton, the 8th Earl of Bridgewater, bequeathed 8,000 pounds sterling to the Royal Society of London to support the publication of works "On the Power, Wisdom, and Goodness of God, as Manifested in the Creation."  The resulting Bridgewater Treatises, published between 1833 and 1840, are classic statements of "natural theology," seeking to demonstrate God's existence by examining the natural world's "perfection." 

            Do I really think there is a lesson here for the just-reported tragic bombing of a Doctors Without Borders hospital in Afghanistan? Stick with me and I’ll try to explain. The presumption of natural perfection is a mainstay not only of believers in special creation, but even, to some degree, of evolutionary biologists, who generally know better, but whose enthusiasm for the power of natural selection to generate outcomes of unparalleled complexity tends to obscure the many equally revealing examples of evolution resulting in pronounced error and imperfection.

            It is widely assumed that human beings – along with other living things – are exceptionally well constructed: for religious fundamentalists, this is evidence for divine intervention, whereas for the biologically informed, it indicates the subtle reality of natural selection, paradoxically because of the frequent imperfections in the natural world.

            Fast-forward from the Bridgewater Treatises to 21st century war-fighting and “ant-terrorist” airstrikes in particular. I have no doubt that the recent lethal airstrike in Kudzu was an error. Certainly it wasn’t an example of the perfection of today’s “smart munitions,” despite the extent to which apologists for high-tech missiles and bombs emphasize how “nearly perfect” and “highly accurate” they are.

            As Jeb Bush understated with regard to the school shootings in Roseburg, Oregon, “stuff happens,” and often this stuff is bad. And stuff – often very unpleasant stuff – also characterizes the products of evolution by natural selection.

            Natural selection proceeds by tinkering with existing structures, producing outcomes that are often ramshackle and just barely functional, so that, counter-intuitively, some of the strongest evidence for evolution as a natural, material process are the facts of poor “design,” resulting from historical constraints. For example, no intelligent designer (not even a first year engineering student) would come up with the human lower back, the knee, the prostate, the location of the female birth canal, the exit of the optic nerve through the retina (which produces a small but nonetheless suboptimal blind spot). When we understand that living things are highly contingent and imperfect products of natural selection, mired in history and thus unable to generate perfect optimality, we see ourselves more clearly, although the transition has been a painful one, not easily accepted.

            Now we have Secretary Clinton joining with former Florida governor Jeb Bush and Ohio governor John Kasich, calling for a no-fly zone in Syria. We can be almost certain that such a process, if implemented, would be liable to highly contingent, unpredictable and imperfect results. Critics might note that a fully material process such as natural selection is biological (and statistical) at its core, whereas a military strategy is the consequence of cultural rather than biological evolution. But susceptibility to imperfection and error remain in both cases.

            Natural selection is mathematically precise, whose outcome should be - and for the most part, is - a remarkable array of "optimal" structures and systems.  A naive view therefore assumes that the biological world is essentially perfect and certainly highly predictable, like a carefully orchestrated geometric proof.  Or like a billiard game, in which a skilled player can be expected to employ the correct angles, inertia, force and momentum.  And in fact, living things – like today’s military hardware - reveal some pretty fancy shooting.

            And so it was that even David Hume - materialist and atheist - marveled at how the parts of living things "are adjusted to each other with an accuracy which ravishes into admiration all men who have ever contemplated them."  But admiration is not always warranted.  Gilbert and Sullivan's "Mikado" sings about "letting the punishment fit the crime," gleefully announcing, for example, that the billiard sharp will be condemned to play "on a cloth untrue, with a twisted cue, and elliptical billiard balls." To a degree not generally appreciated, the organic world contains all sorts of imperfections, and as a result, shots often go awry ... not because the laws of physics and geometry aren't valid, or because the player isn't skillful, but because even Minnesota Fats was subject to the sting of reality.

            Make no mistake, evolution - and thus, nature - really is wonderful.  The smooth-running complexity of physiological systems, anatomical structures, ecological interactions, and behavioral adjustments are powerful testimony to the effectiveness of natural selection in generating highly nonrandom systems such as the near-incredible complexity of the human brain, the remarkable lock-and-key fit between organism and environment, the myriad details of how a cell reproduces itself, extracts energy from complex molecules, and so forth.

             It must be emphasized that the preceding does NOT constitute an argument against evolution; in fact, quite the opposite! Thus, if living things (including human beings) were the products of special creation rather than of natural selection, then the flawed nature of biological systems, including ourselves, would pose some awkward questions, to say the least.  If God created "man" in his image, does this imply that He, too, has comparably ill-constructed knee joints, a poorly engineered lower back, dangerously narrow birth canal, and ridiculously ill-conceived urogenital plumbing?  A novice engineer could have done better.  The point is that these and other structural flaws aren't "anti-evolutionary" arguments at all, but rather cogent statements of the contingent, unplanned, entirely natural nature of natural selection. Evolution has had to make do with an array of constraints, including - but not limited to - those of past history. 

            We are profoundly imperfect, deep in our nature. And we’re not doing all that great in Afghanistan, Iraq, Libya or Syria, either. Decades ago, we were told that the Strategic Air Command could “lob a nuclear warhead into the men’s room at the Kremlin.” And from Kosovo, Iraq, Afghanistan and Syria, we’ve been treated to video footage showing what happens when various highly accurate conventional munitions do what they are supposed to do. What we don’t see, however, are the cases where they misfire, when supposedly perfect aiming is compromised either by human, mechanical or electronic error, and we end up blowing up an Afghan wedding party or a hospital.

            Like evolved organisms, we are also severely constrained by past history, unable to generate “optimal” tactics and strategies, independent of the political, social and economic circumstances that brought us to the present state of things. And equally likely to screw up as a result.

            By the end of the 19th century, Thomas Huxley was perhaps the most famous living biologist, renowned in the English-speaking world as "Darwin's bulldog" for his fierce and determined defense of natural selection.  But he defended evolution as a scientific explanation, NOT as a moral touchstone.  In 1893, Huxley made this especially clear in a lecture titled "Evolution and Ethics," delivered to a packed house at Oxford University.  "The practice of that which is ethically best," he stated, "what we call goodness or virtue - involves a course of conduct which, in all respects, is opposed to that which leads to success in the cosmic struggle for existence. In place of ruthless self-assertion it demands self-restraint; in place of thrusting aside, or treading down, all competitors, it requires that the individual shall not merely respect, but shall help his fellows; its influence is directed, not so much to the survival of the fittest, as to the fitting of as many as possible to survive."

            "The ethical progress of society depends," according to Huxley, "not on imitating the cosmic process, [that is, evolution by natural selection] still less in running away from it, but in combating it." It also depends on recognizing that even our highest tech, like the most elaborately evolved organisms, is not only vulnerable to imperfection, but guaranteed to manifest it.


Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153677 https://historynewsnetwork.org/blog/153677 0
Why We Likely Are Monogamous – And Why Most Men Should Be Glad We Are

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; his most recent book is Out of Eden: surprising consequences of polygamy (2016, Oxford University Press). 

The evidence is undeniable. If a Martian zoologist were to visit Earth, he or she – or it – would conclude that the species Homo sapiens is somewhat polygynous (partaking of a mating system in which one male mates with more than one female). At the same time, and if our Martian looked hard enough, it would also be apparent that – paradoxically – we are also somewhat polyandrous: the mating system in which one female mates with more than one male. This does not mean, incidentally, that human beings were, or currently are, wildly promiscuous, despite the nonsensical assertions of at least one widely read book of pseudo-science (Sex at Dawn). Rather, as a species, we show the characteristic imprint of polygamy: which includes both polygyny (the more obviously manifested mating system), as well as polyandry (more subtly demonstrated, but no less real).

            ]]> Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153743 https://historynewsnetwork.org/blog/153743 0 People are Polygynous

A Mormon polygamist family in 1888 (Wikipedia)

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; his most recent book is Out of Eden: surprising consequences of polygamy (2016, Oxford University Press). A version of this post recently appeared in Psychology Today.

I’ve written – both in recent posts and my latest book – that people are polygynous (naturally harem-keeping: one man, many women). Paradoxically, we’re also polyandrous (one woman, many men), and I’ll write about that next. But for now, what’s the evidence for polygyny?

#1) In all polygynous species, males are physically larger than females. Basically, this is because polygyny produces a situation in which males compete with each other for access to females and in the biological arena such competition typically occurs via direct confrontations, in which the larger and stronger nearly always wins. If the species is strictly polygynous – that is, if polygyny is the only mating system (such as in elk, or gorillas) – then a small number of males get to father many offspring whereas most males are unmated reproductive failures. The greater the “degree of polygyny” (essentially, the larger the average harem size), the more bachelors.

The evidence is even stronger when we consider that there is a direct correlation between the degree of polygyny – average harem size – and the degree of “sexual dimorphism,” the extent to which males and females are different. This difference is most frequent in size, but also reflected in degree of ornamentation (combs, wattles, bright feathers, fancy antlers, etc.).  Gorillas, for example, are quite polygynous and males are much bigger than females; gibbons are nearly monogamous, and males and females are almost exactly the same size. And  human beings? Men are about 1.2 times larger than women, a difference that is even greater if we look at muscle mass, especially in arms and upper legs.

The disparity between the patterning of male and female reproductive success within polygynous species turns out to be very important.  Another, more technical way of saying this is that under polygyny, the “variance” in male reproductive success is high, whereas the variance in female reproductive success is low. Consider elk, for example: natural selection favors bulls who are physically imposing and therefore successful in bull-bull confrontations, because they are the ones whose genes are projected into the future, whereas there is no comparably unbalanced payoff for cows. For bulls, the situation is close to winner-take-all (all the females available in any given harem, along with all the reproductive success). For cows, everyone is, to some degree, a winner – although no one wins big time. One interesting result, incidentally, is that since they are largely freed from the tribulations of same-sex competition, females often get a curious benefit: they are more likely than males to be at the ecological optimum when it comes to body size. Males, on the other hand, since they are constrained by the rigors of sexual competition, are more likely to be too big for their own good.

Whenever species shows a consistent pattern of males larger and stronger than females, it’s a good bet that polygyny is involved. Greater male size isn’t by itself proof of polygyny, but it points in that direction. Note, as well, that in this and other cases, we are dealing with a statistical generalization, which is not invalidated by the fact that some men are indeed smaller than some women. The fact remains that by and large, men are larger and physically stronger than women. Not coincidentally, by the way, women are “stronger” in that they live longer, something probably due in large part to the rigors of male-male competition … itself due to polygyny.

#2) In all polygynous species, males aren’t just larger than females, they are more prone to aggression and violence, especially directed toward other males. In many cases, males are also outfitted with intimidating anatomy that contributes to their potential success: horns, antlers, large canines, sharp claws, etc. But once again, these accouterments only make sense insofar as their possessors are inclined to employ them.

It wouldn’t do for a bull (elk, seal, or baboon) no matter how large and imposing, to refrain from using his bulk when it comes to competing with other bulls. There is little evolutionary payoff to being a Ferdinand among bulls. No matter how imposing, he could refrain from the competitive fray, save himself the time and energy his colleagues expend in threatening, challenging and – if need be - fighting each other, not to mention the risk of being injured or killed in the process. Ferdinand would doubtless live a longer life, and probably a more pleasant one. But when he dies, his genes would die with him. Publish or perish.

Accordingly, just as polygyny generates sexual dimorphism in physical size, it works similarly with regard to behavior, and for the same basic reason. As with the male-female difference in physical size, male-female differences in violent behavior vary with the degree of polygyny. As expected, the male-female difference in aggressiveness and violence among highly polygynous species is very great. And moderately polygynous species? Here the difference is, not surprisingly, moderate.

Among human beings, men – starting as boys – are more aggressive and violence-prone than are women. I have found that the male-female difference in perpetrators of violent crime is about 10 to 1, consistent across every state in the US, and true of every country for which such data are available. Moreover, this difference is greater yet when proceeding from crimes that are less violent to more violent: the male-female difference in petty crime, for example, is very slight, greater when it comes to robbery, and greater yet with regard to assault and most dramatic in homicides. This is true even when the actual crime rates differ dramatically across different countries. Thus, the homicide rate in Iceland is about one percent that in Honduras, but the male:female ratio of those committing homicide is essentially unchanged. Overall cultural differences between Iceland and Honduras are very great, which doubtless explains the overall difference in homicide rates. However, male-female differences remain proportionately unchanged, just as male-female differences in human biology don’t vary between Iceland and Honduras, or indeed, any place people are found.

#3) In all polygynous species, females become sexually and socially mature at a younger age than do males. This phenomenon is superficially counter-intuitive, since when it comes to reproducing, females by definition are the ones who bear the greater physiological and anatomical burden: Eggs are much larger than sperm; among mammals, females, not males, are the ones who must build a placenta and then nourish their offspring from their own blood supply. Females, not males, undergo not only the demands of pregnancy and birth, they also provide all the calories available to their infants via nursing (and, little known to most people, lactation actually makes even greater energy demands than does gestation).

Based on these considerations, we would expect that if anything females would delay their sexual maturation until they are proportionally larger and stronger than males, since when it comes to producing children, the biologically mandated demands upon males are comparatively trivial: just a squirt of semen. But in fact not only are females typically smaller than males as we have seen, but they become sexually mature earlier rather than later because of yet another consequence of the power of polygyny. Male-male competition (mandated, as we have seen, for the harem-keeping sex), makes it highly disadvantageous for males to enter the competitive fray when they are too young, too small, and too inexperienced. A male who seeks to reproduce prematurely would literally be beaten up by his older, larger and more savvy competitors, whereas early-breeding females – who don’t have to deal with the same kind of socio-sexual competition – don’t suffer a comparable penalty.

And so, among polygynous species, females become sexually and socially mature earlier than do males. The technical term is “sexual bimaturism,” and anyone who has ever seen 8th, 9th or 10 graders, or simply been an adolescent will immediately recognize the phenomenon, whereby girls aged 12 to 16 are not only likely to be (temporarily) taller than their male classmates, but considerably more mature, socially as well as sexually. 

Once again, and as expected, the degree of sexual bimaturism among animals varies directly and consistently with their degree of polygyny. Sexual maturation occurs at roughly the same age for males and females in primate species that are monogamous: e.g., Latin American marmosets and owl monkeys.  Among polygynous species – e.g., rhesus macaques, squirrel monkeys and, indeed, nearly all primates (human as well as nonhuman) – males mature more slowly and thus, they reach social and sexual maturity later than do females, when they are considerably older and larger than “their” females, and also, not coincidentally, considerably older and larger than the other, less successful males. The sexual bimaturism so familiar to observers of Western teenagers (and to those teenagers themselves!) is a cross-cultural universal. And finally,

#4) We have the simple historical record, confirmed by anthropology. A noted cross-cultural survey of 849 societies found that prior to Western imperialism and colonial control over much of the world - which included the imposition of historically recent Judeo-Christian marital rules - 708 (83%) of indigenous human societies were preferentially polygynous. Among these, roughly one-half were usually polygynous and one-half, occasionally so. Of the remainder, 137 (16%) were officially monogamous  and fewer than 1% polyandrous.*

So, our biological polygyny can be considered to be essentially proved (Popperian would say, it has withstood every available attempt to disprove it). But as I’ll describe in my next post, women aren’t nearly as sexually passive as all this polygyny business might suggest.

*As it happens, these proportions are almost identical to that found among primates, too:monogamy occurs in roughly 15% of nonhuman primate species, and various degrees of polygyny found in nearly 85% with polyandry virtually unknown. 

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153746 https://historynewsnetwork.org/blog/153746 0
People Are Polyandrous, and Not Just Polygynous

Illustration of Draupadi, a princess and queen in the Indian epic "Mahabharata", with her five husbands (Wikipedia)

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; his most recent book is Out of Eden: surprising consequences of polygamy (2016, Oxford University Press). A version of this post recently appeared in Psychology Today.

Human history did not begin with historians, or with the events recorded and interpreted by them. It is as old as our species ... actually, older yet, but for my purposes, it’s enough to inquire into those aspects of our past that gave rise to our behavioral inclinations. Among these aspects, sex is prominent (albeit not uniquely formative). I wrote earlier about polygyny, which is dramatically evident in our bodies no less than our behavior. But polyandry, the mirror image of polygyny, is also “us”; ironically, we are both. Part of human nature inclines us to male-oriented harems, but also – although more subtly – to their female-oriented equivalent.

When biologists such as myself began doing DNA fingerprinting on animals, many of us were shocked, shocked, to find that the social partner of even some of the most seemingly monogamous bird species was not necessarily the biological father. And people aren’t altogether different, although for understandable reasons, the sexual adventuring of women has long been more obscured. Polyandry –unlike polygyny - has only rarely been institutionalized in human societies, and yet women, like men, are also prone to having multiple sexual partners. (This may seem – even be – obvious, but for decades biologists had assumed that female fidelity was generally the mirror-image opposite of predictable male randomness.

Male-male competition and male-based harem keeping (polygyny) is overt, readily apparent, and carries with it a degree of male-male sexual intolerance which also applies to polyandry, whereby “unfaithful” women along with their paramours are liable to be severely punished if discovered. This intolerance is easy enough to understand, since the evolutionary success (the “fitness”) of a male is greatly threatened by any extra-curricular sexual activity by “his” mate. If she were inseminated by someone else, the result is a payoff for the lover and a fitness decrement for the cuckolded male. As a result, selection has not only favored a male tendency to accumulate as many females as possible (polygyny), but also an especially high level of sexual jealousy on the part of males generally and of men in particular. This, in turn, pressures polyandry into a degree of secrecy not characteristic of polygyny. Another way of looking at it: patriarchy pushes polyandry underground, but does not eliminate it.

Female harem-keeping – polyandry – goes against some aspects of human and mammalian biology, once again because of the difference between sperm-making (what males do) and egg-making (a female monopoly). Although a male’s fitness is enhanced with every female added to his mating prospects, the same is much less true for the fitness of a female who mates with additional males.. There can indeed be a payoff to females who refrain from sexual exclusivity (actually, there are many such payoffs); however, there are also substantial costs, not least running afoul of the male sexual jealousy just described. Thus, even though females can sometimes enhance their fitness by mating with additional males, they are simultaneously selected to be surreptitious about their sexual adventuring. Hence, polyandry – unlike its overt counterpart, polygyny – is more likely to be covert and hardly ever proclaimed or institutionalized. It also doesn’t reveal itself in such blindingly obvious traits as sexual dimorphism in physical size, aggressiveness, or differences in age at sexual maturity, since unlike the situation among males, natural selection does not clearly reward such readily apparent traits among females.

Men, in their befuddlement, have had a hard time seeing female sexuality for what it is, consistently either over- or under-estimating it.  Thus, women have often been portrayed as either rapacious and insatiable, or as lacking sexual desire altogether.  At one time, Talmudic scholars entertained such an overblown estimate of women's sexuality (and society's responsibility to repress it) that widows were forbidden to keep male dogs as pets!  But as noted psychologist Frank Beach pointed out, "any male who entertains this illusion [that women are sexually insatiable] must be a very old man with a short memory or a very young one due for a bitter disappointment."  Or, as anthropologist Donald Symons put it, "The sexually insatiable woman is to be found primarily, if not exclusively, in the ideology of feminism, the hopes of boys, and the fears of men."

In this regard, it is worth mentioning that some anthropologists have recently begun reassessing the received wisdom as to polyandry’s rarity. Don’t get the wrong idea: it is still extremely unusual, although a new category, “informal polyandry,” has been proposed to include societies in which more than one man mates regularly with the same woman. These circumstances are found in a number of societies beyond the standard “classical polyandry” of the Himalayas, Marquesa Islands and parts of the Amazon basin. Informal polyandry often co-occurs with a local belief system known as “partible paternity,” in which it is thought that if multiple men (albeit rarely more than two or three) have sexual intercourse with a pregnant woman, then they literally share paternity of any offspring that result. 

Granted that the evidence for human polyandry is more speculative than dispositive, here, nonetheless, is a sample of the arguments.

1) Human beings are unusual among mammals in that females conceal their ovulation. In most species, ovulation is conspicuously advertised, but not us! Indeed, even in the medically sophisticated 21st century, and despite the fact that reproduction is such a key biological process, it is remarkably difficult to ascertain when most women are ovulating. There is considerable controversy over why women’s ovulation is kept so secret, but one intriguing possibility is that it facilitates a woman’s ability to deceive her mate as to precisely when she is fertile. If our great-great-grandmothers sported a dramatic pink cauliflower bottom when they were in season, our great-great-grandfathers could have mated with them at such times, then ignored them while pursuing their own interests. (Which notably would have included mating with other women.) But by hiding their ovulation, our female ancestors essentially made it impossible for our male ancestors to guard them all the time, giving the option for great-great-grandma to sneak copulations with other men, of her choosing, thereby avoiding the ire of her social partner while obtaining whatever benefits such “extra-pair copulations” may have provided.

2) Women are also unusual among mammals in lacking a clear behavioral estrus (or “heat”) cycle. As a result, they are able to exert a remarkable degree of control over their choice of a mate, unlike most female mammals, who find themselves helplessly in thrall to their hormones. Absent such choice, polyandry would be indistinguishable from literal promiscuity. The word “promiscuity” carries with it a value judgment, distinctly negative. For biologists, however, it simply means the absence of subsequent bonding between the  individuals involved. Some animals appear to be truly promiscuous. For example, many marine organisms – such as barnacles, or oysters – squirt their gametes (eggs and sperm) into the water, usually responding to chemical signals but not engaging in anything like mate choice. But for the most part, promiscuity is rare, since nearly all living things – females in particular - are more than a little fussy when it comes to settling on a sexual partner, even as they may end up with many such partners.

Females in general and women in particular have a substantial interest in making a good mating choice (or choices), if only because biology mandates that such decisions are especially consequential for them, since children are born quite helpless, and their prospect of biological success in enhanced by many factors, notably parental care and attention, in addition to good genes. And indeed, females in general and women in particular are especially fussy when it comes to choosing their sexual partner(s).

3) Recent studies by evolutionary psychologists have shown that during their ovulatory phase, women are attracted to men whose body build and facial features reflect high testosterone and basic good health (that is, good genes), whereas otherwise, they are more influenced by indications of intelligence, kindness, sense of humor, ambition and personal responsibility. In other words, women follow a two-part reproductive strategy consistent with an evolutionary history of polyandry: mate, when possible, with partners carrying those good genes, but associate socially with those offering the prospect of being good collaborators and co-caretakers of children. For some women – those fortunate to pair with men providing both good genes and good parenting/protection prospects – the risks of polyandry (especially, their husband’s ire, potential violence and possible abandonment) outweigh the possible benefits. But for others, quite possibly the majority, the opposite can be true.

4) The adaptive significance (evolutionary payoff) of female orgasm has long been debated. Among the possible explanations – all consistent with polyandry – is that orgasm enables women to assess the appropriateness of a short-term mating partner as a long-term prospect, while another suggests that female climax is not only rewarding for the woman in question but also reassuring for her partner, providing confidence that she will be sexually faithful … while giving her the opportunity to be exactly the opposite.

5) Given that polyandry among animals is predictably correlated with reversals in the “traditional” forms of sexual dimorphism, why hasn’t human polyandry resulted in women being larger and more aggressive than men? For one thing, it isn’t possible for men to be larger than women (which, as we’ve seen is mostly a result of polygyny) and for women to be larger than men (because of polyandry)! And for another, because of the difference between sperm and eggs, the reproductive payoff to polygyny – and its associated male-male competition – is substantially greater than that of polyandry, which in turn has caused polygyny to be the more prominent driver of human sexual evolution. This is not to claim that polygyny (acting mostly on males) is any more real than is polyandry (acting mostly on women), but rather that its effects are more dramatic and more readily identified.

6) Because of the negative fitness consequences for men resulting from polyandry on the part of “their” women, we can expect that men would have been selected to be quite intolerant of it. And indeed, sexual jealousy is a pronounced human trait – also widespread among animals – not uncommonly leading to physical violence. Such a powerful and potentially risky emotional response would not have been generated by evolution if women weren’t predisposed, at least on occasion, to behave in a way that is adaptive for them. 

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153752 https://historynewsnetwork.org/blog/153752 0
Human Beings Aren’t Elephant Seals. But Still ....

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; his most recent book is Out of Eden: surprising consequences of polygamy (2016, Oxford University Press). 

 To get a clear understanding of a general process, it often helps to pay special attention to the extremes. This is particularly true of a spectrum phenomenon, of which polygyny is an excellent example (there are varying “degrees of polygyny” just as there are degrees of polyandry). And the species Homo sapiens is both polygynous and polyandrous. Accordingly, let’s take a quick look at an extreme case of polygyny in the animal world, because when we do, we’ll see ourselves  - albeit in caricature.

            Elephant seals are very, very large. In fact, elephantine. Bulls can reach 16 feet in length and weigh more than 6,000 pounds. Cows are much smaller, about 10 feet long and weighing around 2,000 pounds. This size difference is important, since it arises because of the elephant seal mating system: the species might be the most polygynous of all mammals, with successful males establishing harems of up to 40 females. Since (as in most species) there are equal numbers of males and females, this means that for every highly successful bull seal there are roughly 39 unsuccessful, reproductively excluded bachelors. In the world of elephant seals, every healthy female gets mated, but only a very small proportion of males are comparable evolutionary winners. On average, four percent of the bulls sire 85% of all the offspring.[i] Bulls therefore fight long and hard among themselves for possession of a harem. Success requires large size, a violent temperament, massive canine teeth combined with willingness to employ them, a thick chest shield to provide protection from one’s opponent, and sufficient age and experience.

            Female elephant seals wean their babies in late summer and early fall, after spending much of the summer on land, members of a crowded, beachfront harem. It turns out that by the time they are weaned, some young elephant seals are considerably larger than others – as much as twice the size of their fellow weanlings. These over-sized juveniles are known as “super-weaners.” Their greater size conveys a distinct benefit, since after spending a more or less idyllic time on their rocky beaches, nursing from their mothers, at summer’s end and upon being weaned the pups must begin a long sojourn at sea, not returning to land until the following spring.  This is, not surprisingly, a stressful time for young elephant seals, and – also not surprisingly – those who were super-weaners are more likely to survive. It isn’t known whether male super-weaners are, in turn, more prone to eventually become harem-masters, but it’s a good bet, since in a highly competitive system, anything likely to provide a “leg up” when it comes to physical condition is likely to bring benefits.

            So far, so good, at least for the super-weaners. A question arises, however. Why – given the payoff of being super-sized – aren’t all elephant seals super weaners?  It turns out that since elephant seal mothers are limited in how much milk they can produce, there is only one way to become a super-weaner: a pup must obtain milk from two lactating females. How to achieve this? It’s not easy. Females are quite determined to make their milk available only to their offspring, not to someone else’s. This selfishness makes a lot of evolutionary sense, since nursing mothers who were profligate with their precious milk would have left fewer descendants (and thus, fewer copies of their milk-sharing genes) than others who were disinclined to wet-nurse an unrelated pup.

            Nonetheless, even though every pup has only one genetic mother, it’s still possible for a pup to get milk from two “mothers.” Elephant seal pups occasionally die while nursing, either from “natural causes” or because they are literally squashed during the titanic battles among oblivious, competing bulls, who have females (not the safety of young pups, who were sired the previous year, possibly by a different male) on their mind. The death of nursing infants provides an opportunity for an enterprising young pup: if he can locate a bereaved mother – quickly enough after her infant has died so that her milk hasn’t dried up - he might induce her to permit him to nurse, in place of the recently deceased infant. 

            This is an effective strategy, but also a risky one, since most females don’t take kindly to allowing an unrelated baby to suckle. “Sneak sucklers” often get bitten, and may die of their wounds. But successful ones become what are known (in the technical literature, thanks to the detailed research of elephant seal maven Burney Le Boeuf) as “double mother suckers” … and they, in turn, become super-weaners. Here is the kicker: all double mother suckers are male! Chalk it up to the pressure of polygyny, in the case of elephant seals, super-polygyny leading – because of the potential payoff to males of being larger, stronger, and healthier than their competitors - to super-weaners by way of double mother sucking. All of this requires, of course, a willingness to take risks, certainly greater willingness than is shown by female pups, who, as the harem-kept sex rather than the harem-keepers, are pretty much guaranteed the opportunity to breed so long as they survive. For males in a highly polygynous species, mere survival isn’t enough. They must stand out from their peers.

As described in my recent book, a number of human traits can be understood as resulting from our shared human history of moderate polygyny. Human beings aren’t elephant seals. Few – if any – of our fellow Homo sapiens are double mother suckers. Nonetheless, the data are overwhelming that little boys are more risk-taking, on average, than are little girls,[ii] a difference that continues throughout life and is most intense among adolescents and young adults – precisely the age at which reproductive competition was most intense among our ancestors, and to some extent, still is today. Examples of extreme polygyny, such as elephant seals, reveal exaggerations and caricatures of traits found in human beings as well. We are biologically designed to be mildly, not wildly polygynous, but those traits found in such extreme cases as elephant seals, elk and gorillas sheds light on the more modest but nonetheless real and otherwise perplexing reality of what it means to be human.

[i] B. J. Le Boeuf and J. Reiter (1988) Lifetime reproductive success in northern elephant seals. In T. G. Clutton-Brock (ed.) Reproductive Success. Chicago: University of Chicago Press

[ii] E. E. Maccoby and C. N. Jacklin. (1974) The Psychology of Sex Differences Stanford, CA:

Stanford University Press

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153757 https://historynewsnetwork.org/blog/153757 0
The Pleistocene Push to Support Dominant Alpha Leaders*

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington. His most recent book is Out of Eden: surprising consequences of polygamy (Oxford University Press, 2016). 

*And by support I mean "succumb."

   I’ve already noted – in posts here at the HNN as well as in my most recent, book, Out of Eden: Surprising Consequences of Polygamy – that human beings carry with them undeniable indications of a polygamous heritage. Beyond the more obvious albeit often surprising implications for sexual, parental and aggressive behavior, this fact (and it is so well established that it can indeed be considered a “fact”) also has other, more indirect consequences.

            Among these, I suspect, are implications for our political lives, as manifest in “real time” as the 2016 political Presidential campaign develops. I’m thinking especially of the pressure we often feel to “fall in behind” the presumed candidate of our favored party.

            I’ve recently been in correspondence with a highly intelligent software engineer who is deeply committed to non-sexist parenting, nonviolence and means of overcoming what he and I see as some of the regrettable legacies of our polygamous – especially polygynous – past. This young man (I’ll call him Josh because, well, that’s his name) is painfully aware of the downsides of “alpha maleness” as well as of the widespread tendency to accept the status, and often the dictates, of such dominant dudes. He suggested to me that there must have been a substantial evolutionary advantage – at least during our Pleistocene childhood – of affiliating with such individuals, presumably greater prospect of protection against common enemies: whether predatory animals or other predatory hominins.

            Maybe so. But I suspect that no less important than enhanced safety and security regarding outside threats was the payoff of gaining protection from the threatening, domineering and dangerous alpha dude himself. And that’s where the current election (and to some extent, any election) reveals a whiff of that ancient pattern: sign on with the leader or risk being left behind. The alpha silverback will remember who was on board and who wasn’t. LBJ was a master at threatening and bullying his fellow Senators, especially those who were shorter than he.

            The New York Times of July 8 ran a story – “Some Senators in GOP Feel Trump’s Wrath” – describing an acerbic meeting between what passes for moderate Republican senators these days and the party’s “presumptive nominee,” and which “descended ... into an extraordinary series of acrid exchanges, punctuated by Mr. Trump’s threatening one Republican senator and deriding another as a ‘loser.’ ”

            According to the Times, Senator Jeff Flake, of Arizona, “Mr. Flake, of Arizona, told him that he wanted to support Mr. Trump, but could not because of Mr. Trump’s statements about Mexican-Americans and attacks on a federal judge over his Hispanic descent. Mr. Trump responded by saying that he had been going easy on Mr. Flake so far, but that he would ensure that Mr. Flake lost his re-election bid this year if the senator did not change his tune. Dumbstruck, Mr. Flake informed Mr. Trump that he was not up for re-election this year…. Mr. Trump even aimed vitriol at a senator who did not show up, according to people who attended the meeting: Senator Mark S. Kirk of Illinois, who recently withdrew his support for Mr. Trump. Mr. Trump called Mr. Kirk ‘dishonest’ and a ‘loser’ and suggested that Mr. Kirk really wanted to support Mr. Trump but was refusing to for political reasons, the attendees said. Mr. Kirk is among the most embattled incumbent Republican senators seeking re-election in November.”

            Swear allegiance to the dominant alpha leader or risk reprisal: Its a call that our genes have long known and to which they have nearly always succumbed, if only because those that didn’t were less likely to have a future, whether social, political or biological. 

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153786 https://historynewsnetwork.org/blog/153786 0
Dr. Strangelove Resurfaces

David P. Barash is an evolutionary biologist and professor of psychology at the University of Washington; his most recent book is Out of Eden: The Surprising Consequences of Polygamy .                 

The bizarre possibility exists that under President Trump, the United States may at last get some leverage out of its nuclear arsenal.

                 When President Richard Nixon’s former chief of staff, H. R. Haldeman, was waiting to begin serving a prison term for his involvement in the Watergate scandal, he wrote a memoir. In it, Haldeman described how during the 1968 presidential campaign – at the height of the Vietnam War - Nixon shared his plan to get the North Vietnamese to bend to his will. “I call it the Madman theory, Bob. … We’ll just slip the word to them that, ‘for God’s sake, you know Nixon is obsessed about Communism.  We can’t restrain him when he is angry—and he has his hand on the nuclear button.’—and Ho Chi Minh himself will be in Paris begging for peace.” It didn’t work.

                 As far as can be seen “it” has never worked; that is, no country’s leadership (including but not limited to the United States) has been able to manipulate the heads of other countries by the threat of nuclear annihilation. Nukes didn’t help the U.S. in Vietnam, Iraq, or Afghanistan, nor did they prevent 9/11. They didn’t inhibit Argentina from invading the Falklands/Malvinas, even though the UK had nuclear weapons and the Buenos Aires junta did not. They didn’t enable France to hold onto Algeria, nor did they contribute in any positive way to the USSR’s tribulations in Afghanistan, or assist Russia in its bloody conflicts in Chechnya, Georgia, Ukraine or Syria. They were useless to NATO in Bosnia, Serbia, Libya, and Kosovo, and they haven’t helped the U.S. in Somalia or in confronting ISIS.

                 There are many reasons for this, not least that the use of nuclear weapons lacks credibility. As many strategists have lamented, it is impossible to fashion a credible threat out of an action that is literally unbelievable. The reason nuclear threats are incredible is itself multi-facetted, partly a result of the horrific and grotesquely outsized degree of “collateral damage” they entail, exceeding any reasonable ethical construct consistent with the notion of a “just war.” In addition, when directed toward another nuclear-armed nation, such a threat is typically discounted because a “first strike” would almost certainly generate a catastrophic “second strike” in retaliation (this, for better or worse – mostly worse – is at the heart of nuclear deterrence). By the same token, a police officer could not credibly stop or arrest a bank robber by threatening to detonate a backpack nuclear explosive – that would obliterate the robber, the police officer, the bank and the community.

                 Enter Donald Trump. Ironically, President Trump could end up endowing U.S. nuclear weapons with precisely the credibility it had previously lacked. Even Richard Nixon, with his seriously flawed personality, didn’t frighten Ho Chi Minh as a credible madman.  But from everything one can tell about Mr. Trump, he could very well fill the bill. I am not in a position to diagnose him psychiatrically, although I strongly suspect that he is in fact mentally ill: suffering from both paranoia and narcissistic personality order, quite possibly with a dose of bipolar disease and no small degree of sociopathy. He is, in any event, undoubtedly impulsive, vindictive, egocentric and terrifyingly ill informed – particularly on nuclear issues. Hence, he may supply precisely the twisted credibility that has been lacking thus far since the bombing of Hiroshima and Nagasaki in 1945.

                 The pursuit of credibility has long bedeviled strategic doctrine. It constituted the backdrop for a breakthrough book titled Nuclear Weapons and Foreign Policy, written in 1957 by a then little-known academic named Henry Kissinger, and was the backbone of two lectures titled “The Political Uses of Madness” given by another young scholar – one Daniel Ellsberg - at Harvard in 1959. 

                 Worry about nuclear credibility has been the demon responsible for some of the most dangerous escalations in nuclear weaponry, such as neutron bombs, designed to kill people but to leave infrastructure intact, thereby seeming more usable in such crowded venues as Germany, once described by a senior military officer as composed of towns “only two kilotons apart.”  Tactical battlefield nuclear weapons generally owe their development and deployment to worry that strategic nukes – intended to obliterate an adversary’s homeland – inherently lack credibility, because their use would presumably bring about an unacceptable retaliation.

                 Concern about credibility has also given rise to computer-based systems of “launch on warning,” which, by taking the decision to incinerate the world out of the hands of (presumably sane and thus inhibited) human beings, are designed to make their use more credible ... at the risk of being more subject to computer error or other hardware malfunction. With President Trump, the United States will be spared the need to shore up the credibility of its nuclear arsenal, even if he doesn’t follow through on his recently tweeted threat to “strengthen and expand its nuclear capability.”

                 Among the many paradoxes of nuclear weapons is this: there is no way to get around the credibility skeleton that lurks in the closet of deterrence – and which renders them unusable under any rational calculus – other than by making them more usable, or by putting them in the hands of someone who is demonstrably irrational. And the more usable they are, which includes the more unstable the hands that control them, then by definition the more likely they are to be used.

                 A U.S. Strategic Command report in 1995 was titled “Essentials of Post-Cold War Deterrence.” It argued that “The fact that some elements [of the nuclear command authority] may appear to be potentially ‘out of control’ can be beneficial to creating and reinforcing fears and doubts within the minds of an adversary’s decision makers. This essential sense of fear is the working force of deterrence. That the U.S. may become irrational and vindictive if its vital interests are attacked should be part of the national persona we project to all adversaries.” Got that? Out of control, irrational and vindictive, especially if his self-defined vital interests are attacked? Sound like anyone you know?

                 North Korea’s Kim Jong-un wouldn’t likely be alarmed if President Trump sent a middle-of-the-night tweet storm his way; he might feel otherwise, however, about the possibility of a fleet of nuclear armed missiles, so alarmed in fact that he might make the “rational” decision to pre-empt such an attack.

            So, am I happy and relieved that Donald Trump is about to have his "finger on the nuclear button," thereby enhancing the credibility of our much-beleaguered deterrent? Not on your life.


Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153883 https://historynewsnetwork.org/blog/153883 0
Watching the Trump Administration Unravel Is a Schadenfreude Delight

David P. Barash is professor of psychology emeritus at the University of Washington, and the author of numerous articles and books, including Peace and Conflict Studies, 4th ed. (with C. Webel, Sage, 2017), the forthcoming Approaches to Peace, 4th ed. (Oxford University Press, 2017), as well as Paradigms Lost: The Pain and Pleasure of Seeing Ourselves as We Really Are (Oxford University Press, 2018). With Judith Eve Lipton, he is currently researching a book taking issue with nuclear deterrence. 

Trump can’t stop tweeting; fine for him, since he obviously doesn’t have anything important or beneficial for the country, or the world, to do with his time. In fact, I hope he keeps it up, or even increases his bizarre outbursts—especially given his current impeachable and perhaps criminal entanglements—insofar as this activity corresponds to the legal warning that fish mostly get hooked because they’ve opened their mouths. But what about me? I don’t tweet and have no legal liabilities, but having just retired from my university teaching job, I have many benevolent demands on my time: wife, children, six grandchildren, contributing to the anti-Trump resistance, as well as the maintenance of two horses, a goat, four dogs, four cats, a 10-acre farm, and several book projects underway. Yet I can’t seem to tear myself away from the ongoing legal-political-ethical-personal soap opera that has the Orange One at its center.

I’m reminded of one of my earliest clear memories, the Army-McCarthy hearings of 1953, which was the first such event televised. I vividly recall my mother glued to our TV, a large wood cabinet with a tiny black and white screen, for literally days on end. Once I even stayed home from school because she was so hooked by the unfolding drama that she forgot to take me there. It was the first of what was to become a periodic punctuation in many people’s news-watching lives: the launch of Sputnik, Kennedy’s assassination, the first moon landing, Watergate, the Anita Hill hearings, Clinton’s impeachment. But the Army-McCarthy hearings – precipitated when Tailgunner Joe went too far, and included the U. S. Army in his smear campaign – was the beginning, so engrossing that it gave rise to a song, “The Senator McCarthy Blues,” by the Atomic Platters. There doesn’t appear to be a YouTube clip of it, but the hilarious lyrics (including “Mommy, mommy, where’s the commie”) are retrievable here. The song also bespeaks the prevailing ethos of its time, with a man bemoaning that because his wife spends all her time glued to the hearings, dinner goes uncooked, the floor unwashed, and so forth.

Back to 2017 and my delight and fascination as the noose tightens around Trump’s neck so that if nothing else, his presidency and what I see as his loathsome policy agenda are both increasingly derailed. Why am I caught like this? Sure, I’m rooting for maximum disclosure, pain, embarrassment, and devastating fallout—political no less than legal—from this unfolding story. But it will play out without me, no differently than with. Maybe I’m just a sucker for a kind of newsworthy gossip (although I eschew anything about celebrity culture), or—more likely—having suffered genuine psychic trauma in the aftermath of the November, 2016 election, I’m currently soaking up the Trump Troubles as a kind of soothing schadenfreude.

My mother would have understood—she felt about McCarthy pretty much as I do about Trump. My father didn’t sing the “Senator McCarthy Blues” when the sink was full of dishes and the washer, with unwashed clothes; he watched the Army-McCarthy hearings, too, whenever he could, just as my wife, like me, spends too much time following the Trump Troubles, enjoying sarcastic comedy clips and devastating political cartoons.

The key similarity, of course, between the Army-McCarthy hearings and the current Trump Troubles, is that both represent appropriate responses of government reining in rogue politicians. McCarthy was a serious threat to democracy and to basic human decency; so is Trump. McCarthy, however, wasn't literally a threat to the whole planet; Trump is. But into each life, even in the aftermath of a dubious and disastrous election, and a grave danger to all that I hold dear, a little sun will occasionally shine, albeit in this case via grave difficulties for someone I thoroughly detest. And that's not all: when we want entertainment that is equally compelling and even more cheery, we can watch the return of Twin Peaks!


Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/153938 https://historynewsnetwork.org/blog/153938 0
Our Gaslighting Geezer

David P. Barash is professor of psychology emeritus at the University of Washington. His next book, "Through a Glass Brightly: Using Science to See Our Species as It Really Is," will be published this summer by Oxford University Press. With his wife, Judith Eve Lipton, he is currently writing a book about nuclear deterrence.

I gaslight, you gaslight, he/she/it gaslights. What can this mean? Isn't gaslight some sort of compound noun rather than a verb? But it turns out that gaslight is indeed a verb, one with an interesting history and, moreover, a troublesome relevance to the present.

In 1938, British playwright Patrick Hamilton's work, "Gas Light," was produced in London, and ran for six months. Then it was adapted into a 1940 movie, titled "Gaslight," directed by Thorold Dickinson and starring Anton Walbrook and Diana Wynyard. Four years later, it debuted as an MGM film, featuring Charles Boyer and Ingrid Bergman, with a budget that was an order-of-magnitude larger, and comparably less fidelity to the original play. (As part of the contract, MGM insisted on destroying all pre-existing copies of the 1940 film ... at which, fortunately, the mega-studio was unsuccessful.)

But what is "gaslighting," and what does it have to do with the year 2018?

In the story, a wealthy, psychologically "delicate" woman (Bella Mallen) has married a suave, debonair, controlling and decidedly creepy continental European (Paul Mallen) and purchased a multistory London apartment where, decades before, a rich old lady had been murdered and the place ransacked; she was known to have had some extremely valuable rubies. No one had been willing to live in the apartment until its current occupants. Soon it is apparent that Bella is having a hard time: she repeatedly misplaces items, is blamed - by her husband - for taking things when she claims innocence, seems to imagine events that aren't real, and increasingly comes to doubt her own sanity.

It also becomes clear that she isn't crazy after all, but is being craftily manipulated by the nefarious Paul Mallen, who wants to have her "committed" because she accidentally encountered evidence that, unknown to her, could be used against him. The story is set in 1880 London, and in those Victorian times, a husband could have his wife committed simply on his say-so (#MeToo didn't exist.) 

Paul regularly prowls about the attic, searching, we discover, for those rubies which he – the murderer – hadn't been able to find when he committed the crime. As he searches, he turns on the attic gas lights (explaining the movie's title at last), which causes the lights in the rest of the house to dim and flicker; after all, this is 1880, before electricity. Poor belabored Bella notices this, but others – including her well-meaning maid-servant – attribute it to yet another flight of her over-stressed mind.

Eventually, all is set to rights, thanks to the intervention of a retired Scotland Yard detective, who – among other things – confirms the reality of Bella's perception and the guilt of homicidal, perfidious Paul. The film's major contribution to popular culture, as well as to the psychological literature, has been to introduce the verb "gaslight," meaning to induce someone to question their sanity and grasp of reality by malevolently manipulating his or her perceptions.

In the film, especially the 1940 original, which, despite its over-acting and comparatively poor production values, is far superior to the 1944 remake, Paul Mallen is frightening in his malignance, narcissism, lack of empathy, and overall creepiness. But he isn't one half as scary, or creepy, as the current occupant of the White House.

Readers of HNN are doubtless sufficiently well informed that they don't need a recitation of the continuing slurry of deceptions, misrepresentations, and outright lies that have emanated from and continue to disfigure the Chief Executive and his administration. Moreover, when caught in these lies, Donald Trump has unfailingly blamed the truth-tellers and doubled-down on his own mendacity, with a self-righteous insistence that makes Paul Mallen look like the mythically Honest Abe Lincoln, or George "I cannot tell a lie" Washington.

I doubt that Trump, unlike the fictional Mr. Mallen, is consciously gaslighting the country, in the sense of deliberately trying to drive us insane; as many have commented, he may actually believe his prevarications, in at least some cases. But the effect of his behavior is more important than his intention. Many veteran journalists and seasoned political observers have noted the extent to which Trump's obvious lies, his self-serving, dishonest claims of "fake news," as well as the unending barrage of positions taken only to be disavowed immediately afterward have been not only confusing but downright disorienting, so much so in fact that the victims are at risk of becoming as unstable as the perpetrator.

In his March 5 column, the New York Times's Charles Blow put it with characteristically accurate acerbity: "People used to dealing with a sane, logical person who generally doesn’t lie and generally makes sense are left scratching their heads, wondering whether to believe what they have heard, whether to make plans and policies around it. Believing anything Trump says is a recipe for a headache and heartache. The old rules no longer apply. We see the world as through a window — as it is, even if we are a bit removed from the whole of it. But Trump sees it as if in a house of mirrors — everything reflecting some distorted version of him. His reality always seems to return to a kind of delusional narcissism."

At the end of "Gaslight," Paul Mallen gets his comeuppance, and I am not alone in hoping that Donald Trump gets his, too. Be warned, however: when this happens in the movie, the cinematic sinner goes more than a bit crazy himself – decompensating, as psychiatrists would put it. But unlike Mr. Trump, the malignant Mr. Mallen didn't have nuclear weapons at his disposal.

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154074 https://historynewsnetwork.org/blog/154074 0
Does MAD really work?

One of B.F. Skinner's pigeons in a box

David P. Barash is professor of psychology emeritus at the University of Washington; he is writing a book about nuclear deterrence.

For all its mathematical folderol – not to mention being entrusted with the fate of the Earth – deterrence is likely a superstition.

In 1948, psychologist B. F. Skinner conducted an experiment in which hungry pigeons were fed on a random schedule. Soon three quarters of them were behaving in unusual ways, depending on what each had been doing just before getting food: one rotated her body (always counter-clockwise), another swung his head like a pendulum, a third wiggled her feet, and so on. The resulting research report was titled “Superstition in the pigeon.”

We’ll never know what if anything Skinner’s pigeons were thinking. But there’s no doubt that it’s time for us to think – or rather, rethink – our reliance on deterrence.

Although conventional deterrence has existed for a long time – think of China’s Great Wall (deterrence by denial), Rome’s use of its legions (deterrence by punishment) or even the roars of a lion, and thorns of a rose bush – nuclear deterrence is of course quite new, only existing since 1945. Initially, the weapons and their presumed deterrent effect were a U.S. monopoly, thought to prevent the Red Army from rolling into Western Europe. Then, when the USSR became nuclear armed, we and they entered the MAD era (Mutually Assured Destruction), from which neither country has yet emerged even as others have joined, doctrines have been refined, and new weapons deployed.

Throughout, people have been remarkably pigeon-like, rarely questioning the underlying assumptions of nuclear deterrence, of which Winston Churchill proclaimed “Safety will be the sturdy child of terror, and survival the twin brother of annihilation.” Despite the terror, maybe deterrence really has been sturdy; after all, we have so far survived the nuclear age and avoided annihilation. But such confidence is, at best, premature.

Correlations, after all, can be spurious, as with ice cream consumption and drowning: although the two are correlated, it’s not because eating ice cream makes people drown, but because both events tend to happen in hot weather.

If a pigeon spun around and didn’t get fed, it would presumably have been disappointed, but no great harm would have been done. But if deterrence had failed (a frequent and terrifying trope among strategic planners), we likely wouldn’t be around to bemoan that particular inadequacy. And it only needs to fail once. Moreover, if you play Russian roulette indefinitely – whether with six chambers or 600 – it is mathematically certain that eventually you will take a bullet.

Following the Cuban Missile Crisis of 1962, when we came horrifyingly close to World War III, former Secretary of State Dean Acheson observed that we had avoided Armageddon by “sheer dumb luck.” And one thing we all know about luck is that eventually it runs out.

Maybe we’re like the person who has fallen from a skyscraper and who reassures herself, as she plummets down, “So far, so good.” 

The iconic argument for the success of deterrence is that the Cold War between the U.S. and the USSR never went nuclear. But aside from luck, perhaps this cheery outcome arose simply because the two countries never had any motivation sufficient for any war, conventional or nuclear. Unlike say, India and Pakistan - both of which have nuclear weapons and have also had conventional wars – the two Cold War opponents didn’t share a common border or have conflicting territorial claims. And it’s worth recalling that the Cuban Missile Crisis, rather than being a triumph of nuclear deterrence, was caused by deterrence itself, after Khrushchev sought to shore up the Soviet Union’s posture vis-à-vis the U.S., after we had deployed nuclear armed Thor missiles in the UK and Jupiter missiles in Turkey – which was ordered by President Eisenhower in hopes of, well, furthering our deterrence of the Soviets! It is reasonable to conclude that nuclear war wasn’t avoided because of deterrence but in spite of it.

The same applies to the numerous cases in which false alarms have brought deterrence to the brink of failure, as for example in 1983 when Stanislav Petrov, a mid-ranking Soviet air defense officer, received a report that five missiles, fired from the U.S., were heading toward the Russian homeland. This occurred at an especially fraught time in U.S.-Soviet relations, when the Reagan Administration was cavalierly maintaining the feasibility of surviving a nuclear war with the “evil empire,” and had recently shot down a Korean Air passenger plane, mistaking it for an American spy mission. Petrov concluded on his own that since his country’s early warning system was newly installed and liable to have some bugs, the report was probably a false alarm, so – risking serious punishment for insubordination – he didn’t pass along the alert, which would have necessitated that the ill and elderly then-President Andropov decide within minutes whether to “retaliate”  … to an attack that never happened.

As for that seemingly long nuclear peace since 1945, the historical reality is that the time-span from the beginning of the nuclear age until now isn’t really all that impressive. Not only has the U.S. been involved in many conventional wars (Korea, Vietnam, Iraq, Afghanistan), but even war-prone Europe experienced lengthy periods of peace during the 19th century alone: between the end of the Napoleonic Wars and the Franco-Prussian War, and thence until World War I, and so forth into the 20th century. Each time, peace was followed by war, and when that happened, it was fought with the weapons then available. Considering this, the decades-long absence of nuclear war – so far – may be something to savor, but is less than dispositive.

All of which makes one doubt the dogma that nuclear deterrence has worked, and that we should feel confident that it will continue to do so. Moreover, there is no evidence that nuclear threats – whether overt, via a proclaimed policy of deterrence, or implied, simply by possessing a nuclear arsenal – have conveyed increased international clout. On many occasions, non-nuclear countries have even attacked nuclear armed ones. China sent its army against U.S. forces during the Korean War, in 1951, even though the U.S. had hundreds of nuclear bombs, and Mao would not have any until 13 years later. Non-nuclear Argentina was similarly undeterred when it invaded the Falkland Islands, a territory of nuclear armed Britain. By the same token, during the first Gulf War in 1991, non-nuclear Saddam didn’t hesitate to fire Scud missiles at nuclear Israel; the government of Yitzhak Shamir didn’t play its supposed deterrence card and vaporize Baghdad in return.

There are, moreover, a number of other reasons why the Emperor Deterrence has no clothes, of which one in particular brings us to the current crisis on the Korean Peninsula. Deterrence readily elides into provocation, since doctrine, weapons, military exercises and verbal taunts lend themselves to interpretation as signaling intent to mount a first strike. Conventional military postures on both sides of the 38th parallel have long provided more than enough deterrence, with the thousands of North Korean artillery tubes roughly matched by the well-equipped South Korean military, along with about 28,000 American troops serving as a “tripwire.” But the North’s excessive anxiety about inhibiting an invasion aimed at regime change has driven the Kim government to pursue a hyperactive nuclear weapons and missile program, an example of deterrence run amok that has evoked a comparably overblown and potentially lethal response from the Trump Administration.

This action-reaction sequence italicizes one of the many deep weaknesses of deterrence: not only does it rely on the perception by each side that the other is being self-protective rather than aggressively threatening (easier said than done), it assumes that all participants will behave with cool, well-informed, thoughtful, and rational judgment – even though everything known about human behavior (perhaps especially that of Messrs. Kim and Trump) is that they can be violent, impulsive, thin-skinned, vindictive, ill-informed and downright sociopathic.

There are other problems, not least that the U.S. in particular has been moving toward smaller and more accurate nuclear weapons, especially suitable for tactical, battlefield use. This transition has been motivated by efforts to overcome one of the most troublesome aspects of deterrence, the fact that all-out nuclear war would be so horrible, and its effects so globally destructive (no matter who initiates it) that the weapons themselves aren’t really feasible; hence, they – and the deterrence they ostensibly underpin – lack credibility. The potent paradox is that the only way to imbue nuclear weapons with credibility (and thus, to bolster deterrence) is to make them relatively small and accurate enough that they are credibly usable – but the more usable they are, the more liable they are to actually be used. Add to this the fact that every war game scenario shows that such use inevitably escalates to all-out nuclear war.

The good news – and there is some – is that there are ways out of the deterrence trap. For starters, effective deterrence can be achieved, at least in the short term, with a tiny fraction of the overkill arsenals currently deployed. Despite the ignorant insistence of President Trump, there is certainly no need for more, and a crying need for fewer, eventually down to zero. Threats can be toned down, not just verbally but in terms of the weapons being deployed. The destabilizing targeting of any other country’s nuclear forces can also be ended. A fissile materials cutoff can be implemented, along with no-first-use doctrines. We can ratify the Comprehensive Test Ban Treaty, and sign on to the recently passed Nuclear Ban Treaty, already endored by more than 120 countries, and offering moral and legal authority to the delegitimation of these genocidal weapons.

We cannot mandate changes in Pyongyang’s nuclear procedures, or imbue Donald Trump with insight, thoughtful decency and a sense of international responsibility. but we can pass legislation mandating that no U.S. president – not Trump, not Pence, no one – can initiate first use of nuclear weapons, ever. Better yet, we can insure that this never happens by getting rid of these indefensible weapons altogether, along with the deeply flawed superstitious ideology of deterrence that has justified their existence.

In short, we can prove ourselves wiser than pigeons.

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154081 https://historynewsnetwork.org/blog/154081 0
What Lay Behind the Toronto Van Attack?

David P. Barash is professor emeritus of psychology at the University of Washington. His most recent book, Through a Glass Brightly: Using Science to See Our Species as We Really Are, will be published summer 2018 by Oxford University Press.

     Like most Americans, I hadn’t heard the word “incel” (derived from “involuntary celibate”) until the recent murderous attack in Toronto. But without knowing it, I have been researching the phenomenon for decades - in animals, not in people, but the situation among our nonhuman relatives is illuminating.

      A widespread pattern, especially among mammals such as Homo sapiens and one that relevant to an evolutionary understanding of those unfortunate incels among us – is polygyny, in which a male has sex with multiple females. 

      To understand why polygyny is so common, look no further than eggs and sperm. The former, production of which literally defines females, are relatively large and even when –  as in mammals –  not encased in big hard shells, fertilization obligates the mother to a large investment during pregnancy and then, after birth, lactation. By contrast, the male specialty – sperm – are tiny and produced in vast quantities. As a result of this asymmetry, one male can fertilize many females, and in species in which there are equal numbers of each sex, the stage is set for intense male-male competition to do the fertilizing. The fact that a minority of males are often able to obtain more than their share when it comes to sexual success means that there must be many other males that are left out.

      Nonhuman animals who find themselves sexually and thus reproductively excluded don’t join Internet chat groups where they share their frustration and fury at fornication foregone, but they can be violent – even lethal – troublemakers.

      For an extreme case among mammals (and thus, a revealing one since it italicizes the more general situation), consider elephant seals. Among these highly polygynous animals, a dominant male can sometimes accumulate a harem of 30 or so females, which necessitates that for every such successful harem-master, 29 males are relegated to bachelorhood. These sexually frustrated animal incels aren’t especially aggressive toward females, but they are violent indeed, almost exclusively toward their fellow males. 

      The evidence for an underlying penchant for polygyny among human beings is convincing. For starters, there is the generally larger size of men compared to women: averaging between 10% and 20%, and applying to height, weight and muscle mass. (The fact that some women are heavier, taller and/or stronger than some men doesn’t negate the overall differences.) This differential, technically known as sexual dimorphism doesn’t prove anything by itself, although it is consistent with the male-male competition characteristic of other polygynous species in which the less competitive males are necessarily deprived of sexual and thus reproductive opportunities.

      Human sexual dimorphism is also consistent with a polygynous evolutionary history when it comes to behavioral inclinations, with boys substantially more aggressive, on average, than girls, just as men are more aggressive and violent than women – once again, a difference that corresponds to the biological situation of other species in which males have been selected to compete for access to females. And in which some males, far more than some females, lose out.

      More evidence is provided by sexual bimaturism, in which girls mature earlier than do boys, a circumstance that is immediately apparent in any middle school or high school classroom. Given that reproduction is more physically demanding of females than of males, it would seem counter-intuitive that among human beings, girls become capable of having children at a younger age than boys, but it makes sense when we realize that because of the male-male competition associated with polygyny, it is adaptive for young males to delay entry into the competitive arena until they are older and larger.

      Then there is the fact that prior to the social and cultural homogenization that came with Western colonialism, roughly 85% of human societies were preferentially polygynous. And finally, testimony from our genes themselves: all human populations evaluated thus far show greater genetic diversity among our mitochondrial DNA – inherited from mothers  –  than among our Y chromosomes (inherited by males from their fathers). This means that modern human beings are derived from a comparatively smaller number of male than of female ancestors, because a select few of the former mated with a larger number of the latter.

      Put it all together and there is no doubt that Homo sapiensis a mildly polygynous species, not as extreme as elephant seals, but definitely setting the stage for some men to be less sexually and reproductively successful than others, unlike the biologically generated condition for women, in which the difference between the most and least “fit” is more muted. Accordingly, compared to men, incel women are exceedingly rare. (In some ways, accordingly, better to be female or a gay male! Either way, you are likely to encounter potential partners, while experiencing at least somewhat less exclusionary male-male competition.)

      Of all living things, the human species is undoubtedly the most liberated from biological constraints and imperatives; evolution whispers within our DNA. But for a tiny minority of men who are particularly unfulfilled, unhappy, and dangerously unmoored, it sometimes shouts. 

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154097 https://historynewsnetwork.org/blog/154097 0
Trump's Game of Chicken

David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using science to see our species as we really are (Oxford University Press), 2018.

Those of us who worry about President Trump starting a shooting war might well be relieved that the focus, for now, is on trade rather than explosions. Clearly, trade “war” is a figure of speech, a metaphor. A better one is Game of Chicken, as analyzed by mathematical game theorists. Admittedly, Chicken also isn’t a perfect model for the current US-China imbroglio, but it can be illuminating. 

Like war and generals, or politics and politicians, Games of Chicken are too important to be left to the game theorists alone. So here is a primer.

What happens when a chicken, instead of crossing the road, decides to run headlong into another chicken, who is similarly determined? The result could be a Game of Chicken, if certain conditions apply. 

Consider the classic Game of Chicken. Two cars speed toward each other. Each driver can do one of two things: Swerve or go straight. In a trade war, swerving means giving in the other’s demands (i.e., for China, buying more American-made products, and for the US, abandoning its new tariffs).

To win, you must go straight; the one who swerves is the “chicken.” If both drivers swerve, neither wins but neither suffers relative to the other. But here is the crunch, literally: If both drivers go straight – i.e., if the trade war goes on, injuring both economies - both lose.

It is said that Games of Chicken were first played by California teenagers during the 1950s, although that may simply be an urban legend. The philosopher Bertrand Russell, however, saw a gruesome parallel with nuclear brinkmanship: Each side wants the other to back down, although neither is willing to do so itself and so, a head-on collision beckons.

“We’re eyeball to eyeball,” said Secretary of State Dean Rusk in 1962, as the Cuban Missile Crisis passed its near-apocalyptic outcome, “and I think the other fellow just blinked.” As games go, Chicken can be serious, and deadly. In nuclear confrontations: fried chicken. Trade wars, fortunately, are less dire, but nonetheless consequential.

Mutual swerving seems rational, but if you think the other fellow is a swerver the temptation is to go straight. The rub is that the other driver is thinking the same thing, and Trump claims – for the most part, falsely - that the US has made a history of swerving, so perhaps China expects the US to swerve once more. Moreover, Trump has claimed that trade wars are “easy to win,” suggesting that he expects China to do the swerving. And by the rules of the game, if either side is convinced that the other will swerve, then you could win by going straight. Should you therefore go straight? Not if the other player does the same. So the “game” often boils down to a matter of communication, or rather manipulation: trying to get the other side to swerve.

Accept, right off, that there is no way to guarantee victory. The best either player can hope for is to improve the odds of inducing the other one to buckle. Toward that end, there are many tactics, none especially appealing. Start with reputation. If you are known as a nonswerver, your opponent is bound to take that into account. Not surprising, national leaders have long been concerned that their country be known to stand by its commitments; Trump, by contrast, has distinguished himself by being capricious and unreliable, not a good prognostic sign.

Reputation can be burnished in several ways, like cultivating an image of being literally crazy, or, better yet, suicidal. Whether actually irrational or simply faking it, there is a payoff to convincing your opponent that you have taken leave of your senses. Chalk one up for Trump.

Yet another variant involves convincing the other player that you are unwilling or – better yet - literally unable to swerve. The logical, but nonetheless bizarre consequence, suggested in the 1960s by that bizarrely logical nuclear strategist, Herman Kahn, is to wait until you have reached high speed, and then throw the steering wheel out the window, showing the other driver you can’t swerve, which generates a contest to see who can toss out the steering wheel first! Maybe US success would be enhanced if Congress passed legislation requiring Trump not to back down, although given Republican distaste for tariffs, this seems unlikely.

There are other ways of convincing the oncoming driver that you aren’t going to swerve. Your determination to go straight may depend on your desire to be victorious, and Trump has made it clear that for him, being a “winner” trumps all. That might help.

A final tactic: Drive a large and imposing vehicle. If an armored cement truck is confronting a VW Beetle, who backs down? Given that the US economy is pretty strong – at least for now – this might also give Trump an advantage, although China’s economy has, if anything, even more current momentum.

The logic of Chicken is downright illogical, which brings up the advice offered by a high-powered Defense Department computer, playing a game of Global Thermonuclear War in the 1983 movie, WarGames: “The only winning move is not to play.”

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154132 https://historynewsnetwork.org/blog/154132 0
One Really Big Difference Between Science and History  

David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using Science to See Our Species as We Really Are (Oxford University Press), 2018, from which this article is drawn.

Science differs from theology and the humanities in that it is made to be improved on and corrected over time. Hence, the new paradigms that follow are approximations at best, not rigid formulations of Truth and Reality; they are open to revision— indeed, much of their value resides precisely in their openness to modification in the light of additional evidence. By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on “revelations” that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen-name Abbé Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as “what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.”

Fundamentalist doctrines of Judaism and Islam are no different. Such rigidity is not, however, limited to religion. For a purely secular case, most people today agree that it would be absurd to improve on Shakespeare, as was attempted, for example, by Nahum Tate, who rewrote the ending of King Lear in 1682, a “correction” that was approved by none other than Samuel Johnson, who agreed with Mr. Tate that the death of Cordelia was simply unbearable.

By contrast, science not only is open to improvements and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated— and sometimes killed— for their apostasy, and even among secularists rewriting Shakespeare or playing Bach on an electric guitar is likely to be treated as indefensible, science thrives on correction and adjustments, aiming not to enshrine received wisdom and tradition but to move its insights ever closer to correspondence with reality as found in the natural world. This is not to claim that scientists are less vulnerable to peer pressure and the expectations of intellectual conformity than are others. We are, all of us, only human. But an important part of the peer pressure experienced by scientists involves openness to revision and reformulation. The Nobel Prize–winning ethologist Konrad Lorenz once wrote that every scientist should discard at least one cherished notion every day before breakfast. Although I don’t recall Dr. Lorenz conspicuously following his own advice, it nonetheless captures a worthwhile aspiration.

Well-established scientific concepts aren’t discarded lightly. Nor should they be. As Carl Sagan emphasized, extraordinary claims require extraordinary evidence, and so, it is reasonable that any new findings that go against received scientific wisdom— especially in proportion as those claims are dramatic and paradigm- shifting— should receive special attention, which includes being held to a high standard. But closed-mindedness is among the worst sins of an enterprise devoted to seeking truth rather than to validating old texts and anointed wisdom (religion) or passing along something that has been woven out of whole cloth (the creative arts).

I employ the word “truth” without quotation marks, because despite the protestations of some postmodernists and outright data-deniers, the natural world is undeniably real, and it is the noble endeavor of science to describe and understand this reality, which leads to the following proclamation— which shouldn’t be controversial, but has become so, at least in some quarters: science is our best, perhaps our only way of comprehending the natural world, including ourselves. Moreover, we are approaching an ever more accurate perception of that world and of our place in it. This argument, by the way, is not intended to glorify science or scientists at the expense of the creative arts and its practitioners. In fact, a case can be made that an act of artistic creativity is actually more valuable than a scientific discovery, not least because humanistic creativity yields results that, in the absence of their creators, are unlikely to have otherwise seen the light of day. If Shakespeare had not lived, for example, it is almost certain that we would not have Hamlet, Othello, King Lear, Macbeth, the hilarious comedies, those magnificent sonnets, and so forth. Without Bach, no Goldberg Variations or Well- Tempered Clavier, and none of his toccatas and fugues. No Leonardo? No Mona Lisa. The list goes on, and is as extensive as human creativity itself.

By contrast, although counterfactual history is necessarily speculative, an intuitive case can be made that if there had never been an Isaac Newton, someone else— perhaps his rival Robert Boyle or the Dutch polymath Christiaan Huygens— would have come up with the laws of gravity and of motion, which, unlike the blank pages on which Hamlet was written, or Leonardo’s empty easel, were out there in the world, waiting in a sense to be discovered. Gottfried Leibniz, after all, has a strong claim to being at least the co-discoverer with Newton of calculus, which also was, equally, a truth waiting to be found. Unlike a Shakespeare or Leonardo, who literally created things that had not existed before and that owe their existence entirely to the imaginative genius of their creators, Newton’s contributions were discoveries, scientific insights based on preexisting realities of the world (e.g., the laws of motion) or logical constructs (e.g., calculus).

Similarly, if there had been no Darwin, we can be confident that someone else would have come up with evolution by natural selection. Indeed, someone did: Alfred Russell Wallace, independently and at about the same time. A comparable argument can be made for every scientific discovery, insofar as they were— and will be— true discoveries, irrevocably keyed to the world as it is and thus, one might say, lingering just off-stage for someone to reveal them, like the “New World” waiting for Columbus. If not Columbus, then some other (equally rapacious and glory- seeking) European would have sailed due west from the Old World, just as if not Copernicus then someone else— maybe Kepler or Galileo— would have perceived that the solar system is solar- and not Earth-centric.

Mendelian genetics, interestingly, was rediscovered in 1900 by Hugo DeVries, Carl Correns, and Erich von Tschermak, three botanists working independently, more than a generation after Gregor Mendel published his little- noticed papers outlining the basic laws of inheritance. For us, the key point isn’t so much the near simultaneity of this scientific awakening, but the fact that Mendel based his own findings on something genuine and preexisting (genes and chromosomes, even though they weren’t identified as such until a half- century later), which, once again, were populating the natural world and waiting to be uncovered by someone with the necessary passion, patience, and insight.

Ditto for every scientific discovery, as contrasted to every imaginative creation. No matter how brilliant the former or the latter, practitioners of both enterprises operate under fundamentally different circumstances. It does not disparage Einstein’s intellect to note that in his absence, there were many brilliant physicists (Mach, Maxwell, Planck, Bohr, Heisenberg, etc.) who might well have seen the reality of relativity, because no matter how abstruse and even unreal it may appear to the uninitiated, relativity is, after all, based on reality and not entirely the product of a human brain, no matter how brainy.

In some cases, the distinction is less clear between products of the creative imagination and those of equally creative scientific researchers. Take Freud, for example. His major scientific contribution— identifying the unconscious— is undoubtedly a genuine discovery; that is, the elucidation of something that actually exists, and had Freud never existed, someone else would have found and initiated study of those patterns of human mental activity that lie beneath or otherwise distinct from our conscious minds. On the other hand, it isn’t at all obvious that absent Freud, someone else would have dreamed up infantile sexuality, the Oedipus complex, penis envy, and so forth. And this is precisely my point: insofar as alleged discoveries meld into imaginative creativity, they veer away from science and into something else.

Among some historians, there has been a tendency to assume that history proceeds toward a fixed point so that the past can be seen (only in retrospect, of course) as aiming at a superior or in some sense a more “valid” outcome. Such a perspective has been called “Whig history,” and it is generally out of favor among contemporary scholars, as it should be. A notable example is the announcement by Francis Fukuyama that with the disbanding of the Soviet Union in 1991, we had reached the “end of history” and the final triumph of capitalist democracy.2 Subsequent events have made the inadequacy of this claim painfully obvious. Despite the importance of democracy and its widespread appeal (at least in the West), it is important to be wary of the seductiveness of such “Whiggery,” and to appreciate that changes in sociopolitics do not necessarily exemplify movement from primitive to advanced forms of governance, or from malevolent to benevolent, and so forth. Nor are they final.

On the other hand, such a Whiggish approach is mostly valid in the domain of science. Readers at this point may well be tired of hearing about the Copernican, sun-centered solar system, but it deserves repetition, and not only because it contributed mightily to some much needed anthropodiminution but also because it is quite simply a better model than the Ptolemaic, Earth-centered version, just as evolution by natural selection is superior to special creation. It seems likely, by the same token, that the new paradigms of human nature described in Part II of this book are superior— more accurate, more useful, more supported by existing facts, and more likely to lead to yet more insights—than are the old paradigms they are in the process of displacing. Paul Valery wrote that a work of art is never completed but rather abandoned. This sense of unfinishedness is even more characteristic of science, not just because of the scientific “creator’s” persistent personal dissatisfaction with her product but also because reality is tricky, sometimes leaving false trails. As a result, our pursuit and even our successes can always be improved on. This is not to say (as the more extreme postmodernists claim) that reality is socially constructed and as a result, no scientific findings can be described as true. The physicist David Deutsch, making a bow to the philosopher of science Karl Popper, refers to the accumulation of scientific knowledge as “fallibilism,” which acknowledges that no attempts to create knowledge can avoid being subject to error. His point is that although we may never know for sure that we are correct in an assertion (after all, tomorrow’s discovery may reveal its falsity), we can be increasingly clear that we aren’t completely wrong. Moreover, science has never been superseded by any better way of making sense of nature.

Newton’s laws of motion are as close to truth as anyone can imagine, with respect to medium- sized objects moving at middling speeds, but even these genuine laws have been improved on when it comes to very small things (quantum physics), very fast movement (special relativity), or very large things (general relativity). Even in Newton’s day, his work on gravity was vigorously resisted – by well- informed scientifically minded colleagues – because it posited pushing and pulling by a mysterious, seemingly occult “action at a distance” that could not be otherwise identified or explained. Scientific insights, in short, can be as tricky as reality. Observing the disorienting aspects of quantum physics and how they often depart from both common sense and certain rarely questioned philosophical assumptions (such as cause and effect), Steven Weinberg wryly noted, “Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.” He might have included new biological theories as well.

Biology, in fact, offers materialist insights that, in a sense, exceed even those of physics, the presumed zenith of hard-headed objective knowledge. Thus, although physicists can tell us a lot about how matter behaves, it is so far stumped when it comes to the question of what matter is (wave functions, clouds of probability, gluons and muons and quarks and the like: but what are they made of?). By contrast, biology can and does tell us both what we are made of, at least at the organismic level— organs, tissues, cells, proteins, carbohydrates, fats, and so forth— and, increasingly, how and why we behave as we do: some of the deepest and hitherto unplumbed aspects of human nature. By an interesting coincidence, in 1543, the same year that Copernicus’s book on the solar system appeared, Vesalius published his magnum opus, on human anatomy. Anatomies large and small, far and near, planetary and personal: all give way to a more accurate, and typically more modest view of reality, including ourselves. Such a perspective on human behavior— mostly due to Darwin and his interpreters and extenders— can be seen as yet another continuation of this trend, dissecting the anatomy of human nature itself.

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154160 https://historynewsnetwork.org/blog/154160 0
Can Studying Human Evolution Help Us Understand Impeachment?

David P. Barash is professor of psychology emeritus at the University of Washington; among his recent books is Through a Glass Brightly: using science to see our species as it really is (2018, Oxford University Press). 

Whatever you think about the potential – likely? – impeachment of Donald Trump (and I’m all for it), this development converges intriguingly with The Goodness Paradox, a fascinating 2018 book by anthropologist Richard Wrangham. In it, Wrangham makes the paradoxical suggestion that socially orchestrated murder - something very much like the modern death penalty - may have acted in our prehistoric past to make us less violent than we would otherwise be, at least within our own groups. Let me explain.

Impeachment and removal from office is hardly murder, and, although right-wing trolls will doubtless interpret this essay as me recommending the murder of Donald Trump, that is most assuredly NOT what I am saying. I am interested, however, in the parallels between eliminating a dangerous group member “with extreme prejudice” in our evolutionary past – as Wrangham hypothesizes – and removing a dangerous group leader via a recognized and legitimated political process; i.e., impeachment.

Compared to many other species, human beings are, by and large, notably nonviolent and unaggressive. Thus, non-relatives and even total strangers typically crowd together on city streets, in a bus, train, airplane, movie theater or lecture hall, with almost no violence or aggression. And yet, people have historically been quite fierce, at least on occasion, toward members of a different group. 

Wrangham proposes that our early ancestors, not unlike many nomadic, non-technological societies today,  were prone to enforcing rules of civil conduct within their group. Unable to call 911 or to employ an independent judiciary and police, they would likely have relied on internal mechanisms for keeping within-group tranquility. In current traditional societies, a trouble-making jerk who consistently disrupts their community will typically face efforts to enforce the accepted social rules, by personal warnings, ridicule, or, if necessary, ostracism. But if these gentler attempts are unavailing and especially if the problem individual is dangerously violent and thus a threat to the group’s well-being, capital punishment will frequently be agreed upon. There are few data as to how often this form of extra-juridical justice is meted out in contemporary traditional societies and even less evidence speaking to its frequency and impact in human evolution. Nonetheless, the intriguing possibility exists that as a result of such lethal events, Homo sapiens may have actually become less violent, for two reasons. Number one: by removing those especially predisposed to dangerously lethal behavior, the human genome came to harbor fewer of their predisposing genes, making the rest of us less prone to mayhem. 

Number two: through most of our evolutionary history, the average group size was almost certainly small, so individuals knew each other and were also acutely aware of what befell those who conspicuously got out of line. Once such enforcement – lethally administered if need be - became established as cultural tradition, biological selection as well as social pressures and enlightened self-interest would have favored conformity to expected norms. In sum, the idea is that threats to society may have led to informal but effective policing of serious misbehavior, either by lethally eliminating perpetrators or by law-abiding group members suppressing whatever inner demons might remain within themselves. Perhaps we therefore owe our comparatively benevolent dispositions to a long history of group-enforced capital punishment. 

This hypothesis could be interpreted as an endorsement of the death penalty in modern life, but it needn’t be. It is likely that our binocular eyesight and the three-dimensional, stereoscopic vision it affords is attributable to our history as forest-dwelling primates, but that doesn’t mean we should start climbing trees and leaping from branch to branch. And the fact that our ancestors may well have scavenged the majority of their animal protein doesn’t suggest that we should all begin consuming road kill. Executing malefactors might, might, have made us what we are today, for better and worse. But that doesn’t mean that we should keep doing it.

On the other hand, what we should do – and what the US Constitution explicitly sets out rules for doing – is to remove (nonviolently, to be sure!) trouble-making leaders whose behavior is deleterious to the group. By doing so we probably will not usher in a new millennium, and maybe not even restore the harm to American democracy, society, ideals, the environment, and so forth that a certain trouble-making leader has already done, nor are we likely to engage in the kind of self-domestication that in our evolutionary past might have made us less murderously violent … but it certainly won’t do any harm. 


Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154270 https://historynewsnetwork.org/blog/154270 0
What Should We Do When a Ruler is Mentally Unstable? David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using science to see our species as we really are (Oxford University Press), 2018.

F. Scott Fitzgerald is said to have commented to Ernest Hemingway that “The rich are different from you and me,” to which Hemingway replied “Yes, they have more money.” Many people similarly assume that national leaders are different from you and me not just because they generally have (a lot) more money, but also because their mental health is supposed to be more stable. 


And yet, history is full of political leaders whose mental stability was questionable at best, and others who were undoubtedly nut-cases. Roman emperor Caligula is infamous for sexual excess, for having people killed for his personal amusement, and other pathologies. Charles VI of medieval France became convinced he was made of glass and was terrified that at any moment he might break. Mad King Ludwig II of Bavaria suffered from what now appears to be Pick’s disease (related to early Alzheimer’s), along with frontotemporal dementia and schizotypal personality disorder. King George III of England evidently suffered from logorrhea, an incontrollable need to speak and write to a degree that often became incomprehensible, as well as apparent bipolar disease. And that is only a very limited sample. We can safely conclude that occupying a position of great political responsibility is no guarantee against mental illness.


“Only part of us is sane,” wrote Rebecca West. “Only part of us loves pleasure and the longer day of happiness, wants to live to our nineties and die in peace ...” It requires no arcane wisdom to realize that chronic mental illness is not the only source of “crazy” behavior: people often act out of misperceptions, anger, despair, insanity, stubbornness, revenge, pride, and/or dogmatic conviction – particularly when under threat. Moreover, in certain situations—as when either side is convinced that war is inevitable or under pressure to avoid losing face — an irrational act, including a lethal one, may appear appropriate, even unavoidable. When he ordered the attack on Pearl Harbor, the Japanese Defense Minister observed that “Sometimes it is necessary to close one’s eyes and jump off the Kiyomizu Temple” (a renowned suicide spot). During World War I, Kaiser Wilhelm wrote in the margin of a government document that “Even if we are destroyed, England at least will lose India.” While in his bunker, during the final days of World War II, Adolf Hitler ordered what he hoped would be the total destruction of Germany, because he felt that its people had “failed” him.  

Both Woodrow Wilson and Dwight Eisenhower suffered serious strokes while president of the United States. Boris Yeltsin, president of the Russian Federation from 1991 to 1999, was a known alcoholic who became incoherent and disoriented when on a binge. Nothing is known about what contingency plans, if any, were established within the Kremlin to handle potential war crises had they arisen during the Yeltsin period. Richard Nixon also drank heavily, especially during his stressful stint as US president during the Watergate crisis, which ultimately led to his resignation. During that time, Defense Secretary James Schlesinger took the extraordinary step of insisting that he be notified of any orders from the president that concerned nuclear weapons before they were passed down the command chain. 


Presumably, Schlesinger – and by some accounts, Henry Kissinger, who was then serving as National Security Adviser – would have inserted themselves, unconstitutionally, to prevent war, especially nuclear war, if Nixon had ordered it. As civilians, neither Yeltsin nor Nixon, when incapacitated by alcohol, would have been permitted to drive a car; yet they had full governmental authority to start a nuclear war by themselves. 


During his time in office, Donald Trump has been the first US president considered, simply by virtue of his own personal traits, to be a national security threat. Thus, he is renowned for repetitively lying, possibly to a pathological extent, and concerns have constantly been raised about his mental stability, impulsiveness, narcissism, sociopathy, and many other ailments that in the opinion of many mental health professionals would clearly disqualify him for numerous military and government posts … but not the presidency. The fact that the current impeachment inquiry has driven him to acts and utterances that are increasingly bizarre and unhinged, and that by law he has sole sufficient authority to order the use of nuclear weapons should be of the greatest concern to everyone, regardless of your politics.

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154276 https://historynewsnetwork.org/blog/154276 0
How Tolstoy's War and Peace Can Help Us Understand History's Complexity David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using science to see our species as we really are (Oxford University Press), 2018.



We have extraordinary access to ideas and data, and yet it has become a truism that getting to the bottom of things has become more difficult. Truth itself seems increasingly unclear and under assault. Call it Nietzschean perspectivism if you wish, but I submit it is because things are rarely linear. Even without the astounding barrage of half-truths, outright lies, “alternative facts” and demented tweets emanating from the Orange Presence in the White House, the Internet alone has muddied the waters of our comprehension. Our times are best understood not via relativism or subjectivity of “truth claims” but by recognizing that although events and other facts are indeed real, they mostly result from a complex, interlocking web of causation with no one factor necessarily determinative and no single interpretation necessarily correct. In the brief essay to follow, I summon in support of this proposition no less a presence than Leo Tolstoy. (If you resist such arguments from authority, feel free to skip the quotation that is about to come!)


War and Peace can, not unlike our increasingly muddled grasp of reality, be perceptually overwhelming in its recounting of detailed human stories. Yet one of the author’s notable asides in this masterpiece begins with a seemingly simple and mundane natural occurrence – “A bee settling on a flower has stung a child” -  then proceeds to view it from various angles, eventually arriving at a grand generalization:


And the child is afraid of bees and declares that bees exist to sting people. A poet admires the bee sucking from the chalice of a flower and says it exists to suck the fragrance of flowers. A beekeeper, seeing the bee collect pollen from flowers and carry it to the hive, says that it exists to gather honey. Another beekeeper who has studied the life of the hive more closely says that the bee gathers pollen dust to feed the young bees and rear a queen, and that it exists to perpetuate its race. A botanist notices that the bee flying with the pollen of a male flower to a pistil fertilizes the latter, and sees in this the purpose of the bee's existence. Another, observing the migration of plants, notices that the bee helps in this work, and may say that in this lies the purpose of the bee. But the ultimate purpose of the bee is not exhausted by the first, the second, or any of the processes the human mind can discern. The higher the human intellect rises in the discovery of these purposes, the more obvious it becomes that the ultimate purpose is beyond our comprehension. All that is accessible to man is the relation of the life of the bee to other manifestations of life. And so it is with the purpose of historic characters and nations.


Despite the human yearning for simple, cause-and-effect explanations, people increasingly understand that nothing can be grasped in isolation, even as complexity makes things less graspable in their entirety. This is particularly true when it comes to the elaborate kaleidoscope that is human human behavior, where no single factor explains everything – or indeed, anything.


Tolstoy himself was especially concerned with supporting his idea that history in general, war in particular, and the Napoleonic Wars most especially were due to processes that the human mind could not discern, and that contrary to Great Man notions, “historic characters and nations” are influenced by an array of factors such that every individual is no less influential than are deluded, self-proclaimed leaders. And so, Napoleon comes across in War and Peace as more than a bit ridiculous with his insistence that he and he alone drives events. (This, in service of Tolstoy’s urging that if citizens refuse to participate, there would be no wars; i.e., an early version of the bumper sticker, “What if they had a war and no one came?”)


Taking a hedgehoggy view of that brilliant old fox, today’s world is no less multi-causal and therefore confusing than Tolstoy proclaimed with his parable of the honeybee. Afflicted with an unending stream of presidential lies and gaslighting, constant yet shifting claims of “fake news” and “alternative facts,” a dizzying array of Internet information overload, and a blizzard of wild conspiracy theories, it is tempting to give up on any coherent interpretation … of anything.


Nor is this problem new. Social scientists have long disagreed about what is predominant when it comes to causation: language, socialization, learning, cultural tradition, historical circumstance, evolutionary inheritance, and so forth. Was World War I, for example, due to interlocking alliances, frustrated yearning for colonial empire on the part of Germany, Austria-Hungary’s anxiety about losing its empire, incompetent national leaders, Europe bored with decades of more-or-less peace, the rigidity of war plans combined with strict mobilization time-tables, the assassination of a certain Arch-Duke, machinations by the “merchants of death,” a combination of these, or something else?


Was the 2003 invasion of Iraq due to George W. Bush’s yearning to outdo his father, W having been manipulated by Messrs Cheney, Rumsfeld, Wolfowitz et al, a genuine hope of bringing democracy to the Arab Middle East, greed for Iraqi oil, a real if misguided belief that Western “liberators” would be welcomed with chocolate and flowers, illusions about weapons of mass destruction, a combination of these, or something else? 


Is the climate crisis due to corporate greed, consumer indifference, technological over-reach, the cowardice and short-sightedness of politicians, human overpopulation in the face of limited energy resources of which fossil fuels are the most available and at the cheapest immediate cost, a collision between atmospheric physics and growth-oriented economics, the inexorable push of energy-gobbling civilizations, a combination of these, or something else?


Is the danger of nuclear annihilation due to the military-industrial complex, a human penchant for war, distrust of “the other,” excessive reliance on deterrence as a savior, a kind of psychic numbing due to the unimaginable consequences of thermonuclear holocaust, perceived helplessness on the part of ordinary citizens, widespread feelings of fatalism, a sense that if something really bad hasn’t happened yet it never will, a mesmerized delight in extreme power and potential violence whatever the consequences, a combination of these, or something else?


Although ultimate causes and to some extent even reality itself are often beyond our comprehension –possibly even beyond our ability to repair – it is nonetheless our duty to behave as though they aren’t. And certainly our duty to acknowledge such critical realities as war, climate change and nuclear weapons. Just as “All that is accessible to man is the relation of the life of the bee to other manifestations of life,” we can conclude with the existentialists that just as Albert Camus’s Sisyphus was heroic because he persevered in pushing his rock, we are equally obliged – and privileged – to push ours, even though what is accessible to us turns out to be not one but many, going in different directions and with uncertain outcomes. Or as Rabbi Tarfon (70 CE – 135 CE) proclaimed, “It is not your responsibility to finish the work of perfecting the world, but neither are you free to desist from it." Tolstoy would agree.


Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154294 https://historynewsnetwork.org/blog/154294 0
"Keep American Beautiful" and Personal Vs Corporate Environmental Responsibility David P. Barash is professor of psychology emeritus at the University of Washington; among his recent books is Through a Glass Brightly: using science to see our species as we really are (2018, Oxford University Press). 

Today, many of us accept personal responsibility for climate change and struggle to adjust our carbon footprints, while giving little attention to the much larger effects of major corporations and countries. We might want to learn something from how mid-century Americans concerned about litter were conned and co-opted by forces larger and more guilty than they.


Shortly during and after the Second World War, Americans produced much less garbage, having become accustomed to reusing items whenever possible and throwing things away only when absolutely necessary. This became a serious challenge, however, to newly emerging industries that yearned to profit from making and selling throw-away items, notably glass, plastics, and paper products. These industries accordingly launched a vigorous advertising campaign, inducing people to recalibrate their wartime frugality and start throwing things away after a single use. Americans were told that it was more hygienic, more convenient and (not mentioned), more profitable for those manufacturers who made and sold those items.


But by the early 1950s, the throw-away campaign had begun to backfire as Americans started seeing nearly everything as garbage.  The glass bottles that people used to rinse and reuse were increasingly thrown out of car windows, which had an unfortunate tendency to end up broken in a field, where grazing cows would either step on them and injure themselves or consume them and die. Dairy farmers became increasingly incensed, especially in Vermont, then as now a dairying state.


In response, Vermont passed legislation in 1953 that banned disposable glass bottles. Corporate America worried that this might be a harbinger of restrictions to come, so many of the bottle and packaging companies banded together in a counter-intuitive but politically and psychologically savvy way: They formed something called Keep America Beautiful. It still exists today, under a label that can only be applauded, not only for what it officially stands for but also for its social effectiveness.   Keep America Beautiful began as an example of what is now often criticized as “virtue signaling,” but in this case, the goal wasn’t simply to signal virtue or even to engage in “green- washing.”


Rather, the reason such behemoth companies as Coca Cola and Dixie Cup formed what became the country’s premier anti-littering organization was to co-opt public concern and regulatory responses by shifting the blame from the actual manufacturers of litter—those whose pursuit of profit led to the problem in the first place—to the public, the ostensible culprits whose sin was putting that stuff in the wrong place. Garbage in itself wasn’t the problem, we were told, and industry certainly wasn’t to blame either! We were. 


It became the job of every American to be a responsible consumer (but of course, to keep consuming) and in the process to Keep America Beautiful. At first and to some extent even now, legitimate environmental organizations such as the Audubon Society and the Sierra Club joined. Keep America Beautiful went big-time, producing print ads, billboards, signs, free brochures, pamphlets and eventually Public Service Announcements.


Keep America Beautiful coordinated with the Ad Council, a major marketing firm. People of a certain age will remember some of the results, including the slogan “Every litter bit hurts,” along with a jingle, to the tune of Oh, Dear! What Can the Matter Be: “Please, please, don’t be a litterbug …” Schools and government agencies signed on to the undeniably virtuous campaign. It’s at least possible that as a result, America became somewhat more beautiful but even more important, that troublesome Vermont law that caused such corporate consternation was quietly allowed to die a few years after it had been passed, and – crucially – no other state ever emulated it and banned single-use bottles.


But by the early 1970s, environmental consciousness and anti-establishment sensibilities began fingering corporations once again, demanding that they take at least some responsibility for environmental degradation, including pollution more generally. Keep America Beautiful once again got out in front of the public mood and hired a pricey, top-line ad agency that came up with an icon ad that still resonates today with Americans who were alive at that time: Iron-Eyes Cody, aka “The Crying Indian.”


Appearing on national television in 1971, it showed a Native American (the actor was actually Italian- American) with a conspicuous tear in his eye when he encountered trash, while a voice-over intoned, “Some people have a deep, abiding respect for the natural beauty that was once this country. And some people don’t. People start pollution. People can stop it.” In short, it’s all our fault.


Iron-Eyes Cody’s philosophy is reminiscent of Smokey Bear’s “Only you can prevent forest fires.” Of course, Smokey is right. Somewhat. Individuals, with their careless use of matches, can certainly precipitate forest fires, but as today’s wildfire epidemics demonstrate, there are also major systemic contributions: Global over-heating with consequent desiccation, reduced snow-melt, diminished over-winter insect die-offs that produce beetle epidemics that in turn leave vast tracts of standing dead trees, and so forth. Individuals indeed have a responsibility to keep the natural environment clean and not to start fires, but more is involved.


It is tempting, and long has been, to satisfy one’s self with the slogan “peace begins with me.” As logicians might put it, personal peace may well be a necessary condition for peace on a larger scale, but even if it begins with each of us, peace assuredly does not end with me, or you, or any one individual. The regrettable truth is that no amount of peaceful meditation or intoning of mantras will prevent the next war, just as a life built around making organic, scented candles will not cure global inequality or hold off the next pandemic.


Which brings us to global climate change. There is no question that each of us ought to practice due diligence in our own lives: Reduce your carbon footprint, turn off unneeded appliances, purchase energy-efficient ones, and so forth. Climate activist Gretta Thunberg is surely right to emphasize these necessary adjustments and, moreover, to model personal responsibility, for example, by traveling to the UN via sailboat. But she is also right in keeping her eyes on the prize and demanding that above all, governments and industry change their behavior.


Amid the barrage of warnings and advice about personal blame and individual responsibility, there is a lesson to be gleaned from the corporate manipulations that gave us Keep America Beautiful, and its subsequent epigones: Even as we are implored to adjust our life-styles and as we dutifully struggle to comply, let’s not allow such retail actions to drown out the need for change at the wholesale level, namely by those corporations and governments whose actions and inactions underpin the problem and whose behavior – even more than our own - must be confronted and overhauled. 


(Just after writing this piece, I discovered that some of its ideas were covered in an episode titled “The Litter Myth,” aired September 5, 2019, on NPR’s wonderful history-focused podcast, “Throughline”: https://www.npr.org/2019/09/04/757539617/the-litter-myth.  Anyone wanting a somewhat different take on this topic would do well to “give a listen.”)

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154314 https://historynewsnetwork.org/blog/154314 0
Nuclear Deterrence and Things Left to Chance David P. Barash is professor of psychology emeritus at the University of Washington. His latest book is Threats: Intimidation and its Discontents (2020, Oxford University Press). 



USAF General Jack D. Ripper (Sterling Hayden) and RAF Captain Lionel Mandrake (Peter Sellers) overcome the credibility gap in Stanley Kubrick's Doctor Strangelove or: How I Stopped Worrying and Learned to Love the Bomb (1964) 



An ancient dilemma faced by leaders throughout history has been how to prevent — deter — attacks on their realm from outside (invasions) or from inside (rebellions). And an ancient answer, albeit not the only one, has been to threaten that any such perpetrators will be punished.  The most prominent alternative has been attempted deterrence by denial, which has experienced mixed success, from the Great Wall of China to the Maginot Line of 20th century France. 


Deterrence by punishment gets particular attention, not only because it underlies nuclear deterrence (there being no effective deterrence by defense), but because its consequences have been so horrific. Some of the most riveting accounts of murderous cruelty come down to us from Bronze Age kings, who famously flayed their opponents and made mountains out of human skulls, often as a “lesson” to would-be opponents. 


A more recent but nonetheless hair-raising statement of this perspective came from Sir John Fisher, First Sea Lord, Admiral of the Fleet, and widely regarded as the most important British naval figure after Horatio Nelson. Speaking as the British naval delegate to the First Hague Peace Conference in 1899, Fisher emphasized that deterrence by punishment is likely to be effective in proportion as the threatener has a fearsome reputation: 


“If you rub it in both at home and abroad that you are ready for instant war . . . and intend to be first in and hit your enemy in the belly and kick him when he is down and boil your prisoners in oil (if you take any), and torture his women and children, then people will keep clear of you.”


Connoisseurs of deterrence by punishment have long struggled with how to make the concept work, challenged not only by a desire to be something less than incorrigibly bloodthirsty, but also — especially in the Nuclear Age — deeply worried about how to make an incredible threat credible. Here is one of the more intriguing and incredibly dangerous “solutions.”


Even if you’re not a mountain climber, imagine for a moment that you are. Moreover, you’re roped to another climber, both of you standing by the edge of a crevasse. You’re having a heated argument, trying to get the other to do something that she doesn’t want to do — or alternatively, trying to get her to keep from doing something that she wants to do. The details don’t matter; what does matter is that the two of you disagree, strongly, about what should happen next.


How can you get your partner/opponent to bend to your will?


This sets the stage for an imaginary situation developed by the late Thomas C. Schelling, one of the leading theoreticians of nuclear deterrence and co-recipient of the 2005 Nobel Prize in economics. In his book, Arms and Influence, Schelling used a mountaineering model to explain what he called the “threat that leaves something to chance.” He proposed it as a way to get around the problem of credibility when it comes to the use of nuclear weapons. It is a very big dilemma, one that has bedeviled nuclear strategists for decades and that despite efforts by the best and brightest (including Schelling) remains unsolved to this day. 


When it comes to nuclear deterrence, the credibility gap is easy to state, impossible to surmount: Nuclear weapons are so destructive and their use is so likely to lead to uncontrollable escalation and thus, to unacceptable consequences for all sides, that the threat to use them is inherently incredible. And this, in turn, presents an immense difficulty for strategists hoping to use the threat of nuclear war as a way of either coercing another side to do something they’d rather not (say, withdraw from a disputed region), or to refrain from doing something that they might otherwise do (e.g., attack the would-be deterrer). So, let’s return to those disputatious climbers. 


If you simply announce your demand, the other might well reject it. What, then, might you do if you really, really want to get your way? You could threaten to jump, in which case both of you would die; remember, you’re roped together. Because of its suicidal component, however, such a threat would lack credibility, so the other person might well refuse to take it seriously. But suppose you move right to the edge, becoming not only more insistent but also increasingly erratic in your movements. 


What if you start leaping up and down, or shuffling your feet wildly? Your credibility would be enhanced, not because falling in would then be any less disastrous, but because by increasing the prospect of shared calamity, emphasizing your threat by adding a soupçon of potentially lethal unpredictability over which you have no control, your literal brink­manship just might kill both of you, not on purpose, because as we already saw, that threat would lack credibility. But rather because chance factors — a sudden loss of bal­ance, a gust of wind — might do what prudence would otherwise resist. Thus, according to Schelling, unpredictability — leaving something, somewhat, to chance — would surmount the problem of incredibility. This terrifying loss of control wouldn’t be a bug, but a feature.


In the world of nuclear strategy, the problem of credibility is like that of an adult, trying to deal with a child who refuses to eat her vegetables; the frustrated parent might threaten “Eat your spinach or I’ll blow up the house.” (Don’t try this at home; first of all, it probably won’t work.) Or consider a police officer, armed with a backpack nuclear weapon, who confronts a bank robber by demanding “Stop, in the name of the law, or I’ll blow up you, me, and the whole city.” It’s what led a NATO general to complain during the Cold War, when the West’s nuclear weapons were deployed to deter the Red Army from over-running Europe, that “German towns are only two kilotons apart.” And what led to interest in neutron bombs (designed to kill troops but leave buildings intact), as well as in doctrines (“limited nuclear war-fighting”) and devices (battlefield nuclear weapons), designed to be usable and thus, credible.


But the downside of dancing on the edge of a crevasse in order to make your threat credible by leaving it somewhat to chance, is that, well, it leaves that thing — and a rather important one at that — to chance! By the same token, making nuclear deterrence more credible by deploying weapons that because of their size and ease of employment are more usable means that they must in fact be more usable, a paradoxical situation for weapons whose ostensible sole purpose is to make sure that they won’t be used!


In an earlier book, The Strategy of Conflict, Schelling had discussed the means whereby one side might coerce another, despite the fact that it cannot credibly threaten nuclear war, by employing “the deliberate creation of a recognizable risk of war, a risk that one does not completely control, deliberately letting the situation get somewhat out of hand . . . harassing and intimidating an adversary by exposing him to a shared risk.” 


Schelling’s hair-raising mountain metaphor may have been inspired by the word “brinkmanship,” which seems to have been first used by Democratic Party presidential candidate Adlai Stevenson, who, at a campaign event in 1956, criticized Republican Secretary of State John Foster Dulles for “boasting of his brinkmanship — the art of bringing us to the edge of the nuclear abyss.”  At about this time, Henry Kissinger (then a little known university professor) began developing both the notion of “limited nuclear war” as a way of circumventing the credibility problem, and the concept of the “security dilemma,” in which “the desire of one power for absolute security means absolute insecurity for all the others.”


The idea of security dilemmas has typically been applied to the problem of recipro­cal arms races, whereby a country’s effort to counter a perceived military threat by building up its arsenal results in its rival feeling threatened, which leads that rival, in turn, to build up its arsenal — and so on. As a result, both sides end up less secure than they were before. Brinkmanship, à la Dulles and Schelling, introduces yet another dilemma: when a lack of credibility leads to various stratagems intended to enhance credibility, they may well succeed in doing so, but in the process reduce security on all sides.


So, the next time you find yourself tethered to an adversary at the edge of a crevasse — whether in a thought experiment or reality —  you might want to recall the advice offered by the super-computer in the 1980s movie, WarGames: “the only winning move is not to play.”

Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154436 https://historynewsnetwork.org/blog/154436 0
The Pandemic Pied Piper




David P. Barash is professor of psychology emeritus at the University of Washington. His most recent book is Threats: Intimidation and its Discontents (2020, Oxford University Press).


A widespread tale out of medieval Europe involved the Pied Piper, a figure who arrived unexpectedly at the village of Hamelin and piped an alluring tune, thereby inducing the local, resident rats to follow him out of town, whereupon they drowned in a nearby river. A well-known variant on this story goes on to note that the Piper had been promised payment for his services, but when the town reneged, therefore failing to “pay the Piper,” that magical, musical manipulator later returned and led Hamelin’s children out as well; they, too, never returned.

Given the deadly consequences of believing Mr. Trump’s lies and false assurances, it is more than a little tempting to see him as a modern-day Pied Piper, whether one envisions his followers as rats or — more generously — as children. Not coincidentally, perhaps, Mr. Piper was initially welcomed by Hamelin’s human residents because the town had been suffering from a plague … although the mythic medieval Piper (unlike the very real 21st century president) did in fact undo the plague.

Donald Trump as Pied Piper is, admittedly, a speculative stretch. But one of the strangest aspects of the current pandemic has been the denial practiced by so many of its victims — not to mention the extent to which this denial has often contributed to the victimization itself. Stranger yet is that this denial includes not only people who haven’t contracted the virus or know anyone who has, but even many who actually test positive. Even more astounding is the degree to which said denial has permeated the attitudes of some who refuse to accept the reality of their situation, even as they are dying of it. This denial is not “merely” a refusal to acknowledge their own impending death (not in itself uncommon), but an insistence that the cause isn’t COVID because that isn’t a genuine, life-threatening disease but rather, a hoax dreamed up by political liberals seeking to undermine the presidency of Donald Trump.  Strange indeed.

Or maybe not.

As quoted in The Washington Post of November 16, Judy Doering — an emergency room nurse in South Dakota — said that she has cared for patients entirely reliant on ventilators in order to breathe, and who nonetheless insist that they don’t have COVID-19 (which has now claimed the lives of nearly 250,000 Americans … and counting).  “I think the hardest thing to watch,” Ms. Doering told CNN, “is that people are still looking for something else and a magic answer and they do not want to believe COVID is real. Their last dying words are, ‘This can’t be happening. It’s not real.’” They swear, literally with their dying breaths, that they must have pneumonia or some disease rather than COVID-19. And why? Because President Trump said that the virus was going away, that the country had turned the corner, that it doesn’t hurt otherwise healthy people, that it was at most a minor inconvenience, and that it wouldn’t be encountered any more after election day, November 3, because the whole thing had been ginned up by Democrats and the “deep state” to do him political harm.

 The tragic fact remains, however, that COVID-19 is currently killing roughly two Americans per minute, and many of these victims go to their deaths having followed their Pied Piper — President Trump — in a manner that makes that medieval story painfully real.

Insofar as there is any potency to the parallel, it raises the question as to the possible underlying psychological mechanism. At least one contributor is likely very old— part of our primate evolu­tionary history. In his book, Alpha God, evolutionary psychologist Hector Garcia has pioneered the assessment that monotheism itself is a derivative of Homo sapiens’ deep-seated tendency to defer to an alpha male, thereby ensconcing God as the dominant leader, endowed with such predictable qualities as large size, great power, intolerance of competitors, sexual jeal­ousy, and the supposed ability to provide benefits to His faithful followers.

Garcia revisited this territory in his book Sex, Power and Partisanship, showing how the trope of Democrats as the “mommy party” and Republicans as the “daddy party” helps make sense of our current partisan divide, while illuminating how our evolved psychology plays into the impact of threat, fear, and anger upon political orientation. In times of stress those especially affected find themselves attracted to political leaders and platforms that promise authoritarian certainty, including a vigorous, uninhibited, and even potentially violent response to those individuals and circumstances regarded as threatening.

The phrase “messiah complex” is fairly well known, referring to a psychopathology whereby the patient believes that they are, well, a messiah, ordained to be a savior. Another interpretation of messiah complex would seem worth entertaining: A psychopathology characterized by a need to identify a messiah, and to follow unquestioningly.

Human nature and its enabling hardware changes at a biological snail’s pace whereas cultural circumstances gallop ahead, largely untethered from our slow-moving evolu­tion. As a result, we are likely stuck with a kind of zombie psychology that staggers along, frequently out of touch with current needs and situations. There was a time in our Pleistocene past when alpha males, despite their fierce despotism (and in some cases because of it), were an asset, especially when dealing with outside enemies, and maybe to quell personal disputes within the group. But that was a long time ago. Current presidents don’t stand in the fighting vanguard, huffing and puffing and blowing the other guys down.

Our species’ susceptibility to leaders’ theatrics nonetheless remains deeply entrenched, a long-ago asset now turned liability that is all the more evident when politicians effectively stoke our fears, exaggerating the seeming threats, all the while thumping their chests and proclaiming themselves the best, the most, and maybe even the only genuine alpha creatures this side of God.

In Why People Believe Weird Things — one of the most insightful and important books of the early 21st century — Michael Shermer (guru of scientific skepticism, with a psychology PhD) identified five cogent reasons for that particular human susceptibility: Need for prompt gratification, for consolation, for moral closure, for satisfying simplicity, as well as our predisposition for hope, especially in dark times. Perhaps another should be added: A tendency for at least some of us, under some conditions, to follow a Pied Piper who masquerades as a savior-seeming alpha male, whatever the consequences.



Mon, 25 Sep 2023 17:23:09 +0000 https://historynewsnetwork.org/blog/154445 https://historynewsnetwork.org/blog/154445 0