New Paper by Paul and Anne Ehrlich: Returning to “Normal”? Evolutionary Roots of the Human Prospect

Paul R. Ehrlich & Anne H. Ehrlich | July 5, 2022 | Leave a Comment Download as PDF


This article was first published in BioScience on July 1, 2022


 

“We are not living in normal times, and every American knows it.” —Daniel Sherrell, The Guardian (5 January 2022)

A notion prominent in the news today is that the world of 2019 and the decades leading up to it were some kind of “normal” to which civilization might return after the COVID-19 pandemic. Talk of such a return dramatically underlines the educational system’s failure to inform most people about human history and our present predicament.

Today’s view of normality is possible because everyday thinking about human history largely ignores its first 300,000 years and does not recognize how extremely abnormal the last few centuries have been, roughly just one-thousandth of the history of physically modern Homo sapiens. Knowing how genetic and cultural evolution over millennia shaped us helps explain today’s human predicament, how hard that predicament is to deal with, and underlines how abnormal human life is in the twenty-first century.

Most people don’t realize that the world to which they wish to return was not normal (usual or typical) for our species. More importantly, it was not remotely sustainable (Dasgupta et al. 2021), even perhaps inevitably unsustainable (Rees 2010). Indeed, it is relatively difficult to define in detail what normal behavior is for Homo sapiens as an entity, in part because of the largely blank pages of prehistory.

The most recent 300 out of 300,000 years have been abnormal in the sense that a fever of 107 degrees Fahrenheit is abnormal when, for most of a person’s life, her temperature has been at about 98.6 degrees. Until 10,000 years or so ago, the normal lifestyle for Homo sapiens was living in small groups (Schmidt and Zimmermann 2019), hunting and gathering. Humanity’s fever started about ten millennia in our past and rapidly led to a highly febrile system of giant groups, which have increasingly industrialized. Humanity grew from scattered groups of 20 to cities of 20 million, from normal to abnormal, in an evolutionary instant.

That has been a freak geological moment based on the adoption and spread of agriculture, later topped by a one-time energy bonanza from fossil fuels. It has, as is increasingly evident, entrained a complex of existential threats that are likely mortgaging the future of civilization. Those threats, all gradually (in terms of a human lifespan) unfolding changes in the human environment, include climate disruption, biodiversity loss, resource depletion, global toxification, expanding pandemics, and increasing chances of nuclear war, all driven by overpopulation, overconsumption, and escalating inequity (Pickett and Wilkinson 2011, Piketty and Saez 2014). Intertwined and mutually reinforcing, these drivers are pressing humanity toward a ghastly future (Ehrlich and Holdren 1971, National Academy of Sciences 1993, Perry 2015, Bradshaw et al. 2021, Boulton et al. 2021). There is no possibility of solving that human predicament by returning to foraging (Dasgupta 2021), but it might be possible to establish a relatively desirable future by learning some lessons from hunter–gatherer ancestors who, millions of years ago, had evolved ways to acquire more energy faster than other foraging apes (Kraft et al. 2021).

That evolutionarily recent pattern, which included the appearance of agriculture, also allowed the development of what we are calling the new abnormal of large populations and strong social stratification. Its beginnings can be seen in Hammurabi’s code of some 4000 years ago and the pattern persists today with politicians, celebrities, CEOs, scientists, and so on, “leading.” Interestingly, however, not all human cultures accepted this as normal. For instance, Indigenous Americans such as the Huron statesman Kandiaronk thought the European social system ridiculous, especially the ability of individuals to convert wealth into power and the general lack of personal autonomy (Graeber and Wengrow 2021). Clearly, if Kandiaronk were transferred from the seventeenth century to today, he would have a very different view of normal human behavior from that of an average American.

We believe it is only by examining what is known of physically modern Homo sapiens’ entire 300,000-year history, including the ways in which human psychology has been shaped by both genetic and cultural evolution and their interactions, that the current threats to humanity can be understood and possibly averted. Indeed, if one expands the definition of human to include all our upright, tool-improving ancestors, human history must be considered to go back millions of years.

Mismatch: Old genes and new perceptual needs

Mismatches and culture gaps are foundational to civilization’s grave present situation. Even the most casual consideration of human history brings to light dramatic mismatches (Ehrlich and Blumstein 2018) that could not have occurred in prehistoric normal times. For instance, all organisms have evolved ways of detecting changes in their environments and, if possible, reacting when required in defensive ways to remain in conditions conducive to their survival and reproduction. For most mammals, the important environmental changes they need to detect tend to be immediate or sudden. Survival and reproduction may depend on perceiving the appearance of a predator nearby or a sudden change in the physical environment such as a rockslide or a flash flood. Most mammals have evolved nervous systems that can detect a leopard’s rush, the approach of a possible mate or a falling branch, do extremely rapid calculations of the likely consequences, and send signals to the appropriate organs to take life-saving actions. Our primate ancestors, being mammals, also evolved to be very good at sensing sudden danger and ducking or running.

Similarly, our nervous systems evolved to hold the environmental background constant while we assay or avoid a sudden threat. (Ehrlich 2000). One can easily see this with the aid of the video on a cell phone. Shake your head rapidly from side to side and notice how your head moves while the room basically stays still. Then turn on the video and move the cell phone rapidly from side to side as you moved your head. Look at the recording and you’ll get dizzy as the background dances around. You could easily have detected a lion entering your room while you were moving your head, but the lion would likely have devoured you if your nervous system worked like the video.

What’s the difference? Your nervous system evolved proprioceptors that detect the head motions and tell your brain how your eyes are moving—and your brain automatically compensates and keeps its image of the background steady, as it does all the time as you go through your normal activities. Bottom line: We’re good at seeing things change in front of a constant background, and we’ve actually evolved to hold the background constant to improve the accuracy of our own movements and our ability to detect other movements. Add in habituation (as to warnings about climate disruption) and your perceptual system in the modern world is mismatched with the new need to detect and respond to gradually increasing existential threats in our environmental background.

There were plenty of reasons for australopithecines to evolve the ability to spot stalking predators but no reason at all to focus on gradual changes in the climate (as opposed to reacting to changing weather). Our distant ancestors weren’t causing climate disruption; if they had been, they couldn’t have done anything to correct it. Indeed, for much of our history, they didn’t have the language with syntax to even discuss it. The same can be said for the vast sweep of our normal evolutionary history. The utility of detecting changes in the environmental background (as opposed to immediate environments) came along primarily with the agricultural and industrial revolutions, which produced both the technological means for creating massive environmental changes, as well as detecting any gradual ones among them, communicating widely about them, and taking steps to deal with them. In the new abnormal, humanity started to cause extreme but initially scattered and gradual deleterious changes in the ecological theatre in which the human drama was being performed. Homo sapiens has yet to take significant steps to save the structure.

Both the need to be able to detect gradual environmental change and the difficulty for our hunter–gatherer brains to do so and to plan to respond appropriately are major features of the great mismatch. That mismatch is between the human genomes that evolved largely during our normal forager past and the rapidly transformed and transforming abnormal environments with which those genomes must now interact. This has been repeatedly illustrated in the phenomenon of shifting baselines of population sizes, both in human populations and other organisms that are important to society. Each human generation in the new abnormal tends to view conditions it first observed as being normal, which made evolutionary sense in a long-term relatively static environment. That’s why most people today view the pre-COVID-19 years as normal, especially because they did not experience the 1918–1919 flu pandemic, still less the great plagues of the middle ages. An instructive case in point is that of exploited fisheries over time (Pauly 1995), where the current massive degree of fisheries depletion often is obscured because observers of one generation assume the state of a stock was always as they first fished it, often unaware of how much more abundant fish once were. Much the same can be said about the massive erosion of biodiversity in general (Ceballos et al. 2017, Burns et al. 2021). In contrast, there is the case where scientists developed the ability to detect dramatic change in Earth’s normal ozone shield (Molina and Rowland 1974), and international government cooperation (and luck with commercial interests) may have solved the problem (Goyal et al. 2019). Technology corrected the perceptual mismatch. Technology also corrected our inability to perceive the increase of greenhouse gases in the atmosphere (Keeling et al. 1976), but society has not started a process of dealing with it, although it talks about it.

For millions of years of hominin existence and a few hundred thousand years of modern Homo sapiens, there was relatively little need for—and, before the evolution of speech with syntax and then writing, little possibility—of long-term planning or recognition of ongoing trends. Some non-European complex farming societies did systematically look to the future. The Iroquois nations of eastern North America famously encouraged seven generation stewardship, pressing people to consider how their actions would influence those yet to be born (Vecsey and Venables 1980). Our normal state, however, was to be clever but not wise. In short, we’ve evolved genetically and culturally to live in the here and now and consider it normal. As analyst Nate Hagens has put it, Homo sapiens suffers from “addiction to the present.” That has been underlined by the general lack of preparation for the perfectly predictable—and predicted (Ehrlich 1968, Garrett 1994, Daily and Ehrlich 1996)—virus invasion after recent experiences with SARS-CoV-1, MERS, swine flu, and so on. It is also seen in the continuing failure of nations to reduce the use of fossil fuels or even to invest significantly in the preventative maintenance and adjustment of infrastructure in the face of escalating climate disruption. Even more dramatic is the near total habituation to the gradual increase in the danger of nuclear war, which, until the Russian invasion of Ukraine, was rarely mentioned in the mass media and, even then, underestimated.

Culture gaps

The extreme rapidity of nongenetic (cultural) evolution in technology not only produced mismatches but also created large culture gaps that are features of the new abnormal. We are not referring to cultural differences between human groups (which have long been documented; Henrich and Boyd 1998) but more recent—and, in some ways, more ­serious—differences between the collective knowledge of groups and that of most individuals within the group (Ehrlich and Ehrlich 2010). The overall culture gap began with the agricultural revolution, which first enabled people to specialize in different occupations—farmers, warriors, priests, traders, and so on—in which expertise was largely confined to the specialists. This specialization greatly increased and proliferated with the industrial revolution in the eighteenth and nineteenth centuries. By the twenty-first century, specialization and the compartmentalization of knowledge has ended up creating a cultural complex adaptive system (Levin 1999) that is mismatched to the biophysical complex adaptive system with which the culture interacts.

In hunter–gatherer groups, virtually all members possessed the same nongenetic information, the same culture. The exceptions were few: We conjecture that perhaps a hunter with a favorite productive spot for placing rabbit snares, women who knew the medicinal properties of certain plants, a canoe builder who had a special way of lashing on an outrigger support, a shaman whose mentor had taught him a secret incantation. One might guess that all adults possessed most of the group’s significant culture, as do male !Kung bushmen who, for example, are well aware of the diverse edible plants normally gathered by the women (Draper 1975). Similarly, the Aivilikmiut Inuit, when one of the present authors (PRE) lived with them more than a half century ago, showed no sign of a significant gap in their traditional culture.

Contrast that with a European, Japanese, or American today. Even the most educated individual can’t possibly possess more than a miniscule part of their society’s nongenetic information. How many people in an advanced society, given the correct pile of computer parts, could describe their provenance and assemble a computer? How many know how their cell phones (or refrigerators) work? How many understand where their food comes from and how it was grown and processed? How many have read even a few thousand of the millions of books in libraries? In recent years, only about one in four Americans reads books regularly (Kaestle and Damon-Moore 1991). In modern societies, knowledge has become deliberately and excessively divided into ever smaller units—siloed—and isolated from related information. Knowledge and information are so compartmentalized that even brilliant leaders do not see (or choose for political reasons not to point out) obvious and crucial connections.

In our opinion, that’s one reason the devastating environmental consequences of an ever-expanding human population have been largely ignored. Most “leaders” are hopelessly ignorant of the demographic facts and close connections of the expanding numbers of people to environmental and social dilemmas. The governments in many struggling poor countries fail to support family-planning programs adequately, whereas those in the rich countries of Europe are irrationally encouraging higher fertility. Few in either case recognize that adding another billion people to the population in the future will cause more damage to humanity’s critical life-support systems than did the most recent increment of a billion, because ever more scarce and remote resources must be tapped to support the newcomers.

From our perspective, because of the vast culture gap, few people in our society are able to draw inferences on the basis of knowing how the climate works, why most discussion about inherent intellectual differences among people of different genders or skin colors is nonsensical, the significance of the second law of thermodynamics, the potential consequences of a nuclear war, how biodiversity is related to ecosystem services, the importance of economic externalities, or why population growth increases the risk of novel pandemics. This is a small sample of things a responsible citizen needs to understand in a world faced with a possible collapse of civilization; however, collectively because of the antique structure of educational systems (think “subjects” and “departments”), they require visiting very many silos to learn.

The great acceleration

The recent great acceleration of change in the human situation and scale of activities took place in a historic three-century period, starting around 1750 with industrialization (Steffen and Saez 2015). In that blink of an eye in geological time, our ancestors have essentially completed the postagricultural replacement of the 300-millennia-old normal human situation of living in small, mobile groups. Along with creating mismatches and culture gaps, the global population expanded some dozen-fold in numbers in those three centuries. It has benefited from a one-time bonanza of natural resources (Ehrlich 1989, Price 1995), especially the stored solar energy equivalent of some 10,000 hours of human labor that can be extracted from each barrel of oil (Love 2008, Mouhot 2011). People have used that energy to drastically modify every major ecosystem from the oceans to the forests to the ice caps, disrupt the climate of the planet, and create novel poisons, spreading them from pole to pole. Industrializing people changed their own patterns of activity, including vital sleep times (Walker 2017) using gas and then electric lights, and greatly altered their eating habits and, therefore, jaw structures (Kahn et al. 2020). Modern civilization has also fouled the air so that it often becomes lethal to breathe (Smith 2000), wiped out most other large animals and replaced them with more people and gigantic populations of a few domesticated species, and depleted much of the planet’s soils, underground freshwater stores, and high-grade mineral resources. Humanity has even developed and used weapons with the potential to exterminate everyone and managed to kill in a single war more than five times the estimated number of people that existed ten millennia ago when our species began switching from normal hunting and gathering to abnormal agriculture and launched an acceleration of population growth.

We might trace that transformation’s beginning to the Mesolithic, about 12,000 years ago, when various technological and trade arrangements (Graeber and Wengrow 2021) and increasing sedentism and sociopolitical complexity evolved (Newell and Constandse-Westermann 1984, Reiter 2012, Pearl 2021). Humanity has more recently so disrupted normality that geologists describe its results as a new era in Earth history, the Anthropocene. A leading economist chimed in: “The Anthropocene can be read as being the era when the demand humanity makes on the biosphere’s goods and services—humanity’s ‘ecological footprint’—vastly exceeds its ability to supply it on a sustainable basis” (Dasgupta et al. 2021). Today, since the acceleration, it’s quite ordinary for the United States to spend nearly $800 billion annually to have a military that can try to dominate other nuclear powers, ignoring most existential threats, and making one, a world-ending nuclear war, more likely (Kristensen et al. 2017, Baum et al. 2018, Redfern et al. 2021), something that would have been impossible in a normal human society.

Humanity created the Anthropocene through cultural evolution; absent extreme and obvious selection pressures, a dozen or so—or even a hundred—generations is insufficient time to adapt genetically to the dramatically new human-made environments. Homo sapiens has therefore brought Stone Age genomes into a Facebook world, creating the great genome–environment mismatches that plague civilization.

Two historic revolutions

Of course, both genetic and cultural evolution continued throughout the 300,000 years of normal human existence, gradually altering phenotypes and changing the sociopolitical arrangements and technologies of foraging populations, some of which were leading lives of well-being and cooperation (Sahlins 1972, Churchland 2019). Although there has been immense variety in forager socioecological relationships (Kelly 2013), there seem to have been certain regular aspects to what was for millennia normal human behavior. Those features, in addition to very small population sizes, included no accumulation of artifacts or food, harvesting but not cultivating plants, sharing of food, and maintaining relatively egalitarian social structures, except for men and women, who typically assumed different roles in food acquisition (Winterhalder 2001, Schrire 2016) and where the status of women varied with an array of factors (Hayden et al. 1986). Then came the agricultural revolution only some 11,000 years in the past. The switch to agriculture was hardly normal for any animal, including a primate whose ancestors had hunted and gathered throughout their multimillion-year existence. Somewhat counterintuitively, farming did not generally improve the human condition at the time; health declined because of increases in infectious disease and reductions in dietary quality. Shorter spacing of births allowed by sedentism in some late foraging groups and then agriculturalists nonetheless resulted in rapid population expansion (Armelagos and Cohen 2013, Larsen 1995, 2006, Gibbons 2009, Dow and Reed 2015).

Only ants, termites, and ambrosia beetles also invented agriculture (Mueller et al. 2005)—and did so long before even Australopithicus, let alone farming Homo sapiens, strode onto the evolutionary scene. And unlike Homo sapiens, nonhuman primates did not develop ultrasociality, being especially cooperative, beyond that of other social mammals (Tomasello 2014). Neither did any social insects go on to a second gigantic transformation, an industrial revolution, which was made possible by agriculture. A basic feature of the new abnormal created by the two revolutions, agricultural and industrial, was the speed with which resultant population growth and technological innovation made huge changes in the human environment, generating the Anthropocene.

Like chimps and bonobos, our preagricultural ancestors generally formed assemblages of 20–150 individuals (Dunbar 1992), most commonly around 30 (Marlowe 2005), and continued to do so throughout the Mesolithic and until the agricultural revolution. After some 300,000 years of foraging, becoming ultrasocial, and living in those small groups, physically modern people began behaving abnormally by the standards of their past history, settling down to fish, herd, farm, and have lots of children. They quickly developed the present notion of private property by mixing the idea of their labor (“you reap what you sow”) into sustenance (Heller and Salzman 2021). They learned then to enslave other people, created huge inequities of wealth and gender rights, financialized our species and made rentier capitalism (capitalism with profits accruing mostly to those holding property but not producing anything socially useful) possible (Standing 2021). Agricultural societies institutionalized many new behaviors, including racist practices and mass religion, which cemented the global subordination of women to men (Sultana 2010). That gender differential is the most pervasive inequity of the new abnormal (Epstein 2007), one that a political movement in the United States is struggling to exacerbate along with institutional inequity (e.g., Stevenson 2019)—both of which would have been impossible in hunter–gatherer bands (Fedurek et al. 2020).

In the recent three-century industrial stretch of its history, modern humanity also enabled a minority of people to develop an abnormal life by the standards of human history. That unusual lifestyle featured a superabundance of artifacts, relative freedom from infant and child death (the primary cause of longer average life expectancy), more physical comfort, sometimes happiness (Easterlin 2013) and, some believe, relative freedom from violence (Rose 2013, Bradshaw 2018). On one hand, that is an achievement of which we think Homo sapiens can be proud. On the other hand, our species’ failure to make universal well-being normal, to foresee and attempt to deal with the existential threats inherent in our achievements, and frequent failure to seek sustainability rather than continual growth and immediate maximum return, in our opinion, should be a major source of shame.

Costs of going from small-group primates to densely populated nation states

It is likely that the evolution of variable human cooperation, morality, and fairness (Schäfer et al. 2015, Tomasello 2016, Churchland 2019) traces in part to humanity’s population structure (spatial arrangement and patterns of mating; Wright 1931) during hundreds of thousands of years of small semi-isolated hunter–gatherer groups (Okasha 2013, Rand and Nowak 2013, Boyd et al. 2014, ). It could go even further back to groups of nonhuman primates and other animals (Brosnan and De Waal 2003, 2014). Population structure has played an important evolutionary role long predating the appearance of mammals—for example, in the interactions over nitrogen of soil microbes and plants, which vary with the structure of the microorganism populations (Kinzig and Harte 1998).

Ironically, the vast population increase, development of global communications, and loss of population structure (relative lack of division) in the new abnormal may actually be eroding the human desire to cooperate (Boyd et al. 2014). That could be dangerous in itself. Even today, despite popular myths about individualism and accomplishment, industrial societies would crash and burn without the massive cooperation rooted in our ultrasocial primate history (Henrich 2018). The idea of the “self-made” billionaire is, in our opinion, a delusion built on widespread ignorance and abundant fossil fuels. Henry Ford would have been unable to get rich without the prior cooperation of thousands of people over centuries doing everything from inventing machine tools to learning to drill for oil and construct roads or, indeed, the cooperation of contemporaries in lending money, being employed by him, or buying his products. The delusion was not harmless, however; Ford’s competitiveness and anti-Semitism were inspirational to Adolf Hitler (Ullrich 2016).

The cooperation that language and ultrasociality fostered clearly was a major precursor to the astounding dominance that Homo sapiens has achieved. People themselves now have an aggregate biomass of over 300 million tons (Walpole et al. 2012) and with their domestic mammals compose some 96 percent of the weight of all Earth’s mammals (Bar-On et al. 2018). That’s extraordinary in the entire history of life on our planet as well as perilous.

Agriculture: Humanity’s greatest mistake?

As human groups got larger, in some cases they apparently found it difficult in some environments to obtain sufficient resources by intensive hunting and gathering (Cohen 1977, 2009). As a result, some groups stopped roaming and in stages began to practice first fish harvesting and herding and then plant agriculture, frequently switching between foraging and agriculture seasonally. The latter process has been described as the coevolution (Ehrlich and Raven 1964) of people and plants as they domesticated each other (Rindos 2013) in an interaction brought on by human need. But population growth was central to agriculture being taken up, not because farming was invented—foragers understood plant biology very well (Sutton and Walshe 2021)—but as a consequence of increased demand.

That demand may have been reinforced by an exogenous factor, the loss of much coastal foraging territory to sea-level rise as glaciers melted some 11,000 years ago. That possibly increased population pressures in many areas, depending on the comparative melting rates and foraging quality of land being freed of ice (Zvelebil 1986). In any case, much of Homo sapiens’ rise to planetary dominance (Ehrlich and Ehrlich 2009)—but also much problematic human behavior today—traces to the resultant agricultural sedentism (Sapolsky 2017).

The ability of small groups practicing some form of agriculture to produce more food than they needed—­surpluses—allowed a greatly increased division of labor and with population growth provided an opportunity for more dominant individuals to usurp resources and turn dominance hierarchies into hierarchies of wealth (Perret et al. 2020). Soon there could be soldiers, farmhands, priests, builders, mechanics, and servants. Being sedentary enabled people to accumulate material things, expand trade, invent money, develop great economic disparity, organized crime, slavery, and corruption. All of that required a wider extension of trust, not just among members of one’s own small group, but in some cases of strangers and remote institutions.

Normality of the rat race

Settling down and farming we believe laid the foundations of today’s rat race for more—more status and (related) more stuff. The evolutionary roots of that rat race almost certainly lie in the nearly universal race of organisms to outreproduce other members of their populations. One could therefore trace what Thorstein Veblen (Veblen 1925) famously described as “conspicuous consumption” back to millions of years of sexual selection (Sundie et al. 2011, Collins et al. 2015). Sedentism in our view just added (for males) displays of material wealth to strength, bravery in defense, skill in hunting, peacock tails, giant antlers, or other costly signals related to male fitness. Indeed, one can see in the perpetual competition among individuals for resources, including mates, the historic roots in natural selection of one of our major current problems, massive greed made possible by the cultural evolution of farming and manufacturing. The major role of comparative status in the consumption behavior of many human beings today can be seen in the thriving profession of marketing (O’Cass and McEwen 2004). And from our much more distant past, one also can see traces of optimal foraging behavior (Pyke 1984) at both individual and group levels, as, for instance, in finding ways to garner the most additional energy per unit of energy invested. The new abnormal has, in other words, added vast new dimensions to a very normal human attribute: trying to outreproduce others.

The new abnormal of globalism

The last 300-year stretch has therefore been not even remotely normal for our species, as the colossal increase in group size alone shows, accompanied more recently by the emergence of not just local or national but global concerns (Locher 2019). Moreover, the new abnormal is almost guaranteed to be temporary (Wackernagel et al. 2019, Dasgupta 2021), because human life-support systems are increasingly threatened and corrupt cadres with little interest in the common good increasingly control large nations. Ironically, however, the relatively new idea of a common good that now extends globally, far beyond the small group, is held by many elements of civil society. It could prove a redeeming feature of civilization if it were taken seriously in policy.Trust extending beyond the immediate group, that essential ingredient of the human rise to dominance, may have started with long-distance forager trade and the expansion of group size. Its erosion in recent times may have been exacerbated by industrialization and development of a global culture gap that the Internet with its digital divide seems helpless to close (Lissitsa and Lev-On 2014). The Internet has allowed the development of surveillance capitalism (capitalism with the commodification of personal information; McChesney 2013, Foster and McChesney 2014), which is creating the epistemic chaos (lack of agreement on sources of knowledge) that Shoshana Zuboff warns against (Zuboff 2019).

The new abnormal of mass movement

One important characteristic of our hunter–gatherer ancestors was the ability, indeed usually the necessity, of moving around. Resource availability varied greatly through space and time in many ecosystems, and people needed to migrate to find persistent pools of water during dry seasons, visit certain trees at fruiting time, go to specific rivers when there were seasonal influxes of spawning fish, or follow a herd of ungulate prey on its annual migration. When populations harvesting a resource began to exceed local carrying capacity, conflict could be avoided by groups moving apart. Indeed, the most prominent theory of the origin of states postulates that population growth following the agricultural revolution led to groups becoming circumscribed by environmental (e.g., shortage of agricultural land) or social factors (e.g., competing groups). Circumscription eventually led to the current nation-state political structure of the globe (Carneiro 1970, Langton 1988, Johnson and Earle 2000, Carneiro 2012, Schönholzer 2017, but see Stocker and Xiao 2019). It caused a transition from kin-centered politics to the importance of pseudokin—nonrelatives with whom we make emotional connection through constant exposure (Ehrlich 2000). Is the postagricultural revolution phenomenon of mass migration normal? Hunter–gatherer groups moved around, usually within a relatively small area except when invading new habitat, sometimes to move apart. Over tens of thousands of years, they departed in small groups from their African homeland and occupied most of Earth’s land surface. But mass movements, such as the triangular slave trade, which transported as many as 12 million Africans to the Western Hemisphere over a few centuries, only became possible with the development of large populations, a need for agricultural laborers, steep social/power hierarchies, and appropriate technologies. The magnitude and rate of the Ukrainian exodus in response to the 2022 Russian invasion would have been impossible in normal human history.

Deepening of the culture gap and the power of cultural evolution

Although hunter–gatherer groups only differentiated slightly in basic genetics after they left Africa, they clearly rapidly evolved culturally, as they had in their native continent (Toups et al. 2011), to fit into diverse environments. Within those groups, there were no substantial culture gaps such as exist in the new abnormal of industrial societies. Uninformed about how the world works, most people now are unable to participate in planning to avoid a collapse of civilization and many may oppose taking appropriate measures. The vast scale of the culture gaps within modern societies, illustrated by near ubiquitous growth mania, is one of the main things that makes today’s civilization the new abnormal. It is ironic that cultural evolution, a major element of the success of Homo sapiens, now appears to have become maladaptive.

The power of cultural evolution is attested to by how genetically similar people learned to survive and then thrive in environments as different as Baffinland, the Kalahari, and the Amazon. Sometimes people have dramatically altered their stock of nongenetic information in only a few generations or even within a single generation, as international immigrants often demonstrate. We know, for instance, that a rich, complex, distinct Māori culture had developed just a couple of centuries after Polynesians reached New Zealand (e.g., Barber 2004). The historical significance of early cultural speciation can be seen in the postulated cultural evolution of monotheism in desert environments and polytheism in tropical forests (Sapolsky 2005, Sapolsky 2017). One can assume that cultural differentiation both in traits that had selective value (hunting techniques) and those that probably did not (inventing different gods) was normal for all of human history and went on at different rates (Rogers and Ehrlich 2008) right up to now. But the new abnormal era’s epistemic chaos today, with for instance, many economists promoting perpetual growth and ecologists pointing out that it’s impossible, is clearly hindering adaptive cultural evolution as climate deniers, antivaxxers, business schools, and QAnon so dramatically demonstrate.

Another major difference of today compared to even the early industrial era is the vastly accelerated creation and spread of cultural traits, from use of antibiotics and mRNA vaccines, for example, to the explosive growth of cell phone use and the behavioral changes it encourages such as sexting (Mitchell et al. 2012). New cultural streams, both prosocial and antisocial, trace from differences in values and ideologies that arose from the agricultural and industrial revolutions. They have flourished in the new abnormal of gigantic populations, proliferating technologies, global communications, and the mix of surveillance capitalism and rentier capitalism. The rapid dissemination of new cultural understandings theoretically could lead in the new abnormal to safety, peace, and prosperity, but at present it seems instead to favor violence, cults, and a general failure to deal with existential threats.

Perpetual growth has been called the “creed of the cancer cell,” and the parallels of the human population’s impact to a cancer on the skin of Earth have been made scientifically explicit (Hern 1992, MacDougall 1996, Rees 2020). The cancer has already produced symptoms such as colonialism, genocide, large-scale warfare, pandemics (Keeling Matthew J and Grenfell 1997), environmental destruction (Harte 2007), and a possible erosion of cooperation (Lozano et al. 2020).

Can understanding our full history help us design a better future?

Natural selection in our distant past produced a primate with an extraordinary ability to store, communicate, and manipulate nongenetic information. In small hunting–gathering groups this led, in broad terms, to lives of relative power equality (Boehm 1997, Wilkinson 2001, Gray 2011), reciprocity, altruism, cooperation, trust, and as the record shows, in some cases sustainability. Our ancestors’ behaviors also included some interpersonal and intergroup violence and other attributes we now consider undesirable when they still occur. More recently our species has produced through cultural evolution amazingly rapid technological developments and a population explosion. That is proving, in terms of geological time, a flash in the pan, but a few centuries or so of a new abnormal has generated trends that threaten the very persistence of human civilization, or even of the human species.

Even when confronted by the lethal coronavirus epidemic, massive wildfires, extreme and erratic weather, disappearances of wildlife, environmental refugee flows, and other signs of ecological collapse, few people today recognize the likely shallow temporal depth of the new abnormal. Faith in a future of more seems still unshaken. The widespread assumption remains that everything from automobile numbers and passenger flights to corporate earnings, consumer demand, automation, and access to boundless energy (shifting to mostly “renewable”) will go on expanding indefinitely. And almost everyone assumes that there will always be technological solutions for escalating environmental risks, material constraints, and a wide range of diseases, thanks perhaps to artificial intelligence, whose possible contributions and attendant risks remain difficult to sort out (Haenlein and Kaplan 2019).

In opposition to that largely baseless optimism, humanity is possibly in a position to design and implement what we might call a normal 2.0, a viable future within the biophysical limits of Earth. One important element might be promoting more democracy. It’s many peoples’ favorite form of governance, however defined and however imperfect. Most foraging groups, especially early on, must have been much more democratic than even Athens in its democratic period (Bollen and Paxton 1997, Gray 2011).

As far as we know, forager societies were relatively more sustainable and less likely to be driven into the ground by leaders than have been most postagricultural undemocratic empires, perhaps because of foragers’ relative incapacity to overharvest resources. Slaughter of megafauna seems to have been a feature of some forager lifestyles (Martin 1967), but its impact on sustainability (considering the contributions of plants and aquatic resources to subsistence) is unclear. Scarcity of game did not necessarily lead to social collapse (Kay 1995). A North American Aboriginal foraging population sustained itself for millennia, switched in places to an agricultural base a couple of thousand years ago, and thrived until it was conquered by European pathogens (Smith 1989, Kay 1994). Perhaps the sustainability of forager societies traced to the wider distribution of leadership and a mobile lifestyle that did not provide surpluses for ambitious individuals to monopolize—to declare to be their personal property. We lean toward the latter explanation, but this is an issue that deserves more research.

Another possible reason for ancestral lifestyles to have been sustainable is that smaller group sizes allow for more face-to-face contact to work out courses of action while bringing fewer cultural viewpoints to the table. Ideological differences could make achieving needed consensus difficult. To help build consensus where it is necessary for civilization to persist, world leaders might try to put in place a multilevel regime of adaptive management that would continuously update the status of the global human enterprise and its environmental constraints and that would have as a major goal building consensus on how best to respond to those conditions. Unfortunately, such adaptive management clearly is now difficult to achieve even in subglobal arenas by giant “democracies” such as the United States (O’Toole 2021).

Humanity needs to find new ways of guiding cultural evolution (Ornstein and Ehrlich 1989, Ehrlich and Ornstein 2010) to design policies that will steer civilization away from catastrophe, establishing a worldwide survival regime. But that would require genuine leadership and education appropriate to twenty-first century conditions, both of which seem to be in vanishingly short supply as we write this. We need individuals who have the knowledge necessary to try to move socioecological complex adaptive systems (Levin et al. 2013, Preiser et al. 2018) in beneficial directions. In normal small-group forager culture, there was usually little need to develop leaders who could do even the relatively simple long-term planning essential to a farming society, such as to design and coordinate a small-scale irrigation scheme. Leaders now often must negotiate agreement between nuclear-armed nations or even find ways to form hierarchical social structures for huge groups (Powers and Lehmann 2014). The need for those leadership roles is critical in the now global agricultural, industrial, and demographic new abnormal, and filling them will be incredibly challenging. In more democratic systems, the chances for good adaptive management may be higher than in autocracies, because diverse leadership skills may be able to surface. We might even develop some modern Kandiaronks.

Establishing a flexible, fair, and evidence-based system of governance for the world is, we believe, the greatest challenge facing modern Homo sapiens, the sine qua non of its survival in the new abnormal (Rees 2010). What’s also clearly needed are much stronger constraints on rentier or surveillance capitalism, better judgment on technologies to deploy, avoidance of remaining stuck in a system of financial hierarchy such as has been clearly warned against (Graeber and Wengrow 2021), and a comprehensive plan of action for civilization to shrink its scale.

One avenue to gain some wisdom on sustainability and perhaps find ideas to adopt might be to look at an example of the longest sustained (tens of thousands of years) human societies, those of Australian Aborigines (Beattie 2021, Sutton and Walshe 2021). The Aborigines had one built-in advantage in their relatively small group size, which, as in other foraging cultures, tended to favor cooperation, healthy living, and individual autonomy. They did manage to wipe out the continent’s megafauna, but, as did the Indigenous people in North America, they managed to evolve cultures that allowed sustainability despite that. A much smaller population size should be a long-range goal for Homo sapiens, but reaching it humanely will take numerous generations and even reaching a population size of 1 or 2 billion would not likely provide many of the Aborigines’ band-size advantages.

A lesson we might more quickly learn from the Aborigines is closing critical parts of the culture gap. Aboriginal sustainability was built on an intimate and near universal knowledge of the biophysical environment in which the people were embedded. In current Western societies that knowledge is not only largely absent from the education system but systematically underrepresented and misrepresented in university curricula and by “leaders,” virtually all of whom revere growth and are unaware of the severity of impacts, often nonlinear, that growth can have on humanity.

Other forager behaviors could beneficially be adopted to help establish a normal 2.0. The !Kung bushmen shared food (Marshall 1961), as did other forager societies, and a rich–poor division did not exist. Today’s situation with hundreds of millions of undernourished people would be unlikely if the ethics of forager societies were reestablished through dedevelopment to redistribute wealth and power (Ehrlich et al. 1977), perhaps instituting the long-discussed guaranteed annual income (Bhatia 1968), or some other measures. Also essential would be reversing the privatization of the ecological, resource, social, and intellectual commons (Standing 2021).

Lessons might also be learned from the behavior of some early agricultural communities, such as the preindustrial Polynesian people of Tikopia. The Tikopian small population (approximately 1000 individuals) faced no culture gap and early on reportedly used a variety of population control techniques to keep their cleverly developed horticultural system from being swamped by overpopulation when disease or weather events failed to curb the size of the population (Borrie et al. 1957). More recently, that balance may have been threatened by the acquisition of ideologies (Christianity) and other intrusions from the new abnormal (Macdonald 1991, Firth 2013). In any case Tikopia is an example of a society where human population size in relation to carrying capacity has long been an issue, whereas it is largely ignored everywhere in the new abnormal.

Can humanity move to a normal 2.0 with cooperation over competition and enough over more? Can we dramatically shrink the culture gap and the mismatches between our genomes and our environments? Can we accomplish the required humane shrinking of the scale of the human enterprise and reduction of the lethal inequities that now plague that enterprise? We hope so, and a new generation, symbolized by Greta Thunberg, gives us some of that hope.

Acknowledgments

We thank John Holdren, the late Lee Ross, and the Beijer ecological economics gang for many wonderful conversations around these ideas, and Pete Bing and the late LuEsther T. Mertz, whose long support made this work possible. Andy Beattie, Daniel Blumstein, Gerardo Ceballos, Jonathan Cobb, Jared Diamond, Joan Diamond, Peter Gleick, Larry Goulder, John Harte, Mel Harte, Simon Levin, Peter Raven, James Salzman, Robert Sapolsky, Chris Turnbull, Brian Walker, and two anonymous reviewers gave very helpful comments on this project and the manuscript.

For references cited click here.

Read or download the original paper here or use the link above.


Paul R. Ehrlich (pre@stanford.edu) is a Bing professor of population studies emeritus and Anne H. Ehrlich (aehrlich@stanford.edu) is a senior research scientist emeritus in the Department of Biology at Stanford University, in Stanford, California, in the United States.


The MAHB Blog is a venture of the Millennium Alliance for Humanity and the Biosphere. Questions should be directed to joan@mahbonline.org

 

 

The views and opinions expressed through the MAHB Website are those of the contributing authors and do not necessarily reflect an official position of the MAHB. The MAHB aims to share a range of perspectives and welcomes the discussions that they prompt.