The term was coined in 2000 by a non-geologist, the Nobel Prize-winning chemist Paul Crutzen, to drive home the idea that the global cycle of air, water and rock has come under human domination. He was standing in the front of a press conference, and it was a quip. As he wrote more about the topic, others took up the term including some geologists. Just a few years later, the ICS Working Group was formed.
As the Working Group has explored the possibilities, a healthy skepticism has arisen. The essence of a geologic time unit is that it can be marked somewhere in the rocks, preferably by marking the boundary with a “golden spike” monument in a spot that everyone agrees is useful and correct.
How would we do that, and where would it go? And when? Some proposals for the boundary are based on human dates, not natural signs. The researcher William Ruddiman has long argued that the “human era” began thousands of years ago, with widespread deforestation and the extinction of many large animal species like the mammoths. Crutzen thought that 1800, marking the Industrial Revolution, would be best. Another proposal is for 1950, when the “Great Acceleration” of post-World War II economic globalization began.
And there’s a strong appeal to 1945—specifically July 16 at 11:29:21 a.m. Universal Time, when the first atomic bomb was exploded in New Mexico, marking an abrupt change in atmospheric radioactivity. That’s what the Working Group picked earlier this year as a preliminary recommendation.
But if we start a new period of geologic time during the lives of living people, what geologic materials are geologists studying? The rocks and soil deposits of that age are infinitesimally thin and fragile; even layers of ice in the polar ice caps could be gone in a few centuries if climate change causes them to melt. There are no fossils. How would we show Anthropocene deposits on geologic maps? It would be like writing a thick book on the history of Earth—which is what the geologic time scale represents—in which the last chapter is one sentence, “To be continued.”
Another strong argument against the Anthropocene is that we already have a formal time unit that includes the human era: the Holocene Epoch. The Holocene began 11,700 years ago, at the beginning of the latest of dozens of interglacials—warm periods between ice ages—like dozens before it in the Pleistocene epoch, but this specific interglacial is marked by the rise of civilization. So, geologists Phil Gibbard and Mike Walker have argued, if we’re already living in the time of “the evolution of the human environment,” why should we end that era and start another one using the same criterion?
The trouble is that the Anthropocene proposal does not arise out of evidence from the rocks and sediments of the world’s past. It arises out of our awareness of the geologic processes that run the world now. In that sense the Anthropocene proposal, unlike all previous tinkerings with the time scale, is backwards.
Today in the journal Science, Ruddiman and three coauthors suggest, in effect, that we call the whole thing off. They argue that we’ve already changed Earth so much during the last several thousand years that setting a bright line in 1945 would be missing the boat. “Despite differing views, the term ‘Anthropocene’ is here to stay,” they say. “One way forward would be to use the term informally (with a small ‘a’). . . . In this way, we could avoid the confinement imposed by a single formal designation, yet acknowledge the long and rich history of humanity’s environmental transformation of this planet, both for better and for worse.”
By coining a name with a geological sound, Crutzen did a bit of memecrafting worthy of Shakespeare. One reason it caught on so well was that news stories and commentators could say, “Geologists are considering this radical step.” But if geologists reject the Anthropocene, the concept will lose some of the power Crutzen gave it.