New American Studies Journaldoi.org/10.18422/73-14

The Bomb and Climate Catastrophe in Fact and Fiction

Marianna Torgovnick

I will never forget the Friday in October when my handsome teacher, Mr. Towse, a true first crush, cancelled a test scheduled for the following week, saying that we should enjoy ourselves because “there might not be a Monday.” When we asked why, he suggested we consult our parents. When I did, I realized—for the first, though certainly not the last time—how quickly everything around me could vanish—including me. Awareness of mortality and apocalypse were hardly new. But the instantaneousness of it all at human scale and under human rather than divine volition has defined modernity for generations. This essay counterpoints two existential threats: America’s understanding of nuclear events, in fact and fiction, and of climate change. Though the temporal pace of the two catastrophes differs substantially—the nuclear quick and clearly defined, climate change slow and more inchoate—the ways that they overlap make our different sense of the most likely endings all the more striking and even shocking.

Beginnings: The Story of Climate Change

A solitary farmer clearing and planting a field; the conquest of the Americas; smokestacks belching into a lowering sky; a white polar bear stranded on a tiny glacier: any number of images symbolize climate change because different people choose different events to mark the beginning of the Anthropocene, the current geological era in which humans are leaving their marks upon the Earth. Some locate the beginning in the felling of forests to farm or even in the cooking of food—markers that suit Yuvah Noah Harari’s shocking thesis in Sapiens (2014) that the human race is and always has been genocidal and, ultimately, suicidal. A larger number identify imperialism and the extensive use of fossil fuels in the 18th and 19th-century Industrial Revolution. Both timeframes include centuries of what Rob Nixon calls slow time and violence (2011). Still others point to the speed and clarity of the first atomic explosions in 1945, when man-made radiation entered Earth’s soil and atmosphere and the postwar economic boom produced a dramatic and unprecedented increase in the use of fossil fuel. Depending on which marker we choose, a certain temporal asymmetry in nuclear and climate change plots becomes inevitable, though common patterns of thought persist.

For my purposes, 1945 forms a convenient and logical beginning. For although climate change involves gradual processes over centuries and even millennia, the increase in carbon emissions after 1950, in what Roy Scranton and others call “the Great Acceleration” (2018), remains dramatic and (in Harari’s terms) suicidal (Harari, 2014: 18). Just as the post-World War II generations had the power to destroy civilization and to end humanity through total nuclear war, they have also had the power to arrest or to accelerate climate change. Seduced by creature comforts from air conditioning and automobiles to computers and even fashion, we have basically chosen to accelerate or, at least, not to arrest carbon emissions. We are the generations the future will identify as having lived what Amitav Ghosh calls “the great derangement” (2017).

For those who heard and understood correctly, non-fiction sounded early alarms about Earth’s fragility. In 1968, Paul Ehrlich published The Population Bomb, warning about an overstressed Earth that could not feed the huge growth in population after the end of World War II. Young people of conscience responded by vowing to reproduce no more than themselves: two children for every couple, a policy made easier by the women’s movement and birth control pills. And yet, once outside major cities, it was easy enough to look around and think, But what’s the problem? There’s so much open land. The full dimensions of what we now call climate change accelerated as carbon emissions grew alongside populations. As Roy Scranton puts it trenchantly, “the world groans under the weight of seven billion humans” (2018: 3) – a figure closer to eight billion in 2022 - and would need to lose seventy percent of its current population to return to 1940 levels (ibid.: 321). That’s a shocking figure that bears repeating: to return to a state of climate health, the Earth would need to support 70 percent fewer people. Scranton chooses 1940 because, with all its horrors, World War II killed, by the estimate he uses, “only” 4% of the world’s population (ibid.); if we compare 4% to 70%, we can appreciate the extent and likely intractability of the current climate catastrophe.

The first Earth Day was held at the beginning of Spring 1970, after a damaging oil spill in California and attracted a range of people around the world, but especially by the young. Its goals were peace (a nexus with nuclear plots) and environmental protection. Interest in the celebration spread and has never really vanished, but attendance and attention peaks at ten-year anniversaries and is siphoned off by other movements, such as opposition to the Vietnam War and mobilization behind worthy but local issues—the plight of endangered species, the Amazon rainforests, and so forth. The “so forth” in the last sentence is not meant to be dismissive, but to indicate one reason that environmentalism, while extremely influential on everyday conduct—think, for example, of recycling in 2022 versus 1970—segued into a variety of issues rather than focused like the proverbial laser beam.

After 1970, the government received other reports and alarms that came and went without reaching a tipping point, in part because the movement to halt climate change lacked clear and compelling terminology. “Greenhouse emissions” was always rather nebulous, conjuring visions of botanical gardens or rural farms. Skeptics offered, and continue to offer, cold weather or erratic snowstorms as a refutation of “global warming,” persistently conflating “weather” with climate. The U.S. government has taken some rather quaint concrete actions—for example, when Lyndon Baines Johnson installed solar panels on the White House that were later removed. But the government’s intentions to guard the environment often got sidetracked by events like the Energy (Oil) Shortage of 1973 and rising fuel prices in 2022. In what has been aptly described as regular ritual, industry experts examined climate issues and “found good reason to be alarmed and even better excuses to do nothing” (Rich, 2018). Until the term “climate change” achieved currency in the 21st-century, the movement lacked a name that included the large, systemic forces that permeate industrial, social, and governmental structures not fully subject to individual or communal will. Given the emerging crisis, the situation could not last, and it didn’t.

Beginnings: Gone Nuclear

Unlike climate change, nuclear histories announce their beginning quite clearly, with the first and to date only explosions of atomic bombs for wartime purposes in August of 1945, in Hiroshima and Nagasaki, Japan. The covert Manhattan Project in New Mexico developed “the Bomb” for use against Nazi Germany, but when the Third Reich collapsed in May of 1945, the Project’s weapons weren’t needed in Europe. Would they ever have been used there? Even though the Allies had known since 1942 with some certainty, after devasting losses in the Soviet Union, that Nazi Germany’s defeat was inevitable? We’ll never know. We do know that they were used in Japan—to forestall the need for a deadly invasion after Okinawa, most said; to warn the Soviets against incursions into Asia, some said; or both.

Almost cartoonlike and childish in form, the mushroom clouds above Japan dominate representations of the atomic, masking the many complexities of the bombings and their aftereffects. The clouds became a synecdoche for Cold War fears; they also came to symbolize nuclear energy basic to postwar life. Though the atomic and then hydrogen bombs produce many diverse shapes and colors—stream the end of Stanley Kubrick’s Dr. Strangelove (1964) if you have any doubt about that—the mushroom cloud remains an immediately recognizable symbol that to some extent displaced detailed understandings of what happened on the ground, which would have been entirely predictable based on firebombing already used in Germany and Japan. Yet Robert Oppenheim, the head of the Manhattan Project, expressed his despondence at what had been achieved in tearful apocalyptic words from the Bhagavad-Gita that can be found on You Tube: “Now I am become death, destroyer of worlds.”

Way before the atomic bombings and, indeed, in its aftereffects, World War II traded quite freely in mass death. Yet, the first atomic explosions had, no pun intended, a special impact. As the provocative French critic George Bataille put it, Hiroshima shocked us because “it depended on their fellow men to kill [the residents of Hiroshima] or to let them live…the atomic bomb draws its meaning from its human origins” (1995: 227). Suddenly, and forever after, humans could destroy huge numbers of their fellow beings, within seconds and at a distance, with a total lack of discrimination between military and civilian targets. The atomic bomb stood out because of its human volition and an assured total destruction that mirrored, on a human scale, divine Apocalypse.

Between 129,000 and 226,000 died in Hiroshima and Nagasaki—the precise numbers impossible to identify, with radioactivity taking its toll over years and decades. Would the joy that erupted in the U.S. have muted if people understood the full lethality of radiation and future threats? If they’d known that every person born after 1945—their children, grandchildren, and beyond—would carry traces of radiation in their teeth? Once again, we’ll never know. We do know that, years after the first impacts, tests of atomic and then hydrogen bombs routinely used American servicemen as witnesses.

The first widely-read treatment of life and death on the ground came in John Hersey’s Hiroshima, first published in August of 1946 as a New Yorker article that flew off the newsstands and then as a widely-read paperback. Focused on just four survivors, all linked to a single Christian clergyman, Hersey’s gripping narrative is not especially representative. Yet simply by getting the facts out, Hersey’s book accomplished something remarkable. For as early as 1945 and continuing up to today, the United States has both claimed the atomic explosions as a symbol of its power and disavowed them. American jurists and politicians carefully crafted indictments for war crimes at Nuremberg to preclude America’s being blamed. Until 1953, when the U.S. occupation of Japan ended, it was illegal for the Japanese to publish photos or to testify in public about the bombings. The combination of delayed awareness and official suppression perhaps predicted the widespread mass denial that has followed in films, novels, and, more surprisingly, in non-fiction and even in history.

To date, one of the very few movies to show the devastation in Hiroshima is Alain Resnais’s Hiroshima, Mon Amour (1959), a French rather than American film and a love story rather than fully historical. Its famous fourteen-minute opening intercuts a couple’s intimate lovemaking with actual footage around the bombings and with shots of memorials. The atomic bomb hovers in the background of the interracial couple’s difficulties in remembering and connecting. In fact, the prologue references the Bomb as marking a before and an after in our understanding of what it means to be human. Later films, such as The English Patient, the Oscar-winning movie (dir. Mingella, 1996), excised completely the atomic bombings that form the climax of its source, Michael Ondaatje’s novel (1993), in which, after Hiroshima, the Sikh hero walks away from friendships with whites.

Well into the 21st century, museum exhibitions and other representations in the West stressed World War II in Europe rather than in the Pacific. Frequently, the use of atomic bombs appears in the passive voice—a bomb “was dropped, followed by a second bomb” rather than, “the United States dropped two atomic bombs on Japan in August 1945.” In 2010, an exhibition at the Smithsonian Museum in Washington D.C. about the Enola Gay, the airplane that bombed Hiroshima, drew protests when it included Japanese points of view, which were subsequently curtailed. Shortly afterwards, a detailed history of events on the ground called Last Train to Hiroshima (Pellegrino, 2010) was pulled from bookshelves and destroyed after it was revealed that its author had been duped by a veteran masquerading as a crewmember aboard the Enola Gay. The story of Last Train from Hiroshima had several other bizarre twists: the author lied about his B.A. degree; James Cameron —prestigious director of Titanic (1997) and Avatar (2009) —dropped his plan to make a major film based on the book. Perhaps the oddest twist of all: false claims by veterans that they flew on the Enola Gay are, apparently, common enough to be a thing.

In a comparable way, through the 1950s and into the 1960s, much of the rhetoric and government guidance about atomic and then nuclear bombs favored plucky versions of how civilians could and would cope. The famous government video for schoolchildren called Drop and Cover is the best-known example. But there are many others, including Civil Defense publications that make bomb shelters look not just feasible, but downright cozy: Betty Crocker in the bunker; card games and TV after the apocalypse. Once again, it couldn’t last, and it didn’t.

Empty Cities: The Nuclear

I remember being on Manhattan’s Fifth Avenue early one Sunday morning before department stores opened and tourists arrived. As I surveyed the completely empty streets, I thought of plots from Rod Serling’s TV show, The Twilight Zone (1959-1964), and felt panicky. My friend and I wondered whether the streets had emptied because of a nuclear threat and whether we should be heading somewhere, anywhere—perhaps the subway station from which we had emerged? The scene, and the emotion mirrored images of empty cities that populated the news during the first six months of the Covid-19 pandemic—urban streets and landmarks devoid of people. The images were strange and unnerving, but also something I had seen before—something we had seen before—in films both about the nuclear and about environmental catastrophe.

During and after the 1950s Red Scare and the McCarthy (House Unamerican Activities Committee) hearings, science fiction encoded and displaced nuclear anxieties and fears of Soviet infiltration in motifs from Godzilla through The Thing (dir. Nyby, 1951), to the pod people in Invasion of the Body Snatchers (dir. Siegel, 1956). Hollywood produced no movies as direct as Hiroshima Mon Amour (1959), but it developed its own ways of representing nuclear dangers. Best-selling novels and iconic films show and dwell on empty cities. What is the appeal of such images to the human imagination? Why do we enjoy the spectacle of empty cities, a negation of so much that civilization has achieved? What anxieties and obsessions does the absence of humans touch? We had a taste of that in 2020.

Silva, Paulo. “A deserted Times Square during the coronavirus lockdown in New York City, USA.” New York, April 13, 2020. URL: https://unsplash.com/photos/5oO1xH5h8kQ 

A scene from The World, The Flesh, and The Devil (1959), dir. Ranald MacDougall, starring Harry Belafonte, pictured here.

Perhaps the best-known 1950s example of empty cities is a novel by Nevil Shute called On the Beach (1955), made into a popular 1959 film starring Gregory Peck and Ava Gardner (dir. Kramer). Like Hiroshima Mon Amour, it’s a love story in which the captain of an American submarine underwater when radiation passed over the United States and a woman in Australia wait, like everyone else Down Under, for their own deaths as a huge, deadly cloud moves inexorably from the Northern to the Southern hemisphere. In this story and several others of the period, nuclear disaster does not arrive in the form of actual bombs, but rather in widespread but quickly moving fall-out, often released by accident or for vague, unspecified reasons.

My favorite empty city movie, The World, the Flesh, and the Devil (dir. MacDougall, 1959), is less well known and only recently become available on VHS, DVD, or streaming, perhaps because of its star’s Harry Belafonte’s radical politics. In turn, the movie’s racial themes make it the far more important movie today. Belafonte initially made his reputation as a singer by capitalizing on his West Indian heritage to project a happy-go-lucky, Calypso image. “Matilda,” “Calypso Rock,” “The Banana Boat Song”: daylight come and I wanna go home. He was the first Black singer to appear on the then-stellar Dinah Shore Show and later toured the nation with a white husband and wife dance team, Marge and Gower Champion, giving some of the first integrated performances in the American South. One night in segregated Richmond, they got carried away by the applause and held hands to bow. The stage manager nearly fainted but, to his surprise, the applause continued. It was a key moment that Belafonte would repeat in The World, the Flesh, and the Devil, which cast him as the romantic lead opposite white actress named Inger Stevens who was in real life, perhaps not coincidentally, married to a Black man. It was not the first occasion that Belafonte, tutored by singer and activist Paul Robeson, fought for the right to display interracial romance, and it would not be his last political act.

In The World, the Flesh, and the Devil, Belafonte plays Ralph Burton, an engineer underground when a quickly moving nuclear cloud passes overhead. When he emerges from a mine after several days of being trapped and finds no people, he feels at first that his town is playing a trick on him. After he drives to New York and sees highways, bridges, and tunnels, as in many disaster films, littered with empty cars, he understands that he might, in fact, be the last man alive on Earth. New York looks a lot like end-of-days Rapture, since there are no corpses or any evidence of death in sight.

Many of the settings and some of the actual scenes in The World, the Flesh, and the Devil influenced a film more of my readers will have seen, 2007’s I Am Legend (dir. Lawrence), with Will Smith, in which a flawed vaccine, rather than nuclear fallout, has destroyed humanity. In fact, like Charlton Heston’s 1971 The Omega Man (dir. Sagal), Iwhich shows a radically changed Los Angeles, the films trace a common ancestry to a 1954 novella by Richard Matheson called I Am Legend and a 1901 novella by M. P. Shiel called The Purple Cloud. In each version, the cause of the disaster differs: genetic mutation inI Am Legend , a plague following a nuclear war in The Omega Man, and free-floating radiation in The World, the Flesh, and the Devil. Several of these films have distinct racial subtexts that also appear in many representations of climate change. In these and other disaster films, the end of the human race tends to come about by whatever the culture fears most at any given period, with the relationship between nuclear events and other disasters—pandemics, killer viruses, fierce weather, climate catastrophes—fluid and surprisingly labile. That’s a fact, and it continues to be important in novels that appeared in 2020—for example, Lawrence Wright’s The End of October (2020) and Rumaan Alam’s Leave the World Behind (2020), which reference both climate change and rivalries with Russia as proximate causes of catastrophe.

I began this section with pre-teen me, stranded and scared on Fifth Avenue and pondering nuclear apocalypse. In the end, my friend and I decided not to head back to the subway station but instead to proceed to our goal, the Museum of Modern Art, which had just opened, and everything seemed normal. As in hard-core monster or zombie movies—Godzilla, Cloverfield (dir. Reeves, 2008), World War Z (dir. Forster, 2013)—we imagined total mayhem and destruction, Manhattan a ruin. But then, as when exiting a movie theater, buildings stood and people walked and breathed.

MiddlesThe Cuban Missile Crisis, Fail Safe, and Dr. Strangelove

After the 1950s, America’s nuclear policies had more twists, turns, and secret histories than can or should be accommodated here. A recent book by Fred Kaplan called The Bomb: Presidents, Generals, and the Secret History of Nuclear War (2020) details a lot of what we now know about long-hidden facts, including that Presidents Eisenhower and Kennedy—and all subsequent Commanders in Chief—vowed privately, sometimes only in diaries, that they would never unleash nuclear war, whatever the provocation. Such private convictions often jarred with public statements and with military advice.

As a result, during Kennedy’s tenure and for decades afterwards, the basic policy of the United States became “mutual deterrence” or—a less benign name for pretty much the same thing—“mutually assured destruction” (MAD). Neither policy prevented an arms race and, indeed, mandated one, for the two sides had to keep pace. But mutual deterrence did stop any sane nation from making the first nuclear strike, which has been enough to prevent devastation. Over several decades, mutual deterrence led to a series of strategic arms reductions and other agreements that I do not have space to rehearse here, noting only that the accords were generally seen as a good thing, since the world had come to understand that mass bombings would inaugurate a spoliation of the Earth so thorough that those who lived beyond the initial impact might come to envy those quickly dead. The term nuclear winter first emerged in 1961, just in time for the Cuban Missile Crisis, which marks the beginning of a new stage in Americans’ understanding of and reaction to the threat of all-out nuclear war.

When the United States learned that the Soviets had placed missiles in Cuba within easy reach of the United States, the crisis accelerated. Over thirteen days rather than the climactic weekend that sticks in the popular imagination (one of the many quasi-secret histories of the Bomb), President John F. Kennedy jousted with the Soviet Premier, Nikita Khrushchev, as generals on both sides exerted considerable warlike pressure for Armageddon (Munton and Welch, 2007). In fact, Kennedy masterfully exploited the slow speed of communications by telegraph (the famous red telephone “Hot-Line” did not yet exist), to explore his conviction that Soviet motivations were very much like America’s and not simply “nefarious.” Though it was not made public at the time—and, astonishingly enough, remained secret for decades afterwards—Kennedy agreed to stand down U.S. missiles stationed in Turkey in exchange for the Soviet’s removing missiles from Cuba (Munton and Welch, 2007:2). The resolution of the Crisis was thus a quid pro quo rather than, as popularly understood, a clear victory for the United States.

At an entirely different but hardly irrelevant level, the Cuban Missile Crisis boosted two best-sellers that featured plots in which technology malfunctions, sending planes bearing nuclear weapons towards the Soviet Union. Both Fail Safe by Eugene Burdick and Harvey Wheeler (1962) and Red Alert by Peter Bryant (alias Peter George) (1958) were quickly made into films that mimicked the arms race as their directors strove to reach movie theaters first, with Stanley Kubrick’s Dr. Strangelove (1964) winning this particular war.

In Fail Safe (dir. Lumet, 1964), an American President played by Henry Fonda and based on John Kennedy works with the Soviet Premier to destroy the rogue American plane. When everything fails and a pre-arranged signal confirms that Moscow has been destroyed, he makes the deal that the film’s poster promises will keep us “on the brink of eternity”: the American President authorizes an American bomber to drop two nuclear weapons the exact same size on New York, the American city of choice in many nuclear narratives. There are many wrinkles and tweaks, but director Sidney Lumet ends the film with a montage of ordinary New Yorkers going about their day—black schoolchildren, white teens, a doorman, an older woman—and then stills each moving image to signify obliteration. Fail Safe might have been a big hit and an important film, except for Kubrick’s Dr. Strangelove. This well-known and much-loved film successfully made its nuclear plot not high drama, as in Fail Safe, but high farce. And it reached America’s theaters first.

In Dr. Strangelove, a general played by George C. Scott, named Jack D. Ripper, advocates for all-out war, savoring the possibility that men like him will have access to women in post-apocalypse bunkers. The titular character, played (along with three others), by comedian Peter Sellers, is a nuclear scientist and Nazi retread who eggs the general on. In its memorable ending, the film uses a song by 1940s chanteuse Vera Lynn called “We’ll Meet Again [Don’t Know Where, Don’t Know When],” to accompany multiple images of atomic explosions culled from videos of tests around the world. The images usher the world out with nuclear war’s iconic image: the versatile and sometimes stunningly beautiful mushroom cloud.

Around the time of Kubrick’s film, the present writer’s teenage incarnation—who I’ll leave after this reference—opened a bubble gum wrapper that said: “The world is not such a bad place once you get used to being nervous about everything.” Kubrick’s subtitle conveyed much the same thought: Stop Worrying and Learn to Love the Bomb. While “love” was and remains a deliberate provocation, Americans by and large did learn to stop worrying about nuclear events because there was really nothing anyone could do. The same attitude permeated the very popular MAD magazine, founded in 1952, with its title the acronym for Mutually Assured Destruction and its mascot the ever-smiling Alfred E. Newman. Strange as it seems, the macabre humor and irony of these cultural icons—MAD magazine and the Kubrick film, even my humble bubble gum wrapper—came to inform not just how people went about their lives, but also many of the twists and turns within national nuclear policy. In Vietnam, Lyndon Johnson and Richard Nixon used what Kaplan calls the “madman” theory (2020)—acting as if they, like Dr. Strangelove, would willingly consume the Earth. So did Ronald Reagan, during his first term, in his rhetoric towards Russia. In the same way, government leaders found that “mordant humor was another way of keeping sane: hence the special appeal of Dr. Strangelove to many nuclear strategists” (Kaplan, 2020: 193). Once again, it was time for paradigms to shift, and they did.

Middles: Climate Change

As Nathaniel Rich strikingly puts it in “Losing Earth” (2018), “[n]early every discussion we have in 2019 [about climate change], was being held in 1979.” The same remains true for 2020, 2021, 2022 and, one fears, beyond. For more than forty years, experts have known and warned about the most extreme dangers of climate change, dangers that, albeit at a slow pace, augur nothing less than the end of humanity. That being the case, we might well ask why, for so long, there were no equivalents to nuclear treaties? What happened to prevent more robust action?

At the risk of oversimplifying: the Koch brothers and climate denial, as unlimited funds backed lobbies that favor fossil fuels, downplay human responsibility, and discourage federal action (Leonard, 2019). In additional words, our dependence on electricity and addiction to the comforts we take for granted, to which the 21st century has added an almost continual use of computers, cell phones, tablets, and crypto—all generators of carbon, with plane travel seen (except early in the pandemic) as a norm and even a necessity. By and large, most people and most communities proved willing to recycle for the good of the environment, with some favoring automobiles that emit less carbon or run by electricity. But without a national and, indeed, a worldwide strategy, climate change was addressed locally and, sometimes, not addressed at all. During the very period that nations worked together to reduce the nuclear threat, attention to climate change had, in the United States at least, less government support and less national consensus.

After George W. Bush took office in 2000, in a documentary called An Inconvenient Truth (dir. Guggenheim, 2006), Al Gore sounded a widely heard alarm about the dangers of what was then called global warming. By 2017, when Gore issued An Inconvenient Sequel: Truth to Power (dirs..Cohen and Shenk), the term was climate change and the possibility of completely reversing it was gone. In 2006, Gore proposed that each viewer swap incandescent light bulbs for LEDs; how sweet and triste that seemed in 2017, when he recommends the far larger goal of worldwide political action. What had changed between 2006 and 2017 forms a major theme in Gore’s second film. For while Barack Obama made at least limited progress towards controlling climate change, most notably in the international Paris Agreement of April 2016, Donald Trump consistently worked to erase or even reverse it. As I write, President Joe Biden has put us back in the agreement and back on the job. But the last few years have taught us the possible transience of all things, and the invasion of Ukraine once again jeopardizes progress.

In 2021-22, satellite photographs showed that the Covid-19 crisis—so bad for so many, in so many ways—vastly improved air quality in China, India, and cities like Los Angeles. Water quality improved too. Just as ecology partly caused the Covid-19 crisis, the pandemic produced several months’ worth of ecological pause and consideration. But without consistent worldwide cooperation and corporate changes of heart—the kind of thing Kim Stanley Robinson projects in novels like New York 2041 (2017) and The Ministry for the Future (2020) not to mention a motivated government in the United States, the long-term consequences of such findings have not been sustained. While some cities, states, and nations made lasting changes, most seem eager to return to business as usual.

Recent reports by the United Nations and authors like Roy Scranton (2015, 2018) and David Wallace-Wells (2019) have laid out, clearly, the dire consequences of climate change if current trends continue. While there are some signs that new forms of self-interest may motivate nations and businesses to make some helpful changes, many damaging effects of climate change can no longer be stopped or reversed: shrinking glaciers and rising seas, lost forests and rising carbon emissions, accelerated plant and species extinctions. Long term solutions require changing how humans think and how capitalism currently functions, transforming an ethos of self-interest and accumulation that works against common actions for the common good. Those are the facts, addressed by Naomi Klein (2014) and others. What do we find in fiction?

Writing in The Great Derangement (2017), Amitav Ghosh identifies the novel as an individualistic, psychological genre and claims that novels by their very nature fail to address climate change. Even decades before he wrote, Ghosh’s generic model was badly outdated and, since publication, his central thesis about climate change has worn badly. Many novels address climate change and related issues, including the spread of pandemics. Because developments remain very much in progress and not all the novels involved are well-known, this essay will point to a limited but representative sample of novels and their themes.

Several climate change (cli-fi) novels narrate the failure of science, individual scientists, and technology, a theme that also permeates nuclear plots: Ian McEwan’s Solar (2010), Barbara Kingsolver’s Flight Behaviour (2012), and a more radical novel, The Lamentations of Zeno (2016) by Bulgarian author Ilija Trojanow. In the 21st century, when racial injustice figures prominently in public discourse, cli-fi novels often show climate catastrophe disproportionately harming people of color, as has in fact already been the case: Omar El Akkad’s American War (2017); the now classic Cormac McCarthy’s The Road (2006). Neither of these novels takes an especially cheerful view of human potential for cooperation.

In strong contrast, many cli-fi novels posit instead that community and coalition are not just possible, but also entirely likely after environmental disasters: African American sci-fi writer Octavia E. Butler, for example, in Parable of the Talents (1998) and Parable of the Sower (1993), albeit with mixed and sometimes pessimistic results. Less well-known novels with similar themes include Meg Little Reilly’s We Are Not Prepared: A Gripping Domestic Drama (2016), and a terrific Young Adult novel called The Carbon Diaries 2015 (2008), by Saci Lloyd. I have already mentioned Kim Stanley Robinson’s impressive New York 2140 (2017) and The Ministry for the Future (2020), which imagine social activists, greedy capitalists, computer nerds, and sundry others banding together to put global capitalism to work for the people and to save the Earth. While nuclear plots are often suspicious of the military, of government, and of technology, their critiques are not usually so broadly based, nor do they usually consider the resurgence of community as both possible and salutary.

Nuclear Plots: The End?

In 1982, Jonathan Schell published The Fate of the Earth, which argued that nuclear war must be avoided to save the planet—joining awareness of nuclear Armageddon to the climate catastrophe that would follow. It was one of several watershed moments in the 1980s in which actual policy and media representations converged. In the same shocking mode and soon after, ABC aired a made-for-television movie called The Day After (dir. Meyer, 1983) which, watched by roughly 100 million viewers, became a broadcast sensation. Instead of focusing on coastal New York, as Schell had, the show switched the scene to the heartland in Lawrence, Kansas. As the credits unroll, we see scenes that typify Americana, backed by the equally Americana soundtrack of Virgil Thompson’s “The River.” Then the film sets up a number of domestic plots with likeable families.

Suddenly, tensions around West Berlin escalate, the Soviets invade, and the good people around Lawrence and Kansas City see missiles streak into the sky. We see a flash of light like Hiroshima’s “million suns,” the screen goes red, and bodies vaporize into skeletons, some of them characters we’ve come to know and like. Red mushroom clouds and a firestorm of mass proportions follow, from which we emerge into a grey palette that shows us, for the remaining hour, the characters’ physical decay and, ultimately, death. The Day After is not a stellar narrative, but it was somber and grim stuff that suited the mood of its time, which was eager to reduce Cold War tensions. The events it showed lingered in the cultural imagination, recurring, for example, in Butler, McCarthy, and in the landmark TV series The Americans (2013-2018), in which the characters watch and are deeply affected by The Day After.

In an unlikely, happy twist of history, political events in 1989-91 cooperated with the culture’s mood and greatly facilitated the reduction of nuclear threats. Gorbachev initiated glasnost and perestroika, the Berlin Wall fell, then the Soviet Union followed. Nuclear fears fell off, as they continued to do under H.W. Bush, Bill Clinton, G.W. Bush, and Barack Obama, all of whom pressed towards policies of nuclear disarmament, even when (like the two Bush Presidents), they willingly began conventional wars. There were still many political debates and maneuvers behind the scenes, too many to summarize here. But Americans had pretty much learned to stop worrying and live with the bomb as a kind of low-level background noise.

Were there times when the noise intensified, accompanied by outbursts of nuclear fears? Sure. After 9/11, newspapers detailed and America overreacted to widespread anticipation of suitcase bombs and anthrax attacks. There were tense moments between India and Pakistan and more than just moments with North Korea. When he was elected President, Donald Trump characteristically wanted more nuclear weapons, not fewer. He also threatened both Iran and North Korea with “fire and fury.” Was Trump reenacting the Madman theory? A man of irrepressible bluster? Like Vladimir Putin during the invasion of Ukraine, being extremely careless? A strange mentality? Hopefully in Putin’s case, as in Trump’s, we will never know—though, as this essay goes online in fall of 2022, fears are quite intense. In the same way, Trump targeted Obama’s progress on climate change, not just withdrawing from the Paris Accords of 2016, but also overturning roughly sixty environmental regulations. Though they have since been reversed, his actions may well have a lasting effect on climate change.

Empty Cities: Climate Change

In Steven Soderbergh’s Contagion (2011), a bat carrying a banana casually drops it into a pigsty that houses a porker subsequently eaten at a banquet in Beijing. That bat, that pig, that virus initiate a massive pandemic with a fatality rate that kills 30% of the world’s population. We do not see the bulldozers that razed the forest housing the bat until the very end of the film, but they represent the ecological source of disaster. The equally popular movie The Day After Tomorrow (dir. Emmerich, 2004) follows a multi-racial coalition of survivors after a sudden rise in oceans engulfs New York, followed almost immediately by a ferocious deep freeze—a combination indebted less to science than to a 1933 source called Deluge (dir. Feist). Just as most American movies avoid references to nuclear explosions, the first climate change films also unrolled mostly by indirection. Though it’s a zombie movie and hence in a different category, I Am Legend (dir. Lawrence , 2007) also fascinates in this regard: its whole first half displaying New York’s infrastructure devoid of people.

In the winter and spring of 2020, in the daze of Covid-19’s long self-isolation, viewers around the world saw photographs on TV and in newspapers of empty cities: no people or, perhaps, just one who sets the scale, often within intact and familiar locales. People followed, “Liked,” and “Loved” Instagram and YouTube postings of wildlife returning to major cities: boars in Barcelona, goats parading through towns in Wales, a coyote in New York’s Tribeca.

Commentary on the videos often noted that the animals and, indeed, nature in general, seemed happier without us humans, and more exuberant. We might say that the Empty City motif that we encountered in nuclear narratives recurred later in history, big time, except that it never really left.

In The World Without Us, a 2007 non-fiction bestseller, Alan Weisman follows the Earth’s great cities year by year and century by century as nature imposes its own order in the absence of people, who have—the book insists as its premise—disappeared for unspecified reasons. Two years later, the History Channel series Life Without People (2008-2010) shared the same premise, announcing quite prominently in Episode One (easily accessed online) that it was not “the story of how we [humans] might vanish” but instead “the story of what happens to the world we leave behind” (“The Bodies Left Behind” 0:24-0:34). Like nuclear war in empty city movies, world without people narratives strangely ignore the pachyderm in the room, which is climate change.

In fact, putting together its nuclear and climate-change incarnations, the empty city motif seems to involve some willful act of denial or repression in the culture at large—a shared sense that some things simply won’t be changed and can’t be faced, and so are best forgotten or left unsaid. When buildings stand but people vanish, these narratives seem to know, but refuse to acknowledge, a common truth: We will die, we will all inevitably die; what happens if things get so bad that humans die off completely? They seem to assert that, when we do, something permanent connected to us, something created by us will remain as a form of immortality. The way these narratives avoid the materiality of dead bodies in their miraculously empty cities recalls a line in William Faulkner’s As I Lay Dying (1915) that defines a similar phenomenon: “The reason you will not say it is, when you say it, even to yourself, you will know that it is true.” In “Air War and Literature” (2003), W.G. Sebald identifies a similar syndrome motivating Germans after World War II, as they set to rebuilding their bombed cities without thinking or talking too much about what had caused the destruction. The empty city trope might even reflect some weird valuation of things and buildings over people, perhaps because of their very function as memorials. In an interview rather than the book itself—namely, The World Without Us (2007)—Weisman says that world without people narratives are cautionary tales that seek “some non-threatening approach to disarm readers’ apprehensions about environmental destruction long enough that they might consider the impact of unbridled human activity” (The World Without Us – Alan Weisman). That’s true, though the compulsions behind the recurring motif seem deeper and stranger.

Whatever their motivations or set of motivations—and the growing awareness of climate catastrophe surely counts in the mix—empty city narratives have enjoyed an authentic vogue in the 21st century. To those already discussed or mentioned, we should add “Earth Without People” in Discover magazine (Weisman, 2005); Alan Taylor’s beautiful 2017 photographs of empty spaces that show only traces of humanity (an abandoned hut, a soda can); even Disney’s Wall-E (dir. Stanton, 2008), the first animated film nominated for a Best Picture Oscar and a pretty direct representation of the environmental damage that greedy capitalism can do. I would point as well to Eric Sanderson’s glorious Mannahatta project of 2007 and book (2009), which generate computer images of what Manhattan looked like before humans, juxtaposing images of highly populated areas with their woodland or wetlands origins, some paintings at the 2022 Whitney Biennial, and the 2022 staging of The Skin of Our Teeth (Wilder) at the Lincoln Center Theater. Like the first half ofI Am Legend, which shows Central Park as a cornfield and tumbleweeds rolling down Fifth Avenue, all these empty city narratives encrypt the pastoral, but a pastoral without shepherds.

Climate Change: Beyond the End

In a what-if story rapidly becoming all-too-likely, newly formed deserts and dust bowls emerge on Earth, as do floods and tsunamis, food shortages, many millions of migrants, wildfires, viruses, and plagues. God or fate willing, children we love will be middle-aged, with some taste of “normal” life, but grandchildren or others now very young would still be in their prime. It’s terrible to think of them living at such vulnerable times.

Popular culture generates various scenarios for how things might go: at one extreme (see El Akkad, parts of Butler’s canon, McCarthy), a descent in violence and savagery; at the other (see some of Butler’s canon, Reilly, Laci, Robinson), the saving grace of cooperation and community. Once again, despite its zombie genre, the 2007 movie I Am Legend seems pertinent. At the end of the movie, the hero invents a vaccine that combines his own (male, Black) blood with that of a white, female zombie and commits delivery of the virus to the Irishwoman and the orphaned Canadian boy for whom she cares. To save humanity, the film enlists cooperation that is interracial, international, intergender, intergenerational, and, most startingly (as also in 21st-century versions of fantasies like Planet of the Apes and Bladerunner), inter-species.

If it comes, when it comes, the world after catastrophe might be better or worse—or better for some of us—but it will still be bad. How bad it will get remains unknown, but projections are dire. Some experts coldly posit (they would say rationally project) the need for a radical drop in population, courtesy of man or nature. Perhaps, a series of mini-nuclear events since a large-scale conventional war like World War II no longer seems likely? Perhaps, sharply negative population growth like that we have seen in the 2020s? Perhaps, and more probable, a major pandemic like more lethal waves of something like Covid-19? For such thinkers “the population bomb” remains the simple fact that the Earth cannot support the carbon emissions that come with too many bodies. As a result, some thinkers now identify themselves as “new Malthusians,” a view that may factor into stories of things that kill large tranches of humanity. The climate, these thinkers say, would benefit from having many fewer people; therefore, let nature do its work, eliminating the weak. Should the Malthusian view prevail, the world would likely lose millions, perhaps even many millions. Afterwards, the need for fossil fuel would diminish as the number of humans did. We would need to clear less land for farming and housing; fewer cattle would add their carbon emissions to the mix and, with a radical reduction in populations, there would be fewer cars and trips by plane. Not to mention that many places would not be worth seeing. The world would get a breather, though humanity would not, suffering terribly.

Some climate change deniers hope or even expect a technological miracle. Science or business invents something that grabs carbon from the air as effectively as rainforests and old-growth forests used to do. Science or business miraculously preserves the poles. Some device or creature cleans up water, restoring fish and coral reefs. Mars proves to be somehow hospitable to life. Urban planning gurus generate new cities like Venice around the world—not sinking and not endangered this time—that enable coastal cities to survive. Floating cities. Maybe. May be. But the darker Malthusian outcomes seem more likely.

Let’s say, in this what-if story, that everything goes bad, if not in 2040 or 2030, then in 2100 or decades after. The effects would surely impede human life or end it. In that worst-case as-if, it seems to me that popular culture’s fictions and non-fictions have already told us how things might go. While most cli-fi leaves humans in the picture, at least for now, some imagine—with ease and even tranquility—stories that won’t include humans at all. These plotlines don’t match any of the narratives we’ve seen so far. They’re uncanny narratives. Weird ones that become less uncanny and strange all the time as we move more deeply into the 21st century.

We’ve had a taste of such stories in narratives that have long been considered science fiction: the very end of H.G. Well’s The Time Machine (1895), where crab-like beings scuttle on a people-less Earth; in J.G. Ballard’s The Drowned World (1962), where the hero surrenders to an utter loss of identify as he merges with the swamp; episodes of Rod Serling’s classic 1960s series The Twilight Zone. 2018 provided a more realistic and more relevant example in Richard Power’s best-selling and award-winning The Overstory which, in its first long section, called “Roots,” includes not just the ancestors of characters we’ll meet later, but trees, actual trees, as protagonists. When the novel becomes more traditional and focuses on human characters, they are all radically connected, in one way or another, to the trees we’ve met before, so that plot develops both arboreal and human filiations and friendships. The novel’s view of trees is ultimately quite rhapsodic. When they share water through their roots, trees show community. When they warn nearby trees of insects or fire, they act like and, in fact are, sentient non-human beings. They act, in fact, a lot like us with our families and in our communities. Though valued for their wood by commercial culture, trees contain far richer treasures. They have seen and registered eons of history and contain wisdom that new plantings can never replicate, with old trees housing worlds upon worlds of fungal, insect, and animal life. They convert C02 and provide medicines like penicillin and tamoxifen—that is, unless we cut them down to make coffee tables and outdoor decks. The novel’s view of trees as agents and actors is radical, soul-stirring, and, in some sense, earth-shaking. They have also been documented, outside the novel, by contemporary scientists. While it’s possible to position such views as partly fantasy, that does not, as Jedediah Purdy says in “Thinking Like a Mountain” (2017), make them “politically irrelevant.” Newly popular views of trees represent a genuine and remarkable change in the contemporary mindset, one also manifests in the popularity of plant-based diets, animal rights, and Animal Studies—all approaches that think differently about our relationship to the environment and to other species deemed forms of beingness.

The Overstory announced, in a popular fiction, a new mindset which accepts that “fires … come, despite all efforts, the blights and windthrow and floods. Then Earth will become another thing” (Power, 2018: 500). In the book’s ultimate paradox, the natural world will renew itself “once the real world ends” (ibid.). A time—a soon-time—in which human life will have ended, ceding the Earth to non-human beings and to nature. It’s a peaceful ending and, in its own way, beautiful. A secular version on a human scale of what religious traditions have traditionally seen as divine apocalypse. As climate change tips into climate catastrophe, it’s a perspective for our time perhaps even more unsettling than Darwin’s resetting of the evolutionary clock was for the 19th century.

For, when all is said and done, let’s face it. Humans have been around on Earth for a while and might be around a while longer—surviving climate change as we have survived the nuclear threat—If, that is, we commit to massive, worldwide changes in the ways we think and act. But, as in Virginia Woolf’s To the Lighthouse (1927), time and nature remain neutral and even indifferent to our choice. Either we surrender some past and current values, or we don’t. Either we cooperate and endure, or we cease to exist. Either way, within nature, life will persist, and the fecund energy of the universe still throbs. We hear that message in a surprising number of “what if” stories today.

What I call thinking beyond the end appears in the culture at large, including youth movements that have begun to think climate catastrophe inevitable, in observations of vivid plant life after Chernobyl, and in various arts today. More and more, we see a willingness to accept that humans will not stop climate change, that humanity will end, and that other forms of life will persist or arise on Earth—and will not miss us.

Works Cited

Alam, Rumaan. Leave the World Behind. London, Oxford, New York, New Delhi, Sydney: Bloomsbury Publishing, 2020.

Ballard, James G. The Drowned World. New York: Berkley Books, 1962.

Bataille, Georges. “Concerning the Accounts Given by the Residents of Hiroshima.” France Soir September 1946. Translation in Caruth, Cathy. Trauma: Explorations in Memory. Baltimore: Johns Hopkins University Press, 1995.

Bryant, Peter. Red Alert. New York: Ace Books, 1958.

Burdick, Eugene, and Wheeler, Harvey. Fail Safe. New York: Dell Publishing, 1962.

Butler, Octavia E. Parable of the Sower. New York: Four Walls Eight Windows, 1993.

---. Parable of the Talents. New York: Seven Stories Press, 1998.

Cameron, James. Avatar. Twentieth Century Fox, 2009.

---. Titanic. Paramount Pictures, 1997.

Cohen, Bonny, and Shenk, Jon. An Inconvenient Sequel: Truth to Power. Paramount Pictures, 2017.

Ehrlich, Paul. The Population Bomb. New York: Ballantine Books, 1968.

El Akkad, Omar. American War. New York: Alfred A. Knopf, 2017.

Emmerich, Roland. The Day After Tomorrow. Twentieth Century Fox, 2004.

Faulkner, William. As I Lay Dying. London: Chatto and Windus, 1915.

Feist, Felix E. Deluge. Admiral Productions, 1933.

Forster, Marc. World War Z. Paramount Pictures, 2013.Ghosh, Amitav. The Great Derangement. Chicago: University of Chicago Press, 2017.

Guggenheim, Davis. An Inconvenient Truth. Paramount Vantage, 2006.Harari, Yuvah Noah. Sapiens: A Brief History of Humankind. New York: Harper, 2014.

Hersey, John. “Hiroshima.” The New Yorker, 23 Aug. 1946.

---. Hiroshima. New York: Alfred A. Knopf, 1946.

Kaplan, Fred. The Bomb: Presidents, Generals, and the Secret History of Nuclear War. New York:Simon and Schuster, 2020.

Kingsolver, Barbara. Flight Behaviour. London: Faber and Faber, 2012.

Klein, Naomi. This Changes Everything: Capitalism versus Climate Change. New York: Simon

And Schuster, 2014.Kramer, Stanley. On the Beach. Lomitas Productions Inc., Spinel Entertainment, 1959.

Kubrick, Stanley. Dr. Strangelove, Or: How I Learned to Stop Worrying and Love the Bomb. Hawk Films, Stanley Kubrick’s Production, 1964.

Lawrence, Francis. I Am Legend. Warner Bros., 2007.

Leonard, Christopher. Kochland. New York: Simon and Schuster, 2019.

Lloyd, Saci. The Carbon Diaries 2015. New York: Holiday House, 2008.

Lumet, Sidney. Fail Safe. Columbia Pictures Industries, 1964.

MacDougall, Ranald. The World, the Flesh, and the Devil. HarBel Productions, Sol C. Siegel Productions, 1959.

Matheson, Richard. I Am Legend. Gold Medal Books, 1954.

McCarthy, Cormac. The Road. New York: Alfred A. Knopf, 2006.

McEwan, Ian. Solar. London: Jonathan Cape, 2010.

Meyer, Nicholas. The Day After. ABC Circle Films, 1983.

Minghella, Anthony. The English Patient. Miramax, 1996.Munton, Don and Welch, David A., eds. “Introduction,” The Cuban Missile Crisis: A Concise History. New York/London: Oxford University Press, 2007.

Nixon, Rob. Slow Violence and the Environmentalism of the Poor. Cambridge, Massachusetts, London: Harvard University Press, 2011.

Nyby, Christian. The Thing from Another World. Winchester Pictures Corporation, 1951.

Ondaatje, Michael. The English Patient. New York: Vintage Books, Random House, 1993.

Pellegrino, Charles. Last Train to Hiroshima: The Survivors Look Back. New York: Henry Holt and Company, 2010.

Powers, Richard. The Overstory. New York: Norton, 2018.

Purdy, Jedediah. “Thinking Like a Mountain: On Nature Writing.” N+1, Issue 29, 16 Oct. 2017. https://www.nplusonemag.com/issue-29/reviews/thinking-like-a-mountain/.

Reeves, Matt. Cloverfield. Paramount, 2008.

Reilly, Meg L. We Are Not Prepared: A Gripping Domestic Drama. Mira Books, 2016.

Resnais Alain et al. directors. Hiroshima Mon Amour. Argos-Films: Como-Films: Daiei Motion Pictures: Pathé Overseas, 1959.

Rich, Nathaniel. “Losing Earth: The Decade We Almost Stopped Climate Change.” The New York Times, 1. Aug. 2018.

Robinson, Kim S. New York 2041. London: Orbit, 2017.

---. The Ministry for the Future. London: Orbit, 2020.

Sanderson, Eric. Mannahatta: A Natural History of New York City. New York: Abrams, 2009.

Sagal, Boris. The Omega Man. Walter Seltzer Productions, 1971.

Schell, Jonathan. The Fate of the Earth. New York: Alfred A. Knopf, 1982.

Scranton, Roy. Learning to Die in the Anthropocene. San Francisco: City Lights, 2015.

---. We’re Doomed, Now What? New York: Soho Press, 2018.

Sebald, Winfried G. “Air War and Literature.” In: On the Natural History of Destruction. Translated by Anthea Bell. New York: Random House, 2003.

Serling, Rod. The Twilight Zone. Winchester Pictures Corporation, 1959–1964.

Shiel, Matthew P. The Purple Cloud. London: Chatto & Windus, 1901.

Shute, Nevil. On the Beach. New York: William Morrow and Company, 1957.

Siegel, Don. Invasion of the Body Snatchers. Walter Wanger Production, 1956.

Soderbergh, Steven. Contagion. Warner Bros., 2011.

Stanton, Andrew. Wall-E. Walt Disney Pictures-Pixar Animation Studios, 2008.

---. “The Body Left Behind.” Life Without People, created by David de Vries, Season 1, Episode 1, Flight 33 Productions, 2009.

---. “The Day After.” The Americans, created by Joe Weisberg, Season 4, Episode 9, Amblin Television-FX Production, 2016.

---. Trojanow, Ilija. The Lamentations of Zeno: A Novel. Trans. Philip Boehm. London/New York: Verso, 2016.

Wallace-Wells, David. The Inhospitable Earth. New York: Hachette, 2019.

---. Weisman, Alan. “Earth Without People.” Discover Magazine, 6 Feb. 2005. https://www.discovermagazine.com/planet-earth/earth-without-people.

---. “The World Without Us - Alan Weisman [interview].” worldwithoutus.com/about_author.html.

---. The World Without Us. New York: Picador, 2007.

Well, Herbert G. The Time Machine. London: William Heinemann, 1985.

Wilder, Thornton. The Skin of Our Teeth. Directed by Lileana Blain-Cruz, April 1-May 29, 2022. Lincoln Center Theater, New York.

Woolf, Virginia. To the Lighthouse. London: Hogarth Press, 1927.

Wright, Lawrence. The End of October: A Novel. New York: Alfred A. Knopf, 2020.

About the Author

Marianna Torgovnick is Professor of English at Duke University. Her work crosses disciplinary borders and, beyond English literature, is at home in Art History, Anthropology, Religious Studies, and Theory more broadly. Her publications include Closure in the Novel (1981), The Visual Arts, Pictorialism, and the Novel: James, Lawrence, and Woolf (1985), Gone Primitive: Savage Intellects, Modern Lives (1990), and Crossing Ocean Parkway (1994), the latter of which was awarded an American Book Award. Alongside numerous general interest articles, her other publications include Primitive Passions: Men, Women, and the Quest for Ecstasy (Knopf, 1997) and The War Complex: World War II in Our Time (2005). More information on Professor Torgovnick's writing can be found at www.mariannatorgovnick.com.