The Anthropocene Reviewed Page 14

Climate change is probably the biggest shared challenge facing twenty-first-century humans, and I fear future generations will judge us harshly for our failure to do much about it. They will likely learn in their history classes—correctly—that as a species, we knew carbon emissions were affecting the planet’s climate back in the 1970s. And they will learn—correctly—about the efforts in the 1980s and 1990s to limit carbon emissions, efforts that ultimately failed for complicated and multifaceted reasons that I assume the history classes of the future will have successfully boiled down into a single narrative. And I suspect that our choices will seem unforgivable and even unfathomable to the people reading those history books. “It is fortunate,” Charles Dudley Warner wrote more than a century ago, “that each generation does not comprehend its own ignorance. We are thus enabled to call our ancestors barbarous.”*

Even as we are beginning to experience the consequences of climate change, we are struggling to mount a global human response to this global problem caused by humans. Some of that is down to public misinformation and the widespread distrust of expertise. Some of it is because climate change feels like an important problem but not an urgent one. The wildfires that have become more common must be put out today. It is much harder for us to make the big changes that would, over generations, decrease the probability of those fires.

But I think it is also hard for us to confront human-caused climate change because the most privileged among us, the people who consume the most energy, can separate ourselves from the weather. I am certainly one such person. I am insulated from the weather by my house and its conditioned air. I eat strawberries in January. When it is raining, I can go inside. When it is dark, I can turn on lights. It is easy for me to feel like climate is mostly an outside phenomenon, whereas I am mostly an inside phenomenon.

But that’s all a misconception. I am utterly, wholly dependent on what I imagine as the outside world. I am contingent upon it. For humans, there is ultimately no way out of the obligations and limitations of nature. We are nature. And so, like history, the climate is both something that happens to us and something we make.

* * *


Here in Indianapolis, high temperatures reach above 90 degrees Fahrenheit only about thirteen days per year, and yet most of our homes and office buildings are air-conditioned. This is in part because architecture has changed dramatically in the last fifty years, especially when it comes to commercial buildings, to assume the existence of air-conditioning. But AC is also becoming more common because more of us expect to be able to control our interior environments. When I’m outside, if I can adjust my wardrobe a bit, I feel entirely comfortable if the temperature is anywhere between 55 and 85 degrees Fahrenheit. But inside, my comfort zone drops dramatically, down to a couple of degrees. I loathe sweating while sitting inside, as I often did when I lived in an un-air-conditioned apartment in Chicago. I find it equally uncomfortable to feel goose bumps of chill indoors. Like an expensive painting or a fragile orchid, I thrive only in extremely specific conditions.

I am not alone in this respect. A Cornell University study in 2004 found that office temperatures affect workplace productivity. When temperatures were increased from 68 degrees Fahrenheit to 77, typing output rose by 150 percent and error frequency dropped by 44 percent. This is no small matter—the author of the study said it suggested “raising the temperature to a more comfortable thermal zone saves employers about two dollars per worker, per hour.” Why, then, are so many summertime office environments so cool when it is both more expensive and less efficient to keep summertime temperatures low? Perhaps because the definition of “room temperature” has historically been established by analyzing the temperature preferences of forty-year-old, 154-pound men wearing business suits. Studies have consistently found that on average women prefer warmer indoor temperatures.

But when people point out the bias of AC settings in office buildings—especially when women point it out—they’ve often been mocked for being overly sensitive. After the journalist Taylor Lorenz tweeted that office air-conditioning systems are sexist, a blog in the Atlantic wrote, “To think the temperature in a building is sexist is absurd.” But it’s not absurd. What’s absurd is reducing workplace productivity by using precious fossil fuels to excessively cool an office building so that men wearing ornamental jackets will feel more comfortable.

* * *


I need to get used to feeling a bit warmer. It’s the only future for us. When I was a kid in Florida, it seemed natural to me to grab a sweatshirt before heading to the movie theater. Air-conditioning, like so much else in the Anthropocene, was a kind of background hum that reshaped my life without my ever thinking about it. But writing to you from the early hours of 2021, entering a movie theater at all feels wildly unnatural. What’s “natural” for humans is always changing.

I am immensely grateful for air-conditioning. It makes human life far better. But we need to broaden our definition of what constitutes climate control, and quickly.

I give air-conditioning three stars.


STAPHYLOCOCCUS AUREUS

YEARS AGO, I acquired an infection in my left eye socket caused by the bacteria Staphylococcus aureus. My vision clouded, and my eye swelled shut. I ended up hospitalized for over a week.

Had I experienced the same infection anytime in history before 1940, I would’ve likely lost not just my eye but my life. Then again, I probably wouldn’t have lived long enough to acquire orbital cellulitis, because I would’ve died of the staph infections I had in childhood.

When I was in the hospital, the infectious disease doctors made me feel very special. One told me, “You are colonized by some fascinatingly aggressive staph.” Only about 20 percent of humans are persistently colonized with Staphylococcus aureus—the precise reasons why are not yet clear—and I am apparently one of them. Those of us who carry the bacteria all the time are more likely to experience staph infections. After marveling at my particular staph colony, the doctor told me I wouldn’t believe the petri dishes if I saw them, and then called my continued existence a real testament to modern medicine.

Which I suppose it is. For people like myself, colonized by fascinatingly aggressive bacteria, there can be no hearkening back wistfully to past golden ages, because in all those pasts I would be thoroughly dead. In 1941, Boston City Hospital reported an 82 percent fatality rate for staph infections.

I remember as a child hearing phrases like “Only the strong survive” and “survival of the fittest” and feeling terrified, because I knew I was neither strong nor fit. I didn’t yet understand that when humanity protects the frail among us, and works to ensure their survival, the human project as a whole gets stronger.

* * *


Because staph often infects open wounds, it has been especially deadly during war. Near the beginning of World War I, the English poet Rupert Brooke famously wrote, “If I should die, think only this of me: That there’s some corner of a foreign field That is for ever England.” Brooke would indeed die in the war, in the winter of 1915—not in some corner of a foreign field, but on a hospital boat, where he was killed by a bacterial infection.

By then, there were thousands of doctors treating the war’s wounded and ill. Among them was a seventy-one-year-old Scottish surgeon, Alexander Ogston, who decades earlier had discovered and named Staphylococcus.

Ogston was a huge fan of Joseph Lister, whose observations about postsurgical infection led to the use of carbolic acid and other sterilization techniques. These drastically increased surgical survival rates. Ogston wrote to Lister in 1883, “You have changed surgery . . . from being a hazardous lottery into a safe and soundly based science,” which was only a bit of an exaggeration. Before antiseptics, Ogston wrote, “After every operation we used to await with trembling the dreaded third day, when sepsis set in.” One of Ogston’s colleagues, a nurse who worked with him at the Aberdeen Royal Infirmary, declined surgery for a strangulated hernia, choosing death, “for she had never seen a case which was operated on recover.”

* * *


After visiting Lister and observing complex knee surgeries healing without infection, Ogston returned to the hospital in Aberdeen and tore down the sign above the operating room that read, “Prepare to meet thy God.” No longer would surgery be a last-ditch, desperate effort.

Ogston was so obsessed with Lister’s carbolic acid spray that his students wrote a poem about it, which reads in part:


And we learned the thing of the future

Prev page Next page