One of the more threatening aspects of climate change is its potential to unleash feedbacks, or situations where warming induces changes that drive even more warming. Most of those are natural, such as a warmer ocean being able to hold less carbon dioxide, resulting in even more of the greenhouse gas in the atmosphere. But at least one potential feedback has a very human element: air conditioning.
A lot of the carbon dioxide we emit comes from the production of electricity. The heat those emissions generate causes people to run air conditioning more often, which drives more electricity use, which drives further emissions. It's a feedback that will remain a threat until we manage to green the electrical grid.
A new report released this week takes a look at that feedback from the perspective of our climate goals, examining how much more often air conditioning is likely to be used in a world at our backup goal of 2° C of warming, and comparing it to one where we actually reach our primary goal of limiting warming to 1.5°C. The answer is that it makes a big difference, but the impacts aren't evenly spread between countries.
Absolute and relative unpleasantness
There's no objective measure for when air conditioning should come on. People have different heat tolerances, and a lot of humanity doesn't even have access to air conditioning. But studies in the area typically use a measure called cooling degree days. These frequently use an outdoor temperature where things like office buildings or shopping centers would start using their air conditioning—often about 18° C (65° F). For each day that's warmer than the target, the cooling degree days are incremented by the number of degrees by which the target temperature is exceeded.
So, if you set a baseline of 18° C and have a 25° C day, then that's registered as an additional seven cooling degree days. Thus, the figure measures not only the need to turn on cooling equipment, but provides a sense of how hard that equipment will have to work.
For the new work, a UK-based team used climate models to run as a distributed computing project via climateprediction.net. For each of the two conditions (+1.5° and +2.0° C), 700 runs were done, and the typical temperatures in each grid of a map of the world was calculated. Those results, in turn, were used to calculate cooling degree days for each of the two scenarios.
The researchers then made two comparisons. One was the absolute difference between the 1.5° and 2.0° worlds, the total number of additional cooling degree days that were added by the additional warming. You can think of this as a misery index, registering just how much worse things will be in a given location.
The second measure can be considered a preparedness index, as measured by the magnitude of the relative change. So, places that rarely experience cooling-degree days in a 1.5° world might have a large relative difference if they suddenly experience dozens with the additional heat. And, since those degree days were so rare to begin with, those countries probably don't have access to air conditioning equipment (or the grid to power it) to cope with the unprecedented burden.