What factor is the source of the most uncertainty in climate projections?
Scientists use models—calculations typically run on multiple powerful computers—to project how global warming pollution in the atmosphere will affect future average temperatures, precipitation, and other aspects of our climate. The formulas used to project climate change differ among models, but the most significant variable is always how much energy will be used over the course of this century (based on choices made by governments, businesses, and individual citizens).
To account for this uncertainty, climate models employ different “scenarios” to approximate the impact that different degrees of energy use will have on carbon dioxide and other heat-trapping emissions over time—which, in turn, yield different degrees of climate change. The Intergovernmental Panel on Climate Change, for example, used a set of six scenarios for its most recent climate assessment, ranging from low emissions (the “B1” scenario) to high emissions (“A1FI,” in which FI represents fossil-fuel-intensive energy use).
By the end of this century, as projected by the B1 scenario, temperatures rise between 2.7 and 5.2 degrees Fahrenheit (°F) over the 1980–2000 average; in the A1FI scenario, the projected rise in temperatures increases to between 6.1°F and 11.0°F. The difference between the average end-of-century temperatures for these two scenarios, therefore, is substantial—nearly 4.6°F. This underscores the need to make energy choices today that will set us on a lower-emissions path and avoid the most dangerous consequences of global warming.
I had the opportunity to attend the Intergovernmental Panel on Climate Change (IPCC) presentation at the 2008 ESRI International User Conference in San Diego. It was a great presentation, and you can see the Powerpoint slides here.