A major new study, “Impacts of climate change from 2000 to 2050 on wildfire activity and carbonaceous aerosol concentrations in the western United States” finds a staggering increase in “wildfire activity and carbonaceous aerosol concentrations in the western United States” by mid-century under a moderate warming scenario:

We show that increases in temperature cause annual mean area burned in the western United States to increase by 54% by the 2050s relative to the present-day … with the forests of the Pacific Northwest and Rocky Mountains experiencing the greatest increases of 78% and 175% respectively. Increased area burned results in near doubling of wildfire carbonaceous aerosol emissions by mid-century.

Help Grist raise $25,000 by September 30 to further advance our climate reporting

This graph shows the percentage increase in area burned by wildfires, from the present-day to the 2050s, as calculated by the model of Spracklen et al. [2009] for the May-October fire season. The model follows a scenario of moderately increasing emissions of greenhouse gas emissions and leads to average global warming of 1.6 degrees Celsius (3 degrees Fahrenheit) by 2050. Warmer temperatures can dry out underbrush, leading to more serious conflagrations in the future climate.”

Grist thanks its sponsors. Become one.

And this is just the mid-century prediction for the IPCC’s “moderate” A1B scenario (CO2 at 522 ppm in 2050), which predicts “mean July temperatures to increase by 1.8°C from 2000 to 2050.” This is not the worst-case emissions path, which we are currently on (see U.S. media largely ignores latest warning from climate scientists: “Recent observations confirm … the worst-case IPCC scenario trajectories (or even worse) are being realised” – 1000 ppm). What would happen by 2100 on our current emissions path, when the mean July temperature increase from 2000 is triple (or more) the 1.8°C that the researchers modeled? Turns out someone did model that a few years ago.

Back in 2004, researchers at the U.S. Forest Services Pacific Wildland Fire Lab looked at past fires in the West to create a statistical model of how future climate change may affect wildfires. Their paper, “Climatic Change, Wildfire, and Conservation,” published in Conservation Biology, found that by century’s end, states like Montana, New Mexico, Washington, Utah, and Wyoming could see burn areas increase five times.

For completeness sake – and because I remain optimistic that someday the media will routinely make the connection between increased forest fires and global warming – let me note that back in 2006 Science magazine published a major article analyzing whether the recent soaring wildfire trend was due to a change in forest management practices or to climate change. The study, led by the Scripps Institute of Oceanography, concluded:

Robust statistical associations between wildfire and hydroclimate in western forests indicate that increased wildfire activity over recent decades reflects sub-regional responses to changes in climate. Historical wildfire observations exhibit an abrupt transition in the mid-1980s from a regime of infrequent large wildfires of short (average of 1 week) duration to one with much more frequent and longer burning (5 weeks) fires. This transition was marked by a shift toward unusually warm springs, longer summer dry seasons, drier vegetation (which provoked more and longer burning large wildfires), and longer fire seasons. Reduced winter precipitation and an early spring snowmelt played a role in this shift.

Grist thanks its sponsors. Become one.

That 2006 study noted global warming (from human-caused emissions of greenhouse gases such as carbon dioxide) will further accelerate all of these trends during this century. Worse still, the increased wildfires will themselves release huge amounts of carbon dioxide, which will serve as a vicious circle, accelerating the very global warming that is helping to cause more wildfires.

For more on the new study, see here.

Related Posts: