Their model uses temperature and precipitation to determine probability
A team of researchers from the University of Missouri and the U.S. Forest Service are continuing an effort to research how climate influences wildfire frequency. The model focuses on two variables – temperature and precipitation – to understand how climate drives wildfire across the world.
After acquiring historic fire occurrence data from tree ring and other studies they developed a mathematical model using temperature and precipitation as the two variables. In validation runs, the predictions the model generated were close to actual fire patterns. As they continued to collect additional historic data from locations around the world during the last several years, they refined the model making it more accurate.
“You can see patterns in global wildfire frequency that are obviously predictable,” Michael Stambaugh, an associate research professor in forestry, said. “For example, ¹Greenland doesn’t burn. It’s too icy and wet. It’s on one end of the spectrum. The other end of the spectrum is a place like the Sahara Desert, which doesn’t burn either. It’s too dry and there’s not enough fuel. Between those two extremes, we were confident that there was a way to describe the transition.”
The work is being done by Richard Guyette, Michael Stambaugh, Daniel Dey, and Rose-Marie Muzika who developed what they call the “Physical Chemical Fire Frequency Model (PC2FM)”.
Pollution from fossil burel-burning sources is decreasing but the air is getting dirtier during the wildland fire season.
As a long and brutal fire season in California starts to wind down, Climate Central issued a report that lays out links between climate change, wildfire, and health effects. The report, titled Western Wildfires Undermining Progress on Air Pollution, analyzes air quality trends from 2000 through 2016 in two large California air basins — the Sacramento Valley and the San Joaquin Valley — that are heavily affected by pollution.
The report finds that while the air quality continues to improve as pollution from power plants, trucks and other fossil fuel-burning sources declines, it is getting dirtier during the fire season. Studies have shown that fire seasons in the West are getting longer and that more large wildfires are breaking out as temperatures rise.
“We focused on an especially bad actor called “fine particulate matter”, or PM2.5 — particles that can reach deep into the lungs and exacerbate a wide array of health problems such as asthma, heart disease, and premature birth. A lot of hard work has been done to decrease PM2.5 from other sources, so it’s troubling to see progress getting undercut by wildfires — and to know that a warming climate will likely have wildfires becoming more frequent and burning more area in California and the West,” said Todd Sanford, Ph.D. scientist with Climate Central.
Changes in human uses of the land have had a large impact on fire activity in California’s Sierra Nevada since 1600, according to research by a University of Arizona researcher and her colleagues.
Above: Indian Canyon Fire near Edgemont, SD, 2016. Photo by Bill Gabbert.
By Mari N. Jensen, University of Arizona College of Science
Forest fire activity in California’s Sierra Nevada since 1600 has been influenced more by how humans used the land than by climate, according to new research led by University of Arizona and Penn State scientists.
For the years 1600 to 2015, the team found four periods, each lasting at least 55 years, where the frequency and extent of forest fires clearly differed from the time period before or after.
However, the shifts from one fire regime to another did not correspond to changes in temperature or moisture or other climate patterns until temperatures started rising in the 1980s.
“We were expecting to find climatic drivers,” said lead co-author Valerie Trouet, a UA associate professor of dendrochronology. “We didn’t find them.”
Instead, the team found the fire regimes corresponded to different types of human occupation and use of the land: the pre-settlement period to the Spanish colonial period; the colonial period to the California Gold Rush; the Gold Rush to the Smokey Bear/fire suppression period; and the Smokey Bear/fire suppression era to present.
“The fire regime shifts we see are linked to the land-use changes that took place at the same time,” Trouet said.
“We knew about the Smokey Bear effect — there had been a dramatic shift in the fire regime all over the Western U.S. with fire suppression. We didn’t know about these other earlier regimes,” she said. “It turns out humans — through land-use change — have been influencing and modulating fire for much longer than we anticipated.”
2014 was a busy year in California for wildland firefighters. Battles were fought over 555,044 acres of blackened ground in the state, which was the eighth largest number of acres burned in the last 28 years. So far in 2015, fires have covered 838,465 acres in California, which puts it fifth highest in 28 years. (Stats from Cal FIRE and the NIFC National Situation Report.)
However, I’m not a meteorologist or climate scientist. But some of them who are, took a stab at investigating the possible attribution of extreme weather-related events in 2014 to global climate change. In their report, Explaining Extreme Events of 2014 from a Climate Perspective, 33 different research groups explored the causes of 29 different events that occurred that year.
The first event in the report is titled, Extreme Fire Season in California: A Glimpse Into the Future. It is debatable if the 2014 fire season in California was “extreme”, since like we wrote earlier, it had the eighth largest number of acres burned in the last 28 years according to data from the land management agencies. The authors, Jin-Ho Yoon, S.-Y. Simon Wang, Robert R. Gillies, Lawrence Hipps, Ben Kravitz, and Philip J. Rasch, reported “thousands more fires than the five-year average” between January 1 and September 20.
We don’t put very much stock in numbers of fires, since a small spot that can be stomped out by a couple of firefighters counts just as much as a 300,000-acre conflagration. Total burned acres is much more meaningful. The area burned data that the scientists studied was derived from satellite observations, which can underestimate wildfire extent due to its limit in the minimum detectable burned area, timing of the satellite overflights, light fuels cooling before being detected, and obscuration by cloud cover.
The report also examined the Keetch-Byram Drought index, and determined that “in terms of the KBDI and the extreme fire risk, 2014 ranks first in the entire state”, but it was not clear what time period they were referring to (it may have been since 1979).
The authors fall short of attributing the “extreme” 2014 fire season in California to global climate change:
Our result, based on the CESM1 outputs, indicates that man-made global warming is likely one of the causes that will exacerbate the areal extent and frequency of extreme fire risk, though the influence of internal climate variability on the 2014 and the future fire season is difficult to ascertain.
The three-year drought in the western United States and especially in California became more obvious this year as wildfires were influenced by low moisture in live vegetation, and in some areas once-healthy trees began to show drought-induced stress.
The map above illustrates how much precipitation is needed over a three-month period to end or ameliorate the current drought. Most of northern California will need from 6 to 12 inches according to NOAA.
This [map] only tells you how much precipitation a location needs to get the Palmer Hydrological Drought Index (PHDI) to a certain value based on the model’s equations. It does not tell you how much precipitation is needed to refill a reservoir, restore groundwater to normal, or bring an ecosystem back to normality. It also does not incorporate snowpack into its calculations, and mountain snowpack is a crucial part of hydrology in the U.S. West.
The question of IF there will be a strong El Niño weather pattern in the contiguous United States this winter is now settled. NOAA reports that there is a 95 percent probability that El Niño will continue through the 2015-2016 Northern Hemisphere winter. In an indication of the strength to expect, the June-August average of sea surface temperatures in the Niño3.4 region was 1.22° C above normal. This is the third-highest June-August value since records started in 1950.
El Niño isn’t a storm that will hit a specific area at a specific time. Instead, the warmer tropical Pacific waters cause changes to the global atmospheric circulation, resulting in a wide range of changes to global weather.
The map above is a composite of how precipitation varied from average during the strong El Niños of 1957–1958, 1965–1966, 1972–1973, 1982–1983, 1991–1992, and 1997–1998. There is a large variability in those six events that makes it difficult to predict the effects at any specific location. The map below is a composite of temperatures for the same periods. Again there is much variability, and you will notice that it is very different from the actual forecast, farther down, for this winter.
The impacts of El Niño are typically largest in the U.S. during the cool months from October through May. During the winter season, the southern half of the country — from California to the Southern Plains, as well as along the East Coast — typically receives above-average precipitation. Below-average temperatures also often accompany this above-average precipitation in these regions. Across the northern half of the country, the winter season tends to be warmer and drier than average, particularly in the Northwest, Northern Plains, and Ohio Valley.
Below are NOAA’s outlooks for temperature and precipitation for December 2015 through February 2016.