Last week Secretary of Agriculture Tom Vilsack reported that 13 wildland firefighters lost their lives in the line of duty in 2015. That was an increase from 2014 when there were 10 fatalities, and was about a third of the 34 that were killed in 2013 — that year included the deaths of 19 members of the Granite Mountain Hotshots near Yarnell, Arizona.
The National Interagency Fire Center has statistics about line of duty deaths going back to 1910. During that time, according to their numbers, 1,099 firefighters died.
In looking at the 105 years of NIFC data there appears to be an increasing trend. The figures below are the average number of fatalities each year for the indicated time periods:
One likely explanation for the apparent increase is that 80 to 105 years ago probably not all fatalities were reported or ended up in a centralized data base, especially those that occurred on state or locally protected lands. Even if we only look at the figures since 1960, as in the chart above, it still shows a steep increase over those 55 years.
It is possible in the last 25 years the reporting of fatalities and the collection of the data has been somewhat more consistent and complete. The chart below covers that period, from 1990 through 2015, and has a slight downward trend, which would be even more obvious if not for the 19-person crew that passed away in 2013 on the Yarnell Hill Fire.
I can’t prove that there was under-reporting of wildland firefighter fatalities during most of the 20th century, but if a firefighter was killed on a vegetation fire in Missouri in 1921, I can see how that statistic may not have made it into the data base that is now maintained at NIFC.
So what does all this mean? Individuals can look at the same batch of statistics and develop vastly different interpretations. However, it would not be prudent to assume that the fatality rate almost tripled from the first part of the 105-year period to the last 25 years. There are several ways to analyze data like this. The least complex is to look at the trend of the raw numbers of fatalities year to year. A more complex and meaningful method would be to determine the fatality RATE. For example, the fatalities per million hours spent traveling to and working on fires. That would be impossible to ferret out during most of the last 105 years. But the firefighting agencies should be able to find a way to begin collecting this information, if they don’t have it already.
If the fatality and serious injury rates were calculated over a multi-year period, it should illustrate the effectiveness of a risk management program. Otherwise, the simple number of deaths each year might be affected to an unknown degree by the number of acres burned. Other factors could also affect the numbers, such as fire intensity influenced by fuel treatment programs, fire history, drought, climate change, or arson.
Should firefighting agencies have specific goals about serious injuries and fatalities? Is there an acceptable number? Is 5 a year too many? Is 15 too many? Is it stupid to have a goal of zero fatalities — or any number?
The chart below superimposes the number of fatalities over the acres burned in the United States from 1990 through 2015, but it does not include Alaska since many fires there are not suppressed, or they are only suppressed in areas where they threaten structures or people. In 2015 more acres burned in Alaska than all of the other states combined.
UPDATED January 17, 2016
One of our loyal readers, Bean, has been thinking about this issue and figured that since the amount of firefighters’ exposure to risk is necessary in order to calculate trends, perhaps parameters other than acres burned could be correlated with the number of fatalities. Data that is publicly available as far back as 1990 or 1994 includes mobilizations of incident management teams, crews, overhead, helicopters, air tankers, air attack ships, infrared aircraft, MAFFS air tankers, caterers, military firefighters, and shower units. I considered all of those and concluded that the number of crews mobilized would come the closest to serving as a proxy for accurate data of how many hours all firefighters spent traveling to and working on fires.
Data for crew mobilizations is available from 1990 through 2014. I divided the number of crews mobilized by the number of fatalities for each year and called this the Fatality/Crews Mobilized Index.
Like the earlier chart comparing fatalities to acres burned, this analysis also shows a decreasing trend in the last 25 years. In a comment posted January 17, Kevin9 said the earlier acres/fatalities analysis is “spiky”. This newer crews mobilized/fatalities data also has spikes (especially in 1997 and 2009) but not quite to the degree the earlier chart had. During the 25-year period, 1997 had the least number of acres burned and crews mobilized, but still had 10 fatalities. The second lowest number of crews mobilized occurred in 2009 and there were 15 fatalities that year.
As an experiment, knowing that there were mass casualty events in 1994 and 2013 (14 and 19 fatalities respectively), just to see what the effects were, I changed the data in those two years to the average for the last 25 years, which is 17, and there was no major change in the trend line, except it was a little lower across the entire range.
It’s been a long time since I took statistics courses, but here’s what I came up with when analyzing the Fatality/Crews Mobilized Index data:
- Standard deviation: 0.019
- Mean: 0.026
- Coefficient of variation: 0.770
- Variance: 0.00037