Humans caused more than 1.2 million of the 1.5 million blazes … 1992 to 2012

From the Smithsonian Magazine

According to a press release, researchers from the University of Colorado, Boulder’s Earth Lab took a deep dive into the U.S. Forest Service’s Fire Program Analysis-Fire Occurrence Database, analyzing all wildfires recorded between 1992 and 2012. The researchers found that humans caused more than 1.2 million of the 1.5 million blazes in the database.

The cost of those human-induced fires is staggering. The researchers estimate that man-made fires have tripled the average fire season over the past 21 years from 46 days to 154 days. It now costs over $2 billion per year to fight the fires, and that figure does not include the impacts to recreational lands or local economic impact that fires can have.

Wild Fire

Wildfires USA – 80% Less Than The 1930s

Bjorn Lomborg has been trying to quell the hysteria about forest fires in the USA.

Image may contain: text

No photo description available.

As Lomborg writes on Facebook:

Some people have pointed out that the National Interagency Fire Center writes that “Prior to 1983, sources of these figures are not known, or cannot be confirmed, and were not derived from the current situation reporting process. As a result the figures prior to 1983 should not be compared to later data.”

This is convenient, since the NIFC for the longest time didn’t even want to acknowledge that there were data before 1960 ( I’ve consistently pointed out that we had early data and where the data starting in 1926 comes from; it is the Wildfire Statistics from USDA, summarized in the official Historical Statistics of the United States – Colonial Times to 1970, p537:

So, we all know, very well, where this data is from.

Interestingly, what the NIFC forgets to tell us is that the earlier data was based on reporting from *much* less land – about 200m ha of 700m ha burnable land ( So, if anything, it is reasonable to argue that the early estimates should multiplied by 3.5 (divided by 2/7th), which indeed is what this article did ( “Littel et al. (2009) recognised the reporting bias and, as part of their analyses of fire–climate relationships in the western US, multiplied the USFS-reported WFAB estimates ‘by the ratio of the total area protected in 2003 to the area protected in a given year’.”

There are other, legitimate concerns, such as the inclusion of intentional burning in the early years, which may have added millions of acres to the numbers in the early part of the century (4-10% too much). But still, this does not in any way jeopardize the general trend of the data. This is of course why this data has been used by many academic publications, including Houghton, R. A. (03.2000). “Changes in terrestrial carbon storage in the United States. 2: The role of fire and fire management”. Global ecology and biogeography (1466-822X), 9 (2), p. 145.

If anything, the graph that I’m showing is likely *underestimating* the amount of burning in the early part of last century.

One way to see this is comparing the graph to the US estimate from forest fires in the global carbon budget from “Fire history and the global carbon budget” ( They estimate the burnt area in Eastern and Western US (here added together) in decades from 1900-2000. It is very clear that not only is the graph broadly right, but early fire, from where we have no or very spotty data, is likely to have been even greater, compared to the present.

The likely 2018 burnt area will be about 9% of the burnt area each year in the decade 1900-1910.