In light of the recent and ongoing damage caused by a wildfire in and around Fort McMurray, Alberta, experts and laypersons alike are wondering about the potential role played by climate change in the onset and growth of this as well as other wildfires to come.
And while many models are predicting that the warm, dry weather associated with climate change will lead to more frequent and more costly wildfires, what’s been ignored within these models has been the human factor.
That is, according to a research team led by Dr. Michael L. Mann, of the Department of Geography at the George Washington University in Washington, DC, whose recent study argues that in creating predictive models for wildfires within the state of California, scientists have consistently neglected to incorporate variables such as housing density, proximity of development to forestland, water usage and public land management practices, all of which are crucial, the researchers say, to clear and accurate modeling of wildfires.
“Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires. Previous findings of substantially increased numbers of fires and burned area for California may be tied to omitted variable bias from the exclusion of human influences,” say the study’s authors.
Wildfires are a huge problem in California, the most populous state in the U.S., with estimates on the annual cost of fires within the State Responsibility Areas – those lands where the state government is financially responsible for fire prevention and suppression -at $160.3 million (USD), along with an overall total of $5.18-billion spent on wildfire suppression between the years 1999 and 2011. Moreover, a whopping 95 per cent of fires in California are human caused.
Interested in Electric Vehicles?
This article is brought to you by Nano One (TSXV:NNO). Nano One is changing how nanomaterials are made for batteries and other billion dollar markets. Click here to learn more.
Researchers ague that the building of new housing in or near natural vegetation plays a key role. These areas, termed wildland-urban interface (WUI) areas account for a majority of new housing in some parts of the state, such as in San Diego County where three of every four homes have been built in WUI areas.
Having included these human-centred factors on WUI housing development and density into their model, researchers were able to go back over the past 40 years of data and “predict” the occurrence of past fires much more accurately than when the human-centred factors were excluded. “Our model is able to successfully replicate the distribution of observed fires as well as some of their spatiotemporal dynamics across California,” says the study’s authors.
Importantly, the researchers argue that without properly including the dynamics of human settlement, wildfire modeling is likely to consistently overemphasize the role played by climate change. “Our findings evidence that the failure to include anthropogenic effects in future fire estimates may be significantly overstating the response of wildfire to climatic change alone,” say the study’s authors.
Thus, what are the new model’s predictions for the next 30 to 40 years in California? It’s a mixed bag. Climate change, human settlement dynamics and topographical variations across the state will figure into likely declines in future wildfires for some areas such as the central and north coastal regions, but in others, particularly the high elevation regions such as the Sierra Nevada, increases in the frequency of wildfires are expected.