This model may help predict wildfires

Researchers at Stanford are developing an AI model to help map the fuel wildfires need.

The raging conflagrations that have recently ravaged Australia and the West Coast are fueling concerns that climate change is leading to longer, more intense wildfire seasons. The race is on to find new and advanced methods to hold the firelines.

Now, Stanford News reports, researchers at the university are using deep-learning AI to map where land may be in danger of burning out of control. The model, based on satellite data, shows how dry plants and soil are in minute detail, from the Rockies to the Pacific. This kind of information could be used by landowners and government agencies to predict where dry fuel is accumulating — and maybe even preempt wildfires.

While it needs more testing before it can be trusted with property and lives, the algorithm shows promise, especially in predicting the moisture of shrublands — one of the most fire-prone types of environments.

Finding the Fuel

Wildfires use dry plant material as fuel. Knowing where, and how much, of this fuel is available can help predict where a fire may begin and the path it may follow. Gathering this data fast enough for effective response is difficult, however — an issue AI is increasingly tackling for other emergencies like floods, evacuations, and search and rescue at sea, too.

The U.S. Forest Service measures the amount of dried-out plant life on the ground using a painstaking process that can sample only a small subset of vegetation.

According to Stanford, the Forest Service cuts off tree limbs, weighs them, dries them out, then weighs them again; the difference between the two is the amount of water within the tree.

“That’s obviously really laborious, and you can only do that in a couple of different places, for only some of the species in a landscape,” Alexandra Konings, senior author of the study and Stanford ecohydrologist, told Stanford News.

All this data is then put in the National Fuel Moisture Database, which has been tracking desiccation since the 1970s. Called “live moisture fuel content,” the measurement is a well-documented factor in wildfire risk. Scientists use this information to make informed predictions of where fuel is driest and risk highest.

Deep-Learning vs. The West Burning

The Stanford model uses a type of artificial intelligence system known as a recurrent neural network, which is adept at teasing patterns from galaxies of data.

The process, described in the August 2020 issue of Remote Sensing of Environment, used the data from the National Fuel Moisture Database to train the model. They then turned it loose to predict the fuel moisture level using two types of measurements taken from space.

One, called Synthetic Aperture Radar (SAR), uses microwave radar signals. These waves can make it past the leaves and reach the ground, helping to form a more accurate picture.

“One of our big breakthroughs was to look at a newer set of satellites that are using much longer wavelengths, which allows the observations to be sensitive to water much deeper into the forest canopy and be directly representative of the fuel moisture content,” Konings said.

To validate their work, they tested it against the National Fuel Moisture Database for hundreds of sites, and checked its predictions across various environments. The model was most accurate when predicting the fuel moisture level in shrubland.

Filled with herbs, little trees, and rocky slopes, shrubland covers a little under half of the American west, and is very susceptible to wildfires.

Those estimates were fed into an interactive map showing the data from 2016-2019.

“Creating these maps was the first step in understanding how this new fuel moisture data might affect fire risk and predictions,” Konings said. “Now we’re trying to really pin down the best ways to use it for improved fire prediction.”

The hope is that the same method can be used to create a detailed, interactive map that can be used to identify high-risk areas and help inform wildfire management.

OpenAI and Microsoft are reportedly planning a $100B supercomputer
Microsoft is reportedly planning to build a $100 billion data center and supercomputer, called “Stargate,” for OpenAI.
Can we stop AI hallucinations? And do we even want to?
“Making stuff up” and “being creative” may be two sides of the same coin — but you have to be able to tell the difference.
When AI prompts result in copyright violations, who has to pay?
Who is responsible for copyright violations when they’re produced by generative AI? The technology is outpacing the law.
Google’s Deep Mind AI can help engineers predict “catastrophic failure”
How vulnerable is the electrical grid to a malicious attacker who destroys select substations? Google’s Deep Mind can help predict the answer.
Does AI need a “body” to become truly intelligent? Meta researchers think so.
We’re finally starting to see what can happen when we put an advanced AI “brain” in a state-of-the-art robot “body” — and it’s remarkable.
Up Next
Subscribe to Freethink for more great stories