Wildfires that have raged in California this summer haven’t just overwhelmed firefighters. They’ve also stumped computer models designed to predict the intensity of flames and where they’ll burn.

“These fires are actually exceeding what our models will even predict,” said Ken Pimlott, director of the California Department of Forestry and Fire Protection.

While rapidly spreading wildfires exacerbated by four years of drought may have made wildfires harder to forecast, others suggest modeling methods haven’t kept up to speed with technology.

Modeling has been a primary tool for nearly 40 years for fire managers to plot where a fire will run and help plan where they should deploy firefighters, dig containment lines, fly water- and retardant-dropping aircraft and order evacuations. But it’s not an exact science, and it is often only as good as the expert doing the analysis and a little trial and error.

Modeling experts who work for fire agencies take variables such as vegetation type, humidity, temperature and terrain and plug them into a computer program to create virtual fires and see how they progress. Forecasts are usually created twice a day and shared with managers on the ground to make tactical decisions that day and plan for days ahead.

“It’s imperfect. Sometimes it’s spooky right. Other times you miss the mark,” said Rick Stratton, a fire analyst for the U.S. Forest Service. “More often than not, the science, I don’t want to say it’s right, but it helps make a risk-informed decision.”

Fires this summer have been growing bigger faster, and that’s one factor that could be making modeling harder, said Tim Sexton, a program manager with the Forest Service.

Earlier this month, a fire in Lake County torched more than 60 square miles in 12 hours, destroying nearly 600 homes, killing an old woman trapped in her Cobb Mountain home and sending thousands fleeing down flame-lined roads.

In the same general area north of California’s Wine Country, the so-called Rocky Fire erupted in late July, destroying 43 homes and spreading over 100 square miles. CalFire ran models hundreds of times that could not replicate its rapid growth, Pimlott said.

To some, that’s because the model is outdated and doesn’t accurately account for the often turbulent weather created by the fire itself, which includes fierce winds not foreseen in daily forecasts.

“I think their technology is so outdated and what they’re modeling is so complex,” said Janice Coen, a meteorologist at the National Center for Atmospheric Research in Boulder, Colorado. “Most of us would just say this doesn’t work.”

Coen has used data to create simulations of some of the biggest fires in the West, including the one that killed 19 firefighters in Yarnell, Arizona, in 2013.

In that case, winds changed dramatically and shifted on the crew. Coen said those conditions could have been forecast given the weather pattern that was developing, which she said was common for that area. She wouldn’t have been able to tell exactly where the flames would burn, but she would have had a pretty good idea.

Coen has developed a model that incorporates more accurate weather information with fire behavior to account for air flows in steep terrain and for how fire alters the weather.

“That’s where this model is strong because it’s incorporating the time-changing weather and all the weird things happening in the mountains and the fire feedbacks,” Coen said.

The Forest Service has improved its modeling from the days it used maps and calculators, and now uses a web-based system. It is working at its Missoula, Montana, fire lab to incorporate better weather information to create more sophisticated models, said Sexton, program manager with the Wildland Fire Management Research Development and Applications Program.

He said Coen’s work holds promise for the future, but isn’t “ready for prime time.”

In fact, Colorado next year will begin using Coen’s modeling to forecast fires after its governor signed a bill in May to spend about $3 million over five years testing the technology that forecasts dozens of weather variables, in addition to a fire’s position and intensity, based on aircraft observations, radar data and other sources.

Sexton acknowledged that current modeling doesn’t always work well “right out of the box.” Analysts have to make tweaks after they see how fire behaves and then recalibrate it in a manner that requires a little art with the science.

Modeling is not used on all fires and often isn’t employed until a fire demonstrates a serious threat to life and property.

When three firefighters were killed this summer in Twisp, Washington, they were responding to initial reports of a blaze. That fire had not yet been modeled, Stratton said.

However, it eventually merged with other blazes to become the biggest fire in state history.

Topics California Catastrophe Natural Disasters Wildfire