BOISE, Idaho — Early morning is a frenetic time at a wildfire command post. Biologists, meteorologists, foresters and firefighters hustle into tents and grab laptops to review overnight reports, prepping for the day's assault. Fire behavior analysts run computer models that spit out information crucial to putting out the blaze: how many acres a fire will probably burn, in which direction and with what intensity.
In recent years, the models have been rendered practically obsolete, unable to project how erratic Western fires have become, making tactical decisions more difficult for fire bosses and the fire lines less safe for crews in the field.
The analytical work performed by fire scientists here at the National Interagency Fire Center also confirms what seems anecdotally evident: Wildfires are getting bigger — the average fire is now five times as large as it was in the 1980s — and these enormous conflagrations have a breathtaking facility to dance and grow. Unforeseen winds are swerving and turning on fire crews, and it's no longer unusual for fires to double in size in a day.
The unpredictable has now become the expected.
Last month at Rocky Mountain National Park, the Big Meadows fire amazed veteran firefighters by burning across snowfields. The fire wasn't carried by embers, but marched inexplicably over deep snow.
"No one had seen that before," said Dick Bahr, the National Park Service's fire science lead, still a bit in awe of the power of such a fire. "It's a bit crazy. The models are only as good as the data behind them, and the data is changing faster than we can update it."
Unexpected fire behavior — perhaps a sudden wind shift — may have been at work in the Yarnell Hill fire, the out-of-control blaze that killed 19 elite wildland firefighters in Arizona on Sunday. Exactly what caused the men to become trapped and overrun by flames is unknown, but the case suggests that fire managers are struggling to find secure positions on fire lines.
U.S. Forest Service Chief Tom Tidwell laid it out in stark terms last month while testifying before Congress: "The last two decades have seen fires that are extraordinary in their size, intensity and impacts."
Consider the example of three New Mexico fires:
In 2000, the Cerro Grande fire near Los Alamos became the most destructive fire in state history. It burned 40,000 acres in a week. In 2011, the Las Conchas fire burned through 61,000 acres of a forest in one day and eventually charred 150,000 acres. It was the state's most destructive fire until last year, when the Whitewater-Baldy Complex fire burned nearly 300,000 acres.
"These huge fires are the new normal," said John Glenn, chief of fire operations for the federal Bureau of Land Management. "Look at any touchstone — global warming, fuels, invasive species, forest and rangeland health issues — and then you throw in the urban interface. It's almost like this perfect mix. What used to be the anomaly is almost like the normal now."
A multitude of factors contribute to making wildland fires more complicated to fight and more difficult to understand: drought, insect-ravaged trees, fewer resources set aside to thin overgrown forests. But the physics of fire remain immutable: a voracious force constantly seeking fuel to consume.
The practice of using computer models to predict fire behavior was first adopted in the 1970s. Analysts plug in the area's fire history — where and how previous fires have burned — vegetation type, topography, climate history and current weather.
The forecasting recipe had relied on fire and weather data collected in the last 50 or so years. But the fire landscape has changed so dramatically that analysts have found that only information from the most recent decade is of any use as a predictor of contemporary fire behavior.
Along with an extended drought and wild weather extremes, fire profilers have to take into account a new, explosive fuel type on the Western landscape: houses. By the Forest Service's reckoning, nearly one-third of the homes now built in the United States are on the fringe of settled areas, where timber and chaparral meet stucco and cul-de-sacs.
These houses in fire-prone zones are referred to by some fire professionals as suicide subdivisions, and their popularity drives up the cost and complication of firefighting.
Not only are more and more Americans living in harm's way on the edges of development, but they also have caused about 85% of wildfires in the U.S. in the last decade.
Homes, along with highways, power plants and transmission lines crisscrossing the rural West, are more than work-arounds for fire commanders. Saving that infrastructure becomes an imperative that translates into a headache for fire bosses.
Incident commanders look to their fire analysts for their best advice about what a fire might do. Behavior analysts — careful and cautious scientists — confine their projections to the "probable": what they think is likely to happen, based on an array of variables.
They discount the "possible" as being too extreme to consider.
But as more and more fires blow up into unmanageable messes, even the most conservative analyst begins to expand the realm of what is probable on a fire.
"Looking at the increasing number of rare and significant events, you have to consider that the possible now becomes probable," Bahr said.
Ed Delgado, a meteorologist and predictive services manager at the National Interagency Fire Center, acknowledged that the recent string of fires along Colorado's Front Range, including one that destroyed 500 homes, "was beyond what we thought was probable."
"There's a tendency to not forecast extremes," he said. "It's human nature to not go beyond what your experience is. If you've never seen a 1-million-acre fire, you are probably not going to forecast a 1-million-acre fire."
Given what they are seeing, fire forecasters say they will probably expand the limits of the scenarios they provide fire commanders.
"The one thing we know about the West is that having no fire is not an option," Bahr said. "The only way you can get to no fire is to pave it all."