Tornado Frequency Trends Deceptive, Actuary Says

December 26, 2013

To develop adequate rates, actuaries don’t just need to spot what the next trend will be that will push up claims frequencies or average loss costs.

They also need to sort out what the next trend will not be, actuaries said during a panel discussion of catastrophe models at the Casualty Actuarial Society annual meeting in November, with one actuary suggesting that tornado frequencies may not be on the rise.

If it looks like the number of tornadoes is on the rise, a sound actuarial rate for homeowners will capture that phenomenon, Scott R. Jean, a Fellow of the Casualty Actuarial Society and vice president and chief actuary at EMC Insurance Companies, noted, during his presentation, “Avoiding Knee-Jerk Reactions to Short-Term Natural Catastrophes.

According to a report of the presentation from the CAS, Jean showed a chart displaying the number of severe tornadoes annually over 25 years, with the actual years masked to help him later prove his point.

The number of storms fluctuated, of course, but there was a definite upward trend. The chart showed a five-year moving average, and it fit a line to the data. Both of those indicated the number of storms was continually growing.

Then Jean revealed what years the “trend” covered: 1950 to 1974.

Using the moving average or the fitted line would have overestimated tornado frequency by fourfold, and even overestimated the 2011 tornado season, which was one of the worst, with tragedies in Tuscaloosa, Ala., and Joplin, Mo.

This chart was used to demonstrate the dangers in using even 25 years worth of data to forecast into the future.

There’s also a reason there are more tornadoes reported in recent years, Jean said, and it’s fairly obvious—more people are reporting tornadoes.

In the past, the only tornadoes recorded were confirmed sightings—ones that did damage or were reported by credible witnesses, such as police. There were no video cameras to capture funnel clouds on tape.

Tornadoes that once went unreported are now chronicled in detail. Modern radar systems spot tornadoes that land in the most remote cornfield. And there’s a cottage industry of storm trackers, who shuttle from twister to twister to record the wreckage.

In fact, the National Weather Service analyzed the data without the bias. The result—over the decades, there was little change in the number of severe tornadoes, Jean said.

Therein lies a lesson, Jean said: Look for the story behind the data. Tornadoes, it turns out, seem to follow long cycles. Periods with a lot of storms are followed by fallow periods.

No doubt, Jean said, “2011 was a very extreme year” for tornadoes. But it was less severe than 1974 or 1965.

Modeling Real Trends

Two other panelists described how data from modeling firms can project trends that do pan out, such as for wildfires and windstorms.

Modeling firms have been active in the industry for two decades, but only recently have cat modelers started focusing upon tornadoes and hailstorms, according to Howard Kunst, a Fellow of the Casualty Actuarial Society and chief actuary at CoreLogic, a real estate data and analytics company, who is responsible for collecting and analyzing catastrophe data.

The early days of cat modeling were devoted to hurricanes, he said. “Those were the big-ticket items.”

But now there’s a broader demand. Fifty-seven percent of U.S. catastrophe losses come from tornadoes and hailstorms, Kunst said.

Kunst and another modeling expert, Matthew Nielsen, director of model product management at RMS, said the models cannot predict precisely where tornadoes and hurricanes – known collectively as severe convective storms – will strike. But they have done a good job of pinpointing regions.

Nielsen, for example, showed a 2008 RMS map predicting which areas were at greatest risk of tornadoes. The famed tornado alley – Texas through the Dakotas – was prominently featured, of course. But at nearly as great a risk, according to the modelers, was the northern end of deep southern states like Mississippi and Alabama.

So the models, while not knowing a convective storm would strike precisely at Tuscaloosa in 2011, did recognize a risk to the general vicinity.

Modelers are also concentrating on wildfire risk, as protracted droughts and more arboreal homesteads have increased the exposure. In 2012, nine million acres burned in wildfires. In the past 10 years, 2,500 homes a year were destroyed in wildfires. In prior decades, less than 1,000 homes a year burned.

Today, Kunst said, wildfire potentially threatens 1.26 million homes, collectively worth $189 billion.

So to predict how a fire will behave as it approaches an insured property, modelers look at types of vegetation (via satellite), the slope of surrounding hills, and which side of the hill the property sits on (preferably lee of the prevailing wind).

In the end, though, the actuary’s job is to separate the trend from the random spike, to project whether an April 2011, whose 758 tornadoes more than doubled the previous monthly mark, is an aberration or the start of a surge in activity.

Jean points to the data, properly corrected for bias. “We are trained to look at that data,” he said, “and make forecasts from that data. And based on the data, it’s a little early to make that prediction.”

About CAS

The Casualty Actuarial Society fulfills its mission to advance actuarial science through a singular focus on research and education for property/casualty actuarial practice. Among its 6,000 members are experts in property/casualty insurance, reinsurance, finance, risk management, and enterprise risk management.

Source: Casualty Actuarial Society