What the Future Holds for Catastrophe Modeling: Aon’s Impact Forecasting

June 23, 2015 by Charles E. Boyle

Technologies are rather like bicycles. Just as riders must continue to move forward or risk falling down, so too designers of catastrophe models are moving full speed ahead to stay on course, experts revealed recently.

At a conference hosted by Aon Benfield’s Impact Forecasting in London on June 17, a number of the group’s representatives explained just how far catastrophe models have come in the 30 years since Karen Clark produced the first catastrophe model for hurricanes. They described recent developments that include open platforms and customizable components, wrapping up the conference with reports about what’s going on right now as developers steer models designed to assess natural and manmade perils in the direction of greater relevance to insurers and reinsurers.

Highlights from Impact Forecasting indicating what the future will hold for catastrophe risk analysis included these updates:

• Cristina Arango, who specializes in earthquakes, explained how her team is working on models covering the United States, Turkey, Southeast Asia, the Arabian Peninsula and Chile, among others.

She explained the problem of having insufficient data in some areas, such as the Arabian Peninsula, and too much data in others like Japan. In the former case it’s really only possible to construct scenarios. In the latter, probabilistic models can be quite complicated to produce, especially when the quakes frequently cause tsunamis.

• Petr Puncochar reviewed the flood model developments in progress, noting that Impact Forecasting’s flood division plans to release six flood models this year. They include an updated U.S. model based on data from 22,000 rain gauges, and made necessary by the floods earlier this year in Texas and Oklahoma. Floods in Canada in 2013 have also hastened extensive flood model revisions.

In the Asia-Pacific region getting information can be difficult. “We use the data available” for both fluvial (river) and pluvial (rain) causes of flood loss, Puncochar explained.

A similar situation exists in some European countries, notably Poland, where the claims data is only as recent as 2010. The data is good enough in both France and Brazil, however, to enable Impact Forecasting to generate “bespoke” flood models for those countries.

• Alexandros Georgiadis, who has responsibility for developing the division’s probabilistic windstorm risk model for Europe, focused on the damages these storms cause to Europe’s forests, particularly the rather extensive ones in Scandinavia. The models require data on the specific type of tree—spruce, pine and birch are the most common—as well as the type of soil in which they grow, their location (higher or lower), and the potential, although diminished, valuation of a tree that’s been cracked or uprooted.

• Mark Lynch, Impact Forecasting’s terrorism expert, explained the work he and his team have done in modeling the varying potential effects of explosions. He explained that the type of explosive used, the weight/volume of the charge and where it would potentially be placed all have to be considered in creating the model.

In addition considerations of the urban density, which includes the “geometry of streets” in potential target cities, the type of construction of the buildings at risk, as well as where they are located all affect the eventual model.

Using Frankfurt as an example, Lynch explained how the team “measures the propagation of pressure [from a blast wave],” analyzing the uncertainty of explosions in different locations, as well as the accident and health factors present in relation to a potential target. The greatest loss of life from terrorist bombings occurs when a building collapses.

Blending Data and Science

An overriding theme of the conference was that modelers need to turn to science to help them fill in gaps in data, even as the scientific community works harder on gathering more data to improve scientific analysis.

“We blend science and data,” said Dail Rowe, who leads a team at WeatherPredict. “We have lots of data on auto accidents, but not nearly enough on weather” events, he said.

Rowe cited work done on how high and low pressure variants affect storm systems, noting that European windstorm models now take into account the phenomenon described as “clumping.” This occurs when a second or third storm system follows directly after the first storm, creating a series of linked weather events. The classic example is Lothar and Martin, two storms that followed one another across France, Germany and Switzerland within 24 hours of one another 1999 and caused significant damage and loss of life.

It’s no longer sufficient to use past claims and specific storms as the main data source. These are now a starting point to bring in other data and to apply scientific analysis to create better models.

More than 200 people in attendance at the conference titled “Impact Forecasting Revealed,” also heard Adam Podlaha, global head of Impact Forecasting and the conference chair, report that there is no unique model for any given catastrophe risk. Any insurer, reinsurer, or broker who assesses property/casualty risk needs to have a wraparound picture of that risk, which requires input from different sources, he said.

Podlaha listed the line-up in ascending order, beginning with “scenarios,” i.e. overall views of a given geographic area where risks have been identified, and moving on to “probabilistic models,” where the likelihood of a catastrophic event is calculated and the potential losses are identified.

The next step is to narrow the point of focus by “customizing” the model, or in British terminology to create a “bespoke,” i.e. tailor-made, model for insurers, reinsurers and brokers to consult in order to understand the precise risks they are asked to cover.

Podlaha explained that this requires combing all the available knowledge relevant to the risk. But since that knowledge may be contained on more than one platform, it needs to be collated for a precise risk assessment. Impact Forecasting therefore developed “ELEMENTS,” its loss calculation platform, which is compatible with most other formats, including Oasis, the open source platform introduced in London last year.

In a September 2012 announcement, Aon Benfield said that ELEMENTS allows insurers to incorporate their own views on the risk of property damage through customized vulnerability curves, based on their actual loss experience or current portfolio.

By integrating all of the elements that combine to define a risk, Aon Benfield can produce the “primary tools” required by underwriters to more fully comprehend the nature of the risk they have been presented with and the price range into which it falls.

The process comes full circle when an event occurs, especially one that hasn’t been adequately modeled, such as the Thailand floods or the recent earthquake in Nepal. Such events generate catastrophe reporting and analysis, which in turn enlarges the database for any similar events.

Weather and Climate

Weather-related events generate the greatest number of catastrophic losses, according to Aon Benfield. Whether it’s cyclonic storms—hurricanes and typhoons— thunderstorms and tornadoes, windstorms such U.S. Nor’easters and the ones that hit Europe, or floods and mudslides from excessive rainfall, modeling them is a top priority for insurers, reinsurers and brokers.

Impact Forecasting’s Associate Director and Meteorologist Steve Bowen said that while there are differences between weather and climate, they are certainly related. “Climate is what you expect; weather is what you get,” he said, quoting Mark Twain. The slow but inexorable rise in ocean temperatures underlies most changes in formerly “normal” climates. “Sea level temperatures have been warmer than average for 363 consecutive months,” he said.

When water warms it expands, raising sea levels and increasing the melting of polar ice. It also causes “more atmospheric instability,” which increases the frequency and the power of the storms that occur, Bowen said. Global warming has been accompanied by, and is probably in part caused by, an increase in carbon dioxide (CO2) in the atmosphere, which Bowen said is now over 400 parts per million (ppm), 93 percent of which is “stored in the oceans.”

The concentration is climbing higher, and is expected to reach 450 ppm by 2050, a concentration that hasn’t been reached in millions of years. 2050 is also the year when population growth is expected to see 66 percent of the global population living on or near coastlines, he said.

Thunderstorms, cyclones and floods have caused over $1 trillion in private and public losses since 1980, Bowen said. Eighty-five percent of those losses, however, were attributable to the growth in value of the properties affected; only 15 percent were attributable to weather or climate activities. In future years, however, those two factors will each augment the other in causing greater casualties and increased economic losses.

Evolution and Improvements to Catastrophe Models

Challenging data gaps, described by Rowe and others, have resulted in a wave of new techniques and scientific applications in the construction and use of catastrophe models. It’s no longer sufficient to use past claims and specific storms as the main data source. These are now a starting point to bring in other data and to apply scientific analysis to create better models.

Aon Benfield’s Patrick Daniell described the application of engineering techniques in assessing the structural vulnerability of properties subject to European windstorms. “We applied different engineering elements, analyzed them and combined them with [existing] claims data,” he explained. The resulting “bespoke model,” produced on the ELEMENTS platform, analyzes residential, commercial, industrial and agricultural properties.

Dave Martin, technical director of Australia’s Ambiental, described how Ambiental created a model for hazards, exposures and loss estimates for the company’s flood risk data products. The work has been validated by actual events, and gives an accurate picture of river and coastal flood potentials in Australia down to the postal codes.

Aon Benfield’s Charlie New went through the intricate technical stages required in catastrophe modeling workflow. While the ultimate goal is to provide underwriters with the models they require to make accurate coverage decisions, the amount of material required must be collected, analyzed and incorporated into existing systems. At this point, technology and the people who know how to use it are critical.

As a result, after he had described the various platforms and services involved in creating accurate catastrophe models, New said “Aon Benfield is now a software provider.”

Even 10 years ago that would have been a rather startling comment. The explosion in the creation and dependence on cat models makes it today simply a statement of fact.