Free Preview

This is a preview of some of our exclusive, member only content. If you enjoy this article, please consider becoming a member.

Predictive analytics has transitioned from exciting buzzword to a core component of a carrier’s arsenal over the last few years—and its effectiveness is no longer in doubt. In fact, a Willis Towers Watson survey in 2015 found that 87 percent of property/casualty insurers reported gains in profitability from the use of predictive models.

Executive Summary

Predictive model monitoring is one of the most pivotal components to building and sustaining an overall analytics strategy within an organization. Bret Shroyer of Valen Analytics reviews three measures of model effectiveness in meeting business goals.

Now carriers are asking, “How can I create the most effective model in the long term?” The solution ultimately depends on the available data and how well a company understands the unique questions and answers forming the basis of their model.

At Valen, we often talk about the phases that carriers go through in their adoption of predictive analytics. In Phase 1, carriers typically enjoy two or three years of easy wins from picking the low-hanging fruit that comes from initial model adoption. However, it’s important to know what the right “next move” is beyond Phase 1. These options include:

  • Refreshing or updating an existing model to reflect new information.
  • Rebuilding the existing model with entirely new data, variables, targets, etc.
  • Incorporating predictive modeling into another area of business, like claims or marketing.

How should a carrier choose the optimal path, and when is the right time to do any of these? In the examples that follow, we present three scenarios that point to a clear need for change.

Scenario 1: Changes to the Distribution of Predictions

Particularly for carriers that are new to predictive analytics, it’s important to measure the distribution of predictive scores on live business compared to the distribution of scores from the model build benchmark that was done prior to implementation. One of the key features of a predictive model is that, given the same set of inputs, a model will always give the same outputs. So, if the output is now following an unexpected distribution, it should be clear that the inputs are no longer what was originally expected.

Why are the inputs changing? It’s imperative to understand the answer to this question. It could be a simple data issue—there was data available during the model build (prior loss information) that’s not available in deployment (no prior loss information available on new business submissions). It may be that underwriting appetite is now shifting as a result of model uptake. Or it may be something really simple, such as “underwriters aren’t pulling scores on any policies under $5,000 premium.”

The point is this: If the model was initially built on data from one population, and now it’s being employed on a dramatically different population or with different data, there may be reason to doubt that the predictions will be actionable and accurate. At best, an error in application can be corrected. At worst, this may signal a need to recalibrate or refit the model based on the new understanding of how the model will be applied.

Scenario 2: Changes in Key Performance Indicators

Peter Drucker famously wrote, “What gets measured gets managed.” Every predictive analytics project should have a set of key performance indicators (KPIs) as well, so that the model users and stakeholders can keep tabs on model performance and impact.

Some popular KPIs include:

  • Model lift.Is there agreement between model predictions and actual observed values?
  • Pricing accuracy.Are there any segments that are being consistently under- or overpriced?
  • Risk selection/adverse selection. Are underwriters seeing higher quote ratios and bind/quote ratios on good risks vs. bad risks? How is this changing over time?
  • Utilization. Are scores being generated where needed—including all segments, regions, products, etc.?

By examining these KPIs, model managers can assess whether or not the model is being applied as initially intended and is performing adequately. If there is an issue on either of these fronts, it’s time to dig deeper into the root causes and perhaps refresh the model.

Scenario 3: Changes to Business Process

Unfortunately, it is becoming more common for carriers to want to apply an existing model to multiple business problems. Maslow’s Hammer is described in the phrase, “When all you have is a hammer, every problem looks like a nail.”

Each predictive model was developed to address a particular business problem; applying a good model to the wrong problem can yield suboptimal—or flat-out wrong—results.

Ideally, a predictive model is born out of the following thought process:

  1. Define key goals and strategies to accomplish them.
  2. Define the tactics needed to implement strategy.
  3. Define data and information needed to support tactics.

Step 3 is where predictive analytics should come in, but only where it is able to provide information not available from other more traditional sources. Note that the model here should be uniquely tuned to the relevant business goals, strategies and tactics. Problems arise when carriers start working backward to formulate tactics, strategies and goals to fit a given model.

It’s not always explicit that this is happening, though. Look for these signs:

  • A new KPI is established to monitor model effectiveness that was not part of the original set of KPIs.
  • Underwriting processes or strategies are changing as a result of available model scores.

If either of these are true, you should at a minimum pause to ask if the current model application is in alignment with the original design. If it isn’t, consider going back to the core goals, strategies and tactics used in defining the original model. You may discover that the correct solution is to develop a new model in full alignment with the new strategy.

Predictive model monitoring is one of the most pivotal components to building and sustaining an overall analytics strategy within an organization. By paying attention to the above warning signs and reacting appropriately, carriers will be able to more effectively manage their long-term success with analytics and continue to remain confident in the efficacy of the model and the business goals surrounding its implementation.