Data Quality and the Commercial Property Insurance Marketplace

March 10, 2016 by Zack Schmiesing

Although big data and disruption have become hot topics in the language of market watchers and business leaders, in commercial property insurance, the quality of data is a paramount consideration of insurers underwriting and pricing the risks.Executive Summary Carriers are responding to the evolution and challenges of the commercial property marketplace through automated decision support technology, according to Zack Schmiesing of Verisk Analytics. While primary carriers are recognizing the benefits of integrating sophisticated analytics and data resources that drive underwriting consistency and free experienced underwriters from data gathering and validation tasks, reinsurers have been slower to adopt these tools.

Executive Summary

Carriers are responding to the evolution and challenges of the commercial property marketplace through automated decision support technology, according to Zack Schmiesing of Verisk Analytics. While primary carriers are recognizing the benefits of integrating sophisticated analytics and data resources that drive underwriting consistency and free experienced underwriters from data gathering and validation tasks, reinsurers have been slower to adopt these tools.

In truth, insurers and their customers have embraced big data for decades. Profitable pricing of policies, accurate claims adjustment and sound risk-transfer strategies are all made possible through the use of public and proprietary data strategies. Most insurers know that data quality, although essential, is often misunderstood or overlooked for the sake of expediency. Often the fastest returned quote is the winning quote. An insurer can have the best model and processing available, but poor data quality too frequently leads to incorrect pricing. In this way, inferior data can have a significant negative impact, particularly in commercial property underwriting.