Garbage in, garbage out.
It’s a phrase as old as computing, attributed to an Army specialist in an era when the cutting edge of processing was 16 bits on a device the size of a commercial refrigerator.
While computers’ capabilities have scaled beyond anything SPC William Mellin might have imagined in the year Elvis last appeared on Ed Sullivan and Sputnik launched the space race, users have not outgrown the imperative of accurate information on the front end of any computing task. Far from it.
Poor data quality is “enemy number one,” holding back big data’s potential, Thomas Redman argues in the Harvard Business Review. Advanced data machines are being built on top of legacy business systems which are in turn creaking under the weight and power of that new technology, and the historically data-intense insurance industry is no exception. Carriers and third-party administrators (TPAs) are investing in tools like machine intelligence and high-end analytics at a rapid clip. But that so-called intelligence is only as good as the accuracy of the underlying data.
Straight-through or no-touch claims processing isn’t possible without reliable information. The automation that makes such leaps in efficiency possible will only be a faster route to the wrong outcome unless the correct data is collected from the first interaction with a claimant. Data quality should be the priority for any insurance company expecting to profit from high-end analytics.
The importance of the first interaction with a claimant can’t be overstated. It’s an opportunity to gain meaningful margin, not just a commodity service. Done nimbly and smarter, the claims intake process is an opportunity to reduce the cost of claims with better operational efficiency and better loss ratios, cutting loss adjustment expenses and compressing turnaround time for settlement.
Carriers and TPAs can start by working on a better foundation for innovation. In claims intake alone, we can use automated software-based integrity checks in the intake process and examine call center metrics including average speed of answer. This provides better data and performance to feed advanced data systems or other innovations. If the intake process is found wanting, consider an upgrade by asking if the system will be more flexible inside your walls or outside of them. The answer is knowable: flexibility can be judged by the speed with which client-specific rules can be implemented, and then changed without impacting data quality. The main point is to achieve a mindset geared toward feeding the data machine the best quality information, and ensuring you can change that diet and upgrade it at the pace of your most demanding customers. The alternative is to be resigned to investing in innovation that might be sitting on top of garbage.
Adding urgency, the businesses that insurance covers, in addition to the business of insurance itself, are changing so rapidly that companies need solutions that are not only fanatically accurate, but can be constantly updated and adjusted. McKinsey points out that older companies struggle with switching from legacy internal intake systems or partners. While nobody wants to be the dinosaur that evolution and disruption leaves behind, many enterprises have trouble switching to nimbler and more flexible architecture to support big data, putting themselves at risk of moving too late to external platforms run by agile specialists.
Rigid legacy systems that don’t adjust to unexpected needs or allow companies to experiment threaten scalability and the integrity of every prediction and finding, leaking margin at the very least. Add to that errors in the claims intake process, and the advantages that every business seems to be seeking from advanced analytics and machine intelligence are all but impossible.
The issue is being forced. While little has changed in property/casualty insurance since it got off the ground some time before the American Civil War, now radical transformation is being spurred on by digital innovations. Even historically slow-moving small business insurance is evolving fast, according to Boston Consulting Group.
Insurance carriers are demonstrating that data from devices, ranging from in-home and automotive sensors and wearable technology to GPS and networked appliances, can improve risk assessment and engage policyholders in loss prevention, according to accounting firm EY. Distributed ledger, better known as Blockchain, is already in play in insurance functions from fraud detection, where payment accuracy can improve four percentage points, to distribution and payment models. Unmanned aircraft help with underwriting, disaster management, crop surveys and better, cheaper claims adjustment. And the results should excite any CFO: innovation in auto insurance claims can cut expenses by as much as 30 percent while boosting customer satisfaction by 10-15 percent, McKinsey has found.
In the business world, an investigation by the BCG Institute and Princeton University found businesses are failing faster than ever. There is now a one-in-three chance of a public company being delisted in the next five years. This is six times the rate of 40 years ago. Only high-quality data will fully enable predictive tools to give organizations the long view they seek and plan for an unpredictable world.
Experience shows the edge of your process that faces your customers and field operations as they get hit by events is where to start. If this can be done, businesses can build trust and resiliency in the face of the unpredictable and become the kind of test bed for data solutions that can thrive in the new world.