Small Cyber Insurers Worry Regulators; Model Law Coming

February 1, 2016 by Susanne Sclafane

[ijtv id=”13168″]While insurance regulators are tackling cyber-related issues at a record pace, they’re focused on questions beyond whether insurers are adequately securing the personally identifiable information of policyholders, one regulator said recently.

In fact, they have solvency concerns with respect to some companies that are selling network security and cyber insurance products, Adam Hamm, the chair of the National Association of Insurance Commissioner’s task force on cybersecurity said during the Property/Casualty Insurance Joint Industry Forum.

Hamm, who is also North Dakota’s Commissioner of Insurance, was responding to a question posed by Charles Chamness, president and chief executive officer of the National Association of Mutual Insurance Companies, who moderated a panel on technology trends at the Forum, and asked whether regulators worry that sellers of cyber insurance might be overextending themselves.

“I can tell you my colleagues and I, when we look at the few or so dozen companies that are selling these products—and predictably there are going to be more in the near future—we don’t have a large concern over the biggest of the big companies. I’m not going to name their names. We all know who they are.

“But when you get down to the smaller or medium-sized companies that are selling these products—and more and more trying to get into this market [to] make some money—we have concerns that they may not be completely understanding the risks that they’re taking on,” he said. It could be that “their sophisticated underwriting and pricing really masks the bull that they’re grabbing by the tail here,” he continued.

Chamness led into his question citing widely quoted estimates that insurers wrote $2 billion of cyber insurance premium in 2015 with expectations of growth to $7 billion by 2020.

Adam Hamm (Photo by Don Pollard)
Adam Hamm (Photo by Don Pollard)

We have concerns that they may not be completely understanding the risks that they’re taking on.
“This is an issue that my colleagues are extremely interested in and have a number of concerns with,” Hamm said, explaining the impetus for the NAIC task force to develop a cyber insurance supplement to the statutory blank for 2016 filings and beyond. “Those are estimates,” he said, referring the $2 billion and $7 billion figures. “Nobody has any hard specific data on that. That’s one of the things as regulators we want to rectify,” he said. With data from the supplement, “we’ll no longer have a 30,000-foot view of the cyber liability market.” Instead, regulators will have “a treetop granular level understanding of what this market looks like,” Hamm said. (See also related video interview here.)

Describing the content of the supplement, he said that every time cyber insurance carriers file their annual financial statements at the end of March, they’ll have to disclose premiums and losses related to cyber insurance policies. “We’re going to take all of that data and analyze and scrub it and aggregate it and put it out in the public forum. And I can tell you that the work I do on cyber with my federal colleagues, the No.1 issue they want to know about is this cyber liability [insurance] market really looks like on a granular level. So, once that data becomes available, it’s going to be literally a fight…to get to that data,” he said.

Systemic Risk

In addition to leading the NAIC Cybersecurity Task Force, Hamm participates on the Financial and Banking Information Infrastructure Committee, a national cybersecurity committee (FBIIC, chaired by the U.S. Treasury) and the Cybersecurity Forum for Independent and Executive Branch Regulators (chaired by the Nuclear Regulatory Commission). He also represents state insurance regulation as a non-voting member of the Financial Stability Oversight Council—the arm of the Federal Reserve that designates systemically important financial institutions.

Speaking at a different conference, the Standard & Poor’s Annual Insurance conference last June, Hamm talked about the efforts of the three groups to share information related to cybersecurity threats experienced by the firms they regulate—”seamlessly and on a real-time basis”—and to try to shrink the trust gaps between the public and private sectors over sharing threat information.

Singling out FSOC at one point during his remarks at the S&P meeting, Hamm noted that the Council’s authority (under Dodd-Frank) extends beyond a focus on individual entities that represent a systemic risk to the greater economy and labeling them SIFIs. FSOC can also “analyze ‘activities’ as systemic risks to the system,” he said. “So it’s possible that you could get a majority of FSOC members—the voting members on FSOC—that could analyze and look at this issue and conclude that cybersecurity represents an activity that’s a systemic risk to the system and then potentially take action,” he suggested.

Back at the P/C Joint Industry Forum, Hamm spoke mainly about state insurance regulators and the NAIC’s accomplishments in launching initiatives aiming to protect consumer information from potential breaches of insurance companies. But the word “systemic” entered the discussion again. This time, it was Hemant Shah, chief executive officer of RMS, who used the word to highlight the idea that insurers writing cyber coverage need to understand cyber risk accumulations.

The growth of the Internet of Things, Shah said, is driving a shift from covering assets at risk to systems at risk. In other words, “there are more systemic risks,” he stated.

Still, Shah periodically beckoned insurers to embrace the idea of covering IoT risks, and noted that while claim severity may rise, claim frequencies will likely decline. (See related article, “Is Insurance the Economic Engine for the Internet of Things?”)

“Most of the cover that’s associated with the roughly $2 billion of premium…mentioned [earlier] provides [for] a shadow of the underlying exposure….

“We hear figures like $7 billion [and it sounds like] that’s a lot of growth. But what does a $20 billion market look like?” Shah asked. “What does a $100 billion market look like? After all, the modern economy is driven by digital activity and increasingly those risks are digital in nature,” he said.

Understanding correlation and risk accumulations will be required to go after those bigger numbers, Shah suggested.

[ijtv id=”13138″ width=”340″ float=”right”]Giving a modeler’s perspective, he drew an analogy to earthquake risk. Insurers can write earthquake business in San Francisco and in Tokyo, and be pretty confident that they won’t experience losses in both places at the same time in the same occurrence, he said. That’s not true for cyber.

“How do we think about the contours of the correlation space in a way that can allow us as an industry to allocate capital with some confidence that we can diversify it, and that all of our covers, which could be significantly massive when you consider the underlying economic exposure of these enterprises, are not going to go in the same single event occurrence or incident,” he asked.

What is “fundamental to understand is not how do we price the product. We can learn how to do that over time,” he said, suggesting, instead, that understanding cyber risk accumulation is the “critical technical and analytical challenge” to tackle so that insurers can deploy the amount of capital that the global economy is going to require, he asked.

Moving Fast and Slow: How the Industry is Responding to Technology

As session moved forward, panelists answered questions from Chamness about the pace of technological change, and the pace of industry and disrupter responses. Hamm described an environment that has regulators moving much faster than usual, while Shah suggested that insurers aren’t moving quite fast enough.

But they are moving. “I have been working with insurers for 26 years now.… Relative to what I have seen in the past, my view is that the pace of change is pretty unprecedented,” Shah said, citing the large number of thoughtful conversations he’s now involved in with insurers about subjects like machine-learning, remote-sensing, drones, telematics, big data and analytics.

“I’m in Silicon Valley and many of insurers and reinsurers now have operations going native in the Valley, investing and exploring collaborations with startups.”

So why doesn’t he think it’s fast enough?

“The industry, in my view, spends too much time thinking about technology as a kind of weapon,” Shah said. “We have to master new techniques and technologies to ward off these competitors or to operate more efficiently.

The industry, in my view, spends too much time thinking about technology as a kind of weapon.”
A more profound assessment would sync with his own. “The economy is fundamentally different. The pace of change is increasing. What consumers and corporations care about is changing at a breakneck pace. [Therefore,] more effort needs to be spent internalizing technology not as a technique to do what we do today somewhat better, but [instead asking] how can we think fundamentally differently about the nature of exposure and what needs to be covered? And how do we craft products that are relevant” to a rapidly changing world?

“The industry needs to spend more time asking this fundamental question rather than how many data scientists do we hire to better price the products we already offer?”

Model Law Coming

All the data that carriers are hiring data scientists to sift through has Hamm’s attention. He sounded several alarms to carrier representatives at the conference about the need to protect it.

“Here’s the reality. There’s a bullseye on the insurance industry. Everyone knows huge breaches in the insurance sector 2015,” he said referring to the Anthem breach, Premera Blue Cross and others.

“That’s not going to stop. Bad guys know that insurers have mountains of PII. For them, it’s almost a one-stop shop.

“You can break into the insurance company, and you can either get into that or best-case scenario, ‘exfiltrate’ that data. It’s the mother-lode.”

During an interview with Carrier Management after the session, Hamm predicted that the battle against insurance industry breaches could continue for five to 10 years before “we even start to get over the hump” to a more manageable situation. “If we don’t get [our] arms around cyber [attacks], this could easily be the thing that kills the insurance industry,” he said, noting that consumers, who aren’t enamored with the idea of buying insurance as it is, won’t buy anything but mandatory coverage if more of them become victims of insurance company data breaches.[ijtv id=”13149″ width=”340″ float=”right”]

This could easily be the thing that kills the insurance industry.

That’s the kind of thinking that spurred the NAIC Cybersecurity Task Force into action in 2015, he said, noting that the group wrapped up four initiatives—first putting out a set of guiding principles, and then adopting the annual statement supplement, a Consumer Bill of Rights and updated examiners’ protocols. (Hamm described these in more detail in a video interview with Carrier Management available here.)

“In the eight-and-a-half years I’ve been doing this, I have never seen an NAIC committee tackle that many issues in one year, let alone get them all done,” he said during the Forum session.

“Now that we’re rolling into 2016, you’re going to see a model law passed. We’re going to take everything that was in that Consumer Bill of Rights/Roadmap on cybersecurity protection–all the substance of that is going to go into a model law with an accreditation requirement on it.” That means, that in a few years “all of these things will be in state law.”

Data Explosion

Another panelist at the Forum session, Michael Pritula, director of the Global Insurance Practice of McKinsey & Company, wondered about a regulator’s reaction to another aspect of what he termed an “explosion of data” at carriers. He noted that carriers that McKinsey works with aren’t just relying on “structured internal data,” residing on their servers for underwriting, pricing, fraud detection and other activities. They’re gaining access to unstructured external data, including data streams coming off of IoT devices and social media feeds.

Suggesting that regulators would surely be supportive of carrier fraud detection efforts that “scooped up this unstructured external data,” he asked Hamm about underwriting and pricing efforts that also use data outside of policy information and loss history. Carriers are relying on “machine learning.” They are moving away from linear regression into “very complex models where you can’t really identify what was the cause of saying something was potentially fraudulent or a good risk or bad risk.” With carriers accessing mountains of data outside their organizations, they can’t point to the variable that caused us to say the price needs to be 20 percent higher.”

How do regulators deal with this, Pritula wanted to know.

“How do regulators assess the gut feel of computers, Shah asked, restating the question.

Hamm replied that regulators are on a learning curve along with the industry. “You’re going to see all of us trying to wrap our minds around what [data] is being used, what it is being used for. [But] you’re going to see that we’re open to the idea that some of this has a [bigger] purpose, some of it has value—not just to the industry but also to the consumer as well.

“It’s not something that as regulators we should want to put a stop to. But if it any way shape or form increases the risk to consumers that we protect, …then you’re going to see regulators reluctant to go along with it.”

The bottom line: This is going to be one of those areas where regulators and insurers “are going to have to be at the table together a lot so that there’s a complete understanding. “Because if there’s not, you and I know full well that the regulators’ first reaction is going to be eh, no.” The more open dialogue there is, the better, he said.

Pritula suggested that issues of disparate impact are only going to become more complex, and another panelist, Brian Sullivan, editor for Risk Information, Inc., agreed.

“The industry is going to have to learn to fight two battles. One is to prove that deep machine learning is accurate; the second is to prove that it’s not damaging to some segment of the population that we want to protect. And that is really hard to do,” Sullivan said. Not only is it hard, but “as an industry we’re really really bad…at doing that part,” he said. “I think I can add a couple more reallys in there.”

Related Articles:

[ijtv id=”13162″]