Big data technologies are no doubt recalibrating the insurance industry.

This expanded form of modeling by insurers is incredibly dynamic having an almost limitless range of new variable assumptions and branched connections that will significantly improve and revolutionize how insurers measure risk.

Yes, this means that some will pay less and others more, but with an added chance to get these costs offset via direct sales that eliminates the broker and underwriting expenses. Regardless, it will arguably be more equitable because everyone gets charged a cost closer to their actual risk.

But, this equity dispersion comes at a cost in the form of individualized risk profiling that if too prolific, could lead to excessive risk pool segmentation, adverse risk selection and new individualized risk assessments that could price out new segments of policyholders.

|

Regulatory pathway needed

I believe the U.S. market is ready to realize this potential from these expansive technologies and networks but a regulatory pathway needs to be established to safeguard adherence to our core insurance principles and an individual's right of privacy.

This new and exciting frontier of seemingly limitless data, cross-integrations, and fluid networks brings a new approach in measuring and identifying risk which will ultimately transform our industry. Perhaps the most significant advancement is in the development of the policyholder behavioral risk profiles. This new risk measure brings a new degree of precision but since behavior is often inconsistent and ever changing, its success depends upon a commitment to its continuous refinement.

The power of these developing data technologies adds real value to sales, pricing, underwriting, claims and service administration which all translates to expense reduction, fraud reduction, improved precision/accuracy, streamlined administration and time savings. But, the current excitement dominating discussions is in drastic need of sober regulation to move it along.

|

Credible data

It is understood that all data targeted must be associative, authorized, secured and disclosed. Most importantly, it is expected that every piece of public or non-public data targeted and collected must credibly correlate to the risk transferred. Using data that is credible (strong probability or likelihood) and that maintains quantifiably demonstrable experience correlating to the risk is an insurance gold standard and something that cannot be compromised.

Here are five issues regulators must tackle:

1. "Disruptors" creating wide scale disintermediation and cherry picking risk or selecting against traditional carriers.

2. Conflicts created from data collected that cannot be quantifiably correlated to the assumed risk. Data maintaining a quantifiable relationship to the risk but because of its recent identification or linked associations, has no credible experience correlating to risk. New multivariate data combinations maintain the same concern and would need to be unwound and demonstrated.

3. The data that supports these behavioral risk profile measures is an ever changing moving target that relies on accurate, updated and credible data. How long will these risk profiles be retained? What is their shelf life and how long will they be credible? What's the acceptable look-back period? Will they be continuously updated to track with their volatility? Can any ideal or adverse profiles be contested and corrected? Would the introduction of a formal appeals process be a reasonable consumer protection?

4. Some data collected relates to prohibited risk distinctions or potentially discriminatory distinctions which would require validation.

5. This mile-high supply of new variable data sets and individualized custom risk profiling could lead to excessive risk segmentation delivering runaway risk distinctions or classifications within risk pools. This of course runs counter to the basic precepts of insurance, absorbing smaller incidental risks within larger risk pools.

Arguably, any data that does not credibly correlate to the risk assumed should be off-limits for use.

Regulators must evaluate all of this data by state insurance standards, adhering to the fundamental principles of insurance that have proven timeless and that have insulated the industry from some of the worst storms. These products we sell cover risk in the form of a contractual promise.

All data shared, collected, priced/modeled, underwritten and profiled must credibly correlate back to this risk. This is the foundation we must preserve and continue to build upon.

Robert Chester is an insurance examiner based in Hartford, Connecticut. Connect with him on LinkedIn.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.