Alternative data and new tech may help insurers 'live for risk'
As insurers collect more granular data while improving analytical tools, they should be better able to anticipate and help mitigate major exposures.
Nearly a decade ago, I was tasked with developing a catch phrase that captured the essence of insurance for the benefit of young people considering education and career options in the field.
My suggestion — “Insurers live for risk” — rings as true today as it did back then.
I envisioned an entire ad campaign showing insurers on the front lines of risk management, tackling not only meat and potato exposures in standard personal lines and business coverages, but emerging challenges as well — particularly in cutting edge industries and technologies.
Beyond serving as an ad slogan and recruitment tool, I felt that “Insurers live for risk” accurately communicated the industry’s core mission and value proposition — to help transfer and share the exposures faced by individual policyholders and society, so that everyone could get all the coverage they need for relatively little in premiums.
An elusive credo
Unfortunately, nothing tangible came from this effort. The industry somehow manages to soldier on without an iconic catchphrase to entice the next generation of underwriters, actuaries, agents and claims adjusters. Yet I do think back on my slogan from time to time, particularly when insurers don’t quite live up to the credo I ascribed to them, at least not in the eyes of the public.
Despite all the talk about excess capacity in the property & casualty insurance sector, there are multiple examples of how insurers have stepped back from problematic risks due to the likelihood of high frequency or high severity losses, or both. This tends to raise the ire of consumers, regulators and legislators. It’s why private carriers still only write 14% of flood insurance premiums, and why homeowners’ insurers in California have been pulling out of or raising prices in wildfire-prone communities.
It’s also why only a handful are writing coverage for the booming cannabis industry, legalized in a growing number of states but still technically verboten under federal law. It’s likely one of the reasons insurers have been cautious with limits and exclusions when dipping their toes into the deepening cyber insurance risk pool. And it’s why a growing number of scholastic and amateur football leagues may have to stop playing, with coverage becoming scarce for traumatic head injuries.
This isn’t to suggest insurers are the bad guys. While they do indeed live for risk, that doesn’t mean they can afford to sell homeowners coverage to someone whose house is on fire, for a car right after it’s been totaled in an accident, or life insurance for an applicant who has already passed on. These are extreme examples, but the underlying message is that insurers cannot take on any risk at any price. Insurers must be willing to walk away from underpriced or overexposed risks if they want to remain in business.
New data may embolden insurers
However, thanks to the growing array of data collection options (primarily sensor driven technologies), the expanding availability of alternative data sources (from drones to monitoring of online activity), and the increasing sophistication of data analysis (via advanced analytics and predictive modeling), insurers may yet become bolder when faced with hard-to-place risks.
Take flood insurance, where “insurers have become increasingly comfortable using sophisticated models,” according to the Insurance Information Institute. In 2017, the last full year for which data is available, direct premiums written for private flood insurance was up 57% from the prior year to $589 million (compared to the $3.57 billion written by the federal government’s National Flood Insurance Program), while the number of carriers writing the coverage rose from 20 to 33.
In some cases, such as cyber risk, the relative lack of data for an evolving exposure is one factor discouraging insurers from more aggressively pursuing what should be the industry’s biggest potential organic growth opportunity. Yet as more data is collected and analyzed about the risk and the likely impact of various mitigation methods, carriers should become more confident writing cyber exposures — raising limits and/or lowering prices, for example, for policyholders with more proven cybersecurity programs.
Gaps in understanding
No matter how rational the explanation, when consumers are unable to get affordable coverage (or coverage at any price), it doesn’t do the industry’s reputation any good with regulators or the public. You can attribute a good part of that negative reaction to a general lack of financial literacy, since most “civilians” (those outside of insurance) I’ve spoken with over the years don’t understand the fundamental reality that insurance simply isn’t designed to transfer risk where serious damage is likely to occur on a regular basis, or where data on an exposure is lacking. But whose fault is that/? Certainly not the consumer’s.
More effective communication about how the industry works (as well as how insurance benefits individuals and society) could perhaps take some of the heat off carriers avoiding problematic liabilities and industries. So could more proactive public service messaging calling attention to what policyholders and society at large can do to make difficult exposures insurable—such as being more careful about building in known catastrophe zones, upgrading and enforcing building codes, and encouraging sustainability efforts.
Living up to the catchphrase
Just because insurers do indeed “live for risk” doesn’t mean they are looking to jump out of a plane without a parachute and hope for the best. (Actually, that would be a powerful and even funny image for a public service ad trying to explain how insurance works!) Instead, insurers strive to understand underlying exposures and make sure reasonable mitigation steps are implemented before taking on a challenging liability at an actuarially sound price.
As time goes on and insurers collect more (and more granular) data while improving the analytical tools to analyze them, they should be better able to anticipate and help mitigate an expanding array of emerging and legacy exposures. They’ll also be better able to live up to the catch phrase I coined for them, to the mutual benefit of the industry and consumers alike.
Former National Underwriter Editor in Chief Sam J. Friedman (samfriedman@deloitte.com) is now insurance research leader with Deloitte’s Center for Financial Services in New York. Follow Sam on Twitter at @SamOnInsurance, as well as on LinkedIn. These opinions are his own.
Read more columns by this Sam J. Friedman: