The Internet of Things (IoT) is changing the nature of risk assessment in real time. As more and more devices begin transmitting information about their locations, movements and patterns of use, a wealth of data is accumulating about the environmental and human factors that contribute to the likelihood of a loss event taking place. It's an embarrassment of riches from an actuarial standpoint, but the data is proliferating so rapidly, and from so many sources, that one could be forgiven for feeling overwhelmed by the challenge of putting it all to good use.
Perhaps that is why, according to a 2016 EY survey of C-level executives across multiple industries, only 36% of respondents from the insurance industry claimed that their companies could use insights from new data sources to boost customer value. Of the seven industries EY surveyed, the insurance sector ranked dead last on this question. How can that be? Surely the ability to use real information rather than calculated probability to determine premiums, prevent fraud and settle claims faster must be an advantage.
Related: Scaling hurdles on the path to insurance digital transformation
|Using data to manage risks
So why doesn't the insurance industry see this as an opportunity? One of the biggest reasons is the sheer variety of data sources. IoT data comes to the world of insurance in myriad forms – from driving habits revealed by onboard diagnostic systems to utility consumption patterns gathered by “smart” thermostats and meters. All of it is potentially valuable, but it comes in disparate formats and from different providers.
Making sense of it and extracting real value requires a solution that can standardize and rationalize the data, leading EY to conclude that “legacy system limitations and the variety and volume of new data requires an overall ecosystem approach” rather than an attempt to make sense of it all with one devilishly overcomplicated system.
Related: Choosing the right Producer Management System
For the successful insurer of tomorrow, this will demand the ability to distill that high-volume, raw data into meaningful nuggets, such as scores, factors and indicators. Core systems then can integrate and use that data from an ever-broadening variety of sources in a centralized system capable of translating it all into product development, underwriting and claims value.
Legacy systems with outdated models based on historical data and risk probability simply can't keep up – the world of “the new insurance” will be supported by flexible, modern platforms that can take data from as many sources as necessary, in as many formats as are required, and allow carriers to put the valuable information they extrapolate from the bigger picture into action.
The carriers that don't see this as an opportunity to plan for the future now won't just find themselves drowning in data — they'll be washed away completely.
Jeff Wargin is vice president product management – policy and platform at Duck Creek Technologies. Contact him at [email protected].
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.