The COVID-19 crisis has highlighted the need for operating-model change for the insurance carriers. Many of the leading carriers are using data and analytics to become faster, leaner and more effective. One of the critical roadblocks to efficiency has been the legacy systems that form the backbone of actuarial and underwriting systems for insurance carriers. Insurers are increasingly operating in an environment where they need continuous access to different insights based on structured and unstructured data sources. Agility is a prerequisite in the era of InsurTechs, and increasing change in regulations calls for a sound foundation in the analytics information architecture across the insurance enterprise. Often actuaries and data analysts at insurance carriers are bogged down due to manually intensive processes, lack of embedded data management and controls, and data inconsistencies across lines of businesses. The pressing question that many insurance leaders are now grappling with is how to streamline data sourcing and management processes that convert raw data into insights. |
Common approach to insurance analytics
Over the last few years, we have seen insurance carriers gathering data from various sources. The new data streams include unstructured data from external sources such as social media, telematics, and online shopping behavior. Insurers have addressed the extraction, transformation, and loading (ETL) challenges of this type of data. They blend it with policy-level data on claims, premiums, broker information, risk profiles and insured details to get rich insights on the customer. A more commonly used approach by insurers involves multiple business units across the enterprise pulling the required data through a centralized staging area from the corporate data warehouse for their respective analytics modeling activities. The data pulled from the centralized staging table may or may not undergo further data cleaning prior to the modeling activities. The problem with such approaches is, in a typical model development process, almost half of a data scientist's time goes into data processing and setting the base for the planned model. The second problem with this approach lies in the fact that data varies from one business need to other, and pulling additional data points from the data warehouse other than the one present in the staging table becomes chaotic. In addition to the wasted effort, it creates a burden on the network and generates multiple copies of similar data leading to the numerous versions of conflicting truths that cannot be identified as insights. |
Data factory framework
Some progressive insurers have carefully considered the existing approaches for underwriting and actuarial data needs. After strongly feeling the need for agile methods and thinking digital transformation mandate, they follow the "data factory framework," which has several advantages over the usual approaches. Under the data factory approach, each business need in an insurance enterprise extracts their respective necessary data sets from the corporate warehouse and third-party sources directly. These data extractions undergo data engineering by data engineers dedicated to this service and serve as input for the pro-active analytics data mart. The data engineering tasks vary depending upon the business need and primarily involve the following operations: |
- Data reconciliation;
- Data standardization;
- Data normalization;
- Data linkages;
- Data integration;
- Mining unstructured data;
- Creating automation schedules; and
- Writing stored procedures.
Such a proactive analytics data mart serves as a precursor to analytical tasks such as making dashboards, generating reports, performing ad-hoc analysis, and advanced modeling activities. The benefits of the data factory approach to insurance analytics lie in the fact that it helps to: |
- Reduce internal analytics development costs;
- Accelerate analytics speed to market;
- Realize development and monitoring process efficiencies;
- Promote data-driven decision making;
- Accelerate analytics implementation;
- Enhance Data Quality and Consistency; and
- Enable power users to drill down on their own.
We have witnessed insurance carriers (P&C and L&A) utilizing the data factory framework to align numerous distributed data sources. However, particularly on the L&A side, recent efforts to modernize the reporting framework with the help of the data factory framework have been successful. The challenges or problem areas quoted by carriers include: |
- Complex reporting infrastructure with fragmented data leading to manually intensive reporting;
- Lack of systemic support for data visualizations and interactive reporting; and
- Lack of self-service reporting & analytics capability.
A data factory approach (a slight variation of the one described above) has been an excellent remedy for the aforementioned issues. The preparatory steps involve an in-depth assessment of all the necessary data sources, reports and KPIs. Analysis to identify gaps in the current reporting systems and processes is a prerequisite to enable reporting transformation with enhanced drill-down capabilities and interactive views. Some insurers have also used data factory concepts to automate the entire process of pricing a policy and reduce manual intervention by collating data from different sources such as rating services, insurance financial history, indicators, longevity flags, inferential historical flag, exception directory, and new business/renewals. Engineering steps on the collated data help to arrive at the base model or model bench data for the respective business units. From this point onward, different advanced analytics activities are carried out such as clickstream analysis, geospatial analytics and classification algorithms. Additionally, insurers can refresh capabilities to retrain and re-build existing modules, followed by setting up monitoring and comparison routines to observe the trends and business feeds generated by the models. This kind of automation has helped carriers to achieve efficiency as well as effectiveness in their data operations. |
The big picture
As data and analytics transform the insurance industry at an increasing pace, centralizing and processing data puts insurance carriers in a powerful position. A data factory is an effective and user-friendly framework for reporting, analysis and modeling tasks. It changes the nature of data engineering tasks within the insurance organization and brings efficiencies in the actuarial functions such as risk assessment, valuation and pricing. Doing away with the usual analytics approach and adopting smart analytics data marts also enables a sandbox environment for different business functions to sustainably experiment with data and business processes. This approach can power insurers to lean development, help them negotiate infrastructural complexities, and provide a self-service environment for various reporting and analytical requirements.
Dheeraj Pandey ([email protected]) is an engagement manager at EXL Service, a provider of data analytics solutions to financial organizations including P&C Insurance firms.
Dr. Upendra Belhe ([email protected] ) is the president of Belhe Analytics Advisory, which helps businesses achieve business outcomes through data-driven insights. He serves as a strategic advisor to EXL Service.
Keep reading...
|Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.