The value of analytics in the insurance industry is a point that no longer needs to be debated, but initiating an analytics program is by no means easy because the first step involves getting control of your data. We asked four leading insurance industry analysts the following question:

What are the important steps insurers need to take to improve the quality of their data in preparation for the installation of analytical tools?

Click “Next” to see their responses.

Craig Beattie, Senior Analyst, Insurance, Celent

Improving the quality of data after it has been captured has always been a challenge. The first step is to improve the quality at point of capture. This can mean making it easy for customers to enter the right data, the use of appropriate validation rules, and incentivizing staff to capture the data accurately.

There are possibilities for improving existing data as well. A simple example is to take address data and use a tool to geo-code the address. This can highlight missing data and bad addresses, which could be dealt with through exception processing. The use of tools that look for valid names of individuals is another good example as a side benefit to the rise of the use of credit scoring in some countries.

The third step is to tap into those looking after the system and capture the rules about the data that sit in people's heads, such as “before 1983 a blank in that field meant no, after 1983 we started using N and blank meant not answered.” This kind of corporate memory allows companies to operate well, but someone needs to let the analytics tool in on the secret.

Karen Pauli, research director, insurance, CEB TowerGroup

Among our customers, the best outcomes started with appointing an executive to head up data governance. One person has to have ultimate authority for data quality and data governance at a corporate level or it becomes an endless team exercise in turf protection and perfectionism.

A second point of consideration is understanding how core systems modernization aligns with data goals. If legacy data inputs are broken, then governance will not help. Our customers that brought core systems modernization into the plan have been more successful. This cannot be a roadblock but must be a concurrent initiative stream.

The third success factor is to establish a partnership with a technology provider with strong, if not sole, focus on data quality. Partners with experience helping other insurers with data initiatives can shorten time-to-value. Technology partners that bring best practices are valuable. The “to do's” are as important as the “don't do's.” Experience matters.

The final important point is to not boil the ocean. Having a short-term target for business value makes a difference. Knowing what business problem analytics adoption is to solve, then targeting the data initiative to facilitate this is a winning strategy.

An additional point is working with an analytics provider with strong data competencies. They can help shorten the value timeline.

Martina Conlon, principal, insurance, Novarica

A critical step in avoiding the old “garbage in, garbage out” dilemma in a new analytics environment is to conduct a proactive source system data assessment. Export data from core systems into temporary files or databases of any structure and use a basic statistical tool, or even SQL, to assess the data values.

For numeric fields, calculate minimum values, maximum values, sums, and averages. Or for code values, calculate frequency distributions. For one-to-many relationships, calculate typical relationship ratios, such as the number of locations per policy.

Review results with the business to validate that they make sense. Are there policies with negative annual premiums? Are there unexpectedly high numbers of NULL values? Do an unusually high number of drivers have the birth date 01/01/01? Are there property policies with no locations?

If possible, utilize the data assessment features on the modern data quality tools to validate formats and contents of text fields. Use this information to build a data profile, and determine how you will address any specific data issues uncovered (in the source system, or during the transformation).

This effort typically requires only a few resources and can be time boxed into weeks, not months. It is well worth the investment to clean up the garbage early and deliver an analytics environment that can used with confidence.

Bill Jenkins, managing partner, Agile Insurance Analytics

Insurers have spent millions of dollars on data integration, data mastery and data warehousing initiatives only to be disappointed in the results returned from these investments. A main cause of the disappointment has been the inability of the organization to properly manage the underlying data to be used in these solutions in order to achieve results that are credible, reliable and timely. In effect, the lack of high quality data has been the leading culprit for these unfulfilled expectations.

As business intelligence and analytics usage continues to expand within the industry, and the need for high quality data in the use of internal data, external data, structured and unstructured data, big data and social media data becomes imperative organizations need to address their data quality issues head-on. Bad data can result in flawed management information and analytic results.

Effective data quality management is comprised of:

  • Development of a data governance organization and processes. Data quality management is a partnership between the respective business areas and IT. Formal roles, responsibilities, policies, and procedures concerning the acquisition, maintenance, dissemination, and disposition of data are created and monitored. Most importantly, data governance is about accountability for each of these roles. Accountability means enforcement such as adding related metrics and goals to annual performance goals.
  • Undergo an awareness, education and development of a value proposition for the organization. Develop a position paper pointing out where poor data has impacted processing, reporting and operations within the organization. If economic impacts of these situations can be attached to these situations, so much the better. Obtain an influential business champion to support the data quality strategy.
  • Obtain needed data quality tools/solutions to identify and address data quality problems.
  • Measure/benchmark the quality of data to be used in warehousing, business intelligence, and analytics solutions.
  • Use these measures to identify, prioritize and attack the data quality problems. Using the Pareto Analysis in attacking the quality problems where 20 percent of the problems represent 80 percent of the error universe.
  • Develop scorecards of the errors detected and assign the error types to the appropriate business unit/representative for correcting them. As mentioned, scorecards can be inserted into the applicable individual's or department's operational performance objectives.
  • Identify the causes of data problems and re-engineer the processes if need be. It is easier to fix the problem at the source than later on in the process. But be practical, sometimes it is actually too expensive to fix the issue at the source because the issue does not impact transaction quality and/or it may require customizing a proprietary business application thus being costly vs. the value. In this case, do a standard CBA to determine the best solution.

The above is a continuous program. As more data is brought into the organization to be analyzed the more important it is to have a quality management strategy and program.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.