P-C Insurers Are Only As Good As The Quality Of Their Data

Since data is the lifeblood of the property-casualty business, more than a few insurers could find themselves in need of transfusions when poor data quality threatens their competitive advantage and bottom line.

The cost to U.S. p-c insurers of poor quality data is high. Measured in overpricing, underpricing, writing too many bad risks, regulatory fines or simply losing business from disgruntled customers, poor data can mean poor financial results.

A recent study by The Data Warehousing Institute, a Seattle-based research organization for the business intelligence and data warehousing industry, highlights the dimensions of the problem on a broader scale. Poor quality customer data costs U.S. businesses a stunning $611 billion a year and this estimate doesn't include the cost for poor data used in financial, sales, decision support and other areas.

The importance of preventing errors in reporting, collecting and managing data for p-c insurance cannot be overstated. Consider this real-life example from the Data Warehousing Institute study of an insurance company that receives two million claims per month with 377 data elements per claim. Even with an accuracy rate of 99.9 percent, this carrier's claims data contains more than 754,000 errors per month and more than nine million errors per year.

If only 10 percent of the data elements are critical to the insurer's business decisions, that means the insurer must correct almost one million errors each year. At a very conservative $10 per error resulting from excessively high or low claims payouts (not to mention staff time to fix the errors downstream, as well as lost customer trust and loyalty), the company's exposure to poor quality data is $10 million a year.

Because data is critical to an insurer's ability to execute fundamentals of sound underwriting and cost-based pricing, they invest significant (if not always adequate) amounts of money to maintain and improve the quality of data their essential corporate asset. Unfortunately, in today's uncertain and highly competitive business environment, carriers seek competitive advantage in expense control, as well as improved underwriting results and investment gains. For some, no doubt, it is easier to reduce spending in areas where return on investment is not immediately apparent than in areas where it is.

Data is a corporate asset companies need to manage with the same rigor for accuracy, reliability and quality as they do their financial and material assets. As businesses go increasingly global, the use of data has grown exponentially. Bad or poor quality data used in decision-making has severe financial and regulatory consequences that go beyond just lost business and disgruntled customers.

Our industry lives and dies on the quality and credibility of its data, the raw ingredient for all p-c insurance products.

Consider how data is the very underpinning of our business. Data drives insurer decisions regarding underwriting and pricing risks. Data is at the heart of all customer service and support functions it determines the level of support insurers must provide to retain and grow their customer bases. Claims data (from both internal and external sources) supports decisions and planning for the claims-settlement process. Past claims experience helps companies identify emerging claim trends, predict future claims-adjustment and settlement processes, develop contingencies for catastrophe claims, and identify problem areas for strict loss-control measures.

Data also helps insurers manage litigation, detect fraudulent claims and limit financial exposure to claims through reinsurance. Moreover, data provides the yardstick that investors, regulators and internal planners use to measure a company's financial health.

Like any major corporate asset, data needs to be properly managed and controlled. If data assets are neglected or poorly maintained, the reliability, availability or timeliness of an insurer's data will be in doubt and its value questioned, eventually hurting the company's financial stability and market reputation.

Managing Data

So what can insurers do about managing data and ensuring data quality?

From our perspective, it is critical that insurers adopt specific principles for data-management best practices. The Insurance Services Office Inc. operates one of the largest private databases in the world for p-c insurance information more than 9.3 billion records at any given time and it represents up to 75 percent of the entire industry's premium volume for commercial lines and about a third for personal lines.

Insurers entrust their data to ISO for use in developing a wide range of services, such as statistical analysis, actuarial services, underwriting, claims-fraud detection and other claims- and loss-related information. Data from participating insurers help ISO develop advisory prospective loss costs the essential tool for companies to underwrite and price risks accurately and ensure coverage is available to insurance buyers.

We define quality data as data fit for its intended use. (See the accompanying “Five Cornerstones of Quality Data.”)

From ISO's viewpoint, the following principles of data quality constitute best practices in data management.

o Data stewardship. Maintain a corporate program with senior management-level oversight, if necessary, to understand the roles and responsibilities in data ownership, acquisition, quality assurance, storage and distribution. Make each functional area with data responsibility accountable for performance and good data management.

Data and data-quality standards. Develop internal standards and seek, where appropriate, useful external standards. Harmonize multiple standards and promote operations across multiple systems and platforms.

Organizational issues. Establish a cohesive data-management and data-quality function for creating data and assessing data acquisition across the organization. Consider support for assessing and improving data quality by tapping into resources of outside organizations, where appropriate.

Operations and processes. Develop processes to maximize data quality and usefulness. Use new technologies for data-management and data-quality processes. Consider various internal and external data sources for improving data quality. Monitor changing regulatory requirements that may affect data and data quality.

Data-element development and specification. Design and maintain data, system and reporting mechanisms to promote good data management and quality, and to serve end-user needs. Ensure that the definition of new data is in sync with underlying business processes and its broadest possible use. Consider the level of detail in data and whether historical or retrospective data is necessary for developing system or reporting specifications. Design data and data-reporting requirements for easy modification and updating.

Data-management and data-quality tools. Develop tools to promote good data management and data quality, including a corporate data dictionary, edits and business rules, data-flow documentation, process model and mapping, and data-translation criteria by data source and recipient.

In addition, adopt new technology resources, such as the Internet, predictive technologies, data-visualization tools and data dictionaries, and new data exchange standards, such as extensible markup language (XML), to improve data management and quality.

Make third-party data-management, data-reporting and data-quality tools such as statistical edit packages and statistical plans part of the workflow process at the corporate level.

Measurement. Develop a performance metric to measure poor data quality, such as the costs associated with correcting errors and reports, investigating and preventing errors, bad decisions, missed opportunities, fines, and increased regulatory scrutiny. Measure and benchmark results for each data source.

Individual support. Institute support for both data management and data quality on individual and organizational levels, and follow standards of professionalism.

Privacy issues. Educate users about privacy issues, policies and compliance with privacy regulations. Control access to, and use of, nonpublic data and adopt best practices of organizations that promote data privacy.

The upside of implementing data-management best practices is that insurers can leverage their data for newer and more varied uses and unlock the full value of their data especially with the availability of new technology tools to improve data quality.

Technology is playing an increasingly major role in data-management processes and workflow. ISO has gained a lot of experience in using technology for its statistical plans and databases for personal and commercial lines of insurance that insurers depend on for actuarial services, underwriting, rate making and pricing coverages.

Elements underlying ISO's sophisticated data-editing system include:

Statistical front-end edit packages that check data received from reporting insurers for validity.

Distributional edits and actuarial checks to perform data accuracy and reasonability checks on statistical records sent by insurers.

ISO is also researching new data-exchange standards and technologies as new software innovations and greater data storage capability on more powerful computers and at lower cost are making data collection processes more efficient. With instantaneous access via the Internet and common frameworks to consolidate and transmit data, insurers are better positioned for their data quality efforts.

Several new uniform data storage and transmission standards are making it increasingly easy to meet data exchange and reporting requirements. The growing trend toward an electronic data interchange with batch, store and forward functionality and the popularity of XML both underscore the importance of data standards.

For example, ACORD has developed globally recognized common standards for insurers and industry associations to collaborate and exchange data in a uniform framework. ACORD has created XML standards for the p-c industry to meet real-time data-transaction requirements for personal, commercial and specialty lines, for premium as well as claims transactions. ISO plans to leverage ACORD's XML standards to structure data in a common, consistent format, making it potentially easier to collect and aggregate similar data accurately from multiple data sources.

In a fast-changing global business environment, insurers must be prepared to respond to challenges and identify opportunities for growth. Quality data is the foundation of sound decision-making and competitive advantage.

By managing data more efficiently through best practices in data quality, and in conjunction with new technologies and emerging data-exchange standards, insurers demonstrate that they recognize the value of data as a corporate intellectual asset an asset requiring as much safeguarding as their traditional physical and business assets.

Carole J. Banfield is executive vice president of Insurance Services Office Inc. in Jersey City, N.J.


Reproduced from National Underwriter Edition, May 28, 2004. Copyright 2004 by The National Underwriter Company in the serial publication. All rights reserved.Copyright in this article as an independent work may be held by the author.


Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.