ROI.
Who would have thought three letters could create such grief? But lets face it: If the sky was ever the limit for IT budgets, it sure hasnt been for some time. Execs signing off on data warehousing projects want to see results, and fast. Unfortunately, that doesnt fit well with the historical model of data warehousing, with its multi-year timeframes and high costs. Todays economic realities have caused many insurers to shift their data warehousing efforts to projects that have more immediate and clearly defined financial or operating impact. Those realities also have driven the development of some economic strategies on which economically successful warehousing projects develop. For a guide to tried and proven tactics, read on.
1. Focus on impact analytics that save or make money. Like any business, insurance looks to increase revenue and reduce expenses. Meeting these objectives requires focusing on customer and producer analytics for such goals as gap analysis, distribution channel efficiency and profitability, and targeted marketing, as well as finding ways to cut operational and claims costs and cancel business thats likely to be unprofitable.
A textbook example is Allstate Finan-cial, whose data warehousing efforts garnered the insurer a 2002 Best Practices in Data Warehousing award by the Data Warehousing Institute. The warehouse, which took just over a year to implement, can hold up to three terabytes of data in an Oracle database, using Ab Initio for extract, transform, and load (ETL) from nine different administration systems that support Allstates life insurance, long-term care, annuities, and mutual fund businesses. SAS Enterprise Miner and Brio are used for analytics, and Proclarity is used for online analytical processing (OLAP).
One of our key strategies was to become consumer-centric, says John Hershberger, Allstate Financials assistant vice president of database marketing. You cant do that without information about your customers. So first and foremost, we looked to better understand what their needs were and to meet their expectations about products, service, and support. That meant continued involvement of both business and IT in the warehouse design.
One of the first impacts of Allstates warehouse was the elimination of duplicate mailings to policyholders who held multiple annuity contracts for different beneficiaries. We built an internal householding process using Trillium and built a carrier presort mail file. We estimate that putting a single prospectus in the hands of a customer approaches $5. Currently Allstate is devoting its resources to analysis of the economic values of producer relationships from the individual agent up to the aggregated agency level, including studying customer retention activity.
While Allstates actual project costs were not available, Hershberger reports the company spent serious money on the installation. We still needed to create value along the way, he explains, and break-even is not a long way off; were getting there. It was never a concern to us that were not going to gain value, but weve kept a focus on areas of impact on profit and loss.
Carriers may still find it difficult, however, to base decisions regarding warehousing and analytical projects purely on economic factors. In the business case supporting the project, you can struggle about how to measure the results because you dont have the information in the first place; thats why you need the warehouse, says Richard Marx, vice president at Cap Gemini Ernst & Young and leader of its North American insurance consulting practice. It takes a leap of faith by management to make that investment, and in these economic times, its a tough sell.
2. Decide if an enterprise data warehouse (EDW) or data mart is a better fit for your legacy environment. The industry continues to be divided into two camps when it comes to the better way to approach a warehousing project: Create an enterprise data warehouse from the start, or develop individual, tactically focused data marts. The more disparate systems and different data formats an insurer has, the more difficulty there will be in building an EDW. But even if tactical data marts are the immediate solution, the difficultand potentially expensivework of standardization should still be completed first to save money in the long run.
The trick with taking the approach of starting small is how do you set it up in a way that you can expand the depth and breadth of the warehouse so you dont have to redo it, says Marx. Fortunately, the advantage companies now have over the ones that started three to five years ago is the tools, particularly ETL tools, have gotten so much better. Its feasible to start small and, through various well-planned iterations, expand the comprehensiveness of the warehouse and begin delivering value.
Those who have built an EDW first have found they can, in turn, create subordinate data marts that share common standards while also addressing point solutions. Consider, for example, Insurance Services Office, Inc. (ISO), which reports to have the largest insurance datastore in the world, with seven terabytes of online data and 114 terabytes of near-line data. Over time, ISO has created a number of subordinate data marts, such as its ClaimSearch database, containing 360 million records of claims.
Raw claims data is stored in a central datastore, and separate claims records are stored in the ClaimSearch database. To synchronize data when a carrier makes a change to a reported claim transaction, ISO created an online correction tool that builds onset and offset records. Its a closed-loop process, says Michael DAmico, ISOs chief technologist and director of systems engineering.
3. Decide what data to warehouse. With data-storage costs continuing to decrease, this decision no longer involves quantity as much as it does history. Weve definitely evolved from the point where we hem and haw over a field, says Marty Solomon, business systems architect in CIGNAs health care space.
CIGNA has a multi-terabyte divisional data warehouse for its health-care business, hosted on an OS390 mainframe. The warehouse has been online for nine months and is fed by 22 different systems. CIGNA uses ETIs Extract for ETL and Brio for reporting. Solomon says that in deciding how much historical data to warehouse, legal and regulatory requirements have been the primary driver.
History is an issue because the costs of data cleansing increase the farther back insurers go. Typically, what most insurance CIOs want is several years of data in the warehouse, says Marx. He reports that adding a shorter-term, operational datastore is one way insurers can get more economically to the detailed information they need to support customer-facing functions such as call centers. They may need to understand every premium, every dispersement on an annuity that took place in the last 90 days, for example, he explains. When looking at the warehouse, what insurers want are monthly buckets about what premiums were paid to handle trend analysis.
In deciding what data should be warehoused, Allstate developed a strategy that was designed to minimize current data extract issues yet allow the most future flexibility. We did not know in building the warehouse which data would or wouldnt add value, Hershberger explains. So we went into the source system and looked at the segments that are used in the mainframe systems. Using Ab Initio, we took all of the data in the mainframe and dropped it into a collection area. We then went to our mainframe source system and asked for an evaluation of all the segments that were utilized on a regular basis, using the ETL tool to select only those portions we thought would have value. So if we need to go back and extract additional information at a later time, we can do so more easily.
4. Plan for hidden costs. Realize that a successful warehouse, or even a successful intermittent deliverable, will generate interest in the project among business users and get them thinking about new ways to analyze the information. And that, in turn, means the likelihood of changed and additional project requests.
Usually costs go up because the requirements get rearticulated or expanded, says Solomon. For instance, an insurer might begin with a request to analyze customer retention by region, but then realize the answer to that does not address why those policyholders are leaving. Did my count drop because I closed down my main office in that region so I lost contact? Did those policyholders move? Did they reach the age of 55? Was there insurance through an employer who switched to another carrier?
Planning for the unknown is not only difficult, but also potentially expensive if the anticipated needs do not ultimately materialize. Justifiably, the business doesnt want to spend up front for what it wont use, so its a constant battle, Solomon says. You have to compromise between the visionary and the tactical, and hopefully end up with the strategic.
Theres no substitute for good planning and business/IT alignment when it comes to managing this issue. We try to inform the users, and have been successful in educating them on subject areas and how they work, says Solomon. Not necessarily to know the whole academics behind dimensional data modeling, but to give them an understanding of it so that when were building something, it allows everyone to be more flexible and to handle the questions that come up along the way.
5. Dont circle the wagons. Sticking to proprietary formats is a tempting way to reduce costs, but consider the long-term savings of standardization. This is a time for companies to be looking outward at whats going on from a technology standpoint, and you cant afford to come up with a proprietary solution to every problem. Additionally, the more we move toward data standards, the more vendors will be able to design products that can be easily implemented, says Gary Knoble, past president of the Insurance Data Management Association and vice president of data management at The Hartford. With the crunch toward expense savings and lack of resources, youve got to be looking for off-the-shelf solutions, even though you will still need to do work on them to implement them.
6. Do it right the first time even if it costs more in the short term. Data quality is a continued impediment to quick implementation of a data warehouse. However, failing to address issues of data quality intelligently will serve only to increase the ultimate cost of the warehouse.
Scrubbing techniques can be very sophisticated, but in its most unsophisticated method, its nothing more than force coding, says Knoble. There are certain data problems you can only find by looking at the original values, rather than simply making the data pass a system edit. You often dont understand the consequences of manipulating data in a certain way because that data has to be used downstream.
The salient question is, what do you do when you find bad data? Do you attempt to address it back in the provisioning system, or do you put some scrubs in your warehouse? Hershberger asks, adding Allstate has opted for the latter strategy. We try to trap as many bad variables as possible. If something is clearly an error, we take it out of the data warehouse during the ETL process and replace it with a variable that indicates the source system value is incorrect. If we have multiple sources of data, we make a decision as to which would dominate.
In short, economics should never trump accuracy. Ours is a pragmatic approach, Hershberger says. I dont want to guess about a customers variable attributes any more than I want to guess about producers activity. If the data doesnt support a fact, then the fact that theres no data becomes the fact.
7. Get more mileage out of what you already have. Insurers have a wealth of information locked in legacy administration, ERP, and CRM systems, and need to leverage information across those systems. A data warehouse can be a way not only to link those systems but to efficiently use data captured from new front-end systems without incurring the cost in dollars and time to modify legacy systems to accept new data formats.
Previously, system development efforts focused on reengineering insurance processes as they related to input and transaction processing. Now, the goals are output and decision support. However, many legacy systems were not designed to achieve these goals, says Mike Schroeck, partner at IBM Business Consulting Services. Unfortunately, the cost of modifying those legacy systems to use new data formats is significant. The better solution is to use the warehouse as a standardized repository and middleware to convert the data to formats existing systems can use while still retaining core data for business analytics.
8. Realize that sometimes economics are a secondary concern. There are instances when the most economical way to implement a data warehouse is incompatible with the regulatory requirements insurers must contend with. Cooperation and coordination between business and IT is essential not only to increase the likelihood of a projects success, but also to catch areas of data use or analysis that arent allowed by ever-evolving security and privacy regulations.
For example, while customer data can be used within the organization, insurers need to take care regarding how much information can be shared with affiliates and subsidiaries in their cross-selling and upselling campaigns. Also, there are instances when data elements may need to be discarded altogether, such as with the prohibition against using social security numbers for customer identification, necessitating the creation of new customer ID tables for some carriers.
And lastly, continued legislation addressing corporate corruption, such as holding corporate executives personally responsible for financial reports, will likely be a driving factor in centralized data warehousing efforts. When signing off on these financial reports, executives have a renewed interest and commitment to ensuring they are accurate, consistent, and timely, says Schroeck. Data warehousing enables this, particularly when insurers are running several different financial systems.
Data Warehousing Vendor Guide
Net2S
New York, N.Y.
212-279-6565
www.net2s.com
Basis100
Toronto, Ont.
416-364-6085
www.basis100.com
CSC Financial Services
Austin, Tex.
310-615-0311
www.csc.com
Data Instrument Group
Mountain View, Calif.
408-516-8812
www.digdb.com
Decision Support Inc.
Matthews, N.C.
704-845-1000
www.decisionsupport.com
Delphi Technology, Inc.
Cambridge, Mass.
617-494-8361
www.delphi-tech.com
Evoke Software
San Francisco, Calif.
415-512-0300
www.evokesoft.com
FirstApex
Flower Mound, Tex.
866-700-2739
www.firstapex.com
Fiserv, Inc.
Brookfield, Wis.
262-879-5000
www.fiserv.com
Insight Decision Solutions, Inc.
Markham, Ont.
905-475-3282
www.insightdecision.com
lookNomore
Valley Stream, N.Y.
516-216-2311
www.looknomore.com
Millbrook, Inc.
Center Valley, Pa.
610-797-7400
www.millbrookinc.com
NCR/Teradata
Dayton, Ohio
937-445-5000
www.teradata.com
OuterBay
Campbell, Calif.
408-340-1200
www.outerbay.com
Priority Data Systems
Omaha, Neb.
800-228-9410
www.priority-quote.com
Risk Laboratories
Marietta, Ga.
678-784-4600
www.risklabs.com
Sagent Technology
Mountain View, Calif.
800-233-5478
www.sagent.com
Wipro Technologies
Richardson, Tex.
972-671-6130
www.wipro.com
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.