Data warehouses fail precisely because they are conceived and built as data warehouses.
Trade places with the CFO of your insurance company. Now you're struggling with a loss ratio of 105 percent and can't get a handle on the underlying reasons. A software vendor pops in to see you, suggesting he can help. His sales pitch: He'll build you a nice thing called a "data warehouse" that will store the data from which you can easily "mine" information. Curious, you agree to a meeting and the dialogue soon begins.
Unfortunately, a roadblock appears. You quickly start to believe that you have been kidnapped and are now on an alien planet. The language doesn't seem to be in English, or at least not insurance-speak (remember, you're the CFO here). You hear words and phrases like: "ODBC," "multi-dimensional," "RDBMS," "bit-mapped indices," "object-oriented design," "extendable," "scalable," "pre-packed data models," and the kicker, "enterprise-wide data warehouse." That's it. For $2 million to $4 million and 18 to 24 months, the answer is an enterprise-wide data warehouse.
Or is it?
Wouldn't you rather have heard about a three- to six-month project to empower your actuaries to drill down into such depth that they would be able to surgically price products to improve the bottom line? Or how about providing your claims analysts with enough detailed information to be able to quickly reduce claim severity? That's solving your insurance business problems-and it's not a technology odyssey.
Feel Their Pain
In our quest to solve real problems, we've somehow flip-flopped the business and technical drivers. As a collective industry-vendors, internal IT, and consultants-we are guilty of focusing too much on the underlying technology and not enough on the real pain-points of the business executives.
That doesn't mean we should ignore the equipment or the need to have a solid technology foundation. On the contrary, it's important that the various infrastructure layers within a solution are robust, scalable, and open, but if the pain-point isn't solved-really solved-the project is a failure.
In our example, if the CFO could see a drop in the overall loss ratio on a book of $50 million business from 105 percent down to 100 percent, that would be a $2.5 million increase to the bottom line. Not a bad ROI. This project would obviously be a success with many promotions and performance bonuses distributed.
Here's my list of a few key insurance challenges and pain-points to focus on:
- Improve profitability
- Improve forecasting
- Improve underwriting criteria
- Reduce claim severity
- Surgically improve pricing
- Improve loss control programs
- Improve reserve analysis
- Streamline reconciliation
- Target marketing and sales on profitable business
- Optimize distribution channels
Now, about that underlying technology: Think back to "Raiders of the Lost Ark"-that wonderful flick that captured our hearts back in 1981. Do you remember the ending? The ark wasn't lost; it was in the process of being stored in an enormous government warehouse. It was safe and sound, dutifully cataloged.
The imagery was clear: This was a priceless artifact that was probably never going to be retrieved again.
The lesson from the movie, at least on the technology side, is that you need to have a way of getting the data into the warehouse, an excellent multi-dimensional design layer, a fast database engine, business rules to process the information, and an interface/visualization layer that allows those forklifts to quickly grab that historical artifact with ease.
Extract, Transform, Load. Moving data from legacy and operational systems into an analytic system is usually a straightforward process. The challenge lies within the data quality. Quite often annoying items-is "John Smith" the same person as "J Smith"?- create their own challenges. Several vendors offer fairly sophisticated tools to assist in developing repeatable, more-easily-maintained processes.
Analytic Data Infrastructure. An often misunderstood and neglected component is the data-infrastructure-design layer, which serves as the foundation to support the analytic solution. With an effective design, the right information is collected and organized so that it can be efficiently used. The design should also support future expansions and customizations of the analytic functionality without the carrier having to embark upon a major redesign effort.
The two most critical features of this layer are that it must support unlimited dimensions and measures, and lowest level of detail data.
The more ratings measures that can be analyzed-coupled with the ability to "slice and dice" via every dimension stored-provides a powerful capability to get at the root of exactly, for example, where the profitable and unprofitable business is hiding. Summary data are useful for surface, exterior, and level analyses; to fully understand and transform the information into actionable knowledge, you need to use detail-level data.
Informational Database Engine. One major differentiation between transactional and informational systems is the unpredictable nature of analytic systems. Being able to support large volumes of data access without knowing precisely which data elements the end users will want to slice and dice is challenging. The poor database administrators often are in a no-win spiral of constantly being a step behind the power users.
Supporting volumes of more than $50 million policies is not unusual.
The lesson? Carefully evaluate which database engines can support your needs with today's known requirements, and those that can handle tomorrow's demands.
Insurance Business Rules, Measures, and Dimensions. In an industry with such complex business requirements, it's amazing how many solutions are generic. This layer provides the real insurance intellectual property that gives significant productivity gains and value to the business analysts and users.
Visualization Layer. This layer is usually associated with the "sizzle" and is the layer with which actuarial, claims, marketing, and financial analysts and power users interface on a regular basis. Of course, some people like to crunch through spreadsheets, and some like to read text-based reports. Others want to see beautiful 3-D graphics. The visualization layer must effectively support all of these diverse user preferences. In addition, it must support the ability to slice and dice through the data on an ad hoc basis in order to identify the underlying trends and relationships.
With any big tech implementation, people, attitudes, skills, and beliefs travel hand-in-hand with software and hardware functionalities. If one side fails, everything will crumble. (No pressure.) But if done right, someone might suggest that you've successfully delivered a data warehouse.
Paul Theriault ([email protected]) is senior vice president for marketing and channels at Pinpoint Solutions.
Executive buy-in is as crucial to your data warehouse as the technology you use.
by Ben P. Rosenfield
Once you've worked through the ins and outs of what you need to get a data warehouse up and running, it's time to convince the brass. Bottom lines, ROI, and other financials influence the way executives think-especially if they're technophobes. The top executives are the most critical people involved with the success (or failure) of the project. Their sponsorship of, and commitment to the warehouse initiative will guarantee it will be a pan-organizational tool.
Getting the big cheese to melt over your idea and open the fund floodgates can be tough. Let's face it; no one wants to pitch a project that could face a fluctuating ROI. But it's a key issue that can't be avoided, according to Phil Bucci, managing partner of NCR division Teradata's professional services (www.teradata.com).
"Early success that delivers value is key to the future of the warehouse," he said. "Look at needs across the organization, make ROI calculations based on business objectives, and focus on objectives with high ROI."
If you can, identify something-anything-meaningful in the project that will indicate the warehouse's potential to pay for itself. By convincing the execs that they're funding a pay-as-you-go implementation-not a broad infrastructure-you'll further guarantee success.
If you're able to get past the first round of convincing, concentrate next on winning employees over. After all, you're not adding more RAM to their desktop PCs; corporate culture is about to change. Failure comes in so many flavors. Implementing a warehouse the employees don't understand-or hate blindly-will leave you with Dilbert cartoons slipped anonymously under your door.
Keep Your Friends Close
First, accept the fact that culture will shift. A new way to store, access, and share data across the organization is taking shape, so assess thoroughly what changes are likely to occur around the office and the company overall. Next, find the gaps and fill them with new or revised skill sets for IT people, or hire some new blood such as statistical modelers to add and maintain new functions. Finally, demonstrate to the workers how the new system will make their lives easier. They'll certainly do cartwheels over being able to access data and functionality as never before.
But don't worry about the people so much that you neglect the product. Company-wide acceptance of the warehouse could turn around and bite you in the ass. Too often, according to Bucci, lack of experience leads warehouse project managers to focus design specs and efforts on initial implementation, leaving near-future needs blowing in the wind.
"Once initial implementation is done successfully, the underestimated demand-the rush of users to get more information-can cause the system to fail," Bucci explained. "Ease of expansion will alleviate the burden created by unanticipated needs."
By creating a system with big britches, you will further lock in employee support for the warehouse. They'll believe the system was designed with their needs in mind, and they'll be thrilled to use it. In turn, the higher-ups will be happy.
But this is all philosophical, really. Pleasing the masses is a dynamic struggle; pleasing the machinery, on the other hand-making sure it has the muscle to carry the workload-is much more straightforward. The software makes the hardware work. Disks won't spin until told to. You need to invest in data warehouse software with the potential for scalability, TCO (total cost of ownership) restraint, and end-to-end parallelism.
If your system is scalable, you'll be able to start small and develop a logical business model-and then make the model evident in your database. In turn, you'll be able to query the database in phases based on specific subject areas. In time, adding new questions to the system will be no problem because you implemented technology that grows with you.
TCO, on the other hand, prevents certain types of growth. How much does it cost to own your data warehouse? The figure includes manpower, workloads, overtime, and more. What support is absolutely necessary for the environment you chose? If you keep your mind on these concepts, you should be able to keep the leash on your TCO.
"Think about how many database administrators are needed," said Thomas Grimm, partner in Teradata's professional services solution architecture group. "If you need one for each mainframe, and one for each data mart, that's a ton of support."
Grimm explained that the more the database manages itself, the better. Normally, more human-dependent systems that need rearranging based on new questions require people to back data out onto tape, repartition the hard drives, and then reload the data from tape. That can be an exhausting, weekend-long project, and can quickly burn the best workers out.
"If you have databases that deal with terabytes [of data] and there are humans involved, it's a major TCO issue," Grimm said. "You also run the risk of losing valuable staff by attrition."
Human involvement is costly-especially when those pesky humans make mistakes. According to Grimm, total absence of human intervention in the database fosters the best environment for data sharing across an organization. End-to-end parallelism is the operative phrase. It means data are taken, split evenly, and served that way to everyone. As Grimm put it, the process is an "even, random distribution of data, managed by the database." In this scenario, all data and tasks are executed in parallel distributions across the enterprise.
It's all about openness. Is your software flexible enough to handle your changing and growing needs? Can it do so without too many people butting in? And can it employ emerging standards for data sharing? While focusing on essential software attributes, keep in mind solving real pain-points quickly and completely. Once you've done that and have the right solutions in place, senior executives will be impressed with the results.
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.