The rationalization and update of legacy systems can be difficult tasks requiring a great deal of IT time, management commitment, and company expense. However, some insurers in the life and property/casualty industry in London that migrated data from legacy systems found the benefits of increas-ed efficiencies, long-term cost savings, and enhanced management information systems are well worth the effort. Among the various challenges they faced were data cleanup and legacy systems creat-ed as a result of mergers and acquisitions. The following case studies de-scribe the hurdles they encountered and how they overcame them to achieve their goals.

Challenge: Data Cleanup
Chris Clegg, the former program manager for Prudential plc, the U.K. life company, saw the difficulties involved in doing a data migration in-house and the problems with migrating old data because he was in charge of the migration of two 30-year-old legacy mainframe systems. (When Clegg was interviewed, he was Prudentials program manager. At the end of April, his contract with Prudential ended and he is currently freelancing for other companies.) The program, completed in 2001, involved moving old data onto a smaller, more flexible, and robust client services system, which provided additional functionality for further growth, says Clegg.

Essentially, Prudential wanted to upgrade its old systems, which were slow, hard to maintain, and expensive, and move to newer technology that provides scalability, Web browsers, and report writing capability, he says. Making changes to old systems and maintaining them is expensive. It takes longer and is harder to do.
The main issue of data migration of legacy systems is getting the data across from the old system to the new and preserving the integrity of the data, he says.

Thats complicated by the fact you dont know why some of the data is on the system or the rules that were used to create it, he says, explaining the rationale for including the data may be lost over time or is in someones head who has long since left the company. Further, the structure of the data was quite different, which makes migration extremely hard, Clegg affirms.

Prudential made the initial decision to complete the migration in-house, he says. Its a typical decision within large companies. Theyve got the people, and its less expensive than bringing in outside consultants. Prudential went down that route for a while, and then Clegg decided it was costing too much money, taking too much time, and the software tool being used was never going to cope.

So I decided to get some professionals in, he says, noting he used Cedar Knowledge Solutions, the predecessor of Kognitio Ltd., a consultant and system integrator based in High Wycomb, England. The original system Prudential was working on in-house was going to take seven days, 24 hours a day, to migrate, he notes.

But that wasnt deemed acceptable, he says. We had a target to migrate over a weekenda 48-hour period, which was the maximum time the business could be without system capability, he adds, with a possible extra day if the migration took place over a holiday weekend. This was important so that the system would not be down for clients who phoned in.

Clegg says he was attracted to Kognitio/Cedar because it had experience with the data migration process and quite a rigorous system for determining whether everything had been migrated across properly.

Data cleanup was an integral part of the project, he says. We had to determine which data we were going to take with us and decided not to bother to take some of the data that was 20 years old and didnt fit with new requirements. (Clegg notes the old data is still on file if Prudential needs it. But legally, youre only required to keep it seven years in the United Kingdom.)

Because the new system had rigorous data integrity requirements about what does or doesnt get in, you cant migrate until youve sorted out these issues, he says. The only alternative is to exclude the records that havent got the proper data.

He indicates the new system is geared around data integrity. The old system, however, didnt have such requirements, he adds. As a result, when the data was dumped onto the live system as a test run, an exception report would be thrown out, he says, explaining the data migration program highlighted the records that would fail in the migration.

This is a report that shows you all the data points that are wrong, he notes, which were then corrected on the live system. We had people, for example, who didnt have their sex listed in their file, he says. On the new system, thats not possibleyou have to fill in the field for the clients sex. We had to run reports on all the people who didnt have the specific fields filled in.

Another more difficult part of the data cleanup involved discovering why a particular calculation had been made for pension payments and why the payments didnt follow a regular trend, he says. In other words, you might have had, for example, a policy that had an escalation rate of, say, five percent a year, but you might find one that has an escalation rate of five percent but six percent every second year. These were complexities that had to be sorted out, he points out.

In some cases, this meant quite a substantial change to the record, he says. It might sometimes mean writing to the customer, saying, Look, youve been paying us too little for the last 10 years, and you now owe us 100, but well waive that, he says. It was always done in the customers favor.

Because these systems controlled the payment of retirees pensions, Clegg says it was important the new system paid clients the right amount of money on a monthly, quarterly, or annual basis. It needed to be a seamless change for the clients.

However, he says, there always are anomalies. As a result, wed have to explain to clients we moved to a new system, but they shouldnt worry, and it wouldnt affect what they received.

All the records that were missing data points or had anomalies were sent back to Prudentials data-cleanup group to be entered manually, he says.

This data-cleanup process was done over and over again, so when we came to the day of actual migration, we did one last run of the scans, and in most cases, there were zero anomalies because theyd all been fixed, Clegg says. The one or two that crept through since the last update, we corrected manually.

As a result of this rigorous process, we migrated with 100 percent success, he says, noting that there were only four records that couldnt be migrated and were corrected later.

To complete the implementation, the new system first went live for new business only, he says. Then we did the migration of one of the systems, and then the migration of the second one, Clegg recalls. This meant customer services had to have all three systems running initially. The old systems were then turned off one at a time, so at the end we just had the new system running with all the data on it.

He says Prudential performed three dress rehearsals for each system implementation. Each dress rehearsal was actually a go live with a back out. When the dress rehearsal was done, we could have stayed live, but we chose to back out. This enabled Prudential to make sure everything was running well, and it gave us a chance to check out the backup procedures.

As for the pitfalls of data migration of legacy systems, Clegg says, its important for a company to be prepared for an expensive and time-consuming exercise. Its not something you can just take on and tuck it in to another project or do as part of your day-to-day line activities. It is a professionals job, which is best done by people who understand data migration, he says. Its a specialist task; it needs a separate stream of people. Its not something that you give a programmer to do.

Its like going on a trip to the Antarctic. You could go and buy all the right clothes, but youd be miles and miles ahead if you took someone who had actually been there once, he says.

Challenge: Merging Companies
Some legacy systems need to be merged and updated in order to make an underwriters job easier.
Derek Southgate, IT development manager, Advent Underwriting, a London- based Lloyds managing agency, oversaw a nine-month data migration project involving the merger of two underwriting systems into one, which became necessary after a merger of two agencies that formed Advent.

Both of the underwriting systems, which had been supplied by different software houses, were aging and had limited ongoing support from the software companies, he says. We did look at converting the data from one of the systems onto the other, so all the syndicates could be moved onto the same platform, he notes. However, neither of those systems seemed to be an appropriate platform to go forward on.

These old AS/400 systems did risk recording, claim recording, reinsurance management, and all the associated accounting exercises in the context of the Lloyds environment, he explains.

As part of the data migration pro-cess, Southgate and his team selected ROOM Solutions Subscribe system; ROOMs associated aggregation management product, EXACT; and ROOMs business intelligence service that provides a data warehouse. The London-based ROOM Solutions offers products and consultancy services.
We elected to do the data extract from the old systems by first using the data warehouse as a staging exercise, he says, explaining Advent migrated the data first into the business intelligence data warehouse and then subsequently did a more detailed conversion to the Subscribe product.

We used the first conversion to prove we could extract the correct sets of information from the original AS/400 systems, he says.

The old systems were the result of 10 years of maintenance, patches, and data fixes, he notes. Obviously, these things didnt always get done consistently, so we ended up with orphan policies, premiums, claims, and outstandings, he says. An accounting transaction should have an associated policy or claim, he explains, and all the missing links have to be resolved.

We worked with ROOM to come up with the best set of compromises over the different sets of data deficiencies we found, he says. We progressively worked on the quality of data until we could do a reconciliation with the basic accounting amounts.

The next stage involved proving this data against business-oriented reports, which was the start of the user-acceptance testing phase, he says, explaining comparisons were made between the analyses available from the old systems to the equivalent being developed with ROOM. It became a progressive exercise in refining our understanding of the underlying data and what we were going to get out of the ROOM system.

He adds its essential to get the business people to sign off on the new system so that the auditors, underwriters, claims, and accounts people actually accept it as an adequate replacement for the old system.
Due to time constraints, Advent didnt have the luxury of a large amount of parallel running, where effectively both systems could be operated at the same time. We got to the deadline, and we did all the conversion over a long weekend, and from that time onward, underwriters were live on the new systems, Southgate adds.

We had to go back and do a bit of data patching because we realized we hadnt got all the information out of the old systems or there were inconsistencies, he says. However, by that stage, we understood enough about the new systems and the old systems to be able to do that in a fairly controlled fashion.

He says the migration of the legacy systems into the new system has helped the underwriters to become far more self-sufficient. They now have an analysis tool so that they can go in and do the vast majority of statistical analyses themselves, which has reduced the requirement within the IT area for a lot of ad hoc reports. IT now has to help underwriters only when theres a problem or theres a more complicated analysis required, Southgate affirms.

What were the pitfalls? Southgate says Advent took too many steps in the conversion process. It would have been better if we could have targeted the final conversion style on the first stage of the conversion rather than doing a total of three steps. That would have meant we would have had to do less rewriting of some of the scripts as we could have progressively populated the conversion database, he says.

Another problem discovered after the conversions was we hadnt realized there were certain business analyses done using some of the data, which had to be added later, he says. Because the whole lot was done in such a restricted period of time, we didnt have a full set of all the business requirements defined up front.

Southgate says it would have been useful to have more time to find out about all the usesoften hidden in user spreadsheetsof the old systems. Nevertheless, even the missing data wasnt that important because Advent did some work after the implementation of the data migration, he says. Once we understood the target systems, we could work with ROOM to minimize the impact of any changes, so when we fed through the corrected information, it was a relatively straightforward exercise, even though the data warehouse had been built. We managed to do that without holding up any of the production work, he adds.

Challenge: 15 Legacies, One Data Warehouse
The issue of multiple legacy systems also can be addressed via the use of a data warehouse, says Tony Ledger, IT director for Markel International in London. Markel International runs the international insurance operations of Markel Corp., the Richmond, Va.-based insurance company.

Ledger says Markel is using the services of Sagent UK Ltd. to assist in a data warehouse project, which will make the data in the legacy systems available to underwriters and managers within the business units, so they can analyze the business and enhance the quality of decision-making going forward. Sagent UK is a data integration and data analytics company based in Reading, England.

Currently, Markel has 15 line-of-business processing systems as a result of mergers and acquisitions, Ledger says, which has made it extremely difficult to gather data to analyze the business. Prior to the creation of the data warehouse, management reporting was a time-consuming process, he says.

He notes most of the systems were written in-house over many years, with some going back as far as 15 years. All the current systems capture the information on the risks Markel underwritesthey process premium payments, claims, and all of the technical accounting that goes with the underwriting, he says.

The data migration project will bring all the data together, consolidate it, put it into one data warehouse bucket, and organize it within the data warehouse database, Ledger affirms. The warehouse will make it easy to assess data for analysis, unlike the current legacy systems, he says. This gives the underwriters information in a format that enables them to manage the business.

The data warehouse migration will also ensure Markel is consistent in its use of definitions of data. So when we say net premium, everybody understands what net premium means, and the relevant figures have been pulled through from the 15 systems into the data warehouse, he says. As an example, this will permit an underwriter to see net premium by broker for 2002 for business written in the United States, he explains.
Hopefully, via management information analysis, the data warehouse will assist the company to grow the business profitably and weed out those parts of the business that dont make money, asserts Ledger.

Explaining the data migration process, Ledger says Markel is using Sagentan extract, transform, and load (ETL) toolto pull out the data from the 15 core systems and put it into a data staging area, which is where the 15 become one.

Sagent extracts the data out of the source systems, then does some transformations to the data to ensure that when it then gets loaded into the data staging area, were being consistent in what were loading, so were using the same terminology and definitions, he says.

Then from the data staging area, Sagent also is used to move the data into the data warehouse database, where its then organized in such a way that makes it accessible for analysis, he adds.

The big advantage with a tool like Sagent is that it permits future changes to be made to the extraction of the data because its extremely fast and easy to replicate, he contends.

Within the next 18 to 30 months, we will be looking to replace 14 of those processing systems with one and the 15th with a new system, he explains, in a traditional system conversion exercise. So well go from 15 to two. There obviously will be quite considerable data migration as part of that.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.