The top 10 data-migration considerations for insurers
From cleaning data to preparing for security issues, insurers should consider these steps when moving data from one store to the next.
Migrating data for a system in any type of business can be challenging, but migrating data for insurance systems, such as a policy administration or claims, is even more so.
Unlike basic use cases, simply migrating the data from one data store to another and monitoring for quality will not get the job done. Migrating data within insurance systems requires those disciplines and beyond.
The following are the top 10 considerations when migrating insurance data to help you get started:
1. First, clean it
The old saying “garbage in, garbage out” has stood the test of time for a reason; it’s a simple concept and almost always true. With any migration, and especially an insurance-related migration, there is no reason to bring over dirty or incorrect data. You are better off using the conversion process as an opportunity to upgrade the quality of your data while you move it to ensure the new system has cleaner data than the old.
In insurance, often the migrating system has blocks of data that need special handling, such as data that was previously migrated and doesn’t conform to expected business rules. These blocks should be identified and remediated before or during conversion. Now is the time to invest in clean data.
2. It is about more than mapping
In the case of insurance data, migrating data is about moving the body and intent of the business. Moving only the data objects does not complete the job — you must make sure the multitudes of business rules also get moved in the process.
In fact, rules may need to be changed so the new system provides the same results as the old system. Insurance legacy systems often overload data elements with information that only operations SMEs know how to interpret, so now is the time to codify that business intent in clear, documented, governed data.
3. Be sure to identify owners
Remember that resource constraints are inevitable. It is critical to know where conversion capacity should or can be sourced from early in the planning process.
Furthermore, evaluate if the ability to add third-party staff to prevent resource constraints from affecting timelines is an option. If new tools and environments will be introduced to facilitate the process, it is important to plan for them upfront as they will dramatically speed things along once implemented, but will be a distraction until they are in place.
4. Right size the work
Generally, conversion work will only happen once in production and then be set aside, so it needs to be effective and accurate, but it does not need to be as fully automated and operationalized as ongoing processes. If there are manual or hard-coded approaches to properly convert edge cases, then those should be considered instead of programmatically dealing individually with cases.
5. Understand the data retention complexity
Data retention requirements are complex. Applying data retention requirements to applications and data stores that weren’t designed to support them can be difficult. Make sure to design and anticipate this challenge so the effort and necessary resources are well understood. If this work is not a critical path for the cutover, consider getting a “Day 2” deferral sign off from key stakeholders.
6. Avoid security surprises
Nothing can stop an implementation in its tracks faster than an impasse relative to security.
Security and access standards can vary, and how the standards have been implemented across applications and data stores may be inconsistent. It is critical to understand all the security requirements early in the process, to address all those flagged as mission critical, and either address lower priority needs before cutover or obtain a “Day 2” deferral sign off.
7. History is hard
To properly service accounts in the target system, including historical point-in-time account valuation and reverse/reapply transactions, detailed financial transaction history is usually required. This can be particularly important for open blocks or closed blocks that are still accepting sub-payments. The older the history, the more difficult it becomes to convert, especially if there were prior conversions in the migrating book of business.
For example, legacy policy administration systems store transactional history that can be used to rebuild the position of an annuity contract, but only if that information is 100% available and accurate.
8. Think holistically
If most of the complicated aspects of the migration work well, but there are issues with someone’s email or personal drives, those tend to be what stakeholders remember about the project. Unstructured data (especially enterprise document management and email) is where a great deal of data retention leakage can occur. Content outside of structured applications and data stores may or may not be considered as part of the core data migration/conversion workstreams but share similar characteristics and need to be properly planned for, such as including retention requirements.
9. All about testing
Multiple mock conversions and/or parallel tests are recommended to fully validate the business accuracy of the conversion work. Keep in mind this process may require months of concentrated effort. Properly planning for testing timelines will determine your critical path. Managing environments properly will be a key to the flexibility of that path. At least one production-like environment dedicated to conversion testing is a best practice, which must include storage and processing capacity to handle full production conversion tests, as well as security controls and sign off to handle the full production dataset.
10. Be environmentally sound
You can expect resource contention when running a conversion program. Whether the resources are development staff, shared services staff or testing environments, conflicts and constraints are inevitable. The more these constraints can be eliminated through building an environment or staffing flexibility, the easier the delivery will become.
These 10 considerations are imperative factors for a successful data migration project in the complex insurance landscape. Above all, these considerations require proactivity and a sharp understanding of available resources and the project environment to mitigate constraints. By anticipating challenges and implementing an end-to-end process that prioritizes organization and data expertise, your team will seamlessly execute data migration projects.
Rob Nocera is a technologist and IT leader who has been architecting, designing, and developing systems and applications within the insurance and financial services industry for over 25 years. Rob is the head of data and technology within Capco’s U.S. insurance practice. In this role, he works with the technology and data practices’ leadership to bring an insurance perspective to the solutions, methodologies, and partnerships the group works with.
Eric Fairchild has 30 years of experience in insurance and is the head of delivery for Capco’s U.S. insurance practice. The majority of his career has been spent as a management consultant for retirement, life and annuities companies leading large and complex programs including policy administration systems implementations, book of business acquisition and conversion projects, large-scale capability transformations and agile transformations.
Opinions expressed here are the authors’ own.
Related: