Consider this: Gartner predicted that in 2017, 60% of big data projects would fail. Staggering, in light of the level of investment being made in the technology to take advantage of the data organizations are generating.
The concept of big data is not new in the insurance industry. If leveraged correctly, big data can transform the way insurers do business by allowing them to continually enrich customer experience, streamline operational costs and increase profitability. Given the intuitively obvious benefits, why is the momentum behind the benefits of Big Data still lagging?
Here are three of the main challenges.
Related: How carriers can leverage the power of big data
|No. 3: Lack of clear technology direction
Often, in a large organization, there are multiple technology initiatives and, in some cases, multiple big data projects. Sometimes, these efforts continue in parallel without sufficient communication between the separate teams. This can obviously cause unnecessary overlaps and conflicts that minimize or, in the worst case, neutralize the overall potential benefits of such efforts to the organization. In other cases, there is an effort to unify the different efforts into one, strategic effort. This approach ends up consuming time in building consensus and could end up moving too slow to reap timely benefits.
Given the flexibility, scalability and adaptability of technologies today, it may be okay and necessary in several instances to continue separate efforts. In a conceptual sense, this is no different than separate core systems (policy, CRM, claims) existing within an organization. However, there must be focus on clearly defining and designing the interfaces between the disparate efforts such that data between any two environments can be easily, reliably and speedily exchanged and understood.
This is often easier to address than attempting to unify multiple technology efforts into one and should be a key consideration in the early stages of each effort. However, this is commonly overlooked during the design and deployment of individual efforts and is too expensive/late to address later. There are 3 key aspects of ensuring communication and understanding across multiple efforts:
- |
- |
- |
- The technology framework to accomplish this;
- Establishing a consistent approach to ensuring data quality and integrity as data flows between systems; and
- Establishing a common, unified understanding of the data such that the technical data is mapped to business terms and functions in a way that it is consistent and comprehensible across the overall organization.
- |
- |
Related: 9 ways insurers can drive more value from technology investments
|No. 2: Unclear mapping of business objectives to specific criteria
Another common challenge is that a big data effort is frequently initiated with a limited understanding or assessment of how to map the overall objectives to the specific business requirements and metrics that would clearly define the success of such an effort.
For example, a business case objective may be to reduce churn by a certain percentage. Now, the focus is usually on the set of technology/tools to put in place, the data to ingest and the analytics to create to predict/identify and analyze churn patterns. Sounds right? Yes, but there are a couple of elements that are overlooked.
What is the requirement for the level of quality of the data? Even if the technology is spot on, the accuracy of the analytics will be questionable if the data quality and integrity requirements are not thought through. Here is a simple example: for any churn analysis, the approval/activation date of a policy is an essential attribute. What if, due to circumstances outside the control of the big data effort, that attribute is not reliably populated in a source system? That puts into doubt the effectiveness of the analysis.
Is there a common framework of understanding of what the data from the different sources mean from a business standpoint and who owns what parts of the data? For example, if the approval/activation date of a policy is available from the policy system and the CRM system, which of them is the correct owner? Is the definition/understanding of this attribute common across the organization?
The typical scenario in a big data effort is that the team consists of technologists and data scientists who end up having to address the data quality/integrity and governance issues before any useful results can be produced. This unplanned effort not only derails big data initiatives but also demotivates the team because they have to jump deep into areas that are not their core expertise or interest.
A common challenge is that a big data effort is frequently initiated with a limited understanding or assessment strategy. (Photo: iStock)
|No. 1: Lack of easy, timely access to the results
Typically, more attention is given to the technology to enable ingestion and storage of big data than to the specific requirements of access by business users. The thinking is that the bigger challenge is to get the data sourced from different sources and then to store that data in a suitable big data platform. Once that is done, the big data platform will enable access by business users.
This is true. However, will the data be available in a timely manner to each group of business users? Will the business user be able to "slice-and-dice" the data without having to rely on technologists? Will the business user have a set of relevant "out-of-the-box" analytics that they can leverage or will they have to build? Will the analytics be operationalized to work within current processes or business practices or will the user have to change the way they work to leverage the analytics?
Many of these questions are left unanswered until late in the game. This unfortunately means that business users are left with a platform that has tremendous potential that they are not able to exploit.
Related: From FNOL to settlement — data matters
|So, what should be done?
In the challenges described above, there is a common theme: beyond the technical aspects of a big data initiative, unified understanding of the data, its quality/integrity and its accessibility for analysis are essential to the success of such efforts.
Data quality: It is crucial to include in the planning for such efforts, the clear definition of the data quality and integrity requirements.
The data quality assessment should incorporate determining the quality of the data from the source as well the quality of data that is exchanged between systems, including an approach to validate data quality on an ongoing, automated basis. While data within a source may be of good quality at the moment, that may change as the source environment changes over time and as data is moved from this source to other systems. With a standardized, auditable and automated end-to-end data quality framework, insurers can remain vigilant and catch errors before they impact business
Data governance: It is crucial that a data governance approach is part of any big data initiative.
Data governance capabilities can bridge the business and technical divide by delivering transparency into all aspects of an insurer's data assets, from the data available, its owner/steward, lineage and usage, to its associated definitions, synonyms and business attributes. Full transparency allows business decision makers to gain valuable insights into not only the details of their data assets, but the attendant risks associated with its use across business applications.
Accessibility to data and results: The data and any deployed analytics must be accessible in a relevant, flexible and timely manner to business users.
Finding an analytics "solution" is far more important than finding an analytics "tool." While tools generally come with a great amount of flexibility and capability, it is up to the business user to either rely on technologists to produce useful information or to spend considerable effort and time to educate themselves to exploit all the capabilities of the tool. Neither approach is scalable. It is important to select a solution that:
- |
- |
- |
- Seamlessly operationalizes or integrates the analytics into current processes or work flows;
- Comes with a set of "out-of-the-box" analytics and results that can be leveraged and distributed by business users in an intuitive manner from day-one, in addition to other powerful capabilities that business uses can learn over time; and
- Provides the data in a timely manner to business users rather than leaving it to the users to figure out how best to extract the data from the platform in a timely manner.
- |
- |
With the right combination of analytics, data quality and data governance, insurers can deliver on the success of big data initiatives.
Ravi Rao is a senior vice president of pre-sales at Infogix, a pioneer of automated data quality, data governance and advanced analytics solutions. He can be reached by calling (630) 505-1800.
See also:
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.