The small commercial dilemma for large insurers

The race to use predictive analytics to grab small commercial insurance market share is heating up.

Armed with unprecedented insight into risk, early adopters of predictive analytics focused on big-ticket policies and were able to carve out profitable niches that other insurers were unable to appropriately identify, assess, and price. (Photo: Shutterstock)

The combination of a constantly-growing universe of data and the adoption of increasingly sophisticated ways to analyze it is what’s driven the insurance industry transformation over the past decade.

In commercial lines, large insurance carriers paved the way in establishing precision risk assessment using predictive analytics. This created a highly competitive environment that divided the industry between the “haves” and “have-nots,” allowing larger players to gain considerable market share.

Ultimately, this technology imbalance and the threat of adverse selection spearheaded the adoption of analytics across a majority of insurance carriers of all sizes.

It’s no coincidence that large insurers were the first to leverage analytics. They had the in-house resources to become early adopters, with significant investment capital and sizable historical data sets to use when building their predictive models. Armed with unprecedented insight into risk, they focused on the big-ticket policies and were able to carve out profitable niches that other insurers were unable to appropriately identify, assess, and price.

Fast forward to 2019. With a vastly diminished pool of larger policy opportunities, many insurers are emphasizing growth in the wide-open small commercial market segment where, remarkably, no insurer has more than 4% market share.

The U.S. small commercial market is anything but small. According to a 2016 McKinsey and Co. report, it makes up more than one-third of the US commercial lines market and accounts for up to $103 billion in direct written premium. Despite this, an estimated 40% of sole proprietorships don’t have any business insurance coverage.

While large insurers had the first-mover advantage when analytics technology was first catching on, that’s no longer the case as the race heats up to grab market share in the small commercial market.

Numerous players vying for attention

With a large chunk of small businesses devoid of commercial insurance coverage, there are a number of players looking to claim majority market share. Although large insurers had significant technology advantage in the past, insurers of all sizes have followed suit to become competitive in assessing risk through data-driven decisions.

More so than large employers, small businesses express interest in direct purchasing akin to online retail. Their purchasing behavior more closely resembles individual consumers. This desire opens the floodgates for InsurTech companies to offer modern distribution platforms, powered by technology and predictive analytics, that provide simple insurance purchasing. To accurately understand the needs of small businesses, insurers must understand key factors, such as what motivates them, their purchasing decisions, and how insurers can impact these decisions. As discussed in McKinsey’s research report, small commercial customers can be grouped into need and behavior-based segments that are far more accurate predictors of buying preferences than traditional data points, such as business size and industry segmentation.

Large carriers must continue to capitalize on analytics and use its capabilities to develop exceptional front-end systems for both agents and policyholders to be successful in this market.

Small commercial data ‘blind spots’

Leveraging existing treasure troves of data to win the best business may have led to large insurers’ early success, but they’ve fallen short when it comes to the small commercial market. Many chose not to actively pursue these accounts in the past because the high cost of underwriting makes this segment unprofitable using traditional methods. To make this a viable market, insurers need to leverage technology and advanced data techniques to validate application information at scale. Consequently, it left a gaping hole in raw transactional data – pricing data that follows the lifecycle of a policy from quotes through claims.

If large insurance companies fail to address the gap in transactional data, they will lose out to other incumbents, as well as the InsurTechs promising a better customer experience. While there are numerous sources of data, choosing the most predictive sources and then manipulating that data remains elusive. It’s expensive, complicated, time intensive on backlogged IT resources. A general rule of thumb is to acquire enough data points on at least 10,000 claims to produce accurate modeling insights.

Understanding the importance of having transactional data to combine with in-house and external data also means appreciating that some variables will be more predictive than others. It’s the appropriate combination of these different data types that creates unique synthetic variables to dramatically improve predictive model lift. Synthetic variables are built from computations of more than one variable, made possible by leveraging large and diverse datasets.

To explore this, we conducted a study on the Valen Data Consortium using a powerful machine learning algorithm called gradient boosting (GBM) to rank common model variables. The orange dots in the graph below represent in-house variables and policy information that insurers would typically use to build a model. The blue dots represent that same value, with additional data from the consortium appended to create a synthetic variable.

We found that leveraging synthetic consortium variables can provide up to 13-times the predictive power vs. policy-only variables. This holds true even for insurers with large enough data sets to build their own models. With predictive analytics quickly becoming table stakes, competitive advantage will be dictated by an insurer’s ability to skillfully combine a number of different data sources to identify the highest predictive value through rigorous testing.

An analytics approach also enables straight-through-processing of low risk policies, creating more efficient policyholder and agent experiences. Agents control a significant portion of the small commercial market, yet commissions are small, as are the incentives for handling the policies. Agents are more likely to work with insurance carriers that bind policies quicker and use less manual processing.

Small business owners are looking to unburden themselves from decisions that don’t relate to their day-to-day business operations, so they appreciate simplicity, efficiency, and sensible pricing. Only an insurer that creates a transparent process and functions as a true partner can succeed in this market. In the end, it comes down to taking a comprehensive approach, complete with smart channeling of the data, and embracing the mindset of working in partnership with small businesses. The race for the small commercial market highlights how prevalent the adoption of technology within the insurance industry is becoming, and insurers that can provide the fastest service by most accurately aligning price-to-risk will win the best business.

Kirstin Marr (kirstin.marr@valen.com) is president of Valen Analytics. These opinions are her own.

Also by this contributor: