What you need to know about the wildfire risk model market

Insurers that continue to evolve their risk modeling gain a competitive edge in underwriting and risk management.

Risk models offer invaluable insights into the likelihood of losses resulting from these catastrophic events. (Credit: Summit Art Creations/Adobe Stock)

The Wall Street Journal recently published an article that examined the role of risk models in property & casualty (P&C) insurance; in combating growing losses from climate, natural catastrophe, and extreme-weather risks.

Catastrophe and risk models serve as the bedrock for evaluating and quantifying potential risks of a property or portfolio of properties. When it comes to the growing threat of natural catastrophes, risk models offer invaluable insights into the likelihood of losses resulting from these catastrophic events.

Understanding the risk model market and the data it needs is critical to safeguard against these risks and related losses.

Risk models vs. catastrophe models

People, even within the insurance industry, may confuse the difference between risk models and catastrophe models.

Risk models primarily compare risk levels between individual properties, aiding insurers in underwriting and pricing. They may examine risk for any number of different perils, including wind, hail, flood, fire, wildfire, hurricanes and more.

However, they are not designed to predict losses.

Conversely, catastrophe models are designed to estimate the potential financial impact of catastrophic events on portfolios and insureds. These models provide insights into the potential for large-scale losses that could jeopardize an insurer’s solvency or capacity to settle claims due to the sheer volume of losses. Catastrophe models offer estimates of Probable Maximum Loss (PML) and Expected Average Annual Loss (AAL), informing decisions regarding reinsurance purchases and setting coverage thresholds.

While both risk and catastrophe models are pivotal to the insurance industry, they serve distinct purposes and are employed separately by different departments within insurance organizations.

The evolution of wildfire risk models

Risk models have evolved over the years. The surge in the frequency and severity of natural catastrophes and extreme weather events, particularly wildfires, stands out as a primary impetus behind the evolution of more accurate and comprehensive risk assessment tools. With wildfires becoming an increasing threat to property, particularly because of rapid development in wildfire-prone areas, insurers have endeavored to better evaluate and assess this risk.

See also: Metro areas with the most wildfire loss exposure

Traditionally, the market has primarily relied on two wildfire risk models: the most widely used wildfire scoring model considers factors for fuel load, slope and access. Whereas, another primary wildfire scoring model differs in that it considers aspect instead of access.

These two models constitute the majority of the wildfire risk model market.

Because these models were (and often still are) applied independently — with insurers generally relying on only one — consumers have been somewhat confused regarding their wildfire risk assessments.

Insurers often treat their risk models and data inputs as closely guarded trade secrets and competitive differentiators and thus may shy away from sharing detailed information about their models. This approach has its shortcomings. When consumers receive information on conflicting wildfire risk scores, they may draw inconsistent conclusions about their actual risk levels and risk factors. This lack of transparency leaves consumers somewhat in the dark, fostering uncertainty and ambivalence that can impede wildfire mitigation efforts.

Following the devastating Camp and Woolsey fires in California several years back, insurers began non-renewing policies for homeowners residing in ZIP Codes with higher-than-average wildfire risk. This sparked a growing demand for more advanced wildfire models to recognize differences in exposure within a single ZIP Code — and a push for increased transparency to reduce the confusion and the sheer volume of consumer complaints being received by the California Department of Insurance (CDI).

Different data inputs for wildfire-risk models

Sample map of wildfire risk across the U.S. using an enhanced wildfire risk model and Geospatial Hazard Rankings. (Image provided by Guidewire)

As this situation and the market evolved, insurers began to recognize the limitations of relying solely on a single risk model for their underwriting and pricing decisions.

As the types and accessibility of data used in these risk models greatly increased over the past few years, insurers have recognized that more data elements can translate to greater accuracy in risk assessment and underwriting.

New data elements used to evaluate wildfire risk include weather conditions like drought, average precipitation, wind, wildfire history, fire suppression capabilities, vegetation flammability, rate of spread and even lightning frequency. Various organizations, including IBHS, NFPA, CalFire and others, have conducted extensive studies on some of these different elements, offering data-driven insights into their impact on wildfire risk and mitigation.

In addition to using more data points, leveraging modern Geospatial Hazard Ratings can help insurers assess the risk of a property at a more granular, individual level, rather than using the larger area of a census block, ZIP code, or municipality.

Where the market is going and why

Sample map of wildfire risk across California using an enhanced wildfire risk model and Geospatial Hazard Rankings. By using more granular data and mapping, insurers can more accurately and predictably identify properties to insure even in areas once considered higher risk. (Image provided by Guidewire)

As risk model providers continue to evolve and add data elements to their models, carriers themselves are evolving. Many carriers have begun incorporating multiple, competing models to create a more comprehensive risk profile. By considering a broader range of variables, carriers can better comprehend wildfire risk.

In this evolving landscape, carriers are faced with pivotal decisions on how to best leverage multiple risk models to enhance their pricing, underwriting and risk management strategies. To achieve this, some insurers evaluate the components of competing models and develop their own ‘crosswalks’ to combine model outputs in alignment with their specific business objectives.

Carriers may want to choose a robust model for pricing to equip themselves with a broad range of rate levels that can accurately match risk across a wide spectrum. Simultaneously, they may employ a less robust model for underwriting, to help streamline the analysis of properties or books of business, allowing them to focus on those that can be written profitably.

Robust models enable the combination of scores to create refined price segmentation, offering greater flexibility in risk selection. For instance, a carrier might initially only consider new business with low or moderate wildfire risk. However, by incorporating data on defensible space or home hardening, they may be comfortable writing higher wildfire risk when the exposure has been limited by wildfire mitigation preparations.

By combining the strengths of different models, carriers can enhance their risk assessment, pricing precision and underwriting practices, ultimately benefiting both the industry and consumers.

Navigating the future

The market is continuing and will continue to evolve. Future advancements in AI and machine learning, coupled with real-time data from IoT devices, are poised to further transform risk modeling and wildfire risk assessments. Insurers that continue to evolve their risk model strategies can gain a competitive edge in precision underwriting, risk selection and risk management, and can enjoy sustainable profit margins.

Embracing innovation in risk models will enable insurers to navigate the complexities of wildfire risk better, and ensure more resilient and informed decision-making in the face of increasing natural catastrophe threats.

Tammy Nichols Schwartz, CPCU, is the senior director of Data and Analytics at Guidewire, the provider of technology solutions to the P&C insurance industry. She has more than 20 years of experience as an actuary, underwriter and executive at leading insurance carriers and financial institutions including Farmers Insurance and Bank of America. Prior to Guidewire, Schwartz was the Founder and CEO of Black Swan Analytics.

Opinions expressed here are the author’s own.

Also from this contributor: