Insurers harness next-generation data models to combat wildfire risks
Next-generation wildfire data and risk models allow insurers to pinpoint localized wildfire risk and identify mitigation strategies to lower the propensity for loss.
The wildfires that swept through the Texas Panhandle have reignited concerns about the increasing frequency and severity of wildfires across the United States. The Smokehouse Creek Fire, the largest wildfire in Texas history, serves as a poignant reminder of the urgent need for proactive measures to address the risk of wildfires.
Last year, we saw the collateral and financial damage these natural catastrophes could cause when several major insurers declared they were withdrawing from California, the nation’s largest insurance market. This left homeowners and business owners with fewer options and more risk.
Wildfires have become a growing threat across the U.S. and globally in recent years, exacerbated by warmer temperatures, prolonged droughts and changing climate patterns. As temperatures are projected to rise and weather patterns seem more unpredictable, the risk of wildfires looms, posing significant challenges for insurers tasked with assessing and managing this risk.
In California, which has seen over 8,500 wildfires per year over the past decade, newly proposed insurance reforms seek to address the threat and the regulatory and reinsurance issues exacerbating it. These reforms aim to improve transparency and accountability in wildfire risk assessment, requiring insurers to provide greater visibility into their wildfire risk models and offer discounts to homeowners who take proactive measures to mitigate their risk of wildfire exposure.
Next-gen solutions
However, addressing wildfire risk requires more than regulatory reforms. It demands innovative solutions. One area of innovation insurers should explore is upgrading risk assessment with next-generation data and risk models.
Traditional risk evaluation methods are proving inadequate in the face of evolving threats. They rely on outdated assumptions and incomplete data sets that fail to recognize the full extent and drivers of wildfire risk and other natural catastrophes, including extreme weather risks.
Traditional ratemaking methods also rely on large geographical zones to assess risk and price insurance policies. Each territory is assigned a rate level representing the average risk for that region. Whether it be a city, zip code or census block, the average risk is used to set the premiums, creating pricing inefficiencies for locations within the region that are inconsistent with the average risk level.
Innovative insurers are embracing modern wildfire risk models that:
- Incorporate a broader range of data elements. Wildfire is a complicated exposure that can’t be adequately calibrated with just a few variables.
- Rely on more detailed and fine-scale geospatial data that reflect risk levels at each specific location (as opposed to census block or territory averages).
- Leverage advanced analytics and satellite imagery to provide a more nuanced understanding of wildfire risk, going beyond the physical hazards and including the vulnerabilities of the property and the resiliency of the surrounding community.
With such next-generation wildfire data and risk models, scores are more accurate, allowing the insurer to pinpoint localized wildfire risk and the property owner to identify mitigation strategies that reduce the propensity for loss.
Next-gen risk models capitalize on all sorts of newly available data elements, such as fire season rainfall, vegetation burn points, Katabatic wind zones, proximity to high fuel loads, and fire suppression capabilities to produce more accurate and predictive risk scores.
Innovative insurers also leverage new imagery and data available through satellites and drones. With a much more granular base geography and faster ingestion of data, such as vegetation data, insurers can measure wildfire risk more precisely, especially in rural and urban-wildland areas.
Without more accurate wildfire models, many insurers would consider California uninsurable. Fortunately, with improved data and greater granularity, we are able to identify the vast majority of homes as low risk. In fact, 90% of the homes and buildings in California are in areas with acceptable wildfire risk. The problem lies in identifying them and distinguishing them from those properties that pose too much risk.
In the face of mounting challenges posed by wildfires, insurers must explore innovative solutions to manage and mitigate this growing risk more effectively. The devastating impact of the Smokehouse Creek Fire in the Texas Panhandle only underscores the need for more proactive measures. Similarly, the challenges experienced in California serve as a stark reminder of the dire consequences for homeowners and businesses in vulnerable regions.
Modernizing risk assessment through next-generation data and risk models presents a promising pathway for insurers to bolster their capabilities and mitigate risks. By integrating technological advancements and fostering collaboration among insurers, policymakers, and community stakeholders, we can fortify our defenses against wildfires and pave the way toward a safer, more resilient future for everyone.
Tammy Nichols Schwartz, CPCU, is the senior director of data and analytics at Guidewire, the leading provider of technology solutions to the P&C insurance industry. She has over 20 years of experience as an actuary, underwriter, and executive at leading insurance carriers and financial institutions, including Farmers Insurance and Bank of America. Prior to Guidewire, Schwartz was the Founder and CEO of Black Swan Analytics.
Opinions expressed here are the author’s own.
Related: