Rachel Olney, CEO/founder of Geosite, is working to simplify complex, multi-sourced spatial data to maximize business operations efficiency.

getty

Nearly 100 people lost their lives when Hurricane Ida made landfall last year, causing an estimated $75 billion in damage. Adding to the tragic loss of life was the fact that many of those left to rebuild found navigating the insurance claims process exceedingly complex, and today — many months later — a significant number are still awaiting the resources they need to rebuild their lives.

Regrettably, the frustrations facing the victims of Hurricane Ida are not unique, and the storm has provided a clear example of the need for improved claims processing as climate change continues to increase the frequency, severity and unpredictability of natural catastrophes. Improving the speed and accuracy with which post-disaster claims are identified and resolved is crucial not only for property owners but also for insurance carriers.

Remote sensing technology (satellites, drones and IoT sensors) makes it possible for insurers to locate impacted properties, assess the extent of damage and inform policyholders of the status of their claims before they even return to a disaster-stricken area. But leveraging this data at scale can be difficult.

Current workflows rely on antiquated practices that slow down claims processing at nearly every step, contributing to the lack of cost performance improvement across the industry. Groups like the SENSE consortium are evidence that insurance leaders are trying to push the industry to adopt more diverse and robust data sources into adjuster workflows to improve the accuracy and speed of claims processing. But more needs to be done.

New technologies in remote sensing and analytics are already improving insurers’ ability to assess damage after a disaster. High-resolution imagery from before (blue sky) and after (gray sky) a disaster from sources like the Geospatial Insurance Consortium gives agents the ability to immediately assess wind and flood damage.

Using analytic models to conduct assessments at scale can enable insurers to quickly resolve massive volumes of claims and spot potentially fraudulent claims that require more investigation. Specialized data and analytics can offer even greater precision at scale for measuring flood depth and extent from space-based assets.

The true challenge, however, is to ingest this information in a way that makes it actionable for claims adjusters. Large-scale catastrophic events, like Hurricane Ida and the Dixie Fire in California, can impact an enormous number of policies, and manually reviewing different forms of raw spatial data is costly, slow and would require new expertise on the part of adjusters.

One approach is to give adjusters automatic assessments of damage through geospatial claims software that fuses flood, fire and wind damage data, as well as AI outputs with policy information and building characterization. Better automating the use of geospatial data, advanced software platforms can batch and triage claims by severity while also surfacing fraud.

The ability to do this all remotely — via satellites, aircraft and drones — also means those automatic assessments can occur before residents return to a disaster-stricken area. Most importantly, speeding the process by which insurers assess damage liability can allow property owners to bounce back more quickly.

The data integration and utilization challenges insurers face today are strikingly similar to those across other major industries. Slowly adapting ways of doing business to leverage new technology can require massive changes to previously standard practices.

Since entering the insurance market, I have seen a lack of spatial granularity in policy data across most insurers. Without damage assessment data in spatial data formats, it is common to have inaccurate (or nonexistent) spatial context for insured properties. For example, often, two columns — one with latitude and another longitude — inaccurately depict the centroid of a parcel rather than the actual location of insured facilities. As a consequence, correlating risk and post-disaster data to policy data can be inaccurate or impossible.

The bottom line is, having 30-centimeter accuracy in flood data won’t help determine if a building was affected if the underlying database identifying the location of insured buildings has limited accuracy. To remain competitive, property and casualty carriers should consider modernizing their operations or risk losing out on the billions of dollars associated with the industry’s growth. To do this, insurers should automate processes through the use of data and software where possible, rather than throw bodies at the problem.

In particular, insurance companies that take advantage of the growing geospatial economy could possibly improve operational efficiency and enhance their overall customer experience. Then, we can all bounce back faster and continue to thrive in the face of major natural disasters.


Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?