The rallying cry has sounded — to “close the protection gap”, the difference between what is paid out by insurance and the total cost of some incident or disaster. Here is an issue that can unite and promote the insurance industry, extending benefits to those in peril by expanding the insurance sector. Having ex-post access to funding after a loss, we know, can bring important benefits.
Yet in reality, there is not just one, but three distinct insurance “protection gaps”, each with separate causes and each requiring different remedies. These protection gaps are so different to one another that we should stop treating them as a single category. Lumping them together can cause confusion.
In this series of four blogs, I will explore each of these three distinct gaps, together with the role of protection gap analytics, and the actions we can plan to address these protection gaps.
Catastrophe models, conceived in the 1970s and created at the end of the 1980s, have proved to be a “disruptive technology” in reshaping the catastrophe insurance and reinsurance sectors. The first wave of disruption saw the arrival of fresh capital, to found eight new “technical” Bermudan catastrophe reinsurers. The “Class of 1993” included Centre Cat Ltd., Global Capital Re, IPC Re, LaSalle Re, Mid-Ocean Re, Partner Re, Renaissance Re and Tempest Re. Using catastrophe models, these companies were able to set up shop and price hurricane and earthquake contracts without having decades of their own claims history. While only two of these companies survive as independent reinsurers, the legacy of the disruption of 1993 is Bermuda’s sustained dominance in global reinsurance.
A second wave of disruption starting in the mid-1990s saw the introduction of catastrophe bonds: a slow trickle at first but now a steady flow of new structures, as investors who knew nothing about catastrophic loss came to trust modeled risk estimates to establish the bond interest rates and default probabilities. Catastrophe bonds have subsequently undergone their own “Cambrian explosion” into a diverse set of insurance-linked securities (ILS) structures, including those in which the funds go back to supplement reinsurer’s capital. Again, this disruption in accessing novel sources of pension and investment fund capital would have been impossible without catastrophe loss models.
Guest Blog: Jane Warring, senior counsel, Clyde & Co. U.S. LLP
A few weeks’ back I attended my first Exceedance. If you go to a new conference, you are never entirely sure what to expect. Suffice it to say though, it was like no industry event I have ever attended — and in a good way.
Above all, though, I was blown away by the sessions. I have spent almost 15 years litigating first-party property/coverage disputes at both the trial and appellate levels. So, I know a thing or two about the interplay between exposure, risk and coverage. But Exceedance was a whole new level of learning: information-rich content, transparent discussions, engaged delegates and high-quality speakers.
Like many attendees, I filed a trip report on my return to the office. I wanted to make sure my colleagues who lead our resilience work benefited from my learnings. This got me thinking. What were some of the top takeaways from #Exceedance18?
But slowly, the hurdles to private sector involvement are starting to clear, through the combined efforts of the industry, FEMA, and even private citizens. It will be an exciting time for private insurers and Americans if the new flood reform bill, H.R. 2874 passes through the Senate, as measures in the bill include increased acceptance of private flood insurance by mortgage providers, easing of fixed claims limits, and open source access to FEMA’s extensive claims database.
I invite you to explore the latest digital edition of EXPOSURE Magazine, which also hit the streets of Monte Carlo as a print edition for those attending Les Rendez-Vous de Septembre, and will be available at RMS events over the coming months.
There is a clear mission for EXPOSURE, which is “… to provide insight and analysis to help insurance and risk professionals innovate, adapt and deliver.” And change is in the air for all businesses in the industry, whether it is developing new opportunities, getting products to market faster, being more agile and efficient, or using data-driven insight to transform decision making.
Victor Roldan, regional director – Caribbean and Latin America, RMS
I live in Brickell, in the Financial District of Miami, in a condominium block of some 30 stories on Brickell Bay Drive and SE 12th Street, very close to downtown Miami. The block faces the waterfront, four blocks from the Four Seasons, with the Mandarin Oriental just over the water on Brickell Key. This is an area that the insurance industry knows well, with many RMS clients operating their Latin American and Caribbean business from offices in and around a square mile from here.
I have lived in Miami for the past 15 years, and it is a great city. Similar to most Miami residents, I have experienced hurricanes, I know the difference between a hurricane and a major storm, and in my view here from my block some ten floors up, and despite of being 130 miles away from the eye, this is the worst I have ever seen. My family is safe, they are out of the state.
RMS continues to refine its estimate of the insured losses from Harvey. In the meantime, I think it’s worth looking in more detail at the potential exposure of the National Flood Insurance Program (NFIP) to this major hurricane.
Last Monday, Daniel wrote that it was likely that “Harvey will produce at least US$4 billion in flood claims, triggering the NFIP reinsurance program.” With NFIP up next month for reauthorization and reform, this is an important point — and not just for the 25 reinsurers underwriting over US$1 billion of NFIP’s claims.
Farhana Alarakhiya, vice president – Products, RMS
Hurricane Harvey continues to be top of mind at the RMS offices. On Wednesday, RMS hosted a client webinar where Mark Powell, Tom Sabbatelli and Pete Dailey discussed how we have applied our methodology developed for the RMS U.S. High Definition (HD) Flood Model to provide insights to the extent and severity of the flooding from Harvey, with Houston as our top priority. This effort has resulted in a high-fidelity hazard inundation map which is now available to all RMS clients.
For clients on the RMS(one)® platform who use Exposure Manager, this effort goes one step further. We automatically seed the Harvey hazard layer in the client tenant, to deliver instantaneous access to analytic insights from the U.S. Inland Flood HD Model. This models all sources of flooding across space and time, and can also be used to identify and differentiate locations at risk based on flood extent and severity.
RMS modeling reveals the wider risk of U.S. flooding, and a significant protection gap
A combination of heavy rainfall and melting snow had filled Lake Oroville in northern California near to its capacity. Dam operators released water through the main spillway to control the reservoir level, but a 300-foot hole unexpectedly emerged, and the surrounding soil was eroded by the water gushing out. Spillway outflows were reduced to stop the erosion.
Oroville Dam after heavy rainfall in mid-February (Source: California Dept of Water Resource)
But this made the problem with rising reservoir levels worse, as the water then began to flow over the emergency spillway. On February 12, 2017, at least 188,000 residents were told to evacuate, while trucks and helicopters dumped over 1000 tons of material per hour on the weakened structure to prevent a more significant breach. As water levels in the reservoir subsided, the risk reduced.
With no massive discharge or flooding, insurance losses are expected to be limited to coverage for business interruption, loss of use, or additional living expenses (ALE) incurred by the evacuees.
It could have been far worse if the dam had completely failed – this was a worrying near-miss.
Dam failure, though rare, is not a negligible risk. In fact, a similar near-miss situation occurred during the 1971 San Fernando earthquake where the lower Van Norman Dam was a near-breach and forced the evacuation of 80,000 people. Our modeling teams decided to model a counter-factual version of February’s Oroville Dam breach, in which either (i) the dam continues to disintegrate in a controlled manner or, (ii) in the worst-case scenario, it collapses.
Our flood modelers, Ye Tian and Sonja Jankowfsky, simulated these ‘what if’ scenarios over a 72-hour cycle, using reservoir water level data from the California Data Exchange Center’s Department of Water Resources, and their own estimation of the lake’s bathymetry (contours of the lake bed) and capacity.
The worst-case scenario is modeled assuming the height of the dam wall becomes ‘zero’ instantaneously – for example, as might happen if there was an explosion due to sabotage. For the controlled breach scenario, the dam wall is gradually lowered at two meters per hour to simulate what could have happened had the erosion of the dam continued.
In our modeling, the amount of discharged water would be expected to overwhelm local flood control measures over 100 miles downstream, as was evidenced by the widespread flooding in the simulations. We estimate that under either scenario, about $21.8 billion of building value would be at risk, which, for the communities near Oroville, would be a huge problem because much of it is uninsured.
Insurance coverage is one of the most effective ways of ensuring a region re-bounds quickly after disaster, but in the Oroville region flood insurance penetration is fairly low. Most property owners rely on the National Flood Insurance Program (NFIP), which is most commonly sold to homes in the 100-year flood plain along the Feather River.
FEMA 100-year flood zones
Potential inundation depths simulated for a complete and instantaneous collapse of the Oroville Dam
This flood plain is shown as blue areas in this map. Compare that to the map below.
This map (right) shows modeling from the RMS ‘what if’ scenario of the Oroville Dam breach. The 100-year flood plain (map above) covers significantly less area than that which would be inundated if the dam breached (map right).
So, why does this matter? Residents living outside the 100-year flood zone are not required to purchase flood insurance, and therefore most do not. These areas include the towns of Biggs, Gridley, Live Oak, Oroville, South Oroville, Thermalito, and Yuba City. And yet as RMS modeling shows, many of those communities would have experienced major flooding if the dam had breached completely.
The Protection Gap
It is obvious that NFIP flood zone maps do not include dam failure scenarios, and yet these failures typically inundate a much wider area beyond the naturally occurring flood plain, because the volume of water and speed of release overwhelms natural and man-made defenses.
FEMA cannot quickly nor easily change the flood maps to incorporate this type of risk explicitly. At RMS, we are developing tools to quantify these kinds of extended risks to allow the private flood insurance market to step in and fill the current gaps in coverage.
Thankfully, our ‘what if’ scenario didn’t become a reality. But it highlights the risk of aging infrastructure which may not be able to withstand extreme weather events. Nationwide, the National Inventory of Dams indicates that of the 90,580 situated across the U.S., over 30% exhibit significant to high hazard potential due to structural deterioration.
[Note: clients can obtain modeling files for the Oroville Dam analysis from RMS account managers. This blog has been edited to provide further detail on the initial dam failure.]