logo image
NIGEL ALLENFebruary 11, 2021
Map with pinned location
Map with pinned location
Location, Location, Location: A New Era in Data Resolution
February 11, 2021

The insurance industry has reached a transformational point in its ability to accurately understand the details of exposure at risk. It is the point at which three fundamental components of exposure management are coming together to enable (re)insurers to systematically quantify risk at the location level: the availability of high-resolution location data, access to the technology to capture that data and advances in modeling capabilities to use that data. Data resolution at the individual building level has increased considerably in recent years, including the use of detailed satellite imagery, while advances in data sourcing technology have provided companies with easier access to this more granular information. In parallel, the evolution of new innovations, such as RMS® High Definition Models™ and the transition to cloud-based technologies, has facilitated a massive leap forward in the ability of companies to absorb, analyze and apply this new data within their actuarial and underwriting ecosystems. Quantifying Risk Uncertainty “Risk has an inherent level of uncertainty,” explains Mohsen Rahnama, chief modeling officer at RMS. “The key is how you quantify that uncertainty. No matter what hazard you are modeling, whether it is earthquake, flood, wildfire or hurricane, there are assumptions being made. These catastrophic perils are low-probability, high-consequence events as evidenced, for example, by the 2017 and 2018 California wildfires or Hurricane Katrina in 2005 and Hurricane Harvey in 2017. For earthquake, examples include Tohoku in 2011, the New Zealand earthquakes in 2010 and 2011, and Northridge in 1994. For this reason, risk estimation based on an actuarial approach cannot be carried out for these severe perils; physical models based upon scientific research and event characteristic data for estimating risk are needed.” A critical element in reducing uncertainty is a clear understanding of the sources of uncertainty from the hazard, vulnerability and exposure at risk. “Physical models, such as those using a high-definition approach, systematically address and quantify the uncertainties associated with the hazard and vulnerability components of the model,” adds Rahnama. “There are significant epistemic (also known as systematic) uncertainties in the loss results, which users should consider in their decision-making process. This epistemic uncertainty is associated with a lack of knowledge. It can be subjective and is reducible with additional information.” What are the sources of this uncertainty? For earthquake, there is uncertainty about the ground motion attenuation functions, soil and geotechnical data, the size of the events, or unknown faults. Rahnama explains: “Addressing the modeling uncertainty is one side of the equation. Computational power enables millions of events and more than 50,000 years of simulation to be used, to accurately capture the hazard and reduce the epistemic uncertainty. Our findings show that in the case of earthquakes the main source of uncertainty for portfolio analysis is ground motion; however, vulnerability is the main driver of uncertainty for a single location.” The quality of the exposure data as the input to any mathematical models is essential to assess the risk accurately and reduce the loss uncertainty. However, exposure could represent the main source of loss uncertainty, especially when exposure data is provided in aggregate form. Assumptions can be made to disaggregate exposure using other sources of information, which helps to some degree reduce the associated uncertainty. Rahnama concludes, “Therefore, it is essential in order to minimize the uncertainty related to exposure to try to get location-level information about the exposure, in particular for the region with the potential of liquification for earthquake or for high-gradient hazard such as flood and wildfire.”  A critical element in reducing that uncertainty, removing those assumptions and enhancing risk understanding is combining location-level data and hazard information. That combination provides the data basis for quantifying risk in a systematic way. Understanding the direct correlation between risk or hazard and exposure requires location-level data. The potential damage caused to a location by flood, earthquake or wind will be significantly influenced by factors such as first-floor elevation of a building, distance to fault lines or underlying soil conditions through to the quality of local building codes and structural resilience. And much of that granular data is now available and relatively easy to access. “The amount of location data that is available today is truly phenomenal,” believes Michael Young, vice president of product management at RMS, “and so much can be accessed through capabilities as widely available as Google Earth. Straightforward access to this highly detailed satellite imagery means that you can conduct desktop analysis of individual properties and get a pretty good understanding of many of the building and location characteristics that can influence exposure potential to perils such as wildfire.” Satellite imagery is already a core component of RMS model capabilities, and by applying machine learning and artificial intelligence (AI) technologies to such images, damage quantification and differentiation at the building level is becoming a much more efficient and faster undertaking — as demonstrated in the aftermath of Hurricanes Laura and Delta. “Within two days of Hurricane Laura striking Louisiana at the end of August 2020,” says Rahnama, “we had been able to assess roof damage to over 180,000 properties by applying our machine-learning capabilities to satellite images of the affected areas. We have ‘trained’ our algorithms to understand damage degree variations and can then superimpose wind speed and event footprint specifics to group the damage degrees into different wind speed ranges. What that also meant was that when Hurricane Delta struck the same region weeks later, we were able to see where damage from these two events overlapped.” The Data Intensity of Wildfire Wildfire by its very nature is a data-intensive peril, and the risk has a steep gradient where houses in the same neighborhood can have drastically different risk profiles. The range of factors that can make the difference between total loss, partial loss and zero loss is considerable, and to fully grasp their influence on exposure potential requires location-level data. The demand for high-resolution data has increased exponentially in the aftermath of recent record-breaking wildfire events, such as the series of devastating seasons in California in 2017-18, and unparalleled bushfire losses in Australia in 2019-20. Such events have also highlighted myriad deficiencies in wildfire risk assessment including the failure to account for structural vulnerabilities, the inability to assess exposure to urban conflagrations, insufficient high-resolution data and the lack of a robust modeling solution to provide insight about fire potential given the many years of drought. Wildfires in 2017 devastated the town of Paradise, California In 2019, RMS released its U.S. Wildfire HD Model, built to capture the full impact of wildfire at high resolution, including the complex behaviors that characterize fire spread, ember accumulation and smoke dispersion. Able to simulate over 72 million wildfires across the contiguous U.S., the model creates ultrarealistic fire footprints that encompass surface fuels, topography, weather conditions, moisture and fire suppression measures. “To understand the loss potential of this incredibly nuanced and multifactorial exposure,” explains Michael Young, “you not only need to understand the probability of a fire starting but also the probability of an individual building surviving. “If you look at many wildfire footprints,” he continues, “you will see that sometimes up to 60 percent of buildings within that footprint survived, and the focus is then on what increases survivability — defensible space, building materials, vegetation management, etc. We were one of the first modelers to build mitigation factors into our model, such as those building and location attributes that can enhance building resilience.” Moving the Differentiation Needle In a recent study by RMS and the Center for Insurance Policy Research, the Insurance Institute for Business and Home Safety and the National Fire Protection Association, RMS applied its wildfire model to quantifying the benefits of two mitigation strategies — structural mitigation and vegetation management — assessing hypothetical loss reduction benefits in nine communities across California, Colorado and Oregon. Young says: “By knowing what the building characteristics and protection measures are within the first 5 feet and 30 feet at a given property, we were able to demonstrate that structural modifications can reduce wildfire risk up to 35 percent, while structural and vegetation modifications combined can reduce it by up to 75 percent. This level of resolution can move the needle on the availability of wildfire insurance as it enables development of robust rating algorithms to differentiate specific locations — and means that entire neighborhoods don’t have to be non-renewed.” “By knowing what the building characteristics and protection measures are within the first 5 feet and 30 feet at a given property, we were able to demonstrate that structural modifications can reduce wildfire risk up to 35 percent, while structural and vegetation modifications combined can reduce it by up to 75 percent” Michael Young, RMS While acknowledging that modeling mitigation measures at a 5-foot resolution requires an immense granularity of data, RMS has demonstrated that its wildfire model is responsive to data at that level. “The native resolution of our model is 50-meter cells, which is a considerable enhancement on the zip-code level underwriting grids employed by some insurers. That cell size in a typical suburban neighborhood encompasses approximately three-to-five buildings. By providing the model environment that can utilize information within the 5-to-30-foot range, we are enabling our clients to achieve the level of data fidelity to differentiate risks at that property level. That really is a potential market game changer.” Evolving Insurance Pricing It is not hyperbolic to suggest that being able to combine high-definition modeling with high-resolution data can be market changing. The evolution of risk-based pricing in New Zealand is a case in point. The series of catastrophic earthquakes in the Christchurch region of New Zealand in 2010 and 2011 provided a stark demonstration of how insufficient data meant that the insurance market was blindsided by the scale of liquefaction-related losses from those events. “The earthquakes showed that the market needed to get a lot smarter in how it approached earthquake risk,” says Michael Drayton, consultant at RMS, “and invest much more in understanding how individual building characteristics and location data influenced exposure performance, particularly in relation to liquefaction. “To get to grips with this component of the earthquake peril, you need location-level data,” he continues. “To understand what triggers liquefaction, you must analyze the soil profile, which is far from homogenous. Christchurch, for example, sits on an alluvial plain, which means there are multiple complex layers of silt, gravel and sand that can vary significantly from one location to the next. In fact, across a large commercial or industrial complex, the soil structure can change significantly from one side of the building footprint to the other.” Extensive building damage in downtown Christchurch, New Zealand after 2011 earthquakeThe aftermath of the earthquake series saw a surge in soil data as teams of geotech engineers conducted painstaking analysis of layer composition. With multiple event sets to use, it was possible to assess which areas suffered soil liquefaction and from which specific ground-shaking intensity. “Updating our model with this detailed location information brought about a step-change in assessing liquefaction exposures. Previously, insurers could only assess average liquefaction exposure levels, which was of little use where you have highly concentrated risks in specific areas. Through our RMS® New Zealand Earthquake HD Model, which incorporates 100-meter grid resolution and the application of detailed ground data, it is now possible to assess liquefaction exposure potential at a much more localized level.” “Through our RMS® New Zealand Earthquake HD model, which incorporates 100-meter grid resolution and the application of detailed ground data, it is now possible to assess liquefaction exposure potential at a much more localized level” — Michael Drayton, RMS This development represents a notable market shift from community to risk-based pricing in New Zealand. With insurers able to differentiate risks at the location level, this has enabled companies such as Tower Insurance to more accurately adjust premium levels to reflect risk to the individual property or area. In its annual report in November 2019, Tower stated: “Tower led the way 18 months ago with risk-based pricing and removing cross-subsidization between low- and high-risk customers. Risk-based pricing has resulted in the growth of Tower’s portfolio in Auckland while also reducing exposure to high-risk areas by 16 percent. Tower’s fairer approach to pricing has also allowed the company to grow exposure by 4 percent in the larger, low-risk areas like Auckland, Hamilton, and Taranaki.” Creating the Right Ecosystem The RMS commitment to enable companies to put high-resolution data to both underwriting and portfolio management use goes beyond the development of HD Models™ and the integration of multiple layers of location-level data. Through the launch of RMS Risk Intelligence™, its modular, unified risk analytics platform, and the Risk Modeler™ application, which enables users to access, evaluate, compare and deploy all RMS models, the company has created an ecosystem built to support these next-generation data capabilities. Deployed within the Cloud, the ecosystem thrives on the computational power that this provides, enabling proprietary and tertiary data analytics to rapidly produce high-resolution risk insights. A network of applications — including the ExposureIQ™ and SiteIQ™ applications and Location Intelligence API — support enhanced access to data and provide a more modular framework to deliver that data in a much more customized way. “Because we are maintaining this ecosystem in the Cloud,” explains Michael Young, “when a model update is released, we can instantly stand that model side-by-side with the previous version. As more data becomes available each season, we can upload that new information much faster into our model environment, which means our clients can capitalize on and apply that new insight straightaway.” Michael Drayton adds: “We’re also offering access to our capabilities in a much more modular fashion, which means that individual teams can access the specific applications they need, while all operating in a data-consistent environment. And the fact that this can all be driven through APIs means that we are opening up many new lines of thought around how clients can use location data.” Exploring What Is Possible There is no doubt that the market is on the cusp of a new era of data resolution — capturing detailed hazard and exposure and using the power of analytics to quantify the risk and risk differentiation. Mohsen Rahnama believes the potential is huge. “I foresee a point in the future where virtually every building will essentially have its own social-security-like number,” he believes, “that enables you to access key data points for that particular property and the surrounding location. It will effectively be a risk score, including data on building characteristics, proximity to fault lines, level of elevation, previous loss history, etc. Armed with that information — and superimposing other data sources such as hazard data, geological data and vegetation data — a company will be able to systematically price risk and assess exposure levels for every asset up to the portfolio level.” “The only way we can truly assess this rapidly changing risk is by being able to systematically evaluate exposure based on high-resolution data and advanced modeling techniques that incorporate building resilience and mitigation measures” — Mohsen Rahnama, RMS Bringing the focus back to the here and now, he adds, the expanding impacts of climate change are making the need for this data transformation a market imperative. “If you look at how many properties around the globe are located just one meter above sea level, we are talking about trillions of dollars of exposure. The only way we can truly assess this rapidly changing risk is by being able to systematically evaluate exposure based on high-resolution data and advanced modeling techniques that incorporate building resilience and mitigation measures. How will our exposure landscape look in 2050? The only way we will know is by applying that data resolution underpinned by the latest model science to quantify this evolving risk.”

NIGEL ALLENMay 20, 2019
flood
flood
Clear Link Between Flood Losses and NAO
May 20, 2019

RMS research proves relationship between NAO and catastrophic flood events in Europe The correlation between the North Atlantic Oscillation (NAO) and European precipitation patterns is well known. However, a definitive link between phases of the NAO and catastrophic flood events and related losses had not previously been established — until now. A study by RMS published in Geophysical Research Letters has revealed a direct correlation between the NAO and the occurrence of catastrophic floods across Europe and associated economic losses. The analysis not only extrapolated a statistically significant relationship between the events, but critically showed that average flood losses during opposite NAO states can differ by up to 50 percent. A Change in Pressure The NAO’s impact on meteorological patterns is most pronounced in winter. Fluctuations in the atmospheric pressure between two semi-permanent centers of low and high pressure in the North Atlantic influence wind direction and strength as well as storm tracks. The two-pronged study combined extensive analysis of flood occurrence and peak water levels across Europe, coupled with extensive modeling of European flood events using the RMS Europe Inland Flood High-Definition (HD) Model. The data sets included HANZE-Events, a catalog of over 1,500 catastrophic European flood events between 1870 and 2016, and a recent database of the highest-recorded water levels based on data from over 4,200 weather stations. “The HD model generated a large set of potential catastrophic flood events and quantified the associated losses” “This analysis established a clear relationship between the occurrence of catastrophic flood events and the NAO phase,” explains Stefano Zanardo, principal modeler at RMS, “and confirmed that a positive NAO increased catastrophic flooding in Northern Europe, with a negative phase influencing flooding in Southern Europe. However, to ascertain the impact on actual flood losses we turned to the model.” Modeling the Loss The HD model generated a large set of potential catastrophic flood events and quantified the associated losses. It not only factored in precipitation, but also rainfall runoff, river routing and inundation processes. Critically, the precipitation incorporated the impact of a simulated monthly NAO index as a driver for monthly rainfall. “It showed that seasonal flood losses can increase or decrease by up to 50 percent between positive and negative NAOs, which is very significant,” states Zanardo. “What it also revealed were distinct regional patterns. For example, a positive state resulted in increased flood activity in the U.K. and Germany. These loss patterns provide a spatial correlation of flood risk not previously detected.” Currently, NAO seasonal forecasting is limited to a few months. However, as this window expands, the potential for carriers to factor oscillation phases into flood-related renewal and capital allocation strategies will grow. Further, greater insight into spatial correlation could support more effective portfolio management. “At this stage,” he concludes, “we have confirmed the link between the NAO and flood-related losses. How this evolves to influence carriers’ flood strategies is still to be seen, and a key factor will be advances in the NAO forecasting. What is clear is that oscillations such as the NAO must be included in model assumptions to truly understand flood risk.”

NIGEL ALLENMay 20, 2019
flames
flames
The Flames Burn Higher
May 20, 2019

With California experiencing two of the most devastating seasons on record in consecutive years, EXPOSURE asks whether wildfire now needs to be considered a peak peril Some of the statistics for the 2018 U.S. wildfire season appear normal. The season was a below-average year for the number of fires reported — 58,083 incidents represented only 84 percent of the 10-year average. The number of acres burned — 8,767,492 acres — was marginally above average at 132 percent. Two factors, however, made it exceptional. First, for the second consecutive year, the Great Basin experienced intense wildfire activity, with some 2.1 million acres burned — 233 percent of the 10-year average. And second, the fires destroyed 25,790 structures, with California accounting for over 23,600 of the structures destroyed, compared to a 10-year U.S. annual average of 2,701 residences, according to the National Interagency Fire Center. As of January 28, 2019, reported insured losses for the November 2018 California wildfires, which included the Camp and Woolsey Fires, were at US$11.4 billion, according to the California Department of Insurance. Add to this the insured losses of US$11.79 billion reported in January 2018 for the October and December 2017 California events, and these two consecutive wildfire seasons constitute the most devastating on record for the wildfire-exposed state. Reaching its Peak? Such colossal losses in consecutive years have sent shockwaves through the (re)insurance industry and are forcing a reassessment of wildfire’s secondary status in the peril hierarchy. According to Mark Bove, natural catastrophe solutions manager at Munich Reinsurance America, wildfire’s status needs to be elevated in highly exposed areas. “Wildfire should certainly be considered a peak peril in areas such as California and the Intermountain West,” he states, “but not for the nation as a whole.” His views are echoed by Chris Folkman, senior director of product management at RMS. “Wildfire can no longer be viewed purely as a secondary peril in these exposed territories,” he says. “Six of the top 10 fires for structural destruction have occurred in the last 10 years in the U.S., while seven of the top 10, and 10 of the top 20 most destructive wildfires in California history have occurred since 2015. The industry now needs to achieve a level of maturity with regard to wildfire that is on a par with that of hurricane or flood.” “Average ember contributions to structure damage and destruction is approximately 15 percent, but in a wind-driven event such as the Tubbs Fire this figure is much higher” Chris Folkman RMS However, he is wary about potential knee-jerk reactions to this hike in wildfire-related losses. “There is a strong parallel between the 2017-18 wildfire seasons and the 2004-05 hurricane seasons in terms of people’s gut instincts. 2004 saw four hurricanes make landfall in Florida, with K-R-W causing massive devastation in 2005. At the time, some pockets of the industry wondered out loud if parts of Florida were uninsurable. Yet the next decade was relatively benign in terms of hurricane activity. “The key is to adopt a balanced, long-term view,” thinks Folkman. “At RMS, we think that fire severity is here to stay, while the frequency of big events may remain volatile from year-to-year.” A Fundamental Re-evaluation The California losses are forcing (re)insurers to overhaul their approach to wildfire, both at the individual risk and portfolio management levels. “The 2017 and 2018 California wildfires have forced one of the biggest re-evaluations of a natural peril since Hurricane Andrew in 1992,” believes Bove. “For both California wildfire and Hurricane Andrew, the industry didn’t fully comprehend the potential loss severities. Catastrophe models were relatively new and had not gained market-wide adoption, and many organizations were not systematically monitoring and limiting large accumulation exposure in high-risk areas. As a result, the shocks to the industry were similar.” For decades, approaches to underwriting have focused on the wildland-urban interface (WUI), which represents the area where exposure and vegetation meet. However, exposure levels in these areas are increasing sharply. Combined with excessive amounts of burnable vegetation, extended wildfire seasons, and climate-change-driven increases in temperature and extreme weather conditions, these factors are combining to cause a significant hike in exposure potential for the (re)insurance industry. A recent report published in PNAS entitled “Rapid Growth of the U.S. Wildland-Urban Interface Raises Wildfire Risk” showed that between 1990 and 2010 the new WUI area increased by 72,973 square miles (189,000 square kilometers) — larger than Washington State. The report stated: “Even though the WUI occupies less than one-tenth of the land area of the conterminous United States, 43 percent of all new houses were built there, and 61 percent of all new WUI houses were built in areas that were already in the WUI in 1990 (and remain in the WUI in 2010).” “The WUI has formed a central component of how wildfire risk has been underwritten,” explains Folkman, “but you cannot simply adopt a black-and-white approach to risk selection based on properties within or outside of the zone. As recent losses, and in particular the 2017 Northern California wildfires, have shown, regions outside of the WUI zone considered low risk can still experience devastating losses.” For Bove, while focus on the WUI is appropriate, particularly given the Coffey Park disaster during the 2017 Tubbs Fire, there is not enough focus on the intermix areas. This is the area where properties are interspersed with vegetation. “In some ways, the wildfire risk to intermix communities is worse than that at the interface,” he explains. “In an intermix fire, you have both a wildfire and an urban conflagration impacting the town at the same time, while in interface locations the fire has largely transitioned to an urban fire. “In an intermix community,” he continues, “the terrain is often more challenging and limits firefighter access to the fire as well as evacuation routes for local residents. Also, many intermix locations are far from large urban centers, limiting the amount of firefighting resources immediately available to start combatting the blaze, and this increases the potential for a fire in high-wind conditions to become a significant threat. Most likely we’ll see more scrutiny and investigation of risk in intermix towns across the nation after the Camp Fire’s decimation of Paradise, California.” Rethinking Wildfire Analysis According to Folkman, the need for greater market maturity around wildfire will require a rethink of how the industry currently analyzes the exposure and the tools it uses. “Historically, the industry has relied primarily upon deterministic tools to quantify U.S. wildfire risk,” he says, “which relate overall frequency and severity of events to the presence of fuel and climate conditions, such as high winds, low moisture and high temperatures.” While such tools can prove valuable for addressing “typical” wildland fire events, such as the 2017 Thomas Fire in Southern California, their flaws have been exposed by other recent losses. Burning Wildfire at Sunset “Such tools insufficiently address major catastrophic events that occur beyond the WUI into areas of dense exposure,” explains Folkman, “such as the Tubbs Fire in Northern California in 2017. Further, the unprecedented severity of recent wildfire events has exposed the weaknesses in maintaining a historically based deterministic approach.” While the scale of the 2017-18 losses has focused (re)insurer attention on California, companies must also recognize the scope for potential catastrophic wildfire risk extends beyond the boundaries of the western U.S. “While the frequency and severity of large, damaging fires is lower outside California,” says Bove, “there are many areas where the risk is far from negligible.” While acknowledging that the broader western U.S. is seeing increased risk due to WUI expansion, he adds: “Many may be surprised that similar wildfire risk exists across most of the southeastern U.S., as well as sections of the northeastern U.S., like in the Pine Barrens of southern New Jersey.” As well as addressing the geographical gaps in wildfire analysis, Folkman believes the industry must also recognize the data gaps limiting their understanding. “There are a number of areas that are understated in underwriting practices currently, such as the far-ranging impacts of ember accumulations and their potential to ignite urban conflagrations, as well as vulnerability of particular structures and mitigation measures such as defensible space and fire-resistant roof coverings.” In generating its US$9 billion to US$13 billion loss estimate for the Camp and Woolsey Fires, RMS used its recently launched North America Wildfire High-Definition (HD) Models to simulate the ignition, fire spread, ember accumulations and smoke dispersion of the fires. “In assessing the contribution of embers, for example,” Folkman states, “we modeled the accumulation of embers, their wind-driven travel and their contribution to burn hazard both within and beyond the fire perimeter. Average ember contributions to structure damage and destruction is approximately 15 percent, but in a wind-driven event such as the Tubbs Fire this figure is much higher. This was a key factor in the urban conflagration in Coffey Park.” The model also provides full contiguous U.S. coverage, and includes other model innovations such as ignition and footprint simulations for 50,000 years, flexible occurrence definitions, smoke and evacuation loss across and beyond the fire perimeter, and vulnerability and mitigation measures on which RMS collaborated with the Insurance Institute for Business & Home Safety. Smoke damage, which leads to loss from evacuation orders and contents replacement, is often overlooked in risk assessments, despite composing a tangible portion of the loss, says Folkman. “These are very high-frequency, medium-sized losses and must be considered. The Woolsey Fire saw 260,000 people evacuated, incurring hotel, meal and transport-related expenses. Add to this smoke damage, which often results in high-value contents replacement, and you have a potential sea of medium-sized claims that can contribute significantly to the overall loss.” A further data resolution challenge relates to property characteristics. While primary property attribute data is typically well captured, believes Bove, many secondary characteristics key to wildfire are either not captured or not consistently captured. “This leaves the industry overly reliant on both average model weightings and risk scoring tools. For example, information about defensible spaces, roofing and siding materials, protecting vents and soffits from ember attacks, these are just a few of the additional fields that the industry will need to start capturing to better assess wildfire risk to a property.” A Highly Complex Peril Bove is, however, conscious of the simple fact that “wildfire behavior is extremely complex and non-linear.” He continues: “While visiting Paradise, I saw properties that did everything correct with regard to wildfire mitigation but still burned and risks that did everything wrong and survived. However, mitigation efforts can improve the probability that a structure survives.” “With more data on historical fires,” Folkman concludes, “more research into mitigation measures and increasing awareness of the risk, wildfire exposure can be addressed and managed. But it requires a team mentality, with all parties — (re)insurers, homeowners, communities, policymakers and land-use planners — all playing their part.”

Helen YatesSeptember 05, 2018
The future for flood protection
The future for flood protection
The Future for Flood Protection
September 05, 2018

With innovation in the flood market increasing, EXPOSURE explores whether high-definition (HD) flood models are one of the keys to closing the protection gap In August 2017, Hurricane Harvey brought the highest level of rainfall associated with a tropical cyclone in the U.S. since records began, causing catastrophic flooding in some of the most populated areas of the Texas coast, including Houston. The percentage of losses attributed to inland flood versus wind damage was significant, altering the historical view that precipitation resulting from a tropical storm or hurricane is an attritional loss and highlighting the need for stochastic modeling. Total economic losses resulting from Harvey were around US$85 billion and insured losses were US$30 billion, revealing a significant protection gap, particularly where inland flood damage was concerned. Around 200,000 homes were inundated by the floods, and yet 80 percent of homes in the Houston area were uninsured. Hurricane Harvey Impacts – Aftermath An innovative catastrophe bond has suggested one way this protection gap could be reduced in the future, particularly as a private flood insurance market develops in the U.S. FloodSmart Re, which was announced at the end of July 2018, secured US$500 million of reinsurance protection on behalf of FEMA’s National Flood Insurance Program (NFIP). Reinsurer Hannover Re was acting as the ceding reinsurer for the transaction, sitting between the NFIP and its Bermuda-based special purpose insurer. “It’s a landmark transaction — the first time in history that the U.S. federal government is sponsoring a catastrophe bond,” says John Seo, co-founder and managing principal at Fermat Capital. “It’s just tremendous and I couldn’t be more excited. Events like Harvey are going to accelerate the development of the flood market in terms of risk transfer to the insurance-linked securities (ILS) market. “You have to have more efficient risk pooling and risk sharing mechanisms,” he adds. “There’s over US$200 trillion dollars of capital in the world, so there’s obviously enough to efficiently absorb event risk. So, it’s about, how do you get it out into that larger capital base in an efficient way?” While the bond only provides cover for flooding arising from named storms, either due to storm surge or rainfall, it is a “good test case for the ILS market’s appetite for flood risks,” according to ILS blog Artemis. While “it is not a broad flood coverage, it will likely help to make it more palatable to cat bond investors given their comfort with modeling the probability of named storms, tropical storms and hurricanes.” According to Cory Anger, global head of ILS origination and structuring at GC Securities, the ILS market is certainly showing an appetite for flood risk — including inland flood risk ­— with several catastrophe bonds completed during 2017 for European flood risk (Generali’s Lion II), Japanese flood risk (MSI and ADI’s Akibare Series 2018-1 Notes) and U.S. flood risk. “Both public and private sector entities see value from utilizing capital markets’ capacity to manage flood risk,” she says. “We think there are other geographic regions that would be interested in ILS capacity that haven’t yet tapped the ILS markets. Given the recent success of FEMA/NFIP’s FloodSmart Re Series 2018-1 Notes, we expect FEMA/NFIP to continue to utilize ILS capacity (along with traditional reinsurance capital) to support future U.S. flood risk transfer opportunities.” The ILS sector has grown significantly over the past 15 years, with deals becoming more complex and innovative over time. Many market commentators feel the market was put to the test following the major natural catastrophe losses in 2017. Not only did bonds pay out where they were triggered, fresh capital re-entered, demonstrating investors’ confidence in the sector and its products. “I’m hearing people starting to coin the phrase that 2018 is the ‘great reload,’” says Seo. “This is something I have been saying for quite some years: That the traditional hard-soft, soft-hard market cycle is over. It’s not that you can’t have an event so large that it doesn’t impact the market, but when it comes to capital markets, high yields are actually a siren call for capital. “I don’t think anyone doubts that had 2017 occurred in the absence of the ILS market it would have been a completely different story, and we would have had a traditional hard market scenario in 2018,” he adds. FloodSmart Re has clearly demonstrated the strong investor interest in such transactions. According to Anger, GC Securities acted as the structuring agent for the transaction and was one of two book runners. More than 35 capital markets investors provided fully collateralized protection to FEMA/NFIP on the landmark catastrophe bond. “The appetite for new perils is generally strong, so there’s always strong interest when new risks are brought to market,” says Ben Brookes, managing director of capital and resilience solutions at RMS. He thinks improvements in the underlying data quality along with high-definition flood models make it more likely that inland flood could be included as a peril in future catastrophe bond issuances on behalf of private insurers, on an indemnity basis. “In the early days of the cat bond market, new perils would typically be issued with parametric triggers, because investors were skeptical that sufficient data quality was achieved or that the indemnity risks were adequately captured by cat models. But that changed as investor comfort grew, and a lot of capital entered the market and you saw all these deals becoming indemnity. Increased comfort with risk modeling was a big part of that.” The innovative Blue Wings catastrophe bond, which covered insurer Allianz for severe U.K. flood risk (and some U.S. and Canadian quake) and was completed in 2007, is a good example. The parametric bond used an index to calculate flood depths at over 50 locations across the U.K., was ahead of its time and is the only U.K. flood catastrophe bond that has come to market. According to Anger, as models have become more robust for flood risk — whether due to tropical cyclone (storm surge and excess precipitation) or inland flooding (other than from tropical cyclone) ­— the investor base has been open to trigger selection (e.g., indemnity or parametric). “In general, insurers are preferring indemnity-triggered solutions,” she adds, “which the ILS market has concurrently been open to. Additionally, for this peril, the ILS community has been open to per occurrence and annual aggregate structures, which gives flexibility to sponsors to incorporate ILS capital in their risk transfer programs.” As the private market develops, cat bond sponsors from the insurance market would be more likely to bundle inland flood risk in with other perils, thinks Charlotte Acton, director of capital and resilience solutions at RMS. “A degree of hurricane-induced inland flood risk is already present on a non-modeled basis within some transactions in the market,” she says. “And Harvey illustrates the value in comprehensive modeling of flooding associated with named storms. “So, for a broader portfolio, in most cases, inland flood would be one piece of the picture as it will be exposed to multiple perils. However, a stand-alone inland flood bond is possible for a public sector or corporate sponsor that has specific exposure to flood risk.” With inland flood, as with all other perils, sophisticated models help to make markets. “A fund would look at the risk in and of itself in the deal, but of course they’d also want to understand the price and returns perspective as well,” says Brookes. “Models play into that quite heavily. You can’t price a bond well, and understand the returns of a bond, unless you understand the risk of it.” As the ILS market makes increasing use of indemnity protection through ultimate net loss (UNL) triggers, sophisticated HD flood modeling will be essential in order to transfer the peril to the capital markets. This allows clear parameters to be set around different hours clauses and deductible structures, for instance, in addition to modeling all causes of flood and the influence of local defenses. “It’s a landmark transaction — the first time in history that the U.S. Federal Government is sponsoring a catastrophe bond” John SEO Fermat capital Jillian Williams, chief underwriting officer at Leadenhall Capital Partners, notes that ILS is increasingly bundling together multiple perils in an effort to gain diversification. “Diversification is important for any investment strategy, as you are always trying to minimize the risk of losing large amounts in one go,” she says. “Cat bonds (144A’s) currently have defined perils, but collateralized reinsurance and private cat bonds can cover all perils. Complexities and flow of information to all parties will be a challenge for cat bonds to move from defined perils to UNL all perils. “Any new peril or structure in a cat bond will generate many questions, even if they don’t have a major impact on the potential losses,” she continues. “Investors will want to know why the issuers want to include these new perils and structures and how the associated risk is calculated. For UNL, all flood (not just sea surge) would be included in the cat bond, so the definition of the peril, its complexities, variables and its correlation to other perils will need to be evaluated and represented in the flood models used.” She thinks the potential to transfer more flood to the capital markets is there, but that the complexity of the peril are challenges that need to be overcome, particularly in the U.S. “Flood coverage is already starting to move into the capital markets, but there are many issues that need to be worked through before it can be moved to a 144A transaction in a UNL format for many territories,” says Williams. “Just one of the complexities is that flood risk may be covered by government pools. “To move flood perils from government pools to private insurers is like any evolution, it can take time, particularly if existing coverage is subsidized,” she adds. “For private insurers, the complexity is not just about flood modeling but also about ensuring risk-adequate pricing and navigating through government legislation.”

Loading Icon
close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.