Marleen Nyst and Nilesh Shome of RMS explore some of the lessons and implications from the recent sequence of earthquakes in California On the morning of July 4, 2019, the small town of Ridgecrest in California’s Mojave Desert unexpectedly found itself at the center of a major news story after a magnitude 6.4 earthquake occurred close by. This earthquake later transpired to be a foreshock for a magnitude 7.1 earthquake the following day, the strongest earthquake to hit the state for 20 years. These events, part of a series of earthquakes and aftershocks that were felt by millions of people across the state, briefly reignited awareness of the threat posed by earthquakes in California. Fortunately, damage from the Ridgecrest earthquake sequence was relatively limited. With the event not causing a widespread social or economic impact, its passage through the news agenda was relatively swift. But there are several reasons why an event such as the Ridgecrest earthquake sequence should be a focus of attention both for the insurance industry and the residents and local authorities in California. “If Ridgecrest had happened in a more densely populated area, this state would be facing a far different economic future than it is today” Glenn Pomeroy California Earthquake Authority “We don’t want to minimize the experiences of those whose homes or property were damaged or who were injured when these two powerful earthquakes struck, because for them these earthquakes will have a lasting impact, and they face some difficult days ahead,” explains Glenn Pomeroy, chief executive of the California Earthquake Authority. “However, if this series of earthquakes had happened in a more densely populated area or an area with thousands of very old, vulnerable homes, such as Los Angeles or the San Francisco Bay Area, this state would be facing a far different economic future than it is today — potentially a massive financial crisis,” Pomeroy says. Although one of the most populous U.S. states, California’s population is mostly concentrated in metropolitan areas. A major earthquake in one of these areas could have repercussions for both the domestic and international economy. Low Probability, High Impact Earthquake is a low probability, high impact peril. In California, earthquake risk awareness is low, both within the general public and many (re)insurers. The peril has not caused a major insured loss for 25 years, the last being the magnitude 6.7 Northridge earthquake in 1994. California earthquake has the potential to cause large-scale insured and economic damage. A repeat of the Northridge event would likely cost the insurance industry today around US$30 billion, according to the latest version of the RMS® North America Earthquake Models, and Northridge is far from a worst-case scenario. From an insurance perspective, one of the most significant earthquake events on record would be the magnitude 9.0 Tōhoku Earthquake and Tsunami in 2011. For California, the 1906 magnitude 7.8 San Francisco earthquake, when Lloyd’s underwriter Cuthbert Heath famously instructed his San Franciscan agent to “pay all of our policyholders in full, irrespective of the terms of their policies”, remains historically significant. Heath’s actions led to a Lloyd’s payout of around US$50 million at the time and helped cement Lloyd’s reputation in the U.S. market. RMS models suggest a repeat of this event today could cost the insurance industry around US$50 billion. But the economic cost of such an event could be around six times the insurance bill — as much as US$300 billion — even before considering damage to infrastructure and government buildings, due to the surprisingly low penetration of earthquake insurance in the state. Events such as the 1906 earthquake and even Northridge are too far in the past to remain in public consciousness. And the lack of awareness of the peril’s damage potential is demonstrated by the low take-up of earthquake insurance in the state. “Because large, damaging earthquakes don’t happen very frequently, and we never know when they will happen, for many people it’s out of sight, out of mind. They simply think it won’t happen to them,” Pomeroy says. Across California, an average of just 12 percent to 14 percent of homeowners have earthquake insurance. Take-up varies across the state, with some high-risk regions, such as the San Francisco Bay Area, experiencing take-up below the state average. Take-up tends to be slightly higher in Southern California and is around 20 percent in Los Angeles and Orange counties. Take-up will typically increase in the aftermath of an event as public awareness rises but will rapidly fall as the risk fades from memory. As with any low probability, high impact event, there is a danger the public will not be well prepared when a major event strikes. The insurance industry can take steps to address this challenge, particularly through working to increase awareness of earthquake risk and actively promoting the importance of having insurance coverage for faster recovery. RMS and its insurance partners have also been working to improve society’s resilience against risks such as earthquake, through initiatives such as the 100 Resilient Cities program. Understanding the Risk While the tools to model and understand earthquake risk are improving all the time, there remain several unknowns which underwriters should be aware of. One of the reasons the Ridgecrest Earthquake came as such a surprise was that the fault on which it occurred was not one that seismologists knew existed. Several other recent earthquakes — such as the 2014 Napa event, the Landers and Big Bear Earthquakes in 1992, and the Loma Prieta Earthquake in 1989 — took place on previously unknown or thought to be inactive faults or fault strands. As well as not having a full picture of where the faults may lie, scientific understanding of how multifaults can link together to form a larger event is also changing. Events such as the Kaikoura Earthquake in New Zealand in 2016 and the Baja California Earthquake in Mexico in 2010 have helped inform new scientific thinking that faults can link together causing more damaging, larger magnitude earthquakes. The RMS North America Earthquake Models have also evolved to factor in this thinking and have captured multifault ruptures in the model based on the latest research results. In addition, studying the interaction between the faults that ruptured in the Ridgecrest events will allow RMS to improve the fault connectivity in the models. A further learning from New Zealand came via the 2011 Christchurch Earthquake, which demonstrated how liquefaction of soil can be a significant loss driver due to soil condition in certain areas. The San Francisco Bay Area, an important national and international economic hub, could suffer a similar impact in the event of a major earthquake. Across the area, there has been significant residential and commercial development on artificial landfill areas over the last 100 years, which are prone to have significant liquefaction damage, similar to what was observed in Christchurch. Location, Location, Location Clearly, the location of the earthquake is critical to the scale of damage and insured and economic impact from an event. Ridgecrest is situated roughly 200 kilometers north of Los Angeles. Had the recent earthquake sequence occurred beneath Los Angeles instead, then it is plausible that the insured cost could have been in excess of US$100 billion. The Puente Hills Fault, which sits underneath downtown LA, wasn’t discovered until around the turn of the century. A magnitude 6.8 Puente Hills event could cause an insured loss of US$78.6 billion, and a Newport-Inglewood magnitude 7.3 would cost an estimated US$77.1 billion according to RMS modeling. These are just a couple of the examples within its stochastic event set with a similar magnitude to the Ridgecrest events and which could have a significant social, economic and insured loss impact if they took place elsewhere in the state. The RMS model estimates that magnitude 7 earthquakes in California could cause insurance industry losses ranging from US$20,000 to a US$20 billion, but the maximum loss could be over US$100 billion if occurring in high population centers such as Los Angeles. The losses from the Ridgecrest event were on the low side of the range of loss as the event occurred in a less populated area. For the California Earthquake Authority’s portfolio in Los Angeles County, a large loss event of US$10 billion or greater can be expected approximately every 30 years. As with any major catastrophe, several factors can drive up the insured loss bill, including post-event loss amplification and contingent business interruption, given the potential scale of disruption. In Sacramento, there is also a risk of failure of the levee system. Fire following earthquake was a significant cause of damage following the 1906 San Francisco Earthquake and was estimated to account for around 40 percent of the overall loss from that event. It is, however, expected that fire would make a much smaller contribution to future events, given modern construction materials and methods and fire suppressant systems. Political pressure to settle claims could also drive up the loss total from the event. Lawmakers could put pressure on the CEA and other insurers to settle claims quickly, as has been the case in the aftermath of other catastrophes, such as Hurricane Sandy. The California Earthquake Authority has recommended homes built prior to 1980 be seismically retrofitted to make them less vulnerable to earthquake damage. “We all need to learn the lesson of Ridgecrest: California needs to be better prepared for the next big earthquake because it’s sure to come,” Pomeroy says. “We recommend people consider earthquake insurance to protect themselves financially,” he continues. “The government’s not going to come in and rebuild everybody’s home, and a regular residential insurance policy does not cover earthquake damage. The only way to be covered for earthquake damage is to have an additional earthquake insurance policy in place. “Close to 90 percent of the state does not have an earthquake insurance policy in place. Let this be the wake-up call that we all need to get prepared.”
Why is it that, in many different situations and perils, people appear to want to relocate toward the risk? What is the role of the private insurance and reinsurance industry in curbing their clients’ risk tropism? Florida showed rapid percentage growth in terms of exposure and number of policyholders If the Great Miami Hurricane of 1926 were to occur again today it would result in insurance losses approaching US$200 billion. Even adjusted for inflation, that is hundreds of times more than the US$100 million damage toll in 1926. Over the past 100 years, the Florida coast has developed exponentially, with wealthy individuals drawn to buying lavish coastal properties — and the accompanying wind and storm-surge risks. Since 2000, the number of people living in coastal areas of Florida increased by 4.2 million, or 27 percent, to 19.8 million in 2015, according to the U.S. Census Bureau. This is an example of unintended “risk tropism,” explains Robert Muir-Wood, chief research officer at RMS. Just as the sunflower is a ‘heliotrope’, turning toward the sun, research has shown how humans have an innate drive to live near water, on a river or at the beach, often at increased risk of flood hazards. “There is a very strong human desire to find the perfect primal location for your house. It is something that is built deeply into the human psyche,” Muir-Wood explains. “People want to live with the sound of the sea, or in the forest ‘close to nature,’ and they are drawn to these locations thinking about all the positives and amenity values, but not really understanding or evaluating the accompanying risk factors. “People will pay a lot to live right next to the ocean,” he adds. “It’s an incredibly powerful force and they will invest in doing that, so the price of land goes up by a factor of two or three times when you get close to the beach.” Even when beachfront properties are wiped out in hurricane catastrophes, far from driving individuals away from a high-risk zone, research shows they simply “build back bigger,” says Muir-Wood. “The disaster can provide the opportunity to start again, and wealthier people move in and take the opportunity to rebuild grander houses. At least the new houses are more likely to be built to code, so maybe the reduction in vulnerability partly offsets the increased exposure at risk.” Risk tropism can also be found with the encroachment of high-value properties into the wildlands of California, leading to a big increase in wildfire insurance losses. Living close to trees can be good for mental health until those same trees bring a conflagration. Insurance losses due to wildfire exceeded US$10 billion in 2017 and have already breached US$12 billion for last year’s Camp, Hill and Woolsey Fires, according to the California Department of Insurance. It is not the number of fires that have increased, but the number of houses consumed by the fires. “Insurance tends to stop working when you have levels of risk above one percent […] People are unprepared to pay for it” Robert Muir-Wood RMS Muir-Wood notes that the footprint of the 2017 Tubbs Fire, with claims reaching to nearly US$10 billion, was very similar to the area burned during the Hanley Fire of 1964. The principal difference in outcome is driven by how much housing has been developed in the path of the fire. “If a fire like that arrives twice in one hundred years to destroy your house, then the amount you are going to have to pay in insurance premium is going to be more than 2 percent of the value per year,” he says. “People will think that’s unjustified and will resist it, but actually insurance tends to stop working when you have levels of risk cost above 1 percent of the property value, meaning, quite simply, that people are unprepared to pay for it.” Risk tropism can also be found in the business sector, in the way that technology companies have clustered in Silicon Valley: a tectonic rift within a fast-moving tectonic plate boundary. The tectonics have created the San Francisco Bay and modulate the climate to bring natural air-conditioning. “Why is it that, around the world, the technology sector has picked locations — including Silicon Valley, Seattle, Japan and Taiwan — that are on plate boundaries and are earthquake prone?” asks Muir-Wood. “There seems to be some ideal mix of mountains and water. The Bay Area is a very attractive environment, which has brought the best students to the universities and has helped companies attract some of the smartest people to come and live and work in Silicon Valley,” he continues. “But one day there will be a magnitude 7+ earthquake in the Bay Area that will bring incredible disruption, that will affect the technology firms themselves.” Insurance and reinsurance companies have an important role to play in informing and dissuading organizations and high net worth individuals from being drawn toward highly exposed locations; they can help by pricing the risk correctly and maintaining underwriting discipline. The difficulty comes when politics and insurance collide. The growth of Fair Access to Insurance Requirements (FAIR) plans and beach plans, offering more affordable insurance in parts of the U.S. that are highly exposed to wind and quake perils, is one example of how this function is undermined. At its peak, the size of the residual market in hurricane-exposed states was US$885 billion, according to the Insurance Information Institute (III). It has steadily been reduced, partly as a result of the influx of non-traditional capacity from the ILS market and competitive pricing in the general reinsurance market. However, in many cases the markets-of-last-resort remain some of the largest property insurers in coastal states. Between 2005 and 2009 (following Hurricanes Charley, Frances, Ivan and Jeanne in 2004), the plans in Mississippi, Texas and Florida showed rapid percentage growth in terms of exposure and number of policyholders. A factor fueling this growth, according to the III, was the rise in coastal properties. As long as state-backed insurers are willing to subsidize the cost of cover for those choosing to locate in the riskiest locations, private (re)insurance will fail as an effective check on risk tropism, thinks Muir-Wood. “In California there are quite a few properties that have not been able to get standard fire insurance,” he observes. “But there are state or government-backed schemes available, and they are being used by people whose wildfire risk is considered to be too high.”
Are (re)insurers sufficiently capitalized to withstand a major earthquake in a metropolitan area during peak hours? The U.S. workers’ compensation insurance market continues to generate underwriting profit. According to Fitch Ratings, 2019 is on track to mark the fifth consecutive year of profits and deliver a statutory combined ratio of 86 percent in 2018. Since 2015, it has achieved an annual average combined ratio of 93 percent. The market’s size has increased considerably since the 2008 financial crisis sparked a flurry of activity in the workers’ compensation arena. Over the last 10 years, written premiums have risen 50 percent from approximately US$40 billion to almost US$60 billion, aided by low unemployment and growth in rate and wages. Yet market conditions are changing. The pricing environment is deteriorating, prior-year reserve releases are slowing and severity is ticking upwards. And while loss reserves currently top US$150 billion, questions remain over whether these are sufficient to bear the brunt of a major earthquake in a highly populated area. The Big One California represents over 20 percent of the U.S. workers’ compensation market. The Workers’ Compensation Insurance Rating Bureau of California (WCIRB) forecasts a written premium pot of US$15.7 billion for 2019, a slight decline on 2018’s US$17 billion figure. “So, the workers’ compensation sector’s largest premium is concentrated in the area of the U.S. most exposed to earthquake risk,” explains Nilesh Shome, vice president at RMS. “This problem is unique to the U.S., since in most other countries occupational injury is covered by government insurance schemes instead of the private market. Further, workers’ compensation policies have no limits, so they can be severely impacted by a large earthquake.” Workers’ compensation insurers enjoy relatively healthy balance sheets, with adequate profitability and conservative premium-to-surplus ratios. But, when you assess the industry’s exposure to large earthquakes in more detail, the surplus base starts to look a little smaller. “We are also talking about a marketplace untested in modern times,” he continues. “The 1994 Northridge Earthquake in Los Angeles, for example, while causing major loss, occurred at 4:30 a.m. when most people were still in bed, so had limited impact from a workers’ compensation perspective.” Analyzing the Numbers Working with the WCIRB, RMS modeled earthquake scenarios using Version 17 of the RMS® North America Earthquake Casualty Model, which incorporates the latest science in earthquake hazard and vulnerability research. The portfolio provided by the WCIRB contained exposure information for 11 million full-time-equivalent employees, including occupation details for each. The analysis showed that the average annual estimated insured loss is US$29 million, which corresponds to 0.5 cents per $100 payroll and $2.50 per employee. The 1-in-100-year insurance loss is expected to exceed US$300 million, around 5,000 casualties including 300 fatalities; while at peak work-time hours, the loss could rise to US$1.5 billion. For a 1-in-250-year loss, the figure could top US$1.4 billion and more than 1,000 fatalities, rising to US$5 billion at peak work-time hours. But looking at the magnitude 7.8 San Francisco Earthquake in 1906 at 5:12 a.m., the figure would be 7,300 injuries, 1,900 fatalities and around US$1 billion in loss. At peak work hours, this would rise to 22,000 casualties, 5,800 fatalities and a US$3 billion loss. To help reduce the impact of major earthquakes, RMS is working with the Berkeley Research Lab and the United States Geological Survey (USGS) to research the benefits of an earthquake early warning system (EEWS) and safety measures such as drop-cover-hold and evacuating buildings after an EEWS alarm. Initial studies indicate that an EEWS alert for the large, faraway earthquakes such as the 1857 magnitude 7.9 Fort Tejon Earthquake near Los Angeles can reduce injuries by 20 percent-50 percent. Shome concludes: “It is well known in the industry that workers’ compensation loss distribution has a long tail, and at conferences RMS has demonstrated how our modeling best captures this tail. The model considers many low probability, high consequence events by accurately modeling the latest USGS findings.”
With the introduction of the Risk Data Open Standard, the potential now exists to change the way the (re)insurance industry interacts with risk modeling data In May 2019, RMS introduced the (re)insurance industry to a new open data standard. Set to redefine how the market structures data, the Risk Data Open Standard (RDOS) offers a flexible, fully transparent and highly efficient framework — spanning all risks, models and contracts and information sets — that can be implemented using a wide range of data technology. “The RDOS has been constructed to hold the entire set of information that supports the analysis of any risk” Ryan Ogaard RMS That this new standard has the potential to alter fundamentally how the market interacts with exposure data is not hyperbole. Consider the formats that it is replacing. The RMS Exposure and Results Data Modules (EDM and RDM) have been the data cornerstones of the property catastrophe market for over 20 years. Other vendors use similar data formats, and some catastrophe modeling firms have their own versions. These information workhorses have served the sector well, transforming the way property catastrophe risk is transacted, priced and managed. Out With the Old But after over two decades of dedicated service, it is past time these formats were put out to pasture. Built to handle a narrow range of modeling approaches, limited in their ability to handle multiple information formats, property-centric by design and powered by outdated technology, the EDM/RDM and other formats represent “old-gen” standards crumbling under current data demands. “EDM and RDM have earned their status as the de facto standards for property catastrophe data exchange,” explains Ryan Ogaard, senior vice president at RMS. “Clearly documented, easy to implement, SQL-based, they were groundbreaking and have been used extensively in systems and processes for over 20 years. But the industry has evolved well beyond the capabilities of all the existing formats, and a new data model must be introduced to facilitate innovation and efficiency across our industry.” The RDOS is not the only attempt to solve the data formatting challenge. Multiple other initiatives have been attempted, or are underway, to improve data efficiency within the insurance industry. However, Ogaard believes all of these share one fatal flaw — they do not go far enough. “I have been involved in various industry groups exploring ways to overcome data challenges,” he explains, “and have examined the potential of different options. But in every instance, what is clear is that they would not advance the industry far enough to make them worth switching to.” The switching costs are a major issue with any new data standard. Transitioning to a new format from one so firmly embedded within your data hierarchy is a considerable move. To shift to a new standard that offers only marginal relief from the data pains of the current system would not be enough. “The industry needs a data container that can be extended to new coverages, risk types or contracts,” he states. “If we require a different format for every line of business or type of model, we end up with a multiplicative world of data inefficiency. Look at cyber risk. We’ve already created a separate new standard for that information. If our industry is truly going to move forward, the switch must solve our challenges in the short, medium and long term. That means a future-proof design to handle new models, risks and contracts — ideally all in one container.” Setting the Standard Several years in the making, the RDOS is designed to address every deficiency in the current formatting framework, providing a data container that can be easily modified as needs change and can deliver information in a single, auditable format that supports a wide range of analytics. It is already used within the framework of the recently launched risk management platform RMS Risk Intelligence™ “The RDOS is designed to be extended across several dimensions,” Ogaard continues. “It can handle the data and output to support any modeling algorithm — so RMS, or anyone else, can use it as a basis for new or existing models. It was originally built to support our high-definition (HD) modeling, which requires a domain-specific language to represent policy or treaty terms and structures — that was not possible with the old format. During that process, we realized that we should design a container that would not have to be replaced in the future when we inevitably build other types of models.” The RDOS can also span all business lines. It is designed to accommodate the description of any risk item or subject at risk. The standard has inherent flexibility — new tables can be introduced to the framework without disrupting existing sets, while current tables can be extended to handle information for multiple model types or additional proprietary data. “EDM and RDM were fundamental to creating a much more stable, resilient and dynamic marketplace,” says Ogaard. “That level of modeling simply isn’t available across other lines — but with the RDOS it can be. Right off the bat, that has huge implications for issues such as clash risk. By taking the data that exists across your policy and treaty systems and converting it into a single data format, you can then apply an accumulation engine to evaluate all clash scenarios. So, essentially, you can tackle accumulation risk across all business lines.” It is also built to encompass the full “risk story.” Current data formats essentially provide exposure and modeling results, but lack critical information on how the exposure was used to create the results. This means that anyone receiving these data sets must rely on an explanation of how an analysis was done — or figure it out themselves. “The RDOS has been constructed to hold the entire set of information that supports the analysis of any risk,” he explains. “This includes exposures, (re)insurance coverage information, the business structure used to create the results, complete model settings and adjustments, the results, and the linkage between the information. Multiple analyses can also be included in a single container. That means more time can be spent on accurate risk decision-making.” The RDOS is also independent of any specific technology and can be implemented in modern object relational technology, making it highly flexible. It can also be implemented in SQL Server if the limitations of a relational representation are adequate for the intended usage. The insurance industry, and cat analytics software, has been slow to adopt the power of tools such as Parquet, Spark, Athena and other new and powerful (and often open-source) data tools that can drive more data insights. Opening the Box For the RDOS to achieve its full potential, however, it cannot be constrained by ownership. By its very nature, it must be an open standard operated in a neutral environment if it is to be adopted by all and serve a larger market purpose. RMS recognized this and donated the RDOS to the industry (and beyond) as an open standard, harnessing open-source principles common in the software industry. Taking this route is perhaps not surprising given the executive leadership now in place at the company, with both CEO Karen White and Executive Vice President of Product Cihan Biyikoglu having strong open-source credentials. “When they saw the RDOS,” Ogaard explains, “it clearly had all of the hallmarks of an open-source candidate. It was being built by a leading market player with an industrywide purpose that required a collaborative approach.” What RMS has created with the RDOS represents a viable standard — but rather than a finished product, it is a series of building blocks designed to create a vast range of new applications from across the market. And to do that it must be a completely open standard that can evolve with the industry. “Some companies claim to have open standards,” he continues, “but by that they mean that you can look inside the box. Truly open standards are set up to be overseen and actually modified by the industry. With the RDOS, companies can not only open the box, but take the standard out, use it and modify it to create something better. They can build additions and submit them for inclusion and use by the entire industry. The RDOS will not be driven by RMS needs and priorities — it will exist as a separate entity. RMS cannot build every potential solution or model. We hope that by making this an open standard, new synergy is created that will benefit everyone — including us, of course.” Under Scrutiny To create a standard fit for all, RMS accepted that the RDOS could not be built in isolation and pushed out into the market — it had to be tested, the underlying premise reviewed, the format scrutinized. To ensure this, the company set up a steering committee from across the (re)insurance market. Charged with putting the RDOS through its paces, the committee members are given a central role in virtually every development stage. The committee is currently sixteen companies strong and growing. It will be dynamic and membership will change over time as issues and company focuses evolve. The membership list can be seen at www.riskdataos.org. “You cannot sit in an ivory tower and decide what might work for the industry as a whole,” Ogaard explains. “You need a robust vetting process and by creating this group of leading (re)insurance practitioners, each committed not simply to the success of the project but to the development of the best possible data solution, the RDOS will be guided by the industry, not just one company.” The role of the committee is twofold. First, it reviewed the existing specification, documentation and tooling to determine if it was ready for market consumption. RDOS saw its industry launch at the end of January 2020, and now the RDOS is published, the committee’s role will be to advise on the priorities and scope of future developments based on market-led requests for change and improvement. “Almost every open standard in any industry is based on a real, working product — not a theoretical construct,” he states. “Because the RDOS was built for a practical purpose and is in real-world use, it is much more likely to hold up to wider use and scrutiny.” So, while the RDOS may be growing its awareness in the wider market, it has already established its data credentials within the RMS model framework. Of course, there remains the fundamental challenge of shifting from one data format to another — but measures are already in place to make this as painless as possible. “The RDOS is essentially a superset of the original EDM and RDM formats,” he explains, “offering an environment in which the new and old standards are interchangeable. So, a company can translate an EDM into an RDOS and vice versa. The open standard tooling will include translators to make this translation. The user will therefore be able to operate both formats simultaneously and, as they recognize the RDOS data benefits, transition to that environment at their own pace. The RDOS could be extended to include other modelers’ data fields as well — so could solve model interoperability issues — if the industry decides to use it this way.” The standard has launched on the global development platform GitHub, which supports open-source standards, offering a series of downloadable assets including the RDOS specification, documentation, tools and data so that companies can create their own implementation and translate to and from old data formats. The potential that it creates is considerable and to a degree only limited by the willingness of users to push boundaries. “Success could come in several forms,” Ogaard concludes. “The RDOS becomes the single universal container for data exchange, creating huge efficiencies. Or it creates a robust ecosystem of developers opening up new opportunities and promoting greater industry choice. Or it supports new products that could not be foreseen today and creates synergies that drive more value — perhaps even outside the traditional market. Ideally, all of these things.”
Insurance-linked securities (ILS) investors want to know more about how climate change impacts investment decisions, according to Paul Wilson, head of non-life analytics at Securis Investment Partners, an ILS asset manager We make investments that are typically annual to two-to-three years in duration, so we need to understand the implications of climate change on those timescales,” explains Paul Wilson, head of non-life analytics at Securis Investment Partners. “We reevaluate investments as part of any renewal process, and it’s right to ask if any opportunity is still attractive given what we know about how our climate is changing. “The fundamental question that we’re trying to address is, ‘Have I priced the risk of this investment correctly for the next year?’” he continues. “And therefore, we need to know if the catastrophe models we are using accurately account for the impact climate change may be having. Or are they overly reliant on historical data and, as such, are not actually representing the true current risk levels for today’s climate?” Expertise in climate change is a requirement for how Securis is thinking about risk. “We have investors who are asking questions about climate change, and we have a responsibility to be able to demonstrate to them that we are taking the implications into consideration in our investment decisions.” “We have investors who are asking questions about climate change, and we have a responsibility to demonstrate to them that we are taking the implications into consideration in our investment decisions Paul Wilson Securis Investment Partners The rate at which a changing climate may influence natural catastrophes will present both a challenge and opportunity to the wider industry as well as to catastrophe modeling companies, thinks Wilson. The results coming out of climate change attribution studies are going to have to start informing the decisions around risk. For example, according to attribution studies, climate change tripled the chances of Hurricane Harvey’s record rainfall. “Climate change is a big challenge for the catastrophe modeling community,” he says. “It’s going to put a greater burden on catastrophe modelers to ensure that their models are up to date. The frequency and nature of model updates will have to change. Models we are using today may become out of date in just a few years’ time. That’s interesting when you think about the number of perils and regions where climate change could have a significant impact. “All of those climate-related models could be impacted by climate change, so we have to question the impact that is having today,” he adds. “If the model you are using to price the risk has been calibrated to the last 50 years, but you believe the last 10 or last 20 years are more representative because of the implication of climate change, then how do you adjust your model according to that? That’s the question we should all be looking to address.”
As environmental, social and governance principles become more prominent in guiding investment strategies, the ILS market must respond In recent years, there has been a sharper focus by the investment community on responsible investment. One indicator of this has been the increased adoption of the Principles for Responsible Investment (PRI), as environmental, social and governance (ESG) concerns become a more prominent influencer of investment strategies. Investment houses are also seeking closer alignment between their ESG practices and the United Nations’ Sustainable Development Goals (SDGs). The 17 interconnected SDGs, set in 2015, are a call to action to end poverty, achieve peace and prosperity for all, and create a sustainable society by 2030. As investors target more demonstrable outcomes from their investment practices, is there a possible opportunity for the insurance-linked securities (ILS) market to grow, given the potential societal capital that insurance can generate? “Insurance certainly has all of the hallmarks of an ESG-compatible investment opportunity,” believes Charlotte Acton, director of capital and resilience solutions at RMS. “It has the potential to promote resilience through enabling broader access and uptake of appropriate affordable financial protection and reducing the protection gap; supporting faster and more efficient responses to disasters; and incentivizing mitigation and resilient building practices pre- and post-event.” RMS has been collaborating on numerous initiatives designed to further the role of insurance and insurance technologies in disaster and climate-change resilience. These include exploring ways to monetize the dividends of resilience to incentivize resilient building, using catastrophe models to quantify the benefits of resilience investments such as flood defenses, and earthquake retrofit programs for housing. The work has also involved designing innovative parametric structures to provide rapid post-disaster liquidity. “Investors will want a clear understanding of the exposure or assets that are being protected and whether they are ESG-friendly” Charlotte Acton RMS “ILS offers a clear route for investors to engage with insurance,” explains Acton, “broadening the capital pool that supports insurance is critical as it facilitates the expansion of insurance to new regions and allows the industry to absorb increasingly large losses from growing threats such as climate change.” Viewed as a force for social good, it can certainly be argued that insurance-linked securities supports a number of the U.N.’s SDGs, including reducing the human impact of disasters and creating more sustainable cities, increasing overall resilience levels and increasing access to financial services that enhance sustainable growth potential. While there is opportunity for ILS to play a large part in ESG, the specific role of ILS within PRI is still being determined. According to LGT Capital Partners ESG Report 2019, managers in the ILS space have, in general, yet to start “actively integrating ESG into their investment strategies,” adding that across the ILS asset class “there is still little agreement on how ESG considerations should be applied. However, there is movement in this area. For example, the Bermuda Stock Exchange, a primary exchange for ILS issuers, recently launched an ESG initiative in line with the World Federation of Exchanges’ Sustainability Principles, stating that ESG was a priority in 2019 “with the aim to empower sustainable and responsible growth for its member companies, listings and the wider community.” For ILS to become a key investment option for ESG-focused investors, it must be able to demonstrate its sustainability credentials clearly. “Investors will want a clear understanding of the exposure or assets that are being protected,” Acton explains, “and whether they are ESG-friendly. They will want to know whether the protection offered provides significant societal benefits. If the ILS market can factor ESG considerations into its approach more effectively, then there is no reason why it should not attract greater attention from responsible investors.”
Why the PRA’s stress test has pushed climate change to the top of (re)insurance company agendas As part of its 2019 biennial insurance stress test, the U.K. insurance industry regulator — for the first time — asked insurers and reinsurers to conduct an exploratory exercise in relation to climate change. Using predictions published by the United Nations’ Intergovernmental Panel on Climate Change (IPCC) and in other academic literature, the Bank of England’s Prudential Regulation Authority (PRA) came up with a series of future climate change scenarios, which it asked (re)insurers to use as a basis for stress-testing the impact on their assets and liabilities. The PRA stress test came at a time when pressure is building for commercial and financial services businesses around the world to assess the likely impact of climate change on their business, through initiatives such as the Task Force for Climate-Related Financial Disclosures (TCFD). The submission deadline for the stress-tested scenarios ended on October 31, 2019, following which the PRA will publish a summary of overall results. From a property catastrophe (re)insurance industry perspective, the importance of assessing the potential impact, both in the near and long term, is clear. Companies must ensure their underwriting strategies and solvency levels are adequate so as to be able to account for additional losses from rising sea levels, more climate extremes, and potentially more frequent and/or intense natural catastrophes. Then there’s the more strategic considerations in the long term — how much coverages change and what will consumers demand in a changing climate? The PRA stress test, explains Callum Higgins, product manager of global climate at RMS, is the regulator’s attempt to test the waters. The hypothetical narratives are designed to help companies think about how different plausible futures could impact their business models, according to the PRA. “The climate change scenarios are not designed to assess current financial resilience but rather to provide additional impetus in this area, with results comparable across firms to better understand the different approaches companies are using.” “There was pressure on clients to respond to this because those that don’t participate will probably come under greater scrutiny” Callum Higgins RMS RMS was particularly well placed to support (re)insurers in responding to the “Assumptions to Assess the Impact on an Insurer’s Liabilities” section of the climate change scenarios, with catastrophe models the perfect tools to evaluate such physical climate change risk to liabilities. This portion of the stress test examined how changes in both U.S. hurricane and U.K. weather risk under the different climate change scenarios may affect losses. The assumptions around U.K. weather included shifts in U.K. inland and coastal flood hazard, looking at the potential loss changes from increased surface runoff and sea level rise. While in the U.S., the assumptions included a 10 percent and 20 percent increase in the frequency of major hurricanes by 2050 and 2100, respectively. “While the assumptions and scenarios are hypothetical, it is important (re)insurers use this work to develop their capabilities to understand physical climate change risk,” says Higgins. “At the moment, it is exploratory work, but results will be used to guide future exercises that may put (re)insurers under pressure to provide more sophisticated responses.” Given the short timescales involved, RMS promptly modified the necessary models in time for clients to benefit for their submissions. “To help clients start thinking about how to respond to the PRA request, we provided them with industrywide factors, which allowed for the approximation of losses under the PRA assumptions but will likely not accurately reflect the impact on their portfolios. For this reason, we could also run (re)insurers’ own exposures through the adjusted models, via RMS Analytical Services, better satisfying the PRA’s requirements for those who choose this approach. “To reasonably represent these assumptions and scenarios, we think it does need help from vendor companies like RMS to adjust the model data appropriately, which is possibly out of scope for many businesses,” he adds. Detailed results based on the outcome of the stress-test exercise can be applied to use cases beyond the regulatory submission for the PRA. These or other similar scenarios can be used to sensitivity test possible answers to questions such as how will technical pricing of U.K. flood be affected by climate change, how should U.S. underwriting strategy shift in response to sea level rise or how will capital adequacy requirements change as a result of climate change — and inform strategic decisions accordingly.