logo image
Helen YatesSeptember 04, 2017
Golden Gate
Golden Gate
The Peril of Ignoring The Tail
September 04, 2017

Drawing on several new data sources and gaining a number of new insights from recent earthquakes on how different fault segments might interact in future earthquakes, Version 17 of the RMS North America Earthquake Models sees the frequency of larger events increasing, making for a fatter tail.  EXPOSURE asks what this means for (re)insurers from a pricing and exposure management perspective. Recent major earthquakes, including the M9.0 Tohoku Earthquake in Japan in 2011 and the Canterbury Earthquake Sequence in New Zealand (2010-2011), have offered new insight into the complexities and interdependencies of losses that occur following major events. This insight, as well as other data sources, was incorporated into the latest seismic hazard maps released by the U.S. Geological Survey (USGS). In addition to engaging with USGS on its 2014 update, RMS went on to invest more than 100 person-years of work in implementing the main findings of this update as well as comprehensively enhancing and updating all components in its North America Earthquake Models (NAEQ). The update reflects the deep complexities inherent in the USGS model and confirms the adage that “earthquake is the quintessential tail risk.” Among the changes to the RMS NAEQ models was the recognition that some faults can interconnect, creating correlations of risk that were not previously appreciated. Lessons from Kaikoura While there is still a lot of uncertainty surrounding tail risk, the new data sets provided by USGS and others have improved the understanding of events with a longer return period. “Global earthquakes are happening all of the time, not all large, not all in areas with high exposures,” explains Renee Lee, director, product management at RMS. “Instrumentation has become more advanced and coverage has expanded such that scientists now know more about earthquakes than they did eight years ago when NAEQ was last released in Version 9.0.” This includes understanding about how faults creep and release energy, how faults can interconnect, and how ground motions attenuate through soil layers and over large distances. “Soil plays a very important role in the earthquake risk modeling picture,” says Lee. “Soil deposits can amplify ground motions, which can potentially magnify the building’s response leading to severe damage.” The 2016 M7.8 earthquake in Kaikoura, on New Zealand’s South Island, is a good example of a complex rupture where fault segments connected in more ways than had previously been realized. In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next, producing a single larger earthquake. “The Kaikoura quake was interesting in that we did have some complex energy release moving from fault to fault,” says Glenn Pomeroy, CEO of the California Earthquake Authority (CEA). “We can’t hide our heads in the sand and pretend that scientific awareness doesn’t exist. The probability has increased for a very large, but very infrequent, event, and we need to determine how to manage that risk.” San Andreas Correlations Looking at California, the updated models include events that extend from the north of San Francisco to the south of Palm Springs, correlating exposures along the length of the San Andreas fault. While the prospect of a major earthquake impacting both northern and southern California is considered extremely remote, it will nevertheless affect how reinsurers seek to diversify different types of quake risk within their book of business. “In the past, earthquake risk models have considered Los Angeles as being independent of San Francisco,” says Paul Nunn, head of catastrophe risk modeling at SCOR. “Now we have to consider that these cities could have losses at the same time (following a full rupture of the San Andreas Fault). In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next “However, it doesn’t make that much difference in the sense that these events are so far out in the tail … and we’re not selling much coverage beyond the 1-in-500-year or 1-in-1,000-year return period. The programs we’ve sold will already have been exhausted long before you get to that level of severity.” While the contribution of tail events to return period losses is significant, as Nunn explains, this could be more of an issue for insurance companies than (re)insurers, from a capitalization standpoint. “From a primary insurance perspective, the bigger the magnitude and event footprint, the more separate claims you have to manage. So, part of the challenge is operational — in terms of mobilizing loss adjusters and claims handlers — but primary insurers also have the risk that losses from tail events could go beyond the (re)insurance program they have bought. “It’s less of a challenge from the perspective of global (re)insurers, because most of the risk we take is on a loss limited basis — we sell layers of coverage,” he continues. “Saying that, pricing for the top layers should always reflect the prospect of major events in the tail and the uncertainty associated with that.” He adds: “The magnitude of the Tohoku earthquake event is a good illustration of the inherent uncertainties in earthquake science and wasn’t represented in modeled scenarios at that time.” While U.S. regulation stipulates that carriers writing quake business should capitalize to the 1-in-200-year event level, in Canada capital requirements are more conservative in an effort to better account for tail risk. “So, Canadian insurance companies should have less overhang out of the top of their (re)insurance programs,” says Nunn. Need for Post-Event Funding For the CEA, the updated earthquake models could reinvigorate discussions around the need for a mechanism to raise additional claims-paying capacity following a major earthquake. Set up after the Northridge Earthquake in 1994, the CEA is a not-for-profit, publicly managed and privately funded earthquake pool. “It is pretty challenging for a stand-alone entity to take on large tail risk all by itself,” says Pomeroy. “We have, from time to time, looked at the possibility of creating some sort of post-event risk-transfer mechanism. “A few years ago, for instance, we had a proposal in front of the U.S. Congress that would have created the ability for the CEA to have done some post-event borrowing if we needed to pay for additional claims,” he continues. “It would have put the U.S. government in the position of guaranteeing our debt. The proposal didn’t get signed into law, but it is one example of how you could create an additional claim-paying capacity for that very large, very infrequent event.” “(Re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing” Paul Nunn SCOR The CEA leverages both traditional and non-traditional risk-transfer mechanisms. “Risk transfer is important. No one entity can take it on alone,” says Pomeroy. “Through risk transfer from insurer to (re)insurer the risk is spread broadly through the entrance of the capital markets as another source for claim-paying capability and another way of diversifying the concentration of the risk. “We manage our exposure very carefully by staying within our risk-transfer guidelines,” he continues. “When we look at spreading our risk, we look at spreading it through a large number of (re)insurance companies from 15 countries around the world. And we know the (re)insurers have their own strict guidelines on how big their California quake exposure should be.” The prospect of a higher frequency of larger events producing a “fatter” tail also raises the prospect of an overall reduction in average annual loss (AAL) for (re)insurance portfolios, a factor that is likely to add to pricing pressure as the industry approaches the key January 1 renewal date, predicts Nunn. “The AAL for Los Angeles coming down in the models will impact the industry in the sense that it will affect pricing and how much probable maximum loss people think they’ve got. Most carriers are busy digesting the changes and carrying out due diligence on the new model updates. “Although the eye-catching change is the possibility of the ‘big one,’ the bigger immediate impact on the industry is what’s happening at lower return periods where we’re selling a lot of coverage,” he says. “LA was a big driver of risk in the California quake portfolio and that’s coming down somewhat, while the risk in San Francisco is going up. So (re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing.”

SHAHEEN RAZZAQJuly 25, 2016
private-catastrophe
private-catastrophe
Beware the Private Catastrophe
July 25, 2016

Having a poor handle on the exposure on their books can result in firms facing disproportionate losses relative to their peers following a catastrophic event, but is easily avoidable, says Shaheen Razzaq, senior director – product management, at RMS. The explosions at Tianjin port, the floods in Thailand and most recently the Fort McMurray wildfires in Canada. What these major events have in common is the disproportionate impact of losses incurred by certain firms’ portfolios. Take the Thai floods in 2011, an event which, at the time, was largely unmodeled. The floods that inundated several major industrial estates around Bangkok caused an accumulation of losses for some reinsurers, resulting in negative rating action, loss in share price and withdrawals from the market. Last year’s Tianjin Port explosions in China also resulted in substantial insurance losses, which had an outsized impact on some firms, with significant concentrations of risk at the port or within impacted supply chains. The insured property loss from Asia’s most expensive human-caused catastrophe and the marine industry’s biggest loss since Superstorm Sandy is thought to be as high as US$3.5 billion, with significant “cost creep” as a result of losses from business interruption and contingent business interruption, clean-up and contamination expenses. “While events such as the Tianjin port explosions, Thai floods and more recent Fort McMurray wildfires may have occurred in so-called industry ‘cold spots,’ the impact of such events can be evaluated using deterministic scenarios to stress test a firm’s book of business.” Some of the highest costs from Tianjin were suffered by European firms, with some firms experiencing losses reaching US$275 million. The event highlighted the significant accumulation risk to non-modeled, man-made events in large transportation hubs such as ports, where much of the insurable content (cargo) is mobile and changeable and requires a deeper understanding of the exposures. Speaking about the firm’s experience in an interview with Bloomberg in early 2016, Zurich Insurance Group chairman and acting CEO Tom de Swann noted how due to the accumulation of risk that had not been sufficiently detected, the firm was looking at ways to strengthen its exposure management to avoid such losses in the future. There is a growing understanding that firms can avoid suffering disproportionate impacts from catastrophic events by taking a more analytical approach to mapping the aggregation risk within their portfolios. According to Validus chairman and CEO Ed Noonan, in statements following Tianjin last year, it is now “unacceptable” for the marine insurance industry not to seek to improve its modeling of risk in complex, ever-changing port environments. Women carrying sandbags to protect ancient ruins in Ayuttaya, Thailand during the seasonal monsoon flooding. While events such as the Tianjin port explosions, Thai floods and more recent Fort McMurray wildfires may have occurred in so-called industry “cold spots,” the impact of such events can be evaluated using deterministic scenarios to stress test a firm’s book of business. This can either provide a view of risk where there is a gap in probabilistic model coverage or supplement the view of risk from probabilistic models. Although much has been written about Nassim Taleb’s highly improbable “black swan” events, in a global and interconnected world firms’ increasingly must contend with the reality of “grey swan” and “white swan” events. According to risk consultant Geary Sikich in his article, “Are We Seeing the Emergence of More White Swan Events?” the definition of a grey swan is “a highly probable event with three principal characteristics: It is predictable; it carries an impact that can easily cascade…and, after the fact, we shift the focus to errors in judgment or some other human form of causation.” A white swan is a “highly certain event” with “an impact that can easily be estimated” where, once again, after the fact there is a shift to focus on “errors in judgment.” “Addressing unpredictability requires that we change how Enterprise Risk Management programs operate,” states Sikich. “Forecasts are often based on a “static” moment; frozen in time, so to speak…. Assumptions, on the other hand, depend on situational analysis and the ongoing tweaking via assessment of new information. An assumption can be changed and adjusted as new information becomes available.” “Best-in-class exposure management analytics is all about challenging assumptions and using disaster scenarios to test how your portfolio would respond if a major event were to occur in a non-modeled peril region.” It is clear Sikich’s observations on unpredictability are becoming the new normal in the industry. Firms are investing to fully entrench strong exposure management practices across their entire enterprise to protect against private catastrophes. They are also reaping other benefits from this type of investment: Sophisticated exposure management tools are not just designed to help firms better manage their risks and exposures, but also to identify new areas of opportunity. By gaining a deeper understanding of their global portfolio across all regions and perils, firms are able to make more informed strategic decisions when looking to grow their business. In specific regions for certain perils, firms’ can use exposure-based analytics to contextualize their modeled loss results. This allows them to “what if” on the range of possible deterministic losses so they can stress test their portfolio against historical benchmarks, look for sensitivities and properly set expectations. Exposure Management Analytics Best-in-class exposure management analytics is all about challenging assumptions and using disaster scenarios to test how your portfolio would respond if a major event were to occur in a non-modeled peril region. Such analytics can identify the pinch points – potential accumulations both within and across classes of business – that may exist while also offering valuable information on where to grow your business. Whether it is through M&A or organic growth, having a better grasp of exposure across your portfolio enables strategic decision-making and can add value to a book of business. The ability to analyze exposure across the entire organization and understand how it is likely to impact accumulations and loss potential is a powerful tool for today’s C-suite. Exposure management tools enable firms to understand the risk in their business today but also how changes can impact their portfolio – whether acquiring a book, moving into new territories or divesting a nonperforming book of business. 

close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an RMS specialist shortly.

RMS.com uses cookies to improve your experience and analyze site usage. Read Cookie Policy or click I understand.

close