logo image
Helen YatesSeptember 22, 2021
Florida houses aerial view
Florida houses aerial view
Deconstructing Social Inflation
September 22, 2021

After the loss creep associated with Hurricane Irma in 2017, (re)insurers are keen to quantify how social inflation could exacerbate claims costs in the future. The challenge lies in eliminating the more persistent, longer-term trends, allowing these factors to be explicitly modeled and reducing the “surprise factor” the next time a major storm blows through.  A few days after Hurricane Irma passed over Marco Island, Florida, on September 10, 2017, RMS® deployed a reconnaissance team to offer some initial feedback on the damage that was sustained. Most properties on the island had clay tile roofs and while the team noted some dislodged or broken tiles, damage did not appear to be severe. A year later, when Peter Datin, senior director of modeling at RMS, decided to revisit the area, he was shocked by what he saw. “There were so many roofing contractors still on the island, and almost every house seemed to be getting a full roof replacement. We found out that US$900 million worth of roofing permits for repairs had been filed in Marco Island alone.” Trying to find the exact shape and color for tile replacements was a challenge, forcing contractors to replace the entire roof for aesthetic reasons. Then there is Florida's “25 percent rule,” which previously applied to High-Velocity Hurricane Zones in South Florida (Miami-Dade and Broward Counties) before expanding statewide under the 2017 Florida Building Code. Under the rule, if a loss assessor or contractor determines that a quarter or more of the roof has been damaged in the last 12 months, it cannot simply be repaired, and 100 percent must be replaced. This begins to explain why, in the aftermath of Hurricane Irma and to a lesser extent Hurricane Michael in 2018, claims severity and loss creep were such an issue. “We looked at some modeling aspects in terms of the physical meaning of this,” says Datin. “If we were to directly implement an engineering or physics-based approach, what does that mean? How does it impact the vulnerability curve? "We went through this exercise last summer and found that if you hit that threshold of the 25 percent roof damage ratio, particularly for low wind speeds, that's a fourfold increase on your claims. At certain wind speeds, it can therefore have a very material increase on the losses being paid. It’s not quite that straightforward to implement on the vulnerability curve, but it is very significant.” But issues such as the 25 percent rule do not tell the whole story, and in a highly litigious market such as Florida, determining whether a roof needs a complete replacement is not just down to physics. Increasingly, the confluence of additional factors that fall under the broad description of “social inflation” are also having a meaningful impact on the total cost of claims. What Is Social Inflation? Broadly, social inflation refers to all the ways in which insurers’ claims costs rise over and above general economic inflation (i.e., growth in wages and prices), which will influence the cost of repairs and/or replacing damaged property. It therefore captures the growth in costs connected to the following: unanticipated emerging perils associated with, for example, new materials or technologies, shifts in the legal environment, evolving social attitudes and preferences towards equitable risk absorption, and demographic and political developments. (Source: Geneva Association) Florida's “David and Goliath” Law A major driver is the assertive strategies of the plaintiffs' bar, compounded by the three-year window in which to file a claim and the use and potential abuse of practices such as assignment of benefits (AOB). The use of public adjusters and broader societal attitudes towards insurance claiming also need to be taken into consideration. Meanwhile, the expansion of coverage terms and conditions in the loss-free years between 2005 and 2017 and generous policy interpretations play their part in driving up claims frequency and severity. What Is Assignment of Benefits (AOB)? An assignment of benefits, or AOB, is a document signed by a policyholder that allows a third party, such as a water extraction company, a roofer or a plumber to '”stand in the shoes” of the insured and seek payment directly from the policyholder's insurance company for the cost of repairs. AOBs have long been part of Florida’s insurance marketplace. (Source: Florida Office of Insurance Regulation) More recently, the effects of COVID-19 has impacted the cost of repairs, in turn increasing insurers' loss ratios. (Re)insurers naturally want to better understand how social inflation is likely to impact their cost of claims. But determining the impact of social inflation on the “claims signal” is far from simple. From a modeling perspective, the first step is deselecting the different elements that contribute toward social inflation and understanding which trends are longer term in nature. The recently released Version 21 of the RMS North Atlantic Hurricane Models incorporates an alternative view of vulnerability for clients and reflects the changing market conditions applicable to Florida residential lines, including the 25 percent roof replacement rule. However, the effects of social inflation are still largely considered non-modeled. They are removed from available data where possible, during the model development process. Any residual impacts are implicitly represented in the model. “Quantifying the impacts of social inflation is a complex task, partly because of the uncertainty in how long these factors will persist,” says Jeff Waters, senior product manager at RMS. “The question is, going forward, how much of an issue is social inflation really going to be for the next three, five or 10 years? Should we start thinking more about ways in which to explicitly account for these social inflation factors or give model users the ability to manually fold in these different factors? “One issue is that social inflation really ramped up over the last few years,” he continues. “It's especially true in Florida following events like Hurricanes Irma and Michael. At RMS, we have been working hard trying to determine which of these signals are caused by social inflation and which are caused by other things happening in Florida. Certainly, the view of vulnerability in Version 21 starts to reflect these elevated risk factors.” AOB had a clear impact on claims from Irma and Michael. Florida's “David and Goliath” law was intended to level the playing field between policyholders and economically powerful insurers, notes the Insurance Information Institute's Jeff Dunsavage. Instead, the law offered an incentive for attorneys to file thousands of AOB-related suits. The ease with which unscrupulous contractors can “find” damage and make claims within three years of a catastrophe loss has further exacerbated the problem. Waters points out that in 2006 there were only around 400 AOB lawsuits in the market. By 2018, that number had risen to over 135,000. In a decade that had seen very few storms, it was difficult to predict how significant an impact AOB would have on hurricane-related claims, until Irma struck. Of the Irma and Michael claims investigated by RMS, roughly 20 percent were impacted by AOB. “From a claims severity standpoint, the cost of those claims increased up to threefold on average compared to claims that were not affected by AOB," says Waters. Insurers on the Brink The problem is not just limited to recent hurricane events. Due to the Sunshine State's increased litigation, insurers are continuing to face a barrage of AOB non-catastrophe claims, including losses relating to water and roof damage. Reforms introduced in 2019 initially helped rein in the more opportunistic claims, but notifications dialed back up again after attorneys were able to find and exploit loopholes. Amid pressures on the court system due to COVID-19, reform efforts are continuing. In April 2021, the Florida Legislature passed a new law intended to curb market abuse of litigation and roofing contractor practices, among other reforms. Governor Ron DeSantis said the law had been a reaction to “mounting insurance costs” for homeowners. As loss ratios rose, carriers have been passing some of the additional costs back onto the policyholders in the form of additional premiums (around US$680 per family on average). Meanwhile, some carriers have begun to offer policies with limited AOB rights, or none at all, in an effort to get more control over the spiraling situation. “There are some pushes in the legislature to try to curb some of the more litigious behavior on the part of the trial lawyers,” says Matthew Nielsen, senior director, regulatory affairs at RMS. Nielsen thinks the 2021 hurricane season could be telling in terms of separating out some of the more permanent changes in the market where social inflation is concerned. The National Oceanic and Atmospheric Administration (NOAA) still predicts another above-average season in the North Atlantic, but currently does not anticipate the historic level of storm activity seen in 2020.  “What's going to happen when the next hurricane makes landfall, and which of these elements are actually going to still be here?” asks Nielsen. “What nobody wants to see again is the kind of chaos that came after 2004 and 2005, when there were questions about the health of the insurance market and what the roles of the Florida Hurricane Catastrophe Fund (FHCF) and Florida Citizens Property Insurance Corporation were going to be.” “Ultimately, we're trying to figure out which of these social inflation signals are going to stick around, and the difficulty is separating the long-term signals from the short-term ones,” he continues. “The 25 percent roof replacement rule is written into legislation, and so that is going to be the new reality going forward. On the other hand, we don't want to include something that is a temporary blip on the radar.”

Helen YatesJune 15, 2021
Tohoku earthquake
Tohoku earthquake
The Earthquakes That Changed Everything
June 15, 2021

In the decade since the devastating 2011 Tohoku and Christchurch Earthquakes, risk modelers such as RMS have been proactive in honoring the data generated from these events. It is a process of continuous learning and refinement, explains Chesley Williams and Laura Barksby from RMS, and the journey is still ongoing Within the space of just 17 days in 2011, Christchurch in New Zealand was rocked by an M6.3 earthquake — almost directly beneath the central business district (CBD) — and Japan was reeling from the most powerful earthquake in its history. At RMS, event response and reconnaissance teams were mobilized with the implicit knowledge they were there to do justice to those affected and to gather as much data as possible in order to advance seismic hazard and risk understanding. The tsunami waves triggered by the M9.0 Tohoku Earthquake inundated approximately 532 square kilometers (205 square miles) of the country's northeastern coastline. At its highest point, the waves reached over 15 meters (49 feet) in some localized areas. They overtopped seawalls, destroyed 122,000 buildings with over a million buildings severely or partially damaged, and damaged 230,000 vehicles. The event also triggered level seven meltdowns at Fukushima Dai-ichi Nuclear Power Station. The disaster at Chernobyl in 1986 was the only previous level seven event. The catastrophe was watched, in horror, in real time on news channels around the world. In total, it caused 15,899 deaths with 2,527 missing. Estimated economic damage totaled US$235 billion. When initiating a risk model update, cat modelers generally start with published national seismic hazard maps, which are typically built on consensus-based research. This represents an important first step — to review the key assumptions in such studies to make sure that they are consistent with the latest data and methods for seismic hazard and risk assessment. “When we developed our RMS® Japan Earthquake Model in 2005, the starting point was the first version of the national seismic hazard maps released in that year,” says Chesley Williams, senior director at RMS. “We had detailed discussions with the Japanese researchers who developed the maps to understand the key assumptions, particularly with a focus on the sources impacting Tokyo.” After the 2011 event, it is now clear that the Japan Trench can produce M9 events. The 2005 national hazard maps were focused on the M7-M8 events that had occurred in the preceding 450+ years. Looking at the geologic record that there have likely been large, possibly M9 events in the past, for example the Jogan Sanriku Earthquake in 869. Honoring the Data So much about both these events in 2011 — the Christchurch Earthquake on February 22 and Tohoku Earthquake on March 11 — was unexpected. Although New Zealand is a highly seismic region, the 2010-11 Canterbury Earthquake Sequence occurred in an area that historically had relatively low seismicity. Prior to the Canterbury Earthquake Sequence there were fewer than 30 earthquakes of magnitude four or greater in Christchurch and the immediate surrounding area. In the last decade, there have been more than 370 earthquakes in this region. The Christchurch Earthquake caused higher-than-expected ground motions and unprecedented liquefaction. As a result, it was the costliest event in the sequence and the second-highest insured loss from earthquake in history, after the Tohoku Earthquake. Japan is also highly seismic, but the Tohoku event occurred on structures that had not shown their full potential during the historical record. The intensity of M9.0 Tohoku was particularly surprising in scale — the highest-ever economic losses from a natural disaster — and the tsunami impact was unprecedented for Japan. “Both Christchurch and Tohoku taught us an awful lot about earthquake risk, including the secondary impacts of earthquakes — tsunami, liquefaction, landslides, nuclear disaster, aftershocks, business interruption, contingent business interruption, and post-event loss amplification,” commented Williams. “They transformed how we think about and model seismic hazard and risk.” New Insights Into Large-Magnitude Events Media coverage of Tohoku clearly showed that the damage in the tsunami induction zones was catastrophic. Once the search and rescue work had been completed, RMS sent a reconnaissance team to Japan to examine tsunami damage and also damage from strong ground shaking, which was extensive. Key observations from this work included that older (pre-1981) concrete buildings often sustained significant damage at high ground motions, traditional wooden homes with heavy tile roofs were more heavily damaged than more modern home construction, and contents damage in high-tech industrial facilities was particularly problematic for production continuity. Tsunami damage from the Tohoku EarthquakeIn the period immediately following a disaster, the Japanese government posts running tallies for the damage statistics as they are collected. This data is invaluable for understanding the scale of damage but also provides important insights with regard to drivers of loss. RMS used these damage statistics during the early event response process to help inform economic and insured loss estimates. In subsequent months, more comprehensive damage statistics compiled by Japan’s Ministry of Land, Infrastructure, Transport and Tourism proved vital for refinement of building performance modeling by RMS during strong ground shaking as well as for the development of vulnerability functions for tsunami inundation. Japan has created and maintained what is arguably the best and most dense national seismic network in the world. This network recorded more than 1,000 observations of the ground motions produced by the Tohoku Earthquake. Because large M9+ events are so rare (only five in the last 150 years), this observation dataset is key for understanding the strong ground motions produced by these extremely strong earthquakes. “Prior to this event, modeling of ground motions for events in this magnitude range had to be extrapolated from observation of smaller magnitude events,” says Williams. “Having more data to constrain M9+ ground motions helps refine seismic hazard and risk for all regions that can experience events in this magnitude range. Additionally, the observation data captured the details of the interaction of sedimentary basins and shallow site conditions on ground motion amplitude and frequency content.” The information has allowed RMS to dramatically improve the assessment of site condition impacts (both shallow and deep) to allow for a better assessment of localized interactions of ground motions on structural performance. Following the 2011 events, the Japanese government commissioned a series of comprehensive research studies to better understand earthquake potential for the key subduction zones (i.e., Japan Trench, Kuril Trench, Sagami Trough and Nankai Trough) and key crustal faults. The goal was to extend understanding of the historical record by utilizing the geologic record and providing information on past events over the last several thousand years. Key geologic datasets that were examined included paleotsunami deposits in coastal regions, coastal terraces uplifted in past events and paleoseismic studies to examine past ruptures on faults. The RMS Japan Earthquake Model was informed by all these datasets, allowing for a better representation of the range of events that can occur as well as better constraining the recurrence of future events on these structures. Advances in Tsunami Modeling Prior to the Tohoku event, RMS tsunami solutions had been focused on key tsunami scenario footprints that were developed to allow for an understanding of exposure accumulations at risk. “With the 2011 event and the contribution of approximately 30 percent of the loss from tsunami, it was clear that RMS needed to start providing fully probabilistic tsunami solutions,” said Williams. “The unique characteristics of the Tohoku tsunami event and its generation were key for guiding the RMS tsunami hazard and risk development.” The extremely high fault slip and large ocean bottom deformations highlighted the importance of modeling a range of slip models. RMS has chosen to use analytical slip modeling, and the sampling of alternative slip models for a given earthquake rupture allows for a more comprehensive understanding of tsunami and seismic risk. Tsunami insights from Tohoku also informed tsunami modeling in New Zealand. Following Tohoku, GNS Science, the New Zealand geoscience research institute, updated the maximum magnitude potential for the Hikurangi Subduction Zone to the east of the North Island. This assumption is reflected in the RMS® New Zealand Earthquake HD Model, and when combined with other updates, the larger magnitude has consequential impacts for portfolios with exposure in the capital of Wellington.      Lessons in Liquefaction Residents in Christchurch had certainly felt the initial M7.1 Darfield Earthquake on September 4, 2010, some 40 kilometers (25 miles) west of the city, and power and water supplies were disrupted. The event caused moderate damage, the worst of which was to unreinforced masonry chimneys and walls. Damage was also observed in historic buildings. Following the Darfield event, assessments were made to repair the damaged buildings. However, despite the lower magnitude of the February 2011 earthquake, its proximity almost directly beneath the CBD meant that the ground motions were considerable. The Christchurch Earthquake generated widespread liquefaction and was part of an ongoing sequence of events, the largest of which, following February 2011, were M5.9, M5.5 and M5.3. A number of buildings that had been compromised during the September 2010 quake crumbled under the more intense ground motion of February 22, 2011. “It was the way the sequence moved eastward from Darfield to Christchurch so that it was virtually under the CBD that made it so devastating," said Laura Barksby, product manager at RMS. "It occurred in the wrong place at the wrong time.” The Christchurch event exacerbated preexisting damage, as well as damaging previously unscathed structures. Damage was so severe in some areas of Christchurch that a red zone was established, within which it was considered uneconomical to repair buildings, and structures were demolished regardless of their state. In total, the Canterbury Earthquake Sequence caused 185 fatalities and around NZ$40 billion in economic damage, of which an estimated NZ$33-38 billion was insured. The sudden change in seismicity was traumatic for residents and hampered efforts to assess the damage and begin the rebuild and restoration process. Access inside the CBD was restricted as many older structures, mostly unreinforced masonry buildings, were deemed unsafe. In the years immediately following the earthquake, demolitions outnumbered rebuilds by four to one. Aftermath of the Christchurch Earthquake in 2011“There has been a huge societal impact. The CBD was cordoned off and many businesses had to close,” says Barksby. “From a community perspective, they went from years of no earthquakes to almost waiting for the next to happen. The fact that the events were part of a sequence added to that sense of nervousness.” The overall headline, she explains, was the damage caused by liquefaction. “When we think about earthquakes, our immediate thoughts are about the ground shaking, but with Christchurch the focus was the liquefaction. It was responsible for around 30 to 40 percent of the losses, which is considerable.” During an earthquake, the ground motions can cause an increase in water pressure in soil layers beneath the ground. This can cause a reduction in the strength of the soil particles, and they subsequently behave like a liquid, which can cause significant ground deformation. In Christchurch, for buildings with shallow foundations, there was significant damage. One aspect that had not been appreciated prior to Christchurch was the scale of the destruction liquefaction could cause — and the loss it could generate. RMS reconnaissance observed that some buildings experienced no shake damage but considerable liquefaction damage. “The damage was particularly bad along the River Avon in Christchurch," says Barksby. "Due to the lateral displacement, it looked as though some of the buildings had been pulled apart — the Christchurch Earthquake really demonstrated the different types of liquefaction displacement.” This represented an important distinguishing feature when modeling liquefaction risk. “What was seen in Christchurch was a notable difference in the damage severity depending on the liquefaction process that had occurred. There was a correlation between the type of liquefaction displacement and building damage,” said Barksby. “Lateral spreading versus vertical displacement can have very different outcomes when it comes to loss. This distinction is not something we were able to capture before Christchurch, but thanks to data we can now model it at a high resolution and directly relate it to damage at a location.” The liquefaction impact was highly variable, a feature best captured by aerial photographs taken in the immediate aftermath. While some streets were largely unscathed, others looked as though they had been inundated by flood waters from liquefaction expressed at the surface. Barksby added, “We also saw streets with the whole spectrum of liquefaction damage, ranging from none at all to severe damage just a few hundred meters down the road.” Geotechnical engineering experts from around the world seized the opportunity to better understand the hazard, using Christchurch as a liquefaction laboratory. Through its collaboration with the Canterbury Geotechnical Database (now known as the New Zealand Geotechnical Database), RMS was able to analyze borehole data along with claims insights in order to better understand how soil characteristics, water table depth and proximity to water courses influenced the exposure. It was also really important to establish the link on how liquefaction translated into damage, as ultimately this was the main area of concern. Given the significant advances in seismic understanding after Christchurch, New Zealand was chosen as the location for the first RMS high-definition (HD) earthquake model. Released in 2016 and updated in 2020, the model leveraged the surge in data available from the Earthquake Commission (EQC) and GNS Science, among others, together with collaboration partners in the insurance market to offer a more granular view of the hazard and vulnerability. The RMS New Zealand Earthquake HD Model was also the first to include an advanced liquefaction module in addition to landslide, fire following earthquake and probabilistic tsunami. ʺWe applied all the lessons from Christchurch to the rest of the country at a more localized level than had been possible before,” says Barksby. “New Zealand was selected for the first high-definition model because we had so much data arising from the sequence that we knew we could leverage HD methodologies and Cloud-computing technology, plus the country has a sophisticated insurance market.” Barksby describes it as a paradigm shift, with the same underlying principles and framework rolled out to improve the granularity and level of hazard and vulnerability detail captured by the other earthquake models, including those for Japan and North America. Striving for a Resilient Future A decade on from Tohoku and Christchurch, communities in Japan and New Zealand are still coming to terms with the tragedies and how the quakes have shaped their lives. While very large earthquakes remain relatively rare, it is important to understand the potential, including from the associated perils. The return period for earthquakes on major faults or subduction zones is hundreds to thousands of years​. Because they are so rare, each earthquake disaster has its own unique characteristics​. The events of 2011 were an opportunity to learn and to continue to push the boundaries of earthquake science and seismic engineering​. The earthquakes provided a unique opportunity to fundamentally improve the scientific and engineering communities’ understanding of earthquakes and their impacts​. RMS has used this opportunity to redefine its perspective on seismic risk in Japan and in New Zealand, and beyond​. Chesley Williams concludes: “At RMS, the goal is ​to implement the best available science​, to understand the limitations of the modeling, to apply appropriate uncertainty assumptions and to ensure that we make the best estimate of seismic risk based on the information we have today.”

Helen YatesMay 05, 2020
TreatyIQ: Striking a difficult balance
TreatyIQ: Striking a difficult balance
TreatyIQ: Striking a Difficult Balance
May 05, 2020

As treaty underwriters prepare to navigate another challenging renewal season, compounded by an uncertain economic outlook, many are looking to new technological solutions to help them capitalize on nascent optimism around rates and build sustainable profitability. EXPOSURE explores the importance of reliable marginal impact analytics to bias underwriting decisions in favor of diversification The fall of investment profits for insurance and reinsurance companies as a result of the impact of COVID-19 on financial markets is likely to encourage an upswing in reinsurance pricing. One of the factors that facilitates a hardening market is low investment returns, making an underwriting profit even more of an imperative. As the midyear renewals approach, reinsurance companies are cautiously optimistic that the reinsurance rate on line will continue on an upward trend. According to Willis Towers Watson, pricing was up significantly on loss-affected accounts as of April 1, but elsewhere there were more modest rate rises. It suggests that at this point in the cycle reinsurers cannot count on rate increases, presenting market pricing uncertainty that will need to be navigated in real time during the renewals. In the years of weaker market returns, investment in tools to deliver analytical rigor and agile pricing to underwriters can be difficult to justify, but in many cases, existing analytical processes during busy periods can expose blind spots in the assessment of a cedant portfolio and latency in the analysis of current portfolio risk positions. These inefficiencies will be more pronounced in the current work-from-home era and will leave many underwriters wondering how they can quickly identify and grab the best deals for their portfolios. Reducing Volatility Through the Cycle Both parts of the underwriting cycle can put pressure on reinsurers on underwriting decisions. Whether prices are hardening or softening, market forces can lead reinsurers toward higher volatility. “Part of the interplay in the treaty underwriting guidelines has to do with diversification,” explains Jesse Nickerson, senior director, pricing actuary at RMS. “Underwriters generally want to write risks that are diversifying in nature. However, when rates are low and competition is fierce, this desire is sometimes overwhelmed by pressure to put capital to use. Underwriting guidelines then have a somewhat natural tendency to slip as risks are written at inadequate prices. Underwriters generally want to write risks that are diversifying in nature. However, when rates are low and competition is fierce, this desire is sometimes overwhelmed by pressure to put capital to use Jesse Nickerson RMS “The reduced competition in the market during the period of low profitability triggers increases in rates, and the bounce upward begins,” he continues. “As rates rise and profitability increases, another loosening of underwriting guidelines can occur because all business begins to look like good business. This cycle is a challenge for all of these reinsurance companies to try and manage as it can add significant volatility to their book.” Tools such as RMS TreatyIQ™ help underwriters better carry out marginal impact analytics, which considers the view of risk if new books of business are included in a treaty portfolio. Treaty underwriters are typically tasked with balancing the profitability of individual treaties alongside their impact to aggregate portfolio positions. “One of the things that underwriters take into account as part of the underwriting process is, ‘What is the impact of this potential piece of business on my current portfolio,’” explains Oli Morran, director of product at RMS. “It’s just like an investment decision except that they’re investing risk capital rather than investment capital. In order to get insight into marginal impact, treaty underwriters need to have a view of their portfolio in the application, and not just their current portfolio as it looks last week, month or quarter, but how it looks today “In order to get insight into marginal impact, treaty underwriters need to have a view of their portfolio in the application, and not just their current portfolio as it looks last week, month or quarter, but how it looks today,” he continues. “So, it collects all the treaty contracts you’ve underwritten and rolls it up together to get to your current portfolio position.” Based on this understanding of a reinsurer’s aggregate risk position, underwriters are able to see in real time what impact any given piece of business would have, helping to inform how much capacity they are willing to bring to bear – and at what price. As reinsurers navigate the current, asymmetric market renewals, with the added challenge that increased home-working presents, such insight will allow them to make the right judgments based on a dynamic situation. “Treaty underwriters can import that loss data into TreatyIQ, do some quick analysis and ‘math magic’ to make it work for their view of risk and then get a report in the app that tells them profitability metrics on each of the treaties in the structure, so they can configure the right balance of participation in each treaty when quoting to the broker or cedant,” says Morran. An Art and Science Relationships have always been a central part of treaty underwriting whereby reinsurers select cedants to partner with based on many years of experience and association. Regardless of where the industry is at in the market cycle, these important bonds help to shape the renewal process at key discussion points in the calendar. New tools, such as the TreatyIQ application, are enhancing both the “art” and ”science” parts of the underwriting equation. They are reducing the potential for volatility as underwriters steer portfolios through the reinsurance cycle while harnessing experience and pricing artistry in an auditable way. While much of insurtech has until now been focused on the underlying insurance market, reinsurers are beginning to benefit from applications that offer them real-time insights. An informed approach can help identify the most profitable accounts and steer underwriters toward business that best complements their company’s existing portfolio, overall strategy and risk appetite. Reinsurance underwriters can now make decisions on whether to renew and what pricing to set based on a true understanding of what one risk sitting on their desk has the ability to do to the risks they already hold. With hundreds of treaty programs to assess during a busy renewal season, such insights support underwriters as they decide which deals to underwrite and what portion of each treaty to take on. A constant challenge for treaty underwriters is how to strike the right balance between managing complex and often longstanding relationships with cedants and brokers, while at the same time ensuring that underwritten business complements an existing portfolio. Maintaining underwriting discipline while nurturing all-important client relationships is a more straightforward task when there is data and insight readily available, says Nickerson. “Much of the strength of TreatyIQ is in the efficiency of workflows in augmenting the insight underwriters have at their fingertips. The faster they can get back to a cedant or broker, the better it is for the relationship. The more completely they understand the impact to their portfolio, the better it is for their bottom line.” RMS model data has long been a foundation in reinsurance treaty transactions, providing the common market view of risk for assessing probable catastrophe losses to a cedant’s portfolio. But using modeled data in treaty pricing analytics has traditionally been a multisystem undertaking, involving a supporting cast of actuaries and cat modelers. TreatyIQ allows you to pass losses through potential treaties and quickly see which are the most profitable based on a user’s unique pricing algorithms and risk tolerance RMS Risk Intelligence™ – a modular risk analytics platform – has enabled RMS to develop TreatyIQ as a solution to the analytics needs of treaty underwriters, covering pricing and portfolio roll-up, and to close the analytical gaps that muddy pricing insights. “TreatyIQ allows you to pass losses through potential treaties and quickly see which are the most profitable based on a user’s unique pricing algorithms and risk tolerance,” continues Nickerson. “You can see which have the most positive impact on your portfolio, allowing you to go back to the broker or cedant and make a more informed pitch. Ultimately, it allows underwriters to optimize internally against the constraints that exist in their world at a time of great uncertainty and change.”

Helen YatesMay 05, 2020
Cyber Solutions 4.0: Modeling systemic risk
Cyber Solutions 4.0: Modeling systemic risk
Cyber Solutions 4.0: Modeling Systemic Risk
May 05, 2020

The updated RMS cyber model leverages data, software vulnerabilities, attack scenarios and advanced analytics to help insurers and reinsurers get a handle on their risk aggregations From distributed denial of service (DDoS) attacks, cloud outages and contagious malware through to cyber physical exposures, cyber risk is a sentient and ever-changing threat environment. The cyber insurance market has evolved with the threat, tailoring policies to the exposures most concerning businesses around the world, ranging from data breach to business interruption. But recent events have highlighted the very real potential for systemic risks arising from a cyberattack. Nowhere was this more obvious than the 2017 WannaCry and NotPetya ransomware attacks. WannaCry affected over 200,000 computers in businesses that spanned industry sectors across 150 countries, including more than 80 National Health Service organizations in the U.K. alone. Had it not been for the discovery of a “kill switch,” the malware would have caused even more disruption and economic loss. Just a month after WannaCry, NotPetya hit. It used the same weakness within corporate networks as the WannaCry ransomware, but without the ability to jump from one network to another. With another nation-state as the suspected sponsor, this new strain of contagious malware impacted major organizations, including shipping firm Maersk and pharmaceutical company Merck. Both cyber events highlighted the potential for systemic loss from a single attack, as well as the issues surrounding “silent” cyber cover. The high-profile claims dispute arising between U.S. snack-food giant Mondelez and its property insurer, after the carrier refused a US$100 million claim based on a war exclusion within its policy, fundamentally changed the direction of the insurance market. It resulted in regulators and the industry coming together in a concerted push to clarify whether cyber cover was affirmative or non-affirmative. The Cyber Black Swan There are numerous sources of systemic risk arising from a cyber incident. For the cyber (re)insurance market to reach maturity and a stage at which it can offer the limits and capacity now desired by commercial clients, it is first necessary to understand and mitigate these aggregate exposures. A report published by RMS and the Cambridge Centre for Risk Studies in 2019 found there is increasing potential for systemic failures in IT systems or for systemic exploitation of strategically important technologies. Much of this is the result of an ever more connected world, with a growth in the internet of things (IoT) and reliance on third-party vendors. Supply chain attacks are a source of systemic risk, which will continue to grow over time with the potential for significant accumulation losses for the insurance industry As the report states, “Supply chain attacks are a source of systemic risk, which will continue to grow over time with the potential for significant accumulation losses for the insurance industry.” The report also noted that many of the victims of NotPetya were unintentionally harmed by the ransomware, which is believed to have been a politically motivated attack against Ukraine. Cyber Models Meet Evolving Market Demands Models and other risk analysis tools have become critical to the ongoing development and growing sophistication of the cyber insurance and reinsurance markets. As the industry continues to adapt its offering, there is demand for models that capture the latest threats and enable a clearer understanding into potential aggregations of risk within carriers’ books of business. From introducing exclusions on the silent side and developing sophisticated models to understanding the hazard itself and modeling contagious malware as a physical process, we are gaining ever-greater insight into the physics and dynamics of the underlying risk  Dr. Christos Mitas, RMS  As the insurance industry has evolved in its approach to cyber risk, so too has the modeling. Version 4.0 of the RMS Cyber Solutions, released in October 2019, brings together years of extensive research into the underlying processes that underpin cyber risk. It leverages millions of data points and provides expanded data enrichment capabilities on 13 million global companies, leading to improved model accuracy, explains Dr. Christos Mitas, head of the RMS cyber risk modeling group. “We have been engaging with a couple of dozen clients for the past four years and incorporating features into our solution that speak to the pain points they see in their day-to-day business,” he says. “From introducing exclusions on the silent side and developing sophisticated models to understanding the hazard itself and modeling contagious malware as a physical process, we are gaining ever-greater insight into the physics and dynamics of the underlying risk.” Feedback over the past six months since the release of Version 4.0 has been extremely positive, says Mitas. “There has been genuine amazement around the data assets we have developed and the modeling framework around which we have organized this data collection effort. There has been a huge effort over the last two years by our data scientists who have been using artificial intelligence (AI) and machine learning (ML) to collect data points from cyber events across all the sources of cyber risk that we model. “Cyber 4.0 also included new functionality to address software vulnerabilities and motivations of cyber threat actor groups that have been active over the last few years,” he continues. “These are all datasets that we have collected, and they are complemented with third-party sources — including academia, cybersecurity firms, and partners within the insurance industry — into cyber damage events.” There has been strong support from the reinsurance market, which has been a little bit behind the primary insurance market in developing its cyber product suite. “The reinsurance market has not developed as much as you would expect it to if they were relying on robust models,” says Mitas. “So, we have enhanced reinsurance modeling in our financial engines and exceedance probability (EP) curves to meet this need. “We’ve had some good feedback from reinsurance pieces we have included in Version 4.0,” he continues. “From a cybersecurity point of view, very sophisticated clients that work with internal cybersecurity teams have commented on the strength of some of our modeling for contagious malware, and for cloud outages and data breach.” Quoted Source: Barracuda Networks Click here to learn more about RMS’s purpose-built cyber model

Helen YatesMay 05, 2020
Climate Change: The cost of inaction
Climate Change: The cost of inaction
Climate Change: The Cost of Inaction
May 05, 2020

With pressure from multiple directions for a change in the approach to climate risk, how the insurance industry responds is under scrutiny Severe threats to the climate account for all of the top long-term risks in this year’s World Economic Forum (WEF) “Global Risks Report.” For the first time in the survey’s 10-year outlook, the top five global risks in terms of likelihood are all environmental. From an industry perspective, each one of these risks has potentially significant consequences for insurance and reinsurance companies: Extreme weather events with major damage to property, infrastructure and loss of human life Failure of climate change mitigation and adaptation by governments and businesses Man-made environmental damage and disasters including massive oil spills and incidents of radioactive contamination Major biodiversity loss and ecosystem collapse (terrestrial or marine) with irreversible consequences for the environment, resulting in severely depleted resources for humans as well as industries Major natural disasters such as earthquakes, tsunamis, volcanic eruptions and geomagnetic storms “There is mounting pressure on companies from investors, regulators, customers and employees to demonstrate their resilience to rising climate volatility,” says John Drzik, chairman of Marsh and McLennan Insights. “Scientific advances mean that climate risks can now be modeled with greater accuracy and incorporated into risk management and business plans. High-profile events, like recent wildfires in Australia and California, are adding pressure on companies to take action on climate risk.” There is mounting pressure on companies from investors, regulators, customers and employees to demonstrate their resilience to rising climate volatility”  John Drzik Marsh and McLennan Insights In December 2019, the Bank of England introduced new measures for insurers, expecting them to assess, manage and report on the financial risks of climate change as part of the bank’s 2021 Biennial Exploratory Scenario (BES) exercise. The BES builds on the Prudential Regulatory Authority’s Insurance Stress Test 2019, which asked insurers to stress test their assets and liabilities based on a series of future climate scenarios. The Network for the Greening of the Financial System shows how regulators in other countries are moving in a similar direction. “The BES is a pioneering exercise, which builds on the considerable progress in addressing climate-related risks that has already been made by firms, central banks and regulators,” said outgoing Bank of England governor Mark Carney. “Climate change will affect the value of virtually every financial asset; the BES will help ensure the core of our financial system is resilient to those changes.” The insurance industry’s approach to climate change is evolving. Industry-backed groups such as ClimateWise have been set up to respond to the challenges posed by climate change while also influencing policymakers. “Given the continual growth in exposure to natural catastrophes, insurance can no longer simply rely on a strategy of assessing and re-pricing risk,” says Maurice Tulloch, former chair of ClimateWise and CEO of international insurance at Aviva. “Doing so threatens a rise of uninsurable markets.” The Cost of Extreme Events In the past, property catastrophe (re)insurers were able to recalibrate their perception of natural catastrophe risk on an annual basis, as policies came up for renewal, believing that changes to hazard frequency and/or severity would occur incrementally over time. However, it has become apparent that some natural hazards have a much greater climate footprint than had been previously imagined. Attribution studies are helping insurers and other stakeholders to measure the financial impact of climate change on a specific event. “You have had events in the last few years that have a climate change signature to them,” says Robert Muir-Wood, chief research officer of science and technology at RMS. “That could include wildfire in California or extraordinary amounts of rainfall during Hurricane Harvey over Houston, or the intensity of hurricanes in the Caribbean, such as Irma, Maria and Dorian. “These events appear to be more intense and severe than those that have occurred in the past,” he continues. “Attribution studies are corroborating the fact that these natural disasters really do have a climate change signature. It was a bit experimental to start with, but now it’s just become a regular part of the picture, that after every event a designated attribution study program will be undertaken … often by more than one climate lab. “In the past it was a rather futile argument whether or not an event had a greater impact because of climate change, because you couldn’t really prove the point,” he adds. “Now it’s possible to say not only if an event has a climate change influence, but by how much. The issue isn’t whether something was or was not climate change, it’s that climate change has affected the probability of an event like that by this amount. That is the nature of the conversation now, which is an intelligent way of thinking about it.” Now it’s possible to say not only if an event has a climate change influence, but by how much. The issue isn’t whether something was or was not climate change, it’s that climate change has affected the probability of an event like that by this amount  Robert Muir-Wood RMS Record catastrophe losses in 2017 and 2018 — with combined claims costing insurers US$230 billion, according to Swiss Re sigma — have had a significant impact on the competitive and financial position of many property catastrophe (re)insurers. The loss tally from 2019 was less severe, with global insurance losses below the 10-year average at US$56 billion, but Typhoons Faxai and Hagibis caused significant damage to Japan when they occurred just weeks apart in September and October. “It can be argued that the insurance industry is the only sector that is going to be able to absorb the losses from climate change,” adds Muir-Wood. “Companies already feel they are picking up losses in this area and it’s a bit uncharted — you can’t just use the average of history. It doesn’t really work anymore. So, we need to provide the models that give our clients the comfort of knowing how to handle and price climate change risks in anticipation.” The Cost of Short-Termism While climate change is clearly on the agenda of the boards of international insurance and reinsurance firms, its emphasis differs from company to company, according to the Geneva Association. In a report, the industry think tank found that insurers are hindered from scaling up their contribution to climate adaptation and mitigation by barriers that are imposed at a public policy and regulatory level. The need to take a long-term view on climate change is at odds with the pressures that insurance companies are under as public and regulated entities. Shareholder expectations and the political demands to keep insurance rates affordable are in conflict with the need to charge a risk-adjusted price or reduce exposures in regions that are highly catastrophe exposed. Examples of this need to protect property owners from full risk pricing became an election issue in the Florida market when state-owned carrier Florida Citizens supported customers with effectively subsidized premiums. The disproportionate emphasis on using the historical record as a means of modeling the probability of future losses is a further challenge for the private market operating in the state. “In the past when insurers were confronted with climate change, they were comfortable with the sense that they could always put up the price or avoid writing the business if the risk got too high,” says Muir-Wood. “But I don’t think that’s a credible position anymore. We see situations, such as in California, where insurers are told they should already have priced in climate change risk and they need to use the average of the last 30 years, and that’s obviously a challenge for the solvency of insurers. Regulators want to be up to speed on this. If levels of risk are increasing, they need to make sure that (re)insurance companies can remain solvent. That they have enough capital to take on those risks. “The Florida Insurance Commissioner’s function is more weighted to look after the interests of consumers around insurance prices, and they maintain a very strong line that risk models should be calibrated against the long-term historical averages,” he continues. “And they’ve said that both in Florida for hurricane and in California for wildfire. And in a time of change and a time of increased risk, that position is clearly not in the interest of insurers, and they need to be thinking carefully about that. “Regulators want to be up to speed on this,” he adds. “If levels of risk are increasing, they need to make sure that (re)insurance companies can remain solvent. That they have enough capital to take on those risks. And supervisors will expect the companies they regulate to turn up with extremely good arguments and a demonstration of the data behind their position as to how they are pricing their risk and managing their portfolios.” The Reputational Cost of Inaction Despite the persistence of near-term pressures, a lack of action and a long-term view on climate change is no longer a viable option for the industry. In part, this is due to a mounting reputational cost. European and Australian (re)insurers have, for instance, been more proactive in divesting from fossil fuels than their American and Asian counterparts. This is expected to change as negative attention mounts in both mainstream and social media. The industry’s retreat from coal is gathering pace as public pressure on the fossil fuel industry and its supporters grows. The number of insurers withdrawing cover for coal more than doubled in 2019, with coal exit policies announced by 17 (re)insurance companies. “The role of insurers is to manage society’s risks — it is their duty and in their own interest to help avoid climate breakdown,” says Peter Bosshard, coordinator of the Unfriend Coal campaign. “The industry’s retreat from coal is gathering pace as public pressure on the fossil fuel industry and its supporters grows.” The influence of climate change activists such as Greta Thunberg, the actions of NGO pressure groups like Unfriend Coal and growing climate change disclosure requirements are building a critical momentum and scrutiny into the action (or lack thereof) taken by insurance senior management. “If you are in the driver’s seat of an insurance company and you know your customers’ attitudes are shifting quite fast, then you need to avoid looking as though you are behind the curve,” says Muir-Wood. “Quite clearly there is a reputational side to this. Attitudes are changing, and as an industry we should anticipate that all sorts of things that are tolerated today will become unacceptable in the future.” To understand your organization’s potential exposure to climate change contact the RMS team here

Helen YatesSeptember 06, 2019
buildings
buildings
Insurance: The next 10 years
September 06, 2019

Mohsen Rahnama, Cihan Biyikoglu and Moe Khosravy of RMS look to 2029, consider the changes the (re)insurance industry will have undergone and explain why all roads lead to a platform Over the last 30 years, catastrophe models have become an integral part of the insurance industry for portfolio risk management. During this time, the RMS model suite has evolved and expanded from the initial IRAS model  — which covered California earthquake — to a comprehensive and diverse set of models covering over 100 peril-country combinations all over the world.  RMS Risk Intelligence™, an open and flexible platform, was recently launched, and it was built to enable better risk management and support profitable risk selection. Since the earliest versions of catastrophe models, significant advances have been made in both technology and computing power. These advances allow for a more comprehensive application of new science in risk modeling and make it possible for modelers to address key sources of model and loss uncertainty in a more systematic way.  These and other significant changes over the last decade are shaping the future of insurance. By 2029, the industry will be fully digitized, presenting even more opportunity for disruption in an era of technological advances. In what is likely to remain a highly competitive environment, market participants will need to differentiate based on the power of computing speed and the ability to mine and extract value from data to inform quick, risk-based decisions. Laying the Foundations So how did we get here? Over the past few decades we have witnessed several major natural catastrophes including Hurricanes Andrew, Katrina and Sandy; the Northridge, Kobe, Maule, Tōhoku and Christchurch Earthquakes; and costly hurricanes and California wildfires in 2017 and 2018. Further, human-made catastrophes have included the terrorist attacks of 9/11 and major cyberattacks, such as WannaCry and NotPetya.  Each of these events has changed the landscape of risk assessment, underwriting and portfolio management. Combining the lessons learned from past events, including billions of dollars of loss data, with new technology has enhanced the risk modeling methodology, resulting in more robust models and a more effective way to quantify risk across diverse regions and perils. The sophistication of catastrophe models has increased as technology has enabled a better understanding of root causes and behavior of events, and it has improved analysis of their impact. Technology has also equipped the industry with more sophisticated tools to harness larger datasets and run more computationally intensive analytics. These new models are designed to translate finer-grained data into deeper and more detailed insights. Consequently, we are creating better models while also ensuring model users can make better use of model results through more sophisticated tools and applications.  A Collaborative Approach In the last decade, the pace at which technology has advanced is compelling. Emerging technology has caused the insurance industry to question if it is responding quickly and effectively to take advantage of new opportunities. In today’s digital world, many segments of the industry are leveraging the power and capacity enabled by Cloud-computing environments to conduct intensive data analysis using robust analytics.  Technology has also equipped the industry with more sophisticated tools to harness larger datasets Such an approach empowers the industry by allowing information to be accessed quickly, whenever it is needed, to make effective, fully informed decisions. The development of a standardized, open platform creates smooth workflows and allows for rapid advancement, information sharing and collaboration in growing common applications.   The future of communication between various parties across the insurance value chain — insurers, brokers, reinsurers, supervisors and capital markets — will be vastly different from what it is today. By 2029, we anticipate the transfer of data, use of analytics and other collaborations will be taking place across a common platform. The benefits will include increased efficiency, more accurate data collection and improvements in underwriting workflow. A collaborative platform will also enable more robust and informed risk assessments, portfolio rollout processes and risk transfers. Further, as data is exchanged it will be enriched and augmented using new machine learning and AI techniques. An Elastic Platform We continue to see technology evolve at a very rapid pace. Infrastructure continues to improve as the cost of storage declines and computational speed increases. Across the board, the incremental cost of computing technology has come down.  Software tools have evolved accordingly, with modern big data systems now capable of handling hundreds if not thousands of terabytes of data. Improved programming frameworks allow for more seamless parallel programming. User-interface components reveal data in ways that were not possible in the past. Furthermore, this collection of phenomenal advances is now available in the Cloud, with the added benefit that it is continuously self-improving to support growing commercial demands. In addition to helping avoid built-in obsolescence, the Cloud offers “elasticity.” Elasticity means accessing many machines when you need them and fewer when you don’t. It means storage that can dynamically grow and shrink, and computing capacity that can follow the ebb and flow of demand.  In our world of insurance and data analytics, the macro cycles of renewal seasons and micromodeling demand bursts can both be accommodated through the elastic nature of the Cloud. In an elastic world, the actual cost of supercomputing goes down, and we can confidently guarantee fast response times.  Empowering Underwriters A decade from now, the industry will look very different, not least due to changes within the workforce and the risk landscape. First-movers and fast-followers will be in a position of competitive advantage come 2029 in an industry where large incumbents are already partnering with more agile “insurtech” startups.  The role of the intermediary will continue to evolve, and at every stage of risk transfer — from insured to primary insurer, reinsurer and into the capital markets — data sharing and standardization will become key success factors. Over the next 10 years, as data becomes more standardized and more widely shared, the concept of blockchain, or distributed ledger technology, will move closer to becoming a reality.  This standardization, collaboration and use of advanced analytics are essential to the future of the industry. Machine learning and AI, highly sophisticated models and enhanced computational power will enable underwriters to improve their risk selection and make quick, highly informed decisions.  And this ability will enhance the role of the insurance industry in society, in a changing and altogether riskier world. The tremendous protection gap can only be tackled when there is more detailed insight and differentiation around each individual risk. When there is greater insight into the underlying risk, there is less need for conservatism, risks become more accurately and competitively priced, and (re)insurers are able to innovate to provide products and solutions for new and emerging exposures.  Over the coming decade, models will require advanced computing technology to fully harness the power of big data. Underwater robots are now probing previously unmapped ocean waters to detect changes in temperatures, currents, sea level and coastal flooding. Drones are surveying our built-up environment in fine detail. Artificial intelligence and machine learning algorithms are searching for patterns of climate change in these new datasets, and climate models are reconstructing the past and predicting the future at a resolution never before possible. These emerging technologies and datasets will help meet our industry’s insatiable demand for more robust risk assessment at the level of an individual asset. This explosion of data will fundamentally change the way we think about model execution and development, as well as the end-to-end software infrastructure. Platforms will need to be dynamic and forward-looking verses static and historic in the way they acquire, train, and execute on data. The industry has already transformed considerably over the past five years, despite traditionally being considered a laggard in terms of its technology adoption. The foundation is firmly in place for a further shift over the next decade where all roads are leading to a common, collaborative industry platform, where participants are willing to share data and insights and, as they do so, open up new markets and opportunities.  RMS Risk Intelligence The analytical and computational power of the Risk Intelligence (RI) platform enables the RMS model development team to bring the latest science and research to the RMS catastrophe peril model suite and build the next generation of high-definition models. The functionality and high performance of RI allows the RMS team to assess elements of model and loss uncertainty in a more robust way than before.  The framework of RI is flexible, modular and scalable, allowing the rapid integration of future knowledge with a swifter implementation and update cycle. The open modeling platform allows model users to extract more value from their claims experience to develop vulnerability functions that represent a view of risk specific to their data or to use custom-built alternatives. This enables users to perform a wide range of sensitivity tests and take ownership of their view of risk. Mohsen Rahnama is chief risk modeling officer and executive vice president, models and data, Cihan Biyikoglu is executive vice president, product and Moe Khosravy is executive vice president, software and platform at RMS

Helen YatesSeptember 06, 2019
storm
storm
Severe Convective Storms: A New Peak Peril?
September 06, 2019

Severe convective storms (SCS) have driven U.S. insured catastrophe losses in recent years with both attritional and major single-event claims now rivaling an average hurricane season. EXPOSURE looks at why SCS losses are rising and asks how (re)insurers should be responding At the time of writing, 2019 was already shaping up to be another active season for U.S. severe convective storms (SCS), with at least eight tornadoes daily over a period of 12 consecutive days in May. It was the most May tornadoes since 2015, with no fewer than seven outbreaks of SCS across central and eastern parts of the U.S. According to data from the National Oceanic and Atmospheric Administration (NOAA), there were 555 preliminary tornado reports, more than double the average of 276 for the month in the period of 1991-2010. According to the current numbers, May 2019 produced the second-highest number of reported tornadoes for any month on record after April 2011, which broke multiple records in relation to SCS and tornado touchdowns. It continues a trend set over the past two decades, which has seen SCS losses increasing significantly and steadily. In 2018, losses amounted to US$18.8 billion, of which US$14.1 billion was insured. This compares to insurance losses of US$15.6 billion for hurricane losses in the same period. While losses from SCS are often the buildup of losses from multiple events, there are examples of single events costing insurers and reinsurers over US$3 billion in claims. This includes the costliest SCS to date, which hit Tuscaloosa, Alabama, in April 2011, involving several tornado touchdowns and causing US$7.9 billion in insured damage. The second-most-costly SCS occurred in May of the same year, striking Joplin, Missouri, and other locations, resulting in insured losses of nearly US$7.6 billion. “The trend in the scientific discussion is that there might be fewer but more-severe events” Juergen Grieser RMS According to RMS models, average losses from SCS now exceed US$15 billion annually and are in the same range as hurricane average annual loss (AAL), which is also backed up by independently published scientific research. “The losses in 2011 and 2012 were real eye-openers,” says Rajkiran Vojjala, vice president of modeling at RMS. “SCS is no longer a peril with events that cost a few hundred million dollars. You could have cat losses of US$10 billion in today’s money if there were events similar to those in April 2011.”  Nearly a third of all average annual reported tornadoes occur in the states of Texas, Oklahoma, Kansas and Nebraska, all states that are within the “Tornado Alley.” This is where cold, dry polar air meets warm, moist air moving up from the Gulf of Mexico, causing strong convective activity. “A typical SCS swath affects many states. So the extent is large, unlike, say, wildfire, which is truly localized to a small particular region,” says Vojjala. Research suggests the annual number of Enhanced Fujita (EF) scale EF2 and stronger tornadoes hitting the U.S. has trended upward over the past 20 years; however, there is some doubt over whether this is a real meteorological trend. One explanation could be that increased observational practices simply mean that such weather phenomena are more likely to be recorded, particularly in less populated regions.  According to Juergen Grieser, senior director of modeling at RMS, there is a debate whether part of the increase in claims relating to SCS could be attributed to climate change. “A warmer climate means a weaker jet stream, which should lead to less organized convection while the energy of convection might increase,” he says. “The trend in the scientific discussion is that there might be fewer but more-severe events.” Claims severity rather than claims frequency is a more significant driver of losses relating to hail events, he adds. “We have an increase in hail losses of about 11 percent per year over the last 15 years, which is quite a lot. But 7.5 percent of that is from an increase in the cost of individual claims,” explains Grieser. “So, while the claims frequency has also increased in this period, the individual claim is more expensive now than it was ever before.”  Claims go ‘Through the Roof’ Another big driver of loss is likely to be aging roofs and the increasing exposure at risk of SCS. The contribution of roof age was explored in a blog last year by Stephen Cusack, director of model development at RMS. He noted that one of the biggest changes in residential exposure to SCS over the past two decades has been the rise in the median age of housing from 30 years in 2001 to 37 years in 2013. A changing insurance industry climate is also a driver for increased losses, thinks Vojjala. “There has been a change in public perception on claiming whereby even cosmetic damage to roofs is now being claimed and contractors are chasing hailstorms to see what damage might have been caused,” he says. “So, there is more awareness and that has led to higher losses. “The insurance products for hail and tornado have grown and so those perils are being insured more, and there are different types of coverage,” he notes. “Most insurers now offer not replacement cost but only the actual value of the roofs to alleviate some of the rising cost of claims. On the flip side, if they do continue offering full replacement coverage and a hurricane hits in some of those areas, you now have better roofs.” How insurance companies approach the peril is changing as a result of rising claims. “Historically, insurance and reinsurance clients have viewed SCS as an attritional loss, but in the last five to 10 years the changing trends have altered that perception,” says Vojjala. “That’s where there is this need for high-resolution modeling, which increasingly our clients have been asking for to improve their exposure management practices. “With SCS also having catastrophic losses, it has stoked interest from the ILS community as well, who are also experimenting with parametric triggers for SCS,” he adds. “We usually see this on the earthquake or hurricane side, but increasingly we are seeing it with SCS as well.” 

Loading Icon
close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.