logo image
Helen YatesMay 05, 2020
TreatyIQ: Striking a difficult balance
TreatyIQ: Striking a difficult balance
TreatyIQ: Striking a Difficult Balance
May 05, 2020

As treaty underwriters prepare to navigate another challenging renewal season, compounded by an uncertain economic outlook, many are looking to new technological solutions to help them capitalize on nascent optimism around rates and build sustainable profitability. EXPOSURE explores the importance of reliable marginal impact analytics to bias underwriting decisions in favor of diversification The fall of investment profits for insurance and reinsurance companies as a result of the impact of COVID-19 on financial markets is likely to encourage an upswing in reinsurance pricing. One of the factors that facilitates a hardening market is low investment returns, making an underwriting profit even more of an imperative. As the midyear renewals approach, reinsurance companies are cautiously optimistic that the reinsurance rate on line will continue on an upward trend. According to Willis Towers Watson, pricing was up significantly on loss-affected accounts as of April 1, but elsewhere there were more modest rate rises. It suggests that at this point in the cycle reinsurers cannot count on rate increases, presenting market pricing uncertainty that will need to be navigated in real time during the renewals. In the years of weaker market returns, investment in tools to deliver analytical rigor and agile pricing to underwriters can be difficult to justify, but in many cases, existing analytical processes during busy periods can expose blind spots in the assessment of a cedant portfolio and latency in the analysis of current portfolio risk positions. These inefficiencies will be more pronounced in the current work-from-home era and will leave many underwriters wondering how they can quickly identify and grab the best deals for their portfolios. Reducing Volatility Through the Cycle Both parts of the underwriting cycle can put pressure on reinsurers on underwriting decisions. Whether prices are hardening or softening, market forces can lead reinsurers toward higher volatility. “Part of the interplay in the treaty underwriting guidelines has to do with diversification,” explains Jesse Nickerson, senior director, pricing actuary at RMS. “Underwriters generally want to write risks that are diversifying in nature. However, when rates are low and competition is fierce, this desire is sometimes overwhelmed by pressure to put capital to use. Underwriting guidelines then have a somewhat natural tendency to slip as risks are written at inadequate prices. Underwriters generally want to write risks that are diversifying in nature. However, when rates are low and competition is fierce, this desire is sometimes overwhelmed by pressure to put capital to use Jesse Nickerson RMS “The reduced competition in the market during the period of low profitability triggers increases in rates, and the bounce upward begins,” he continues. “As rates rise and profitability increases, another loosening of underwriting guidelines can occur because all business begins to look like good business. This cycle is a challenge for all of these reinsurance companies to try and manage as it can add significant volatility to their book.” Tools such as RMS TreatyIQ™ help underwriters better carry out marginal impact analytics, which considers the view of risk if new books of business are included in a treaty portfolio. Treaty underwriters are typically tasked with balancing the profitability of individual treaties alongside their impact to aggregate portfolio positions. “One of the things that underwriters take into account as part of the underwriting process is, ‘What is the impact of this potential piece of business on my current portfolio,’” explains Oli Morran, director of product at RMS. “It’s just like an investment decision except that they’re investing risk capital rather than investment capital. In order to get insight into marginal impact, treaty underwriters need to have a view of their portfolio in the application, and not just their current portfolio as it looks last week, month or quarter, but how it looks today “In order to get insight into marginal impact, treaty underwriters need to have a view of their portfolio in the application, and not just their current portfolio as it looks last week, month or quarter, but how it looks today,” he continues. “So, it collects all the treaty contracts you’ve underwritten and rolls it up together to get to your current portfolio position.” Based on this understanding of a reinsurer’s aggregate risk position, underwriters are able to see in real time what impact any given piece of business would have, helping to inform how much capacity they are willing to bring to bear – and at what price. As reinsurers navigate the current, asymmetric market renewals, with the added challenge that increased home-working presents, such insight will allow them to make the right judgments based on a dynamic situation. “Treaty underwriters can import that loss data into TreatyIQ, do some quick analysis and ‘math magic’ to make it work for their view of risk and then get a report in the app that tells them profitability metrics on each of the treaties in the structure, so they can configure the right balance of participation in each treaty when quoting to the broker or cedant,” says Morran. An Art and Science Relationships have always been a central part of treaty underwriting whereby reinsurers select cedants to partner with based on many years of experience and association. Regardless of where the industry is at in the market cycle, these important bonds help to shape the renewal process at key discussion points in the calendar. New tools, such as the TreatyIQ application, are enhancing both the “art” and ”science” parts of the underwriting equation. They are reducing the potential for volatility as underwriters steer portfolios through the reinsurance cycle while harnessing experience and pricing artistry in an auditable way. While much of insurtech has until now been focused on the underlying insurance market, reinsurers are beginning to benefit from applications that offer them real-time insights. An informed approach can help identify the most profitable accounts and steer underwriters toward business that best complements their company’s existing portfolio, overall strategy and risk appetite. Reinsurance underwriters can now make decisions on whether to renew and what pricing to set based on a true understanding of what one risk sitting on their desk has the ability to do to the risks they already hold. With hundreds of treaty programs to assess during a busy renewal season, such insights support underwriters as they decide which deals to underwrite and what portion of each treaty to take on. A constant challenge for treaty underwriters is how to strike the right balance between managing complex and often longstanding relationships with cedants and brokers, while at the same time ensuring that underwritten business complements an existing portfolio. Maintaining underwriting discipline while nurturing all-important client relationships is a more straightforward task when there is data and insight readily available, says Nickerson. “Much of the strength of TreatyIQ is in the efficiency of workflows in augmenting the insight underwriters have at their fingertips. The faster they can get back to a cedant or broker, the better it is for the relationship. The more completely they understand the impact to their portfolio, the better it is for their bottom line.” RMS model data has long been a foundation in reinsurance treaty transactions, providing the common market view of risk for assessing probable catastrophe losses to a cedant’s portfolio. But using modeled data in treaty pricing analytics has traditionally been a multisystem undertaking, involving a supporting cast of actuaries and cat modelers. TreatyIQ allows you to pass losses through potential treaties and quickly see which are the most profitable based on a user’s unique pricing algorithms and risk tolerance RMS Risk Intelligence™ – a modular risk analytics platform – has enabled RMS to develop TreatyIQ as a solution to the analytics needs of treaty underwriters, covering pricing and portfolio roll-up, and to close the analytical gaps that muddy pricing insights. “TreatyIQ allows you to pass losses through potential treaties and quickly see which are the most profitable based on a user’s unique pricing algorithms and risk tolerance,” continues Nickerson. “You can see which have the most positive impact on your portfolio, allowing you to go back to the broker or cedant and make a more informed pitch. Ultimately, it allows underwriters to optimize internally against the constraints that exist in their world at a time of great uncertainty and change.”

Helen YatesMay 05, 2020
Cyber Solutions 4.0: Modeling systemic risk
Cyber Solutions 4.0: Modeling systemic risk
Cyber Solutions 4.0: Modeling Systemic Risk
May 05, 2020

The updated RMS cyber model leverages data, software vulnerabilities, attack scenarios and advanced analytics to help insurers and reinsurers get a handle on their risk aggregations From distributed denial of service (DDoS) attacks, cloud outages and contagious malware through to cyber physical exposures, cyber risk is a sentient and ever-changing threat environment. The cyber insurance market has evolved with the threat, tailoring policies to the exposures most concerning businesses around the world, ranging from data breach to business interruption. But recent events have highlighted the very real potential for systemic risks arising from a cyberattack. Nowhere was this more obvious than the 2017 WannaCry and NotPetya ransomware attacks. WannaCry affected over 200,000 computers in businesses that spanned industry sectors across 150 countries, including more than 80 National Health Service organizations in the U.K. alone. Had it not been for the discovery of a “kill switch,” the malware would have caused even more disruption and economic loss. Just a month after WannaCry, NotPetya hit. It used the same weakness within corporate networks as the WannaCry ransomware, but without the ability to jump from one network to another. With another nation-state as the suspected sponsor, this new strain of contagious malware impacted major organizations, including shipping firm Maersk and pharmaceutical company Merck. Both cyber events highlighted the potential for systemic loss from a single attack, as well as the issues surrounding “silent” cyber cover. The high-profile claims dispute arising between U.S. snack-food giant Mondelez and its property insurer, after the carrier refused a US$100 million claim based on a war exclusion within its policy, fundamentally changed the direction of the insurance market. It resulted in regulators and the industry coming together in a concerted push to clarify whether cyber cover was affirmative or non-affirmative. The Cyber Black Swan There are numerous sources of systemic risk arising from a cyber incident. For the cyber (re)insurance market to reach maturity and a stage at which it can offer the limits and capacity now desired by commercial clients, it is first necessary to understand and mitigate these aggregate exposures. A report published by RMS and the Cambridge Centre for Risk Studies in 2019 found there is increasing potential for systemic failures in IT systems or for systemic exploitation of strategically important technologies. Much of this is the result of an ever more connected world, with a growth in the internet of things (IoT) and reliance on third-party vendors. Supply chain attacks are a source of systemic risk, which will continue to grow over time with the potential for significant accumulation losses for the insurance industry As the report states, “Supply chain attacks are a source of systemic risk, which will continue to grow over time with the potential for significant accumulation losses for the insurance industry.” The report also noted that many of the victims of NotPetya were unintentionally harmed by the ransomware, which is believed to have been a politically motivated attack against Ukraine. Cyber Models Meet Evolving Market Demands Models and other risk analysis tools have become critical to the ongoing development and growing sophistication of the cyber insurance and reinsurance markets. As the industry continues to adapt its offering, there is demand for models that capture the latest threats and enable a clearer understanding into potential aggregations of risk within carriers’ books of business. From introducing exclusions on the silent side and developing sophisticated models to understanding the hazard itself and modeling contagious malware as a physical process, we are gaining ever-greater insight into the physics and dynamics of the underlying risk  Dr. Christos Mitas, RMS  As the insurance industry has evolved in its approach to cyber risk, so too has the modeling. Version 4.0 of the RMS Cyber Solutions, released in October 2019, brings together years of extensive research into the underlying processes that underpin cyber risk. It leverages millions of data points and provides expanded data enrichment capabilities on 13 million global companies, leading to improved model accuracy, explains Dr. Christos Mitas, head of the RMS cyber risk modeling group. “We have been engaging with a couple of dozen clients for the past four years and incorporating features into our solution that speak to the pain points they see in their day-to-day business,” he says. “From introducing exclusions on the silent side and developing sophisticated models to understanding the hazard itself and modeling contagious malware as a physical process, we are gaining ever-greater insight into the physics and dynamics of the underlying risk.” Feedback over the past six months since the release of Version 4.0 has been extremely positive, says Mitas. “There has been genuine amazement around the data assets we have developed and the modeling framework around which we have organized this data collection effort. There has been a huge effort over the last two years by our data scientists who have been using artificial intelligence (AI) and machine learning (ML) to collect data points from cyber events across all the sources of cyber risk that we model. “Cyber 4.0 also included new functionality to address software vulnerabilities and motivations of cyber threat actor groups that have been active over the last few years,” he continues. “These are all datasets that we have collected, and they are complemented with third-party sources — including academia, cybersecurity firms, and partners within the insurance industry — into cyber damage events.” There has been strong support from the reinsurance market, which has been a little bit behind the primary insurance market in developing its cyber product suite. “The reinsurance market has not developed as much as you would expect it to if they were relying on robust models,” says Mitas. “So, we have enhanced reinsurance modeling in our financial engines and exceedance probability (EP) curves to meet this need. “We’ve had some good feedback from reinsurance pieces we have included in Version 4.0,” he continues. “From a cybersecurity point of view, very sophisticated clients that work with internal cybersecurity teams have commented on the strength of some of our modeling for contagious malware, and for cloud outages and data breach.” Quoted Source: Barracuda Networks Click here to learn more about RMS’s purpose-built cyber model

Helen YatesMay 05, 2020
Climate Change: The cost of inaction
Climate Change: The cost of inaction
Climate Change: The Cost of Inaction
May 05, 2020

With pressure from multiple directions for a change in the approach to climate risk, how the insurance industry responds is under scrutiny Severe threats to the climate account for all of the top long-term risks in this year’s World Economic Forum (WEF) “Global Risks Report.” For the first time in the survey’s 10-year outlook, the top five global risks in terms of likelihood are all environmental. From an industry perspective, each one of these risks has potentially significant consequences for insurance and reinsurance companies: Extreme weather events with major damage to property, infrastructure and loss of human life Failure of climate change mitigation and adaptation by governments and businesses Man-made environmental damage and disasters including massive oil spills and incidents of radioactive contamination Major biodiversity loss and ecosystem collapse (terrestrial or marine) with irreversible consequences for the environment, resulting in severely depleted resources for humans as well as industries Major natural disasters such as earthquakes, tsunamis, volcanic eruptions and geomagnetic storms “There is mounting pressure on companies from investors, regulators, customers and employees to demonstrate their resilience to rising climate volatility,” says John Drzik, chairman of Marsh and McLennan Insights. “Scientific advances mean that climate risks can now be modeled with greater accuracy and incorporated into risk management and business plans. High-profile events, like recent wildfires in Australia and California, are adding pressure on companies to take action on climate risk.” There is mounting pressure on companies from investors, regulators, customers and employees to demonstrate their resilience to rising climate volatility”  John Drzik Marsh and McLennan Insights In December 2019, the Bank of England introduced new measures for insurers, expecting them to assess, manage and report on the financial risks of climate change as part of the bank’s 2021 Biennial Exploratory Scenario (BES) exercise. The BES builds on the Prudential Regulatory Authority’s Insurance Stress Test 2019, which asked insurers to stress test their assets and liabilities based on a series of future climate scenarios. The Network for the Greening of the Financial System shows how regulators in other countries are moving in a similar direction. “The BES is a pioneering exercise, which builds on the considerable progress in addressing climate-related risks that has already been made by firms, central banks and regulators,” said outgoing Bank of England governor Mark Carney. “Climate change will affect the value of virtually every financial asset; the BES will help ensure the core of our financial system is resilient to those changes.” The insurance industry’s approach to climate change is evolving. Industry-backed groups such as ClimateWise have been set up to respond to the challenges posed by climate change while also influencing policymakers. “Given the continual growth in exposure to natural catastrophes, insurance can no longer simply rely on a strategy of assessing and re-pricing risk,” says Maurice Tulloch, former chair of ClimateWise and CEO of international insurance at Aviva. “Doing so threatens a rise of uninsurable markets.” The Cost of Extreme Events In the past, property catastrophe (re)insurers were able to recalibrate their perception of natural catastrophe risk on an annual basis, as policies came up for renewal, believing that changes to hazard frequency and/or severity would occur incrementally over time. However, it has become apparent that some natural hazards have a much greater climate footprint than had been previously imagined. Attribution studies are helping insurers and other stakeholders to measure the financial impact of climate change on a specific event. “You have had events in the last few years that have a climate change signature to them,” says Robert Muir-Wood, chief research officer of science and technology at RMS. “That could include wildfire in California or extraordinary amounts of rainfall during Hurricane Harvey over Houston, or the intensity of hurricanes in the Caribbean, such as Irma, Maria and Dorian. “These events appear to be more intense and severe than those that have occurred in the past,” he continues. “Attribution studies are corroborating the fact that these natural disasters really do have a climate change signature. It was a bit experimental to start with, but now it’s just become a regular part of the picture, that after every event a designated attribution study program will be undertaken … often by more than one climate lab. “In the past it was a rather futile argument whether or not an event had a greater impact because of climate change, because you couldn’t really prove the point,” he adds. “Now it’s possible to say not only if an event has a climate change influence, but by how much. The issue isn’t whether something was or was not climate change, it’s that climate change has affected the probability of an event like that by this amount. That is the nature of the conversation now, which is an intelligent way of thinking about it.” Now it’s possible to say not only if an event has a climate change influence, but by how much. The issue isn’t whether something was or was not climate change, it’s that climate change has affected the probability of an event like that by this amount  Robert Muir-Wood RMS Record catastrophe losses in 2017 and 2018 — with combined claims costing insurers US$230 billion, according to Swiss Re sigma — have had a significant impact on the competitive and financial position of many property catastrophe (re)insurers. The loss tally from 2019 was less severe, with global insurance losses below the 10-year average at US$56 billion, but Typhoons Faxai and Hagibis caused significant damage to Japan when they occurred just weeks apart in September and October. “It can be argued that the insurance industry is the only sector that is going to be able to absorb the losses from climate change,” adds Muir-Wood. “Companies already feel they are picking up losses in this area and it’s a bit uncharted — you can’t just use the average of history. It doesn’t really work anymore. So, we need to provide the models that give our clients the comfort of knowing how to handle and price climate change risks in anticipation.” The Cost of Short-Termism While climate change is clearly on the agenda of the boards of international insurance and reinsurance firms, its emphasis differs from company to company, according to the Geneva Association. In a report, the industry think tank found that insurers are hindered from scaling up their contribution to climate adaptation and mitigation by barriers that are imposed at a public policy and regulatory level. The need to take a long-term view on climate change is at odds with the pressures that insurance companies are under as public and regulated entities. Shareholder expectations and the political demands to keep insurance rates affordable are in conflict with the need to charge a risk-adjusted price or reduce exposures in regions that are highly catastrophe exposed. Examples of this need to protect property owners from full risk pricing became an election issue in the Florida market when state-owned carrier Florida Citizens supported customers with effectively subsidized premiums. The disproportionate emphasis on using the historical record as a means of modeling the probability of future losses is a further challenge for the private market operating in the state. “In the past when insurers were confronted with climate change, they were comfortable with the sense that they could always put up the price or avoid writing the business if the risk got too high,” says Muir-Wood. “But I don’t think that’s a credible position anymore. We see situations, such as in California, where insurers are told they should already have priced in climate change risk and they need to use the average of the last 30 years, and that’s obviously a challenge for the solvency of insurers. Regulators want to be up to speed on this. If levels of risk are increasing, they need to make sure that (re)insurance companies can remain solvent. That they have enough capital to take on those risks. “The Florida Insurance Commissioner’s function is more weighted to look after the interests of consumers around insurance prices, and they maintain a very strong line that risk models should be calibrated against the long-term historical averages,” he continues. “And they’ve said that both in Florida for hurricane and in California for wildfire. And in a time of change and a time of increased risk, that position is clearly not in the interest of insurers, and they need to be thinking carefully about that. “Regulators want to be up to speed on this,” he adds. “If levels of risk are increasing, they need to make sure that (re)insurance companies can remain solvent. That they have enough capital to take on those risks. And supervisors will expect the companies they regulate to turn up with extremely good arguments and a demonstration of the data behind their position as to how they are pricing their risk and managing their portfolios.” The Reputational Cost of Inaction Despite the persistence of near-term pressures, a lack of action and a long-term view on climate change is no longer a viable option for the industry. In part, this is due to a mounting reputational cost. European and Australian (re)insurers have, for instance, been more proactive in divesting from fossil fuels than their American and Asian counterparts. This is expected to change as negative attention mounts in both mainstream and social media. The industry’s retreat from coal is gathering pace as public pressure on the fossil fuel industry and its supporters grows. The number of insurers withdrawing cover for coal more than doubled in 2019, with coal exit policies announced by 17 (re)insurance companies. “The role of insurers is to manage society’s risks — it is their duty and in their own interest to help avoid climate breakdown,” says Peter Bosshard, coordinator of the Unfriend Coal campaign. “The industry’s retreat from coal is gathering pace as public pressure on the fossil fuel industry and its supporters grows.” The influence of climate change activists such as Greta Thunberg, the actions of NGO pressure groups like Unfriend Coal and growing climate change disclosure requirements are building a critical momentum and scrutiny into the action (or lack thereof) taken by insurance senior management. “If you are in the driver’s seat of an insurance company and you know your customers’ attitudes are shifting quite fast, then you need to avoid looking as though you are behind the curve,” says Muir-Wood. “Quite clearly there is a reputational side to this. Attitudes are changing, and as an industry we should anticipate that all sorts of things that are tolerated today will become unacceptable in the future.” To understand your organization’s potential exposure to climate change contact the RMS team here

Helen YatesSeptember 06, 2019
buildings
buildings
Insurance: The next 10 years
September 06, 2019

Mohsen Rahnama, Cihan Biyikoglu and Moe Khosravy of RMS look to 2029, consider the changes the (re)insurance industry will have undergone and explain why all roads lead to a platform Over the last 30 years, catastrophe models have become an integral part of the insurance industry for portfolio risk management. During this time, the RMS model suite has evolved and expanded from the initial IRAS model  — which covered California earthquake — to a comprehensive and diverse set of models covering over 100 peril-country combinations all over the world.  RMS Risk Intelligence™, an open and flexible platform, was recently launched, and it was built to enable better risk management and support profitable risk selection. Since the earliest versions of catastrophe models, significant advances have been made in both technology and computing power. These advances allow for a more comprehensive application of new science in risk modeling and make it possible for modelers to address key sources of model and loss uncertainty in a more systematic way.  These and other significant changes over the last decade are shaping the future of insurance. By 2029, the industry will be fully digitized, presenting even more opportunity for disruption in an era of technological advances. In what is likely to remain a highly competitive environment, market participants will need to differentiate based on the power of computing speed and the ability to mine and extract value from data to inform quick, risk-based decisions. Laying the Foundations So how did we get here? Over the past few decades we have witnessed several major natural catastrophes including Hurricanes Andrew, Katrina and Sandy; the Northridge, Kobe, Maule, Tōhoku and Christchurch Earthquakes; and costly hurricanes and California wildfires in 2017 and 2018. Further, human-made catastrophes have included the terrorist attacks of 9/11 and major cyberattacks, such as WannaCry and NotPetya.  Each of these events has changed the landscape of risk assessment, underwriting and portfolio management. Combining the lessons learned from past events, including billions of dollars of loss data, with new technology has enhanced the risk modeling methodology, resulting in more robust models and a more effective way to quantify risk across diverse regions and perils. The sophistication of catastrophe models has increased as technology has enabled a better understanding of root causes and behavior of events, and it has improved analysis of their impact. Technology has also equipped the industry with more sophisticated tools to harness larger datasets and run more computationally intensive analytics. These new models are designed to translate finer-grained data into deeper and more detailed insights. Consequently, we are creating better models while also ensuring model users can make better use of model results through more sophisticated tools and applications.  A Collaborative Approach In the last decade, the pace at which technology has advanced is compelling. Emerging technology has caused the insurance industry to question if it is responding quickly and effectively to take advantage of new opportunities. In today’s digital world, many segments of the industry are leveraging the power and capacity enabled by Cloud-computing environments to conduct intensive data analysis using robust analytics.  Technology has also equipped the industry with more sophisticated tools to harness larger datasets Such an approach empowers the industry by allowing information to be accessed quickly, whenever it is needed, to make effective, fully informed decisions. The development of a standardized, open platform creates smooth workflows and allows for rapid advancement, information sharing and collaboration in growing common applications.   The future of communication between various parties across the insurance value chain — insurers, brokers, reinsurers, supervisors and capital markets — will be vastly different from what it is today. By 2029, we anticipate the transfer of data, use of analytics and other collaborations will be taking place across a common platform. The benefits will include increased efficiency, more accurate data collection and improvements in underwriting workflow. A collaborative platform will also enable more robust and informed risk assessments, portfolio rollout processes and risk transfers. Further, as data is exchanged it will be enriched and augmented using new machine learning and AI techniques. An Elastic Platform We continue to see technology evolve at a very rapid pace. Infrastructure continues to improve as the cost of storage declines and computational speed increases. Across the board, the incremental cost of computing technology has come down.  Software tools have evolved accordingly, with modern big data systems now capable of handling hundreds if not thousands of terabytes of data. Improved programming frameworks allow for more seamless parallel programming. User-interface components reveal data in ways that were not possible in the past. Furthermore, this collection of phenomenal advances is now available in the Cloud, with the added benefit that it is continuously self-improving to support growing commercial demands. In addition to helping avoid built-in obsolescence, the Cloud offers “elasticity.” Elasticity means accessing many machines when you need them and fewer when you don’t. It means storage that can dynamically grow and shrink, and computing capacity that can follow the ebb and flow of demand.  In our world of insurance and data analytics, the macro cycles of renewal seasons and micromodeling demand bursts can both be accommodated through the elastic nature of the Cloud. In an elastic world, the actual cost of supercomputing goes down, and we can confidently guarantee fast response times.  Empowering Underwriters A decade from now, the industry will look very different, not least due to changes within the workforce and the risk landscape. First-movers and fast-followers will be in a position of competitive advantage come 2029 in an industry where large incumbents are already partnering with more agile “insurtech” startups.  The role of the intermediary will continue to evolve, and at every stage of risk transfer — from insured to primary insurer, reinsurer and into the capital markets — data sharing and standardization will become key success factors. Over the next 10 years, as data becomes more standardized and more widely shared, the concept of blockchain, or distributed ledger technology, will move closer to becoming a reality.  This standardization, collaboration and use of advanced analytics are essential to the future of the industry. Machine learning and AI, highly sophisticated models and enhanced computational power will enable underwriters to improve their risk selection and make quick, highly informed decisions.  And this ability will enhance the role of the insurance industry in society, in a changing and altogether riskier world. The tremendous protection gap can only be tackled when there is more detailed insight and differentiation around each individual risk. When there is greater insight into the underlying risk, there is less need for conservatism, risks become more accurately and competitively priced, and (re)insurers are able to innovate to provide products and solutions for new and emerging exposures.  Over the coming decade, models will require advanced computing technology to fully harness the power of big data. Underwater robots are now probing previously unmapped ocean waters to detect changes in temperatures, currents, sea level and coastal flooding. Drones are surveying our built-up environment in fine detail. Artificial intelligence and machine learning algorithms are searching for patterns of climate change in these new datasets, and climate models are reconstructing the past and predicting the future at a resolution never before possible. These emerging technologies and datasets will help meet our industry’s insatiable demand for more robust risk assessment at the level of an individual asset. This explosion of data will fundamentally change the way we think about model execution and development, as well as the end-to-end software infrastructure. Platforms will need to be dynamic and forward-looking verses static and historic in the way they acquire, train, and execute on data. The industry has already transformed considerably over the past five years, despite traditionally being considered a laggard in terms of its technology adoption. The foundation is firmly in place for a further shift over the next decade where all roads are leading to a common, collaborative industry platform, where participants are willing to share data and insights and, as they do so, open up new markets and opportunities.  RMS Risk Intelligence The analytical and computational power of the Risk Intelligence (RI) platform enables the RMS model development team to bring the latest science and research to the RMS catastrophe peril model suite and build the next generation of high-definition models. The functionality and high performance of RI allows the RMS team to assess elements of model and loss uncertainty in a more robust way than before.  The framework of RI is flexible, modular and scalable, allowing the rapid integration of future knowledge with a swifter implementation and update cycle. The open modeling platform allows model users to extract more value from their claims experience to develop vulnerability functions that represent a view of risk specific to their data or to use custom-built alternatives. This enables users to perform a wide range of sensitivity tests and take ownership of their view of risk. Mohsen Rahnama is chief risk modeling officer and executive vice president, models and data, Cihan Biyikoglu is executive vice president, product and Moe Khosravy is executive vice president, software and platform at RMS

Helen YatesSeptember 06, 2019
storm
storm
Severe Convective Storms: A New Peak Peril?
September 06, 2019

Severe convective storms (SCS) have driven U.S. insured catastrophe losses in recent years with both attritional and major single-event claims now rivaling an average hurricane season. EXPOSURE looks at why SCS losses are rising and asks how (re)insurers should be responding At the time of writing, 2019 was already shaping up to be another active season for U.S. severe convective storms (SCS), with at least eight tornadoes daily over a period of 12 consecutive days in May. It was the most May tornadoes since 2015, with no fewer than seven outbreaks of SCS across central and eastern parts of the U.S. According to data from the National Oceanic and Atmospheric Administration (NOAA), there were 555 preliminary tornado reports, more than double the average of 276 for the month in the period of 1991-2010. According to the current numbers, May 2019 produced the second-highest number of reported tornadoes for any month on record after April 2011, which broke multiple records in relation to SCS and tornado touchdowns. It continues a trend set over the past two decades, which has seen SCS losses increasing significantly and steadily. In 2018, losses amounted to US$18.8 billion, of which US$14.1 billion was insured. This compares to insurance losses of US$15.6 billion for hurricane losses in the same period. While losses from SCS are often the buildup of losses from multiple events, there are examples of single events costing insurers and reinsurers over US$3 billion in claims. This includes the costliest SCS to date, which hit Tuscaloosa, Alabama, in April 2011, involving several tornado touchdowns and causing US$7.9 billion in insured damage. The second-most-costly SCS occurred in May of the same year, striking Joplin, Missouri, and other locations, resulting in insured losses of nearly US$7.6 billion. “The trend in the scientific discussion is that there might be fewer but more-severe events” Juergen Grieser RMS According to RMS models, average losses from SCS now exceed US$15 billion annually and are in the same range as hurricane average annual loss (AAL), which is also backed up by independently published scientific research. “The losses in 2011 and 2012 were real eye-openers,” says Rajkiran Vojjala, vice president of modeling at RMS. “SCS is no longer a peril with events that cost a few hundred million dollars. You could have cat losses of US$10 billion in today’s money if there were events similar to those in April 2011.”  Nearly a third of all average annual reported tornadoes occur in the states of Texas, Oklahoma, Kansas and Nebraska, all states that are within the “Tornado Alley.” This is where cold, dry polar air meets warm, moist air moving up from the Gulf of Mexico, causing strong convective activity. “A typical SCS swath affects many states. So the extent is large, unlike, say, wildfire, which is truly localized to a small particular region,” says Vojjala. Research suggests the annual number of Enhanced Fujita (EF) scale EF2 and stronger tornadoes hitting the U.S. has trended upward over the past 20 years; however, there is some doubt over whether this is a real meteorological trend. One explanation could be that increased observational practices simply mean that such weather phenomena are more likely to be recorded, particularly in less populated regions.  According to Juergen Grieser, senior director of modeling at RMS, there is a debate whether part of the increase in claims relating to SCS could be attributed to climate change. “A warmer climate means a weaker jet stream, which should lead to less organized convection while the energy of convection might increase,” he says. “The trend in the scientific discussion is that there might be fewer but more-severe events.” Claims severity rather than claims frequency is a more significant driver of losses relating to hail events, he adds. “We have an increase in hail losses of about 11 percent per year over the last 15 years, which is quite a lot. But 7.5 percent of that is from an increase in the cost of individual claims,” explains Grieser. “So, while the claims frequency has also increased in this period, the individual claim is more expensive now than it was ever before.”  Claims go ‘Through the Roof’ Another big driver of loss is likely to be aging roofs and the increasing exposure at risk of SCS. The contribution of roof age was explored in a blog last year by Stephen Cusack, director of model development at RMS. He noted that one of the biggest changes in residential exposure to SCS over the past two decades has been the rise in the median age of housing from 30 years in 2001 to 37 years in 2013. A changing insurance industry climate is also a driver for increased losses, thinks Vojjala. “There has been a change in public perception on claiming whereby even cosmetic damage to roofs is now being claimed and contractors are chasing hailstorms to see what damage might have been caused,” he says. “So, there is more awareness and that has led to higher losses. “The insurance products for hail and tornado have grown and so those perils are being insured more, and there are different types of coverage,” he notes. “Most insurers now offer not replacement cost but only the actual value of the roofs to alleviate some of the rising cost of claims. On the flip side, if they do continue offering full replacement coverage and a hurricane hits in some of those areas, you now have better roofs.” How insurance companies approach the peril is changing as a result of rising claims. “Historically, insurance and reinsurance clients have viewed SCS as an attritional loss, but in the last five to 10 years the changing trends have altered that perception,” says Vojjala. “That’s where there is this need for high-resolution modeling, which increasingly our clients have been asking for to improve their exposure management practices. “With SCS also having catastrophic losses, it has stoked interest from the ILS community as well, who are also experimenting with parametric triggers for SCS,” he adds. “We usually see this on the earthquake or hurricane side, but increasingly we are seeing it with SCS as well.” 

Helen YatesSeptember 06, 2019
Infographic
Infographic
Risk in 2030
September 06, 2019

At this year’s RMS Exceedance conference in Miami, Robert Muir-Wood and Michael Steel imagined 10 future risks

Helen YatesSeptember 06, 2019
Ridgecrest
Ridgecrest
Ridgecrest: A Wake-Up Call
September 06, 2019

Marleen Nyst and Nilesh Shome of RMS explore some of the lessons and implications from the recent sequence of earthquakes in California On the morning of July 4, 2019, the small town of Ridgecrest in California’s Mojave Desert unexpectedly found itself at the center of a major news story after a magnitude 6.4 earthquake occurred close by. This earthquake later transpired to be a foreshock for a magnitude 7.1 earthquake the following day, the strongest earthquake to hit the state for 20 years. These events, part of a series of earthquakes and aftershocks that were felt by millions of people across the state, briefly reignited awareness of the threat posed by earthquakes in California. Fortunately, damage from the Ridgecrest earthquake sequence was relatively limited. With the event not causing a widespread social or economic impact, its passage through the news agenda was relatively swift.  But there are several reasons why an event such as the Ridgecrest earthquake sequence should be a focus of attention both for the insurance industry and the residents and local authorities in California.  “If Ridgecrest had happened in a more densely populated area, this state would be facing a far different economic future than it is today” Glenn Pomeroy California Earthquake Authority “We don’t want to minimize the experiences of those whose homes or property were damaged or who were injured when these two powerful earthquakes struck, because for them these earthquakes will have a lasting impact, and they face some difficult days ahead,” explains Glenn Pomeroy, chief executive of the California Earthquake Authority. “However, if this series of earthquakes had happened in a more densely populated area or an area with thousands of very old, vulnerable homes, such as Los Angeles or the San Francisco Bay Area, this state would be facing a far different economic future than it is today — potentially a massive financial crisis,” Pomeroy says. Although one of the most populous U.S. states, California’s population is mostly concentrated in metropolitan areas. A major earthquake in one of these areas could have repercussions for both the domestic and international economy.  Low Probability, High Impact Earthquake is a low probability, high impact peril. In California, earthquake risk awareness is low, both within the general public and many (re)insurers. The peril has not caused a major insured loss for 25 years, the last being the magnitude 6.7 Northridge earthquake in 1994. California earthquake has the potential to cause large-scale insured and economic damage. A repeat of the Northridge event would likely cost the insurance industry today around US$30 billion, according to the latest version of the RMS® North America Earthquake Models, and Northridge is far from a worst-case scenario. From an insurance perspective, one of the most significant earthquake events on record would be the magnitude 9.0 Tōhoku Earthquake and Tsunami in 2011. For California, the 1906 magnitude 7.8 San Francisco earthquake, when Lloyd’s underwriter Cuthbert Heath famously instructed his San Franciscan agent to “pay all of our policyholders in full, irrespective of the terms of their policies”, remains historically significant. Heath’s actions led to a Lloyd’s payout of around US$50 million at the time and helped cement Lloyd’s reputation in the U.S. market. RMS models suggest a repeat of this event today could cost the insurance industry around US$50 billion. But the economic cost of such an event could be around six times the insurance bill — as much as US$300 billion — even before considering damage to infrastructure and government buildings, due to the surprisingly low penetration of earthquake insurance in the state. Events such as the 1906 earthquake and even Northridge are too far in the past to remain in public consciousness. And the lack of awareness of the peril’s damage potential is demonstrated by the low take-up of earthquake insurance in the state. “Because large, damaging earthquakes don’t happen very frequently, and we never know when they will happen, for many people it’s out of sight, out of mind. They simply think it won’t happen to them,” Pomeroy says. Across California, an average of just 12 percent to 14 percent of homeowners have earthquake insurance. Take-up varies across the state, with some high-risk regions, such as the San Francisco Bay Area, experiencing take-up below the state average. Take-up tends to be slightly higher in Southern California and is around 20 percent in Los Angeles and Orange counties.  Take-up will typically increase in the aftermath of an event as public awareness rises but will rapidly fall as the risk fades from memory. As with any low probability, high impact event, there is a danger the public will not be well prepared when a major event strikes.  The insurance industry can take steps to address this challenge, particularly through working to increase awareness of earthquake risk and actively promoting the importance of having insurance coverage for faster recovery. RMS and its insurance partners have also been working to improve society’s resilience against risks such as earthquake, through initiatives such as the 100 Resilient Cities program. Understanding the Risk While the tools to model and understand earthquake risk are improving all the time, there remain several unknowns which underwriters should be aware of. One of the reasons the Ridgecrest Earthquake came as such a surprise was that the fault on which it occurred was not one that seismologists knew existed.  Several other recent earthquakes — such as the 2014 Napa event, the Landers and Big Bear Earthquakes in 1992, and the Loma Prieta Earthquake in 1989 — took place on previously unknown or thought to be inactive faults or fault strands. As well as not having a full picture of where the faults may lie, scientific understanding of how multifaults can link together to form a larger event is also changing.  Events such as the Kaikoura Earthquake in New Zealand in 2016 and the Baja California Earthquake in Mexico in 2010 have helped inform new scientific thinking that faults can link together causing more damaging, larger magnitude earthquakes. The RMS North America Earthquake Models have also evolved to factor in this thinking and have captured multifault ruptures in the model based on the latest research results. In addition, studying the interaction between the faults that ruptured in the Ridgecrest events will allow RMS to improve the fault connectivity in the models.  A further learning from New Zealand came via the 2011 Christchurch Earthquake, which demonstrated how liquefaction of soil can be a significant loss driver due to soil condition in certain areas. The San Francisco Bay Area, an important national and international economic hub, could suffer a similar impact in the event of a major earthquake. Across the area, there has been significant residential and commercial development on artificial landfill areas over the last 100 years, which are prone to have significant liquefaction damage, similar to what was observed in Christchurch. Location, Location, Location Clearly, the location of the earthquake is critical to the scale of damage and insured and economic impact from an event. Ridgecrest is situated roughly 200 kilometers north of Los Angeles. Had the recent earthquake sequence occurred beneath Los Angeles instead, then it is plausible that the insured cost could have been in excess of US$100 billion.  The Puente Hills Fault, which sits underneath downtown LA, wasn’t discovered until around the turn of the century. A magnitude 6.8 Puente Hills event could cause an insured loss of US$78.6 billion, and a Newport-Inglewood magnitude 7.3 would cost an estimated US$77.1 billion according to RMS modeling. These are just a couple of the examples within its stochastic event set with a similar magnitude to the Ridgecrest events and which could have a significant social, economic and insured loss impact if they took place elsewhere in the state. The RMS model estimates that magnitude 7 earthquakes in California could cause insurance industry losses ranging from US$20,000 to a US$20 billion, but the maximum loss could be over US$100 billion if occurring in high population centers such as Los Angeles. The losses from the Ridgecrest event were on the low side of the range of loss as the event occurred in a less populated area. For the California Earthquake Authority’s portfolio in Los Angeles County, a large loss event of US$10 billion or greater can be expected approximately every 30 years. As with any major catastrophe, several factors can drive up the insured loss bill, including post-event loss amplification and contingent business interruption, given the potential scale of disruption. In Sacramento, there is also a risk of failure of the levee system. Fire following earthquake was a significant cause of damage following the 1906 San Francisco Earthquake and was estimated to account for around 40 percent of the overall loss from that event. It is, however, expected that fire would make a much smaller contribution to future events, given modern construction materials and methods and fire suppressant systems.  Political pressure to settle claims could also drive up the loss total from the event. Lawmakers could put pressure on the CEA and other insurers to settle claims quickly, as has been the case in the aftermath of other catastrophes, such as Hurricane Sandy. The California Earthquake Authority has recommended homes built prior to 1980 be seismically retrofitted to make them less vulnerable to earthquake damage. “We all need to learn the lesson of Ridgecrest: California needs to be better prepared for the next big earthquake because it’s sure to come,” Pomeroy says. “We recommend people consider earthquake insurance to protect themselves financially,” he continues. “The government’s not going to come in and rebuild everybody’s home, and a regular residential insurance policy does not cover earthquake damage. The only way to be covered for earthquake damage is to have an additional earthquake insurance policy in place.  “Close to 90 percent of the state does not have an earthquake insurance policy in place. Let this be the wake-up call that we all need to get prepared.”

close button
Video Title

Thank You

You’ll be contacted by an RMS specialist shortly.

RMS.com uses cookies to improve your experience and analyze site usage. Read Cookie Policy or click I understand.

close