logo image
Clear All
More Topics
Helen Yates
TreatyIQ: Striking a difficult balance
TreatyIQ: Striking a difficult balance
TreatyIQ: Striking a Difficult Balance
May 05, 2020

As treaty underwriters prepare to navigate another challenging renewal season, compounded by an uncertain economic outlook, many are looking to new technological solutions to help them capitalize on nascent optimism around rates and build sustainable profitability. EXPOSURE explores the importance of reliable marginal impact analytics to bias underwriting decisions in favor of diversification The fall of investment profits for insurance and reinsurance companies as a result of the impact of COVID-19 on financial markets is likely to encourage an upswing in reinsurance pricing. One of the factors that facilitates a hardening market is low investment returns, making an underwriting profit even more of an imperative. As the midyear renewals approach, reinsurance companies are cautiously optimistic that the reinsurance rate on line will continue on an upward trend. According to Willis Towers Watson, pricing was up significantly on loss-affected accounts as of April 1, but elsewhere there were more modest rate rises. It suggests that at this point in the cycle reinsurers cannot count on rate increases, presenting market pricing uncertainty that will need to be navigated in real time during the renewals. In the years of weaker market returns, investment in tools to deliver analytical rigor and agile pricing to underwriters can be difficult to justify, but in many cases, existing analytical processes during busy periods can expose blind spots in the assessment of a cedant portfolio and latency in the analysis of current portfolio risk positions. These inefficiencies will be more pronounced in the current work-from-home era and will leave many underwriters wondering how they can quickly identify and grab the best deals for their portfolios. Reducing volatility through the cycle Both parts of the underwriting cycle can put pressure on reinsurers on underwriting decisions. Whether prices are hardening or softening, market forces can lead reinsurers toward higher volatility. “Part of the interplay in the treaty underwriting guidelines has to do with diversification,” explains Jesse Nickerson, senior director, pricing actuary at RMS. “Underwriters generally want to write risks that are diversifying in nature. However, when rates are low and competition is fierce, this desire is sometimes overwhelmed by pressure to put capital to use. Underwriting guidelines then have a somewhat natural tendency to slip as risks are written at inadequate prices. Underwriters generally want to write risks that are diversifying in nature. However, when rates are low and competition is fierce, this desire is sometimes overwhelmed by pressure to put capital to use Jesse Nickerson RMS “The reduced competition in the market during the period of low profitability triggers increases in rates, and the bounce upward begins,” he continues. “As rates rise and profitability increases, another loosening of underwriting guidelines can occur because all business begins to look like good business. This cycle is a challenge for all of these reinsurance companies to try and manage as it can add significant volatility to their book.” Tools such as RMS TreatyIQ™ help underwriters better carry out marginal impact analytics, which considers the view of risk if new books of business are included in a treaty portfolio. Treaty underwriters are typically tasked with balancing the profitability of individual treaties alongside their impact to aggregate portfolio positions. “One of the things that underwriters take into account as part of the underwriting process is, ‘What is the impact of this potential piece of business on my current portfolio,’” explains Oli Morran, director of product at RMS. “It’s just like an investment decision except that they’re investing risk capital rather than investment capital. In order to get insight into marginal impact, treaty underwriters need to have a view of their portfolio in the application, and not just their current portfolio as it looks last week, month or quarter, but how it looks today “In order to get insight into marginal impact, treaty underwriters need to have a view of their portfolio in the application, and not just their current portfolio as it looks last week, month or quarter, but how it looks today,” he continues. “So, it collects all the treaty contracts you’ve underwritten and rolls it up together to get to your current portfolio position.” Based on this understanding of a reinsurer’s aggregate risk position, underwriters are able to see in real time what impact any given piece of business would have, helping to inform how much capacity they are willing to bring to bear – and at what price. As reinsurers navigate the current, asymmetric market renewals, with the added challenge that increased home-working presents, such insight will allow them to make the right judgments based on a dynamic situation. “Treaty underwriters can import that loss data into TreatyIQ, do some quick analysis and ‘math magic’ to make it work for their view of risk and then get a report in the app that tells them profitability metrics on each of the treaties in the structure, so they can configure the right balance of participation in each treaty when quoting to the broker or cedant,” says Morran. An art and science Relationships have always been a central part of treaty underwriting whereby reinsurers select cedants to partner with based on many years of experience and association. Regardless of where the industry is at in the market cycle, these important bonds help to shape the renewal process at key discussion points in the calendar. New tools, such as the TreatyIQ application, are enhancing both the “art” and ”science” parts of the underwriting equation. They are reducing the potential for volatility as underwriters steer portfolios through the reinsurance cycle while harnessing experience and pricing artistry in an auditable way. While much of insurtech has until now been focused on the underlying insurance market, reinsurers are beginning to benefit from applications that offer them real-time insights. An informed approach can help identify the most profitable accounts and steer underwriters toward business that best complements their company’s existing portfolio, overall strategy and risk appetite. Reinsurance underwriters can now make decisions on whether to renew and what pricing to set based on a true understanding of what one risk sitting on their desk has the ability to do to the risks they already hold. With hundreds of treaty programs to assess during a busy renewal season, such insights support underwriters as they decide which deals to underwrite and what portion of each treaty to take on. A constant challenge for treaty underwriters is how to strike the right balance between managing complex and often longstanding relationships with cedants and brokers, while at the same time ensuring that underwritten business complements an existing portfolio. Maintaining underwriting discipline while nurturing all-important client relationships is a more straightforward task when there is data and insight readily available, says Nickerson. “Much of the strength of TreatyIQ is in the efficiency of workflows in augmenting the insight underwriters have at their fingertips. The faster they can get back to a cedant or broker, the better it is for the relationship. The more completely they understand the impact to their portfolio, the better it is for their bottom line.” RMS model data has long been a foundation in reinsurance treaty transactions, providing the common market view of risk for assessing probable catastrophe losses to a cedant’s portfolio. But using modeled data in treaty pricing analytics has traditionally been a multisystem undertaking, involving a supporting cast of actuaries and cat modelers. TreatyIQ allows you to pass losses through potential treaties and quickly see which are the most profitable based on a user’s unique pricing algorithms and risk tolerance RMS Risk Intelligence™ – a modular risk analytics platform – has enabled RMS to develop TreatyIQ as a solution to the analytics needs of treaty underwriters, covering pricing and portfolio roll-up, and to close the analytical gaps that muddy pricing insights. “TreatyIQ allows you to pass losses through potential treaties and quickly see which are the most profitable based on a user’s unique pricing algorithms and risk tolerance,” continues Nickerson. “You can see which have the most positive impact on your portfolio, allowing you to go back to the broker or cedant and make a more informed pitch. Ultimately, it allows underwriters to optimize internally against the constraints that exist in their world at a time of great uncertainty and change.”

NIGEL ALLEN
The data difference
The data difference
The Data Difference
May 05, 2020

The value of data as a driver of business decisions has grown exponentially as the importance of generating sustainable underwriting profit becomes the primary focus for companies in response to recent diminished investment yields. Increased risk selection scrutiny is more important than ever to maintain underwriting margins. High-caliber, insightful risk data is critical for the data analytics that support each risk decision The insurance industry is in a transformational phase where profit margins continue to be stretched in a highly competitive marketplace. Changing customer dynamics and new technologies are driving demand for more personalized solutions delivered in real time, while companies are working to boost performance, increase operational efficiency and drive greater automation. In some instances, this involves projects to overhaul legacy systems that are central to daily operation. In such a state of market flux, access to quality data has become a primary differentiator. But there’s the rub. Companies now have access to vast amounts of data from an expanding array of sources — but how can organizations effectively distinguish good data from poor data? What differentiates the data that delivers stellar underwriting performance from that which sends a combined operating performance above 100 percent? A complete picture “Companies are often data rich, but insight poor,” believes Jordan Byk, senior director, product management at RMS. “The amount of data available to the (re)insurance industry is staggering, but creating the appropriate insights that will give them a competitive advantage is the real challenge. To do that, data consumers need to be able to separate ‘good’ from ‘bad’ and identify what constitutes ‘great’ data.” For Byk, a characteristic of “great data” is the speed with which it drives confident decision-making that, in turn, guides the business in the desired direction. “What I mean by speed here is not just performance, but that the data is reliable and insightful enough that decisions can be made immediately, and all are confident that the decisions fit within the risk parameters set by the company for profitable growth. “While resolution is clearly a core component of our modeling capabilities at RMS, the ultimate goal is to provide a complete data picture and ensure quality and reliability of underlying data”  Oliver Smith RMS “We’ve solved the speed and reliability aspect by generating pre-compiled, model-derived data at resolutions intelligent for each peril,” he adds. There has been much focus on increasing data-resolution levels, but does higher resolution automatically elevate the value of data in risk decision-making? The drive to deliver data at 10-, five- or even one-meter resolution may not necessarily be the main ingredient in what makes truly great data. “Often higher resolution is perceived as better,” explains Oliver Smith, senior product manager at RMS, “but that is not always the case. While resolution is clearly a core component of our modeling capabilities at RMS, the ultimate goal is to provide a complete data picture and ensure quality and reliability of underlying data. “Resolution of the model-derived data is certainly an important factor in assessing a particular exposure,” adds Smith, “but just as important is understanding the nature of the underlying hazard and vulnerability components that drive resolution. Otherwise, you are at risk of the ‘garbage-in-garbage-out’ scenario that can foster a false sense of reliability based solely around the ‘level’ of resolution.” The data score The ability to assess the impact of known exposure data is particularly relevant to the extensive practice of risk scoring. Such scoring provides a means of expressing a particular risk as a score from 1 to 10, 1 to 20 or another means that indicates “low risk to high risk” based on an underlying definition for each value. This enables underwriters to make quick submission assessments and supports critical decisions relating to quoting, referrals and pricing. “Such capabilities are increasingly common and offer a fantastic mechanism for establishing underwriting guidelines, and enabling prioritization and triage of locations based on a consistent view of perceived riskiness,” says Chris Sams, senior product manager at RMS. “What is less common, however, is ‘reliable’ and superior quality risk scoring, as many risk scores do not factor in readily available vulnerability data.” “Such capabilities are increasingly common and offer a fantastic mechanism for establishing underwriting guidelines, and enabling prioritization and triage of locations based on a consistent view of perceived riskiness”  Chris Sams RMS Exposure insight is created by adjusting multiple data lenses until the risk image comes into focus. If particular lenses are missing or there is an overreliance on one particular lens, the image can be distorted. For instance, an overreliance on hazard-related information can significantly alter the perceived exposure levels for a specific asset or location. “Take two locations adjacent to one another that are exposed to the same wind or flood hazard,” Byk says. “One is a high-rise hotel built in 2020 and subject to the latest design standards, while another is a wood-frame, small commercial property built in the 1980s; or one location is built at ground level with a basement, while another is elevated on piers and does not have a basement. “These vulnerability factors will result in a completely different loss experience in the occurrence of a wind- or flood-related event. If you were to run the locations through our models, the annual average loss figures will vary considerably. But if the underwriting decision is based on hazard-only scores, they will look the same until they hit the portfolio assessment — and that’s when the underwriter could face some difficult questions.” To assist clients to understand the differences in vulnerability factors, RMS provides ExposureSource, a U.S. property database comprised of property characteristics for 82 million residential buildings and 21 million commercial buildings. By providing this high-quality exposure data set, clients can make the most of the RMS risk scoring products for the U.S. Seeing through the results Another common shortfall with risk scores is the lack of transparency around the definitions attributed to each value. Looking at a scale of 1 to 10, for example, companies don’t have insight into the exposure characteristics being used to categorize a particular asset or location as, say, a 4 rather than a 5 or 6. To combat data-scoring deficiencies, RMS RiskScore values are generated by catastrophe models incorporating the trusted science and quality you expect from an RMS model, calibrated on billions of dollars of real-world claims. With consistent and reliable risk scores covering 30 countries and up to seven perils, the apparent simplicity of the RMS RiskScore hides the complexity of the big data catastrophe simulations that create them. The scores combine hazard and vulnerability to understand not only the hazard experienced at a site, but also the susceptibility of a particular building stock when exposed to a given level of hazard. The RMS RiskScore allows for user definition of exposure characteristics such as occupancy, construction material, building height and year built. Users can also define secondary modifiers such as basement presence and first-floor height, which are critical for the assessment of flood risk, and roof shape or roof cover, which is critical for wind risk. “It also provides clearly structured definitions for each value on the scale,” explains Smith, “providing instant insight on a risk’s damage potential at key return periods, offering a level of transparency not seen in other scoring mechanisms. For example, a score of 6 out of 10 for a 100-year earthquake event equates to an expected damage level of 15 to 20 percent. This information can then be used to support a more informed decision on whether to decline, quote or refer the submission. Equally important is that the transparency allows companies to easily translate the RMS RiskScore into custom scales, per peril, to support their business needs and risk tolerances.” Model insights at point of underwriting While RMS model-derived data should not be considered a replacement for the sophistication offered by catastrophe modeling, it can enable underwriters to access relevant information instantaneously at the point of underwriting. “Model usage is common practice across multiple points in the (re)insurance chain for assessing risk to individual locations, accounts, portfolios, quantifying available capacity, reinsurance placement and fulfilling regulatory requirements — to name but a few,” highlights Sams. “However, running the model takes time, and, often, underwriting decisions — particularly those being made by smaller organizations — are being made ahead of any model runs. By the time the exposure results are generated, the exposure may already be at risk.” “Through this interface, companies gain access to the immense datasets that we maintain in the cloud and can simply call down risk decision information whenever they need it”  Jordan Byk RMS In providing a range of data products into the process, RMS is helping clients select, triage and price risks before such critical decisions are made. The expanding suite of data assets is generated by its probabilistic models and represents the same science and expertise that underpins the model offering. “And by using APIs as the delivery vehicle,” adds Byk, “we not only provide that modeled insight instantaneously, but also integrate that data directly and seamlessly into the client’s on-premise systems at critical points in their workflow. Through this interface, companies gain access to the immense datasets that we maintain in the cloud and can simply call down risk decision information whenever they need it. While these are not designed to compete with a full model output, until a time that we have risk models that provide instant analysis, such model-derived datasets offer the speed of response that many risk decisions demand.” A consistent and broad perspective on risk A further factor that can instigate problems is data and analytics inconsistency across the (re)insurance workflow. Currently, with data extracted from multiple sources and, in many cases, filtered through different lenses at various stages in the workflow, having consistency from the point of underwriting to portfolio management has been the norm. “There is no doubt that the disparate nature of available data creates a disconnect between the way risks are assumed into the portfolio and how they are priced,” Smith points out. “This disconnect can cause ‘surprises’ when modeling the full portfolio, generating a different risk profile than expected or indicating inadequate pricing. By applying data generated via the same analytics and data science that is used for portfolio management, consistency can be achieved for underwriting risk selection and pricing, minimizing the potential for surprise.” Equally important, given the scope of modeled data required by (re)insurance companies, is the need to focus on providing users with the means to access the breadth of data from a central repository. “If you access such data at speed, including your own data coupled with external information, and apply sophisticated analytics — that is how you derive truly powerful insights,” he concludes. “Only with that scope of reliable, insightful information instantly accessible at any point in the chain can you ensure that you’re always making fully informed decisions — that’s what great data is really about. It’s as simple as that.” For further information on RMS’s market-leading data solutions, click here.

NIGEL ALLEN
This changes everything
This changes everything
This Changes Everything
May 05, 2020

At this year’s Exceedance®, RMS explored the key forces currently disrupting the industry, from technology, data analytics and the cloud through to rising extremes of catastrophic events like the pandemic and climate change. This coupling of technological and environmental disruption represents a true inflection point for the industry. EXPOSURE asked six experts across RMS for their views on why they believe these forces will change everything Cloud Computing: Moe Khosravy, executive vice president, software and platforms How are you seeing businesses transition their workloads over to the cloud? I have to say it’s been remarkable. We’re way past basic conversations on the value proposition of the cloud to now having deep technical discussions that are truly transformative plays. Customers are looking for solutions that seamlessly scale with their business and platforms that lower their cost of ownership while delivering capabilities that can be consumed from anywhere in the world. Why is the cloud so important or relevant now? It is now hard for a business to beat the benefits that the cloud offers and getting harder to justify buying and supporting complex in-house IT infrastructure. There is also a mindset shift going on — why is an in-house IT team responsible for running and supporting another vendor’s software on their systems if the vendor itself can provide that solution? This burden can now be lifted using the cloud, letting the business concentrate on what it does best. Has the pandemic affected views of being in the cloud? I would say absolutely. We have always emphasized the importance of cloud and true SaaS architectures to enable business continuity — allowing you to do your work from anywhere, decoupled from your IT and physical footprint. Never has the importance of this been more clearly underlined than during the past few months. Risk Analytics: Cihan Biyikoglu, executive vice president, product What are the specific industry challenges that risk analytics is solving or has the potential to solve? Risk analytics really is a wide field, but in the immediate short term one of the focus areas for us is improving productivity around data. So much time is spent by businesses trying to manually process data — cleansing, completing and correcting data — and on conversion between incompatible datasets. This alone is a huge barrier just to get a single set of results. If we can take this burden away, give decision-makers the power to get results in real time with automated and efficient data handling, then with that I believe we will liberate them to use the latest insights to drive business results. Another important innovation here are the HD Models™. The power of the new engine with its improved accuracy I believe is a game changer that will give our customers a competitive edge. How will risk analytics impact activities and capabilities within the market? As seen in other industries, the more data you can combine, the better the analytics become — that’s the universal law of analytics. Getting all of this data on a unified platform and combining different datasets unearths new insights, which could produce opportunities to serve customers better and drive profit or growth. What are the longer-term implications for risk analytics? In my view, it’s about generating more effective risk insights from analytics, results in better decision- making and the ability to explore new product areas with more confidence. It will spark a wave of innovation to profitably serve customers with exciting products and understand the risk and cost drivers more clearly. How is RMS capitalizing on risk analytics? At RMS, we have the pieces in place for clients to accelerate their risk analytics with the unified, open platform, Risk Intelligence™, which is built on a Risk Data Lake™ in the cloud and is ready to take all sources of data and unearth new insights. Applications such as Risk Modeler™ and ExposureIQ™ can quickly get decision-makers to the analytics they need to influence their business. Open Standards: Dr. Paul Reed, technical program manager, RDOS Why are open standards so important and relevant now? I think the challenges of risk data interoperability and supporting new lines of business have been recognized for many years, as companies have been forced to rework existing data standards to try to accommodate emerging risks and to squeeze more data into proprietary standards that can trace their origins to the 1990s. Today, however, with the availability of big data technology, cloud platforms such as RMS Risk Intelligence and standards such as the Risk Data Open Standard™ (RDOS) allow support for high-resolution risk modeling, new classes of risk, complex contract structures and simplified data exchange. Are there specific industry challenges that open standards are solving or have the potential to solve? I would say that open standards such as the RDOS are helping to solve risk data interoperability challenges, which have been hindering the industry, and provide support for new lines of business. In the case of the RDOS, it’s specifically designed for extensibility, to create a risk data exchange standard that is future-proof and can be readily modified and adapted to meet both current and future requirements. Open standards in other industries, such as Kubernetes, Hadoop and HTML, have proven to be catalysts for collaborative innovation, enabling accelerated development of new capabilities. How is RMS responding to and capitalizing on this development? RMS contributed the RDOS to the industry, and we are using it as the data framework for our platform called Risk Intelligence. The RDOS is free for anyone to use, and anyone can contribute updates that can expand the value and utility of the standard — so its development and direction is not dependent on a single vendor. We’ve put in place an independent steering committee to guide the development of the standard, currently made up of 15 companies. It provides benefits to RMS clients not only by enhancing the new RMS platform and applications, but also by enabling other industry users who create new and innovative products and address new and emerging risk classes. Pandemic Risk: Dr. Gordon Woo, catastrophist How does pandemic risk affect the market? There’s no doubt that the current pandemic represents a globally systemic risk across many market sectors, and insurers are working out both what the impact from claims will be and the impact on capital. For very good reasons, people are categorizing the COVID-19 disease as a game-changer. However, in my view, SARS [severe acute respiratory syndrome] in 2003, MERS [Middle East respiratory syndrome] in 2012 and Ebola in 2014 should also have been game-changers. Over the last decade alone, we have seen multiple near misses. It’s likely that suppression strategies to combat the coronavirus will probably continue in some form until a vaccine is developed, and governments must strike this uneasy balance between their economies and the opening of their populations to exposure from the virus. What are the longer-term implications of this current pandemic for the industry? It’s clear that the mitigation of pandemic risk will need to be prioritized and given far more urgency than before. There’s no doubt in my mind that events such as the 2014 Ebola crisis were a missed opportunity for new initiatives in pandemic risk mitigation. Away from the life and health sector, all insurers will need to have a better grasp on future pandemics, after seeing the impact of COVID-19 and its wide business impact. The market could look to bold initiatives with governments to examine how to cover future pandemics, similar to how terror attacks are covered as a pooled risk. How is RMS helping its clients in relation to COVID-19? Since early January when the first cases emerged from Wuhan, China, we’ve been supporting our clients and the wider market in gaining a better understanding of the diverse loss implications of COVID-19. Our LifeRisks® team has been actively assisting in pandemic risk management, with regular communications and briefings, and will incorporate new perspectives from COVID-19 into our infectious diseases modeling. Climate Change: Ryan Ogaard, senior vice president, model product management Why is climate change so relevant to the market now? There are many reasons. Insurers and their stakeholders are looking at the constant flow of catastrophes, from the U.S. hurricane season of 2017, wildfires in California and bushfires in Australia, to recent major typhoons and wondering if climate change is driving extreme weather risk, and what it could do in the future. They’re asking whether the current extent of climate change risk is priced into their premiums. Regulators are also beginning to conduct stress tests on the potential impact of climate change in the future, and insurers must respond. How will climate change impact how the market operates? Similar to any risk, insurers need to understand and quantify how the physical risk of climate change will impact their portfolios and adjust their strategy accordingly. Also, over the coming years it appears likely that regulators will incorporate climate change reporting into their regimes. Once an insurer understands their exposure to climate change risk, they can then start to take action — which will impact how the market operates. These actions could be in the form of premium changes, mitigating actions such as supporting physical defenses, diversifying the risk or taking on more capital. How is RMS responding to market needs around climate change? RMS is listening to the needs of clients to understand their pain points around climate change risk, what actions they are taking and how we can add value. We’re working with a number of clients on bespoke studies that modify the current view of risk to project into the future and/or test the sensitivity of current modeling assumptions. We’re also working to help clients understand the extent to which climate change is already built into risk models, to educate clients on emerging climate change science and to explain whether there is or isn’t a clear climate change signal for a particular peril. Cyber: Dr. Christos Mitas, vice president, model development How is this change currently manifesting itself? While cyber risk itself is not new, for anyone involved in protecting or insuring organizations against cyberattacks, they will know that the nature of cyber risk is forever evolving. This could involve changes in those perpetrating the attacks, from lone wolf criminals to state-backed actors or the type of target from an unpatched personal computer to a power-plant control system. If you take the current COVID-19 pandemic, this has seen cybercriminals look to take advantage of millions of employees working from home or vulnerable business IT infrastructure. Change to the threat landscape is a constant for cyber risk. Why is cyber risk so important and relevant right now? Simply because new cyber risks emerge, and insurers who are active in this area need to ensure they are ahead of the curve in terms of awareness and have the tools and knowledge to manage new risks. There have been systemic ransomware attacks over the last few years, and criminals continue to look for potential weaknesses in networked systems, third-party software, supply chains — all requiring constant vigilance. It’s this continual threat of a systemic attack that requires insurers to use effective tools based on cutting-edge science, to capture the latest threats and identify potential risk aggregation. How is RMS responding to market needs around cyber risk? With our latest RMS Cyber Solutions, which is version 4.0, we’ve worked closely with clients and the market to really understand the pain points within their businesses, with a wealth of new data assets and modeling approaches. One area is the ability to know the potential cyber risk of the type of business you are looking to insure. In version 4.0, we have a database of over 13 million businesses that can help enrich the information you have about your portfolio and prospective clients, which then leads to more prudent and effective risk modeling. A Time to Change Our industry is undergoing a period of significant disruption on multiple fronts. From the rapidly evolving exposure landscape and the extraordinary changes brought about by the pandemic to step-change advances in technology and seismic shifts in data analytics capabilities, the market is undergoing an unparalleled transition period. As Exceedance 2020 demonstrated, this is no longer a time for business as usual. This is what defines leaders and culls the rest. This changes everything.

Helen Yates
Cyber Solutions 4.0: Modeling systemic risk
Cyber Solutions 4.0: Modeling systemic risk
Cyber Solutions 4.0: Modeling Systemic Risk
May 05, 2020

The updated RMS cyber model leverages data, software vulnerabilities, attack scenarios and advanced analytics to help insurers and reinsurers get a handle on their risk aggregations From distributed denial of service (DDoS) attacks, cloud outages and contagious malware through to cyber physical exposures, cyber risk is a sentient and ever-changing threat environment. The cyber insurance market has evolved with the threat, tailoring policies to the exposures most concerning businesses around the world, ranging from data breach to business interruption. But recent events have highlighted the very real potential for systemic risks arising from a cyberattack. Nowhere was this more obvious than the 2017 WannaCry and NotPetya ransomware attacks. WannaCry affected over 200,000 computers in businesses that spanned industry sectors across 150 countries, including more than 80 National Health Service organizations in the U.K. alone. Had it not been for the discovery of a “kill switch,” the malware would have caused even more disruption and economic loss. Just a month after WannaCry, NotPetya hit. It used the same weakness within corporate networks as the WannaCry ransomware, but without the ability to jump from one network to another. With another nation-state as the suspected sponsor, this new strain of contagious malware impacted major organizations, including shipping firm Maersk and pharmaceutical company Merck. Both cyber events highlighted the potential for systemic loss from a single attack, as well as the issues surrounding “silent” cyber cover. The high-profile claims dispute arising between U.S. snack-food giant Mondelez and its property insurer, after the carrier refused a US$100 million claim based on a war exclusion within its policy, fundamentally changed the direction of the insurance market. It resulted in regulators and the industry coming together in a concerted push to clarify whether cyber cover was affirmative or non-affirmative. The cyber black swan There are numerous sources of systemic risk arising from a cyber incident. For the cyber (re)insurance market to reach maturity and a stage at which it can offer the limits and capacity now desired by commercial clients, it is first necessary to understand and mitigate these aggregate exposures. A report published by RMS and the Cambridge Centre for Risk Studies in 2019 found there is increasing potential for systemic failures in IT systems or for systemic exploitation of strategically important technologies. Much of this is the result of an ever more connected world, with a growth in the internet of things (IoT) and reliance on third-party vendors. Supply chain attacks are a source of systemic risk, which will continue to grow over time with the potential for significant accumulation losses for the insurance industry As the report states, “Supply chain attacks are a source of systemic risk, which will continue to grow over time with the potential for significant accumulation losses for the insurance industry.” The report also noted that many of the victims of NotPetya were unintentionally harmed by the ransomware, which is believed to have been a politically motivated attack against Ukraine. Cyber models meet evolving market demands Models and other risk analysis tools have become critical to the ongoing development and growing sophistication of the cyber insurance and reinsurance markets. As the industry continues to adapt its offering, there is demand for models that capture the latest threats and enable a clearer understanding into potential aggregations of risk within carriers’ books of business. From introducing exclusions on the silent side and developing sophisticated models to understanding the hazard itself and modeling contagious malware as a physical process, we are gaining ever-greater insight into the physics and dynamics of the underlying risk  Dr. Christos Mitas, RMS  As the insurance industry has evolved in its approach to cyber risk, so too has the modeling. Version 4.0 of the RMS Cyber Solutions, released in October 2019, brings together years of extensive research into the underlying processes that underpin cyber risk. It leverages millions of data points and provides expanded data enrichment capabilities on 13 million global companies, leading to improved model accuracy, explains Dr. Christos Mitas, head of the RMS cyber risk modeling group. “We have been engaging with a couple of dozen clients for the past four years and incorporating features into our solution that speak to the pain points they see in their day-to-day business,” he says. “From introducing exclusions on the silent side and developing sophisticated models to understanding the hazard itself and modeling contagious malware as a physical process, we are gaining ever-greater insight into the physics and dynamics of the underlying risk.” Feedback over the past six months since the release of Version 4.0 has been extremely positive, says Mitas. “There has been genuine amazement around the data assets we have developed and the modeling framework around which we have organized this data collection effort. There has been a huge effort over the last two years by our data scientists who have been using artificial intelligence (AI) and machine learning (ML) to collect data points from cyber events across all the sources of cyber risk that we model. “Cyber 4.0 also included new functionality to address software vulnerabilities and motivations of cyber threat actor groups that have been active over the last few years,” he continues. “These are all datasets that we have collected, and they are complemented with third-party sources — including academia, cybersecurity firms, and partners within the insurance industry — into cyber damage events.” There has been strong support from the reinsurance market, which has been a little bit behind the primary insurance market in developing its cyber product suite. “The reinsurance market has not developed as much as you would expect it to if they were relying on robust models,” says Mitas. “So, we have enhanced reinsurance modeling in our financial engines and exceedance probability (EP) curves to meet this need. “We’ve had some good feedback from reinsurance pieces we have included in Version 4.0,” he continues. “From a cybersecurity point of view, very sophisticated clients that work with internal cybersecurity teams have commented on the strength of some of our modeling for contagious malware, and for cloud outages and data breach.” Quoted Source: Barracuda Networks Click here to learn more about RMS’s purpose-built cyber model

NIGEL ALLEN
A solution shared
A solution shared
A Solution Shared
May 05, 2020

The Risk Data Open Standard is now available, and active industry collaboration is essential for achieving wide-scale interoperability objectives On January 31, the first version of the Risk Data Open Standard™ (RDOS) was made available to the risk community and the public on the GitHub platform. The RDOS is an “open” standard because it is available with no fees or royalties and anyone can review, download, contribute to or leverage the RDOS for their own project. With the potential to transform the way risk data is expressed and exchanged across the (re)insurance industry and beyond, the RDOS represents a new data model (i.e., a data specification or schema) specifically designed for holding all types of risk data, from exposure through model settings to results analyses. The industry has longed recognized that a dramatic improvement in risk data container design is required to support current and future industry operations. The industry currently relies on data models for risk data exchange and storage that were originally designed to support property cat models over 20 years ago. These formats are incomplete. They do not capture critical information about contracts, business structures or model settings. This means that an analyst receiving data in these old formats has detective work to do – filling in the missing pieces of the risk puzzle. Because formats lack a complete picture linking exposures to results, highly skilled, well-paid people are wasting a huge amount of time, and efforts to automate are difficult, if not impossible, to achieve. Existing formats are also very property-centric. As models for new insurance lines have emerged over the years, such as energy, agriculture and cyber, the risk data for these lines of business have either been forced suboptimally into the property cat data model, or entirely new formats have been created to support single lines of business. The industry is faced with two poor choices: accept substandard data or deal with many data formats – potentially one for each line of business – possibly multiplied by the number of companies who offer models for a particular line of business. The industry is painfully aware of the problems we are trying to solve. The RDOS aims to provide a complete, flexible and interoperable data format ‘currency’ for exchange that will eliminate the time-consuming and costly data processes that are currently required  Paul Reed RMS “The industry is painfully aware of the problems we are trying to solve. The RDOS aims to provide a complete, flexible and interoperable data format ‘currency’ for exchange that will eliminate the time-consuming and costly data processes that are currently required,” explains Paul Reed, technical program manager for the RDOS at RMS. He adds, “Of course, adoption of a new standard can’t happen overnight, but because it is backward-compatible with the RMS EDM and RDM users have optionality through the transition period.” Taking on the challenge The RDOS has great promise. An open standard specifically designed to represent and exchange risk data, it accommodates all categories of risk information across five critical information sets – exposure, contracts (coverage), business structures, model settings and results analyses. But can it really overcome the many intrinsic data hurdles currently constraining the industry? According to Ryan Ogaard, senior vice president of model product management at RMS, its ability to do just that lies in the RDOS’s conceptual entity model. “The design is simple, yet complete, consisting of these five linked categories of information that provide an unambiguous, auditable view of risk analysis,” he explains. “Each data category is segregated – creating flexibility by isolating changes to any given part of the RDOS – but also linked in a single container to enable clear navigation through and understanding of any risk analysis, from the exposure and contracts through to the results.” By adding critical information about the business structure and models used, the standard creates a complete data picture – a fully traceable description of any analysis. This unique capability is a result of the superior technical data model design that the RDOS brings to the data struggle, believes Reed. “The RDOS delivers multiple technical advantages,” he says. “Firstly, it stores results data along with contracts, business structure and settings data, which combine to enable a clear and comprehensive understanding of analyses. Secondly, the contract definition language (CDL) and structure definition language (SDL) provide a powerful tool for unambiguously determining contract payouts from a set of claims. In addition, the data model design supports advanced database technology and can be implanted in several popular DB formats including object-relational and SQL. Flexibility has been designed into virtually every facet of the RDOS, with design for extensibility built into each of the five information entities.” “New information sets can be introduced to the RDOS without impacting existing information,” Ogaard says. “This overcomes the challenges of model rigidity and provides the flexibility to capture multivendor modeling data, as well as the user’s own view of risk. This makes the standard future-proof and usable by a broad cross section of the (re)insurance industry and other industries.” Opening up the standard To achieve the ambitious objective of risk data interoperability, it was critical that the RDOS was founded on an open-source platform. Establishing the RDOS on the GitHub platform was a game-changing decision, according to Cihan Biyikoglu, executive vice president of product at RMS. You have to recognize the immense scale of the data challenge that exists within the risk analysis field. To address it effectively will require a great deal of collaboration across a broad range of stakeholders. Having the RDOS as an open standard enables that scale of collaboration to occur. “I’ve worked on a number of open-source projects,” he says, “and in my opinion an open-source standard is the most effective way of energizing an active community of contributors around a particular project. “You have to recognize the immense scale of the data challenge that exists within the risk analysis field. To address it effectively will require a great deal of collaboration across a broad range of stakeholders. Having the RDOS as an open standard enables that scale of collaboration to occur.” Concerns have been raised about whether, given its open-source status and the ambition to become a truly industrywide standard, RMS should continue to play a leading role in the ongoing development of the RDOS now that it is open to all. Biyikoglu believes it should. “Many open-source projects start with a good initial offering but are not maintained over time and quickly become irrelevant. If you look at the successful projects, a common theme is that they emanate from an industry participant suffering greatly from the particular issue. In the early phase, they contribute the majority of the improvements, but as the project evolves and the active community expands, the responsibility for moving it forward is shared by all. And that is exactly what we expect to see with the RDOS.” For Paul Reed, the open-source model provides a fair and open environment in which all parties can freely contribute. “By adopting proven open-source best practices and supported by the industry-driven RDOS Steering Committee, we are creating a level playing field in which all participants have an equal opportunity to contribute.” Assessing the potential Following the initial release of the RDOS, much of the activity on the GitHub platform has involved downloading and reviewing the RDOS data model and tools, as users look to understand what it can offer and how it will function. However, as the open RDOS community builds and contributions are received, combined with guidance from industry experts on the steering committee, Ogaard is confident it will quickly start generating clear value on multiple fronts. “The flexibility, adaptability and completeness of the RDOS structure create the potential to add tremendous industry value,” he believes, “by addressing the shortcomings of current data models in many areas. There is obvious value in standardized data for lines of business beyond property and in facilitating efficiency and automation. The RDOS could also help solve model interoperability problems. It’s really up to the industry to set the priorities for which problem to tackle first. The flexibility, adaptability and completeness of the RDOS structure create the potential to add tremendous industry value “Existing data formats were designed to handle property data,” Ogaard continues, “and do not accommodate new categories of exposure information. The RDOS Risk Item entity describes an exposure and enables new Risk Items to be created to represent any line of business or type of risk, without impacting any existing Risk Item. That means a user could add marine as a new type of Risk Item, with attributes specific to marine, and define contracts that cover marine exposure or its own loss type, without interfering with any existing Risk Item.” The RDOS is only in its infancy, and how it evolves – and how quickly it evolves – lies firmly in the hands of the industry. RMS has laid out the new standard in the GitHub open-source environment and, while it remains committed to the open standard’s ongoing development, the direction that the RDOS takes is firmly in the hands of the (re)insurance community.   Access the Risk Data Open Standard here

Helen Yates
Climate Change: The cost of inaction
Climate Change: The cost of inaction
Climate Change: The Cost of Inaction
May 05, 2020

With pressure from multiple directions for a change in the approach to climate risk, how the insurance industry responds is under scrutiny Severe threats to the climate account for all of the top long-term risks in this year’s World Economic Forum (WEF) “Global Risks Report.” For the first time in the survey’s 10-year outlook, the top five global risks in terms of likelihood are all environmental. From an industry perspective, each one of these risks has potentially significant consequences for insurance and reinsurance companies: Extreme weather events with major damage to property, infrastructure and loss of human life Failure of climate change mitigation and adaptation by governments and businesses Man-made environmental damage and disasters including massive oil spills and incidents of radioactive contamination Major biodiversity loss and ecosystem collapse (terrestrial or marine) with irreversible consequences for the environment, resulting in severely depleted resources for humans as well as industries Major natural disasters such as earthquakes, tsunamis, volcanic eruptions and geomagnetic storms “There is mounting pressure on companies from investors, regulators, customers and employees to demonstrate their resilience to rising climate volatility,” says John Drzik, chairman of Marsh and McLennan Insights. “Scientific advances mean that climate risks can now be modeled with greater accuracy and incorporated into risk management and business plans. High-profile events, like recent wildfires in Australia and California, are adding pressure on companies to take action on climate risk.” There is mounting pressure on companies from investors, regulators, customers and employees to demonstrate their resilience to rising climate volatility”  John Drzik Marsh and McLennan Insights In December 2019, the Bank of England introduced new measures for insurers, expecting them to assess, manage and report on the financial risks of climate change as part of the bank’s 2021 Biennial Exploratory Scenario (BES) exercise. The BES builds on the Prudential Regulatory Authority’s Insurance Stress Test 2019, which asked insurers to stress test their assets and liabilities based on a series of future climate scenarios. The Network for the Greening of the Financial System shows how regulators in other countries are moving in a similar direction. “The BES is a pioneering exercise, which builds on the considerable progress in addressing climate-related risks that has already been made by firms, central banks and regulators,” said outgoing Bank of England governor Mark Carney. “Climate change will affect the value of virtually every financial asset; the BES will help ensure the core of our financial system is resilient to those changes.” The insurance industry’s approach to climate change is evolving. Industry-backed groups such as ClimateWise have been set up to respond to the challenges posed by climate change while also influencing policymakers. “Given the continual growth in exposure to natural catastrophes, insurance can no longer simply rely on a strategy of assessing and re-pricing risk,” says Maurice Tulloch, former chair of ClimateWise and CEO of international insurance at Aviva. “Doing so threatens a rise of uninsurable markets.” The cost of extreme events In the past, property catastrophe (re)insurers were able to recalibrate their perception of natural catastrophe risk on an annual basis, as policies came up for renewal, believing that changes to hazard frequency and/or severity would occur incrementally over time. However, it has become apparent that some natural hazards have a much greater climate footprint than had been previously imagined. Attribution studies are helping insurers and other stakeholders to measure the financial impact of climate change on a specific event. “You have had events in the last few years that have a climate change signature to them,” says Robert Muir-Wood, chief research officer of science and technology at RMS. “That could include wildfire in California or extraordinary amounts of rainfall during Hurricane Harvey over Houston, or the intensity of hurricanes in the Caribbean, such as Irma, Maria and Dorian. “These events appear to be more intense and severe than those that have occurred in the past,” he continues. “Attribution studies are corroborating the fact that these natural disasters really do have a climate change signature. It was a bit experimental to start with, but now it’s just become a regular part of the picture, that after every event a designated attribution study program will be undertaken … often by more than one climate lab. “In the past it was a rather futile argument whether or not an event had a greater impact because of climate change, because you couldn’t really prove the point,” he adds. “Now it’s possible to say not only if an event has a climate change influence, but by how much. The issue isn’t whether something was or was not climate change, it’s that climate change has affected the probability of an event like that by this amount. That is the nature of the conversation now, which is an intelligent way of thinking about it.” Now it’s possible to say not only if an event has a climate change influence, but by how much. The issue isn’t whether something was or was not climate change, it’s that climate change has affected the probability of an event like that by this amount  Robert Muir-Wood RMS Record catastrophe losses in 2017 and 2018 — with combined claims costing insurers US$230 billion, according to Swiss Re sigma — have had a significant impact on the competitive and financial position of many property catastrophe (re)insurers. The loss tally from 2019 was less severe, with global insurance losses below the 10-year average at US$56 billion, but Typhoons Faxai and Hagibis caused significant damage to Japan when they occurred just weeks apart in September and October. “It can be argued that the insurance industry is the only sector that is going to be able to absorb the losses from climate change,” adds Muir-Wood. “Companies already feel they are picking up losses in this area and it’s a bit uncharted — you can’t just use the average of history. It doesn’t really work anymore. So, we need to provide the models that give our clients the comfort of knowing how to handle and price climate change risks in anticipation.” The cost of short-termism While climate change is clearly on the agenda of the boards of international insurance and reinsurance firms, its emphasis differs from company to company, according to the Geneva Association. In a report, the industry think tank found that insurers are hindered from scaling up their contribution to climate adaptation and mitigation by barriers that are imposed at a public policy and regulatory level. The need to take a long-term view on climate change is at odds with the pressures that insurance companies are under as public and regulated entities. Shareholder expectations and the political demands to keep insurance rates affordable are in conflict with the need to charge a risk-adjusted price or reduce exposures in regions that are highly catastrophe exposed. Examples of this need to protect property owners from full risk pricing became an election issue in the Florida market when state-owned carrier Florida Citizens supported customers with effectively subsidized premiums. The disproportionate emphasis on using the historical record as a means of modeling the probability of future losses is a further challenge for the private market operating in the state. “In the past when insurers were confronted with climate change, they were comfortable with the sense that they could always put up the price or avoid writing the business if the risk got too high,” says Muir-Wood. “But I don’t think that’s a credible position anymore. We see situations, such as in California, where insurers are told they should already have priced in climate change risk and they need to use the average of the last 30 years, and that’s obviously a challenge for the solvency of insurers. Regulators want to be up to speed on this. If levels of risk are increasing, they need to make sure that (re)insurance companies can remain solvent. That they have enough capital to take on those risks. “The Florida Insurance Commissioner’s function is more weighted to look after the interests of consumers around insurance prices, and they maintain a very strong line that risk models should be calibrated against the long-term historical averages,” he continues. “And they’ve said that both in Florida for hurricane and in California for wildfire. And in a time of change and a time of increased risk, that position is clearly not in the interest of insurers, and they need to be thinking carefully about that. “Regulators want to be up to speed on this,” he adds. “If levels of risk are increasing, they need to make sure that (re)insurance companies can remain solvent. That they have enough capital to take on those risks. And supervisors will expect the companies they regulate to turn up with extremely good arguments and a demonstration of the data behind their position as to how they are pricing their risk and managing their portfolios.” The reputational cost of inaction Despite the persistence of near-term pressures, a lack of action and a long-term view on climate change is no longer a viable option for the industry. In part, this is due to a mounting reputational cost. European and Australian (re)insurers have, for instance, been more proactive in divesting from fossil fuels than their American and Asian counterparts. This is expected to change as negative attention mounts in both mainstream and social media. The industry’s retreat from coal is gathering pace as public pressure on the fossil fuel industry and its supporters grows. The number of insurers withdrawing cover for coal more than doubled in 2019, with coal exit policies announced by 17 (re)insurance companies. “The role of insurers is to manage society’s risks — it is their duty and in their own interest to help avoid climate breakdown,” says Peter Bosshard, coordinator of the Unfriend Coal campaign. “The industry’s retreat from coal is gathering pace as public pressure on the fossil fuel industry and its supporters grows.” The influence of climate change activists such as Greta Thunberg, the actions of NGO pressure groups like Unfriend Coal and growing climate change disclosure requirements are building a critical momentum and scrutiny into the action (or lack thereof) taken by insurance senior management. “If you are in the driver’s seat of an insurance company and you know your customers’ attitudes are shifting quite fast, then you need to avoid looking as though you are behind the curve,” says Muir-Wood. “Quite clearly there is a reputational side to this. Attitudes are changing, and as an industry we should anticipate that all sorts of things that are tolerated today will become unacceptable in the future.” To understand your organization’s potential exposure to climate change contact the RMS team here

ANTONY IRELAND
scs
scs
Severe Convective Storms: Experience Cannot Tell the Whole Story
May 05, 2020

Severe convective storms can strike with little warning across vast areas of the planet, yet some insurers still rely solely on historical records that do not capture the full spectrum of risk at given locations. EXPOSURE explores the limitations of this approach and how they can be overcome with cat modeling Attritional and high-severity claims from severe convective storms (SCS) — tornadoes, hail, straight-line winds and lightning — are on the rise. In fact, in the U.S., average annual insured losses (AAL) from SCS now rival even those from hurricanes, at around US$17 billion, according to the latest RMS U.S. SCS Industry Loss Curve from 2018. In Canada, SCS cost insurers more than any other natural peril on average each year. Despite the scale of the threat, it is often overlooked as a low volatility, attritional peril  Christopher Allen RMS “Despite the scale of the threat, it is often overlooked as a low volatility, attritional peril,” says Christopher Allen, product manager for the North American SCS and winterstorm models at RMS. But losses can be very volatile, particularly when considering individual geographic regions or portfolios (see Figure 1). Moreover, they can be very high. “The U.S. experiences higher insured losses from SCS than any other country. According to the National Weather Service Storm Prediction Center, there over 1,000 tornadoes every year on average. But while a powerful tornado does not cause the same total damage as a major earthquake or hurricane, these events are still capable of causing catastrophic losses that run into the billions.” Figure 1: Insured losses from U.S. SCS in the Northeast (New York, Connecticut, Rhode Island, Massachusetts, New Hampshire, Vermont, Maine), Great Plains (North Dakota, South Dakota, Nebraska, Kansas, Oklahoma) and Southeast (Alabama, Mississippi, Louisiana, Georgia). Losses are trended to 2020 and then scaled separately for each region so the mean loss in each region becomes 100. Source: Industry Loss Data Two of the costliest SCS outbreaks to date hit the U.S. in spring 2011. In late April, large hail, straight-line winds and over 350 tornadoes spawned across wide areas of the South and Midwest, including over the cities of Tuscaloosa and Birmingham, Alabama, which were hit by a tornado rating EF-4 on the Enhanced Fujita (EF) scale. In late May, an outbreak of several hundred more tornadoes occurred over a similarly wide area, including an EF-5 tornado in Joplin, Missouri, that killed over 150 people. If the two outbreaks occurred again today, according to an RMS estimate based on trending industry loss data, each would easily cause over US$10 billion of insured loss. However, extreme losses from SCS do not just occur in the U.S. In April 1999, a hailstorm in Sydney dropped hailstones of up to 3.5 inches (9 centimeters) in diameter over the city, causing insured losses of AU$5.6 billion according to the Insurance Council of Australia (ICA), currently the most costly insurance event in Australia’s history [1]. “It is entirely possible we will soon see claims in excess of US$10 billion from a single SCS event,” Allen says, warning that relying on historical data alone to quantify SCS (re)insurance risk leaves carriers underprepared and overexposed. Historical records are short and biased According to Allen, the rarity of SCS at a local level means historical weather and loss data fall short of fully characterizing SCS hazard. In the U.S., the Storm Prediction Center’s national record of hail and straight-line wind reports goes back to 1955, and tornado reports date back to 1950. In Canada, routine tornado reports go back to 1980. “These may seem like adequate records, but they only scratch the surface of the many SCS scenarios nature can throw at us,” Allen says. “To capture full SCS variability at a given location, records should be simulated over thousands, not tens, of years,” he explains. “This is only possible using a cat model that simulates a very wide range of possible storms to give a fuller representation of the risk at that location. Observed over tens of thousands of years, most locations would have been hit by SCS just as frequently as their neighbors, but this will never be reflected in the historical records. Just because a town or city has not been hit by a tornado in recent years doesn’t mean it can’t be.” To capture full SCS variability at a given location, records should be simulated over thousands, not tens, of years Shorter historical records could also misrepresent the severity of SCS possible at a given location. Total insured catastrophe losses in Phoenix, Arizona, for example, were typically negligible between 1990 and 2009, but on October 5, 2010, Phoenix was hit by its largest-ever tornado and hail outbreak, causing economic losses of US$4.5 billion. (Source: NOAA National Centers for Environmental Information) Just like the national observations, insurers’ own claims histories, or industry data such as presented in Figure 1, are also too short to capture the full extent of SCS volatility, Allen warns. “Some primary insurers write very large volumes of natural catastrophe business and have comprehensive claims records dating back 20 or so years, which are sometimes seen as good enough datasets on which to evaluate the risk at their insured locations. However, underwriting based solely on this length of experience could lead to more surprises and greater earnings instability.” If a tree falls and no one hears… Historical SCS records in most countries rely primarily on human observation reports. If a tornado is not seen, it is not reported, which means that unlike a hurricane or large earthquake it is possible to miss SCS in the recent historical record. “While this happens less often in Europe, which has a high population density, missed sightings can distort historical data in Canada, Australia and remote parts of the U.S.,” Allen explains. Another key issue is that the EF scale rates tornado strength based on how much damage is caused, but this does not always reflect the power of the storm. If a strong tornado occurs in a rural area with few buildings, for example, it won’t register high on the EF scale, even though it could have caused major damage to an urban area. “This again makes the historical record very challenging to interpret,” he says. “Catastrophe modelers invest a great deal of time and effort in understanding the strengths and weaknesses of historical data. By using robust aspects of observations in conjunction with other methods, for example numerical weather simulations, they are able to build upon and advance beyond what experience tells us, allowing for more credible evaluation of SCS risk than using experience alone.” Then there is the issue of rising exposures. Urban expansion and rising property prices, in combination with factors such as rising labor costs and aging roofs that are increasingly susceptible to damage, are pushing exposure values upward. “This means that an identical SCS in the same location would most likely result in a higher loss today than 20 years ago, or in some cases may result in an insured loss where previously there would have been none,” Allen explains. Calgary, Alberta, for example, is the hailstorm capital of Canada. On September 7, 1991, a major hailstorm over the city resulted in the country’s largest insured loss to date from a single storm: CA$343 million was paid out at the time. The city has of course expanded significantly since then (see Figure 2), and the value of the exposure in preexisting urban areas has also increased. An identical hailstorm occurring over the city today would therefore cause far larger insured losses, even without considering inflation. Figure 2: Urban expansion in Calgary, Alberta, Canada. European Space Agency. Land Cover CCI Product User Guide Version 2. Tech. Rep. (2017). Available at: maps.elie.ucl.ac.be/CCI/viewer/download/ESACCI-LC-Ph2-PUGv2_2.0.pdf “Probabilistic SCS cat modeling addresses these issues,” Allen says. “Rather than being constrained by historical data, the framework builds upon and beyond it using meteorological, engineering and insurance knowledge to evaluate what is physically possible today. This means claims do not have to be ‘on-leveled’ to account for changing exposures, which may require the user to make some possibly tenuous adjustments and extrapolations; users simply input the exposures they have today and the model outputs today’s risk.” The catastrophe modeling approach In addition to their ability to simulate “synthetic” loss events over thousands of years, Allen argues, cat models make it easier to conduct sensitivity testing by location, varying policy terms or construction classes; to drill into loss-driving properties within portfolios; and to optimize attachment points for reinsurance programs. SCS cat models are commonly used in the reinsurance market, partly because they make it easy to assess tail risk (again, difficult to do using a short historical record alone), but they are currently used less frequently for underwriting primary risks. There are instances of carriers that use catastrophe models for reinsurance business but still rely on historical claims data for direct insurance business. So why do some primary insurers not take advantage of the cat modeling approach? “Though not marketwide, there can be a perception that experience alone represents the full spectrum of SCS risk — and this overlooks the historical record’s limitations, potentially adding unaccounted-for risk to their portfolios,” Allen says. What is more, detailed studies of historical records and claims “on-leveling” to account for changes over time are challenging and very time-consuming. By contrast, insurers who are already familiar with the cat modeling framework (for example, for hurricane) should find that switching to a probabilistic SCS model is relatively simple and requires little additional learning from the user, as the model employs the same framework as for other peril models, he explains. A US$10 billion SCS loss is around the corner, and carriers need to be prepared and have at their disposal the ability to calculate the probability of that occurring for any given location Furthermore, catastrophe model data formats, such as the RMS Exposure and Results Data Modules (EDM and RDM), are already widely exchanged, and now the Risk Data Open Standard™ (RDOS) will have increasing value within the (re)insurance industry. Reinsurance brokers make heavy use of cat modeling submissions when placing reinsurance, for example, while rating agencies increasingly request catastrophe modeling results when determining company credit ratings. Allen argues that with property cat portfolios under pressure and the insurance market now hardening, it is all the more important that insurers select and price risks as accurately as possible to ensure they increase profits and reduce their combined ratios. “A US$10 billion SCS loss is around the corner, and carriers need to be prepared and have at their disposal the ability to calculate the probability of that occurring for any given location,” he says. “To truly understand their exposure, risk must be determined based on all possible tomorrows, in addition to what has happened in the past.” [1] Losses normalized to 2017 Australian dollars and exposure by the ICA. Source: https://www.icadataglobe.com/access-catastrophe-data. To obtain a holistic view of severe weather risk contact the RMS team here

close button
Video Title

Thank You

You’ll be contacted by an RMS specialist shortly.

RMS.com uses cookies to improve your experience and analyze site usage. Read Cookie Policy or click I understand.

close