This changes everything

At this year’s Exceedance®, RMS explored the key forces currently disrupting the industry, from technology, data analytics and the cloud through to rising extremes of catastrophic events like the pandemic and climate change. This coupling of technological and environmental disruption represents a true inflection point for the industry. EXPOSURE asked six experts across RMS for their views on why they believe these forces will change everything

Cloud Computing: Moe Khosravy, executive vice president, software and platforms

How are you seeing businesses transition their workloads over to the cloud?

I have to say it’s been remarkable. We’re way past basic conversations on the value proposition of the cloud to now having deep technical discussions that are truly transformative plays. Customers are looking for solutions that seamlessly scale with their business and platforms that lower their cost of ownership while delivering capabilities that can be consumed from anywhere in the world.

Why is the cloud so important or relevant now?

It is now hard for a business to beat the benefits that the cloud offers and getting harder to justify buying and supporting complex in-house IT infrastructure. There is also a mindset shift going on — why is an in-house IT team responsible for running and supporting another vendor’s software on their systems if the vendor itself can provide that solution? This burden can now be lifted using the cloud, letting the business concentrate on what it does best.

Has the pandemic affected views of being in the cloud?

I would say absolutely. We have always emphasized the importance of cloud and true SaaS architectures to enable business continuity — allowing you to do your work from anywhere, decoupled from your IT and physical footprint. Never has the importance of this been more clearly underlined than during the past few months.

Risk Analytics: Cihan Biyikoglu, executive vice president, product

What are the specific industry challenges that risk analytics is solving or has the potential to solve?

Risk analytics really is a wide field, but in the immediate short term one of the focus areas for us is improving productivity around data. So much time is spent by businesses trying to manually process data — cleansing, completing and correcting data — and on conversion between incompatible datasets. This alone is a huge barrier just to get a single set of results.

If we can take this burden away, give decision-makers the power to get results in real time with automated and efficient data handling, then with that I believe we will liberate them to use the latest insights to drive business results.

Another important innovation here are the HD Models™. The power of the new engine with its improved accuracy I believe is a game changer that will give our customers a competitive edge.

How will risk analytics impact activities and capabilities within the market?

As seen in other industries, the more data you can combine, the better the analytics become — that’s the universal law of analytics. Getting all of this data on a unified platform and combining different datasets unearths new insights, which could produce opportunities to serve customers better and drive profit or growth.

What are the longer-term implications for risk analytics?

In my view, it’s about generating more effective risk insights from analytics, results in better decision- making and the ability to explore new product areas with more confidence. It will spark a wave of innovation to profitably serve customers with exciting products and understand the risk and cost drivers more clearly.

How is RMS capitalizing on risk analytics?

At RMS, we have the pieces in place for clients to accelerate their risk analytics with the unified, open platform, Risk Intelligence™, which is built on a Risk Data Lake™ in the cloud and is ready to take all sources of data and unearth new insights. Applications such as Risk Modeler™ and ExposureIQ™ can quickly get decision-makers to the analytics they need to influence their business.

Open Standards: Dr. Paul Reed, technical program manager, RDOS

Why are open standards so important and relevant now?

I think the challenges of risk data interoperability and supporting new lines of business have been recognized for many years, as companies have been forced to rework existing data standards to try to accommodate emerging risks and to squeeze more data into proprietary standards that can trace their origins to the 1990s.

Today, however, with the availability of big data technology, cloud platforms such as RMS Risk Intelligence and standards such as the Risk Data Open Standard™ (RDOS) allow support for high-resolution risk modeling, new classes of risk, complex contract structures and simplified data exchange.

Are there specific industry challenges that open standards are solving or have the potential to solve?

I would say that open standards such as the RDOS are helping to solve risk data interoperability challenges, which have been hindering the industry, and provide support for new lines of business. In the case of the RDOS, it’s specifically designed for extensibility, to create a risk data exchange standard that is future-proof and can be readily modified and adapted to meet both current and future requirements.

Open standards in other industries, such as Kubernetes, Hadoop and HTML, have proven to be catalysts for collaborative innovation, enabling accelerated development of new capabilities.

How is RMS responding to and capitalizing on this development?

RMS contributed the RDOS to the industry, and we are using it as the data framework for our platform called Risk Intelligence. The RDOS is free for anyone to use, and anyone can contribute updates that can expand the value and utility of the standard — so its development and direction is not dependent on a single vendor.

We’ve put in place an independent steering committee to guide the development of the standard, currently made up of 15 companies. It provides benefits to RMS clients not only by enhancing the new RMS platform and applications, but also by enabling other industry users who create new and innovative products and address new and emerging risk classes.

Pandemic Risk: Dr. Gordon Woo, catastrophist

How does pandemic risk affect the market?

There’s no doubt that the current pandemic represents a globally systemic risk across many market sectors, and insurers are working out both what the impact from claims will be and the impact on capital. For very good reasons, people are categorizing the COVID-19 disease as a game-changer. However, in my view, SARS [severe acute respiratory syndrome] in 2003, MERS [Middle East respiratory syndrome] in 2012 and Ebola in 2014 should also have been game-changers. Over the last decade alone, we have seen multiple near misses.

It’s likely that suppression strategies to combat the coronavirus will probably continue in some form until a vaccine is developed, and governments must strike this uneasy balance between their economies and the opening of their populations to exposure from the virus.

What are the longer-term implications of this current pandemic for the industry?

It’s clear that the mitigation of pandemic risk will need to be prioritized and given far more urgency than before. There’s no doubt in my mind that events such as the 2014 Ebola crisis were a missed opportunity for new initiatives in pandemic risk mitigation. Away from the life and health sector, all insurers will need to have a better grasp on future pandemics, after seeing the impact of COVID-19 and its wide business impact. The market could look to bold initiatives with governments to examine how to cover future pandemics, similar to how terror attacks are covered as a pooled risk.

How is RMS helping its clients in relation to COVID-19?

Since early January when the first cases emerged from Wuhan, China, we’ve been supporting our clients and the wider market in gaining a better understanding of the diverse loss implications of COVID-19. Our LifeRisks® team has been actively assisting in pandemic risk management, with regular communications and briefings, and will incorporate new perspectives from COVID-19 into our infectious diseases modeling.

Climate Change: Ryan Ogaard, senior vice president, model product management

Why is climate change so relevant to the market now?

There are many reasons. Insurers and their stakeholders are looking at the constant flow of catastrophes, from the U.S. hurricane season of 2017, wildfires in California and bushfires in Australia, to recent major typhoons and wondering if climate change is driving extreme weather risk, and what it could do in the future. They’re asking whether the current extent of climate change risk is priced into their premiums. Regulators are also beginning to conduct stress tests on the potential impact of climate change in the future, and insurers must respond.

How will climate change impact how the market operates?

Similar to any risk, insurers need to understand and quantify how the physical risk of climate change will impact their portfolios and adjust their strategy accordingly. Also, over the coming years it appears likely that regulators will incorporate climate change reporting into their regimes. Once an insurer understands their exposure to climate change risk, they can then start to take action — which will impact how the market operates. These actions could be in the form of premium changes, mitigating actions such as supporting physical defenses, diversifying the risk or taking on more capital.

How is RMS responding to market needs around climate change?

RMS is listening to the needs of clients to understand their pain points around climate change risk, what actions they are taking and how we can add value. We’re working with a number of clients on bespoke studies that modify the current view of risk to project into the future and/or test the sensitivity of current modeling assumptions. We’re also working to help clients understand the extent to which climate change is already built into risk models, to educate clients on emerging climate change science and to explain whether there is or isn’t a clear climate change signal for a particular peril.

Cyber: Dr. Christos Mitas, vice president, model development

How is this change currently manifesting itself?

While cyber risk itself is not new, for anyone involved in protecting or insuring organizations against cyberattacks, they will know that the nature of cyber risk is forever evolving. This could involve changes in those perpetrating the attacks, from lone wolf criminals to state-backed actors or the type of target from an unpatched personal computer to a power-plant control system. If you take the current COVID-19 pandemic, this has seen cybercriminals look to take advantage of millions of employees working from home or vulnerable business IT infrastructure. Change to the threat landscape is a constant for cyber risk.

Why is cyber risk so important and relevant right now?

Simply because new cyber risks emerge, and insurers who are active in this area need to ensure they are ahead of the curve in terms of awareness and have the tools and knowledge to manage new risks. There have been systemic ransomware attacks over the last few years, and criminals continue to look for potential weaknesses in networked systems, third-party software, supply chains — all requiring constant vigilance. It’s this continual threat of a systemic attack that requires insurers to use effective tools based on cutting-edge science, to capture the latest threats and identify potential risk aggregation.

How is RMS responding to market needs around cyber risk?

With our latest RMS Cyber Solutions, which is version 4.0, we’ve worked closely with clients and the market to really understand the pain points within their businesses, with a wealth of new data assets and modeling approaches. One area is the ability to know the potential cyber risk of the type of business you are looking to insure. In version 4.0, we have a database of over 13 million businesses that can help enrich the information you have about your portfolio and prospective clients, which then leads to more prudent and effective risk modeling.

A Time to Change

Our industry is undergoing a period of significant disruption on multiple fronts. From the rapidly evolving exposure landscape and the extraordinary changes brought about by the pandemic to step-change advances in technology and seismic shifts in data analytics capabilities, the market is undergoing an unparalleled transition period.

As Exceedance 2020 demonstrated, this is no longer a time for business as usual. This is what defines leaders and culls the rest. This changes everything.

Breaking down the pandemic

As COVID-19 has spread across the world and billions of people are on lockdown, EXPOSURE looks at how the latest scientific data can help insurers better model pandemic risk

The coronavirus disease 2019 (COVID-19) was declared a pandemic by the World Health Organization (WHO) on March 11, 2020. In a matter of months, it has expanded from the first reported cases in the city of Wuhan in Hubei province, China, to confirmed cases in over 200 countries around the globe.

At the time of writing, approximately one-third of the world’s population is in some form of lockdown, with movement and activities restricted in an effort to slow the disease’s spread. The transmissibility of COVID-19 is truly global, with even the extreme remoteness of location proving no barrier to its relentless progression as it reaches far-flung locations such as Papua New Guinea and Timor-Leste.

After declaring the event a global pandemic, Dr. Tedros Adhanom Ghebreyesus, WHO director general, said: “We have never before seen a pandemic sparked by a coronavirus. This is the first pandemic caused by a coronavirus. And we have never before seen a pandemic that can be controlled. … This is not just a public health crisis, it is a crisis that will touch every sector — so every sector and every individual must be involved in the fight.”

Ignoring the near misses

COVID-19 has been described as the biggest global catastrophe since World War II. Its impact on every part of our lives, from the mundane to the complex, will be profound, and its ramifications will be far-reaching and enduring.

On multiple levels, the coronavirus has caught the world off guard. So rapidly has it spread that initial response strategies, designed to slow its progress, were quickly reevaluated and more restrictive measures have been required to stem the tide. Yet, some are asking why many nations have been so flat-footed in their response.

To find a comparable pandemic event, it is necessary to look back over 100 years to the 1918 flu pandemic, also referred to as Spanish flu. While this is a considerable time gap, the interim period has witnessed multiple near misses that should have ensured countries remained primed for a potential pandemic.

“For very good reasons, people are categorizing COVID-19 as a game-changer. However, SARS in 2003 should have been a game-changer, MERS in 2012 should have been a game-changer, Ebola in 2014 should have been a game-changer. If you look back over the last decade alone, we have seen multiple near misses.”  Dr. Gordon Woo, RMS

However, as Dr. Gordon Woo, catastrophist at RMS, explains, such events have gone largely ignored. “For very good reasons, people are categorizing COVID-19 as a game-changer. However, SARS in 2003 should have been a game-changer, MERS in 2012 should have been a game-changer, Ebola in 2014 should have been a game-changer. If you look back over the last decade alone, we have seen multiple near misses.

“If you examine MERS, this had a mortality rate of approximately 30 percent — much greater than COVID-19 — yet fortunately it was not a highly transmissible virus. However, in South Korea a mutation saw its transmissibility rate surge to four chains of infection, which is why it had such a considerable impact on the country.”

While COVID-19 is caused by a novel virus and there is no preexisting immunity within the population, its genetic makeup shares 80 percent of the coronavirus genes that sparked the 2003 SARS outbreak. In fact, the virus is officially titled “severe acute respiratory syndrome coronavirus 2,” or “SARS-CoV-2.” However, the WHO refers to it by the name of the disease it causes, COVID-19, as calling it SARS could have “unintended consequences in terms of creating unnecessary fear for some populations, especially in Asia which was worst affected by the SARS outbreak in 2003.”

“Unfortunately, people do not respond to near misses,” Woo adds, “they only respond to events. And perhaps that is why we are where we are with this pandemic. The current event is well within the bounds of catastrophe modeling, or potentially a lot worse if the fatality ratio was in line with that of the SARS outbreak.

“When it comes to infectious diseases, we must learn from history. So, if we take SARS, rather than describing it as a unique event, we need to consider all the possible variants that could occur to ensure we are better able to forecast the type of event we are experiencing now.”

Within model parameters

A COVID-19-type event scenario is well within risk model parameters. The RMS® Infectious Diseases Model within its LifeRisks®platform incorporates a range of possible source infections, which includes coronavirus, and the company has been applying model analytics to forecast the potential development tracks of the current outbreak.

Launched in 2007, the Infectious Diseases Model was developed in response to the H5N1 virus. This pathogen exhibited a mortality rate of approximately 60 percent, triggering alarm bells across the life insurance sector and sparking demand for a means of modeling its potential portfolio impact. The model was designed to produce outputs specific to mortality and morbidity losses resulting from a major outbreak.

In 2006, H5N1 exhibited a mortality rate of approximately 60 percent, triggering alarm bells across the life insurance sector and sparking demand for a means of modeling its potential portfolio impact

The probabilistic model is built on two critical pillars. The first is modeling that accurately reflects both the science of infectious disease and the fundamental principles of epidemiology. The second is a software platform that allows firms to address questions based on their exposure and experience data.

“It uses pathogen characteristics that include transmissibility and virulence to compartmentalize a pathological epidemiological model and estimate an abated mortality and morbidity rate for the outbreak,” explains Dr. Brice Jabo, medical epidemiologist at RMS.

“The next stage is to apply factors including demographics, vaccines and pharmaceutical and non-pharmaceutical interventions to the estimated rate. And finally, we adjust the results to reflect the specific differences in the overall health of the portfolio or the country to generate an accurate estimate of the potential morbidity and mortality losses.”

The model currently spans 59 countries, allowing for differences in government strategy, health care systems, vaccine treatment, demographics and population health to be applied to each territory when estimating pandemic morbidity and mortality losses.

Breaking down the virus

In the case of COVID-19, transmissibility — the average number of infections that result from an initial case — has been a critical model parameter. The virus has a relatively high level of transmissibility, with data showing that the average infection rate is in the region of 1.5-3.5 per initial infection.

However, while there is general consensus on this figure, establishing an estimate for the virus severity or virulence is more challenging, as Jabo explains: “Understanding the virulence of the disease enables you to assess the potential burden placed on the health care system. In the model, we therefore track the proportion of mild, severe, critical and fatal cases to establish whether the system will be able to cope with the outbreak. However, the challenge factor is that this figure is very dependent on the number of tests that are carried out in the particular country, as well as the eligibility criteria applied to conducting the tests.”

An effective way of generating more concrete numbers is to have a closed system, where everyone in a particular environment has a similar chance of contracting the disease and all individuals are tested. In the case of COVID-19 these closed systems have come in the form of cruise ships. In these contained environments, it has been possible to test all parties and track the infection and fatality rates accurately.

Another parameter tracked in the model is non-pharmaceutical intervention — those measures introduced in the absence of a vaccine to slow the progression of the disease and prevent health care systems from being overwhelmed. Suppression strategies are currently the most effective form of defense in the case of COVID-19. They are likely to be in place in many countries for a number of months as work continues on a vaccine.

“This is an example of a risk that is hugely dependent on government policy for how it develops,” says Woo. “In the case of China, we have seen how the stringent policies they introduced have worked to contain the first wave, as well as the actions taken in South Korea. There has been concerted effort across many parts of Southeast Asia, a region prone to infectious diseases, to carry out extensive testing, chase contacts and implement quarantine procedures, and these have so far proved successful in reducing the spread. The focus is now on other parts of the world such as Europe and the Americas as they implement measures to tackle the outbreak.”

The Infectious Diseases Model’s vaccine and pharmaceutical modifiers reflect improvements in vaccine production capacity, manufacturing techniques and the potential impact of antibacterial resistance. While an effective treatment is, at time of writing, still in development, this does allow users to conduct “what-if” scenarios.

“Model users can apply vaccine-related assumptions that they feel comfortable with,” Jabo says. “For example, they can predict potential losses based on a vaccine being available within two months that has an 80 percent effectiveness rate, or an antiviral treatment available in one month with a 60 percent rate.”

Data upgrades

Various pathogens have different mortality and morbidity distributions. In the case of COVID-19, evidence to date suggests that the highest levels of mortality from the virus occur in the 60-plus age range, with fatality levels declining significantly below this point. However, recent advances in data relating to immunity levels has greatly increased our understanding of the specific age range exposed to a particular virus.

“Recent scientific findings from data arising from two major flu viruses, H5N1 and A/H7N9, have had a significant impact on our understanding of vulnerability,” explains Woo. “The studies have revealed that the primary age range of vulnerability to a flu virus is dependent upon the first flu that you were exposed to as a child.

“There are two major flu groups to which everyone would have had some level of exposure at some stage in their childhood. That exposure would depend on which flu virus was dominant at the time they were born, influencing their level of immunity and which type of virus they are more susceptible to in the future. This is critical information in understanding virus spread and we have adapted the age profile vulnerability component of our model to reflect this.”

Recent model upgrades have also allowed for the application of detailed information on population health, as Jabo explains: “Preexisting conditions can increase the risk of infection and death, as COVID-19 is demonstrating. Our model includes a parameter that accounts for the underlying health of the population at the country, state or portfolio level.

“The information to date shows that people with co-morbidities such as hypertension, diabetes and cardiovascular disease are at a higher risk of death from COVID-19. It is possible, based on this data, to apply the distribution of these co-morbidities to a particular geography or portfolio, adjusting the outputs based on where our data shows high levels of these conditions.”

Predictive analytics

The RMS Infectious Diseases Model is designed to estimate pandemic loss for a 12-month period. However, to enable users to assess the potential impact of the current pandemic in real time, RMS has developed a hybrid version that combines the model pandemic scenarios with the number of cases reported.

“Using the daily cases numbers issued by each country,” says Jabo, “we project forward from that data, while simultaneously projecting backward from the RMS scenarios. Using this hybrid approach, it allows us to provide a time-dependent estimate for COVID-19. In effect, we are creating a holistic alignment of observed data coupled with RMS data to provide our clients with a way to understand how the evolution of the pandemic is progressing in real time.”

Aligning the observed data with the model parameters makes the selection of proper model scenarios more plausible. The forward and backward projections, as illustrated, not only allow for short-term projections, but also forms part of model validation and enables users to derive predictive analytics to support their portfolio analysis.

“Staying up to date with this dynamic event is vital,” Jabo concludes, “because the impact of the myriad government policies and measures in place will result in different potential scenarios, and that is exactly what we are seeing happening.”

The data difference

The value of data as a driver of business decisions has grown exponentially as the importance of generating sustainable underwriting profit becomes the primary focus for companies in response to recent diminished investment yields. Increased risk selection scrutiny is more important than ever to maintain underwriting margins. High-caliber, insightful risk data is critical for the data analytics that support each risk decision

The insurance industry is in a transformational phase where profit margins continue to be stretched in a highly competitive marketplace. Changing customer dynamics and new technologies are driving demand for more personalized solutions delivered in real time, while companies are working to boost performance, increase operational efficiency and drive greater automation. In some instances, this involves projects to overhaul legacy systems that are central to daily operation.

In such a state of market flux, access to quality data has become a primary differentiator. But there’s the rub. Companies now have access to vast amounts of data from an expanding array of sources — but how can organizations effectively distinguish good data from poor data? What differentiates the data that delivers stellar underwriting performance from that which sends a combined operating performance above 100 percent?

A complete picture

“Companies are often data rich, but insight poor,” believes Jordan Byk, senior director, product management at RMS. “The amount of data available to the (re)insurance industry is staggering, but creating the appropriate insights that will give them a competitive advantage is the real challenge. To do that, data consumers need to be able to separate ‘good’ from ‘bad’ and identify what constitutes ‘great’ data.”

For Byk, a characteristic of “great data” is the speed with which it drives confident decision-making that, in turn, guides the business in the desired direction. “What I mean by speed here is not just performance, but that the data is reliable and insightful enough that decisions can be made immediately, and all are confident that the decisions fit within the risk parameters set by the company for profitable growth.

“While resolution is clearly a core component of our modeling capabilities at RMS, the ultimate goal is to provide a complete data picture and ensure quality and reliability of underlying data” Oliver Smith, RMS

“We’ve solved the speed and reliability aspect by generating pre-compiled, model-derived data at resolutions intelligent for each peril,” he adds. There has been much focus on increasing data-resolution levels, but does higher resolution automatically elevate the value of data in risk decision-making? The drive to deliver data at 10-, five- or even one-meter resolution may not necessarily be the main ingredient in what makes truly great data.

“Often higher resolution is perceived as better,” explains Oliver Smith, senior product manager at RMS, “but that is not always the case. While resolution is clearly a core component of our modeling capabilities at RMS, the ultimate goal is to provide a complete data picture and ensure quality and reliability of underlying data.

“Resolution of the model-derived data is certainly an important factor in assessing a particular exposure,” adds Smith, “but just as important is understanding the nature of the underlying hazard and vulnerability components that drive resolution. Otherwise, you are at risk of the ‘garbage-in-garbage-out’ scenario that can foster a false sense of reliability based solely around the ‘level’ of resolution.”

The data score

The ability to assess the impact of known exposure data is particularly relevant to the extensive practice of risk scoring. Such scoring provides a means of expressing a particular risk as a score from 1 to 10, 1 to 20 or another means that indicates “low risk to high risk” based on an underlying definition for each value. This enables underwriters to make quick submission assessments and supports critical decisions relating to quoting, referrals and pricing.

“Such capabilities are increasingly common and offer a fantastic mechanism for establishing underwriting guidelines, and enabling prioritization and triage of locations based on a consistent view of perceived riskiness,” says Chris Sams, senior product manager at RMS. “What is less common, however, is ‘reliable’ and superior quality risk scoring, as many risk scores do not factor in readily available vulnerability data.”

“Such capabilities are increasingly common and offer a fantastic mechanism for establishing underwriting guidelines, and enabling prioritization and triage of locations based on a consistent view of perceived riskiness” Chris Sams, RMS

Exposure insight is created by adjusting multiple data lenses until the risk image comes into focus. If particular lenses are missing or there is an overreliance on one particular lens, the image can be distorted. For instance, an overreliance on hazard-related information can significantly alter the perceived exposure levels for a specific asset or location.

“Take two locations adjacent to one another that are exposed to the same wind or flood hazard,” Byk says. “One is a high-rise hotel built in 2020 and subject to the latest design standards, while another is a wood-frame, small commercial property built in the 1980s; or one location is built at ground level with a basement, while another is elevated on piers and does not have a basement.

“These vulnerability factors will result in a completely different loss experience in the occurrence of a wind- or flood-related event. If you were to run the locations through our models, the annual average loss figures will vary considerably. But if the underwriting decision is based on hazard-only scores, they will look the same until they hit the portfolio assessment — and that’s when the underwriter could face some difficult questions.”

To assist clients to understand the differences in vulnerability factors, RMS provides ExposureSource, a U.S. property database comprised of property characteristics for 82 million residential buildings and 21 million commercial buildings. By providing this high-quality exposure data set, clients can make the most of the RMS risk scoring products for the U.S.

Seeing through the results

Another common shortfall with risk scores is the lack of transparency around the definitions attributed to each value. Looking at a scale of 1 to 10, for example, companies don’t have insight into the exposure characteristics being used to categorize a particular asset or location as, say, a 4 rather than a 5 or 6.

To combat data-scoring deficiencies, RMS RiskScore values are generated by catastrophe models incorporating the trusted science and quality you expect from an RMS model, calibrated on billions of dollars of real-world claims. With consistent and reliable risk scores covering 30 countries and up to seven perils, the apparent simplicity of the RMS RiskScore hides the complexity of the big data catastrophe simulations that create them.

The scores combine hazard and vulnerability to understand not only the hazard experienced at a site, but also the susceptibility of a particular building stock when exposed to a given level of hazard. The RMS RiskScore allows for user definition of exposure characteristics such as occupancy, construction material, building height and year built. Users can also define secondary modifiers such as basement presence and first-floor height, which are critical for the assessment of flood risk, and roof shape or roof cover, which is critical for wind risk.

“It also provides clearly structured definitions for each value on the scale,” explains Smith, “providing instant insight on a risk’s damage potential at key return periods, offering a level of transparency not seen in other scoring mechanisms. For example, a score of 6 out of 10 for a 100-year earthquake event equates to an expected damage level of 15 to 20 percent. This information can then be used to support a more informed decision on whether to decline, quote or refer the submission. Equally important is that the transparency allows companies to easily translate the RMS RiskScore into custom scales, per peril, to support their business needs and risk tolerances.”

Model insights at point of underwriting

While RMS model-derived data should not be considered a replacement for the sophistication offered by catastrophe modeling, it can enable underwriters to access relevant information instantaneously at the point of underwriting.

“Model usage is common practice across multiple points in the (re)insurance chain for assessing risk to individual locations, accounts, portfolios, quantifying available capacity, reinsurance placement and fulfilling regulatory requirements — to name but a few,” highlights Sams. “However, running the model takes time, and, often, underwriting decisions — particularly those being made by smaller organizations — are being made ahead of any model runs. By the time the exposure results are generated, the exposure may already be at risk.”

“Through this interface, companies gain access to the immense datasets that we maintain in the cloud and can simply call down risk decision information whenever they need it” Jordan Byk, RMS

In providing a range of data products into the process, RMS is helping clients select, triage and price risks before such critical decisions are made. The expanding suite of data assets is generated by its probabilistic models and represents the same science and expertise that underpins the model offering.

“And by using APIs as the delivery vehicle,” adds Byk, “we not only provide that modeled insight instantaneously, but also integrate that data directly and seamlessly into the client’s on-premise systems at critical points in their workflow. Through this interface, companies gain access to the immense datasets that we maintain in the cloud and can simply call down risk decision information whenever they need it. While these are not designed to compete with a full model output, until a time that we have risk models that provide instant analysis, such model-derived datasets offer the speed of response that many risk decisions demand.”

A consistent and broad perspective on risk

A further factor that can instigate problems is data and analytics inconsistency across the (re)insurance workflow. Currently, with data extracted from multiple sources and, in many cases, filtered through different lenses at various stages in the workflow, having consistency from the point of underwriting to portfolio management has been the norm.

“There is no doubt that the disparate nature of available data creates a disconnect between the way risks are assumed into the portfolio and how they are priced,” Smith points out. “This disconnect can cause ‘surprises’ when modeling the full portfolio, generating a different risk profile than expected or indicating inadequate pricing. By applying data generated via the same analytics and data science that is used for portfolio management, consistency can be achieved for underwriting risk selection and pricing, minimizing the potential for surprise.”

Equally important, given the scope of modeled data required by (re)insurance companies, is the need to focus on providing users with the means to access the breadth of data from a central repository.

“If you access such data at speed, including your own data coupled with external information, and apply sophisticated analytics — that is how you derive truly powerful insights,” he concludes. “Only with that scope of reliable, insightful information instantly accessible at any point in the chain can you ensure that you’re always making fully informed decisions — that’s what great data is really about. It’s as simple as that.”

For further information on RMS’s market-leading data solutions, click here.

A solution shared

The Risk Data Open Standard is now available, and active industry collaboration is essential for achieving wide-scale interoperability objectives

On January 31, the first version of the Risk Data Open Standard (RDOS) was made available to the risk community and the public on the GitHub platform. The RDOS is an “open” standard because it is available with no fees or royalties and anyone can review, download, contribute to or leverage the RDOS for their own project.

With the potential to transform the way risk data is expressed and exchanged across the (re)insurance industry and beyond, the RDOS represents a new data model (i.e., a data specification or schema) specifically designed for holding all types of risk data, from exposure through model settings to results analyses.

The industry has long recognized that a dramatic improvement in risk data container design is required to support current and future industry operations. The industry currently relies on data models for risk data exchange and storage that were originally designed to support property cat models over 20 years ago. These formats are incomplete. They do not capture critical information about contracts, business structures or model settings. This means that an analyst receiving data in these old formats has detective work to do – filling in the missing pieces of the risk puzzle. Because formats lack a complete picture linking exposures to results, highly skilled, well-paid people are wasting a huge amount of time, and efforts to automate are difficult, if not impossible, to achieve.

Existing formats are also very property-centric. As models for new insurance lines have emerged over the years, such as energy, agriculture and cyber, the risk data for these lines of business have either been forced suboptimally into the property cat data model, or entirely new formats have been created to support single lines of business. The industry is faced with two poor choices: accept substandard data or deal with many data formats – potentially one for each line of business – possibly multiplied by the number of companies who offer models for a particular line of business.

The industry is painfully aware of the problems we are trying to solve. The RDOS aims to provide a complete, flexible and interoperable data format ‘currency’ for exchange that will eliminate the time-consuming and costly data processes that are currently required – Paul Reed, RMS

“The industry is painfully aware of the problems we are trying to solve. The RDOS aims to provide a complete, flexible and interoperable data format ‘currency’ for exchange that will eliminate the time-consuming and costly data processes that are currently required,” explains Paul Reed, technical program manager for the RDOS at RMS. He adds, “Of course, adoption of a new standard can’t happen overnight, but because it is backward-compatible with the RMS EDM and RDM users have optionality through the transition period.”

Taking on the challenge

The RDOS has great promise. An open standard specifically designed to represent and exchange risk data, it accommodates all categories of risk information across five critical information sets – exposure, contracts (coverage), business structures, model settings and results analyses. But can it really overcome the many intrinsic data hurdles currently constraining the industry?

According to Ryan Ogaard, senior vice president of model product management at RMS, its ability to do just that lies in the RDOS’s conceptual entity model. “The design is simple, yet complete, consisting of these five linked categories of information that provide an unambiguous, auditable view of risk analysis,” he explains. “Each data category is segregated – creating flexibility by isolating changes to any given part of the RDOS – but also linked in a single container to enable clear navigation through and understanding of any risk analysis, from the exposure and contracts through to the results.”


By adding critical information about the business structure and models used, the standard creates a complete data picture – a fully traceable description of any analysis. This unique capability is a result of the superior technical data model design that the RDOS brings to the data struggle, believes Reed.

“The RDOS delivers multiple technical advantages,” he says. “Firstly, it stores results data along with contracts, business structure and settings data, which combine to enable a clear and comprehensive understanding of analyses. Secondly, the contract definition language (CDL) and structure definition language (SDL) provide a powerful tool for unambiguously determining contract payouts from a set of claims. In addition, the data model design supports advanced database technology and can be implanted in several popular DB formats including object-relational and SQL. Flexibility has been designed into virtually every facet of the RDOS, with design for extensibility built into each of the five information entities.”

“New information sets can be introduced to the RDOS without impacting existing information,” Ogaard says. “This overcomes the challenges of model rigidity and provides the flexibility to capture multivendor modeling data, as well as the user’s own view of risk. This makes the standard future-proof and usable by a broad cross section of the (re)insurance industry and other industries.”

Opening up the standard

To achieve the ambitious objective of risk data interoperability, it was critical that the RDOS was founded on an open-source platform. Establishing the RDOS on the GitHub platform was a game-changing decision, according to Cihan Biyikoglu, executive vice president of product at RMS.

You have to recognize the immense scale of the data challenge that exists within the risk analysis field. To address it effectively will require a great deal of collaboration across a broad range of stakeholders. Having the RDOS as an open standard enables that scale of collaboration to occur.

“I’ve worked on a number of open-source projects,” he says, “and in my opinion an open-source standard is the most effective way of energizing an active community of contributors around a particular project.

“You have to recognize the immense scale of the data challenge that exists within the risk analysis field. To address it effectively will require a great deal of collaboration across a broad range of stakeholders. Having the RDOS as an open standard enables that scale of collaboration to occur.”

Concerns have been raised about whether, given its open-source status and the ambition to become a truly industrywide standard, RMS should continue to play a leading role in the ongoing development of the RDOS now that it is open to all.

Biyikoglu believes it should. “Many open-source projects start with a good initial offering but are not maintained over time and quickly become irrelevant. If you look at the successful projects, a common theme is that they emanate from an industry participant suffering greatly from the particular issue. In the early phase, they contribute the majority of the improvements, but as the project evolves and the active community expands, the responsibility for moving it forward is shared by all. And that is exactly what we expect to see with the RDOS.”

For Paul Reed, the open-source model provides a fair and open environment in which all parties can freely contribute. “By adopting proven open-source best practices and supported by the industry-driven RDOS Steering Committee, we are creating a level playing field in which all participants have an equal opportunity to contribute.”

Assessing the potential

Following the initial release of the RDOS, much of the activity on the GitHub platform has involved downloading and reviewing the RDOS data model and tools, as users look to understand what it can offer and how it will function. However, as the open RDOS community builds and contributions are received, combined with guidance from industry experts on the steering committee, Ogaard is confident it will quickly start generating clear value on multiple fronts.

“The flexibility, adaptability and completeness of the RDOS structure create the potential to add tremendous industry value,” he believes, “by addressing the shortcomings of current data models in many areas. There is obvious value in standardized data for lines of business beyond property and in facilitating efficiency and automation. The RDOS could also help solve model interoperability problems. It’s really up to the industry to set the priorities for which problem to tackle first.

The flexibility, adaptability and completeness of the RDOS structure create the potential to add tremendous industry value

“Existing data formats were designed to handle property data,” Ogaard continues, “and do not accommodate new categories of exposure information. The RDOS Risk Item entity describes an exposure and enables new Risk Items to be created to represent any line of business or type of risk, without impacting any existing Risk Item. That means a user could add marine as a new type of Risk Item, with attributes specific to marine, and define contracts that cover marine exposure or its own loss type, without interfering with any existing Risk Item.”

The RDOS is only in its infancy, and how it evolves – and how quickly it evolves – lies firmly in the hands of the industry. RMS has laid out the new standard in the GitHub open-source environment and, while it remains committed to the open standard’s ongoing development, the direction that the RDOS takes is firmly in the hands of the (re)insurance community.


Access the Risk Data Open Standard here

The power of a crisis

As Christchurch City Council continues to build back better, will its resilience investment pay dividends when it comes to citywide insurance cover?

The Canterbury Earthquake Sequence is the largest insured event in New Zealand’s history. Between September 2010 and December 2011, four major earthquakes caused damage to approximately 168,000 residential buildings.

The earthquakes spawned more than 770,000 claims for the country’s Earthquake Commission (EQC) alone, resulting in a payout of around NZ$10 billion (US$6.4 billion). The private sector absorbed almost twice that, with the Insurance Council of New Zealand putting the figure at NZ$21.4 billion (as of March 31, 2019).

Christchurch Art Gallery. The city’s art gallery, for example, has been retrofitted to resist even the most severe earthquake activity.

Nine years on from the initial tremors, there remain over 1,200 open property claims in the private market, while the outstanding figure for the EQC stood at some 2,600 claims in February 2018.

“Dealing with the property claims was extremely challenging,” explains Raf Manji, chair of the Christchurch City Council’s Finance Committee, “not just in terms of contractual issues, but because the insurance was based on building-by-building cover. And when you’re dealing with damage to so many buildings, it is going to take a very long time to agree what that damage is.”

Building back better

The need to rebuild Christchurch presented the city with an opportunity. 

“As American politician Rahm Emanuel once said, ‘Never let a crisis go to waste,’” says Lianne Dalziel, mayor of Christchurch. “The earthquakes provided a major opportunity to build back better and ensure we embed resilience into every aspect, from below ground up.”

That commitment means that new construction, whether of above-ground assets or horizontal infrastructure, is being carried out to a level much higher than building codes dictate. 

“With the information, we want more informed conversations with both traditional and alternative markets about how we transfer risk more effectively” — Raf Manji, Christchurch City Council

“We’re building to an exceptionally high standard,” states Mike Gillooly, chief resilience officer for the city. This is a relatively new public position created following Christchurch’s inclusion in the first wave of the Rockefeller Foundation’s 100 Resilient Cities program. “The city’s art gallery, for example, has been retrofitted to resist even the most severe earthquake activity,” Gillooly continues.

But this dedication to resilience goes beyond the immediate rebuild. The council is also making resilience a core component of its long-term strategic planning. The city’s 2021-2051 infrastructure strategy, which covers the council’s investments in water supply, wastewater, stormwater, transport, parks, facilities, solid waste and communication technology for the next 30 years, will have resilience as its overarching theme.

“This is the first time we are proactively building risk and resilience into our long-term planning framework,” states Dalziel. “We are developing a much deeper appreciation of risk and have spent considerable time understanding our infrastructure. We are also working toward a much more sophisticated engagement with risk at the community level.”

“It’s not only about strengthening our physical infrastructure,” she continues. “It’s also about strengthening our social infrastructure.” 

“We are committed to promoting greater community well-being. We need to build up social capital by bringing people together to plan for an uncertain future. High levels of social capital accelerate recovery in the aftermath of a shock, while also creating greater inherent resilience to more slow-moving challenges, such as climate change and associated rising sea levels.”

Dalziel is quick to stress the importance of insurance in all this. “There is a strong relationship between economic resilience and social resilience, and the role of insurance in facilitating both cannot be underestimated. The value of insurance does not simply equal the sum of claims paid — it’s as much about the financial and social well-being that it supports.” 

Making resilience pay

Recently insurers across New Zealand have been shifting their appetite and premiums in high-hazard regions to be more reflective of the country’s risk profile. 

There has been a shift too in the council’s approach to insurance — a shift that is central to its resilience efforts, explains Manji.

“Following the earthquakes, Lianne asked me to run for council. I was a former financial markets trader and she wanted someone onboard with a financial background. But when I joined, I was taken aback by the lack of risk understanding that I saw at the local government level.”

One of his first steps was to set up an independently chaired audit and risk committee and introduce a new risk management framework — a model that has since been adopted by Auckland.

“Through this new framework, we were able to establish a much more sophisticated view of risk,” he explains, “and we also launched a five-year program to document every single asset in place — both above and below ground. Having this granular level of exposure insight means we can assess our approach to mitigating, retaining and transferring risk from a much more data-informed position.”

At present, Christchurch is conservatively insured. This is a very deliberate choice, however, and Manji is convinced of the benefits of this approach.

“This excess capacity means we have headroom into which we can grow as we continue to construct new and reconstruct old assets. That’s a much stronger position to be in than having to return to the market seeking more limit when capacity may be limited. It also demonstrates a long-term commitment to the insurance market upon which you can have much more constructive, ongoing dialogue.”

Data-informed dialogue

Christchurch City Council has been making use of insurance capital for many years. It was the 2010-11 earthquakes, though, that spurred its focus on arming itself with increasingly higher-resolution data.

“We’re now coming to the table each year with an ever more accurate picture of our exposure. Working with RMS, we’ve been able to significantly evolve our risk thinking based on a range of citywide loss scenarios, and to look at ways of creating a more effective balance between traditional and more innovative parametric-based solutions.”

That desire for balance does not just apply to the source of Christchurch capital, but also what kinds of assets that capital covers. At present, while the council has secured coverage for 65 percent of the value of its above-ground structures, it has only managed to buy insurance to cover approximately 15 percent of its underground infrastructure.

“The insurance market is not comfortable with providing cover for underground infrastructure because it tends not to be well understood or documented,” Manji continues. 

“Unlike most cities, however, we know exactly what is underground and just how resilient it is. With that information, we want to have more informed conversations — with both the traditional market and alternative providers of risk capital — about how we transfer this risk more effectively. Parametric-based solutions, for example, give us the opportunity to look beyond typical building replacement covers and take a bigger-picture view of what we want to achieve from our investment in risk transfer.

“And whereas an indemnity-based policy is designed primarily to return you to where you were prior to the loss, parametric payouts can be deployed for what ever purpose you want. That flexibility — along with the speed and certainty of payout — is incredibly valuable.”

For Gillooly, it is about becoming an increasingly sophisticated user of risk capital and engaging in ever more mature dialogue with the markets. “If we can demonstrate through the data and analytics that we understand the exposure, that we’ve quantified the risk and we’re investing in effective risk reduction, then the market needs to acknowledge these efforts in the form of increased capacity, reduced premiums or both. Data, analytics and risk insights will continue to be the key focus of our annual discussions with the London market — and will allow us to explore parametric insurance-linked securities with confidence too.”

A climate model challenge

Insurance-linked securities (ILS) investors want to know more about how climate change impacts investment decisions, according to Paul Wilson, head of non-life analytics at Securis Investment Partners, an ILS asset manager

We make investments that are typically annual to two-to-three years in duration, so we need to understand the implications of climate change on those timescales,” explains Paul Wilson, head of non-life analytics at Securis Investment Partners. “We reevaluate investments as part of any renewal process, and it’s right to ask if any opportunity is still attractive given what we know about how our climate is changing.

“The fundamental question that we’re trying to address is, ‘Have I priced the risk of this investment correctly for the next year?’” he continues. “And therefore, we need to know if the catastrophe models we are using accurately account for the impact climate change may be having. Or are they overly reliant on historical data and, as such, are not actually representing the true current risk levels for today’s climate?”

Expertise in climate change is a requirement for how Securis is thinking about risk. “We have investors who are asking questions about climate change, and we have a responsibility to be able to demonstrate to them that we are taking the implications into consideration in our investment decisions.”

“We have investors who are asking questions about climate change, and we have a responsibility to demonstrate to them that we are taking the implications into consideration in our investment decisions” — Paul Wilson, Securis Investment Partners

The rate at which a changing climate may influence natural catastrophes will present both a challenge and opportunity to the wider industry as well as to catastrophe modeling companies, thinks Wilson. The results coming out of climate change attribution studies are going to have to start informing the decisions around risk. For example, according to attribution studies, climate change tripled the chances of Hurricane Harvey’s record rainfall. 

“Climate change is a big challenge for the catastrophe modeling community,” he says. “It’s going to put a greater burden on catastrophe modelers to ensure that their models are up to date. The frequency and nature of model updates will have to change. Models we are using today may become out of date in just a few years’ time. That’s interesting when you think about the number of perils and regions where climate change could have a significant impact.

“All of those climate-related models could be impacted by climate change, so we have to question the impact that is having today,” he adds. “If the model you are using to price the risk has been calibrated to the last 50 years, but you believe the last 10 or last 20 years are more representative because of the implication of climate change, then how do you adjust your model according to that? That’s the question we should all be looking to address.”

What a difference

As the insurance industry’s Dive In Festival continues to gather momentum, EXPOSURE examines the factors influencing the speed at which the diversity and inclusion dial is moving

September 2019 marks the fifth Dive In Festival, a global movement in the insurance sector to support the development of inclusive workplace cultures. An industry phenomenon, it has ballooned in size from a London-only initiative in 2015 attracting 1,700 people to an international spectacle spanning 27 countries and reaching over 9,000 people in 2018.

That the event should gather such momentum clearly demonstrates a market that is moving forward. There is now an industrywide acknowledgement of the need to better reflect the diversity of the customer base within the industry’s professional ranks.

The starting point

As Pauline Miller, head of talent development and inclusion (D&I) at Lloyd’s, explains, the insurance industry is a market that has, in the past, been slow to change its practitioner profile. “If you look at Lloyd’s, for example, for nearly three hundred years it was a men-only environment, with women only admitted as members in December 1969.

“It’s about bringing together the most creative group of people that represent different ways of thinking that have evolved out of the multiple factors that make them different” — Pauline Miller, Lloyd’s

“You also have to recognize that the insurance industry is not as far along the diversity and inclusion journey compared to other sectors,” she continues. “I previously worked in the banking industry, and diversity and inclusion had been an agenda issue in the organization for a number of years. So, we must acknowledge that this is a journey that will require multiple more steps before we really begin breaking down barriers.”

However, she is confident the insurance industry can quickly make up ground.

“By its very nature, the insurance market lends itself to the spread of the D&I initiative,” Miller believes. “We are a relationship-based business that thrives on direct contact, and our day-to-day activities are based upon collaboration. We must leverage this to help speed up the creation of a more diverse and inclusive environment.”

The positive effects of collaboration are already evident in how this is evolving. Initiatives like Dive In, a weeklong focus on diversity and inclusion, within other financial sectors have tended to be confined to individual organizations, with few generating the level of industrywide engagement witnessed within the insurance sector.

However, as Danny Fisher, global HR business partner and EMEA HR manager at RMS, points out, for the drive to gain real traction there must be marketwide consensus on the direction it is moving in.

“There is always a risk,” he says, “that any complex initiative that begins with such positive intent can become derailed if there is not an understanding of a common vision from the start, and the benefits it will deliver.

“There also needs to be better understanding and acknowledgement of the multitude of factors that may have contributed to the uniformity we see across the insurance sector. We have to establish why this has happened and address the flaws in our industry contributing to it.”

It can be argued that the insurance industry is still composed of a relatively homogeneous group of people. In terms of gender disparity, ethnic diversity, and people of different sexual orientations, from different cultural or social backgrounds, or with physical or mental impairments, the industry recognizes a need to improve. 

Diversity is the range of human differences, including but not limited to race, ethnicity, gender, gender identity, sexual orientation, age, social class, physical ability or attributes, religious or ethical values system, national origin, and political beliefs.

“As a market,” Miller agrees, “there is a tendency to hire people similar to the person who is recruiting. Whether that’s someone of the same gender, ethnicity, sexual orientation or from the same university or social background.”

“You can end up with a very uniform workforce,” adds Fisher, “where people look the same and have a similar view of the world, which can foster ‘groupthink’ and is prone to bias and questionable conclusions. People approach problems and solutions in the same way, with no one looking at an alternative — an alternative that is often greatly needed. So, a key part of the diversity push is the need to generate greater diversity of thought.”

The challenge is also introducing that talent in an inclusive way that promotes the effective development of new solutions to existing and future problems. That broad palette of talent can only be created by attracting and retaining the best and brightest from across the social spectrum within a framework in which that blend of skills, perspectives and opinions can thrive.

“Diversity is not simply about the number of women, ethnicities, people with disabilities or people from disadvantaged backgrounds that you hire,” believes Miller. “It’s about bringing together the most creative group of people that represent different ways of thinking that have evolved out of the multiple factors that make them different.”

Moving the dial

There is clearly a desire to make this happen and strong evidence that the industry is moving together. Top-level support for D&I initiatives coupled with the rapid growth of industrywide networks representing different demographics are helping firm up the foundations of a more diverse and inclusive marketplace. 

But what other developments are needed to move the dial further?

“We have to recognize that there is no ‘one-size-fits-all’ to this challenge,” says Miller. “Policies and strategies must be designed to create an environment in which diversity and inclusion can thrive, but fundamentally they must reflect the unique dynamics of your own organization.

“We also must ensure we are promoting the benefits of a career in insurance in a more powerful and enticing way and to a broader audience,” she adds. “We operate in a fantastic industry, but we don’t sell it enough. And when we do get that diversity of talent through the door, we have to offer a workplace that sticks, so they don’t simply walk straight back out again. 

“For example, someone from a disadvantaged community coming through an intern program may never have worked in an office environment before, and when they look around are they going to see people like themselves that they can relate to? What role models can they connect with? Are we prepared for that?”

For Fisher, steps can also be taken to change processes and modernize thinking and habits. “We have to be training managers in interview and evaluation techniques and discipline to keep unconscious bias in check. There has to be consistency with meaningful tests to ensure data-driven hiring decisions.

“At RMS, we are fortunate to attract talent from around the world and are able to facilitate bringing them on board to add further variety in solving for complex problems. A successful approach for us, for example, has been accessing talent early, often prior to their professional career.”

There is, of course, the risk that the push for greater diversity leads to a quota-based approach. 

“Nobody wants this to become a tick-box exercise,” believes Miller, “and equally nobody wants to be hired simply because they represent a particular demographic. But if we are expecting change, we do need measurements in place to show how we are moving the dial forward. That may mean introducing realistic targets within realistic timeframes that are monitored carefully to ensure we are on track.

“Ultimately,” she concludes, “what we are all working to do is to create the best environment for the broadest spectrum of people to come into what is a truly amazing marketplace. And when they do, offering a workplace that enables them to thrive and enjoy very successful careers that contribute to the advancement of our industry. That’s what we all have to be working toward.”

A need for multi-gap analysis

The insurance protection gap is composed of emerging markets and high-risk and intangible exposures

There cannot be many industries that recognize that approximately 70 percent of market potential is untapped. Yet that is the scale of opportunity in the expanding “protection gap”.

Power outage in lower Manhattan, New York, after Hurricane Sandy

While efforts are ongoing to plug the colossal shortage, any meaningful industry foray into this barren range must acknowledge that the gap is actually multiple gaps, believes Robert Muir-Wood, chief research officer at RMS. 

“It is composed of three distinct insurance gaps — high risk, emerging markets and intangibles — each with separate causes and distinct solutions. Treating it as one single challenge means we will never achieve the loss clarity to tackle the multiple underlying issues.”

High-risk, high-value gaps exist in regions where potential loss magnitude outweighs the ability of the industry to refund post-catastrophe. High deductibles and exclusions reduce coverage appeal and stunt market growth.

“Take California earthquake. The California Earthquake Authority (CEA) was launched in 1996 to tackle the coverage dilemma exposed by the Northridge disaster. Yet increased deductibles and new exclusions led to a 30 percent gap expansion. And while recent changes have seen purchase uptick, penetration is around 12-14 percent for California homeowners.”

On the emerging market front, micro- and meso-insurance and sovereign risk transfer efforts to bridge the gap have achieved limited success. “The shortfall in emerging economies remains static at between 80 to 100 percent,” he states, “and it is not just a developing world issue, it’s clearly evident in mature markets like Italy.”

“The protection gap is composed of three distinct insurance gaps — high risk, emerging markets and intangibles — each with separate causes and distinct solutions” — Robert Muir-Wood, RMS

A further fast-expanding gap is intangible assets. “In 1975, physical assets accounted for 83 percent of the value of S&P 500 companies,” Muir-Wood points out. “By 2015, that figure was 16 percent, with 84 percent composed of intangible assets such as IP, client data, brand value and innovation potential.” 

While non-damage business interruption cover is evolving, expanding client demand for events such as power outage, cloud disruption and cyberbreach greatly outpace delivery.

To start closing these gaps, Muir-Wood believes protection gap analytics are essential. “We have to first establish a consistent measurement for the difference between insured and total loss and split out ‘penetration’ and ‘coverage’ gaps. That gives us our baseline from which to set appropriate targets and monitor progress.

“Probabilistic cat risk models will play a central role, particularly for the high-risk protection gap, where multiple region and peril-specific models already exist. However, for intangibles and emerging markets, where such models have yet to gain a strong foothold, focusing on scenario events might prove a more effective approach.”

Variations in the gaps according to severity and geography of the catastrophe could be expressed in the form of an exceedance probability curve, showing how the percentage of uninsured risk varies by return period.

“There should be standardization in measuring and reporting the gap,” he concludes. “This should include analyzing insured and economic loss based on probabilistic models, separating the effects of the penetration and coverage gaps, and identifying how gaps vary with annual probability and location.” 

The value of defense

Current flood defenses in the U.K. reduce annual losses from river flooding by £1.1 billion, according to research by RMS

Flooding is one of the most significant natural hazards for the U.K. with over five million homes and businesses in England at risk of flooding and coastal erosion, according to the Environment Agency. 

Flood barrier in Shropshire, England

In 2015, the U.K. government announced a six-year, £2.3 billion investment in flood defenses. But the Environment Agency proposes a further annual investment of £1 billion through 2065 to keep pace with the flood-related impacts of climate change and shifts in exposure levels.

Critical to targeted flood mitigation investment is understanding the positive impacts of current defenses. In June 2019, Flood Re* released its Investing in Flood Risk Management and Defenses study, conducted by RMS. 

Addressing the financial benefits of existing flood defenses for the first time, data from the RMS® Europe Inland Flood HD Model demonstrated that current infrastructure reduced annual losses from riverine flooding by £1.1 billion. This was based on ground-up losses, using the RMS U.K. Economic Exposure Database covering buildings and contents for residential, commercial, industrial and agricultural, plus business interruption losses. 

Critical to targeted flood investment is understanding the positive impacts of current defenses

“Our flood model incorporates countrywide defense data sourced from the Environment Agency and the Scottish Flood Defence Asset Database,” says Theresa Lederer, a consultant within the RMS capital and resilience solutions team, “including walls, levees and embankments, carefully reviewed and augmented by RMS experts. Our initial model run was with defenses in place, and then, using the in-built model functionality to enter user-defined defense values, we removed these [defenses in place].” 

The differences in average annual loss results between the two analyses was £1.1 billion, with losses increasing from £0.7 billion under current defenses to £1.8 billion in the undefended case. The analysis also revealed a differentiated picture of flood risk and defenses at the regional and local levels.

“The savings relative to total inland flood risk are more pronounced in Northern Ireland and England (both over a 50 percent reduction in average annual losses) than Scotland and Wales,” she explains. “But when you view the savings relative to surface-water flood risk only, these are similarly significant across the country, with loss reductions exceeding 75 percent in all regions. This reflects the fact that pluvial flooding, which is kept constant in the analysis, is a bigger loss driver in Scotland and Wales, compared to the rest of the U.K.”

Other insights included that the more deprived half of the population — based on the U.K. Townsend Deprivation Index — benefited from 70 percent of the loss reduction.

The study also showed that while absolute savings were highest for catastrophic events, the proportion of the savings compared to the overall level of loss caused by such events was less significant. “In the case of 1-in-5-year events,” Lederer says, “river flood defenses prevent approximately 70 percent of inland flood losses.  For 1-in-500-year events this drops to 30 percent; however, the absolute value of those 30 percent is far higher than the absolute savings realized in a 1-in-5-year event.

“Should the focus of defenses therefore be on providing protection from major flood events, with potential catastrophic impacts even though return on investment might not be as attractive given their infrequency? Or on attritional losses from more frequent events, which might realize savings more frequently but fail to protect from the most severe events? Finding a balanced, data-driven approach to flood defense investment is crucial to ensure the affordability of sustainable flood resilience.”


ILS: a responsible investment approach

As environmental, social and governance principles become more prominent in guiding investment strategies, the ILS market must respond 

In recent years, there has been a sharper focus by the investment community on responsible investment. One indicator of this has been the increased adoption of the Principles for Responsible Investment (PRI), as environmental, social and governance (ESG) concerns become a more prominent influencer of investment strategies.

Investment houses are also seeking closer alignment between their ESG practices and the United Nations’ Sustainable Development Goals (SDGs). The 17 interconnected SDGs, set in 2015, are a call to action to end poverty, achieve peace and prosperity for all, and create a sustainable society by 2030.

As investors target more demonstrable outcomes from their investment practices, is there a possible opportunity for the insurance-linked securities (ILS) market to grow, given the potential societal capital that insurance can generate?

“Insurance certainly has all of the hallmarks of an ESG-compatible investment opportunity,” believes Charlotte Acton, director of capital and resilience solutions at RMS. “It has the potential to promote resilience through enabling broader access and uptake of appropriate affordable financial protection and reducing the protection gap; supporting faster and more efficient responses to disasters; and incentivizing mitigation and resilient building practices pre- and post-event.”

RMS has been collaborating on numerous initiatives designed to further the role of insurance and insurance technologies in disaster and climate-change resilience. These include exploring ways to monetize the dividends of resilience to incentivize resilient building, using catastrophe models to quantify the benefits of resilience investments such as flood defenses, and earthquake retrofit programs for housing. The work has also involved designing innovative parametric structures to provide rapid post-disaster liquidity.

“Investors will want a clear understanding of the exposure or assets that are being protected and whether they are ESG-friendly” — Charlotte Acton, RMS

“ILS offers a clear route for investors to engage with insurance,” explains Acton, “broadening the capital pool that supports insurance is critical as it facilitates the expansion of insurance to new regions and allows the industry to absorb increasingly large losses from growing threats such as climate change.”

Viewed as a force for social good, it can certainly be argued that insurance-linked securities supports a number of the U.N.’s SDGs, including reducing the human impact of disasters and creating more sustainable cities, increasing overall resilience levels and increasing access to financial services that enhance sustainable growth potential.

While there is opportunity for ILS to play a large part in ESG, the specific role of ILS within PRI is still being determined. According to LGT Capital Partners ESG Report 2019, managers in the ILS space have, in general, yet to start “actively integrating ESG into their investment strategies,” adding that across the ILS asset class “there is still little agreement on how ESG considerations should be applied.” 

However, there is movement in this area. For example, the Bermuda Stock Exchange, a primary exchange for ILS issuers, recently launched an ESG initiative in line with the World Federation of Exchanges’ Sustainability Principles, stating that ESG was a priority in 2019 “with the aim to empower sustainable and responsible growth for its member companies, listings and the wider community.”

For ILS to become a key investment option for ESG-focused investors, it must be able to demonstrate its sustainability credentials clearly.

“Investors will want a clear understanding of the exposure or assets that are being protected,” Acton explains, “and whether they are ESG-friendly. They will want to know whether the protection offered provides significant societal benefits. If the ILS market can factor ESG considerations into its approach more effectively, then there is no reason why it should not attract greater attention from responsible investors.”


The Flames Burn Higher

With California experiencing two of the most devastating seasons on record in consecutive years, EXPOSURE asks whether wildfire now needs to be considered a peak peril

Some of the statistics for the 2018 U.S. wildfire season appear normal. The season was a below-average year for the number of fires reported — 58,083 incidents represented only 84 percent of the 10-year average. The number of acres burned — 8,767,492 acres — was marginally above average at 132 percent.

Two factors, however, made it exceptional. First, for the second consecutive year, the Great Basin experienced intense wildfire activity, with some 2.1 million acres burned — 233 percent of the 10-year average. And second, the fires destroyed 25,790 structures, with California accounting for over 23,600 of the structures destroyed, compared to a 10-year U.S. annual average of 2,701 residences, according to the National Interagency Fire Center.

As of January 28, 2019, reported insured losses for the November 2018 California wildfires, which included the Camp and Woolsey Fires, were at US$11.4 billion, according to the California Department of Insurance. Add to this the insured losses of US$11.79 billion reported in January 2018 for the October and December 2017 California events, and these two consecutive wildfire seasons constitute the most devastating on record for the wildfire-exposed state.

Reaching its peak?

Such colossal losses in consecutive years have sent shockwaves through the (re)insurance industry and are forcing a reassessment of wildfire’s secondary status in the peril hierarchy.

According to Mark Bove, natural catastrophe solutions manager at Munich Reinsurance America, wildfire’s status needs to be elevated in highly exposed areas. “Wildfire should certainly be considered a peak peril in areas such as California and the Intermountain West,” he states, “but not for the nation as a whole.”

His views are echoed by Chris Folkman, senior director of product management at RMS. “Wildfire can no longer be viewed purely as a secondary peril in these exposed territories,” he says. “Six of the top 10 fires for structural destruction have occurred in the last 10 years in the U.S., while seven of the top 10, and 10 of the top 20 most destructive wildfires in California history have occurred since 2015. The industry now needs to achieve a level of maturity with regard to wildfire that is on a par with that of hurricane or flood.”

“Average ember contributions to structure damage and destruction is approximately 15 percent, but in a wind-driven event such as the Tubbs Fire this figure is much higher”
— Chris Folkman, RMS

However, he is wary about potential knee-jerk reactions to this hike in wildfire-related losses. “There is a strong parallel between the 2017-18 wildfire seasons and the 2004-05 hurricane seasons in terms of people’s gut instincts. 2004 saw four hurricanes make landfall in Florida, with K-R-W causing massive devastation in 2005. At the time, some pockets of the industry wondered out loud if parts of Florida were uninsurable. Yet the next decade was relatively benign in terms of hurricane activity.

“The key is to adopt a balanced, long-term view,” thinks Folkman. “At RMS, we think that fire severity is here to stay, while the frequency of big events may remain volatile from year-to-year.”

A fundamental re-evaluation

The California losses are forcing (re)insurers to overhaul their approach to wildfire, both at the individual risk and portfolio management levels.

“The 2017 and 2018 California wildfires have forced one of the biggest re-evaluations of a natural peril since Hurricane Andrew in 1992,” believes Bove. “For both California wildfire and Hurricane Andrew, the industry didn’t fully comprehend the potential loss severities. Catastrophe models were relatively new and had not gained market-wide adoption, and many organizations were not systematically monitoring and limiting large accumulation exposure in high-risk areas. As a result, the shocks to the industry were similar.”

For decades, approaches to underwriting have focused on the wildland-urban interface (WUI), which represents the area where exposure and vegetation meet. However, exposure levels in these areas are increasing sharply. Combined with excessive amounts of burnable vegetation, extended wildfire seasons, and climate-change-driven increases in temperature and extreme weather conditions, these factors are combining to cause a significant hike in exposure potential for the (re)insurance industry.

A recent report published in PNAS entitled “Rapid Growth of the U.S. Wildland-Urban Interface Raises Wildfire Risk” showed that between 1990 and 2010 the new WUI area increased by 72,973 square miles (189,000 square kilometers) — larger than Washington State. The report stated: “Even though the WUI occupies less than one-tenth of the land area of the conterminous United States, 43 percent of all new houses were built there, and 61 percent of all new WUI houses were built in areas that were already in the WUI in 1990 (and remain in the WUI in 2010).”

“The WUI has formed a central component of how wildfire risk has been underwritten,” explains Folkman, “but you cannot simply adopt a black-and-white approach to risk selection based on properties within or outside of the zone. As recent losses, and in particular the 2017 Northern California wildfires, have shown, regions outside of the WUI zone considered low risk can still experience devastating losses.”

For Bove, while focus on the WUI is appropriate, particularly given the Coffey Park disaster during the 2017 Tubbs Fire, there is not enough focus on the intermix areas. This is the area where properties are interspersed with vegetation.

“In some ways, the wildfire risk to intermix communities is worse than that at the interface,” he explains. “In an intermix fire, you have both a wildfire and an urban conflagration impacting the town at the same time, while in interface locations the fire has largely transitioned to an urban fire.

“In an intermix community,” he continues, “the terrain is often more challenging and limits firefighter access to the fire as well as evacuation routes for local residents. Also, many intermix locations are far from large urban centers, limiting the amount of firefighting resources immediately available to start combatting the blaze, and this increases the potential for a fire in high-wind conditions to become a significant threat. Most likely we’ll see more scrutiny and investigation of risk in intermix towns across the nation after the Camp Fire’s decimation of Paradise, California.”

Rethinking wildfire analysis

According to Folkman, the need for greater market maturity around wildfire will require a rethink of how the industry currently analyzes the exposure and the tools it uses.

“Historically, the industry has relied primarily upon deterministic tools to quantify U.S. wildfire risk,” he says, “which relate overall frequency and severity of events to the presence of fuel and climate conditions, such as high winds, low moisture and high temperatures.”

While such tools can prove valuable for addressing “typical” wildland fire events, such as the 2017 Thomas Fire in Southern California, their flaws have been exposed by other recent losses.

Burning Wildfire at Sunset

“Such tools insufficiently address major catastrophic events that occur beyond the WUI into areas of dense exposure,” explains Folkman, “such as the Tubbs Fire in Northern California in 2017. Further, the unprecedented severity of recent wildfire events has exposed the weaknesses in maintaining a historically based deterministic approach.”

While the scale of the 2017-18 losses has focused (re)insurer attention on California, companies must also recognize the scope for potential catastrophic wildfire risk extends beyond the boundaries of the western U.S.

“While the frequency and severity of large, damaging fires is lower outside California,” says Bove, “there are many areas where the risk is far from negligible.” While acknowledging that the broader western U.S. is seeing increased risk due to WUI expansion, he adds: “Many may be surprised that similar wildfire risk exists across most of the southeastern U.S., as well as sections of the northeastern U.S., like in the Pine Barrens of southern New Jersey.”

As well as addressing the geographical gaps in wildfire analysis, Folkman believes the industry must also recognize the data gaps limiting their understanding.

“There are a number of areas that are understated in underwriting practices currently, such as the far-ranging impacts of ember accumulations and their potential to ignite urban conflagrations, as well as vulnerability of particular structures and mitigation measures such as defensible space and fire-resistant roof coverings.”

In generating its US$9 billion to US$13 billion loss estimate for the Camp and Woolsey Fires, RMS used its recently launched North America Wildfire High-Definition (HD) Models to simulate the ignition, fire spread, ember accumulations and smoke dispersion of the fires.

“In assessing the contribution of embers, for example,” Folkman states, “we modeled the accumulation of embers, their wind-driven travel and their contribution to burn hazard both within and beyond the fire perimeter. Average ember contributions to structure damage and destruction is approximately 15 percent, but in a wind-driven event such as the Tubbs Fire this figure is much higher. This was a key factor in the urban conflagration in Coffey Park.”

The model also provides full contiguous U.S. coverage, and includes other model innovations such as ignition and footprint simulations for 50,000 years, flexible occurrence definitions, smoke and evacuation loss across and beyond the fire perimeter, and vulnerability and mitigation measures on which RMS collaborated with the Insurance Institute for Business & Home Safety.

Smoke damage, which leads to loss from evacuation orders and contents replacement, is often overlooked in risk assessments, despite composing a tangible portion of the loss, says Folkman. “These are very high-frequency, medium-sized losses and must be considered. The Woolsey Fire saw 260,000 people evacuated, incurring hotel, meal and transport-related expenses. Add to this smoke damage, which often results in high-value contents replacement, and you have a potential sea of medium-sized claims that can contribute significantly to the overall loss.”

A further data resolution challenge relates to property characteristics. While primary property attribute data is typically well captured, believes Bove, many secondary characteristics key to wildfire are either not captured or not consistently captured.

“This leaves the industry overly reliant on both average model weightings and risk scoring tools. For example, information about defensible spaces, roofing and siding materials, protecting vents and soffits from ember attacks, these are just a few of the additional fields that the industry will need to start capturing to better assess wildfire risk to a property.”

A highly complex peril

Bove is, however, conscious of the simple fact that “wildfire behavior is extremely complex and non-linear.” He continues: “While visiting Paradise, I saw properties that did everything correct with regard to wildfire mitigation but still burned and risks that did everything wrong and survived. However, mitigation efforts can improve the probability that a structure survives.”

“With more data on historical fires,” Folkman concludes, “more research into mitigation measures and increasing awareness of the risk, wildfire exposure can be addressed and managed. But it requires a team mentality, with all parties — (re)insurers, homeowners, communities, policymakers and land-use planners — all playing their part.”

Vulnerability - In equal measure

As international efforts grow to minimize the disproportionate impact of disasters on specific parts of society, EXPOSURE looks at how close public/private collaboration will be critical to moving forward

A woman carries items through Port-au-Prince, Haiti, after the 2010 earthquake destroyed the city

There is a widely held and understandable belief that large-scale disasters are indiscriminate events. They weigh out devastation in equal measure, irrespective of the gender, age, social standing or physical ability of those impacted.

The reality, however, is very different. Catastrophic events expose the various inequalities within society in horrific fashion. Women, children, the elderly, people with disabilities and those living in economically deprived areas are at much greater risk than other parts of society both during the initial disaster phase and the recovery process.

Cyclone Gorky, for example, which struck Bangladesh in 1991, caused in the region of 140,000 deaths — women made up 93 percent of that colossal death toll. Similarly, in the 2004 Indian Ocean Tsunami some 70 percent of the 250,000 fatalities were women.

Looking at the disparity from an age-banded perspective, during the 2005 Kashmir Earthquake 10,000 schools collapsed resulting in the deaths of 19,000 children. Children also remain particularly vulnerable well after disasters have subsided. In 2014, a study by the University of San Francisco of death rates in the Philippines found that delayed deaths among female infants outnumbered reported typhoon deaths by 15-to-1 following an average typhoon season — a statistic widely attributed to parents prioritizing their male infants at a time of extreme financial difficulty.

And this disaster disparity is not limited to developing nations as some may assume. Societal groups in developed nations can be just as exposed to a disproportionate level of risk.

During the recent Camp Fire in California, figures revealed that residents in the town of Paradise aged 75 or over were 8 times more likely to die than the average for all other age bands. This age-related disparity was only marginally smaller for Hurricane Katrina in 2005.

The scale of the problem

These alarming statistics are now resonating at the highest levels. Growing recognition of the inequalities in disaster-related fatality ratios is now influencing global thinking on disaster response and management strategies. Most importantly, it is a central tenet of the Sendai Framework for Disaster Risk Reduction 2015–2030, which demands an “all-of-society engagement and partnership” to reduce risk that encompasses those “disproportionately affected by disasters.”

Yet a fundamental problem is that disaggregated data for specific vulnerable groups is not being captured for the majority of disasters.

“There is a growing acknowledgment across many nations that certain groupings within society are disproportionately impacted by disasters,” explains Alison Dobbin, principal catastrophe risk modeler at RMS. “Yet the data required to get a true sense of the scale of the problem simply isn’t being utilized and disaggregated in an effective manner post-disaster. And without exploiting and building on the data that is available, we cannot gain a working understanding of how best to tackle the multiple issues that contribute to it.”

The criticality of capturing disaster datasets specific to particular groups and age bands is clearly flagged in the Sendai Framework. Under the “Guiding Principles,” the document states: “Disaster risk reduction requires a multi-hazard approach and inclusive risk-informed decision-making based on the open exchange and dissemination of disaggregated data, including by sex, age and disability, as well as on easily accessible, up-to-date, comprehensible, science-based, non-sensitive risk information, complemented by traditional knowledge.”

Gathering the data

Effective data capture, however, requires a consistent approach to the collection of disaggregated information across all groups — first, to understand the specific impacts of particular perils on distinct groups, and second, to generate guidance, policies and standards for preparedness and resilience that reflect the unique sensitivities.

While efforts to collect and analyze aggregated data are increasing, the complexities involved in ascertaining differentiated vulnerabilities to specific groups are becoming increasingly apparent, as Nicola Howe, lead catastrophe risk modeler at RMS, explains.

“We can go beyond statistics collection, and model those factors which lead to discriminative outcomes”
— Nicola Howe, RMS

“You have to remember that social vulnerability varies from place to place and is often in a state of flux,” she says. “People move, levels of equality change, lifestyles evolve and the economic conditions in specific regions fluctuate. Take gender-based vulnerabilities for example. They tend not to be as evident in societies that demonstrate stronger levels of sexual equality.

“Experiences during disasters are also highly localized and specific to the particular event or peril,” she continues. “There are multiple variables that can influence the impact on specific groups. Cultural, political and economic factors are strong influencers, but other aspects such as the time of day or the particular season can also have a significant effect on outcomes.”

This creates challenges, not only for attributing specific vulnerabilities to particular groups and establishing policies designed to reduce those vulnerabilities, but also for assessing the extent to which the measures are having the desired outcomes.

Establishing data consistency and overcoming the complexities posed by this universal problem will require the close collaboration of all key participants.

“It is imperative that governments and NGOs recognize the important part that the private sector can play in working together and converting relevant data into the targeted insight required to support effective decision-making in this area,” says Dobbin.

A collective response

At time of writing, Dobbin and Howe were preparing to join a diverse panel of speakers at the UN’s 2019 Global Platform for Disaster Risk Reduction in Switzerland. This year’s convening marks the third consecutive conference at which RMS has participated. Previous events have seen Robert Muir-Wood, chief research officer, and Daniel Stander, global managing director, present on the resilience dividend andrisk finance.

The title of this year’s discussion is “Using Gender, Age and Disability-Responsive Data to Empower Those Left Furthest Behind.”

“One of our primary aims at the event,” says Howe, “will be to demonstrate the central role that the private sector, and in our case the risk modeling community, can play in helping to bridge the data gap that exists and help promote the meaningful way in which we can contribute.”

The data does, in some cases, exist and is maintained primarily by governments and NGOs in the form of census data, death certificates, survey results and general studies.

“Companies such as RMS provide the capabilities to convert this raw data into actionable insight,” Dobbin says. “We model from hazard, through vulnerability and exposure, all the way to the financial loss. That means we can take the data and turn it into outputs that governments and NGOs can use to better integrate disadvantaged groups into resilience planning.”

But it’s not simply about getting access to the data. It is also about working closely with these bodies to establish the questions that they need answers to. “We need to understand the specific outputs required. To this end, we are regularly having conversations with many diverse stakeholders,” adds Dobbin.

While to date the analytical capabilities of the risk modeling community have not been directed at the social vulnerability issue in any significant way, RMS has worked with organizations to model human exposure levels for perils. Collaborating with the Workers’ Compensation Insurance Rating Bureau of California (WCIRB), a private, nonprofit association, RMS conducted probabilistic earthquake analysis on exposure data for more than 11 million employees. This included information about the occupation of each employee to establish potential exposure levels for workers’ compensation cover in the state.

“We were able to combine human exposure data to model the impact of an earthquake, ascertaining vulnerability based on where employees were likely to be, their locations, their specific jobs, the buildings they worked in and the time of day that the event occurred,” says Howe. “We have already established that we can incorporate age and gender data into the model, so we know that our technology is capable of supporting detailed analyses of this nature on a huge scale.”

She continues: “We must show where the modeling community can make a tangible difference. We bring the ability to go beyond the collection of statistics post-disaster and to model those factors that lead to such strong differences in outcomes, so that we can identify where discrimination and selective outcomes are anticipated before they actually happen in disasters. This could be through identifying where people are situated in buildings at different times of day, by gender, age, disability, etc. It could be by modeling how different people by age, gender or disability will respond to a warning of a tsunami or a storm surge. It could be by modeling evacuation protocols to demonstrate how inclusive they are.”

Strengthening the synergies

A critical aspect of reducing the vulnerability of specific groups is to ensure disadvantaged elements of society become more prominent components of mitigation and response planning efforts. A more people-centered approach to disaster management was a key aspect of the forerunner to the Sendai Framework, the Hyogo Framework for Action 2005–2015. The plan called for risk reduction practices to be more inclusive and engage a broader scope of stakeholders, including those viewed as being at higher risk.

This approach is a core part of the “Guiding Principles” that underpin the Sendai Framework. It states: “Disaster risk reduction requires an all-of-society engagement and partnership. It also requires empowerment and inclusive, accessible and non-discriminatory participation, paying special attention to people disproportionately affected by disasters, especially the poorest. A gender, age, disability and cultural perspective should be integrated in all policies and practices, and women and youth leadership should be promoted.”

The Framework also calls for the empowerment of women and people with disabilities, stating that enabling them “to publicly lead and promote gender equitable and universally accessible response, recovery, rehabilitation and reconstruction approaches.”

This is a main area of focus for the U.N. event, explains Howe. “The conference will explore how we can promote greater involvement among members of these disadvantaged groups in resilience-related discussions, because at present we are simply not capitalizing on the insight that they can provide.

“Take gender for instance. We need to get the views of those disproportionately impacted by disaster involved at every stage of the discussion process so that we can ensure that we are generating gender-sensitive risk reduction strategies, that we are factoring universal design components into how we build our shelters, so women feel welcome and supported. Only then can we say we are truly recognizing the principles of the Sendai Framework.”

Clear link between flood losses and North Atlantic Oscillation

RMS research proves relationship between NAO and catastrophic flood events in Europe

The correlation between the North Atlantic Oscillation (NAO) and European precipitation patterns is well known. However, a definitive link between phases of the NAO and catastrophic flood events and related losses had not previously been established — until now.

A study by RMS published in Geophysical Research Letters has revealed a direct correlation between the NAO and the occurrence of catastrophic floods across Europe and associated economic losses. The analysis not only extrapolated a statistically significant relationship between the events, but critically showed that average flood losses during opposite NAO states can differ by up to 50 percent.

A change in pressure

The NAO’s impact on meteorological patterns is most pronounced in winter. Fluctuations in the atmospheric pressure between two semi-permanent centers of low and high pressure in the North Atlantic influence wind direction and strength as well as storm tracks.

The two-pronged study combined extensive analysis of flood occurrence and peak water levels across Europe, coupled with extensive modeling of European flood events using the RMS Europe Inland Flood High-Definition (HD) Model.

The data sets included HANZE-Events, a catalog of over 1,500 catastrophic European flood events between 1870 and 2016, and a recent database of the highest-recorded water levels based on data from over 4,200 weather stations.

"The HD model generated a large set of potential catastrophic flood events and quantified the associated losses"

“This analysis established a clear relationship between the occurrence of catastrophic flood events and the NAO phase,” explains Stefano Zanardo, principal modeler at RMS, “and confirmed that a positive NAO increased catastrophic flooding in Northern Europe, with a negative phase influencing flooding in Southern Europe. However, to ascertain the impact on actual flood losses we turned to the model.”

Modeling the loss

The HD model generated a large set of potential catastrophic flood events and quantified the associated losses. It not only factored in precipitation, but also rainfall runoff, river routing and inundation processes. Critically, the precipitation incorporated the impact of a simulated monthly NAO index as a driver for monthly rainfall.

“It showed that seasonal flood losses can increase or decrease by up to 50 percent between positive and negative NAOs, which is very significant,” states Zanardo. “What it also revealed were distinct regional patterns. For example, a positive state resulted in increased flood activity in the U.K. and Germany. These loss patterns provide a spatial correlation of flood risk not previously detected.”

Currently, NAO seasonal forecasting is limited to a few months. However, as this window expands, the potential for carriers to factor oscillation phases into flood-related renewal and capital allocation strategies will grow. Further, greater insight into spatial correlation could support more effective portfolio management.

“At this stage,” he concludes, “we have confirmed the link between the NAO and flood-related losses. How this evolves to influence carriers’ flood strategies is still to be seen, and a key factor will be advances in the NAO forecasting. What is clear is that oscillations such as the NAO must be included in model assumptions to truly understand flood risk.”

Earthquake risk – New Zealand insurance sector experiences growing pains

Speed of change around homeowners insurance is gathering pace as insurers move to differential pricing models

Road cracks appeared during the 2016 Kaikoura Earthquake in New Zealand

New Zealand’s insurance sector is undergoing fundamental change as the impact of the NZ$40 billion (US$27 billion) Canterbury Earthquake and more recent Kaikōura disaster spur efforts to create a more sustainable, risk-reflective marketplace.

In 2018, EXPOSURE examined risk-based pricing in the region following Tower Insurance’s decision to adopt such an approach to achieve a “fairer and more equitable way of pricing risk.” Since then, IAG, the country’s largest general insurer, has followed suit, with properties in higher-risk areas forecast to see premium hikes, while it also adopts “a conservative approach” to providing insurance in peril-prone areas.

“Insurance, unsurprisingly, is now a mainstream topic across virtually every media channel in New Zealand,” says Michael Drayton, a consultant at RMS. “There has been a huge shift in how homeowners insurance is viewed, and it will take time to adjust to the introduction of risk-based pricing.”

Another market-changing development is the move by the country’s Earthquake Commission (EQC) to increase the first layer of buildings’ insurance cover it provides from NZ$100,000 to NZ$150,000 (US$68,000 to US$101,000), while lowering contents cover from NZ$20,000 (US$13,500) to zero. These changes come into force in July 2019.

Modeling the average annual loss (AAL) impact of these changes based on the updated RMS New Zealand Earthquake Industry Exposure Database shows the private sector will see a marginal increase in the amount of risk it takes on as the AAL increase from the contents exit outweighs the decrease from the buildings cover hike.

These findings have contributed greatly to the debate around the relationship between buildings and contents cover. One major issue the market has been addressing is its ability to accurately estimate sums insured. According to Drayton, recent events have seen three separate spikes around exposure estimates.

“The first spike occurred in the aftermath of the Christchurch Earthquake,” he explains, “when there was much debate about commercial building values and limits, and confusion relating to sums insured and replacement values.

“The second occurred with the move away from open-ended replacement policies in favor of sums insured for residential properties.

“Now that the EQC has removed contents cover, we are seeing another spike as the private market broaches uncertainty around content-related replacement values.

“There is very much an education process taking place across New Zealand’s insurance industry,” Drayton concludes. “There are multiple lessons being learned in a very short period of time. Evolution at this pace inevitably results in growing pains, but if it is to achieve a sustainable insurance market it must push on through.”

A risk-driven business

Following Tower Insurance’s switch to risk-based pricing in New Zealand, EXPOSURE examines how recent market developments may herald a more fundamental industry shift

The ramifications of the Christchurch earthquakes of 2010-11 continue to reverberate through the New Zealand insurance market. The country’s Earthquake Commission (EQC), which provides government-backed natural disaster insurance, is forecast to have paid around NZ$11 billion (US$7.3 billion) by the time it settles its final claim.

The devastating losses exposed significant shortfalls in the country’s insurance market. These included major deficiencies in insurer data, gaps in portfolio management and expansive policy wordings that left carriers exposed to numerous unexpected losses.

Since then, much has changed. Policy terms have been tightened, restrictions have been introduced on coverage and concerted efforts have been made to bolster databases. On July 1, 2019, the EQC increased the cap limit on the government-mandated residential cover it provides to all householders from NZ$100,000 (US$66,000) (a figure set in 1993) to NZ$150,000. A significant increase, but well below the average house price in New Zealand as of December 2017, which stood at NZ$669,565, and an average rebuild cost of NZ$350,000. It has also removed contents coverage.

More recently, however, one development has taken place that has the potential to have a much more profound impact on the market.

Risk-based pricing

In March 2018, New Zealand insurer Tower Insurance announced a move to risk-based pricing for home insurance. It aims to ensure premium levels are commensurate with individual property risk profiles, with those in highly exposed areas experiencing a price rise on the earthquake component of their coverage.

Describing the shift as a “fairer and more equitable way of pricing risk,” Tower CEO Richard Harding says this was the “right thing to do” both for the “long-term benefit of New Zealand” and for customers, with risk-based pricing “the fairest way to distribute the costs we face as an insurer.”

The move has generated much media coverage, with stories highlighting instances of triple-digit percentage hikes in earthquake-prone regions such as Wellington. Yet, what has generated significantly fewer column inches has been the marginal declines available to the vast majority of households in the less seismically active regions, as the high-risk earthquake burden on their premium is reduced.

A key factor in Tower’s decision was the increasing quality and granularity of the underwriting data at its disposal. “Tower has always focused on the quality of its data and has invested heavily in ensuring it has the highest-resolution information available,” says Michael Drayton, senior risk modeler for RMS, based in New Zealand.

“The earthquakes generated the most extensive liquefaction in a built-up area seen in a developed country” — Michael Drayton, RMS

In fact, in the aftermath of the Christchurch earthquakes, RMS worked with Tower as RMS rebuilt its New Zealand High-Definition (HD) Earthquake Model due to the caliber of their data. Prior to the earthquake, claims data was in very short supply given that there had been few previous events with large-scale impacts on highly built-up areas.

“On the vulnerability side,” Drayton explains, “we had virtually no local claims data to build our damage functions. Our previous model had used comparisons of building performance in other earthquake-exposed regions. After Christchurch, we suddenly had access to billions of dollars of claims information.”

RMS sourced data from numerous parties, including EQC and Tower, as well as geoscience research firm GNS Science, as it reconstructed the model from this swell of data.

“RMS had a model that had served the market well for many years,” he explains. “On the hazard side, the fundamentals remained the same — the highest hazard is along the plate boundary, which runs offshore along the east coast of North Island traversing over to the western edge of South Island. But we had now gathered new information on fault lines, activity rates, magnitudes and subduction zones. We also updated our ground motion prediction equations.”

One of the most high-profile model developments was the advanced liquefaction module. “The 2010-11 earthquakes generated probably the most extensive liquefaction in a built-up area seen in a developed country. With the new information, we were now able to capture the risk at much higher gradients and in much greater resolution,” says Drayton.

This data surge enabled RMS to construct its New Zealand Earthquake HD Model on a variable resolution grid set at a far more localized level. In turn, this has helped give Tower sufficient confidence in the granularity and accuracy of its data at the property level to adopt risk-based pricing.

The ripple effects

As homeowners received their renewal notices, the reality of risk-based pricing started to sink in. Tower is the third-largest insurer for domestic household, contents and private motor cover in New Zealand and faces stiff competition. Over 70 percent of the market is in the hands of two players, with IAG holding around 47 percent and Suncorp approximately 25 percent.

News reports also suggested movement from the larger players. AMI and State, both owned by IAG, announced that three-quarters of its policyholders — those at heightened risk of earthquake, landslide or flood — will see an average annual premium increase of NZ$91 (US$60); the remaining quarter at lower risk will see decreases averaging NZ$54 per year. A handful of households could see increases or decreases of up to NZ$1,000. According to the news website Stuff, IAG has not changed premiums for its NZI policyholders, with NZI selling house insurance policies through brokers.

“One interesting dynamic is that a small number of start-ups are now entering the market with the same risk-based pricing stance taken by Tower,” Drayton points out. “These are companies with new purpose-built IT systems that are small and nimble and able to target niche sectors.”

“It’s certainly a development to watch closely,” he continues, “as it raises the potential for larger players, if they are not able to respond effectively, being selected against. It will be interesting to see if the rate of these new entrants increases.”

The move from IAG suggests risk-based pricing will extend beyond the earthquake component of cover to flood-related elements. “Flood is not a reinsurance peril for New Zealand, but it is an attritional one,” Drayton points out. “Then there is the issue of rising sea levels and the potential for coastal flooding, which is a major cause for concern. So, the risk-based pricing shift is feeding into climate change discussions too.”

A fundamental shift

Policyholders in risk-exposed areas such as Wellington were almost totally unaware of how much higher their insurance should be based on their property exposure, largely shielded away from the risk reality of earthquakes in recent years. The move to risk-based pricing will change that.

“The market shifts we are seeing today pose a multitude of questions and few clear answers” — Michael Drayton, RMS

Drayton agrees that recent developments are opening the eyes of homeowners. “There is a growing realization that New Zealand’s insurance market has operated very differently from other insurance markets and that that is now changing.”

One major marketwide development in recent years has been the move from full replacement cover to fixed sums insured in household policies. “This has a lot of people worried they might not be covered,” he explains. “Whereas before, people simply assumed that in the event of a big loss the insurer would cover it all, now they’re slowly realizing it no longer works like that. This will require a lot of policyholder education and will take time.”

At a more foundational level, current market dynamics also address the fundamental role of insurance, exposing the conflicted role of the insurer as both a facilitator of risk pooling and a profit-making enterprise. When investment returns outweighed underwriting profit, it appeared as if cross-subsidization wasn’t a big issue. However, current dynamics has meant the operating model is squarely focused on underwriting returns — to favor risk-based pricing.

Cross-subsidization is the basis upon which EQC is built, but is it fair? Twenty cents in every NZ$100 (US$66) of home or contents fire insurance premium, up to a maximum of NZ$100,000 insured, is passed on to the EQC. While to date there has been limited government response to risk-based pricing, it is monitoring the situation closely given the broader implications.

Looking globally, in an RMS blog, chief research officer Robert Muir-Wood also raises the question whether “flat-rated” schemes, like the French cat nat scheme, will survive now that it has become clear how to use risk models to calculate the wide differentials in the underlying cost of the risk. He asks whether “such schemes are established in the name of ‘solidarity’ or ignorance?”

While there is no evidence yet, current developments raise the potential for certain risks to become uninsurable. Increasingly granular data combined with the drive for greater profitability may cause a downward spiral in a market built on a shared burden.

Drayton adds: “Potential uninsurability has more to do with land-use planning and building consent regimes, and insurers shouldn’t be paying the price for poor planning decisions. Ironically, earthquake loading codes are very sophisticated and have evolved to recognize the fine gradations in earthquake risk provided by localized data. In fact, they are so refined that structural engineers remark that they are too nuanced and need to be simpler. But if you are building in a high-risk area, it’s not just designing for the hazard, it is also managing the potential financial risk.”

He concludes: “The market shifts we are seeing today pose a multitude of questions and few clear answers. However, the only constant running through all these discussions is that they are all data driven.”

Making the move

Key to understanding the rationale behind the shift to risk-based pricing is understanding the broader economic context of New Zealand, says Tower CEO Richard Harding.

“The New Zealand economy is comparatively small,” he explains, “and we face a range of unique climatic and geological risks. If we don’t plan for and mitigate these risks, there is a chance that reinsurers will charge insurers more or restrict cover.

“Before this happens, we need to educate the community, government, councils and regulators, and by moving toward risk-based pricing, we’re putting a signal into the market to drive social change through these organizations.

“These signals will help demonstrate to councils and government that more needs to be done to plan for and mitigate natural disasters and climate change.” 

Harding feels that this risk-based pricing shift is a natural market evolution. “When you look at global trends, this is happening around the world. So, given that we face a number of large risks here in New Zealand, in some respects, it’s surprising it hasn’t happened sooner,” he says.

While some parties have raised concerns that there may be a fall in insurance uptake in highly exposed regions, Harding does not believe this will be the case. “For the average home, insurance may be more expensive than it currently is, but it won’t be unattainable,” he states. 

Moving forward, he says that Tower is working to extend its risk-based pricing approach beyond the earthquake component of its cover, stating that the firm “is actively pursuing risk-based pricing for flood and other natural perils, and over the long term we would expect other insurers to follow in our footsteps.” 

In terms of the potential wider implications if this occurs, Harding says that such a development would compel government, councils and other organizations to change how they view risk in their planning processes. “I think it will start to drive customers to consider risk more holistically and take this into account when they build and buy homes,” he concludes.

In total harmony

Karen White joined RMS as CEO in March 2018, followed closely by Moe Khosravy, general manager of software and platform activities. EXPOSURE talks to both, along with Mohsen Rahnama, chief risk modeling officer and one of the firm’s most long-standing team members, about their collective vision for the company, innovation, transformation and technology in risk management

Karen and Moe, what was it that sparked your interest in joining RMS?

Karen: What initially got me excited was the strength of the hand we have to play here and the fact that the insurance sector is at a very interesting time in its evolution. The team is fantastic — one of the most extraordinary groups of talent I have come across. At our core, we have hundreds of Ph.D.s, superb modelers and scientists, surrounded by top engineers, and computer and data scientists.

I firmly believe no other modeling firm holds a candle to the quality of leadership and depth and breadth of intellectual property at RMS. We are years ahead of our competitors in terms of the products we deliver.

Moe: For me, what can I say? When Karen calls with an idea it’s very hard to say no! However, when she called about the RMS opportunity, I hadn’t ever considered working in the insurance sector.

My eureka moment came when I looked at the industry’s challenges and the technology available to tackle them. I realized that this wasn’t simply a cat modeling property insurance play, but was much more expansive. If you generalize the notion of risk and loss, the potential of what we are working on and the value to the insurance sector becomes much greater.

I thought about the technologies entering the sector and how new developments on the AI [artificial intelligence] and machine learning front could vastly expand current analytical capabilities. I also began to consider how such technologies could transform the sector’s cost base. In the end, the decision to join RMS was pretty straightforward.

"Developments such as AI and machine learning are not fairy dust to sprinkle on the industry’s problems”

Karen: The industry itself is reaching a eureka moment, which is precisely where I love to be. It is at a transformational tipping point — the technology is available to enable this transformation and the industry is compelled to undertake it.

I’ve always sought to enter markets at this critical point. When I joined Oracle in the 1990s, the business world was at a transformational point — moving from client-server computing to Internet computing. This has brought about many of the huge changes we have seen in business infrastructure since, so I had a bird’s-eye view of what was a truly extraordinary market shift coupled with a technology shift.

That experience made me realize how an architectural shift coupled with a market shift can create immense forward momentum. If the technology can’t support the vision, or if the challenges or opportunities aren’t compelling enough, then you won’t see that level of change occur.

Do (re)insurers recognize the need to change and are they willing to make the digital transition required?

Karen: I absolutely think so. There are incredible market pressures to become more efficient, assess risks more effectively, improve loss ratios, achieve better business outcomes and introduce more beneficial ways of capitalizing risk.

You also have numerous new opportunities emerging. New perils, new products and new ways of delivering those products that have huge potential to fuel growth. These can be accelerated not just by market dynamics but also by a smart embrace of new technologies and digital transformation.

Mohsen: Twenty-five years ago when we began building models at RMS, practitioners simply had no effective means of assessing risk. So, the adoption of model technology was a relatively simple step. Today, the extreme levels of competition are making the ability to differentiate risk at a much more granular level a critical factor, and our model advances are enabling that.

In tandem, many of the Silicon Valley technologies have the potential to greatly enhance efficiency, improve processing power, minimize cost, boost speed to market, enable the development of new products, and positively impact every part of the insurance workflow.

Data is the primary asset of our industry — it is the source of every risk decision, and every risk is itself an opportunity. The amount of data is increasing exponentially, and we can now capture more information much faster than ever before, and analyze it with much greater accuracy to enable better decisions. It is clear that the potential is there to change our industry in a positive way.

The industry is renowned for being risk averse. Is it ready to adopt the new technologies that this transformation requires?

Karen: The risk of doing nothing given current market and technology developments is far greater than that of embracing emerging tech to enable new opportunities and improve cost structures, even though there are bound to be some bumps in the road.

I understand the change management can be daunting. But many of the technologies RMS is leveraging to help clients improve price performance and model execution are not new. AI, the Cloud and machine learning are already tried and trusted, and the insurance market will benefit from the lessons other industries have learned as it integrates these technologies.

"The sector is not yet attracting the kind of talent that is attracted to firms such as Google, Microsoft or Amazon — and it needs to”

Moe: Making the necessary changes will challenge the perceived risk-averse nature of the insurance market as it will require new ground to be broken. However, if we can clearly show how these capabilities can help companies be measurably more productive and achieve demonstrable business gains, then the market will be more receptive to new user experiences.

Mohsen: The performance gains that technology is introducing are immense. A few years ago, we were using computation fluid dynamics to model storm surge. We were conducting the analysis through CPU [central processing unit] microprocessors, which was taking weeks. With the advent of GPU [graphics processing unit] microprocessors, we can carry out the same level of analysis in hours.

When you add the supercomputing capabilities possible in the Cloud, which has enabled us to deliver HD-resolution models to our clients — in particular for flood, which requires a high-gradient hazard model to differentiate risk effectively — it has enhanced productivity significantly and in tandem price performance.

Is an industry used to incremental change able to accept the stepwise change technology can introduce?

Karen: Radical change often happens in increments. The change from client-server to Internet computing did not happen overnight, but was an incremental change that came in waves and enabled powerful market shifts.

Amazon is a good example of market leadership out of digital transformation. It launched in 1994 as an online bookstore in a mature, relatively sleepy industry. It evolved into broad e-commerce and again with the introduction of Cloud services when it launched AWS [Amazon Web Services] 12 years ago — now a US$17 billion business that has disrupted the computer industry and is a huge portion of its profit. Amazon has total revenue of US$178 billion from nothing over 25 years, having disrupted the retail sector.

Retail consumption has changed dramatically, but I can still go shopping on London’s Oxford Street and about 90 percent of retail is still offline. My point is, things do change incrementally but standing still is not a great option when technology-fueled market dynamics are underway. Getting out in front can be enormously rewarding and create new leadership.

However, we must recognize that how we introduce technology must be driven by the challenges it is being introduced to address. I am already hearing people talk about developments such as AI, machine learning and neural networks as if they are fairy dust to sprinkle on the industry’s problems. That is not how this transformation process works.

How are you approaching the challenges that this transformation poses?

Karen: At RMS, we start by understanding the challenges and opportunities from our customers’ perspectives and then look at what value we can bring that we have not brought before. Only then can we look at how we deliver the required solution.

Moe: It’s about having an “outward-in” perspective. We have amazing technology expertise across modeling, computer science and data science, but to deploy that effectively we must listen to what the market wants.

We know that many companies are operating multiple disparate systems within their networks that have simply been built upon again and again. So, we must look at harnessing technology to change that, because where you have islands of data, applications and analysis, you lose fidelity, time and insight and costs rise.

Moe: While there is a commonality of purpose spanning insurers, reinsurers and brokers, every organization is different. At RMS, we must incorporate that into our software and our platforms. There is no one-size-fits-all and we can’t force everyone to go down the same analytical path.

That’s why we are adopting a more modular approach in terms of our software. Whether the focus is portfolio management or underwriting decision-making, it’s about choosing those modules that best meet your needs.

"Data is the primary asset of our industry — it is the source of every risk decision, and every risk is itself an opportunity”

Mohsen: When constructing models, we focus on how we can bring the right technology to solve the specific problems our clients have. This requires a huge amount of critical thinking to bring the best solution to market.

How strong is the talent base that is helping to deliver this level of capability?

Mohsen: RMS is extremely fortunate to have such a fantastic array of talent. This caliber of expertise is what helps set us apart from competitors, enabling us to push boundaries and advance our modeling capabilities at the speed we are.

Recently, we have set up teams of modelers and data and computer scientists tasked with developing a range of innovations. It’s fantastic having this depth of talent, and when you create an environment in which innovative minds can thrive you quickly reap the rewards — and that is what we are seeing. In fact, I have seen more innovation at RMS in the last six months than over the past several years.

Moe: I would add though that the sector is not yet attracting the kind of talent seen at firms such as Google, Microsoft or Amazon, and it needs to. These companies are either large-scale customer-service providers capitalizing on big data platforms and leading-edge machine-learning techniques to achieve the scale, simplicity and flexibility their customers demand, or enterprises actually building these core platforms themselves.

When you bring new blood into an organization or industry, you generate new ideas that challenge current thinking and practices, from the user interface to the underlying platform or the cost of performance. We need to do a better PR job as a technology sector. The best and brightest people in most cases just want the greatest problems to tackle — and we have a ton of those in our industry.

Karen: The critical component of any successful team is a balance of complementary skills and capabilities focused on having a high impact on an interesting set of challenges. If you get that dynamic right, then that combination of different lenses correctly aligned brings real clarity to what you are trying to achieve and how to achieve it.

I firmly believe at RMS we have that balance. If you look at the skills, experience and backgrounds of Moe, Mohsen and myself, for example, they couldn’t be more different. Bringing Moe and Mohsen together, however, has quickly sparked great and different thinking. They work incredibly well together despite their vastly different technical focus and career paths. In fact, we refer to them as the “Moe-Moes” and made them matching inscribed giant chain necklaces and presented them at an all-hands meeting recently.

Moe: Some of the ideas we generate during our discussions and with other members of the modeling team are incredibly powerful. What’s possible here at RMS we would never have been able to even consider before we started working together.

Mohsen: Moe’s vast experience of building platforms at companies such as HP, Intel and Microsoft is a great addition to our capabilities. Karen brings a history of innovation and building market platforms with the discipline and the focus we need to deliver on the vision we are creating. If you look at the huge amount we have been able to achieve in the months that she has been at RMS, that is a testament to the clear direction we now have.

Karen: While we do come from very different backgrounds, we share a very well-defined culture. We care deeply about our clients and their needs. We challenge ourselves every day to innovate to meet those needs, while at the same time maintaining a hell-bent pragmatism to ensure we deliver.

Mohsen: To achieve what we have set out to achieve requires harmony. It requires a clear vision, the scientific know-how, the drive to learn more, the ability to innovate and the technology to deliver — all working in harmony.

Career highlights

Karen White is an accomplished leader in the technology industry, with a 25-year track record of leading, innovating and scaling global technology businesses. She started her career in Silicon Valley in 1993 as a senior executive at Oracle. Most recently, Karen was president and COO at Addepar, a leading fintech company serving the investment management industry with data and analytics solutions.

Moe Khosravy (center) has over 20 years of software innovation experience delivering enterprise-grade products and platforms differentiated by data science, powerful analytics and applied machine learning to help transform industries. Most recently he was vice president of software at HP Inc., supporting hundreds of millions of connected devices and clients.

Mohsen Rahnama leads a global team of accomplished scientists, engineers and product managers responsible for the development and delivery of all RMS catastrophe models and data. During his 20 years at RMS, he has been a dedicated, hands-on leader of the largest team of catastrophe modeling professionals in the industry.

A model operation

EXPOSURE explores the rationale, challenges and benefits of adopting an outsourced model function 

Business process outsourcing has become a mainstay of the operational structure of many organizations. In recent years, reflecting new technologies and changing market dynamics, the outsourced function has evolved significantly to fit seamlessly within existing infrastructure.

On the modeling front, the exponential increase in data coupled with the drive to reduce expense ratios while enhancing performance levels is making the outsourced model proposition an increasingly attractive one.

The business rationale

The rationale for outsourcing modeling activities spans multiple possible origin points, according to Neetika Kapoor Sehdev, senior manager at RMS.

“Drivers for adopting an outsourced modeling strategy vary significantly depending on the company itself and their specific ambitions. It may be a new startup that has no internal modeling capabilities, with outsourcing providing access to every component of the model function from day one.”

There is also the flexibility that such access provides, as Piyush Zutshi, director of RMS Analytical Services points out.

“That creates a huge value-add in terms of our catastrophe response capabilities — knowing that we are able to report our latest position has made a big difference on this front” — Judith Woo, Starstone

“In those initial years, companies often require the flexibility of an outsourced modeling capability, as there is a degree of uncertainty at that stage regarding potential growth rates and the possibility that they may change track and consider alternative lines of business or territories should other areas not prove as profitable as predicted.”

Another big outsourcing driver is the potential to free up valuable internal expertise, as Sehdev explains.

“Often, the daily churn of data processing consumes a huge amount of internal analytical resources,” she says, “and limits the opportunities for these highly skilled experts to devote sufficient time to analyzing the data output and supporting the decision-making process.”

This all-too-common data stumbling block for many companies is one that not only affects their ability to capitalize fully on their data, but also to retain key analytical staff.

“Companies hire highly skilled analysts to boost their data performance,” Zutshi says, “but most of their working day is taken up by data crunching. That makes it extremely challenging to retain that caliber of staff as they are massively overqualified for the role and also have limited potential for career growth.”

Other reasons for outsourcing include new model testing. It provides organizations with a sandbox testing environment to assess the potential benefits and impact of a new model on their underwriting processes and portfolio management capabilities before committing to the license fee.

The flexibility of outsourced model capabilities can also prove critical during renewal periods. These seasonal activity peaks can be factored into contracts to ensure that organizations are able to cope with the spike in data analysis required as they reanalyze portfolios, renew contracts, add new business and write off old business.

“At RMS Analytical Services,” Zutshi explains, “we prepare for data surge points well in advance. We work with clients to understand the potential size of the analytical spike, and then we add a factor of 20 to 30 percent to that to ensure that we have the data processing power on hand should that surge prove greater than expected.”

Things to consider

Integrating an outsourced function into existing modeling processes can prove a demanding undertaking, particularly in the early stages where companies will be required to commit time and resources to the knowledge transfer required to ensure a seamless integration. The structure of the existing infrastructure will, of course, be a major influencing factor in the ease of transition.

“There are those companies that over the years have invested heavily in their in-house capabilities and developed their own systems that are very tightly bound within their processes,” Sehdev points out, “which can mean decoupling certain aspects is more challenging. For those operations that run much leaner infrastructures, it can often be more straightforward to decouple particular components of the processing.”

RMS Analytical Services has, however, addressed this issue and now works increasingly within the systems of such clients, rather than operating as an external function. “We have the ability to work remotely, which means our teams operate fully within their existing framework. This removes the need to decouple any parts of the data chain, and we can fit seamlessly into their processes.”

This also helps address any potential data transfer issues companies may have, particularly given increasingly stringent information management legislation and guidelines.

There are a number of factors that will influence the extent to which a company will outsource its modeling function. Unsurprisingly, smaller organizations and startup operations are more likely to take the fully outsourced option, while larger companies tend to use it as a means of augmenting internal teams — particularly around data engineering.

RMS Analytical Services operate various different engagement models. Managed services are based on annual contracts governed by volume for data engineering and risk analytics. On-demand services are available for one-off risk analytics projects, renewals support, bespoke analysis such as event response, and new IP adoption. “Modeler down the hall” is a third option that provides ad hoc work, while the firm also offers consulting services around areas such as process optimization, model assessment and transition support.

Making the transition work

Starstone Insurance, a global specialty insurer providing a diversified range of property, casualty and specialty insurance to customers worldwide, has been operating an outsourced modeling function for two and a half years.

“My predecessor was responsible for introducing the outsourced component of our modeling operations,” explains Judith Woo, head of exposure management at Starstone. “It was very much a cost-driven decision as outsourcing can provide a very cost-effective model.”

The company operates a hybrid model, with the outsourced team working on most of the pre- and post-bind data processing, while its internal modeling team focuses on the complex specialty risks that fall within its underwriting remit.

“The volume of business has increased over the years as has the quality of data we receive,” she explains. “The amount of information we receive from our brokers has grown significantly. A lot of the data processing involved can be automated and that allows us to transfer much of this work to RMS Analytical Services.”

On a day-to-day basis, the process is straightforward, with the Starstone team uploading the data to be processed via the RMS data portal. The facility also acts as a messaging function with the two teams communicating directly. “In fact,” Woo points out, “there are email conversations that take place directly between our underwriters and the RMS Analytical Service team that do not always require our modeling division’s input.”

However, reaching this level of integration and trust has required a strong commitment from Starstone to making the relationship work.

“You are starting to work with a third-party operation that does not understand your business or its data processes. You must invest time and energy to go through the various systems and processes in detail,” she adds, “and that can take months depending on the complexity of the business.

“You are essentially building an extension of your team, and you have to commit to making that integration work. You can’t simply bring them in, give them a particular problem and expect them to solve it without there being the necessary knowledge transfer and sharing of information.”

Her internal modeling team of six has access to an outsourced team of 26, she explains, which greatly enhances the firm’s data-handling capabilities.

“With such a team, you can import fresh data into the modeling process on a much more frequent basis, for example. That creates a huge value-add in terms of our catastrophe response capabilities —
knowing that we are able to report our latest position has made a big difference on this front.”

Creating a partnership

As with any working partnership, the initial phases are critical as they set the tone for the ongoing relationship.

“We have well-defined due diligence and transition methodologies,” Zutshi states. “During the initial phase, we work to understand and evaluate their processes. We then create a detailed transition methodology, in which we define specific data templates, establish monthly volume loads, lean periods and surge points, and put in place communication and reporting protocols.”

At the end, both parties have a full documented data dictionary with business rules governing how data will be managed, coupled with the option to choose from a repository of 1,000+ validation rules for data engineering. This is reviewed on a regular basis to ensure all processes remain aligned with the practices and direction of the organization.

“Often, the daily churn of data processing consumes a huge amount of internal analytical resources and limits the opportunities to devote sufficient time to analyzing the data output” — Neetika Kapoor Sehdev, RMS

Service level agreements (SLAs) also form also form a central tenet of the relationship plus stringent data compliance procedures.

“Robust data security and storage is critical,” says Woo. “We have comprehensive NDAs [non-disclosure agreements] in place that are GDPR  compliant to ensure that the integrity of our data is maintained throughout. We also have stringent SLAs in place to guarantee data processing turnaround times. Although, you need to agree on a reasonable time period reflecting the data complexity and also when it is delivered.”

According to Sehdev, most SLAs that the analytical team operates require a 24-hour data turnaround rising to 48-72 hours for more complex data requirements, but clients are able to set priorities as needed.

“However, there is no point delivering on turnaround times,” she adds, “if the quality of the data supplied is not fit for purpose. That’s why we apply a number of data quality assurance processes, which means that our first-time accuracy level is over 98 percent.”

The value-add

Most clients of RMS Analytical Services have outsourced modeling functions to the division for over seven years, with a number having worked with the team since it launched in 2004. The decision to incorporate their services is not taken lightly given the nature of the information involved and the level of confidence required in their capabilities.

“The majority of our large clients bring us on board initially in a data-engineering capacity,” explains Sehdev. “It’s the building of trust and confidence in our ability, however, that helps them move to the next tranche of services.”

The team has worked to strengthen and mature these relationships, which has enabled them to increase both the size and scope of the engagements they undertake.

“With a number of clients, our role has expanded to encompass account modeling, portfolio roll-up and related consulting services,” says Zutshi. “Central to this maturing process is that we are interacting with them daily and have a dedicated team that acts as the primary touch point. We’re also working directly with the underwriters, which helps boost comfort and confidence levels.

“For an outsourced model function to become an integral part of the client’s team,” he concludes, “it must be a close, coordinated effort between the parties. That’s what helps us evolve from a standard vendor relationship to a trusted partner.”

Pushing back the water

Flood Re has been tasked with creating a risk-reflective, affordable U.K. flood insurance market by 2039. Moving forward, data resolution that supports critical investment decisions will be key

Millions of properties in the U.K. are exposed to some form of flood risk. While exposure levels vary massively across the country, coastal, fluvial and pluvial floods have the potential to impact most locations across the U.K. Recent flood events have dramatically demonstrated this with properties in perceived low-risk areas being nevertheless severely affected.

Before the launch of Flood Re, securing affordable household cover in high-risk areas had become more challenging — and for those impacted by flooding, almost impossible. To address this problem, Flood Re — a joint U.K. Government and insurance-industry initiative — was set up in April 2016 to help ensure available, affordable cover for exposed properties.

The reinsurance scheme’s immediate aim was to establish a system whereby insurers could offer competitive premiums and lower excesses to highly exposed households. To date it has achieved considerable success on this front.

Of the 350,000 properties deemed at high risk, over 150,000 policies have been ceded to Flood Re. Over 60 insurance brands
representing 90 percent of the U.K. home insurance market are able to cede to the scheme. Premiums for households with prior flood claims fell by more than 50 percent in most instances, and a per-claim excess of £250 per claim (as opposed to thousands of pounds) was set.

While there is still work to be done, Flood Re is now an effective, albeit temporary, barrier to flood risk becoming uninsurable in high-risk parts of the U.K. However, in some respects, this success could be considered low-hanging fruit.

A temporary solution

Flood Re is intended as a temporary solution, granted with a considerable lifespan. By 2039, when the initiative terminates, it must leave behind a flood insurance market based on risk-reflective pricing that is affordable to most households.

To achieve this market nirvana, it is also tasked with working to manage flood risks. According to Gary McInally, chief actuary at Flood Re, the scheme must act as a catalyst for this process.

“Flood Re has a very clear remit for the longer term,” he explains. “That is to reduce the risk of flooding over time, by helping reduce the frequency with which properties flood and the impact of flooding when it does occur. Properties ought to be presenting a level of risk that is insurable in the future. It is not about removing the risk, but rather promoting the transformation of previously uninsurable properties into insurable properties for the future.”

To facilitate this transition to improved property-level resilience, Flood Re will need to adopt a multifaceted approach promoting research and development, consumer education and changes to market practices to recognize the benefit. Firstly, it must assess the potential to reduce exposure levels through implementing a range of resistance (the ability to prevent flooding) and resilience (the ability to recover from flooding) measures at the property level. Second, it must promote options for how the resulting risk reduction can be reflected in reduced flood cover prices and availability requiring less support from Flood Re.

According to Andy Bord, CEO of Flood Re: “There is currently almost no link between the action of individuals in protecting their properties against floods and the insurance premium which they are charged by insurers. In principle, establishing such a positive link is an attractive approach, as it would provide a direct incentive for households to invest in property-level protection.

“Flood Re is building a sound evidence base by working with academics and others to quantify the benefits of such mitigation measures. We are also investigating ways the scheme can recognize the adoption of resilience measures by householders and ways we can practically support a ‘build-back-better’ approach by insurers.”

Modeling flood resilience

Multiple studies and reports have been conducted in recent years into how to reduce flood exposure levels in the U.K. However, an extensive review commissioned by Flood Re spanning over 2,000 studies and reports found that while helping to clarify potential appropriate measures, there is a clear lack of data on the suitability of any of these measures to support the needs of the insurance market.

A 2014 report produced for the U.K. Environment Agency identified a series of possible packages of resistance and resilience measures. The study was based on the agency’s Long-Term Investment Scenario (LTIS) model and assessed the potential benefit of the various packages to U.K. properties at risk of flooding.

The 2014 study is currently being updated by the Environment Agency, with the new study examining specific subsets based on the levels of benefit delivered.

“It is not about removing the risk, but rather promoting the transformation of previously uninsurable properties into insurable properties” — Gary McInally, Flood Re

Packages considered will encompass resistance and resilience measures spanning both active and passive components. These include: waterproof external walls, flood-resistant doors, sump pumps and concrete flooring. The effectiveness of each is being assessed at various levels of flood severity to generate depth damage curves.

While the data generated will have a foundational role in helping support outcomes around flood-related investments, it is imperative that the findings of the study undergo rigorous testing, as McInally explains. “We want to promote the use of the best-available data when making decisions,” he says. “That’s why it was important to independently verify the findings of the Environment Agency study. If the findings differ from studies conducted by the insurance industry, then we should work together to understand why.”

To assess the results of key elements of the study, Flood Re called upon the flood modeling capabilities of RMS, and its Europe Inland Flood High-Definition (HD) Models, which provide the most comprehensive and granular view of flood risk currently available in Europe, covering 15 countries including the U.K. The models enable the assessment of flood risk and the uncertainties associated with that risk right down to the individual property and coverage level. In addition, it provides a much longer simulation timeline, capitalizing on advances in computational power through Cloud-based computing to span 50,000 years of possible flood events across Europe, generating over 200,000 possible flood scenarios for the U.K. alone.

The model also enables a much more accurate and transparent means of assessing the impact of permanent and temporary flood defenses and their role to protect against both fluvial and pluvial flood events.

Putting data to the test

“The recent advances in HD modeling have provided greater transparency and so allow us to better understand the behavior of the model in more detail than was possible previously,” McInally believes. “That is enabling us to pose much more refined questions that previously we could not address.”

While the Environment Agency study provided significant data insights, the LTIS model does not incorporate the capability to model pluvial and fluvial flooding at the individual property level, he explains.

RMS used its U.K. Flood HD model to conduct the same analysis recently carried out by the Environment Agency, benefiting from its comprehensive set of flood events together with the vulnerability, uncertainty and loss modeling framework. This meant that RMS could model the vulnerability of each resistance/resilience package for a particular building at a much more granular level.

RMS took the same vulnerability data used by the Environment Agency, which is relatively similar to the one used within the model, and ran this through the flood model, to assess the impact of each of the resistance and resilience packages against a vulnerability baseline to establish their overall effectiveness. The results revealed a significant difference between the model numbers generated by the LTIS model and those produced by the RMS Europe Inland Flood HD Models.

Since hazard data used by the Environment Agency did not include pluvial flood risk, combined with general lower resolution layers than  used in the RMS model, the LTIS study presented an overconcentration and hence overestimation of flood depths at the property level. As a result, the perceived benefits of the various resilience and resistance measures were underestimated — the potential benefits attributed to each package in some instances were almost double those of the original study.

The findings can show how using a particular package across a subset of about 500,000 households in certain specific locations, could achieve a potential reduction in annual average losses from flood events of up to 40 percent at a country level. This could help Flood Re understand how to allocate resources to generate the greatest potential and achieve the most significant benefit.

A return on investment?

There is still much work to be done to establish an evidence base for the specific value of property-level resilience and resistance measures of sufficient granularity to better inform flood-related investment decisions.

“The initial indications from the ongoing Flood Re cost-benefit analysis work are that resistance measures, because they are cheaper to implement, will prove a more cost-effective approach across a wider group of properties in flood-exposed areas,” McInally indicates. “However, in a post-repair scenario, the cost-benefit results for resilience measures are also favorable.”

However, he is wary about making any definitive statements at this early stage based on the research to date.

“Flood by its very nature includes significant potential ‘hit-and-miss factors’,” he points out. “You could, for example, make cities such as Hull or Carlisle highly flood resistant and resilient, and yet neither location might experience a major flood event in the next 30 years while the Lake District and West Midlands might experience multiple floods. So the actual impact on reducing the cost of flooding from any program of investment will, in practice, be very different from a simple modeled long-term average benefit. Insurance industry modeling approaches used by Flood Re, which includes the use of the RMS Europe Inland Flood HD Models, could help improve understanding of the range of investment benefit that might actually be achieved in practice.”

Making it clear

Pete Dailey of RMS explains why model transparency is critical to client confidence

View of Hurricane Harvey from space

In the aftermath of Hurricances Harvey, Irma and Maria (HIM), there was much comment on the disparity among the loss estimates produced by model vendors. Concerns have been raised about significant outlier results released by some modelers.

“It’s no surprise,” explains Dr. Pete Dailey, vice president at RMS, “that vendors who approach the modeling differently will generate different estimates. But rather than pushing back against this, we feel it’s critical to acknowledge and understand these differences.

“At RMS, we develop probabilistic models that operate across the full model space and deliver that insight to our clients. Uncertainty is inherent within the modeling process for any natural hazard, so we can’t rely solely on past events, but rather simulate the full range of plausible future events.”

There are multiple components that contribute to differences in loss estimates, including the scientific approaches and technologies used and the granularity of the exposure data.

“Increased demand for more immediate data is encouraging modelers to push the envelope”

“As modelers, we must be fully transparent in our loss-estimation approach,” he states. “All apply scientific and engineering knowledge to detailed exposure data sets to generate the best possible estimates given the skill of the model. Yet the models always provide a range of opinion when events happen, and sometimes that is wider than expected. Clients must know exactly what steps we take, what data we rely upon, and how we apply the models to produce our estimates as events unfold. Only then can stakeholders conduct the due diligence to effectively understand the reasons for the differences and make important financial decisions accordingly.”

Outlier estimates must also be scrutinized in greater detail. “There were some outlier results during HIM, and particularly for Hurricane Maria. The onus is on the individual modeler to acknowledge the disparity and be fully transparent about the factors that contributed to it. And most importantly, how such disparity is being addressed going forward,” says Dailey.

“A ‘big miss’ in a modeled loss estimate generates market disruption, and without clear explanation this impacts the credibility of all catastrophe models. RMS models performed quite well for Maria. One reason for this was our detailed local knowledge of the building stock and engineering practices in Puerto Rico. We’ve built strong relationships over the years and made multiple visits to the island, and the payoff for us and our client comes when events like Maria happen.”

As client demand for real-time and pre-event estimates grows, the data challenge placed on modelers is increasing.

“Demand for more immediate data is encouraging modelers like RMS to push the scientific envelope,” explains Dailey, “as it should. However, we need to ensure all modelers acknowledge, and to the degree possible quantify, the difficulties inherent in real-time loss estimation — especially since it’s often not possible to get eyes on the ground for days or weeks after a major catastrophe.”

Much has been said about the need for modelers to revise initial estimates months after an event occurs. Dailey acknowledges that while RMS sometimes updates its estimates, during HIM the strength of early estimates was clear.

“In the months following HIM, we didn’t need to significantly revise our initial loss figures even though they were produced when uncertainty levels were at their peak as the storms unfolded in real time,” he states. “The estimates for all three storms were sufficiently robust in the immediate aftermath to stand the test of time. While no one knows what the next event will bring, we’re confident our models and, more importantly, our transparent approach to explaining our estimates will continue to build client confidence.”

Data Flow in a Digital Ecosystem

There has been much industry focus on the value of digitization at the customer interface, but what is its role in risk management and portfolio optimization?

In recent years, the perceived value of digitization to the insurance industry has been increasingly refined on many fronts. It now serves a clear function in areas such as policy administration, customer interaction, policy distribution and claims processing, delivering tangible, measurable benefits.

However, the potential role of digitization in supporting the underwriting functions, enhancing the risk management process and facilitating portfolio optimization is sometimes less clear. That this is the case is perhaps a reflection of the fact that risk assessment is by its very nature a more nebulous task, isolated to only a few employees, and clarifying the direct benefits of digitization is therefore challenging.

To grasp the potential of digitalization, we must first acknowledge the limitations of existing platforms and processes, and in particular the lack of joined-up data in a consistent format. But connecting data sets and being able to process analytics is just the start. There needs to be clarity in terms of the analytics an underwriter requires, including building or extending core business workflow to deliver insights at the point of impact.

Data limitation

For Louise Day, director of operations at the International Underwriting Association (IUA), a major issue is that much of the data generated across the industry is held remotely from the underwriter.

“You have data being keyed in at numerous points and from multiple parties in the underwriting process. However, rather than being stored in a format accessible to the underwriter, it is simply transferred to a repository where it becomes part of a huge data lake with limited ability to stream that data back out.”

That data is entering the “lake” via multiple different systems and in different formats. These amorphous pools severely limit the potential to extract information in a defined, risk-specific manner, conduct impactful analytics and do so in a timeframe relevant to the underwriting decision-making process.

“The underwriter is often disconnected from critical risk data,” believes Shaheen Razzaq, senior product director at RMS. “This creates significant challenges when trying to accurately represent coverage, generate or access meaningful analysis of metrics and grasp the marginal impacts of any underwriting decisions on overall portfolio performance.

“Success lies not just in attempting to connect the different data sources together, but to do it in such a way that can generate the right insight within the right context and get this to the underwriter to make smarter decisions.”

Without the digital capabilities to connect the various data sets and deliver information in a digestible format to the underwriter, their view of risk can be severely restricted — particularly given that server storage limits often mean their data access only extends as far as current information. Many businesses find themselves suffering from DRIP, being data rich but information poor, without the ability to transform their data into valuable insight.

“You need to be able to understand risk in its fullest context,” Razzaq says. “What is the precise location of the risk? What policy history information do we have? How has the risk performed? How have the modeled numbers changed? What other data sources can I tap? What are the wider portfolio implications of binding it? How will it impact my concentration risk? How can I test different contract structures to ensure the client has adequate cover but is still profitable business for me? These are all questions they need answers to in real time at the decision-making point, but often that’s simply not possible.”

When extrapolating this lack of data granularity up to the portfolio level and beyond, the potential implications of poor risk management at the point of underwriting can be extreme.  With a high-resolution peril like U.S. flood, where two properties meters apart can have very different risk profiles, without granular data at the point of impact, the ability to make accurate risk decisions is restricted. Rolling up that degree of inaccuracy to the line of business and to the portfolio level, and the ramifications are significant.

Looking beyond the organization and out to the wider flow of data through the underwriting ecosystem, the lack of format consistency is creating a major data blockage, according to Jamie Garratt, head of innovation at Talbot.

“You are talking about trying to transfer data which is often not in any consistent format along a value chain that contains a huge number of different systems and counterparties,” he explains. “And the inability to quickly and inexpensively convert that data into a format that enables that flow, is prohibitive to progress.

“You are looking at the formatting of policies, schedules and risk information, which is being passed through a number of counterparties all operating different systems. It then needs to integrate into pricing models, policy administration systems, exposure management systems, payment systems, et cetera. And when you consider this process replicated across a subscription market the inefficiencies are extensive.”

A functioning ecosystem

There are numerous examples of sectors that have transitioned successfully to a digitized data ecosystem that the insurance industry can learn from. One such industry is health care, which over the last decade has successfully adopted digital processes across the value chain and overcome the data formatting challenge.

It can be argued that health care has a value chain similar to that in the insurance industry. Data is shared between various stakeholders — including competitors — to create the analytical backbone it needs to function effectively. Data is retained and shared at the individual level and combines multiple health perspectives to gain a holistic view of the patient.

The sector has also overcome the data-consistency hurdle by collectively agreeing on a data standard, enabling the effective flow of information across all parties in the chain, from the health care facilities through to the services companies that support them.

Garratt draws attention to the way the broader financial markets function. “There are numerous parallels that can be drawn between the financial and the insurance markets, and much that we can learn from how that industry has evolved over the last 10 to 20 years.”

“As the capital markets become an increasingly prevalent part of the insurance sector,” he continues, “this will inevitably have a bearing on how we approach data and the need for greater digitization. If you look, for example, at the advances that have been made in how risk is transferred on the insurance-linked securities (ILS) front, what we now have is a fairly homogenous financial product where the potential for data exchange is more straightforward and transaction costs and speed have been greatly reduced.

“It is true that pure reinsurance transactions are more complex given the nature of the market, but there are lessons that can be learned to improve transaction execution and the binding of risks.”

For Razzaq, it’s also about rebalancing the data extrapolation versus data analysis equation. “By removing data silos and creating straight-through access to detailed, relevant, real-time data, you shift this equation on its axis. At present, some 70 to 80 percent of analysts’ time is spent sourcing data and converting it into a consistent format, with only 20 to 30 percent spent on the critical data analysis. An effective digital infrastructure can switch that equation around, greatly reducing the steps involved, and re-establishing analytics as the core function of the analytics team.”

The analytical backbone

So how does this concept of a functioning digital ecosystem map to the (re)insurance environment? The challenge, of course, is not only to create joined-up, real-time data processes at the organizational level, but also look at how that unified infrastructure can extend out to support improved data interaction at the industry level.

An ideal digital scenario from a risk management perspective is where all parties operate on a single analytical framework or backbone built on the same rules, with the same data and using the same financial calculation engines, ensuring that on all risk fronts you are carrying out an ‘apples-to-apples’ comparison. That consistent approach would need to extend from the individual risk decision, to the portfolio, to the line of business, right up to the enterprise-wide level.

At the underwriting trenches, it is about enhancing and improving the decision-making process and understanding the portfolio-level implications of those decisions.

“A modern pricing and portfolio risk evaluation framework can reduce assessment times, providing direct access to relevant internal and external data in almost real time,” states Ben Canagaretna, managing director at Barbican Insurance Group. “Creating a data flow, designed specifically to support agile decision-making, allows underwriters to price complex business in a much shorter time period.”

“It’s about creating a data flow designed specifically to support decision-making”— Ben Canagaretna, Barbican Insurance Group

“The feedback loop around decisions surrounding overall reinsurance costs and investor capital exposure is paramount in order to maximize returns on capital for shareholders that are commensurate to risk appetite. At the heart of this is the portfolio marginal impact analysis – the ability to assess the impact of each risk on the overall portfolio in terms of exceedance probability curves, realistic disaster scenarios and regional exposures. Integrated historical loss information is a must in order to quickly assess the profitability of relevant brokers, trade groups and specific policies.”

There is, of course, the risk of data overload in such an environment, with multiple information streams threatening to swamp the process if not channeled effectively.

“It’s about giving the underwriter much better visibility of the risk,” says Garratt, “but to do that the information must be filtered precisely to ensure that the most relevant data is prioritized, so it can then inform underwriters about a specific risk or feed directly into pricing models.”

Making the transition

There are no organizations in today’s (re)insurance market that cannot perceive at least a marginal benefit from integrating digital capabilities into their current underwriting processes. And for those that have started on the route, tangible benefits are already emerging. Yet making the transition, particularly given the clear scale of the challenge, is daunting.

“You can’t simply unplug all of your legacy systems and reconnect a new digital infrastructure,” says IUA’s Day. “You have to find a way of integrating current processes into a data ecosystem in a manageable and controlled manner. From a data-gathering perspective, that process could start with adopting a standard electronic template to collect quote data and storing that data in a way that can be easily accessed and transferred.”

“There are tangible short-term benefits of making the transition,” adds Razzaq. “Starting small and focusing on certain entities within the group. Only transferring certain use cases and not all at once. Taking a steady step approach rather than simply acknowledging the benefits but being overwhelmed by the potential scale of the challenge.”

There is no doubting, however, that the task is significant, particularly integrating multiple data types into a single format. “We recognize that companies have source-data repositories and legacy systems, and the initial aim is not to ‘rip and replace’ those, but rather to create a path to a system that allows all of these data sets to move. For RMS, we have the ability to connect these various data hubs via open APIs to our Risk Intelligence platform to create that information superhighway, with an analytics layer that can turn this data into actionable insights.”

Talbot has already ventured further down this path than many other organizations, and its pioneering spirit is already bearing fruit.

“We have looked at those areas,” explains Garratt, “where we believe it is more likely we can secure short-term benefits that demonstrate the value of our longer-term strategy. For example, we recently conducted a proof of concept using quite powerful natural-language processing supported by machine-learning capabilities to extract and then analyze historic data in the marine space, and already we are generating some really valuable insights.

“I don’t think the transition is reliant on having a clear idea of what the end state is going to look like, but rather taking those initial steps that start moving you in a particular direction. There also has to be an acceptance of the need to fail early and learn fast, which is hard to grasp in a risk-averse industry. Some initiatives will fail — you have to recognize that and be ready to pivot and move in a different direction if they do.”