Bringing Clarity to Slab Claims

How will a new collaboration between a major Texas insurer, RMS, Accenture and Texas Tech University provide the ability to determine with accuracy the source of slab claim loss?

The litigation surrounding “slab claims” in the U.S. in the aftermath of a major hurricane has long been an issue within the insurance industry. When nothing is left of a coastal property but the concrete slab on which it was built, how do claims handlers determine whether the damage was predominantly caused by water or wind?

The decision that many insurers take can spark protracted litigation, as was the case following Hurricane Ike, a powerful storm that caused widespread damage across the state after it made landfall over Galveston in September 2008. The storm had a very large footprint for a Category 2 hurricane, with sustained wind speeds of 110 mph and a 22-foot storm surge. Five years on, litigation surrounding how slab claim damage had been wrought rumbled on in the courts.

Recognizing the extent of the issue, major coastal insurers knew they needed to improve their methodologies. It sparked a new collaboration between RMS, a major Texas insurer, Accenture and Texas Tech University (TTU). And from this year, the insurer will be able to utilize RMS data, hurricane modeling methodologies, and software analyses to track the likelihood of slab claims before a tropical cyclone makes landfall and document the post-landfall wind, storm surge and wave impacts over time.

The approach will help determine the source of the property damage with greater accuracy and clarity, reducing the need for litigation post-loss, thus improving the overall claims experience for both the policyholder and insurer. To provide super accurate wind field data, RMS has signed a contract with TTU to expand a network of mobile meteorological stations that are ultimately positioned in areas predicted to experience landfall during a real-time event.

“Our contract is focused on Texas, but they could also be deployed anywhere in the southern and eastern U.S.,” says Michael Young, senior director of product management at RMS. “The rapidly deployable weather stations collect peak and mean wind speed characteristics and transmit via the cell network the wind speeds for inclusion into our tropical cyclone data set. This is in addition to a wide range of other data sources, which this year includes 5,000 new data stations from our partner Earth Networks.”

The storm surge component of this project utilizes the same hydrodynamic storm surge model methodologies embedded within the RMS North Atlantic Hurricane Models to develop an accurate view of the timing, extent and severity of storm surge and wave-driven hazards post-landfall. Similar to the wind field modeling process, this approach will also be informed by ground-truth terrain and observational data, such as high-resolution bathymetry data, tide and stream gauge sensors and high-water marks.

“The whole purpose of our involvement in this project is to help the insurer get those insights into what’s causing the damage,” adds Jeff Waters, senior product manager at RMS. “The first eight hours of the time series at a particular location might involve mostly damaging surge, followed by eight hours of damaging wind and surge. So, we’ll know, for instance, that a lot of that damage that occurred in the first eight hours was probably caused by surge. It’s a very exciting and pretty unique project to be part of.”


Assigning a Return Period to 2017

Hurricanes Harvey, Irma and Maria (HIM) tore through the Caribbean and U.S. in 2017, resulting in insured losses over US$80 billion. Twelve years after Hurricanes Katrina, Rita and Wilma (KRW), EXPOSURE asks if the (re)insurance industry was better prepared for its next ‘terrible trio’ and what lessons can be learned  

In one sense, 2017 was a typical loss year for the insurance industry in that the majority of losses stemmed from the “peak zone” of U.S. hurricanes. However, not since the 2004-05 season had the U.S. witnessed so many landfalling hurricanes. It was the second most costly hurricane season on record for the (re)insurance industry, when losses in 2005 are adjusted for inflation.

According to Aon Benfield, HIM caused total losses over US$220 billion and insured losses over US$80 billion — huge sums in the context of global catastrophe losses for the year of US$344 billion and insured losses of US$134 billion. Overall, weather-related catastrophe losses exceeded 0.4 percent of global GDP in 2017 (based on data from Aon Benfield, Munich Re and the World Bank), the second highest figure since 1990. In that period, only 2005 saw a higher relative catastrophe loss at around 0.5 percent of GDP.

But, it seems, (re)insurers were much better prepared to absorb major losses this time around. Much has changed in the 12 years since Hurricane Katrina breached the levees in New Orleans. Catastrophe modeling as a profession has evolved into exposure management, models and underlying data have improved and there is a much greater appreciation of model uncertainty and assumptions, explains Alan Godfrey, head of exposure management at Asta.

“Even post-2005 people would still see an event occurring, go to the models and pull out a single event ID ... then tell all and sundry this is what we’re going to lose. And that’s an enormous misinterpretation of how the models are supposed to be used. In 2017, people demonstrated a much greater maturity and used the models to advise their own loss estimates, and not the other way around.”

It also helped that the industry was extremely well-capitalized moving into 2017. After a decade of operating through a low interest rate and increasingly competitive environment, (re)insurers had taken a highly disciplined approach to capital management. Gone are the days where a major event sparked a series of run-offs. While some (re)insurers have reported higher losses than others, all have emerged intact.

“In 2017 the industry has performed incredibly well from an operational point of view,” says Godfrey. “There have obviously been challenges from large losses and recovering capital, but those are almost outside of exposure management.”

According to Aon Benfield, global reinsurance capacity grew by 80 percent between 1990 and 2017 (to US$605 billion), against global GDP growth of around 24 percent. The influx of capacity from the capital markets into U.S. property catastrophe reinsurance has also brought about change and innovation, offering new instruments such as catastrophe bonds for transferring extreme risks.

Harvey broke all U.S. records for tropical cyclone-driven rainfall with observed cumulative rainfall of 51 inches

Much of this growth in non-traditional capacity has been facilitated by better data and more sophisticated analytics, along with a healthy appetite for insurance risk from pension funds and other institutional investors.

For insurance-linked securities (ILS), the 2017 North Atlantic hurricane season, Mexico’s earthquakes and California’s wildfires were their first big test. “Some thought that once we had a significant year that capital would leave the market,” says John Huff, president and chief executive of the Association of Bermuda Insurers and Reinsurance (ABIR). “And we didn’t see that.

“In January 2018 we saw that capital being reloaded,” he continues. “There is abundant capital in all parts of the reinsurance market. Deploying that capital with a reasonable rate of return is, of course, the objective.”

Huff thinks the industry performed extremely well in 2017 in spite of the severity of the losses and a few surprises. “I’ve even heard of reinsurers that were ready with claim payments on lower layers before the storm even hit. The modeling and ability to track the weather is getting more sophisticated. We saw some shifting of the storms — Irma was the best example — but reinsurers were tracking that in real time in order to be able to respond.”

The Buffalo Bayou River floods a park in Houston after the arrival of Hurricane Harvey

How Harvey inundated Houston

One lesson the industry has learned over three decades of modeling is that models are approximations of reality. Each event has its own unique characteristics, some of which fall outside of what is anticipated by the models.

The widespread inland flooding that occurred after Hurricane Harvey made landfall on the Texas coastline is an important illustration of this, explains Huff. Even so, he adds, it continued a theme, with flood losses being a major driver of U.S. catastrophe claims for several years now. “What we’re seeing is flood events becoming the No. 1 natural disaster in the U.S. for people who never thought they were at risk of flood.”

Harvey broke all U.S. records for tropical cyclone-driven rainfall with observed cumulative rainfall of 51 inches (129 centimeters). The extreme rainfall generated by Harvey and the unprecedented inland flooding across southeastern Texas and parts of southern Louisiana was unusual.

However, nobody was overly surprised by the fact that losses from Harvey were largely driven by water versus wind. Prior events with significant storm surge-induced flooding, including Hurricane Katrina and 2012’s Superstorm Sandy, had helped to prepare (re)insurers, exposure managers and modelers for this eventuality. “The events themselves were very large but they were well within uncertainty ranges and not disproportionate to expectations,” says Godfrey.

“Harvey is a new data point — and we don’t have that many — so the scientists will look at it and know that any new data point will lead to tweaks,” he continues. “If anything, it will make people spend a bit more time on their calibration for the non-modeled elements of hurricane losses, and some may conclude that big changes are needed to their own adjustments.”

But, he adds: “Nobody is surprised by the fact that flooding post-hurricane causes loss. We know that now. It’s more a case of tweaking and calibrating, which we will be doing for the rest of our lives.”

Flood modeling

Hurricane Harvey also underscored the importance of the investment in sophisticated, probabilistic flood models. RMS ran its U.S. Inland Flood HD Model in real time to estimate expected flood losses. “When Hurricane Harvey happened, we had already simulated losses of that magnitude in our flood model, even before the event occurred,” says Dr. Pete Dailey, vice president of product management and responsible for U.S. flood modeling at RMS.

“The value of the model is to be able to anticipate extreme tail events well before they occur, so that insurance companies can be prepared in advance for the kind of risk they’re taking on and what potential claims volume they may have after a major event,” he adds.

Does this mean that a US$100 billion-plus loss year like 2017 is now a 1-in-6-year event?

Harvey has already offered a wealth of new data that will be fed into the flood model. The emergency shutdown of the Houston metropolitan area prevented RMS meteorologists and engineers from accessing the scene in the immediate aftermath, explains Dailey. However, once on the ground they gathered as much information as they could, observing and recording what had actually happened to affected properties.

“We go to individual properties to assess the damage visually, record the latitude and longitude of the property, the street address, the construction, occupancy and the number of stories,” he says. “We will also make an estimate of the age of the property. Those basic parameters allow us to go back and take a look at what the model would have predicted in terms of damage and loss, as compared to what we observed.”

The fact that insured losses emanating from the flooding were only a fraction of the total economic losses is an inevitable discussion point. The majority of claims paid were for commercial properties, with residential properties falling under the remit of the National Flood Insurance Program (NFIP). Many residential homes were uninsured, however, explains ABIR’s Huff.

“The NFIP covers just the smallest amount of people — there are only five million policies — and yet you see a substantial event like Harvey which is largely uninsured because (re)insurance companies only cover commercial flood in the U.S.,” he says. “After Harvey you’ll see a realization that the private market is very well-equipped to get back into the private flood business, and there’s a national dialogue going on now.”

Is 2017 the new normal?

One question being asked in the aftermath of the 2017 hurricane season is: What is the return period for a loss year like 2017? RMS estimates that, in terms of U.S. and Caribbean industry insured wind, storm surge and flood losses, the 2017 hurricane season corresponds to a return period between 15 and 30 years.

However, losses on the scale of 2017 occur more frequently when considering global perils. Adjusted for inflation, it is seven years since the industry paid out a similar level of catastrophe claims — US$110 billion on the Tohoku earthquake and tsunami, Thai floods and New Zealand earthquake in 2011. Six years prior to that, KRW cost the industry in excess of US$75 billion (well over US$100 billion in today’s money).

So, does this mean that a US$100 billion-plus (or equivalent in inflation-adjusted terms) loss year like 2017 is now a one-in-six-year event? As wealth and insurance penetration grows in developing parts of the world, will we begin to see more loss years like 2011, where catastrophe claims are not necessarily driven by the U.S. or Japan peak zones?

“Increased insurance penetration does mean that on the whole losses will increase, but hopefully this is proportional to the premiums and capital that we are getting in,” says Asta’s Godfray. “The important thing is understanding correlations and how diversification actually works and making sure that is applied within business models.

“In the past, people were able to get away with focusing on the world in a relatively binary fashion,” he continues. “The more people move toward diversified books of business, which is excellent for efficient use of capital, the more important it becomes to understand the correlations between different regions.”

“You could imagine in the future, a (re)insurer making a mistake with a very sophisticated set of catastrophe and actuarial models,” he adds. “They may perfectly take into account all of the non-modeled elements but get the correlations between them all wrong, ending up with another year like 2011 where the losses across the globe are evenly split, affecting them far more than their models had predicted.”

As macro trends including population growth, increasing wealth, climate change and urbanization influence likely losses from natural catastrophes, could this mean a shorter return period for years like last year, where industry losses exceeded US$134 billion?

“When we look at the average value of properties along the U.S. coastline — the Gulf Coast and East Coast — there’s a noticeable trend of increasing value at risk,” says Dailey. “That is because people are building in places that are at risk of wind damage from hurricanes and coastal flooding. And these properties are of a higher value because they are more complex, have a larger square footage and have more stories. Which all leads to a higher total insured value.

“The second trend that we see would be from climate change whereby the storms that produce damage along the coastline may be increasing in frequency and intensity,” he continues. “That’s a more difficult question to get a handle on but there’s a building consensus that while the frequency of hurricane landfalls may not necessarily be increasing, those that do make landfall are increasing in intensity.”

Lloyd’s chief executive Inga Beale has stated her concerns about the impact of climate change, following the market’s £4.5 billion catastrophe claims bill for 2017. “That’s a significant number, more than double 2016; we’re seeing the impact of climate change to a certain extent, particularly on these weather losses, with the rising sea level that impacts and increases the amount of loss,” she said in an interview with Bloomberg.

While a warming climate is expected to have significant implications for the level of losses arising from storms and other severe weather events, it is not yet clear exactly how this will manifest, according to Tom Sabbatelli, senior product manager at RMS. “We know the waters have risen several centimeters in the last couple of decades and we can use catastrophe models to quantify what sort of impact that has on coastal flooding, but it’s also unclear what that necessarily means for tropical cyclone strength.

“The oceans may be warming, but there’s still an ongoing debate about how that translates into cyclone intensity, and that’s been going on for a long time,” he continues. “The reason for that is we just don’t know until we have the benefit of hindsight. We haven’t had a number of major hurricanes in the last few years, so does that mean that the current climate is quiet in the Atlantic? Is 2017 an anomaly or are we going back to more regular severe activity? It’s not until you’re ten or 20 years down the line and you look back that you know for sure.”


Where Tsunami Warnings Are Etched in Stone

As RMS releases its new Japan Earthquake and Tsunami Model, EXPOSURE looks back at the 2011 Tohoku event and other significant events that have shaped scientific knowledge and understanding of earthquake risk 

Hundreds of ancient markers dot the coastline of Japan, some over 600 years old, as a reminder of the danger of tsunami. Today, a new project to construct a 12.5-meter-high seawall stretching nearly 400 kilometers along Japan’s northeast coast is another reminder. Japan is a highly seismically active country and was well prepared for earthquakes and tsunami ahead of the Tohoku Earthquake in 2011. It had strict building codes, protective tsunami barriers, early-warning systems and disaster-response plans.

But it was the sheer magnitude, scale and devastation caused by the Tohoku Earthquake and Tsunami that made it stand out from the many thousands of earthquakes that had come before it in modern times. What had not been foreseen in government planning was that an earthquake of this magnitude could occur, nor that it could produce such a sizable tsunami.

The Tohoku Earthquake was a magnitude 9.0 event — off the charts as far as the Japanese historical record for earthquakes was concerned. A violent change in the ocean bottom triggered an immense tsunami with waves of up to 40 meters that tore across the northeast coast of the main island of Honshu, traveling up to 10 kilometers inland in the Sendai area.

The tsunami breached sea walls and claimed almost everything in its path, taking 16,000 lives (a further 2,000 remain missing, presumed dead) and causing economic losses of US$235 billion. However, while the historical record proved inadequate preparation for the Tohoku event, the geological record shows that events of that magnitude had occurred before records began, explains Mohsen Rahnama, chief risk modeling officer at RMS.

“Since the Tohoku event, there's been a shift ... to moving further back in time using a more full consideration of the geological record” — Mohsen Rahnama, RMS

“If you go back in the geological record to 869 in the Tohoku region, there is evidence for a potentially similarly scaled tsunami,” he explains. “Since the Tohoku event, there’s been a shift in the government assessments moving away from a focus on what happened historically to a more full consideration of the geological record.”

The geological record, which includes tsunami deposits in coastal lakes and across the Sendai and Ishinomaki plains, shows there were large earthquakes and associated tsunami in A.D. 869, 1611 and 1896. The findings of this research point to the importance of having a fully probabilistic tsunami model at a very high resolution.

Rahnama continues: “The Tohoku event really was the ‘perfect’ tsunami hitting the largest exposure concentration at risk to tsunami in Japan. The new RMS tsunami model for Japan includes tsunami events similar to and in a few cases larger than were observed in 2011. Because the exposure in the region is still being rebuilt, the model cannot produce tsunami events with this scale of loss in Tohoku at this time.”

Incorporating secondary perils

In its new Japan earthquake and tsunami model release, RMS has incorporated the lessons from the Tohoku Earthquake and other major earthquakes that have occurred since the last model was released. Crucially, it includes a fully probabilistic tsunami model that is integrated with the earthquake stochastic event set.

“Since the Japan model was last updated we’ve had several large earthquakes around the world, and they all inform how we think about the largest events, particularly how we model the ground motions they produce,” says Ryan Leddy, senior product manager at RMS, “because good instrumentation has only been available over the last several decades. So, the more events where we sample really detailed information about the ground shaking, the better we can quantify it.

“Particularly on understanding strong ground shaking, we utilized information across events,” he continues. “Petrochemical facilities around the world are built with relatively consistent construction practices. This means that examination of the damage experienced by these types of facilities in Chile and Japan can inform our understanding of the performance of these facilities in other parts of the world with similar seismic hazard.”

The Maule Earthquake in Chile in 2010, the Canterbury sequence of earthquakes in New Zealand in 2010 and 2011, and the more recent Kumamoto Earthquakes in Japan in 2016, have added considerably to the data sets. Most notably they have informed scientific understanding of the nature of secondary earthquake perils, including tsunami, fire following earthquake, landslides and liquefaction.

The 2016 Kumamoto Earthquake sequence triggered extensive landsliding. The sequence included five events in the range of magnitude 5.5 to 7.0 and caused severe damage in Kumamoto and Oita Prefectures from ground shaking, landsliding, liquefaction and fire following earthquake.

“Liquefaction is in the model as a secondary peril. RMS has redesigned and recalibrated the liquefaction model for Japan. The new model directly calculates damage due to vertical deformation due to liquefaction processes,” says Chesley Williams, senior director, RMS Model Product Management. “While the 1964 Niigata Earthquake with its tipped apartment buildings showed that liquefaction damages can be severe in Japan, on a countrywide basis the earthquake risk is driven by the shaking, tsunami and fire following, followed by liquefaction and landslide. For individual exposures, the key driver of the earthquake risk is very site specific, highlighting the importance of high-resolution modeling in Japan.”

The new RMS model accounts for the clustering of large events on the Nankai Trough. This is an important advancement as an examination of the historical record shows that events on the Nankai Trough have either occurred as full rupturing events (e.g., 1707 Hoei Earthquake) or as pairs of events (e.g., 1944 and 1946 and two events in 1854).

This is different from aftershocks, explains Williams. “Clustered events are events on different sources that would have happened in the long-term earthquake record, and the occurrence of one event impacts the timing of the other events. This is a subtle but important distinction. We can model event clustering on the Nankai Trough due to the comprehensive event record informed by both historical events and the geologic record.”

The Tohoku event resulted in insurance losses of US$30 billion to US$40 billion, the costliest earthquake event for the insurance industry in history. While the news media focused on the extreme tsunami, the largest proportion of the insurance claims emanated from damage wrought by the strong ground shaking. Interestingly, likely due to cultural constraints, only a relatively low amount of post-event loss amplification was observed.

“In general for very large catastrophes, claims costs can exceed the normal cost of settlement due to a unique set of economic, social and operational factors,” says Williams. “Materials and labor become more expensive and claims leakage can be more of an issue, so there are a number of factors that kick in that are now captured by the RMS post-event loss amplification modeling. The new Japan model now explicitly models post-event loss amplification but limits the impacts to be consistent with the observations in recent events in Japan.”

Supply chain disruption and contingent business interruption were significant sources of loss following the Tohoku event. This was exacerbated by the level seven meltdown at the Fukushima nuclear power plant that resulted in evacuations, exclusion zones and rolling blackouts.

“We sent reconnaissance teams to Japan after the event to understand the characteristics of damage and to undertake case studies for business interruption,” says Williams. “We visited large industrial facilities and talked to them about their downtime, their material requirement and their access to energy sources to better understand what had impacted their ability to get back up and running.”

Recent events have re-emphasized that there are significant differences in business interruption by occupancy. “For example,  a semiconductor facility is likely going to have a longer downtime than a cement factory,” says Williams. “The recent events have highlighted the impacts on business interruption for certain occupancies by damage to supply sources. These contingent business interruptions are complex, so examination of the case studies investigated in Japan were instrumental for informing the model.”

Rebuilding in the seven years since the Tohoku Tsunami struck has been an exercise in resilient infrastructure. With nearly half a million people left homeless, there has been intense rebuilding to restore services, industry and residential property. US$12 billion has been spent on seawalls alone, replacing the 4-meter breakwaters with 12.5-meter-high tsunami barriers.

An endless convoy of trucks has been moving topsoil from the hills to the coastline in order to raise the land by over 10 meters in places. Most cities have decided to elevate by several meters, with a focus on rebuilding commercial premises in exposed areas. Some towns have forbidden the construction of homes in flat areas nearest the coasts and relocated residents to higher ground.


Tokyo-Yokohama: The world's most exposed metropolis

The Japanese metropolis of Tokyo-Yokohama has the world's greatest GDP at risk from natural catastrophes. Home to 38 million residents, it has potential for significant economic losses from multiple perils, but particularly earthquakes. According to Swiss Re it is the riskiest metropolitan area in the world.

A combination of strict building codes, land use plans and disaster preparedness have significantly reduced the city's vulnerability in recent decades. Despite the devastation caused by the tsunami, very few casualties (around 100) related to partial or complete building collapse resulting from ground shaking during the magnitude 9.0 Tohoku Earthquake.  


How Cyber Became a Peak Peril

As new probabilistic cyber models are launched, EXPOSURE explores how probabilistic modeling will facilitate the growth of the cyber (re)insurance market and potentially open up the transfer of catastrophic risks to the capital markets 

The potential for cyberattacks to cause global, systemic disruption continues to ratchet up, and to confuse matters further, it is state actors that are increasingly involved in sponsoring these major attacks. Last year’s major global ransomware attacks — WannaCry and NotPetya — were a wake-up call for many businesses, in terms of highlighting the potential scale and source of cyber incidents. The widespread disruption caused by these incidents — widely suspected of being state-sponsored attacks — confirmed that cyber risk is now in the realm of catastrophe exposures.

The introduction of probabilistic catastrophe modeling for cyber therefore comes at an opportune time. In terms of modeling, although a cyberattack is human-made and very different from a Florida hurricane or Japanese earthquake, for instance, there are some parallels with natural catastrophe perils. Most notable is the potential for sizable, systemic loss.

“Catastrophe modeling exists because of the potential correlation of losses across multiple locations and policies all from the same event,” explains Robert Muir-Wood, chief research officer at RMS. “This concentration is what insurers most fear. The whole function of insurance is to diversify risk.

“Anything that concentrates risk is moving in the opposite direction to diversification,” he continues. “So, insurers need to find every way possible to limit the concentration of losses. And cyber clearly has the potential, as demonstrated by the NotPetya and WannaCry attacks last year, to impact many separate businesses in a single attack.”

“What’s the equivalent of a cyber hurricane? None of the insurers are quite sure about that” — Tom Harvey, RMS

Cyberattacks can easily make a loss go global. Whereas a Florida hurricane can damage multiple properties across a small geographical area, a ransomware attack can interrupt the day-to-day running of thousands of businesses on an unprecedented geographical scale. “When I think of systemic risk I think of an attack that can target many thousands of organizations, causing disruption of digital assets using technology as a vector for disruption,” says Tom Harvey, senior product manager at RMS cyber solutions.

“What’s the equivalent of a cyber hurricane? None of the insurers are quite sure about that. When you write a cyber insurance policy you’re inherently taking a bet on the probability of that policy paying out. Most people recognize there are systemic risks out there, which increases the probability of their policy paying out, but until models have been developed there’s no way to really quantify that,” he adds. “Which is why we do what we do.”

RMS estimates a substantial outage at a leading cloud service provider could generate an insurable economic loss of US$63 billion — and that is just for the U.S. In economic loss terms, this is roughly equivalent to a catastrophic natural disaster such as Superstorm Sandy in 2012.

To estimate these losses, the RMS model takes into account the inherent resiliency of cloud service providers, which capitalizes on extensive research into how corporations use the cloud for their revenue generating processes, and how cloud providers have adopted resilient IT architectures to mitigate the impact of an outage on their customers.

The majority of the loss would come from contingent business income (CBI), a coverage that typically has an 8-12 hour waiting period and is heavily sublimited. Coupled with the still relatively low cyber insurance penetration, a significant proportion of this loss will fall on the corporates themselves rather than the insurance industry.

The evolution of cyber modeling

In the early days of cyber insurance, when businesses and insurers were grappling with an esoteric and rapidly evolving threat landscape, cyber was initially underwritten using various scenarios to determine probable maximum losses for a portfolio of risks.

RMS launched its Cyber Accumulation Management System (CAMS) in 2015, initially focused on five key cyber exposures: data exfiltration, ransomware, denial of service, cloud failure and extortion. “Within each of those classes of cyberattack we asked, ‘What is the most systemic type of incident that we would expect to see?’” explains Harvey. “Then you can understand the constraints that determine the potential scale of these events.

“We have always conducted a great deal of historical event analysis to understand the technical constraints that are in place, and then we put all that together. So, for example, with data exfiltration there are only so many threat actors that have the capability to carry out this type of activity,” he continues. “And it’s quite a resource intensive activity. So even if you made it very easy for hackers to steal data there’s only so many actors in the world (even state actors) that would want to.

“From an insurance point of view, if you are insuring 5,000 companies and providing cyber coverage for them, you could run the model and say if one of these catastrophes impacts our book we can be confident our losses are not going to exceed, say US$100 million. That’s starting to provide some comfort to those insurers about what their PML [probable maximum loss] scenarios would be.”

The affirmative cyber insurance market is now four times the size it was when RMS developed its first-generation cyber risk model, and as the market diversifies and grows, clients need new tools to manage profitable growth.

Harvey adds: “The biggest request from our clients was to assess the return periods of cyber loss and to link probabilities with accumulation scenarios, and help them allocate capital to cyber as a line of insurance.  In the release of RMS Cyber Solutions Version 3, which includes the first probabilistic model for cyber loss, we estimate the scalability of the various loss processes that make up the drivers of cyber claims.

“Stochastic modeling helps explore the systemic potential for catastrophe loss estimates resulting from each cyber loss process: incorporating the statistical volatility of claims patterns from these in recent years, the technical constraints on scaling factors and attack modes of each process, and the parallels with loss exceedance distributions from other perils that RMS has modeled extensively.

“From this, we now provide loss exceedance probability (EP) distributions for each cyber loss process, with reference accumulation scenarios benchmarked to key return periods from the EP curve. These are combined into a total loss EP curve from all causes. RMS has been expanding on these scenarios in recent years, coming up with new situations that could occur in the future and incorporating a rapidly growing wealth of data on cyberattacks that have occurred. Knowing how these real-life incidents have played out helps our cyber modeling team to assign probabilities to those scenarios so insurers can more confidently assign their capital and price the business.”

With the ability to model cyber on a probabilistic basis to enable insurers to more accurately assign capital to their portfolio of risks, it is hoped this will facilitate the growth of both the cyber insurance and reinsurance market.

Taking out the peaks

As the cyber (re)insurance market develops, the need for mechanisms to transfer extreme risks will grow. This is where the capital markets could potentially play a role. There are plenty of challenges in structuring an instrument such as a catastrophe bond to cover cyber risk, however, the existence of probabilistic cyber models takes that one step closer to becoming a reality.

In 2016, Credit Suisse was able to transfer its operational risk exposures to the capital markets via the Operational Re catastrophe bond, which was fronted by insurer Zurich. Among the perils covered was a cyberattack and rogue trading scenarios. Certainly, investors in insurance-linked securities (ILS) have the appetite to diversify away from peak zone natural catastrophe perils.

ILS investors have the appetite to diversify away from peak zone natural catastrophe perils

“On a high level, absolutely you could transfer cyber risk to the capital markets,” thinks Ben Brookes, managing director of capital and resilience solutions at RMS. “All the dynamics you would expect are there. It’s a potentially large systemic risk and potentially challenging to hold that risk in concentration as an insurance company. There is the opportunity to cede that risk into a much broader pool of investment risk where you could argue there is much more diversification.

“One question is how much diversification there is across mainstream asset classes?” he continues. “What would the impact be on the mainstream financial markets if a major cloud provider went down for a period of time, for instance? For cyber ILS to be successful, some work would need to be put into that to understand the diversification benefit, and you’d need to be able to demonstrate that to ILS funds in order to get them comfortable.

“It could be an insured, for example, a business highly dependent on the cloud, rather than an insurance or reinsurance company, looking to cede the risk. Particularly a large organization, with a sizable exposure that cannot secure the capacity it needs in the traditional market as it is at present,” says Brookes.

“The isolation and packaging of that cause of loss could enable you to design something that seems a little bit like a parametric cyber bond, and to do that relatively soon,” he believes.

“We’re at a point where we’ve got a good handle on the risk of cloud provider failure or data exfiltration at various different levels. You could envisage building an index around that, for instance the aggregate number of records leaked across the Fortune 500 in the U.S. And then we can model that — and that’s something that can be done in relatively short order.”


Getting physical

There are only a handful of examples of instances where a cyber intrusion has caused substantial physical damage. These are well-known and include a German steel mill attack and the Stuxnet virus, which attacked a nuclear plant. However, in spite of this, many experts believe the potential for physical damage resulting from a cyberattack is growing.

“There are three instances globally where cyber has been used to cause physical damage,” says Julian Enoizi, CEO of Pool Re. “The damage caused was quite significant, but there was no attribution toward those being terrorist events. But that doesn’t mean that if the physical ISIL caliphate gets squeezed they wouldn’t resort to cyber as a weapon in the future.”

In our previous article in EXPOSURE last year about the vulnerabilities inherent in the Internet of Things, following the Mirai DDoS Attack in 2016, we explored how similar viruses could be used to compromise smart thermostats causing them to overheat and start a fire. Because there is so little data and significant potential for systemic risk, (re)insurers have been reluctant to offer meaningful coverage for cyber physical exposures.

They are also concerned that the traditional “air-gapping” defense used to protect supervisory control and data acquisition systems (SCADA) by energy and utilities firms could more easily be overcome in a world where everything has an Internet connection.

Until now. In March this year, the U.K.’s terrorism insurance backstop Pool Re announced it had secured £2.1 billion of retrocession cover, which included — for the first time — cyber terrorism. “We identified the gap in our cover about two-and-a-half years ago that led us to start working with academia and government departments to find out whether there was an exposure to a cyber terrorism event that could cause physical damage,” says Enoizi.

“While it was clear there was no imminent threat, we wanted to be able to future-proof the product and make sure there were no gaps in it,” he continues. “So, we did the studies and have been working hard on getting the insurance and reinsurance market comfortable with that.”

Even after two years of research and discussions with reinsurers and brokers, it was a challenge to secure capacity from all the usual sources, reveals Enoizi. “Pool Re buys the largest reinsurance program for terrorism in the world. And there are certain reinsurance markets who would not participate in this placement because of the addition of a cyber trigger. Some markets withdrew their participation.”

This does suggest the capital markets could be the natural home for such an exposure in the future. “It is clear that state-based actors are increasingly mounting some of the largest cyberattacks,” says RMS’s Muir-Wood. “It would be interesting to test the capital markets just to see what their appetite is for taking on this kind of risk. They have definitely got a bit bolder than they were five years ago, but this remains a frontier area of the risk landscape.”


Brazil: Modeling the world’s future breadbasket

How a crop modeling collaboration with IRB Brasil Re could help bridge the protection gap and build a more resilient agricultural base for the future in Brazil

Brazil is currently the world’s second largest corn exporter, and is set to overtake the U.S. as the globe’s biggest soybean exporter, with the U.S. Department of Agriculture (USDA) predicting a record Brazilian soybean crop of 115 million metric tons in its outlook for 2018.

Yet this agricultural powerhouse — responsible for around a quarter of Brazil’s GDP — remains largely underinsured, according to Victor Roldán, vice president and head of Caribbean and Latin America at RMS. A situation that must be addressed given the importance of the sector for the country’s economy and growing weather extremes farmers must contend with under climate change conditions.

The effects of climate change over the next 25 years could lead to further heavy crop losses

“Natural perils are identified as the industry’s main risk,” he says. “Major droughts or excess of rain have been big drivers of losses for the sector, and their frequency and severity shall increase under future climate change conditions. During 2014 to 2017, El Niño affected Brazil with some of the largest droughts in some areas of the country and excess of rain in others.

“There is a need to structure more effective and attractive insurance products to protect the farmers,” he continues. “For this we need better analytics, a better understanding of the perils, exposure and vulnerability.”

Worst drought in 80 years

The worst drought in 80 years reached its height in 2015, with farmers in Sao Paulo losing up to a third of their crops due to the dry weather. Production of soy shrank by 17 percent between 2013 and 2014 while around a fifth of the state’s citrus crops died. Meanwhile, heavy rain and flash floods in the south of the country also detrimentally impacted agricultural output.

The effects of climate change over the next 25 years could lead to further heavy crop losses, according to a study carried out by Brazil’s Secretariat of Strategic Issues (SAE). It found that some of the country’s main crops could suffer a serious decline in the areas already under cultivation, anticipating a decline of up to 39 percent in the soybean crop. This could translate into significant financial losses, since the soybean crop currently brings in around US$20 billion in export earnings annually.

IRB Brasil Re has been the leader in the agricultural reinsurance sector of the country for decades and has more than 70 years of agricultural claims data. Today agricultural risks represent its second-largest business line after property. However, insurance penetration remains low in the agricultural sector, and IRB has been seeking ways in which to encourage take-up among farmers.

The 2015 drought was a turning point, explains Roldán. “As the largest reinsurance player in Brazil, IRB needed to address in a more systematic way the recorded 16.3 percent increase in claims. The increase was due to the drought in the Midwestern region, which adversely affected corn, soybean and coffee crops and, separately an increase in the historical average rainfall level in the Southern region, which caused damage to the crops.”

Building a probabilistic crop model

A better crop-weather modeling approach and risk analytics of crop perils will help the market to better understand their risks and drive growth in crop insurance penetration. IRB is partnering with RMS to develop the first fully probabilistic hybrid crop model for the agricultural insurance sector in Brazil, which it is planning to roll out to its cedants. The model will assess crop risks linked with weather drivers, such as drought, excess rainfall, temperature variation, hail events, strong wind and other natural hazards that impact crop yield variability. The model will be suited for different crop insurance products such as named perils (hail, frost, etc.), Multiple-Peril Crop Insurance (MPCI) and revenue covers, and will also include livestock and forestry.

“Major droughts or excess of rain have been big drivers of losses for the sector, but also climate change is a worrying trend” — Victor Roldán, RMS

“Weather-driven impacts on crop production are complex perils to model given the natural variability in space and time, the localized nature of the hazards and the complex vulnerability response depending on the intensity, but also on the timing of occurrence,” explains Olivier Bode, manager, global agricultural risk at RMS.

“For instance, plant vulnerability not only depends on the intensity of the stress but also on the timing of the occurrence, and the crop phenology or growth stage, which in turn depends on the planting date and the selected variety along with the local weather and soil conditions,” he continues. “Thus, exposure information is critical as you need to know which variety the farmer is selecting and its corresponding planting date to make sure you’re representing correctly the impacts that might occur during a growing season. The hybrid crop model developed by RMS for IRB has explicit modules that account for variety specific responses and dynamic representation of crop growth stages.”

The model will rely on more than historical data. “That’s the major advantage of using a probabilistic crop-weather modeling approach,” says Bode. “Typically, insurers are looking at historical yield data to compute actuarial losses and they don’t go beyond that. A probabilistic framework allows insurers to go beyond the short historical yield record, adding value by coupling longer weather time series with crop models. They also allow you to capture future possible events that are not recorded in past weather data, for example, drought events that might span over several years, flood occurrences extending over larger or new areas as well as climate change related impacts. This allows you to calculate exceedance probability losses at different return periods for each crop and for specific scenarios.”

There is also significant potential to roll out the model to other geographies in the future, with Colombia currently looking like the obvious next step and opportunity. “The El Niño weather phenomenon affects all of Latin America; it decreases rains by more than 60 percent during the rainy seasons in many countries,” explains Roldán. “Like Brazil, Colombia is a very biologically diverse country and features a variety of ecosystems. Currently, most of the country has under-utilized agricultural land.”

Colombia is already a key player worldwide in two products: coffee and cut flowers. But the country signed a number of free trade agreements that will give its producers more access to foreign markets. “So, the expansion of agribusiness insurance is urgently needed in Colombia,” says Roldán.

 


Readying the insurance industry for a “moonshot”

There is growing acceptance that trying to squeeze more efficiency out of existing systems and processes is folly in an industry that must make fundamental changes. But are insurance and reinsurance companies ready for the cultural and digital change this necessitates?

In an article in Wired magazine, Google X lab director Eric “Astro” Teller (whose business card describes him as “Captain of Moonshots”) suggested that it might actually be easier to make something 10 times better than 10 percent better. Squeezing out a further 10 percent in efficiency typically involves tinkering with existing flawed processes and systems. It’s not always necessary to take a “moonshot,” but making something 10 times better involves taking bold, innovative steps and tearing up the rule book where necessary.

The term “moonshot” came from IBM, describing how they foresaw the impact of Cloud in the future of healthcare, specifically its impact in the hunt for a cure for cancer. IBM argued a new architectural strategy — one based on open platforms and designed to cope with rampant data growth and the need for flexibility — was required in order to take cancer research to the next level.

But is the 330-year-old insurance industry — with its legacy systems, an embedded culture and significant operational pressures — ready for such a radical approach? And should those companies that are not ready, prepare to be disrupted?

In the London and Lloyd’s market, where the cost of doing business remains extremely high, there are fears that business could migrate to more efficient, modern competitor hubs, such as Zurich, Bermuda and Singapore.

“The high cost of doing business is something that has been directly recognized by [Lloyd’s CEO] Inga Beale amongst others; and it’s something that has been explicitly highlighted by the rating agencies in their reports on the market,” observes Mike van Slooten, head of market analytics at Aon Benfield. “There is a consensus building that things really do have to change.”

The influx of alternative capacity, a rapidly evolving risk landscape — with business risks increasingly esoteric — a persistently low interest rate environment and high levels of competition have stretched balance sheets in recent years. In addition, the struggle to keep up with the explosion of data and the opportunities this presents, and the need to overhaul legacy systems, is challenging the industry as never before.

“You’ve got a situation where the market as a whole is struggling to meet its ROE targets,” says van Slooten. “We’re getting to a stage where pretty much everyone has to accept the pricing that’s on offer. One company might be better at risk selection than another — but what really differentiates companies in this market is the expense ratio, and you see a huge disparity across the industry.

“Some very large, successful organizations have proved they can run at a 25 percent expense ratio and for other smaller organizations it is north of 40 percent, and in this market, that’s a very big differential,” he continues. “Without cost being brought out of the system there’s a lot of pressure there, and that’s where these M&A deals are coming from. Insight is going to remain at a premium going forward, however, a lot of the form-filling and processing that goes on behind the scenes has got to be overhauled.”

“Efficiency needs to be partnered with business agility,” says Jon Godfray, chief operating officer at Barbican Insurance Group. Making a process 10 times faster will not achieve the “moonshot” an organization needs if it is not married to the ability to act quickly on insight and make informed decisions. “If we weren’t nimble and fast, we would struggle to survive. A nimble business is absolutely key in this space. Things that took five years to develop five years ago are now taking two. Everything is moving at a faster pace.”

As a medium-sized Lloyd’s insurance group, Barbican recognizes the need to remain nimble and to adapt its business model as the industry evolves. However, large incumbents are also upping their game. “I spent some years at a very large insurer and it was like a massive oil tanker … you decided in January where you wanted to be in December, because it took you four months to turn the wheel,” says Godfray.

“Large organizations have got a lot better at being adaptable,” he continues. “Communication lines are shorter and technology plays a big part. This means the nimble advantage we have had is reducing, and we must therefore work even faster and perform better. Organizations need to remain flexible and nimble, and need to be able to embrace the increasingly stringent regulatory climate we’re in.”

Creating a culture of innovation

Automation and the efficiencies to be gained by speeding up previously clunky and expensive processes will enable organizations to compete more effectively. But not all organizations need to be pioneers in order to leverage new technology to their advantage,” adds Godfray. “We see ourselves as a second-level early adopter. We’d love to be at the forefront of everything, but there are others with deeper pockets who can do that.”

“However, we can be an early adopter of technology that can make a difference and be brave enough to close an avenue if it isn’t working,” he continues. “Moving on from investments that don’t appear to be working is something a lot of big organizations struggle with. We have a great arrangement with our investor where if we start something and we don’t like it, we stop it and we move on.”

The drive for efficiency is not all about technology. There is a growing recognition that culture and process is critical to the change underway in the industry. Attracting the right talent, enabling bold decisions and investments to be made, and responding appropriately to rapidly changing customer needs and expectations all rest on the ability for large organizations to think and act more nimbly.

And at the end of the day, survival is all about making tactical decisions that enhance an organization’s bottom line, Godfray believes. “The winners of the future will have decent P&Ls. If you’re not making money, you’re not going to be a winner. Organizations that are consistently struggling will find it harder and harder as the operating environment becomes less and less forgiving, and they will gradually be consolidated into other companies.”

Much of the disruptive change that has already occurred within the industry has occurred within general insurance, where the Internet of Things (IoT), artificial intelligence and product innovation are just some of the developments underway. As we move into an era of the connected home, wearable devices and autonomous vehicles, insurers are in a better position to both analyze individuals and to feed back information to them in order to empower and reduce risk.

But even within personal lines there has not been a remarkable product revolution yet, thinks Anthony Beilin, head of innovation and startup engagement at Aviva. “The same can be said for disruption of the entire value chain. People have attacked various parts and a lot of the focus so far has been on distribution and the front-end customer portal. Maybe over the next 10 years, traditional intermediaries will be replaced with new apps and platforms, but that’s just a move from one partner to another.”

Innovation is not just about digitization, says Beilin. While it is important for any (re)-
insurance company to consistently improve its digital offering, true success and efficiencies will be found in redesigning the value chain, including the products on offer. “It isn’t just taking what was a paper experience onto the Internet, then taking what was on the Internet onto the mobile and taking a mobile experience into a chatbot … that isn’t innovation.

“What we really need to think about is: what does protecting people’s future look like in 50 years’ time? Do people own cars? Do people even drive cars? What are the experiences that people will really worry about?” he explains. “How can we rethink what is essentially a hedged insurance contract to provide a more holistic experience, whether it’s using AI to manage your finances or using technology to protect your health, that’s where the radical transformation will come.”

Beilin acknowledges that collaboration will be necessary. With a background in launching startups he understands the necessary and complementary characteristics of niche players versus large incumbents.

“It is an agreed principle that the bigger the company, the harder it is to make change,” says Beilin. “When you start talking about innovating it runs contrary to the mantra of what big businesses do, which is to set up processes and systems to ensure a minimum level of delivery. Innovation, on the other hand, is about taking the seed of an idea and developing it into something new, and it’s not a natural fit with the day-to-day operations of any business.”

This is not just a problem for the insurance industry. Beilin points to the disruption brought about in the traditional media space by Netflix, Facebook and other social media platforms. “Quite frankly startups are more nimble, they have more hunger, dynamism and more to lose,” he says. “If they go bankrupt, they don’t get paid. The challenge for them is in scaling it to multiple customers.”

This is where investments like Aviva’s Digital Garage come in. “We’re trying to be a partner for them,” says Beilin. “Collaboration is the key in anything. If you look at the success we’re going to achieve,  it’s not going to be in isolation. We need different capabilities to succeed in a future state. We’ve got some extremely creative and talented people on staff, but of course we’ll never have everyone. We need different capabilities and skills so we need to make sure we’re interoperable and open to working with partners wherever possible.”


Achieving 10X: A platform-centric approach

Together with increasing speed and agility and initiatives to drive down the transactional cost of the business, technology and how it enables better risk selection, pricing and capital allocation is seen as a savior. Analytics, and fusing the back office where the data lives, through to the front office — where the decision-makers are — is imperative. 

According to 93 percent of insurance CEOs surveyed by PwC in 2015, data mining and analysis is the most strategically important digital technology for their business. Many (re)insurance company CIOs have taken the plunge and moved parts of their business into the Cloud, particularly those technologies that are optimized to leverage its elasticity and scalability, in order to enhance their analytical capabilities. 

When it comes to analytics, simply moving exposure data, contract data, and existing actuarial and probabilistic models into Cloud architecture will not enable companies to redesign their entire workflow, explains Shaheen Razzaq, director, software products at RMS. 

“Legacy systems were not designed to scale to the level needed,” he adds. “We are now in a world dealing with huge amounts of data and even more sophisticated models and analytics. We need scalable and performing technologies. And to truly leverage these technologies, we need to redesign our systems from the ground up.” He argues that what is needed is a platform-centric approach, designed to be supported by the Cloud, to deliver the scale, performance and insurance-specific needs the industry needs to achieve its moonshot moment. Clearly RMS(one)®, a big data and analytics platform purpose-built for the insurance industry, is one solution available.


 


The peril of ignoring the tail

Drawing on several new data sources and gaining a number of new insights from recent earthquakes on how different fault segments might interact in future earthquakes, Version 17 of the RMS North America Earthquake Models sees the frequency of larger events increasing, making for a fatter tail.  EXPOSURE asks what this means for (re)insurers from a pricing and exposure management perspective.

Recent major earthquakes, including the M9.0 Tohoku Earthquake in Japan in 2011 and the Canterbury Earthquake Sequence in New Zealand (2010-2011), have offered new insight into the complexities and interdependencies of losses that occur following major events. This insight, as well as other data sources, was incorporated into the latest seismic hazard maps released by the U.S. Geological Survey (USGS).

In addition to engaging with USGS on its 2014 update, RMS went on to invest more than 100 person-years of work in implementing the main findings of this update as well as comprehensively enhancing and updating all components in its North America Earthquake Models (NAEQ). The update reflects the deep complexities inherent in the USGS model and confirms the adage that “earthquake is the quintessential tail risk.” Among the changes to the RMS NAEQ models was the recognition that some faults can interconnect, creating correlations of risk that were not previously appreciated.

Lessons from Kaikoura

While there is still a lot of uncertainty surrounding tail risk, the new data sets provided by USGS and others have improved the understanding of events with a longer return period. “Global earthquakes are happening all of the time, not all large, not all in areas with high exposures,” explains Renee Lee, director, product management at RMS. “Instrumentation has become more advanced and coverage has expanded such that scientists now know more about earthquakes than they did eight years ago when NAEQ was last released in Version 9.0.”

This includes understanding about how faults creep and release energy, how faults can interconnect, and how ground motions attenuate through soil layers and over large distances. “Soil plays a very important role in the earthquake risk modeling picture,” says Lee. “Soil deposits can amplify ground motions, which can potentially magnify the building’s response leading to severe damage.”

The 2016 M7.8 earthquake in Kaikoura, on New Zealand’s South Island, is a good example of a complex rupture where fault segments connected in more ways than had previously been realized. In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next, producing a single larger earthquake.

“The Kaikoura quake was interesting in that we did have some complex energy release moving from fault to fault,” says Glenn Pomeroy, CEO of the California Earthquake Authority (CEA). “We can’t hide our heads in the sand and pretend that scientific awareness doesn’t exist. The probability has increased for a very large, but very infrequent, event, and we need to determine how to manage that risk.”

San Andreas correlations

Looking at California, the updated models include events that extend from the north of San Francisco to the south of Palm Springs, correlating exposures along the length of the San Andreas fault. While the prospect of a major earthquake impacting both northern and southern California is considered extremely remote, it will nevertheless affect how reinsurers seek to diversify different types of quake risk within their book of business.

“In the past, earthquake risk models have considered Los Angeles as being independent of San Francisco,” says Paul Nunn, head of catastrophe risk modeling at SCOR. “Now we have to consider that these cities could have losses at the same time (following a full rupture of the San Andreas Fault).

In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next

“However, it doesn’t make that much difference in the sense that these events are so far out in the tail ... and we’re not selling much coverage beyond the 1-in-500-year or 1-in-1,000-year return period. The programs we’ve sold will already have been exhausted long before you get to that level of severity.”

While the contribution of tail events to return period losses is significant, as Nunn explains, this could be more of an issue for insurance companies than (re)insurers, from a capitalization standpoint. “From a primary insurance perspective, the bigger the magnitude and event footprint, the more separate claims you have to manage. So, part of the challenge is operational — in terms of mobilizing loss adjusters and claims handlers — but primary insurers also have the risk that losses from tail events could go beyond the (re)insurance program they have bought.

“It’s less of a challenge from the perspective of global (re)insurers, because most of the risk we take is on a loss limited basis — we sell layers of coverage,” he continues. “Saying that, pricing for the top layers should always reflect the prospect of major events in the tail and the uncertainty associated with that.”

He adds: “The magnitude of the Tohoku earthquake event is a good illustration of the inherent uncertainties in earthquake science and wasn’t represented in modeled scenarios at that time.”

While U.S. regulation stipulates that carriers writing quake business should capitalize to the 1-in-200-year event level, in Canada capital requirements are more conservative in an effort to better account for tail risk. “So, Canadian insurance companies should have less overhang out of the top of their (re)insurance programs,” says Nunn.

Need for post-event funding

For the CEA, the updated earthquake models could reinvigorate discussions around the need for a mechanism to raise additional claims-paying capacity following a major earthquake. Set up after the Northridge Earthquake in 1994, the CEA is a not-for-profit, publicly managed and privately funded earthquake pool.

“It is pretty challenging for a stand-alone entity to take on large tail risk all by itself,” says Pomeroy. “We have, from time to time, looked at the possibility of creating some sort of post-event risk-transfer mechanism.

“A few years ago, for instance, we had a proposal in front of the U.S. Congress that would have created the ability for the CEA to have done some post-event borrowing if we needed to pay for additional claims,” he continues. “It would have put the U.S. government in the position of guaranteeing our debt. The proposal didn’t get signed into law, but it is one example of how you could create an additional claim-paying capacity for that very large, very infrequent event.”

“(Re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing” — Paul Nunn, SCOR

The CEA leverages both traditional and non-traditional risk-transfer mechanisms. “Risk transfer is important. No one entity can take it on alone,” says Pomeroy. “Through risk transfer from insurer to (re)insurer the risk is spread broadly through the entrance of the capital markets as another source for claim-paying capability and another way of diversifying the concentration of the risk.

“We manage our exposure very carefully by staying within our risk-transfer guidelines,” he continues. “When we look at spreading our risk, we look at spreading it through a large number of (re)insurance companies from 15 countries around the world. And we know the (re)insurers have their own strict guidelines on how big their California quake exposure should be.”

The prospect of a higher frequency of larger events producing a “fatter” tail also raises the prospect of an overall reduction in average annual loss (AAL) for (re)insurance portfolios, a factor that is likely to add to pricing pressure as the industry approaches the key January 1 renewal date, predicts Nunn. “The AAL for Los Angeles coming down in the models will impact the industry in the sense that it will affect pricing and how much probable maximum loss people think they’ve got. Most carriers are busy digesting the changes and carrying out due diligence on the new model updates.

“Although the eye-catching change is the possibility of the ‘big one,’ the bigger immediate impact on the industry is what’s happening at lower return periods where we’re selling a lot of coverage,” he says. “LA was a big driver of risk in the California quake portfolio and that’s coming down somewhat, while the risk in San Francisco is going up. So (re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing.”

 


Quantifying the resilience dividend

New opportunities arise for risk capital providers and city planners as the resilience movement gets analytical. EXPOSURE explores the potential.

A hundred years ago, a seven-and-a-half-mile seawall was built to protect San Francisco from Mother Nature. It gave the city’s planning department the confidence to develop today’s commercially and culturally rich downtown.

But that iconic waterfront is under threat. The aging seawall has serious seismic vulnerability. Almost $80 billion of San Francisco property is exposed to sea level rise.

To ensure his city’s long-term resilience, Mayor Ed Lee commissioned a plan to design and fund the rebuild of the seawall. A cost of $8 million for the feasibility study last year and $40 million for the preliminary design this year is just the beginning. With an estimated price tag of up to $5 billion, the stakes are high. Getting it wrong is not an option. But getting it right won’t be easy.

San Francisco is no outlier. Investing in resilience is in vogue. Citizens expect their city officials to understand the risks faced and deal with them. The science is there, so citizens want to see their city planning and investing for a robust, resilient city looking fifty or a hundred years ahead. The frequency and severity of natural catastrophes continues to rise. The threat of terror continues to evolve. Reducing damage and disruption when the worst happens has become an imperative across the political spectrum.

Uncertainty around various macro trends complicates the narrative: sea level rise, coastal development, urban densification, fiscal constraints, “disaster deductibles.” Careful planning is required. An informed understanding of how the right intervention leads to a meaningful reduction in risk is higher than ever before on the City Hall agenda.

This has various implications for risk capital providers. Opportunities are emerging to write more profitable business in catastrophe-exposed areas. Municipal buyers are looking for new products that link risk transfer and risk reduction or deliver more than just cash when disaster strikes.

The innovators will win, thinks John Seo, co-founder and managing principal of Fermat Capital Management. “Considerable time and thought must be invested on what to do with funds, both pre- and post-event.

“All municipalities function on a relatively fixed annual budget. Risk transfer smooths the costs of catastrophe risk, which lessens the disruption on ongoing spending and programs. Ideally, risk transfer comes with a plan for what to do with the funds received from a risk transfer payout. That plan is just as valuable, if not more valuable, than the payout itself.”

Resisting a shock in New Orleans

This innovative approach to resilience has become central to New Orleans under Mayor Mitch Landrieu. Partnering with utilities and reinsurance experts, the city examined its drinking water, sanitation and rainwater evacuation facilities to determine their vulnerability to major storms. This analysis provided the basis for investments to ensure these facilities could withstand a shock and continue operating effectively.

“In New Orleans, the city’s pumps are a critical piece of infrastructure. So, the question was: can you create a better nexus between an engineering company with manpower and thought-power to help keep those pumps going, to prepare them in advance of a catastrophe, and align insurance contracts and risk so we are continuing service delivery,” explains Elizabeth Yee, vice president of city solutions at 100 Resilient Cities.

The aim is to focus on disaster response and business continuity, in addition to risk financing. “If there’s an earthquake it’s great the city might receive $10 million to help repair the airport, but what they really need is an airport that is up and running, not just $10 million,” says Yee. “So, there needs to be a way to structure insurance contracts so they better help continue service delivery, as opposed to just providing money.”

There is also the need to reflect the impact of strengthened infrastructure when modeling and pricing the risk. But this isn’t always an easy journey.

In the city of Miami Beach, Mayor Philip Levine decided to raise its roads, so the barrier island’s thoroughfares stay open even in a flood. While the roads remain dry, this intervention has brought some unwelcome consequences.

City residents and business owners are concerned that the runoff will flood adjacent properties. Irrespective of where the water from the streets goes, it is no longer clear whether in-force insurance policies would pay out in the event of flood damage. The ground floor is no longer technically the ground floor. It is now a basement as it sits below the street level which one local restaurateur found out when Allstate denied his $15,000 claim last year.

“That’s an example of the kind of highly nuanced problem government agencies are grappling with all over the world,” explains Daniel Stander, global managing director at RMS. “There are often no quick and easy answers. Economic analysis is essential. Get it wrong and well-intentioned intervention can actually increase the risk — and the cost of insurance with it.

“The interventions you put in place have to reduce the risk in the eyes of the market,” he continues. “If you want to get the credit for your resilience investments, you need to make sure you understand your risk as the market does, and then reduce your risk in its eyes. Get it right, and communities and economies thrive. Get it wrong, and whole neighborhoods become uninsurable, unaffordable, unlivable.”

Retrofitting shelters in Berkeley

Through its partnership with 100 Resilient Cities, RMS is helping a growing number of cities determine which resilience interventions will make the biggest difference.

Knowing that a major Hayward fault rupture would displace up to 12,000 households, with up to 4,000 seeking temporary shelter, the city of Berkeley engaged RMS to ascertain whether the city’s earthquake shelters would withstand the most probable events on the fault. A citywide analysis highlighted that the shelters perform, on average, worse than the surrounding buildings from which residents would flee. The RMS analysis also found that a $17 million seismic retrofit investment plan is substantially more cost-effective and environmentally friendly than rebuilding or repairing structures after an earthquake.

“We’ve encouraged our chief resilience officers who are new to a city to learn about their exposures,” explains Yee. “From that baseline understanding, they can then work with someone like RMS to carry out more specific analysis. The work that RMS did with Berkeley helped them to better understand the economic risk posed by an earthquake, and ensured the city was able to secure funding to upgrade earthquake shelters for its residents.”

Rewarding resilience

In parts of the world where the state or national government acts as (re)insurer-of-last-resort, stepping in to cover the cost of a catastrophe, there may be a lack of incentive to improve city resilience, warns Yee. “Many of the residents in my neighbourhood have elevated our homes, because we had fish in our yards after Hurricane Sandy,” she says. “But some of our neighbours have decided to wait until the ‘next one’ because there’s this attitude that FEMA (the Federal Emergency Management Agency) will just pay them back for any damage that occurs. We need to change the regulatory framework so that good behavior is incentivized and rewarded.”

“You don’t have to go to emerging markets to find plenty of exposure that is not covered by insurance”— Daniel Stander, RMS

In the U.S., FEMA has suggested the introduction of a “disaster deductible.” This would require recipients of FEMA public assistance funds to expend a predetermined amount of their own funds on emergency management and disaster costs before they receive federal funding. Critically, it is hoped the proposed disaster deductible could “incentivize risk reduction efforts, mitigate future disaster impacts and lower overall recovery costs.”

City resilience framework

The City Resilience Framework, developed by Arup with support from the Rockefeller Foundation, helps clarify the primary factors contributing to resilient cities.  

Resilient cities are more insurable cities, points out Stander. “There are constraints on how much risk can be underwritten by the market in a given city or county. Those constraints bite hardest in high-hazard, high-exposure locations.”

“So, despite an overcapitalized market, there is significant underinsurance,” explains Stander. “You don’t have to go to emerging markets to find plenty of exposure that is not covered by insurance.”

Insurers need not fear that cities’ resilience investments will be to the detriment of premium income. “The insurance industry wants risk to be at an appropriate level,” says Stander. “There are parts of the world where the risk is so high, the industry is rightly reluctant to touch it. Informal neighborhoods throughout South America and South Asia are so poorly constructed they’re practically uninsurable. The insurance industry likes resilience interventions that keep risk insurable at a rate which is both affordable and profitable.”

“Besides, it’s not like you can suddenly make Miami zero-risk,” he adds. “But what you can do as a custodian of a city’s economy is prioritize and communicate resilience interventions that simultaneously reduce rates for citizens and attract private insurance markets. And as a capital provider you can structure products that reward resilient thinking, which help cities monetize their investments in resilience.”

Movements like Rockefeller Foundation‒pioneered 100 Resilient Cities are both responding to and driving this urgency. There is a real and present need for action to meet growing threats.

In San Francisco, investments in resilience are being made now. The city is beyond strategy formulation and on to implementation mode. Shovel-ready projects are required to stem the impacts of 66 inches of sea level rise by 2100. For San Francisco and hundreds of cities and regions around the globe, resilience is a serious business.


Quantifying the economic impact of sea level rise in San Francisco 

In May 2016, RMS published the findings of an analysis into the likely economic impact of sea level rise (SLR) in San Francisco, with the aim to inform the city’s action plan. It found that by the year 2100, $77 billion of property would be at risk from a one-in-100-year extreme storm surge event and that $55 billion of property in low-lying coastal zones could be permanently inundated in the absence of intervention.
The city’s Sea Level Rise Action Plan, which incorporated RMS findings, enabled San Francisco’s mayor to invest $8 million in assessing the feasibility of retrofitting the city’s seawall. The city subsequently commissioned a $40 million contract to design that retrofit program. 


 


A burgeoning opportunity

As traditional (re)insurers hunt for opportunity outside of property catastrophe classes, new probabilistic casualty catastrophe models are becoming available. At the same time, as catastrophe risks are becoming increasingly “manufactured” or human-made, so casualty classes have the potential to be the source of claims after a large “natural” catastrophe.

Just as the growing sophistication of property catastrophe models has enabled industry innovation, there is growing excitement that new tools available to casualty (re)insurers could help to expand the market. By improved evaluation of casualty clash exposures, reinsurers will be better able to understand, price and manage their exposures, as well as design new products that cater to underserved areas.

However, the casualty market must switch from pursuing a purely defensive strategy. “There is an ever-growing list of exclusions in liability insurance and interest in the product is declining with the proliferation of these exclusions,” explains Dr. Robert Reville, president and CEO of Praedicat, the world’s first liability catastrophe modeling company. “There is a real growth opportunity for the industry to deal with these exclusions and recognize where they can confidently write more business.

“Industry practitioners look at what’s happened in property — where modeling has led to a lot of new product ideas, including capital market solutions, and a lot of innovation — and casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property,” he adds.

Perils — particularly emerging risks that underwriters have struggled to price, manage and understand — have typically been excluded from casualty products. This includes electromagnetic fields (EMFs), such as those emanating from broadcast antennas and cell phones. Cover for such exposures is restricted, particularly for the U.S. market, where it is often excluded entirely. Some carriers will not offer any cover at all if the client has even a remote exposure to EMF risks. Yet are they being over-apprehensive about the risk?

The fear that leads to an over application of exclusions is very tangible. “The latency of the disease development process — or the way a product might be used, with more people becoming exposed over time — causes there to be a build-up of risk that may result in catastrophe,” Reville continues. “Insurers want to be relevant to insuring innovation in product, but they have to come to terms with the latency and the potential for a liability catastrophe that might emerge from it.”

Unique nature of casualty catastrophe

It is a misconception that casualty is not a catastrophe class of business. Reville points out that the industry’s US$100 billion-plus loss relating to asbestos claims is arguably its biggest-ever catastrophe. Within the Lloyd’s market the overwhelming nature of APH (asbestos, pollution and health) liabilities contributed to the market’s downward spiral in the late 1980s, only brought under control through the formation of the run-off entity Equitas, now owned and managed by Warren Buffett’s Berkshire Hathaway.

As the APH claims crisis demonstrated, casualty catastrophes differ from property catastrophes in that they are a “two-tailed loss.” There is the “tail loss” both have in common, which describes the high frequency, low probability characteristics — or high return period — of a major event. But in addition, casualty classes of business are “long-tail” in nature. This means that a policy written in 2017 may not experience a claim until 20 years later, providing an additional challenge from a modeling and reserving perspective.

“Casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property” — Robert Reville, Praedicat

Another big difference between casualty clash and property catastrophe from a modeling perspective is that the past is not a good indication of future claims. “By the time asbestos litigation had really taken off, it was already a banned product in the U.S., so it was not as though asbestos claims were any use in trying to figure out where the next environmental disaster or next product liability was going to be,” says Reville. “So, we needed a forward-looking approach to identify where there could be new sources of litigation.”

With the world becoming both more interconnected and more litigious, there is every expectation that future casualty catastrophe losses could be much greater and impact multiple classes of business. “The reality is there’s serial aggregation and systemic risk within casualty business, and our answer to that has generally been that it’s too difficult to quantify,” says Nancy Bewlay, chief underwriting officer, global casualty, at XL Catlin. “But the world is changing. We now have technology advances and data collection capabilities we never had before, and public information that can be used in the underwriting process.

“Take the Takata airbag recall,” she continues. “In 2016, they had to recall 100 million airbags worldwide. It affected all the major motor manufacturers, who then faced the accumulation potential not only of third-party liability claims, but also product liability and product recall. Everything starts to accumulate and combine within that one industry, and when you look at the economic footprint of that throughout the supply chain there’s a massive potential for a casualty catastrophe when you see how everything is interconnected.”

RMS chief research officer Robert Muir-Wood explains: “Another area where we can expect an expansion of modeling applications concerns casualty lines picking up losses from more conventional property catastrophes. This could occur when the cause of a catastrophe can be argued to have ‘non-natural’ origins, and particularly where there are secondary ‘cascade’ consequences of a catastrophe — such as a dam failing after a big earthquake or for claims on ‘professional lines’ coverages of builders and architects — once it is clear that standard property insurance lines will not compensate for all the building damage.”

“This could be prevalent in regions with low property catastrophe insurance penetration, such as in California, where just one in ten homeowners has earthquake cover. In the largest catastrophes, we could expect claims to be made against a wide range of casualty lines. The big innovation around property catastrophe in particular was to employ high-resolution GIS [geographic information systems] data to identify the location of all the risk. We need to apply similar location data to casualty coverages, so that we can estimate the combined consequences of a property/casualty clash catastrophe.”

One active instance, cited by Muir-Wood, of this shift from property to casualty cover-
ages concerns earthquakes in Oklahoma. “There are large amounts of wastewater left over from fracking, and the cheapest way of disposing of it is to pump it down deep boreholes. But this process has been triggering earthquakes, and these earthquakes have started getting quite big — the largest so far in September 2016 had a magnitude of M5.8.

“At present the damage to buildings caused by these earthquakes is being picked up by property insurers,” he continues. “But what you will see over time are lawsuits to try and pass the costs back to the operators of the wells themselves. Working with Praedicat, RMS has done some modeling work on how these operators can assess the risk cost of adding a new disposal well. Clearly the larger the earthquake, the less likely it is to occur. However, the costs add up: our modeling shows that an earthquake bigger than M6 right under Oklahoma City could cause more than US$10 billion of damage.”

Muir-Wood adds: “The challenge is that casualty insurance tends to cover many potential sources of liability in the contract and the operators of the wells, and we believe their insurers are not currently identifying this particular — and potentially catastrophic —source of future claims. There’s the potential for a really big loss that would eventually fall onto the liability writers of these deep wells ... and they are not currently pricing for this risk, or managing their portfolios of casualty lines.”

A modeled class of business

According to Reville, the explosion of data and development of data science tools have been key to the development of casualty catastrophe modeling. The opportunity to develop probabilistic modeling for casualty classes of business was born in the mid-2000s when Reville was senior economist at the RAND Corporation.

At that time, RAND was using data from the RMS® Probabilistic Terrorism Model to help inform the U.S. Congress in its decision on the renewal of the Terrorism Risk Insurance Act (TRIA). Separately, it had written a paper on the scope and scale of asbestos litigation and its potential future course.

“As we were working on these two things it occurred to us that here was this US$100 billion loss — this asbestos problem — and adjacently within property catastrophe insurance there was this developed form of analytics that was helping insurers solve a similar problem. So, we decided to work together to try and figure out if there was a way of solving the problem on the liability side as well,” adds Reville.

Eventually Praedicat was spun out of the initial project as its own brand, launching its first probabilistic liability catastrophe model in summer 2016. “The industry has evolved a lot over the past five years, in part driven by Solvency II and heightened interest from the regulators and rating agencies,” says Reville. “There is a greater level of concern around the issue, and the ability to apply technologies to understand risk in new ways has evolved a lot.”

There are obvious benefits to (re)insurers from a pricing and exposure management perspective. “The opportunity is changing the way we underwrite,” says Bewlay. “Historically, we underwrote by exclusion with a view to limiting our maximum loss potential. We couldn’t get a clear understanding of our portfolio because we weren’t able to. We didn’t have enough meaningful, statistical and credible data.”

“We feel they are not being proactive enough because ... there’s the potential for a really big loss that would fall onto the liability writers of these deep wells”— Robert Muir-Wood, RMS

Then there are the exciting opportunities for growth in a market where there is intense competition and downward pressure on rates. “Now you can take a view on the ‘what-if’ scenario and ask: how much loss can I handle and what’s the probability of that happening?” she continues. “So, you can take on managed risk. Through the modeling you can better understand your industry classes and what could happen within your portfolio, and can be slightly more opportunistic in areas where previously you may have been extremely cautious.”

Not only does this expand the potential range of casualty insurance and reinsurance products, it should allow the industry to better support developments in burgeoning industries. “Cyber is a classic example,” says Bewlay. “If you can start to model the effects of a cyber loss you might decide you’re OK providing cyber in personal lines for individual homeowners in addition to providing cyber in a traditional business or technology environment.

“You would start to model all three of these scenarios and what your potential market share would be to a particular event, and how that would impact your portfolio,” she continues. “If you can answer those questions utilizing your classic underwriting and actuarial techniques, a bit of predictive modeling in there — this is the blend of art and science — you can start taking opportunities that possibly you couldn’t before.”


The Future of (Re)Insurance: Evolution of the Insurer DNA

The (re)insurance industry is at a tipping point. Rapid technological change, disruption through new, more efficient forms of capital and an evolving risk landscape are challenging industry incumbents like never before. Inevitably, as EXPOSURE reports, the winners will be those who find ways to harmonize analytics, technology, industry innovation, and modelling.

There is much talk of disruptive innovation in the insurance industry. In personal lines insurance, disintermediation, the rise of aggregator websites and the Internet of Things (IoT) – such as connected car, home, and wearable devices – promise to transform traditional products and services. In the commercial insurance and reinsurance space, disruptive technological change has been less obvious, but behind the scenes the industry is undergoing some fundamental changes.

The tipping point

The ‘Uber’ moment has yet to arrive in reinsurance, according to Michael Steel, global head of solutions at RMS. “The change we’re seeing in the industry is constant. We’re seeing disruption throughout the entire insurance journey. It’s not the case that the industry is suffering from a short-term correction and then the market will go back to the way it has done business previously. The industry is under huge competitive pressures and the change we’re seeing is permanent and it will be continuous over time.”

Experts feel the industry is now at a tipping point. Huge competitive pressures, rising expense ratios, an evolving risk landscape and rapid technological advances are forcing change upon an industry that has traditionally been considered somewhat of a laggard. And the revolution, when it comes, will be a quick one, thinks Rupert Swallow, co-founder and CEO of Capsicum Re.

“WE’RE SEEING DISRUPTION THROUGHOUT THE ENTIRE INSURANCE JOURNEY”

— MICHAEL STEEL, RMS

Other sectors have plenty of cautionary tales on what happens when businesses fail to adapt to a changing world, he explains. “Kodak was a business that in 1998 had 120,000 employees and printed 95 percent of the world’s photographs. Two years later, that company was bankrupt as digital cameras built their presence in the marketplace. When the tipping point is reached, the change is radical and fast and fundamental.”

While it is impossible to predict exactly how the industry will evolve going forward, it is clear that tomorrow’s leading (re)insurance companies will share certain attributes. This includes a strong appetite to harness data and invest in new technology and analytics capabilities, the drive to differentiate and design new products and services, and the ability to collaborate.  According to Eric Yau, general manager of software at RMS, the goal of an analytic-driven organization is to leverage the right technologies to bring data, workflow and business analytics together to continuously drive more informed, timely and collaborative decision making across the enterprise.

“New technologies play a key role and while there are many choices with the rise of insurtech firms, history shows us that success is achieved only when the proper due diligence is done to really understand and assess how these technologies enable the longer term business strategy, goals and objectives,” says Yau.

Yau says one of the most important ingredients to success is the ability to effectively blend the right team of technologists, data scientists and domain experts who can work together to understand and deliver upon these key objectives.

The most successful companies will also look to attract and retain the best talent, with succession planning that puts a strong emphasis on bringing Millennials up through the ranks. “There is a huge difference between the way Millennials look at the workplace and live their lives, versus industry professionals born in the 1960s or 1970s - the two generations are completely different,” says Swallow. “Those guys [Millennials] would no sooner write a cheque to pay for something than fly to the moon.”

Case for collaboration

If (re)insurers drag their heels in embracing and investing in new technology and analytics capabilities, disruption could well come from outside the industry. Back in 2015, Lloyd’s CEO Inga Beale warned that insurers were in danger of being “Uber-ized” as technology allows companies from Google to Walmart to undermine the sector’s role of managing risk.

Her concerns are well founded, with Google launching a price comparison site in the U.S. and Rakuten and Alibaba, Japan and China’s answers to Amazon respectively, selling a range of insurance products on their platforms.

“No area of the market is off-limits to well-organized technology companies that are increasingly encroaching everywhere,” says Rob Procter, CEO of Securis Investment Partners. “Why wouldn’t Google write insurance… particularly given what they are doing with autonomous vehicles? They may not be insurance experts but these technology firms are driving the advances in terms of volumes of data, data manipulation, and speed of data processing.”

Procter makes the point that the reinsurance industry has already been disrupted by the influx of third-party capital into the ILS space over the past decade to 15 years. Collateralized products such as catastrophe bonds, sidecars and non-traditional reinsurance have fundamentally altered the reinsurance cycle and exposed the industry’s inefficiencies like never before.

“We’ve been innovators in this industry because we came in ten or 15 years ago, and we’ve changed the way the industry is structured and is capitalized and how the capital connects with the customer,” he says. “But more change is required to bring down expenses and to take out what are massive friction costs, which in turn will allow reinsurance solutions to be priced competitively in situations where they are not currently.

“It’s astounding that 70 percent of the world’s catastrophe losses are still uninsured,” he adds. “That statistic has remained unchanged for the last 20 years. If this industry was more efficient it would be able to deliver solutions that work to close that gap.”

Collaboration is the key to leveraging technology – or insurtech – expertise and getting closer to the original risk. There are numerous examples of tie-ups between
(re)insurance industry incumbents and tech firms. Others have set up innovation garages or bought their way into innovation, acquiring or backing niche start-up firms. Silicon Valley, Israel’s Silicon Wadi, India’s tech capital Bangalore and Shanghai in China are now among the favored destinations for scouting visits by insurance chief innovation officers.

One example of a strategic collaboration is the MGA Attune, set up last year by AIG, Hamilton Insurance Group, and affiliates of Two Sigma Investments. Through the partnership, AIG gained access to Two Sigma’s vast technology and data-science capabilities to grow its market share in the U.S. small to mid-sized commercial insurance space.

“The challenge for the industry is to remain relevant to our customers,” says Steel. “Those that fail to adapt will get left behind. To succeed you’re going to need greater information about the underlying risk, the ability to package the risk in a different way, to select the appropriate risks, differentiate more, and construct better portfolios.”

Investment in technology in and of itself is not the solution, thinks Swallow. He thinks there has been too much focus on process and not enough on product design. “Insurtech is an amazing opportunity but a lot of people seem to spend time looking at the fulfilment of the product – what ‘Chily’ [Swallow’s business partner and industry guru Grahame Chilton] would call ‘plumbing’.

“In our industry, there is still so much attention on the ‘plumbing’ and the fact that the plumbing doesn’t work, that insurtech isn’t yet really focused on compliance, regulation of product, which is where all the real gains can be found, just as they have been in the capital markets,” adds Swallow.

Taking out the friction

Blockchain however, states Swallow, is “plumbing on steroids”. “Blockchain is nothing but pure, unadulterated, disintermediation. My understanding is that if certain events happen at the beginning of the chain, then there is a defined outcome that actually happens without any human intervention at the other end of the chain.”

In January, Aegon, Allianz, Munich Re, Swiss Re, and Zurich launched the Blockchain Insurance Industry Initiative, a “$5 billion opportunity” according to PwC. The feasibility study will explore the potential of distributed ledger technologies to better serve clients through faster, more convenient and secure services.

“BLOCKCHAIN FOR THE REINSURANCE SPACE IS AN EFFICIENCY TOOL. AND IF WE ALL GET MORE EFFICIENT, YOU ARE ABLE TO INCREASE INSURABILITY BECAUSE YOUR PRICES COME DOWN”

— KURT KARL, SWISS RE

Blockchain offers huge potential to reduce some of the significant administrative burdens in the industry, thinks Kurt Karl, chief economist at Swiss Re. “Blockchain for the reinsurance space is an efficiency tool. And if we all get more efficient, you are able to increase insurability because your prices come down, and you can have more affordable reinsurance and therefore more affordable insurance. So I think we all win if it’s a cost saving for the industry.”

Collaboration will enable those with scale to behave like nimble start-ups, explains Karl. “We like scale. We’re large. I’ll be blunt about that,” he says. “For the reinsurance space, what we do is to leverage our size to differentiate ourselves. With size, we’re able to invest in all these new technologies and then understand them well enough to have a dialogue with our clients. The nimbleness doesn’t come from small insurers; the nimbleness comes from insurance tech start-ups.”

He gives the example of Lemonade, the peer-to-peer start-up insurer that launched last summer, selling discounted homeowners’ insurance in New York. Working off the premise that insurance customers lack trust in the industry, Lemonade’s business model is based around returning premium to customers when claims are not made. In its second round of capital raising, Lemonade secured funding from XL Group’s venture fund, also a reinsurance partner of the innovative new firm. The firm is also able to offer faster, more efficient, claims processing.

“Lemonade’s [business model] is all about efficiency and the cost saving,” says Karl. “But it’s also clearly of benefit to the client, which is a lot more appealing than a long, drawn-out claims process.”

Tearing up the rule book

By collecting and utilizing data from customers and third parties, personal lines insurers are now able to offer more customized products and, in many circumstances, improve the underlying risk. Customers can win discounts for protecting their homes and other assets, maintaining a healthy lifestyle and driving safely. In a world where products are increasingly designed with the digital native in mind, drivers can pay-as-they-go and property owners can access cheaper home insurance via peer-to-peer models.

Reinsurers may be one step removed from this seismic shift in how the original risk is perceived and underwritten, but just as personal lines insurers are tearing up the rule book, so too are their risk partners. It is over 300 years since the first marine and fire insurance policies were written. In that time (re)insurance has expanded significantly with a range of property, casualty, and specialty products.

However, the wordings contained in standard (re)insurance policies, the involvement of a broker in placing the business and the face-to-face transactional nature of the business – particularly within the London market – has not altered significantly over the past three centuries. Some are questioning whether these traditional indemnity products are the right solution for all classes of risk.

“We think people are often insuring cyber against the wrong things,” says Dane Douetil, group CEO of Minova Insurance. “They probably buy too much cover in some places and not nearly enough in areas where they don’t really understand they’ve got a risk. So we’re starting from the other way around, which is actually providing analysis about where their risks are and then creating the policy to cover it.”

“There has been more innovation in intangible type risks, far more in the last five to ten years than probably people give credit for. Whether you’re talking about cyber, product recall, new forms of business interruption, intellectual property or the huge growth in mergers and acquisition coverages against warranty and indemnity claims – there’s been a lot of development in all of those areas and none of that existed ten years ago.”

Closing the gap

Access to new data sources along with the ability to interpret and utilize that information will be a key instrument in improving the speed of settlement and offering products that are fit for purpose and reflect today’s risk landscape. “We’ve been working on a product that just takes all the information available from airlines, about delays and how often they happen,” says Karl. “And of course you can price off that; you don’t need the loss history, all you need is the probability of the loss, how often does the plane have a five-hour delay?”

“All the travel underwriters then need to do is price it ‘X’, and have a little margin built-in, and then they’re able to offer a nice new product to consumers who get some compensation for the frustration of sitting there on the tarmac.”

With more esoteric lines of business such as cyber, parametric products could be one solution to providing meaningful coverage for a rapidly-evolving corporate risk. “The corporates of course want indemnity protection, but that’s extremely difficult to do,” says Karl. “I think there will be some of that but also some parametric, because it’s often a fixed payout that’s capped and is dependent upon the metric, as opposed to indemnity, which could well end up being the full value of the company. Because you can potentially have a company destroyed by a cyber-attack at this point.”

One issue to overcome with parametric products is the basis risk aspect. This is the risk that an insured suffers a significant loss of income, but its cover is not triggered. However, as data and risk management improves, the concerns surrounding basis risk should reduce.

Improving the underlying risk

The evolution of the cyber (re)insurance market also points to a new opportunity in a data-rich age: pre-loss services. By tapping into a wealth of claims and third-party data sources, successful (re)insurers of the future will be in an even stronger position to help their insureds become resilient and incident-ready. In cyber, these services are already part of the package and include security consultancy, breach-response services and simulated cyber attacks to test the fortitude of corporate networks and raise awareness among staff. “We’ve heard about the three ‘Vs’ when harnessing data – velocity, variety, and volume – in our industry we need to add a fourth, veracity,” says Yau. “When making decisions around which risks to write, our clients need to have allocated the right capital to back that decision or show regulators what parameters fed that decision.”

“WE DO A DISSERVICE TO OUR INDUSTRY BY SAYING THAT WE’RE NOT INNOVATORS, THAT WE’RE STUCK IN THE PAST”

— DANE DOUETIL, MINOVA INSURANCE

IoT is not just an instrument for personal lines. Just as insurance companies are utilizing data collected from connected devices to analyze individual risks and feedback information to improve the risk, (re)insurers also have an opportunity to utilize third-party data. “GPS sensors on containers can allow insurers to monitor cargo as it flows around the world – there is a use for this technology to help mitigate and manage the risk on the front end of the business,” states Steel.

Information is only powerful if it is analyzed effectively and available in real-time as transactional and pricing decisions are made, thinks RMS’ Steel. “The industry is getting better at using analytics and ensuring the output of analytics is fed directly into the hands of key business decision makers.”

“It’s about using things like portfolio optimization, which even ten years ago would have been difficult,” he adds. “As you’re using the technologies that are available now you’re creating more efficient capital structures and better, more efficient business models.”

Minova’s Douetil thinks the industry is stepping up to the plate. “Insurance is effectively the oil that lubricates the economy,” he says. “Without insurance, as we saw with the World Trade Center disaster and other catastrophes, the whole economy could come to a grinding halt pretty quickly if you take the ‘oil’ away.”

“That oil has to continually adapt and be innovative in terms of being able to serve the wider economy,” he continues. “But I think we do a disservice to our industry by saying that we’re not innovators, that we’re stuck in the past. I just think about how much this business has changed over the years.”

“It can change more, without a doubt, and there is no doubt that the communication capabilities that we have now mean there will be a shortening of the distribution chain,” he adds. “That’s already happening quite dramatically and in the personal lines market, obviously even more rapidly.”