The future for flood protection

With innovation in the flood market increasing, EXPOSURE explores whether high-definition (HD) flood models are one of the keys to closing the protection gap

In August 2017, Hurricane Harvey brought the highest level of rainfall associated with a tropical cyclone in the U.S. since records began, causing catastrophic flooding in some of the most populated areas of the Texas coast, including Houston. The percentage of losses attributed to inland flood versus wind damage was significant, altering the historical view that precipitation resulting from a tropical storm or hurricane is an attritional loss and highlighting the need for stochastic modeling.

Total economic losses resulting from Harvey were around US$85 billion and insured losses were US$30 billion, revealing a significant protection gap, particularly where inland flood damage was concerned. Around 200,000 homes were inundated by the floods, and yet 80 percent of homes in the Houston area were uninsured.

Hurricane Harvey Impacts - Aftermath

Now, an innovative catastrophe bond suggests one way this protection gap could be reduced in the future, particularly as a private flood insurance market develops in the U.S. FloodSmart Re, which was announced at the end of July, secured US$500 million of reinsurance protection on behalf of FEMA’s National Flood Insurance Program (NFIP). Reinsurer Hannover Re was acting as the ceding reinsurer for the transaction, sitting between the NFIP and its Bermuda-based special purpose insurer.

“It’s a landmark transaction — the first time in history that the U.S. federal government is sponsoring a catastrophe bond,” says John Seo, co-founder and managing principal at Fermat Capital. “It’s just tremendous and I couldn’t be more excited. Events like Harvey are going to accelerate the development of the flood market in terms of risk transfer to the insurance-linked securities (ILS) market.

“You have to have more efficient risk pooling and risk sharing mechanisms,” he adds. “There’s over US$200 trillion dollars of capital in the world, so there’s obviously enough to efficiently absorb event risk. So, it’s about, how do you get it out into that larger capital base in an efficient way?”

While the bond only provides cover for flooding arising from named storms, either due to storm surge or rainfall, it is a “good test case for the ILS market’s appetite for flood risks,” according to ILS blog Artemis. While “it is not a broad flood coverage, it will likely help to make it more palatable to cat bond investors given their comfort with modeling the probability of named storms, tropical storms and hurricanes.”

According to Cory Anger, global head of ILS origination and structuring at GC Securities, the ILS market is certainly showing an appetite for flood risk — including inland flood risk ­— with several catastrophe bonds completed over the last year for European flood risk (Generali’s Lion II), Japanese flood risk (MSI and ADI’s Akibare Series 2018-1 Notes) and U.S. flood risk.

“Both public and private sector entities see value from utilizing capital markets’ capacity to manage flood risk,” she says. “We think there are other geographic regions that would be interested in ILS capacity that haven’t yet tapped the ILS markets. Given the recent success of FEMA/NFIP’s FloodSmart Re Series 2018-1 Notes, we expect FEMA/NFIP to continue to utilize ILS capacity (along with traditional reinsurance capital) to support future U.S. flood risk transfer opportunities.”

The ILS sector has grown significantly over the past 15 years, with deals becoming more complex and innovative over time. Many market commentators feel the market was put to the test following the major natural catastrophe losses in 2017. Not only did bonds pay out where they were triggered, fresh capital re-entered, demonstrating investors’ confidence in the sector and its products.

“I’m hearing people starting to coin the phrase that 2018 is the ‘great reload,’” says Seo. “This is something I have been saying for quite some years: That the traditional hard-soft, soft-hard market cycle is over. It’s not that you can’t have an event so large that it doesn’t impact the market, but when it comes to capital markets, high yields are actually a siren call for capital.

“I don’t think anyone doubts that had 2017 occurred in the absence of the ILS market it would have been a completely different story, and we would have had a traditional hard market scenario in 2018,” he adds.

FloodSmart Re has clearly demonstrated the strong investor interest in such transactions. According to Anger, GC Securities acted as the structuring agent for the transaction and was one of two book runners. More than 35 capital markets investors provided fully collateralized protection to FEMA/NFIP on the landmark catastrophe bond.

“The appetite for new perils is generally strong, so there’s always strong interest when new risks are brought to market,” says Ben Brookes, managing director of capital and resilience solutions at RMS.

He thinks improvements in the underlying data quality along with high-definition flood models make it more likely that inland flood could be included as a peril in future catastrophe bond issuances on behalf of private insurers, on an indemnity basis.

“In the early days of the cat bond market, new perils would typically be issued with parametric triggers, because investors were skeptical that sufficient data quality was achieved or that the indemnity risks were adequately captured by cat models. But that changed as investor comfort grew, and a lot of capital entered the market and you saw all these deals becoming indemnity. Increased comfort with risk modeling was a big part of that.”

The innovative Blue Wings catastrophe bond, which covered insurer Allianz for severe U.K. flood risk (and some U.S. and Canadian quake) and was completed in 2007, is a good example. The parametric bond used an index to calculate flood depths at over 50 locations across the U.K., was ahead of its time and is the only U.K. flood catastrophe bond that has come to market.

According to Anger, as models have become more robust for flood risk — whether due to tropical cyclone (storm surge and excess precipitation) or inland flooding (other than from tropical cyclone) ­— the investor base has been open to trigger selection (e.g., indemnity or parametric).

“In general, insurers are preferring
indemnity-triggered solutions,” she adds, “which the ILS market has concurrently been open to. Additionally, for this peril, the ILS community has been open to per occurrence and annual aggregate structures, which gives flexibility to sponsors to incorporate ILS capital in their risk transfer programs.”

As the private market develops, cat bond sponsors from the insurance market would be more likely to bundle inland flood risk in with other perils, thinks Charlotte Acton, director of capital and resilience solutions at RMS. “A degree of hurricane-induced
inland flood risk is already present on a non-
modeled basis within some transactions in the market,” she says. “And Harvey illustrates the value in comprehensive modeling of flooding associated with named storms.

“So, for a broader portfolio, in most cases, inland flood would be one piece of the picture as it will be exposed to multiple perils. However, a stand-alone inland flood bond is possible for a public sector or corporate sponsor that has specific exposure to flood risk.”

With inland flood, as with all other perils, sophisticated models help to make markets. “A fund would look at the risk in and of itself in the deal, but of course they’d also want to understand the price and returns perspective as well,” says Brookes. “Models play into that quite heavily. You can’t price a bond well, and understand the returns of a bond, unless you understand the risk of it.”

As the ILS market makes increasing use of indemnity protection through ultimate net loss (UNL) triggers, sophisticated HD flood modeling will be essential in order to transfer the peril to the capital markets. This allows clear parameters to be set around different hours clauses and deductible structures, for instance, in addition to modeling all causes of flood and the influence of local defenses.

“It’s a landmark transaction — the first time in history that the U.S. Federal Government is sponsoring a catastrophe bond” — John SEO, Fermat capital

Jillian Williams, head of portfolio analysis at Leadenhall Capital Partners, notes that ILS is increasingly bundling together multiple perils in an effort to gain diversification.

“Diversification is important for any investment strategy, as you are always trying to minimize the risk of losing large amounts in one go,” she says. “Cat bonds (144A’s) currently have defined perils, but collateralized reinsurance and private cat bonds can cover all perils. Complexities and flow of information to all parties will be a challenge for cat bonds to move from defined perils to UNL all perils.

“Any new peril or structure in a cat bond will generate many questions, even if they don’t have a major impact on the potential losses,” she continues. “Investors will want to know why the issuers want to include these new perils and structures and how the associated risk is calculated. For UNL, all flood (not just sea surge) would be included in the cat bond, so the definition of the peril, its complexities, variables and its correlation to other perils will need to be evaluated and represented in the flood models used.”

She thinks the potential to transfer more flood to the capital markets is there, but that the complexity of the peril are challenges that need to be overcome, particularly in the U.S. “Flood coverage is already starting to move into the capital markets, but there are many issues that need to be worked through before it can be moved to a 144A transaction in a UNL format for many territories,” says Williams. “Just one of the complexities is that flood risk may be covered by government pools.

“To move flood perils from government pools to private insurers is like any evolution, it can take time, particularly if existing coverage is subsidized,” she adds. “For private insurers, the complexity is not just about flood modeling but also about ensuring risk-adequate pricing and navigating through government legislation.”


When the lights went out

How poor infrastructure, grid blackouts and runaway business interruption has hampered Puerto Rico’s recovery in the aftermath of Hurricane Maria

As the 2018 North Atlantic hurricane season continues, Puerto Rico has yet to recover from destructive events of the previous year. In September 2017, Category 4 Hurricane Maria devastated several Caribbean islands, including Puerto Rico, and left a trail of destruction in its path. For many, Maria was one of the worst natural catastrophes to hit a U.S. territory, causing an estimated US$65 billion to US$115 billion in damage and claiming as many as 4,500 to 5,000 lives.

The damage wrought has further strained the island’s sluggish economy. Puerto Rico had over US$70 billion in public debt when Maria hit. Economic forecasts for 2018 to 2020, considering the impact of Hurricane Maria, suggest Puerto Rico’s GDP will decline by 7 to 8 percent in 2018 and likely remain in a negative range of 5 to 7 percent for the next few years.

“Resilience is also about the financial capacity to come back and do the reconstruction work” — Pooya Sarabandi, RMS

Power outages, business interruption (BI) and contingent BI (CBI) — including supply chain disruption — have hampered the economy’s recovery. “Resilience is also about the financial capacity to come back and do the reconstruction work,” explains Pooya Sarabandi, global head of data analy-
tics at RMS. “You’re now into this chicken-
and-egg situation where the Puerto Rican government already has a lot of public debt and doesn’t have reserves, and meanwhile the federal U.S. government is only willing to provide a certain level of funding.”

Maria’s devastating impact on Puerto Rico demonstrates the lasting effect a major catastrophe can have when it affects a small, isolated region with a concentrated industry and lack of resilience in infrastructure and lifelines. Whereas manufacturers based on the U.S. mainland have contingencies to tap into — the workforce, raw materials and components, and infrastructure in other parts of the country during times of need — there is not the same opportunity to do this on an island, explains Sarabandi.

Rolling blackouts

Following Maria’s landfall, residences and businesses experienced power outages throughout the island. Severe physical damage to electric power generation plants, transmission and distribution systems — including solar and wind power generation plants — plunged the island into a prolonged period of rolling blackouts.

Around 80 percent of utility poles were damaged in the event, leaving most of the island without electricity. Two weeks after the storm, 90 percent of the island was still without power. A month on, roughly 85 percent of customers were not connected to the power grid. Three months later, this figure was reported to be about half of Puerto Ricans. And finally, after six months, about 15 percent of residents did not have electricity.

“There’s no real damage on the grid itself,” says Victor Roldan, head of Caribbean and Latin America at RMS. “Most of the damage is on the distribution lines around the island. Where they had the better infrastructure in the capital, San Juan, they were able to get it back up and running in about two weeks. But there are still parts of the island without power due to bad distribution infrastructure. And that’s where the business interruption is mostly coming from.

“There are reports that 50 percent of all Maria claims for Puerto Rico will be CBI related,” adds Roldan. “Insurers were very competitive, and CBI was included in commercial policies without much thought to the consequences. Policyholders probably paid a fifth of the premiums they should have, way out of kilter with the risk. The majority of CBI claims will be power related, the businesses didn’t experience physical damage, but the loss of power has hit them financially.”

Damage to transportation infrastructure, including railways and roads, only delayed the pace of recovery. The Tren Urbano, the island’s only rail line that serves the San Juan metropolitan area (where roughly 60 percent of Puerto Ricans live), started limited service for the first time almost three months after Hurricane Maria struck. There were over 1,500 reported instances of damage to roads and bridges across the island. San Juan’s main airport, the busiest in the Caribbean, was closed for several weeks.


A concentration of risk

Roughly half of Puerto Rico’s economy is based on manufacturing activities, with around US$50 billion in GDP coming from industries such as pharmaceutical, medical devices, chemical, food, beverages and tobacco. Hurricane Maria had a significant impact on manufacturing output in Puerto Rico, particularly on the pharmaceutical and medical devices industries, which is responsible for 30 percent of the island’s GDP.

According to Anthony Phillips, chairman of Willis Re Latin America and Caribbean, the final outcome of the BI loss remains unknown but has exceeded expectations due to the length of time in getting power reinstalled. “It’s hard to model the BI loss when you depend on the efficiency of the power companies,” he says. “We used the models and whilst personal lines appeared to come in within expectations, commercial lines has exceeded them. This is mainly due to BI and the inability of the Puerto Rico Electric Power Authority (PREPA) to get things up and running.”

Home to more than 80 pharmaceutical manufacturing facilities, many of which are operated by large multinational companies, Puerto Rico’s pharmaceutical hub was a significant aggregation of risk from a supply chain and insurance perspective. Although only a few of the larger pharmaceutical plants were directly damaged by the storm, operations across the sector were suspended or reduced, in some cases for weeks or even months, due to power outages, lack of access and logistics.

“The perception of the Business Interruption insurers anticipated, versus the reality, was a complete mismatch” — Mohsen Rahnama, RMS

“The perception of the BI insurers anticipated, versus the reality, was a complete mismatch,” says Mohsen Rahnama, chief risk modeling officer at RMS. “All the big names in pharmaceuticals have operations in Puerto Rico because it’s more cost-
effective for production. And they’re all global companies and have backup processes in place and cover for business interruption. However, if there is no diesel on the island for their generators, and if materials cannot get to the island, then there are implications across the entire chain of supply.”

While most of the plants were equipped with backup power generation units, manu-
facturers struggled due to long-term lack of connection to the island’s only power grid. The continuous functioning of on-site generators was not only key to resuming production lines, power was also essential for refrigeration and storage of the pharmaceuticals. Five months on, 85 medicines in the U.S. were classified by the Food and Drug Administration (FDA) as “in shortage.”

There are several reasons why Puerto Rico’s recovery stalled. Its isolation from the U.S. mainland and poor infrastructure were both key factors, highlighted by comparing the island’s recovery to recovery operations following U.S. mainland storms, such as Hurricane Harvey in Texas last year and 2012’s Superstorm Sandy.

Not only did Sandy impact a larger area when it hit New York and New Jersey, it also caused severe damage to all transmission and distribution systems in its path. However, recovery and restoration took weeks, not months.

It is essential to incorporate the vulnerabilities created by an aggregation of risk, inadequate infrastructure and lack of contingency options into catastrophe and pricing models, thinks Roldan. “There is only one power company and the power company is facing bankruptcy,” he says. “It hasn’t invested in infrastructure in years. Maria wasn’t even the worst-case scenario because it was not a direct hit to San Juan. So, insurers need to be prepared and underwriting business interruption risks in a more sophisticated manner and not succumbing to market pressures.”


CBI impact on hospitality and tourism

Large-magnitude, high-consequence events have a lasting impact on local populations. Businesses can face increased levels of disruption and loss of revenue due to unavailability of customers, employees or both. These resourcing issues need to be properly considered in the scenario-planning stage, particularly for sectors such as hospitality and tourism.

Puerto Rico’s hospitality and tourism sectors are a significant source of its GDP. While 69 percent of hotels and 61 percent of casinos were operational six weeks after Maria struck, according to the Puerto Rico Tourism Company, other factors continued to deter visitors. 

It was not until the end of February 2018, five months after the event, that roughly 80 percent of Puerto Rico’s hotels and restaurants were back in business with tourists returning to the island. This suggests a considerable loss of income due to indirect business interruption in the hospitality and tourism industry. 


IFRS 17: Under the microscope

How new accounting standards could reduce demand for reinsurance as cedants are forced to look more closely at underperforming books of business

They may not be coming into effect until January 1, 2021, but the new IFRS 17 accounting standards are already shaking up the insurance industry. And they are expected to have an impact on the January 1, 2019, renewals as insurers ready themselves for the new regime.

Crucially, IFRS 17 will require insurers to recognize immediately the full loss on any unprofitable insurance business. “The standard states that reinsurance contracts must now be valued and accounted for separate to the underlying contracts, meaning that traditional ‘netting down’ (gross less reinsured) and approximate methods used for these calculations may no longer be valid,” explained PwC partner Alex Bertolotti in a blog post.

“Even an individual reinsurance contract could be material in the context of the overall balance sheet, and so have the potential to create a significant mismatch between the value placed on reinsurance and the value placed on the underlying risks,” he continued.

“This problem is not just an accounting issue, and could have significant strategic and operational implications as well as an impact on the transfer of risk, on tax, on capital and on Solvency II for European operations.”

In fact, the requirements under IFRS 17 could lead to a drop in reinsurance purchasing, according to consultancy firm Hymans Robertson, as cedants are forced to question why they are deriving value from reinsurance rather than the underlying business on unprofitable accounts. “This may dampen demand for reinsurance that is used to manage the impact of loss making business,” it warned in a white paper.

Cost of compliance

The new accounting standards will also be a costly compliance burden for many insurance companies. Ernst & Young estimates that firms with over US$25 billion in Gross Written Premium (GWP) could be spending over US$150 million preparing for IFRS 17.

Under the new regime, insurers will need to account for their business performance at a more granular level. In order to achieve this, it is important to capture more detailed information on the underlying business at the point of underwriting, explained Corina Sutter, director of government and regulatory affairs at RMS.

This can be achieved by deploying systems and tools that allow insurers to capture, manage and analyze such granular data in increasingly high volumes, she said. “It is key for those systems or tools to be well-integrated into any other critical data repositories, analytics systems and reporting tools.

“From a modeling perspective, analyzing performance at contract level means precisely understanding the risk that is being taken on by insurance firms for each individual account,” continued Sutter. “So, for P&C lines, catastrophe risk modeling may be required at account level. Many firms already do this today in order to better inform their pricing decisions. IFRS 17 is a further push to do so.

“It is key to use tools that not only allow the capture of the present risk, but also the risk associated with the future expected value of a contract,” she added. “Probabilistic modeling provides this capability as it evaluates risk over time.”


What one thing would... help improve the level of uncertainty when assessing risks?

Gregory Lowe

Global Head of Resilience and Sustainability, Aon

One thing that aspects of climate change are telling us is that past experience may not be reflective of what the future holds. Whether that means greater or fewer losses, we don’t always know as there are so many variables at play. But it is clear that as more uncertainty and complexity is introduced into a system, this creates a society that’s very vulnerable to shocks.

There is complexity at the climate level — because we are in uncharted territory with feedback loops, etc. — and complexity within the society that we’ve built around us, which is so dependent on interlinked infrastructure and technology. One story around Florida has been that the improvement in building codes since Hurricane Andrew has made a tremendous difference to the losses.

There is also this trade-off in how you deal with exposure to multiple hazards and underwrite that risk. So, if you’re making a roof wind resistant does that have an impact on seismic resistance? Does one peril exacerbate another? In California, we’ve seen some large flood events and wildfires, and there’s a certain interplay there when you experience extremes from one side and the other.

We can’t ignore the socio-economic as well as the scientific and climate-related factors when considering the risk. While the industry talks a lot about systemic risk, we are still a long way off from really addressing that. And you’re never going to underwrite systemic risk as such, but thinking about how one risk could potentially impact another is something that we all need to get better at.

Every discipline or industry is based upon a set of assumptions. And it’s not that we should necessarily throw our assumptions out the window, but we should have a sense of when we need to change those. Certainly, the assumption that you have this relatively stable environment with the occasional significant loss year is one to consider. Volatility is something I would expect to see a lot more of in the future.


David Flandro

Head of Global Analytics, JLT Re

It’s key for underwriters to understand the importance of the ranges in model outputs and to interpret the data as best they can. Of course, model vendors can help interpret the data, but at the end of the day it’s the underwriter who must make the decision. The models are there to inform underwriting decisions, not to make underwriting decisions. I think sometimes people use them for the latter, and that’s when they get into trouble.

There was noticeable skepticism around modeled loss ranges released in the wake of Hurricanes Harvey, Irma and Maria in 2017. So clearly, there was an opportunity to explore how the industry was using the models. What are we doing right? What could we be doing differently?

One thing that could improve catastrophe model efficacy is improving the way that they are understood. Better communication on the part of the modeling firms could improve outcomes. This may sound qualitative, but we’ve got a lot of very quantitative people in the industry and they don’t always get it right.

It’s also incumbent on the modeling firms to continue to learn to look at their own output empirically over a long period of time and understand where they got it right, where they got it wrong and then show everybody how they’re learning from it. And likewise, underwriters need to understand the modelers are not aiming for metaphysical accuracy, but for sensible estimates and ranges. These are supposed to be starting points, not endpoints.


Taking cloud adoption to the core

Insurance and reinsurance companies have been more reticent than other business sectors in embracing Cloud technology. EXPOSURE explores why it is time to ditch “the comfort blanket”

The main benefits of Cloud computing are well-established and include scale, efficiency and cost effectiveness. The Cloud also offers economical access to huge amounts of computing power, ideal to tackle the big data/big analytics challenge. And exciting innovations such as microservices — allowing access to prebuilt, Cloud-hosted algorithms, artificial intelligence (AI) and machine learning applications, which can be assembled to build rapidly deployed new services — have the potential to transform the (re)insurance industry.

And yet the industry has continued to demonstrate a reluctance in moving its core services onto a Cloud-based infrastructure. While a growing number of insurance and reinsurance companies are using Cloud services (such as those offered by Amazon Web Services, Microsoft Azure and Google Cloud) for nonessential office and support functions, most have been reluctant to consider Cloud for their mission-critical infrastructure.

In its research of Cloud adoption rates in regulated industries, such as banking, insurance and health care, McKinsey found, “Many enterprises are stuck supporting both their inefficient traditional data-center environments and inadequately planned Cloud implementations that may not be as easy to manage or as affordable as they imagined.”

No magic bullet

It also found that “lift and shift” is not enough, where companies attempt to move existing, monolithic business applications to the Cloud, expecting them to be “magically endowed with all the dynamic features.”

“We’ve come up against a lot of that when explaining the difference in what the RMS(one)® platform offers,” says Farhana Alarakhiya, vice president of products at RMS. “Basically, what clients are showing us is their legacy offering placed on a new Cloud platform. It’s potentially a better user interface, but it’s not really transforming the process.”

Now is the time for the market-leading (re)insurers to make that leap and really transform how they do business, she says. “It’s about embracing the new and different and taking comfort in what other industries have been able to do. A lot of Cloud providers are making it very easy to deliver analytics on the Cloud. So, you’ve got the story of agility, scalability, predictability, compliance and security on the Cloud and access to new analytics, new algorithms, use of microservices when it comes to delivering predictive analytics.”

This ease to tap into highly advanced analytics and new applications, unburdened from legacy systems, makes the Cloud highly attractive. Hussein Hassanali, managing partner at VTX Partners, a division of Volante Global, commented: “Cloud can also enhance long-term pricing adequacy and profitability driven by improved data capture, historical data analytics and automated links to third-party market information. Further, the ‘plug-and-play’ aspect allows you to continuously innovate by connecting to best-in-class third-party applications.”

While moving from a server-based platform to the Cloud can bring numerous advantages, there is a perceived unwillingness to put high-value data into the environment, with concerns over security and the regulatory implications that brings.

This includes data protection rules governing whether or not data can be moved across borders. “There are some interesting dichotomies in terms of attitude and reality,” says Craig Beattie, analyst at Celent Consulting. “Cloud-hosting providers in western Europe and North America are more likely to have better security than (re)insurers do in their internal data centers, but the board will often not support a move to put that sort of data outside of the company’s infrastructure.

“Today, most CIOs and executive boards have moved beyond the knee-jerk fears over security, and the challenges have become more practical,” he continues. “They will ask, ‘What can we put in the Cloud? What does it cost to move the data around and what does it cost to get the data back? What if it fails? What does that backup look like?’”

With a hybrid Cloud solution, insurers wanting the ability to tap into the scalability and cost efficiencies of a
software-as-a-service (SaaS) model, but unwilling to relinquish their data sovereignty, dedicated resources can be developed in which to place customer data alongside the Cloud infrastructure. But while a private or hybrid solution was touted as a good compromise for insurers nervous about data security, these are also more costly options. The challenge is whether the end solution can match the big Cloud providers with global footprints that have compliance and data sovereignty issues already covered for their customers.

“We hear a lot of things about the Internet being cheap — but if you partially adopt the Internet and you’ve got significant chunks of data, it gets very costly to shift those back and forth,” says Beattie.

A Cloud-first approach

Not moving to the Cloud is no longer a viable option long term, particularly as competitors make the transition and competition and disruption change the industry beyond recognition. Given the increasing cost and complexity involved in updating and linking legacy systems and expanding infrastructure to encompass new technology solutions, Cloud is the obvious choice for investment, thinks Beattie.

“If you’ve already built your on-premise infrastructure based on classic CPU-based processing, you’ve tied yourself in and you’re committed to whatever payback period you were expecting,” he says. “But predictive analytics and the infrastructure involved is moving too quickly to make that capital investment. So why would an insurer do that? In many ways it just makes sense that insurers would move these services into the Cloud.

“State-of-the-art for machine learning processing 10 years ago was grids of generic CPUs,” he adds. “Five years ago, this was moving to GPU-based neural network analyses, and now we’ve got ‘AI chips’ coming to market. In an environment like that, the only option is to rent the infrastructure as it’s needed, lest we invest in something that becomes legacy in less time than it takes to install.”

Taking advantage of the power and scale of Cloud computing also advances the march toward real-time, big data analytics. Ricky Mahar, managing partner at VTX Partners, a division of Volante Global, added: “Cloud computing makes companies more agile and scalable, providing flexible resources for both power and space. It offers an environment critical to the ability of companies to fully utilize the data available and capitalize on real-time analytics. Running complex analytics using large data sets enhances both internal decision-making and profitability.”

As discussed, few (re)insurers have taken the plunge and moved their mission-critical business to a Cloud-based SaaS platform. But there are a handful. Among these first movers are some of the newer, less legacy-encumbered carriers, but also some of the industry’s more established players. The latter includes U.S.-based life insurer MetLife, which announced it was collaborating with IBM Cloud last year to build a platform designed specifically for insurers. Meanwhile Munich Re America is offering a Cloud-hosted AI platform to its insurer clients. “The ice is thawing and insurers and reinsurers are changing,” says Beattie. “Reinsurers [like Munich Re] are not just adopting Cloud but are launching new innovative products on the Cloud.”

What’s the danger of not adopting the Cloud? “If your reasons for not adopting the Cloud are security-based, this reason really doesn’t hold up any more. If it is about reliability, scalability, remember that the largest online enterprises such as Amazon, Netflix are all Cloud-based,” comments Farhana Alarakhiya. “The real worry is that there are so many exciting, groundbreaking innovations built in the Cloud for the (re)insurance industry, such as predictive analytics, which will transform the industry, that if you miss out on these because of outdated fears, you will damage your business. The industry is waiting for transformation, and it’s progressing fast in the Cloud.”


Are we moving off the baseline?

How is climate change influencing natural perils and weather extremes, and what should reinsurance companies do to respond?

Reinsurance companies may feel they are relatively insulated from the immediate effects of climate change on their business, given that most property catastrophe policies are renewed on an annual basis. However, with signs that we are already moving off the historical baseline when it comes to natural perils, there is evidence to suggest that underwriters should already be selectively factoring the influence of climate change into their day-to-day decision-making.

Most climate scientists agree that some of the extreme weather anticipated by the United Nations Intergovernmental Panel on Climate Change (IPCC) in 2013 is already here and can be linked to climate change in real time via the burgeoning field of extreme weather attribution. “It’s a new area of science that has grown up in the last 10 to 15 years,” explains Dr. Robert Muir-Wood, chief research officer at RMS. “Scientists run two climate models for the whole globe, both of them starting in 1950. One keeps the atmospheric chemistry static since then, while the other reflects the actual increase in greenhouse gases. By simulating thousands of years of these alternative worlds, we can find the difference in the probability of a particular weather extreme.”

"Underwriters should be factoring the influence of climate change into their day-to-day decision-making"

For instance, climate scientists have run their models in an effort to determine how much the intensity of the precipitation that caused such devastating flooding during last year’s Hurricane Harvey can be attributed to anthropogenic climate change. Research conducted by scientists at the World Weather Attribution (WWA) project has found that the record rainfall produced by Harvey was at least three times more likely to be due to the influence of global warming.

This suggests, for certain perils and geographies, reinsurers need to be considering the implications of an increased potential for certain climate extremes in their underwriting. “If we can’t rely on the long-term baseline, how and where do we modify our perspective?” asks Muir-Wood. “We need to attempt to answer this question peril by peril, region by region and by return period. You cannot generalize and say that all perils are getting worse everywhere, because they’re not. In some countries and perils there is evidence that the changes are already material, and then in many other areas the jury is out and it’s not clear.”

Keeping pace with the change

While the last IPCC report was published five years ago (the next one is due in 2019), there is some consensus on how climate change is beginning to influence natural perils and climate extremes. Many regional climates naturally have large variations at interannual and even interdecadal timescales, which makes observation of climate change, and validation of predictions, more difficult.

“There is always going to be uncertainty when it comes to climate change,” emphasizes Swenja Surminski, head of adaptation research at the Grantham Research Institute on Climate Change and the Environment, part of the London School of Economics and Political Science (LSE). “But when you look at the scientific evidence, it’s very clear what’s happening to temperature, how the average temperature is increasing, and the impact that this can have on fundamental things, including extreme events.”

According to the World Economic Forum’s Global Risks Report 2018, “Too little has been done to mitigate climate change and ... our own analysis shows that the likelihood of missing the Paris Agreement target of limiting global warming to two degrees Celsius or below is greater than the likelihood of achieving it.”

The report cites extreme weather events and natural disasters as the top two “most likely” risks to happen in the next 10 years and the second- and third-highest risks (in the same order) to have the “biggest impact” over the next decade, after weapons of mass destruction. The failure of climate change mitigation and adaptation is also ranked in the top five for both likelihood and impact. It notes that 2017 was among the three hottest years on record and the hottest ever without an El Niño.

It is clear that climate change is already exacerbating climate extremes, says Surminski, causing dry regions to become drier and hot regions to become hotter. “By now, based on our scientific understanding and also thanks to modeling, we get a much better picture of what our current exposure is and how that might be changing over the next 10, 20, even 50 to 100 years,” she says.

“There is also an expectation we will have more freak events, when suddenly the weather produces really unexpected, very unusual phenomena,” she continues. “That’s not just climate change. It’s also tied into El Niño and other weather phenomena occurring, so it’s a complex mix. But right now, we’re in a much better position to understand what’s going on and to appreciate that climate change is having an impact.”

Pricing for climate change

For insurance and reinsurance underwriters, the challenge is to understand the extent to which we have already deviated from the historical record and to manage and price for that appropriately. It is not an easy task given the inherent variability in existing weather patterns, according to Andy Bord, CEO of Flood Re, the U.K.’s flood risk pool, which has a panel of international reinsurers.

“The existing models are calibrated against data that already includes at least some of the impact of climate change,” he says. “Some model vendors have also recently produced models that aim to assess the impact of climate change on the future level of flood risk in the U.K. We know at least one larger reinsurer has undertaken their own climate change impact analyses.

“We view improving the understanding of the potential variability of weather given today’s climate as being the immediate challenge for the insurance industry, given the relatively short-term view of markets,” he adds.

The need for underwriters to appreciate the extent to which we may have already moved off the historical baseline is compounded by the conflicting evidence on how climate change is influencing different perils. And by the counterinfluence or confluence, in many cases, of naturally occurring climate patterns, such as El Niño and the Atlantic Multidecadal Oscillation (AMO).

The past two decades have seen below-normal European windstorm activity, for instance, and evidence builds that the unprecedented reduction in Arctic sea ice during the autumn months is the main cause, according to Dr. Stephen Cusack, director of model development at RMS. “In turn, the sea ice declines have been driven both by the ‘polar amplification’ aspect of anthropogenic climate change and the positive phase of the AMO over the past two decades, though their relative roles are uncertain.

“We view improving the understanding of the potential variability of weather given today’s climate as being the immediate challenge for the insurance industry, given the relatively short-term view of markets” — Andy Bord, Flood Re

“The (re)insurance market right now is saying, ‘Your model has higher losses than our recent experience.’ And what we are saying is that the recent lull is not well understood, and we are unsure how long it will last. Though for pricing future risk, the question is when, and not if, the rebound in European windstorm activity happens. Regarding anthropogenic climate change, other mechanisms will strengthen and counter the currently dominant ‘polar amplification’ process. Also, the AMO goes into positive and negative phases,” he continues. “It’s been positive for the last 20 to 25 years and that’s likely to change within the next decade or so.”

And while European windstorm activity has been somewhat muted by the AMO, the same cannot be said for North Atlantic hurricane activity. Hurricanes Harvey, Irma and Maria (HIM) caused an estimated US$92 billion in insured losses, making 2017 the second costliest North Atlantic hurricane season, according to Swiss Re Sigma. “The North Atlantic seems to remain in an active phase of hurricane activity, irrespective of climate change influences that may come on top of it,” the study states.

While individual storms are never caused by one factor alone, stressed the Sigma study, “Some of the characteristics observed in HIM are those predicted to occur more frequently in a warmer world.” In particular, it notes the high level of rainfall over Houston and hurricane intensification. While storm surge was only a marginal contributor to the losses from Hurricane Harvey, Swiss Re anticipates the probability of extreme storm surge damage in the northeastern U.S. due to higher seas will almost double in the next 40 years.

“From a hurricane perspective, we can talk about the frequency of hurricanes in a given year related to the long-term average, but what’s important from the climate change point of view is that the frequency and the intensity on both sides of the distribution are increasing,” says Dr. Pete Dailey, vice president at RMS. “This means there’s more likelihood of quiet years and more likelihood of very active years, so you’re moving away from the mean, which is another way of thinking about moving away from the baseline.

“So, we need to make sure that we are modeling the tail of the distribution really well, and that we’re capturing the really wet years — the years where there’s a higher frequency of torrential rain in association with events that we model.”


The edge of insurability

Over the long term, the industry likely will be increasingly insuring the impact of anthropogenic climate change. One question is whether we will see “no-go” areas in the future, where the risk is simply too high for insurance and reinsurance companies to take on. As Robert Muir-Wood of RMS explains, there is often a tension between the need for (re)insurers to charge an accurate price for the risk and the political pressure to ensure cover remains available and affordable.

He cites the community at Queen’s Cove in Grand Bahama, where homes were unable to secure insurance given the repeated storm surge flood losses they have sustained over the years from a number of hurricanes. Unable to maintain a mortgage without insurance, properties were left to fall into disrepair. “Natural selection came up with a solution,” says Muir-Wood, whereby some homeowners elevated buildings on concrete stilts thereby making them once again insurable.  

“In high-income, flood-prone countries, such as Holland, there has been sustained investment in excellent flood defenses,” he says. “The challenge in developing countries is there may not be the money or the political will to build adequate flood walls. In a coastal city like Jakarta, Indonesia, where the land is sinking as a result of pumping out the groundwater, it’s a huge challenge. 

“It’s not black and white as to when it becomes untenable to live somewhere. People will find a way of responding to increased incidence of flooding. They may simply move their life up a level, as already happens in Venice, but insurability will be a key factor and accommodating the changes in flood hazard is going to be a shared challenge in coastal areas everywhere.”

Political pressure to maintain affordable catastrophe insurance was a major driver of the U.S. residual market, with state-backed Fair Access to Insurance Requirements (FAIR) plans providing basic property insurance for homes that are highly exposed to natural catastrophes. Examples include the California Earthquake Association, Texas Windstorm Insurance Association and Florida Citizens Property Insurance Corporation (and state reinsurer, the FHCF). 

However, the financial woes experienced by FEMA’s National Flood Insurance Program (NFIP), currently the principal provider of residential flood insurance in the U.S., demonstrates the difficulties such programs face in terms of being sustainable over the long term.  

With the U.K.’s Flood Re scheme, investment in disaster mitigation is a big part of the solution, explains CEO Andy Bord. However, even then he acknowledges that “for some homes at the very greatest risk of flooding, the necessary investment needed to reduce risks and costs would simply be uneconomic.”  


Bringing Clarity to Slab Claims

How will a new collaboration between a major Texas insurer, RMS, Accenture and Texas Tech University provide the ability to determine with accuracy the source of slab claim loss?

The litigation surrounding “slab claims” in the U.S. in the aftermath of a major hurricane has long been an issue within the insurance industry. When nothing is left of a coastal property but the concrete slab on which it was built, how do claims handlers determine whether the damage was predominantly caused by water or wind?

The decision that many insurers take can spark protracted litigation, as was the case following Hurricane Ike, a powerful storm that caused widespread damage across the state after it made landfall over Galveston in September 2008. The storm had a very large footprint for a Category 2 hurricane, with sustained wind speeds of 110 mph and a 22-foot storm surge. Five years on, litigation surrounding how slab claim damage had been wrought rumbled on in the courts.

Recognizing the extent of the issue, major coastal insurers knew they needed to improve their methodologies. It sparked a new collaboration between RMS, a major Texas insurer, Accenture and Texas Tech University (TTU). And from this year, the insurer will be able to utilize RMS data, hurricane modeling methodologies, and software analyses to track the likelihood of slab claims before a tropical cyclone makes landfall and document the post-landfall wind, storm surge and wave impacts over time.

The approach will help determine the source of the property damage with greater accuracy and clarity, reducing the need for litigation post-loss, thus improving the overall claims experience for both the policyholder and insurer. To provide super accurate wind field data, RMS has signed a contract with TTU to expand a network of mobile meteorological stations that are ultimately positioned in areas predicted to experience landfall during a real-time event.

“Our contract is focused on Texas, but they could also be deployed anywhere in the southern and eastern U.S.,” says Michael Young, senior director of product management at RMS. “The rapidly deployable weather stations collect peak and mean wind speed characteristics and transmit via the cell network the wind speeds for inclusion into our tropical cyclone data set. This is in addition to a wide range of other data sources, which this year includes 5,000 new data stations from our partner Earth Networks.”

The storm surge component of this project utilizes the same hydrodynamic storm surge model methodologies embedded within the RMS North Atlantic Hurricane Models to develop an accurate view of the timing, extent and severity of storm surge and wave-driven hazards post-landfall. Similar to the wind field modeling process, this approach will also be informed by ground-truth terrain and observational data, such as high-resolution bathymetry data, tide and stream gauge sensors and high-water marks.

“The whole purpose of our involvement in this project is to help the insurer get those insights into what’s causing the damage,” adds Jeff Waters, senior product manager at RMS. “The first eight hours of the time series at a particular location might involve mostly damaging surge, followed by eight hours of damaging wind and surge. So, we’ll know, for instance, that a lot of that damage that occurred in the first eight hours was probably caused by surge. It’s a very exciting and pretty unique project to be part of.”


Assigning a Return Period to 2017

Hurricanes Harvey, Irma and Maria (HIM) tore through the Caribbean and U.S. in 2017, resulting in insured losses over US$80 billion. Twelve years after Hurricanes Katrina, Rita and Wilma (KRW), EXPOSURE asks if the (re)insurance industry was better prepared for its next ‘terrible trio’ and what lessons can be learned  

In one sense, 2017 was a typical loss year for the insurance industry in that the majority of losses stemmed from the “peak zone” of U.S. hurricanes. However, not since the 2004-05 season had the U.S. witnessed so many landfalling hurricanes. It was the second most costly hurricane season on record for the (re)insurance industry, when losses in 2005 are adjusted for inflation.

According to Aon Benfield, HIM caused total losses over US$220 billion and insured losses over US$80 billion — huge sums in the context of global catastrophe losses for the year of US$344 billion and insured losses of US$134 billion. Overall, weather-related catastrophe losses exceeded 0.4 percent of global GDP in 2017 (based on data from Aon Benfield, Munich Re and the World Bank), the second highest figure since 1990. In that period, only 2005 saw a higher relative catastrophe loss at around 0.5 percent of GDP.

But, it seems, (re)insurers were much better prepared to absorb major losses this time around. Much has changed in the 12 years since Hurricane Katrina breached the levees in New Orleans. Catastrophe modeling as a profession has evolved into exposure management, models and underlying data have improved and there is a much greater appreciation of model uncertainty and assumptions, explains Alan Godfrey, head of exposure management at Asta.

“Even post-2005 people would still see an event occurring, go to the models and pull out a single event ID ... then tell all and sundry this is what we’re going to lose. And that’s an enormous misinterpretation of how the models are supposed to be used. In 2017, people demonstrated a much greater maturity and used the models to advise their own loss estimates, and not the other way around.”

It also helped that the industry was extremely well-capitalized moving into 2017. After a decade of operating through a low interest rate and increasingly competitive environment, (re)insurers had taken a highly disciplined approach to capital management. Gone are the days where a major event sparked a series of run-offs. While some (re)insurers have reported higher losses than others, all have emerged intact.

“In 2017 the industry has performed incredibly well from an operational point of view,” says Godfrey. “There have obviously been challenges from large losses and recovering capital, but those are almost outside of exposure management.”

According to Aon Benfield, global reinsurance capacity grew by 80 percent between 1990 and 2017 (to US$605 billion), against global GDP growth of around 24 percent. The influx of capacity from the capital markets into U.S. property catastrophe reinsurance has also brought about change and innovation, offering new instruments such as catastrophe bonds for transferring extreme risks.

Harvey broke all U.S. records for tropical cyclone-driven rainfall with observed cumulative rainfall of 51 inches

Much of this growth in non-traditional capacity has been facilitated by better data and more sophisticated analytics, along with a healthy appetite for insurance risk from pension funds and other institutional investors.

For insurance-linked securities (ILS), the 2017 North Atlantic hurricane season, Mexico’s earthquakes and California’s wildfires were their first big test. “Some thought that once we had a significant year that capital would leave the market,” says John Huff, president and chief executive of the Association of Bermuda Insurers and Reinsurance (ABIR). “And we didn’t see that.

“In January 2018 we saw that capital being reloaded,” he continues. “There is abundant capital in all parts of the reinsurance market. Deploying that capital with a reasonable rate of return is, of course, the objective.”

Huff thinks the industry performed extremely well in 2017 in spite of the severity of the losses and a few surprises. “I’ve even heard of reinsurers that were ready with claim payments on lower layers before the storm even hit. The modeling and ability to track the weather is getting more sophisticated. We saw some shifting of the storms — Irma was the best example — but reinsurers were tracking that in real time in order to be able to respond.”

The Buffalo Bayou River floods a park in Houston after the arrival of Hurricane Harvey

How Harvey inundated Houston

One lesson the industry has learned over three decades of modeling is that models are approximations of reality. Each event has its own unique characteristics, some of which fall outside of what is anticipated by the models.

The widespread inland flooding that occurred after Hurricane Harvey made landfall on the Texas coastline is an important illustration of this, explains Huff. Even so, he adds, it continued a theme, with flood losses being a major driver of U.S. catastrophe claims for several years now. “What we’re seeing is flood events becoming the No. 1 natural disaster in the U.S. for people who never thought they were at risk of flood.”

Harvey broke all U.S. records for tropical cyclone-driven rainfall with observed cumulative rainfall of 51 inches (129 centimeters). The extreme rainfall generated by Harvey and the unprecedented inland flooding across southeastern Texas and parts of southern Louisiana was unusual.

However, nobody was overly surprised by the fact that losses from Harvey were largely driven by water versus wind. Prior events with significant storm surge-induced flooding, including Hurricane Katrina and 2012’s Superstorm Sandy, had helped to prepare (re)insurers, exposure managers and modelers for this eventuality. “The events themselves were very large but they were well within uncertainty ranges and not disproportionate to expectations,” says Godfrey.

“Harvey is a new data point — and we don’t have that many — so the scientists will look at it and know that any new data point will lead to tweaks,” he continues. “If anything, it will make people spend a bit more time on their calibration for the non-modeled elements of hurricane losses, and some may conclude that big changes are needed to their own adjustments.”

But, he adds: “Nobody is surprised by the fact that flooding post-hurricane causes loss. We know that now. It’s more a case of tweaking and calibrating, which we will be doing for the rest of our lives.”

Flood modeling

Hurricane Harvey also underscored the importance of the investment in sophisticated, probabilistic flood models. RMS ran its U.S. Inland Flood HD Model in real time to estimate expected flood losses. “When Hurricane Harvey happened, we had already simulated losses of that magnitude in our flood model, even before the event occurred,” says Dr. Pete Dailey, vice president of product management and responsible for U.S. flood modeling at RMS.

“The value of the model is to be able to anticipate extreme tail events well before they occur, so that insurance companies can be prepared in advance for the kind of risk they’re taking on and what potential claims volume they may have after a major event,” he adds.

Does this mean that a US$100 billion-plus loss year like 2017 is now a 1-in-6-year event?

Harvey has already offered a wealth of new data that will be fed into the flood model. The emergency shutdown of the Houston metropolitan area prevented RMS meteorologists and engineers from accessing the scene in the immediate aftermath, explains Dailey. However, once on the ground they gathered as much information as they could, observing and recording what had actually happened to affected properties.

“We go to individual properties to assess the damage visually, record the latitude and longitude of the property, the street address, the construction, occupancy and the number of stories,” he says. “We will also make an estimate of the age of the property. Those basic parameters allow us to go back and take a look at what the model would have predicted in terms of damage and loss, as compared to what we observed.”

The fact that insured losses emanating from the flooding were only a fraction of the total economic losses is an inevitable discussion point. The majority of claims paid were for commercial properties, with residential properties falling under the remit of the National Flood Insurance Program (NFIP). Many residential homes were uninsured, however, explains ABIR’s Huff.

“The NFIP covers just the smallest amount of people — there are only five million policies — and yet you see a substantial event like Harvey which is largely uninsured because (re)insurance companies only cover commercial flood in the U.S.,” he says. “After Harvey you’ll see a realization that the private market is very well-equipped to get back into the private flood business, and there’s a national dialogue going on now.”

Is 2017 the new normal?

One question being asked in the aftermath of the 2017 hurricane season is: What is the return period for a loss year like 2017? RMS estimates that, in terms of U.S. and Caribbean industry insured wind, storm surge and flood losses, the 2017 hurricane season corresponds to a return period between 15 and 30 years.

However, losses on the scale of 2017 occur more frequently when considering global perils. Adjusted for inflation, it is seven years since the industry paid out a similar level of catastrophe claims — US$110 billion on the Tohoku earthquake and tsunami, Thai floods and New Zealand earthquake in 2011. Six years prior to that, KRW cost the industry in excess of US$75 billion (well over US$100 billion in today’s money).

So, does this mean that a US$100 billion-plus (or equivalent in inflation-adjusted terms) loss year like 2017 is now a one-in-six-year event? As wealth and insurance penetration grows in developing parts of the world, will we begin to see more loss years like 2011, where catastrophe claims are not necessarily driven by the U.S. or Japan peak zones?

“Increased insurance penetration does mean that on the whole losses will increase, but hopefully this is proportional to the premiums and capital that we are getting in,” says Asta’s Godfray. “The important thing is understanding correlations and how diversification actually works and making sure that is applied within business models.

“In the past, people were able to get away with focusing on the world in a relatively binary fashion,” he continues. “The more people move toward diversified books of business, which is excellent for efficient use of capital, the more important it becomes to understand the correlations between different regions.”

“You could imagine in the future, a (re)insurer making a mistake with a very sophisticated set of catastrophe and actuarial models,” he adds. “They may perfectly take into account all of the non-modeled elements but get the correlations between them all wrong, ending up with another year like 2011 where the losses across the globe are evenly split, affecting them far more than their models had predicted.”

As macro trends including population growth, increasing wealth, climate change and urbanization influence likely losses from natural catastrophes, could this mean a shorter return period for years like last year, where industry losses exceeded US$134 billion?

“When we look at the average value of properties along the U.S. coastline — the Gulf Coast and East Coast — there’s a noticeable trend of increasing value at risk,” says Dailey. “That is because people are building in places that are at risk of wind damage from hurricanes and coastal flooding. And these properties are of a higher value because they are more complex, have a larger square footage and have more stories. Which all leads to a higher total insured value.

“The second trend that we see would be from climate change whereby the storms that produce damage along the coastline may be increasing in frequency and intensity,” he continues. “That’s a more difficult question to get a handle on but there’s a building consensus that while the frequency of hurricane landfalls may not necessarily be increasing, those that do make landfall are increasing in intensity.”

Lloyd’s chief executive Inga Beale has stated her concerns about the impact of climate change, following the market’s £4.5 billion catastrophe claims bill for 2017. “That’s a significant number, more than double 2016; we’re seeing the impact of climate change to a certain extent, particularly on these weather losses, with the rising sea level that impacts and increases the amount of loss,” she said in an interview with Bloomberg.

While a warming climate is expected to have significant implications for the level of losses arising from storms and other severe weather events, it is not yet clear exactly how this will manifest, according to Tom Sabbatelli, senior product manager at RMS. “We know the waters have risen several centimeters in the last couple of decades and we can use catastrophe models to quantify what sort of impact that has on coastal flooding, but it’s also unclear what that necessarily means for tropical cyclone strength.

“The oceans may be warming, but there’s still an ongoing debate about how that translates into cyclone intensity, and that’s been going on for a long time,” he continues. “The reason for that is we just don’t know until we have the benefit of hindsight. We haven’t had a number of major hurricanes in the last few years, so does that mean that the current climate is quiet in the Atlantic? Is 2017 an anomaly or are we going back to more regular severe activity? It’s not until you’re ten or 20 years down the line and you look back that you know for sure.”


Where Tsunami Warnings Are Etched in Stone

As RMS releases its new Japan Earthquake and Tsunami Model, EXPOSURE looks back at the 2011 Tohoku event and other significant events that have shaped scientific knowledge and understanding of earthquake risk 

Hundreds of ancient markers dot the coastline of Japan, some over 600 years old, as a reminder of the danger of tsunami. Today, a new project to construct a 12.5-meter-high seawall stretching nearly 400 kilometers along Japan’s northeast coast is another reminder. Japan is a highly seismically active country and was well prepared for earthquakes and tsunami ahead of the Tohoku Earthquake in 2011. It had strict building codes, protective tsunami barriers, early-warning systems and disaster-response plans.

But it was the sheer magnitude, scale and devastation caused by the Tohoku Earthquake and Tsunami that made it stand out from the many thousands of earthquakes that had come before it in modern times. What had not been foreseen in government planning was that an earthquake of this magnitude could occur, nor that it could produce such a sizable tsunami.

The Tohoku Earthquake was a magnitude 9.0 event — off the charts as far as the Japanese historical record for earthquakes was concerned. A violent change in the ocean bottom triggered an immense tsunami with waves of up to 40 meters that tore across the northeast coast of the main island of Honshu, traveling up to 10 kilometers inland in the Sendai area.

The tsunami breached sea walls and claimed almost everything in its path, taking 16,000 lives (a further 2,000 remain missing, presumed dead) and causing economic losses of US$235 billion. However, while the historical record proved inadequate preparation for the Tohoku event, the geological record shows that events of that magnitude had occurred before records began, explains Mohsen Rahnama, chief risk modeling officer at RMS.

“Since the Tohoku event, there's been a shift ... to moving further back in time using a more full consideration of the geological record” — Mohsen Rahnama, RMS

“If you go back in the geological record to 869 in the Tohoku region, there is evidence for a potentially similarly scaled tsunami,” he explains. “Since the Tohoku event, there’s been a shift in the government assessments moving away from a focus on what happened historically to a more full consideration of the geological record.”

The geological record, which includes tsunami deposits in coastal lakes and across the Sendai and Ishinomaki plains, shows there were large earthquakes and associated tsunami in A.D. 869, 1611 and 1896. The findings of this research point to the importance of having a fully probabilistic tsunami model at a very high resolution.

Rahnama continues: “The Tohoku event really was the ‘perfect’ tsunami hitting the largest exposure concentration at risk to tsunami in Japan. The new RMS tsunami model for Japan includes tsunami events similar to and in a few cases larger than were observed in 2011. Because the exposure in the region is still being rebuilt, the model cannot produce tsunami events with this scale of loss in Tohoku at this time.”

Incorporating secondary perils

In its new Japan earthquake and tsunami model release, RMS has incorporated the lessons from the Tohoku Earthquake and other major earthquakes that have occurred since the last model was released. Crucially, it includes a fully probabilistic tsunami model that is integrated with the earthquake stochastic event set.

“Since the Japan model was last updated we’ve had several large earthquakes around the world, and they all inform how we think about the largest events, particularly how we model the ground motions they produce,” says Ryan Leddy, senior product manager at RMS, “because good instrumentation has only been available over the last several decades. So, the more events where we sample really detailed information about the ground shaking, the better we can quantify it.

“Particularly on understanding strong ground shaking, we utilized information across events,” he continues. “Petrochemical facilities around the world are built with relatively consistent construction practices. This means that examination of the damage experienced by these types of facilities in Chile and Japan can inform our understanding of the performance of these facilities in other parts of the world with similar seismic hazard.”

The Maule Earthquake in Chile in 2010, the Canterbury sequence of earthquakes in New Zealand in 2010 and 2011, and the more recent Kumamoto Earthquakes in Japan in 2016, have added considerably to the data sets. Most notably they have informed scientific understanding of the nature of secondary earthquake perils, including tsunami, fire following earthquake, landslides and liquefaction.

The 2016 Kumamoto Earthquake sequence triggered extensive landsliding. The sequence included five events in the range of magnitude 5.5 to 7.0 and caused severe damage in Kumamoto and Oita Prefectures from ground shaking, landsliding, liquefaction and fire following earthquake.

“Liquefaction is in the model as a secondary peril. RMS has redesigned and recalibrated the liquefaction model for Japan. The new model directly calculates damage due to vertical deformation due to liquefaction processes,” says Chesley Williams, senior director, RMS Model Product Management. “While the 1964 Niigata Earthquake with its tipped apartment buildings showed that liquefaction damages can be severe in Japan, on a countrywide basis the earthquake risk is driven by the shaking, tsunami and fire following, followed by liquefaction and landslide. For individual exposures, the key driver of the earthquake risk is very site specific, highlighting the importance of high-resolution modeling in Japan.”

The new RMS model accounts for the clustering of large events on the Nankai Trough. This is an important advancement as an examination of the historical record shows that events on the Nankai Trough have either occurred as full rupturing events (e.g., 1707 Hoei Earthquake) or as pairs of events (e.g., 1944 and 1946 and two events in 1854).

This is different from aftershocks, explains Williams. “Clustered events are events on different sources that would have happened in the long-term earthquake record, and the occurrence of one event impacts the timing of the other events. This is a subtle but important distinction. We can model event clustering on the Nankai Trough due to the comprehensive event record informed by both historical events and the geologic record.”

The Tohoku event resulted in insurance losses of US$30 billion to US$40 billion, the costliest earthquake event for the insurance industry in history. While the news media focused on the extreme tsunami, the largest proportion of the insurance claims emanated from damage wrought by the strong ground shaking. Interestingly, likely due to cultural constraints, only a relatively low amount of post-event loss amplification was observed.

“In general for very large catastrophes, claims costs can exceed the normal cost of settlement due to a unique set of economic, social and operational factors,” says Williams. “Materials and labor become more expensive and claims leakage can be more of an issue, so there are a number of factors that kick in that are now captured by the RMS post-event loss amplification modeling. The new Japan model now explicitly models post-event loss amplification but limits the impacts to be consistent with the observations in recent events in Japan.”

Supply chain disruption and contingent business interruption were significant sources of loss following the Tohoku event. This was exacerbated by the level seven meltdown at the Fukushima nuclear power plant that resulted in evacuations, exclusion zones and rolling blackouts.

“We sent reconnaissance teams to Japan after the event to understand the characteristics of damage and to undertake case studies for business interruption,” says Williams. “We visited large industrial facilities and talked to them about their downtime, their material requirement and their access to energy sources to better understand what had impacted their ability to get back up and running.”

Recent events have re-emphasized that there are significant differences in business interruption by occupancy. “For example,  a semiconductor facility is likely going to have a longer downtime than a cement factory,” says Williams. “The recent events have highlighted the impacts on business interruption for certain occupancies by damage to supply sources. These contingent business interruptions are complex, so examination of the case studies investigated in Japan were instrumental for informing the model.”

Rebuilding in the seven years since the Tohoku Tsunami struck has been an exercise in resilient infrastructure. With nearly half a million people left homeless, there has been intense rebuilding to restore services, industry and residential property. US$12 billion has been spent on seawalls alone, replacing the 4-meter breakwaters with 12.5-meter-high tsunami barriers.

An endless convoy of trucks has been moving topsoil from the hills to the coastline in order to raise the land by over 10 meters in places. Most cities have decided to elevate by several meters, with a focus on rebuilding commercial premises in exposed areas. Some towns have forbidden the construction of homes in flat areas nearest the coasts and relocated residents to higher ground.


Tokyo-Yokohama: The world's most exposed metropolis

The Japanese metropolis of Tokyo-Yokohama has the world's greatest GDP at risk from natural catastrophes. Home to 38 million residents, it has potential for significant economic losses from multiple perils, but particularly earthquakes. According to Swiss Re it is the riskiest metropolitan area in the world.

A combination of strict building codes, land use plans and disaster preparedness have significantly reduced the city's vulnerability in recent decades. Despite the devastation caused by the tsunami, very few casualties (around 100) related to partial or complete building collapse resulting from ground shaking during the magnitude 9.0 Tohoku Earthquake.  


How Cyber Became a Peak Peril

As new probabilistic cyber models are launched, EXPOSURE explores how probabilistic modeling will facilitate the growth of the cyber (re)insurance market and potentially open up the transfer of catastrophic risks to the capital markets 

The potential for cyberattacks to cause global, systemic disruption continues to ratchet up, and to confuse matters further, it is state actors that are increasingly involved in sponsoring these major attacks. Last year’s major global ransomware attacks — WannaCry and NotPetya — were a wake-up call for many businesses, in terms of highlighting the potential scale and source of cyber incidents. The widespread disruption caused by these incidents — widely suspected of being state-sponsored attacks — confirmed that cyber risk is now in the realm of catastrophe exposures.

The introduction of probabilistic catastrophe modeling for cyber therefore comes at an opportune time. In terms of modeling, although a cyberattack is human-made and very different from a Florida hurricane or Japanese earthquake, for instance, there are some parallels with natural catastrophe perils. Most notable is the potential for sizable, systemic loss.

“Catastrophe modeling exists because of the potential correlation of losses across multiple locations and policies all from the same event,” explains Robert Muir-Wood, chief research officer at RMS. “This concentration is what insurers most fear. The whole function of insurance is to diversify risk.

“Anything that concentrates risk is moving in the opposite direction to diversification,” he continues. “So, insurers need to find every way possible to limit the concentration of losses. And cyber clearly has the potential, as demonstrated by the NotPetya and WannaCry attacks last year, to impact many separate businesses in a single attack.”

“What’s the equivalent of a cyber hurricane? None of the insurers are quite sure about that” — Tom Harvey, RMS

Cyberattacks can easily make a loss go global. Whereas a Florida hurricane can damage multiple properties across a small geographical area, a ransomware attack can interrupt the day-to-day running of thousands of businesses on an unprecedented geographical scale. “When I think of systemic risk I think of an attack that can target many thousands of organizations, causing disruption of digital assets using technology as a vector for disruption,” says Tom Harvey, senior product manager at RMS cyber solutions.

“What’s the equivalent of a cyber hurricane? None of the insurers are quite sure about that. When you write a cyber insurance policy you’re inherently taking a bet on the probability of that policy paying out. Most people recognize there are systemic risks out there, which increases the probability of their policy paying out, but until models have been developed there’s no way to really quantify that,” he adds. “Which is why we do what we do.”

RMS estimates a substantial outage at a leading cloud service provider could generate an insurable economic loss of US$63 billion — and that is just for the U.S. In economic loss terms, this is roughly equivalent to a catastrophic natural disaster such as Superstorm Sandy in 2012.

To estimate these losses, the RMS model takes into account the inherent resiliency of cloud service providers, which capitalizes on extensive research into how corporations use the cloud for their revenue generating processes, and how cloud providers have adopted resilient IT architectures to mitigate the impact of an outage on their customers.

The majority of the loss would come from contingent business income (CBI), a coverage that typically has an 8-12 hour waiting period and is heavily sublimited. Coupled with the still relatively low cyber insurance penetration, a significant proportion of this loss will fall on the corporates themselves rather than the insurance industry.

The evolution of cyber modeling

In the early days of cyber insurance, when businesses and insurers were grappling with an esoteric and rapidly evolving threat landscape, cyber was initially underwritten using various scenarios to determine probable maximum losses for a portfolio of risks.

RMS launched its Cyber Accumulation Management System (CAMS) in 2015, initially focused on five key cyber exposures: data exfiltration, ransomware, denial of service, cloud failure and extortion. “Within each of those classes of cyberattack we asked, ‘What is the most systemic type of incident that we would expect to see?’” explains Harvey. “Then you can understand the constraints that determine the potential scale of these events.

“We have always conducted a great deal of historical event analysis to understand the technical constraints that are in place, and then we put all that together. So, for example, with data exfiltration there are only so many threat actors that have the capability to carry out this type of activity,” he continues. “And it’s quite a resource intensive activity. So even if you made it very easy for hackers to steal data there’s only so many actors in the world (even state actors) that would want to.

“From an insurance point of view, if you are insuring 5,000 companies and providing cyber coverage for them, you could run the model and say if one of these catastrophes impacts our book we can be confident our losses are not going to exceed, say US$100 million. That’s starting to provide some comfort to those insurers about what their PML [probable maximum loss] scenarios would be.”

The affirmative cyber insurance market is now four times the size it was when RMS developed its first-generation cyber risk model, and as the market diversifies and grows, clients need new tools to manage profitable growth.

Harvey adds: “The biggest request from our clients was to assess the return periods of cyber loss and to link probabilities with accumulation scenarios, and help them allocate capital to cyber as a line of insurance.  In the release of RMS Cyber Solutions Version 3, which includes the first probabilistic model for cyber loss, we estimate the scalability of the various loss processes that make up the drivers of cyber claims.

“Stochastic modeling helps explore the systemic potential for catastrophe loss estimates resulting from each cyber loss process: incorporating the statistical volatility of claims patterns from these in recent years, the technical constraints on scaling factors and attack modes of each process, and the parallels with loss exceedance distributions from other perils that RMS has modeled extensively.

“From this, we now provide loss exceedance probability (EP) distributions for each cyber loss process, with reference accumulation scenarios benchmarked to key return periods from the EP curve. These are combined into a total loss EP curve from all causes. RMS has been expanding on these scenarios in recent years, coming up with new situations that could occur in the future and incorporating a rapidly growing wealth of data on cyberattacks that have occurred. Knowing how these real-life incidents have played out helps our cyber modeling team to assign probabilities to those scenarios so insurers can more confidently assign their capital and price the business.”

With the ability to model cyber on a probabilistic basis to enable insurers to more accurately assign capital to their portfolio of risks, it is hoped this will facilitate the growth of both the cyber insurance and reinsurance market.

Taking out the peaks

As the cyber (re)insurance market develops, the need for mechanisms to transfer extreme risks will grow. This is where the capital markets could potentially play a role. There are plenty of challenges in structuring an instrument such as a catastrophe bond to cover cyber risk, however, the existence of probabilistic cyber models takes that one step closer to becoming a reality.

In 2016, Credit Suisse was able to transfer its operational risk exposures to the capital markets via the Operational Re catastrophe bond, which was fronted by insurer Zurich. Among the perils covered was a cyberattack and rogue trading scenarios. Certainly, investors in insurance-linked securities (ILS) have the appetite to diversify away from peak zone natural catastrophe perils.

ILS investors have the appetite to diversify away from peak zone natural catastrophe perils

“On a high level, absolutely you could transfer cyber risk to the capital markets,” thinks Ben Brookes, managing director of capital and resilience solutions at RMS. “All the dynamics you would expect are there. It’s a potentially large systemic risk and potentially challenging to hold that risk in concentration as an insurance company. There is the opportunity to cede that risk into a much broader pool of investment risk where you could argue there is much more diversification.

“One question is how much diversification there is across mainstream asset classes?” he continues. “What would the impact be on the mainstream financial markets if a major cloud provider went down for a period of time, for instance? For cyber ILS to be successful, some work would need to be put into that to understand the diversification benefit, and you’d need to be able to demonstrate that to ILS funds in order to get them comfortable.

“It could be an insured, for example, a business highly dependent on the cloud, rather than an insurance or reinsurance company, looking to cede the risk. Particularly a large organization, with a sizable exposure that cannot secure the capacity it needs in the traditional market as it is at present,” says Brookes.

“The isolation and packaging of that cause of loss could enable you to design something that seems a little bit like a parametric cyber bond, and to do that relatively soon,” he believes.

“We’re at a point where we’ve got a good handle on the risk of cloud provider failure or data exfiltration at various different levels. You could envisage building an index around that, for instance the aggregate number of records leaked across the Fortune 500 in the U.S. And then we can model that — and that’s something that can be done in relatively short order.”


Getting physical

There are only a handful of examples of instances where a cyber intrusion has caused substantial physical damage. These are well-known and include a German steel mill attack and the Stuxnet virus, which attacked a nuclear plant. However, in spite of this, many experts believe the potential for physical damage resulting from a cyberattack is growing.

“There are three instances globally where cyber has been used to cause physical damage,” says Julian Enoizi, CEO of Pool Re. “The damage caused was quite significant, but there was no attribution toward those being terrorist events. But that doesn’t mean that if the physical ISIL caliphate gets squeezed they wouldn’t resort to cyber as a weapon in the future.”

In our previous article in EXPOSURE last year about the vulnerabilities inherent in the Internet of Things, following the Mirai DDoS Attack in 2016, we explored how similar viruses could be used to compromise smart thermostats causing them to overheat and start a fire. Because there is so little data and significant potential for systemic risk, (re)insurers have been reluctant to offer meaningful coverage for cyber physical exposures.

They are also concerned that the traditional “air-gapping” defense used to protect supervisory control and data acquisition systems (SCADA) by energy and utilities firms could more easily be overcome in a world where everything has an Internet connection.

Until now. In March this year, the U.K.’s terrorism insurance backstop Pool Re announced it had secured £2.1 billion of retrocession cover, which included — for the first time — cyber terrorism. “We identified the gap in our cover about two-and-a-half years ago that led us to start working with academia and government departments to find out whether there was an exposure to a cyber terrorism event that could cause physical damage,” says Enoizi.

“While it was clear there was no imminent threat, we wanted to be able to future-proof the product and make sure there were no gaps in it,” he continues. “So, we did the studies and have been working hard on getting the insurance and reinsurance market comfortable with that.”

Even after two years of research and discussions with reinsurers and brokers, it was a challenge to secure capacity from all the usual sources, reveals Enoizi. “Pool Re buys the largest reinsurance program for terrorism in the world. And there are certain reinsurance markets who would not participate in this placement because of the addition of a cyber trigger. Some markets withdrew their participation.”

This does suggest the capital markets could be the natural home for such an exposure in the future. “It is clear that state-based actors are increasingly mounting some of the largest cyberattacks,” says RMS’s Muir-Wood. “It would be interesting to test the capital markets just to see what their appetite is for taking on this kind of risk. They have definitely got a bit bolder than they were five years ago, but this remains a frontier area of the risk landscape.”


Brazil: Modeling the world’s future breadbasket

How a crop modeling collaboration with IRB Brasil Re could help bridge the protection gap and build a more resilient agricultural base for the future in Brazil

Brazil is currently the world’s second largest corn exporter, and is set to overtake the U.S. as the globe’s biggest soybean exporter, with the U.S. Department of Agriculture (USDA) predicting a record Brazilian soybean crop of 115 million metric tons in its outlook for 2018.

Yet this agricultural powerhouse — responsible for around a quarter of Brazil’s GDP — remains largely underinsured, according to Victor Roldán, vice president and head of Caribbean and Latin America at RMS. A situation that must be addressed given the importance of the sector for the country’s economy and growing weather extremes farmers must contend with under climate change conditions.

The effects of climate change over the next 25 years could lead to further heavy crop losses

“Natural perils are identified as the industry’s main risk,” he says. “Major droughts or excess of rain have been big drivers of losses for the sector, and their frequency and severity shall increase under future climate change conditions. During 2014 to 2017, El Niño affected Brazil with some of the largest droughts in some areas of the country and excess of rain in others.

“There is a need to structure more effective and attractive insurance products to protect the farmers,” he continues. “For this we need better analytics, a better understanding of the perils, exposure and vulnerability.”

Worst drought in 80 years

The worst drought in 80 years reached its height in 2015, with farmers in Sao Paulo losing up to a third of their crops due to the dry weather. Production of soy shrank by 17 percent between 2013 and 2014 while around a fifth of the state’s citrus crops died. Meanwhile, heavy rain and flash floods in the south of the country also detrimentally impacted agricultural output.

The effects of climate change over the next 25 years could lead to further heavy crop losses, according to a study carried out by Brazil’s Secretariat of Strategic Issues (SAE). It found that some of the country’s main crops could suffer a serious decline in the areas already under cultivation, anticipating a decline of up to 39 percent in the soybean crop. This could translate into significant financial losses, since the soybean crop currently brings in around US$20 billion in export earnings annually.

IRB Brasil Re has been the leader in the agricultural reinsurance sector of the country for decades and has more than 70 years of agricultural claims data. Today agricultural risks represent its second-largest business line after property. However, insurance penetration remains low in the agricultural sector, and IRB has been seeking ways in which to encourage take-up among farmers.

The 2015 drought was a turning point, explains Roldán. “As the largest reinsurance player in Brazil, IRB needed to address in a more systematic way the recorded 16.3 percent increase in claims. The increase was due to the drought in the Midwestern region, which adversely affected corn, soybean and coffee crops and, separately an increase in the historical average rainfall level in the Southern region, which caused damage to the crops.”

Building a probabilistic crop model

A better crop-weather modeling approach and risk analytics of crop perils will help the market to better understand their risks and drive growth in crop insurance penetration. IRB is partnering with RMS to develop the first fully probabilistic hybrid crop model for the agricultural insurance sector in Brazil, which it is planning to roll out to its cedants. The model will assess crop risks linked with weather drivers, such as drought, excess rainfall, temperature variation, hail events, strong wind and other natural hazards that impact crop yield variability. The model will be suited for different crop insurance products such as named perils (hail, frost, etc.), Multiple-Peril Crop Insurance (MPCI) and revenue covers, and will also include livestock and forestry.

“Major droughts or excess of rain have been big drivers of losses for the sector, but also climate change is a worrying trend” — Victor Roldán, RMS

“Weather-driven impacts on crop production are complex perils to model given the natural variability in space and time, the localized nature of the hazards and the complex vulnerability response depending on the intensity, but also on the timing of occurrence,” explains Olivier Bode, manager, global agricultural risk at RMS.

“For instance, plant vulnerability not only depends on the intensity of the stress but also on the timing of the occurrence, and the crop phenology or growth stage, which in turn depends on the planting date and the selected variety along with the local weather and soil conditions,” he continues. “Thus, exposure information is critical as you need to know which variety the farmer is selecting and its corresponding planting date to make sure you’re representing correctly the impacts that might occur during a growing season. The hybrid crop model developed by RMS for IRB has explicit modules that account for variety specific responses and dynamic representation of crop growth stages.”

The model will rely on more than historical data. “That’s the major advantage of using a probabilistic crop-weather modeling approach,” says Bode. “Typically, insurers are looking at historical yield data to compute actuarial losses and they don’t go beyond that. A probabilistic framework allows insurers to go beyond the short historical yield record, adding value by coupling longer weather time series with crop models. They also allow you to capture future possible events that are not recorded in past weather data, for example, drought events that might span over several years, flood occurrences extending over larger or new areas as well as climate change related impacts. This allows you to calculate exceedance probability losses at different return periods for each crop and for specific scenarios.”

There is also significant potential to roll out the model to other geographies in the future, with Colombia currently looking like the obvious next step and opportunity. “The El Niño weather phenomenon affects all of Latin America; it decreases rains by more than 60 percent during the rainy seasons in many countries,” explains Roldán. “Like Brazil, Colombia is a very biologically diverse country and features a variety of ecosystems. Currently, most of the country has under-utilized agricultural land.”

Colombia is already a key player worldwide in two products: coffee and cut flowers. But the country signed a number of free trade agreements that will give its producers more access to foreign markets. “So, the expansion of agribusiness insurance is urgently needed in Colombia,” says Roldán.

 


Readying the insurance industry for a “moonshot”

There is growing acceptance that trying to squeeze more efficiency out of existing systems and processes is folly in an industry that must make fundamental changes. But are insurance and reinsurance companies ready for the cultural and digital change this necessitates?

In an article in Wired magazine, Google X lab director Eric “Astro” Teller (whose business card describes him as “Captain of Moonshots”) suggested that it might actually be easier to make something 10 times better than 10 percent better. Squeezing out a further 10 percent in efficiency typically involves tinkering with existing flawed processes and systems. It’s not always necessary to take a “moonshot,” but making something 10 times better involves taking bold, innovative steps and tearing up the rule book where necessary.

The term “moonshot” came from IBM, describing how they foresaw the impact of Cloud in the future of healthcare, specifically its impact in the hunt for a cure for cancer. IBM argued a new architectural strategy — one based on open platforms and designed to cope with rampant data growth and the need for flexibility — was required in order to take cancer research to the next level.

But is the 330-year-old insurance industry — with its legacy systems, an embedded culture and significant operational pressures — ready for such a radical approach? And should those companies that are not ready, prepare to be disrupted?

In the London and Lloyd’s market, where the cost of doing business remains extremely high, there are fears that business could migrate to more efficient, modern competitor hubs, such as Zurich, Bermuda and Singapore.

“The high cost of doing business is something that has been directly recognized by [Lloyd’s CEO] Inga Beale amongst others; and it’s something that has been explicitly highlighted by the rating agencies in their reports on the market,” observes Mike van Slooten, head of market analytics at Aon Benfield. “There is a consensus building that things really do have to change.”

The influx of alternative capacity, a rapidly evolving risk landscape — with business risks increasingly esoteric — a persistently low interest rate environment and high levels of competition have stretched balance sheets in recent years. In addition, the struggle to keep up with the explosion of data and the opportunities this presents, and the need to overhaul legacy systems, is challenging the industry as never before.

“You’ve got a situation where the market as a whole is struggling to meet its ROE targets,” says van Slooten. “We’re getting to a stage where pretty much everyone has to accept the pricing that’s on offer. One company might be better at risk selection than another — but what really differentiates companies in this market is the expense ratio, and you see a huge disparity across the industry.

“Some very large, successful organizations have proved they can run at a 25 percent expense ratio and for other smaller organizations it is north of 40 percent, and in this market, that’s a very big differential,” he continues. “Without cost being brought out of the system there’s a lot of pressure there, and that’s where these M&A deals are coming from. Insight is going to remain at a premium going forward, however, a lot of the form-filling and processing that goes on behind the scenes has got to be overhauled.”

“Efficiency needs to be partnered with business agility,” says Jon Godfray, chief operating officer at Barbican Insurance Group. Making a process 10 times faster will not achieve the “moonshot” an organization needs if it is not married to the ability to act quickly on insight and make informed decisions. “If we weren’t nimble and fast, we would struggle to survive. A nimble business is absolutely key in this space. Things that took five years to develop five years ago are now taking two. Everything is moving at a faster pace.”

As a medium-sized Lloyd’s insurance group, Barbican recognizes the need to remain nimble and to adapt its business model as the industry evolves. However, large incumbents are also upping their game. “I spent some years at a very large insurer and it was like a massive oil tanker … you decided in January where you wanted to be in December, because it took you four months to turn the wheel,” says Godfray.

“Large organizations have got a lot better at being adaptable,” he continues. “Communication lines are shorter and technology plays a big part. This means the nimble advantage we have had is reducing, and we must therefore work even faster and perform better. Organizations need to remain flexible and nimble, and need to be able to embrace the increasingly stringent regulatory climate we’re in.”

Creating a culture of innovation

Automation and the efficiencies to be gained by speeding up previously clunky and expensive processes will enable organizations to compete more effectively. But not all organizations need to be pioneers in order to leverage new technology to their advantage,” adds Godfray. “We see ourselves as a second-level early adopter. We’d love to be at the forefront of everything, but there are others with deeper pockets who can do that.”

“However, we can be an early adopter of technology that can make a difference and be brave enough to close an avenue if it isn’t working,” he continues. “Moving on from investments that don’t appear to be working is something a lot of big organizations struggle with. We have a great arrangement with our investor where if we start something and we don’t like it, we stop it and we move on.”

The drive for efficiency is not all about technology. There is a growing recognition that culture and process is critical to the change underway in the industry. Attracting the right talent, enabling bold decisions and investments to be made, and responding appropriately to rapidly changing customer needs and expectations all rest on the ability for large organizations to think and act more nimbly.

And at the end of the day, survival is all about making tactical decisions that enhance an organization’s bottom line, Godfray believes. “The winners of the future will have decent P&Ls. If you’re not making money, you’re not going to be a winner. Organizations that are consistently struggling will find it harder and harder as the operating environment becomes less and less forgiving, and they will gradually be consolidated into other companies.”

Much of the disruptive change that has already occurred within the industry has occurred within general insurance, where the Internet of Things (IoT), artificial intelligence and product innovation are just some of the developments underway. As we move into an era of the connected home, wearable devices and autonomous vehicles, insurers are in a better position to both analyze individuals and to feed back information to them in order to empower and reduce risk.

But even within personal lines there has not been a remarkable product revolution yet, thinks Anthony Beilin, head of innovation and startup engagement at Aviva. “The same can be said for disruption of the entire value chain. People have attacked various parts and a lot of the focus so far has been on distribution and the front-end customer portal. Maybe over the next 10 years, traditional intermediaries will be replaced with new apps and platforms, but that’s just a move from one partner to another.”

Innovation is not just about digitization, says Beilin. While it is important for any (re)-
insurance company to consistently improve its digital offering, true success and efficiencies will be found in redesigning the value chain, including the products on offer. “It isn’t just taking what was a paper experience onto the Internet, then taking what was on the Internet onto the mobile and taking a mobile experience into a chatbot … that isn’t innovation.

“What we really need to think about is: what does protecting people’s future look like in 50 years’ time? Do people own cars? Do people even drive cars? What are the experiences that people will really worry about?” he explains. “How can we rethink what is essentially a hedged insurance contract to provide a more holistic experience, whether it’s using AI to manage your finances or using technology to protect your health, that’s where the radical transformation will come.”

Beilin acknowledges that collaboration will be necessary. With a background in launching startups he understands the necessary and complementary characteristics of niche players versus large incumbents.

“It is an agreed principle that the bigger the company, the harder it is to make change,” says Beilin. “When you start talking about innovating it runs contrary to the mantra of what big businesses do, which is to set up processes and systems to ensure a minimum level of delivery. Innovation, on the other hand, is about taking the seed of an idea and developing it into something new, and it’s not a natural fit with the day-to-day operations of any business.”

This is not just a problem for the insurance industry. Beilin points to the disruption brought about in the traditional media space by Netflix, Facebook and other social media platforms. “Quite frankly startups are more nimble, they have more hunger, dynamism and more to lose,” he says. “If they go bankrupt, they don’t get paid. The challenge for them is in scaling it to multiple customers.”

This is where investments like Aviva’s Digital Garage come in. “We’re trying to be a partner for them,” says Beilin. “Collaboration is the key in anything. If you look at the success we’re going to achieve,  it’s not going to be in isolation. We need different capabilities to succeed in a future state. We’ve got some extremely creative and talented people on staff, but of course we’ll never have everyone. We need different capabilities and skills so we need to make sure we’re interoperable and open to working with partners wherever possible.”


Achieving 10X: A platform-centric approach

Together with increasing speed and agility and initiatives to drive down the transactional cost of the business, technology and how it enables better risk selection, pricing and capital allocation is seen as a savior. Analytics, and fusing the back office where the data lives, through to the front office — where the decision-makers are — is imperative. 

According to 93 percent of insurance CEOs surveyed by PwC in 2015, data mining and analysis is the most strategically important digital technology for their business. Many (re)insurance company CIOs have taken the plunge and moved parts of their business into the Cloud, particularly those technologies that are optimized to leverage its elasticity and scalability, in order to enhance their analytical capabilities. 

When it comes to analytics, simply moving exposure data, contract data, and existing actuarial and probabilistic models into Cloud architecture will not enable companies to redesign their entire workflow, explains Shaheen Razzaq, director, software products at RMS. 

“Legacy systems were not designed to scale to the level needed,” he adds. “We are now in a world dealing with huge amounts of data and even more sophisticated models and analytics. We need scalable and performing technologies. And to truly leverage these technologies, we need to redesign our systems from the ground up.” He argues that what is needed is a platform-centric approach, designed to be supported by the Cloud, to deliver the scale, performance and insurance-specific needs the industry needs to achieve its moonshot moment. Clearly RMS(one)®, a big data and analytics platform purpose-built for the insurance industry, is one solution available.


 


The peril of ignoring the tail

Drawing on several new data sources and gaining a number of new insights from recent earthquakes on how different fault segments might interact in future earthquakes, Version 17 of the RMS North America Earthquake Models sees the frequency of larger events increasing, making for a fatter tail.  EXPOSURE asks what this means for (re)insurers from a pricing and exposure management perspective.

Recent major earthquakes, including the M9.0 Tohoku Earthquake in Japan in 2011 and the Canterbury Earthquake Sequence in New Zealand (2010-2011), have offered new insight into the complexities and interdependencies of losses that occur following major events. This insight, as well as other data sources, was incorporated into the latest seismic hazard maps released by the U.S. Geological Survey (USGS).

In addition to engaging with USGS on its 2014 update, RMS went on to invest more than 100 person-years of work in implementing the main findings of this update as well as comprehensively enhancing and updating all components in its North America Earthquake Models (NAEQ). The update reflects the deep complexities inherent in the USGS model and confirms the adage that “earthquake is the quintessential tail risk.” Among the changes to the RMS NAEQ models was the recognition that some faults can interconnect, creating correlations of risk that were not previously appreciated.

Lessons from Kaikoura

While there is still a lot of uncertainty surrounding tail risk, the new data sets provided by USGS and others have improved the understanding of events with a longer return period. “Global earthquakes are happening all of the time, not all large, not all in areas with high exposures,” explains Renee Lee, director, product management at RMS. “Instrumentation has become more advanced and coverage has expanded such that scientists now know more about earthquakes than they did eight years ago when NAEQ was last released in Version 9.0.”

This includes understanding about how faults creep and release energy, how faults can interconnect, and how ground motions attenuate through soil layers and over large distances. “Soil plays a very important role in the earthquake risk modeling picture,” says Lee. “Soil deposits can amplify ground motions, which can potentially magnify the building’s response leading to severe damage.”

The 2016 M7.8 earthquake in Kaikoura, on New Zealand’s South Island, is a good example of a complex rupture where fault segments connected in more ways than had previously been realized. In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next, producing a single larger earthquake.

“The Kaikoura quake was interesting in that we did have some complex energy release moving from fault to fault,” says Glenn Pomeroy, CEO of the California Earthquake Authority (CEA). “We can’t hide our heads in the sand and pretend that scientific awareness doesn’t exist. The probability has increased for a very large, but very infrequent, event, and we need to determine how to manage that risk.”

San Andreas correlations

Looking at California, the updated models include events that extend from the north of San Francisco to the south of Palm Springs, correlating exposures along the length of the San Andreas fault. While the prospect of a major earthquake impacting both northern and southern California is considered extremely remote, it will nevertheless affect how reinsurers seek to diversify different types of quake risk within their book of business.

“In the past, earthquake risk models have considered Los Angeles as being independent of San Francisco,” says Paul Nunn, head of catastrophe risk modeling at SCOR. “Now we have to consider that these cities could have losses at the same time (following a full rupture of the San Andreas Fault).

In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next

“However, it doesn’t make that much difference in the sense that these events are so far out in the tail ... and we’re not selling much coverage beyond the 1-in-500-year or 1-in-1,000-year return period. The programs we’ve sold will already have been exhausted long before you get to that level of severity.”

While the contribution of tail events to return period losses is significant, as Nunn explains, this could be more of an issue for insurance companies than (re)insurers, from a capitalization standpoint. “From a primary insurance perspective, the bigger the magnitude and event footprint, the more separate claims you have to manage. So, part of the challenge is operational — in terms of mobilizing loss adjusters and claims handlers — but primary insurers also have the risk that losses from tail events could go beyond the (re)insurance program they have bought.

“It’s less of a challenge from the perspective of global (re)insurers, because most of the risk we take is on a loss limited basis — we sell layers of coverage,” he continues. “Saying that, pricing for the top layers should always reflect the prospect of major events in the tail and the uncertainty associated with that.”

He adds: “The magnitude of the Tohoku earthquake event is a good illustration of the inherent uncertainties in earthquake science and wasn’t represented in modeled scenarios at that time.”

While U.S. regulation stipulates that carriers writing quake business should capitalize to the 1-in-200-year event level, in Canada capital requirements are more conservative in an effort to better account for tail risk. “So, Canadian insurance companies should have less overhang out of the top of their (re)insurance programs,” says Nunn.

Need for post-event funding

For the CEA, the updated earthquake models could reinvigorate discussions around the need for a mechanism to raise additional claims-paying capacity following a major earthquake. Set up after the Northridge Earthquake in 1994, the CEA is a not-for-profit, publicly managed and privately funded earthquake pool.

“It is pretty challenging for a stand-alone entity to take on large tail risk all by itself,” says Pomeroy. “We have, from time to time, looked at the possibility of creating some sort of post-event risk-transfer mechanism.

“A few years ago, for instance, we had a proposal in front of the U.S. Congress that would have created the ability for the CEA to have done some post-event borrowing if we needed to pay for additional claims,” he continues. “It would have put the U.S. government in the position of guaranteeing our debt. The proposal didn’t get signed into law, but it is one example of how you could create an additional claim-paying capacity for that very large, very infrequent event.”

“(Re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing” — Paul Nunn, SCOR

The CEA leverages both traditional and non-traditional risk-transfer mechanisms. “Risk transfer is important. No one entity can take it on alone,” says Pomeroy. “Through risk transfer from insurer to (re)insurer the risk is spread broadly through the entrance of the capital markets as another source for claim-paying capability and another way of diversifying the concentration of the risk.

“We manage our exposure very carefully by staying within our risk-transfer guidelines,” he continues. “When we look at spreading our risk, we look at spreading it through a large number of (re)insurance companies from 15 countries around the world. And we know the (re)insurers have their own strict guidelines on how big their California quake exposure should be.”

The prospect of a higher frequency of larger events producing a “fatter” tail also raises the prospect of an overall reduction in average annual loss (AAL) for (re)insurance portfolios, a factor that is likely to add to pricing pressure as the industry approaches the key January 1 renewal date, predicts Nunn. “The AAL for Los Angeles coming down in the models will impact the industry in the sense that it will affect pricing and how much probable maximum loss people think they’ve got. Most carriers are busy digesting the changes and carrying out due diligence on the new model updates.

“Although the eye-catching change is the possibility of the ‘big one,’ the bigger immediate impact on the industry is what’s happening at lower return periods where we’re selling a lot of coverage,” he says. “LA was a big driver of risk in the California quake portfolio and that’s coming down somewhat, while the risk in San Francisco is going up. So (re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing.”

 


Quantifying the resilience dividend

New opportunities arise for risk capital providers and city planners as the resilience movement gets analytical. EXPOSURE explores the potential.

A hundred years ago, a seven-and-a-half-mile seawall was built to protect San Francisco from Mother Nature. It gave the city’s planning department the confidence to develop today’s commercially and culturally rich downtown.

But that iconic waterfront is under threat. The aging seawall has serious seismic vulnerability. Almost $80 billion of San Francisco property is exposed to sea level rise.

To ensure his city’s long-term resilience, Mayor Ed Lee commissioned a plan to design and fund the rebuild of the seawall. A cost of $8 million for the feasibility study last year and $40 million for the preliminary design this year is just the beginning. With an estimated price tag of up to $5 billion, the stakes are high. Getting it wrong is not an option. But getting it right won’t be easy.

San Francisco is no outlier. Investing in resilience is in vogue. Citizens expect their city officials to understand the risks faced and deal with them. The science is there, so citizens want to see their city planning and investing for a robust, resilient city looking fifty or a hundred years ahead. The frequency and severity of natural catastrophes continues to rise. The threat of terror continues to evolve. Reducing damage and disruption when the worst happens has become an imperative across the political spectrum.

Uncertainty around various macro trends complicates the narrative: sea level rise, coastal development, urban densification, fiscal constraints, “disaster deductibles.” Careful planning is required. An informed understanding of how the right intervention leads to a meaningful reduction in risk is higher than ever before on the City Hall agenda.

This has various implications for risk capital providers. Opportunities are emerging to write more profitable business in catastrophe-exposed areas. Municipal buyers are looking for new products that link risk transfer and risk reduction or deliver more than just cash when disaster strikes.

The innovators will win, thinks John Seo, co-founder and managing principal of Fermat Capital Management. “Considerable time and thought must be invested on what to do with funds, both pre- and post-event.

“All municipalities function on a relatively fixed annual budget. Risk transfer smooths the costs of catastrophe risk, which lessens the disruption on ongoing spending and programs. Ideally, risk transfer comes with a plan for what to do with the funds received from a risk transfer payout. That plan is just as valuable, if not more valuable, than the payout itself.”

Resisting a shock in New Orleans

This innovative approach to resilience has become central to New Orleans under Mayor Mitch Landrieu. Partnering with utilities and reinsurance experts, the city examined its drinking water, sanitation and rainwater evacuation facilities to determine their vulnerability to major storms. This analysis provided the basis for investments to ensure these facilities could withstand a shock and continue operating effectively.

“In New Orleans, the city’s pumps are a critical piece of infrastructure. So, the question was: can you create a better nexus between an engineering company with manpower and thought-power to help keep those pumps going, to prepare them in advance of a catastrophe, and align insurance contracts and risk so we are continuing service delivery,” explains Elizabeth Yee, vice president of city solutions at 100 Resilient Cities.

The aim is to focus on disaster response and business continuity, in addition to risk financing. “If there’s an earthquake it’s great the city might receive $10 million to help repair the airport, but what they really need is an airport that is up and running, not just $10 million,” says Yee. “So, there needs to be a way to structure insurance contracts so they better help continue service delivery, as opposed to just providing money.”

There is also the need to reflect the impact of strengthened infrastructure when modeling and pricing the risk. But this isn’t always an easy journey.

In the city of Miami Beach, Mayor Philip Levine decided to raise its roads, so the barrier island’s thoroughfares stay open even in a flood. While the roads remain dry, this intervention has brought some unwelcome consequences.

City residents and business owners are concerned that the runoff will flood adjacent properties. Irrespective of where the water from the streets goes, it is no longer clear whether in-force insurance policies would pay out in the event of flood damage. The ground floor is no longer technically the ground floor. It is now a basement as it sits below the street level which one local restaurateur found out when Allstate denied his $15,000 claim last year.

“That’s an example of the kind of highly nuanced problem government agencies are grappling with all over the world,” explains Daniel Stander, global managing director at RMS. “There are often no quick and easy answers. Economic analysis is essential. Get it wrong and well-intentioned intervention can actually increase the risk — and the cost of insurance with it.

“The interventions you put in place have to reduce the risk in the eyes of the market,” he continues. “If you want to get the credit for your resilience investments, you need to make sure you understand your risk as the market does, and then reduce your risk in its eyes. Get it right, and communities and economies thrive. Get it wrong, and whole neighborhoods become uninsurable, unaffordable, unlivable.”

Retrofitting shelters in Berkeley

Through its partnership with 100 Resilient Cities, RMS is helping a growing number of cities determine which resilience interventions will make the biggest difference.

Knowing that a major Hayward fault rupture would displace up to 12,000 households, with up to 4,000 seeking temporary shelter, the city of Berkeley engaged RMS to ascertain whether the city’s earthquake shelters would withstand the most probable events on the fault. A citywide analysis highlighted that the shelters perform, on average, worse than the surrounding buildings from which residents would flee. The RMS analysis also found that a $17 million seismic retrofit investment plan is substantially more cost-effective and environmentally friendly than rebuilding or repairing structures after an earthquake.

“We’ve encouraged our chief resilience officers who are new to a city to learn about their exposures,” explains Yee. “From that baseline understanding, they can then work with someone like RMS to carry out more specific analysis. The work that RMS did with Berkeley helped them to better understand the economic risk posed by an earthquake, and ensured the city was able to secure funding to upgrade earthquake shelters for its residents.”

Rewarding resilience

In parts of the world where the state or national government acts as (re)insurer-of-last-resort, stepping in to cover the cost of a catastrophe, there may be a lack of incentive to improve city resilience, warns Yee. “Many of the residents in my neighbourhood have elevated our homes, because we had fish in our yards after Hurricane Sandy,” she says. “But some of our neighbours have decided to wait until the ‘next one’ because there’s this attitude that FEMA (the Federal Emergency Management Agency) will just pay them back for any damage that occurs. We need to change the regulatory framework so that good behavior is incentivized and rewarded.”

“You don’t have to go to emerging markets to find plenty of exposure that is not covered by insurance”— Daniel Stander, RMS

In the U.S., FEMA has suggested the introduction of a “disaster deductible.” This would require recipients of FEMA public assistance funds to expend a predetermined amount of their own funds on emergency management and disaster costs before they receive federal funding. Critically, it is hoped the proposed disaster deductible could “incentivize risk reduction efforts, mitigate future disaster impacts and lower overall recovery costs.”

City resilience framework

The City Resilience Framework, developed by Arup with support from the Rockefeller Foundation, helps clarify the primary factors contributing to resilient cities.  

Resilient cities are more insurable cities, points out Stander. “There are constraints on how much risk can be underwritten by the market in a given city or county. Those constraints bite hardest in high-hazard, high-exposure locations.”

“So, despite an overcapitalized market, there is significant underinsurance,” explains Stander. “You don’t have to go to emerging markets to find plenty of exposure that is not covered by insurance.”

Insurers need not fear that cities’ resilience investments will be to the detriment of premium income. “The insurance industry wants risk to be at an appropriate level,” says Stander. “There are parts of the world where the risk is so high, the industry is rightly reluctant to touch it. Informal neighborhoods throughout South America and South Asia are so poorly constructed they’re practically uninsurable. The insurance industry likes resilience interventions that keep risk insurable at a rate which is both affordable and profitable.”

“Besides, it’s not like you can suddenly make Miami zero-risk,” he adds. “But what you can do as a custodian of a city’s economy is prioritize and communicate resilience interventions that simultaneously reduce rates for citizens and attract private insurance markets. And as a capital provider you can structure products that reward resilient thinking, which help cities monetize their investments in resilience.”

Movements like Rockefeller Foundation‒pioneered 100 Resilient Cities are both responding to and driving this urgency. There is a real and present need for action to meet growing threats.

In San Francisco, investments in resilience are being made now. The city is beyond strategy formulation and on to implementation mode. Shovel-ready projects are required to stem the impacts of 66 inches of sea level rise by 2100. For San Francisco and hundreds of cities and regions around the globe, resilience is a serious business.


Quantifying the economic impact of sea level rise in San Francisco 

In May 2016, RMS published the findings of an analysis into the likely economic impact of sea level rise (SLR) in San Francisco, with the aim to inform the city’s action plan. It found that by the year 2100, $77 billion of property would be at risk from a one-in-100-year extreme storm surge event and that $55 billion of property in low-lying coastal zones could be permanently inundated in the absence of intervention.
The city’s Sea Level Rise Action Plan, which incorporated RMS findings, enabled San Francisco’s mayor to invest $8 million in assessing the feasibility of retrofitting the city’s seawall. The city subsequently commissioned a $40 million contract to design that retrofit program. 


 


A burgeoning opportunity

As traditional (re)insurers hunt for opportunity outside of property catastrophe classes, new probabilistic casualty catastrophe models are becoming available. At the same time, as catastrophe risks are becoming increasingly “manufactured” or human-made, so casualty classes have the potential to be the source of claims after a large “natural” catastrophe.

Just as the growing sophistication of property catastrophe models has enabled industry innovation, there is growing excitement that new tools available to casualty (re)insurers could help to expand the market. By improved evaluation of casualty clash exposures, reinsurers will be better able to understand, price and manage their exposures, as well as design new products that cater to underserved areas.

However, the casualty market must switch from pursuing a purely defensive strategy. “There is an ever-growing list of exclusions in liability insurance and interest in the product is declining with the proliferation of these exclusions,” explains Dr. Robert Reville, president and CEO of Praedicat, the world’s first liability catastrophe modeling company. “There is a real growth opportunity for the industry to deal with these exclusions and recognize where they can confidently write more business.

“Industry practitioners look at what’s happened in property — where modeling has led to a lot of new product ideas, including capital market solutions, and a lot of innovation — and casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property,” he adds.

Perils — particularly emerging risks that underwriters have struggled to price, manage and understand — have typically been excluded from casualty products. This includes electromagnetic fields (EMFs), such as those emanating from broadcast antennas and cell phones. Cover for such exposures is restricted, particularly for the U.S. market, where it is often excluded entirely. Some carriers will not offer any cover at all if the client has even a remote exposure to EMF risks. Yet are they being over-apprehensive about the risk?

The fear that leads to an over application of exclusions is very tangible. “The latency of the disease development process — or the way a product might be used, with more people becoming exposed over time — causes there to be a build-up of risk that may result in catastrophe,” Reville continues. “Insurers want to be relevant to insuring innovation in product, but they have to come to terms with the latency and the potential for a liability catastrophe that might emerge from it.”

Unique nature of casualty catastrophe

It is a misconception that casualty is not a catastrophe class of business. Reville points out that the industry’s US$100 billion-plus loss relating to asbestos claims is arguably its biggest-ever catastrophe. Within the Lloyd’s market the overwhelming nature of APH (asbestos, pollution and health) liabilities contributed to the market’s downward spiral in the late 1980s, only brought under control through the formation of the run-off entity Equitas, now owned and managed by Warren Buffett’s Berkshire Hathaway.

As the APH claims crisis demonstrated, casualty catastrophes differ from property catastrophes in that they are a “two-tailed loss.” There is the “tail loss” both have in common, which describes the high frequency, low probability characteristics — or high return period — of a major event. But in addition, casualty classes of business are “long-tail” in nature. This means that a policy written in 2017 may not experience a claim until 20 years later, providing an additional challenge from a modeling and reserving perspective.

“Casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property” — Robert Reville, Praedicat

Another big difference between casualty clash and property catastrophe from a modeling perspective is that the past is not a good indication of future claims. “By the time asbestos litigation had really taken off, it was already a banned product in the U.S., so it was not as though asbestos claims were any use in trying to figure out where the next environmental disaster or next product liability was going to be,” says Reville. “So, we needed a forward-looking approach to identify where there could be new sources of litigation.”

With the world becoming both more interconnected and more litigious, there is every expectation that future casualty catastrophe losses could be much greater and impact multiple classes of business. “The reality is there’s serial aggregation and systemic risk within casualty business, and our answer to that has generally been that it’s too difficult to quantify,” says Nancy Bewlay, chief underwriting officer, global casualty, at XL Catlin. “But the world is changing. We now have technology advances and data collection capabilities we never had before, and public information that can be used in the underwriting process.

“Take the Takata airbag recall,” she continues. “In 2016, they had to recall 100 million airbags worldwide. It affected all the major motor manufacturers, who then faced the accumulation potential not only of third-party liability claims, but also product liability and product recall. Everything starts to accumulate and combine within that one industry, and when you look at the economic footprint of that throughout the supply chain there’s a massive potential for a casualty catastrophe when you see how everything is interconnected.”

RMS chief research officer Robert Muir-Wood explains: “Another area where we can expect an expansion of modeling applications concerns casualty lines picking up losses from more conventional property catastrophes. This could occur when the cause of a catastrophe can be argued to have ‘non-natural’ origins, and particularly where there are secondary ‘cascade’ consequences of a catastrophe — such as a dam failing after a big earthquake or for claims on ‘professional lines’ coverages of builders and architects — once it is clear that standard property insurance lines will not compensate for all the building damage.”

“This could be prevalent in regions with low property catastrophe insurance penetration, such as in California, where just one in ten homeowners has earthquake cover. In the largest catastrophes, we could expect claims to be made against a wide range of casualty lines. The big innovation around property catastrophe in particular was to employ high-resolution GIS [geographic information systems] data to identify the location of all the risk. We need to apply similar location data to casualty coverages, so that we can estimate the combined consequences of a property/casualty clash catastrophe.”

One active instance, cited by Muir-Wood, of this shift from property to casualty cover-
ages concerns earthquakes in Oklahoma. “There are large amounts of wastewater left over from fracking, and the cheapest way of disposing of it is to pump it down deep boreholes. But this process has been triggering earthquakes, and these earthquakes have started getting quite big — the largest so far in September 2016 had a magnitude of M5.8.

“At present the damage to buildings caused by these earthquakes is being picked up by property insurers,” he continues. “But what you will see over time are lawsuits to try and pass the costs back to the operators of the wells themselves. Working with Praedicat, RMS has done some modeling work on how these operators can assess the risk cost of adding a new disposal well. Clearly the larger the earthquake, the less likely it is to occur. However, the costs add up: our modeling shows that an earthquake bigger than M6 right under Oklahoma City could cause more than US$10 billion of damage.”

Muir-Wood adds: “The challenge is that casualty insurance tends to cover many potential sources of liability in the contract and the operators of the wells, and we believe their insurers are not currently identifying this particular — and potentially catastrophic —source of future claims. There’s the potential for a really big loss that would eventually fall onto the liability writers of these deep wells ... and they are not currently pricing for this risk, or managing their portfolios of casualty lines.”

A modeled class of business

According to Reville, the explosion of data and development of data science tools have been key to the development of casualty catastrophe modeling. The opportunity to develop probabilistic modeling for casualty classes of business was born in the mid-2000s when Reville was senior economist at the RAND Corporation.

At that time, RAND was using data from the RMS® Probabilistic Terrorism Model to help inform the U.S. Congress in its decision on the renewal of the Terrorism Risk Insurance Act (TRIA). Separately, it had written a paper on the scope and scale of asbestos litigation and its potential future course.

“As we were working on these two things it occurred to us that here was this US$100 billion loss — this asbestos problem — and adjacently within property catastrophe insurance there was this developed form of analytics that was helping insurers solve a similar problem. So, we decided to work together to try and figure out if there was a way of solving the problem on the liability side as well,” adds Reville.

Eventually Praedicat was spun out of the initial project as its own brand, launching its first probabilistic liability catastrophe model in summer 2016. “The industry has evolved a lot over the past five years, in part driven by Solvency II and heightened interest from the regulators and rating agencies,” says Reville. “There is a greater level of concern around the issue, and the ability to apply technologies to understand risk in new ways has evolved a lot.”

There are obvious benefits to (re)insurers from a pricing and exposure management perspective. “The opportunity is changing the way we underwrite,” says Bewlay. “Historically, we underwrote by exclusion with a view to limiting our maximum loss potential. We couldn’t get a clear understanding of our portfolio because we weren’t able to. We didn’t have enough meaningful, statistical and credible data.”

“We feel they are not being proactive enough because ... there’s the potential for a really big loss that would fall onto the liability writers of these deep wells”— Robert Muir-Wood, RMS

Then there are the exciting opportunities for growth in a market where there is intense competition and downward pressure on rates. “Now you can take a view on the ‘what-if’ scenario and ask: how much loss can I handle and what’s the probability of that happening?” she continues. “So, you can take on managed risk. Through the modeling you can better understand your industry classes and what could happen within your portfolio, and can be slightly more opportunistic in areas where previously you may have been extremely cautious.”

Not only does this expand the potential range of casualty insurance and reinsurance products, it should allow the industry to better support developments in burgeoning industries. “Cyber is a classic example,” says Bewlay. “If you can start to model the effects of a cyber loss you might decide you’re OK providing cyber in personal lines for individual homeowners in addition to providing cyber in a traditional business or technology environment.

“You would start to model all three of these scenarios and what your potential market share would be to a particular event, and how that would impact your portfolio,” she continues. “If you can answer those questions utilizing your classic underwriting and actuarial techniques, a bit of predictive modeling in there — this is the blend of art and science — you can start taking opportunities that possibly you couldn’t before.”


The Future of (Re)Insurance: Evolution of the Insurer DNA

The (re)insurance industry is at a tipping point. Rapid technological change, disruption through new, more efficient forms of capital and an evolving risk landscape are challenging industry incumbents like never before. Inevitably, as EXPOSURE reports, the winners will be those who find ways to harmonize analytics, technology, industry innovation, and modelling.

There is much talk of disruptive innovation in the insurance industry. In personal lines insurance, disintermediation, the rise of aggregator websites and the Internet of Things (IoT) – such as connected car, home, and wearable devices – promise to transform traditional products and services. In the commercial insurance and reinsurance space, disruptive technological change has been less obvious, but behind the scenes the industry is undergoing some fundamental changes.

The tipping point

The ‘Uber’ moment has yet to arrive in reinsurance, according to Michael Steel, global head of solutions at RMS. “The change we’re seeing in the industry is constant. We’re seeing disruption throughout the entire insurance journey. It’s not the case that the industry is suffering from a short-term correction and then the market will go back to the way it has done business previously. The industry is under huge competitive pressures and the change we’re seeing is permanent and it will be continuous over time.”

Experts feel the industry is now at a tipping point. Huge competitive pressures, rising expense ratios, an evolving risk landscape and rapid technological advances are forcing change upon an industry that has traditionally been considered somewhat of a laggard. And the revolution, when it comes, will be a quick one, thinks Rupert Swallow, co-founder and CEO of Capsicum Re.

“WE’RE SEEING DISRUPTION THROUGHOUT THE ENTIRE INSURANCE JOURNEY”

— MICHAEL STEEL, RMS

Other sectors have plenty of cautionary tales on what happens when businesses fail to adapt to a changing world, he explains. “Kodak was a business that in 1998 had 120,000 employees and printed 95 percent of the world’s photographs. Two years later, that company was bankrupt as digital cameras built their presence in the marketplace. When the tipping point is reached, the change is radical and fast and fundamental.”

While it is impossible to predict exactly how the industry will evolve going forward, it is clear that tomorrow’s leading (re)insurance companies will share certain attributes. This includes a strong appetite to harness data and invest in new technology and analytics capabilities, the drive to differentiate and design new products and services, and the ability to collaborate.  According to Eric Yau, general manager of software at RMS, the goal of an analytic-driven organization is to leverage the right technologies to bring data, workflow and business analytics together to continuously drive more informed, timely and collaborative decision making across the enterprise.

“New technologies play a key role and while there are many choices with the rise of insurtech firms, history shows us that success is achieved only when the proper due diligence is done to really understand and assess how these technologies enable the longer term business strategy, goals and objectives,” says Yau.

Yau says one of the most important ingredients to success is the ability to effectively blend the right team of technologists, data scientists and domain experts who can work together to understand and deliver upon these key objectives.

The most successful companies will also look to attract and retain the best talent, with succession planning that puts a strong emphasis on bringing Millennials up through the ranks. “There is a huge difference between the way Millennials look at the workplace and live their lives, versus industry professionals born in the 1960s or 1970s - the two generations are completely different,” says Swallow. “Those guys [Millennials] would no sooner write a cheque to pay for something than fly to the moon.”

Case for collaboration

If (re)insurers drag their heels in embracing and investing in new technology and analytics capabilities, disruption could well come from outside the industry. Back in 2015, Lloyd’s CEO Inga Beale warned that insurers were in danger of being “Uber-ized” as technology allows companies from Google to Walmart to undermine the sector’s role of managing risk.

Her concerns are well founded, with Google launching a price comparison site in the U.S. and Rakuten and Alibaba, Japan and China’s answers to Amazon respectively, selling a range of insurance products on their platforms.

“No area of the market is off-limits to well-organized technology companies that are increasingly encroaching everywhere,” says Rob Procter, CEO of Securis Investment Partners. “Why wouldn’t Google write insurance… particularly given what they are doing with autonomous vehicles? They may not be insurance experts but these technology firms are driving the advances in terms of volumes of data, data manipulation, and speed of data processing.”

Procter makes the point that the reinsurance industry has already been disrupted by the influx of third-party capital into the ILS space over the past decade to 15 years. Collateralized products such as catastrophe bonds, sidecars and non-traditional reinsurance have fundamentally altered the reinsurance cycle and exposed the industry’s inefficiencies like never before.

“We’ve been innovators in this industry because we came in ten or 15 years ago, and we’ve changed the way the industry is structured and is capitalized and how the capital connects with the customer,” he says. “But more change is required to bring down expenses and to take out what are massive friction costs, which in turn will allow reinsurance solutions to be priced competitively in situations where they are not currently.

“It’s astounding that 70 percent of the world’s catastrophe losses are still uninsured,” he adds. “That statistic has remained unchanged for the last 20 years. If this industry was more efficient it would be able to deliver solutions that work to close that gap.”

Collaboration is the key to leveraging technology – or insurtech – expertise and getting closer to the original risk. There are numerous examples of tie-ups between
(re)insurance industry incumbents and tech firms. Others have set up innovation garages or bought their way into innovation, acquiring or backing niche start-up firms. Silicon Valley, Israel’s Silicon Wadi, India’s tech capital Bangalore and Shanghai in China are now among the favored destinations for scouting visits by insurance chief innovation officers.

One example of a strategic collaboration is the MGA Attune, set up last year by AIG, Hamilton Insurance Group, and affiliates of Two Sigma Investments. Through the partnership, AIG gained access to Two Sigma’s vast technology and data-science capabilities to grow its market share in the U.S. small to mid-sized commercial insurance space.

“The challenge for the industry is to remain relevant to our customers,” says Steel. “Those that fail to adapt will get left behind. To succeed you’re going to need greater information about the underlying risk, the ability to package the risk in a different way, to select the appropriate risks, differentiate more, and construct better portfolios.”

Investment in technology in and of itself is not the solution, thinks Swallow. He thinks there has been too much focus on process and not enough on product design. “Insurtech is an amazing opportunity but a lot of people seem to spend time looking at the fulfilment of the product – what ‘Chily’ [Swallow’s business partner and industry guru Grahame Chilton] would call ‘plumbing’.

“In our industry, there is still so much attention on the ‘plumbing’ and the fact that the plumbing doesn’t work, that insurtech isn’t yet really focused on compliance, regulation of product, which is where all the real gains can be found, just as they have been in the capital markets,” adds Swallow.

Taking out the friction

Blockchain however, states Swallow, is “plumbing on steroids”. “Blockchain is nothing but pure, unadulterated, disintermediation. My understanding is that if certain events happen at the beginning of the chain, then there is a defined outcome that actually happens without any human intervention at the other end of the chain.”

In January, Aegon, Allianz, Munich Re, Swiss Re, and Zurich launched the Blockchain Insurance Industry Initiative, a “$5 billion opportunity” according to PwC. The feasibility study will explore the potential of distributed ledger technologies to better serve clients through faster, more convenient and secure services.

“BLOCKCHAIN FOR THE REINSURANCE SPACE IS AN EFFICIENCY TOOL. AND IF WE ALL GET MORE EFFICIENT, YOU ARE ABLE TO INCREASE INSURABILITY BECAUSE YOUR PRICES COME DOWN”

— KURT KARL, SWISS RE

Blockchain offers huge potential to reduce some of the significant administrative burdens in the industry, thinks Kurt Karl, chief economist at Swiss Re. “Blockchain for the reinsurance space is an efficiency tool. And if we all get more efficient, you are able to increase insurability because your prices come down, and you can have more affordable reinsurance and therefore more affordable insurance. So I think we all win if it’s a cost saving for the industry.”

Collaboration will enable those with scale to behave like nimble start-ups, explains Karl. “We like scale. We’re large. I’ll be blunt about that,” he says. “For the reinsurance space, what we do is to leverage our size to differentiate ourselves. With size, we’re able to invest in all these new technologies and then understand them well enough to have a dialogue with our clients. The nimbleness doesn’t come from small insurers; the nimbleness comes from insurance tech start-ups.”

He gives the example of Lemonade, the peer-to-peer start-up insurer that launched last summer, selling discounted homeowners’ insurance in New York. Working off the premise that insurance customers lack trust in the industry, Lemonade’s business model is based around returning premium to customers when claims are not made. In its second round of capital raising, Lemonade secured funding from XL Group’s venture fund, also a reinsurance partner of the innovative new firm. The firm is also able to offer faster, more efficient, claims processing.

“Lemonade’s [business model] is all about efficiency and the cost saving,” says Karl. “But it’s also clearly of benefit to the client, which is a lot more appealing than a long, drawn-out claims process.”

Tearing up the rule book

By collecting and utilizing data from customers and third parties, personal lines insurers are now able to offer more customized products and, in many circumstances, improve the underlying risk. Customers can win discounts for protecting their homes and other assets, maintaining a healthy lifestyle and driving safely. In a world where products are increasingly designed with the digital native in mind, drivers can pay-as-they-go and property owners can access cheaper home insurance via peer-to-peer models.

Reinsurers may be one step removed from this seismic shift in how the original risk is perceived and underwritten, but just as personal lines insurers are tearing up the rule book, so too are their risk partners. It is over 300 years since the first marine and fire insurance policies were written. In that time (re)insurance has expanded significantly with a range of property, casualty, and specialty products.

However, the wordings contained in standard (re)insurance policies, the involvement of a broker in placing the business and the face-to-face transactional nature of the business – particularly within the London market – has not altered significantly over the past three centuries. Some are questioning whether these traditional indemnity products are the right solution for all classes of risk.

“We think people are often insuring cyber against the wrong things,” says Dane Douetil, group CEO of Minova Insurance. “They probably buy too much cover in some places and not nearly enough in areas where they don’t really understand they’ve got a risk. So we’re starting from the other way around, which is actually providing analysis about where their risks are and then creating the policy to cover it.”

“There has been more innovation in intangible type risks, far more in the last five to ten years than probably people give credit for. Whether you’re talking about cyber, product recall, new forms of business interruption, intellectual property or the huge growth in mergers and acquisition coverages against warranty and indemnity claims – there’s been a lot of development in all of those areas and none of that existed ten years ago.”

Closing the gap

Access to new data sources along with the ability to interpret and utilize that information will be a key instrument in improving the speed of settlement and offering products that are fit for purpose and reflect today’s risk landscape. “We’ve been working on a product that just takes all the information available from airlines, about delays and how often they happen,” says Karl. “And of course you can price off that; you don’t need the loss history, all you need is the probability of the loss, how often does the plane have a five-hour delay?”

“All the travel underwriters then need to do is price it ‘X’, and have a little margin built-in, and then they’re able to offer a nice new product to consumers who get some compensation for the frustration of sitting there on the tarmac.”

With more esoteric lines of business such as cyber, parametric products could be one solution to providing meaningful coverage for a rapidly-evolving corporate risk. “The corporates of course want indemnity protection, but that’s extremely difficult to do,” says Karl. “I think there will be some of that but also some parametric, because it’s often a fixed payout that’s capped and is dependent upon the metric, as opposed to indemnity, which could well end up being the full value of the company. Because you can potentially have a company destroyed by a cyber-attack at this point.”

One issue to overcome with parametric products is the basis risk aspect. This is the risk that an insured suffers a significant loss of income, but its cover is not triggered. However, as data and risk management improves, the concerns surrounding basis risk should reduce.

Improving the underlying risk

The evolution of the cyber (re)insurance market also points to a new opportunity in a data-rich age: pre-loss services. By tapping into a wealth of claims and third-party data sources, successful (re)insurers of the future will be in an even stronger position to help their insureds become resilient and incident-ready. In cyber, these services are already part of the package and include security consultancy, breach-response services and simulated cyber attacks to test the fortitude of corporate networks and raise awareness among staff. “We’ve heard about the three ‘Vs’ when harnessing data – velocity, variety, and volume – in our industry we need to add a fourth, veracity,” says Yau. “When making decisions around which risks to write, our clients need to have allocated the right capital to back that decision or show regulators what parameters fed that decision.”

“WE DO A DISSERVICE TO OUR INDUSTRY BY SAYING THAT WE’RE NOT INNOVATORS, THAT WE’RE STUCK IN THE PAST”

— DANE DOUETIL, MINOVA INSURANCE

IoT is not just an instrument for personal lines. Just as insurance companies are utilizing data collected from connected devices to analyze individual risks and feedback information to improve the risk, (re)insurers also have an opportunity to utilize third-party data. “GPS sensors on containers can allow insurers to monitor cargo as it flows around the world – there is a use for this technology to help mitigate and manage the risk on the front end of the business,” states Steel.

Information is only powerful if it is analyzed effectively and available in real-time as transactional and pricing decisions are made, thinks RMS’ Steel. “The industry is getting better at using analytics and ensuring the output of analytics is fed directly into the hands of key business decision makers.”

“It’s about using things like portfolio optimization, which even ten years ago would have been difficult,” he adds. “As you’re using the technologies that are available now you’re creating more efficient capital structures and better, more efficient business models.”

Minova’s Douetil thinks the industry is stepping up to the plate. “Insurance is effectively the oil that lubricates the economy,” he says. “Without insurance, as we saw with the World Trade Center disaster and other catastrophes, the whole economy could come to a grinding halt pretty quickly if you take the ‘oil’ away.”

“That oil has to continually adapt and be innovative in terms of being able to serve the wider economy,” he continues. “But I think we do a disservice to our industry by saying that we’re not innovators, that we’re stuck in the past. I just think about how much this business has changed over the years.”

“It can change more, without a doubt, and there is no doubt that the communication capabilities that we have now mean there will be a shortening of the distribution chain,” he adds. “That’s already happening quite dramatically and in the personal lines market, obviously even more rapidly.”


Managing the next financial shock

EXPOSURE reports on how a pilot project to stress test banks’ exposure to drought could hold the key to future economic resilience.

here is a growing recognition that environmental stress testing is a crucial instrument to ensure a sustainable financial system. In December 2016, the Task Force on Climate-related Financial Disclosures (TCFD) released its recommendations for effective disclosure of climate-related financial risks.

“This represents an important effort by the private sector to improve transparency around climate-related financial risks and opportunities,” said Michael Bloomberg, chair of the TCFD. “Climate change is not only an environmental problem, but a business one as well. We need business leaders to join us to help spread these recommendations across their industries in order to help make markets more efficient and economies more stable, resilient and sustainable.”

Why drought?

Drought is a significant potential source of shock to the global financial system. There is a common misconception that sustained lack of water is primarily a problem for agriculture and food production. In Europe alone, it is estimated that around 40 percent of total water extraction is used for industry and energy production (cooling in power plants) and 15 percent for public water supply. The main water consumption sectors are irrigation, utilities and manufacturing.

The macro-economic impact of a prolonged or systemic drought could therefore be severe, and is currently the focus of a joint project between RMS and a number of leading financial institutions and development agencies to stress test lending portfolios to see how they would respond to environmental risk.

“ONLY BY BRINGING TOGETHER MINISTERIAL LEVEL GOVERNMENT OFFICIALS WITH LEADERS IN COMMERCE CAN WE ADDRESS THE WORLD’S BIGGEST ISSUES”

— DANIEL STANDER, RMS

“Practically every industry in the world has some reliance on water availability in some shape or form,” states Stephen Moss, director, capital markets at RMS. “And, as we’ve seen, as environmental impacts become more frequent and severe, so there is a growing awareness that water — as a key future resource — is starting to become more acute.”

“So the questions are: do we understand how a lack of water could impact specific industries and how that could then flow down the line to all the industrial activities that rely on the availability of water? And then how does that impact on the broader economy?” he continues. “We live in a very interconnected world and as a result, the impact of drought on one industry sector or one geographic region can have a material impact on adjacent industries or regions, regardless of whether they themselves are impacted by that phenomenon or not.”

This interconnectivity is at the heart of why a hazard such as drought could become a major systemic threat for the global financial system, explains RMS scientist, Dr. Navin Peiris. “You could have an event or drought occurring in the U.S. and any reduction in production of goods and services could impact global supply chains and draw in other regions due to the fact the world is so interconnected.”

The ability to model how drought is likely to impact banks’ loan default rates will enable financial institutions to accurately measure and control the risk. By adjusting their own risk management practices there should be a positive knock-on effect that ripples down if banks are motivated to encourage better water conservation behaviors amongst their corporate borrowers, explains Moss.

“The expectation would be that in the same way that an insurance company incorporates the risk of having to payout on a large natural event, a bank should also be incorporating that into their overall risk assessment of a corporate when providing a loan - and including that incremental element in the pricing,” he says. “And just as insureds are motivated to defend themselves against flood or to put sprinklers in the factories in return for a lower premium, if you could provide financial incentives to borrowers through lower loan costs, businesses would then be encouraged to improve their resilience to water shortage.”

A critical stress test

In May 2016, the Natural Capital Finance Alliance, which is made up of the Global Canopy Programme (GCP) and the United Nations Environment Programme Finance Initiative, teamed up with Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH Emerging Markets Dialogue on Finance (EMDF) and several leading financial institutions to launch a project to pilot scenario modeling.

“THERE IS A GROWING AWARENESS THAT WATER — AS A KEY FUTURE RESOURCE — IS STARTING TO BECOME MORE ACUTE”

— STEPHEN MOSS, RMS

Funded by the German Federal Ministry for Economic Cooperation and Development (BMZ), RMS was appointed to develop a first-of-its-kind drought model. The aim is to help financial institutions and wider economies become more resilient to extreme droughts, as Yannick Motz, head of the emerging markets dialogue on finance, GIZ, explains.

“GIZ has been working with financial institutions and regulators from G20 economies to integrate environmental indicators into lending and investment decisions, product development and risk management. Particularly in the past few years, we have experienced a growing awareness in the financial sector for climate-related risks.”

The Dustbowl – The first distinct drought (1930 – 1931) in the ‘dust bowl’ years affected much of the north east and western U.S.

“The lack of practicable methodologies and tools that adequately quantify, price and assess such risks, however, still impedes financial institutions in fully addressing and integrating them into their decision-making processes,” he continues. “Striving to contribute to filling this gap, GIZ and NCFA initiated this pilot project with the objective to develop an open-source tool that allows banks to assess the potential impact of drought events on the performance of their corporate loan portfolio.”

It is a groundbreaking project between key stakeholders across public and private sectors, according to RMS managing director Daniel Stander. “There are certain things in this world that you can only get done at a Davos level. You need to bring ministerial-level government officials and members of commerce together. It’s only that kind of combination that is going to address the world’s biggest issues. At RMS, experience has taught us that models don’t just solve problems. With the right level of support, they can make markets and change behaviors as well. This initiative is a good example of that.”

RMS adapted well-established frameworks from the insurance sector to build – in a consortium complemented by the Universities of Cambridge and Oxford – a tool for banks to stress test the impact of drought. The model was built in close collaboration with several financial institutions, including the Industrial and Commercial Bank of China (ICBC), Caixa Econômica Federal, Itaú and Santander in Brazil, Banorte, Banamex and Trust Funds for Rural Development (FIRA) in Mexico, UBS in Switzerland and Citigroup in the US.

“Some of the largest losses we saw in some of our scenarios were not necessarily as a result of an industry sector not having access to water, but because other industry sectors didn’t have access to water, so demand dropped significantly and those companies were therefore not able to sell their wares. This was particularly true for petrochemical businesses that are heavily reliant on the health of the broader economy,” explains Moss. “So, this model is a broad framework that incorporates domestic interconnectivity and trade, as well as global macroeconomic effects.”

There is significant scope to apply this approach to modeling other major threats and potential sources of global economic shock, including natural, manmade and emerging perils. “The know-how we’ve applied on this project can be used to evaluate the potential impacts of other stresses,” explains Peiris. “Drought is just one environmental risk facing the financial services industry. This approach can be replicated to measure the potential impact of other systemic risks on macro and micro economic scales.”


The day a botnet took down the internet

The Dyn distributed denial of service (DDoS) attack in October 2016 highlighted security flaws inherent in the Internet of Things (IoT). EXPOSURE asks what this means for businesses and insurers as the world becomes increasingly connected.

A decade ago, Internet connections were largely limited to desktop computers, laptops, tablets, and smart phones. Since then there has been an explosion of devices with IP addresses, including baby monitors, connected home appliances, motor vehicles, security cameras, webcams, ‘Fitbits’ and other wearables. Gartner predicts there will be 20.8 billion things connected to the Internet by 2020.

In a hyper-connected world, governments, corporates, insurers and banks need to better understand the potential for systemic and catastrophic risk arising from a cyber attack seeking to exploit IoT vulnerabilities. With few actual examples of how such attacks could play out, realistic disaster scenarios and cyber modeling are essential tools by which (re)insurers can manage their aggregate exposures and stress test their portfolios.

“IF MALICIOUS ACTORS WANTED TO, THEY WOULD ATTACK CORE SERVICES ON THE INTERNET AND I THINK WE’D BE SEEING A NEAR GLOBAL OUTAGE”

— KEN MUNRO, PEN TEST PARTNERS

Many IoT devices currently on the market were not designed with strict IT security in mind. Ethical hackers have demonstrated how everything from cars to children’s toys can be compromised. These connected devices are often an organization’s weakest link. The cyber criminals responsible for the 2013 Target data breach are understood to have gained access to the retailer’s systems and the credit card details of over 40 million customers via the organization’s heating, ventilation and air conditioning (HVAC) system.

The assault on DNS hosting firm Dyn in October 2016, which brought down multiple websites including Twitter, Netflix, Amazon, Spotify, Reddit, and CNN in Europe and the U.S., was another wake-up call. The DDoS attack was perpetrated using the Mirai virus to compromise IoT systems. Like a parasite, the malware gained control of an estimated 100,000 devices, using them to bombard and overwhelm Dyn’s infrastructure.

This is just the tip of the iceberg, according to Ken Munro, partner, Pen Test Partners. “My first thought [following the Dyn attack] was ‘you ain’t seen nothing yet’. That particular incident was probably using the top end of a terabyte of data per second, and that’s nothing. We’ve already seen a botnet that is several orders of magnitude larger than that. If malicious actors wanted to, they would attack core services on the Internet and I think we’d be seeing a near global outage.”

In the rush to bring new IoT devices to market, IT security has been somewhat of an afterthought, thinks Munro. The situation is starting to change, though, with consumer watchdogs in Norway, the Netherlands and the U.S. taking action. However, there is a significant legacy problem to overcome and it will be several years before current security weaknesses are tackled in a meaningful way.

“I’ve still got our first baby monitor from 10 years ago,” he points out. “The Mirai botnet should have been impossible, but it wasn’t because a whole bunch of security camera manufacturers did a really cheap job. IT security wasn’t on their radar. They were thinking about keeping people’s homes secure without even considering that the device itself might actually be the problem.”

In attempting to understand the future impact of such attacks, it is important to gain a better understanding of motivation. For cyber criminals, DDoS attacks using IoT botnets could be linked to extortion attempts or to diverting the attention of IT professionals away from other activities. For state-sponsored actors, the purpose could be more sinister, with the intent to cause widespread disruption, and potentially physical damage and bodily harm.

Insurers stress-test “silent” cyber

It is the latter scenario that is of growing concern to risk and insurance managers. Lloyd’s, for instance, has asked syndicates to create at least three internal “plausible but extreme” cyber attack scenarios as stress-tests for cyber catastrophe losses. It has asked them to calculate their total gross aggregate exposure to each scenario across all classes, including “silent” cyber.

AIG is also considering how a major cyber attack could impact its book of business. “We are looking at it, not only from our own ERM perspective, but also to understand what probable maximum losses there could be as we start to introduce other products and are able to attach cyber to traditional property and casualty policies,” explains Mark Camillo, head of cyber at AIG. “We look at different types of scenarios and how they would impact a book.”

AIG and a number of Lloyd’s insurers have expanded their cyber offerings to include cover for non-damage business interruption and physical damage and bodily harm arising from a cyber incident. Some carriers – including FM Global – are explicitly including cyber in their traditional suite of products. Others have yet to include explicit wording on how traditional products would respond to a cyber incident.

“WE’RE RELEASING A NUMBER OF CYBER-PHYSICAL ATTACK SCENARIOS THAT CAUSE LOSSES TO TRADITIONAL PROPERTY INSURANCE”

— ANDREW COBURN, RMS

“I don’t know if the market will move towards exclusions or including affirmative cyber coverage within property and casualty to give insureds a choice as to how they want to purchase it,” states Camillo. “What will change is that there is going to have to be some sort of due diligence to ensure cyber exposures are coded properly and carriers are taking that into consideration in capital requirements for these types of attacks.”

In addition to markets such as Lloyd’s, there is growing scrutiny from insurance industry regulators, including the Prudential Regulation Authority in the U.K., on how a major cyber event could impact the insurance industry and its capital buffers. They are putting pressure on those carriers that are currently silent on how their traditional products would respond, to make it clear whether cyber-triggered events would be covered under conventional policies.

“The reinsurance market is certainly concerned about, and constantly looking at the potential for, catastrophic events that could happen across a portfolio,” says William Henriques, senior managing director and co-head of the Cyber Practice Group at Aon Benfield. “That has not stopped them from writing cyber reinsurance and there’s enough capacity out there. But as the market grows and gets to $10 billion, and reinsurers keep supporting that growth, they are going to be watching that accumulation and potential for catastrophic risk and managing that.”

Catastrophic cyber scenarios

In December 2015 and again in December 2016, parts of Ukraine’s power grid were taken down. WIRED magazine noted that many parts of the U.S. grid were less secure than Ukraine’s and would take longer to reboot. It was eerily similar to a fictitious scenario published by Cambridge University’s Centre for Risk Studies in partnership with Lloyd’s in 2015. ‘Business Blackout’ considered the impact of a cyber attack on the US power grid, estimating total economic impact from the 1-in-200 scenario would be $243 billion, rising to $1 trillion in its most extreme form.

It is not beyond the realms of possibility for a Mirai-style virus targeting smart thermostats to be used to achieve such a blackout, thinks Pen Test Partners’ Ken Munro. “You could simultaneously turn them all on and off at the same time and create huge power spikes on the electricity grid. If you turn it on and off and on again quickly, you’ll knock out the grid – then we would see some really serious consequences.”

Smart thermostats could be compromised in other ways, for instance by targeting food and pharmaceutical facilities with the aim to spoil goods. There is a commonly held belief that the industrial and supervisory control and data acquisition systems (ICS/SCADA) used by energy and utility companies are immune to cyber attacks because they are disconnected from the Internet, a protective measure known as “air gapping”. Smart thermostats and other connected devices could render that defense obsolete.

In its latest Cyber Accumulation Management System (CAMS v2.0), RMS considers how silent cyber exposures could impact accumulation risk in the event of major cyber attacks on operations technology, using the Ukrainian power grid attack as an example. “We’re releasing a number of cyber-physical attack scenarios that cause losses to traditional property insurance,” explains Andrew Coburn, senior vice president at RMS and a founder and member of the executive team of the Cambridge Centre for Risk Studies.

“We’re working with our clients on trying to figure out what level of stress test should be running,” he explains. “The CAMS system we’ve released is about running large numbers of scenarios and we’re extending that to look at silent cover, things in conventional insurance policies that could potentially be triggered by a cyber attack, such as fires and explosions.”

Multiple lines of business could be impacted by a cyber event thinks Coburn, including nearly all property classes, including aviation and aerospace. “We have just developed some scenarios for marine and cargo insurance, offshore energy lines of business, industrial property, large numbers of general liability and professional lines, and, quite importantly, financial institutions professional indemnity, D&O and specialty lines.”

“The IoT is a key element of the systemic potential of cyber attacks,” he says. “Most of the systemic risk is about looking at your tail risk. Insurers need to look at how much capital they need to support each line of business, how much reinsurance they need to buy and how they structure their risk capital.”


Scenarios added to RMS CAMS v2.0

Cyber-Induced Fires in Commercial Office Buildings

Hackers exploit vulnerabilities in the smart battery management system of a common brand of laptop, sending their lithium-ion batteries into thermal runaway state. The attack is coordinated to occur on one night. A small proportion of infected laptops that are left on charge overnight overheat and catch fire, and some unattended fires in commercial office buildings spread to cause major losses. Insurers face claims for a large numbers of fires in their commercial property and homeowners’ portfolios.

Cyber-Enabled Marine Cargo Theft from Port

Cyber criminals gain access to a port management system in use at several major ports. They identify high value cargo shipments and systematically switch and steal containers passing through the ports over many months. When the process of theft is finally discovered, the hackers scramble the data in the system, disabling the ports from operating for several days. Insurers face claims for cargo loss and business interruption in their marine lines.

ICS-Triggered Fires in Industrial Processing Plants

External saboteurs gain access to the process control network of large processing plants, and spoof the thermostats of the industrial control systems (ICS), causing heat-sensitive processes to overheat and ignite flammable materials in storage facilities. Insurers face sizeable claims for fire and explosions in a number of major industrial facilities in their large accounts and facultative portfolio.

PCS-Triggered Explosions on Oil Rigs

A disgruntled employee gains access to a Network Operations Centre (NOC) controlling a field of oil rigs, and manipulates several of the Platform Control Systems (PCS) to cause structural misalignment of well heads, damage to several rigs, oil and gas release, and fires. At least one platform has a catastrophic explosion. Insurers face significant claims to multiple production facilities in their offshore energy book.

Regional Power Outage from Cyber Attack on U.S. Power Generation

A well-resourced cyber team infiltrates malware into the control systems of U.S. power generating companies that creates desynchronization in certain types of generators. Sufficient generators are damaged to cause a cascading regional power outage that is complex to repair. Restoration of power to 90 percent of customers takes two weeks. Insurers face claims in many lines of business, including large commercial accounts, energy, homeowners and speciality lines. The scenario is published as a Lloyd’s Emerging Risk Report ‘Business Blackout’ by Cambridge Centre for Risk Studies and was released in RMS CAMS v1.1.

Regional Power Outage from Cyber Attack on UK Power Distribution

A nation-state plants ‘Trojan Horse’ rogue hardware in electricity distribution substations, which are activated remotely to curtail power distribution and cause rolling blackouts intermittently over a multi-week campaign. Insurers face claims in many lines of business, including large commercial accounts, energy, homeowners and specialty lines. The scenario is published as ‘Integrated Infrastructure’ by Cambridge Centre for Risk Studies, and was released in RMS CAMS v1.1.