Readying the insurance industry for a “moonshot”

There is growing acceptance that trying to squeeze more efficiency out of existing systems and processes is folly in an industry that must make fundamental changes. But are insurance and reinsurance companies ready for the cultural and digital change this necessitates?

In an article in Wired magazine, Google X lab director Eric “Astro” Teller (whose business card describes him as “Captain of Moonshots”) suggested that it might actually be easier to make something 10 times better than 10 percent better. Squeezing out a further 10 percent in efficiency typically involves tinkering with existing flawed processes and systems. It’s not always necessary to take a “moonshot,” but making something 10 times better involves taking bold, innovative steps and tearing up the rule book where necessary.

The term “moonshot” came from IBM, describing how they foresaw the impact of Cloud in the future of healthcare, specifically its impact in the hunt for a cure for cancer. IBM argued a new architectural strategy — one based on open platforms and designed to cope with rampant data growth and the need for flexibility — was required in order to take cancer research to the next level.

But is the 330-year-old insurance industry — with its legacy systems, an embedded culture and significant operational pressures — ready for such a radical approach? And should those companies that are not ready, prepare to be disrupted?

In the London and Lloyd’s market, where the cost of doing business remains extremely high, there are fears that business could migrate to more efficient, modern competitor hubs, such as Zurich, Bermuda and Singapore.

“The high cost of doing business is something that has been directly recognized by [Lloyd’s CEO] Inga Beale amongst others; and it’s something that has been explicitly highlighted by the rating agencies in their reports on the market,” observes Mike van Slooten, head of market analytics at Aon Benfield. “There is a consensus building that things really do have to change.”

The influx of alternative capacity, a rapidly evolving risk landscape — with business risks increasingly esoteric — a persistently low interest rate environment and high levels of competition have stretched balance sheets in recent years. In addition, the struggle to keep up with the explosion of data and the opportunities this presents, and the need to overhaul legacy systems, is challenging the industry as never before.

“You’ve got a situation where the market as a whole is struggling to meet its ROE targets,” says van Slooten. “We’re getting to a stage where pretty much everyone has to accept the pricing that’s on offer. One company might be better at risk selection than another — but what really differentiates companies in this market is the expense ratio, and you see a huge disparity across the industry.

“Some very large, successful organizations have proved they can run at a 25 percent expense ratio and for other smaller organizations it is north of 40 percent, and in this market, that’s a very big differential,” he continues. “Without cost being brought out of the system there’s a lot of pressure there, and that’s where these M&A deals are coming from. Insight is going to remain at a premium going forward, however, a lot of the form-filling and processing that goes on behind the scenes has got to be overhauled.”

“Efficiency needs to be partnered with business agility,” says Jon Godfray, chief operating officer at Barbican Insurance Group. Making a process 10 times faster will not achieve the “moonshot” an organization needs if it is not married to the ability to act quickly on insight and make informed decisions. “If we weren’t nimble and fast, we would struggle to survive. A nimble business is absolutely key in this space. Things that took five years to develop five years ago are now taking two. Everything is moving at a faster pace.”

As a medium-sized Lloyd’s insurance group, Barbican recognizes the need to remain nimble and to adapt its business model as the industry evolves. However, large incumbents are also upping their game. “I spent some years at a very large insurer and it was like a massive oil tanker … you decided in January where you wanted to be in December, because it took you four months to turn the wheel,” says Godfray.

“Large organizations have got a lot better at being adaptable,” he continues. “Communication lines are shorter and technology plays a big part. This means the nimble advantage we have had is reducing, and we must therefore work even faster and perform better. Organizations need to remain flexible and nimble, and need to be able to embrace the increasingly stringent regulatory climate we’re in.”

Creating a culture of innovation

Automation and the efficiencies to be gained by speeding up previously clunky and expensive processes will enable organizations to compete more effectively. But not all organizations need to be pioneers in order to leverage new technology to their advantage,” adds Godfray. “We see ourselves as a second-level early adopter. We’d love to be at the forefront of everything, but there are others with deeper pockets who can do that.”

“However, we can be an early adopter of technology that can make a difference and be brave enough to close an avenue if it isn’t working,” he continues. “Moving on from investments that don’t appear to be working is something a lot of big organizations struggle with. We have a great arrangement with our investor where if we start something and we don’t like it, we stop it and we move on.”

The drive for efficiency is not all about technology. There is a growing recognition that culture and process is critical to the change underway in the industry. Attracting the right talent, enabling bold decisions and investments to be made, and responding appropriately to rapidly changing customer needs and expectations all rest on the ability for large organizations to think and act more nimbly.

And at the end of the day, survival is all about making tactical decisions that enhance an organization’s bottom line, Godfray believes. “The winners of the future will have decent P&Ls. If you’re not making money, you’re not going to be a winner. Organizations that are consistently struggling will find it harder and harder as the operating environment becomes less and less forgiving, and they will gradually be consolidated into other companies.”

Much of the disruptive change that has already occurred within the industry has occurred within general insurance, where the Internet of Things (IoT), artificial intelligence and product innovation are just some of the developments underway. As we move into an era of the connected home, wearable devices and autonomous vehicles, insurers are in a better position to both analyze individuals and to feed back information to them in order to empower and reduce risk.

But even within personal lines there has not been a remarkable product revolution yet, thinks Anthony Beilin, head of innovation and startup engagement at Aviva. “The same can be said for disruption of the entire value chain. People have attacked various parts and a lot of the focus so far has been on distribution and the front-end customer portal. Maybe over the next 10 years, traditional intermediaries will be replaced with new apps and platforms, but that’s just a move from one partner to another.”

Innovation is not just about digitization, says Beilin. While it is important for any (re)-
insurance company to consistently improve its digital offering, true success and efficiencies will be found in redesigning the value chain, including the products on offer. “It isn’t just taking what was a paper experience onto the Internet, then taking what was on the Internet onto the mobile and taking a mobile experience into a chatbot … that isn’t innovation.

“What we really need to think about is: what does protecting people’s future look like in 50 years’ time? Do people own cars? Do people even drive cars? What are the experiences that people will really worry about?” he explains. “How can we rethink what is essentially a hedged insurance contract to provide a more holistic experience, whether it’s using AI to manage your finances or using technology to protect your health, that’s where the radical transformation will come.”

Beilin acknowledges that collaboration will be necessary. With a background in launching startups he understands the necessary and complementary characteristics of niche players versus large incumbents.

“It is an agreed principle that the bigger the company, the harder it is to make change,” says Beilin. “When you start talking about innovating it runs contrary to the mantra of what big businesses do, which is to set up processes and systems to ensure a minimum level of delivery. Innovation, on the other hand, is about taking the seed of an idea and developing it into something new, and it’s not a natural fit with the day-to-day operations of any business.”

This is not just a problem for the insurance industry. Beilin points to the disruption brought about in the traditional media space by Netflix, Facebook and other social media platforms. “Quite frankly startups are more nimble, they have more hunger, dynamism and more to lose,” he says. “If they go bankrupt, they don’t get paid. The challenge for them is in scaling it to multiple customers.”

This is where investments like Aviva’s Digital Garage come in. “We’re trying to be a partner for them,” says Beilin. “Collaboration is the key in anything. If you look at the success we’re going to achieve,  it’s not going to be in isolation. We need different capabilities to succeed in a future state. We’ve got some extremely creative and talented people on staff, but of course we’ll never have everyone. We need different capabilities and skills so we need to make sure we’re interoperable and open to working with partners wherever possible.”


Achieving 10X: A platform-centric approach

Together with increasing speed and agility and initiatives to drive down the transactional cost of the business, technology and how it enables better risk selection, pricing and capital allocation is seen as a savior. Analytics, and fusing the back office where the data lives, through to the front office — where the decision-makers are — is imperative. 

According to 93 percent of insurance CEOs surveyed by PwC in 2015, data mining and analysis is the most strategically important digital technology for their business. Many (re)insurance company CIOs have taken the plunge and moved parts of their business into the Cloud, particularly those technologies that are optimized to leverage its elasticity and scalability, in order to enhance their analytical capabilities. 

When it comes to analytics, simply moving exposure data, contract data, and existing actuarial and probabilistic models into Cloud architecture will not enable companies to redesign their entire workflow, explains Shaheen Razzaq, director, software products at RMS. 

“Legacy systems were not designed to scale to the level needed,” he adds. “We are now in a world dealing with huge amounts of data and even more sophisticated models and analytics. We need scalable and performing technologies. And to truly leverage these technologies, we need to redesign our systems from the ground up.” He argues that what is needed is a platform-centric approach, designed to be supported by the Cloud, to deliver the scale, performance and insurance-specific needs the industry needs to achieve its moonshot moment. Clearly RMS(one)®, a big data and analytics platform purpose-built for the insurance industry, is one solution available.


 


The peril of ignoring the tail

Drawing on several new data sources and gaining a number of new insights from recent earthquakes on how different fault segments might interact in future earthquakes, Version 17 of the RMS North America Earthquake Models sees the frequency of larger events increasing, making for a fatter tail.  EXPOSURE asks what this means for (re)insurers from a pricing and exposure management perspective.

Recent major earthquakes, including the M9.0 Tohoku Earthquake in Japan in 2011 and the Canterbury Earthquake Sequence in New Zealand (2010-2011), have offered new insight into the complexities and interdependencies of losses that occur following major events. This insight, as well as other data sources, was incorporated into the latest seismic hazard maps released by the U.S. Geological Survey (USGS).

In addition to engaging with USGS on its 2014 update, RMS went on to invest more than 100 person-years of work in implementing the main findings of this update as well as comprehensively enhancing and updating all components in its North America Earthquake Models (NAEQ). The update reflects the deep complexities inherent in the USGS model and confirms the adage that “earthquake is the quintessential tail risk.” Among the changes to the RMS NAEQ models was the recognition that some faults can interconnect, creating correlations of risk that were not previously appreciated.

Lessons from Kaikoura

While there is still a lot of uncertainty surrounding tail risk, the new data sets provided by USGS and others have improved the understanding of events with a longer return period. “Global earthquakes are happening all of the time, not all large, not all in areas with high exposures,” explains Renee Lee, director, product management at RMS. “Instrumentation has become more advanced and coverage has expanded such that scientists now know more about earthquakes than they did eight years ago when NAEQ was last released in Version 9.0.”

This includes understanding about how faults creep and release energy, how faults can interconnect, and how ground motions attenuate through soil layers and over large distances. “Soil plays a very important role in the earthquake risk modeling picture,” says Lee. “Soil deposits can amplify ground motions, which can potentially magnify the building’s response leading to severe damage.”

The 2016 M7.8 earthquake in Kaikoura, on New Zealand’s South Island, is a good example of a complex rupture where fault segments connected in more ways than had previously been realized. In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next, producing a single larger earthquake.

“The Kaikoura quake was interesting in that we did have some complex energy release moving from fault to fault,” says Glenn Pomeroy, CEO of the California Earthquake Authority (CEA). “We can’t hide our heads in the sand and pretend that scientific awareness doesn’t exist. The probability has increased for a very large, but very infrequent, event, and we need to determine how to manage that risk.”

San Andreas correlations

Looking at California, the updated models include events that extend from the north of San Francisco to the south of Palm Springs, correlating exposures along the length of the San Andreas fault. While the prospect of a major earthquake impacting both northern and southern California is considered extremely remote, it will nevertheless affect how reinsurers seek to diversify different types of quake risk within their book of business.

“In the past, earthquake risk models have considered Los Angeles as being independent of San Francisco,” says Paul Nunn, head of catastrophe risk modeling at SCOR. “Now we have to consider that these cities could have losses at the same time (following a full rupture of the San Andreas Fault).

In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next

“However, it doesn’t make that much difference in the sense that these events are so far out in the tail ... and we’re not selling much coverage beyond the 1-in-500-year or 1-in-1,000-year return period. The programs we’ve sold will already have been exhausted long before you get to that level of severity.”

While the contribution of tail events to return period losses is significant, as Nunn explains, this could be more of an issue for insurance companies than (re)insurers, from a capitalization standpoint. “From a primary insurance perspective, the bigger the magnitude and event footprint, the more separate claims you have to manage. So, part of the challenge is operational — in terms of mobilizing loss adjusters and claims handlers — but primary insurers also have the risk that losses from tail events could go beyond the (re)insurance program they have bought.

“It’s less of a challenge from the perspective of global (re)insurers, because most of the risk we take is on a loss limited basis — we sell layers of coverage,” he continues. “Saying that, pricing for the top layers should always reflect the prospect of major events in the tail and the uncertainty associated with that.”

He adds: “The magnitude of the Tohoku earthquake event is a good illustration of the inherent uncertainties in earthquake science and wasn’t represented in modeled scenarios at that time.”

While U.S. regulation stipulates that carriers writing quake business should capitalize to the 1-in-200-year event level, in Canada capital requirements are more conservative in an effort to better account for tail risk. “So, Canadian insurance companies should have less overhang out of the top of their (re)insurance programs,” says Nunn.

Need for post-event funding

For the CEA, the updated earthquake models could reinvigorate discussions around the need for a mechanism to raise additional claims-paying capacity following a major earthquake. Set up after the Northridge Earthquake in 1994, the CEA is a not-for-profit, publicly managed and privately funded earthquake pool.

“It is pretty challenging for a stand-alone entity to take on large tail risk all by itself,” says Pomeroy. “We have, from time to time, looked at the possibility of creating some sort of post-event risk-transfer mechanism.

“A few years ago, for instance, we had a proposal in front of the U.S. Congress that would have created the ability for the CEA to have done some post-event borrowing if we needed to pay for additional claims,” he continues. “It would have put the U.S. government in the position of guaranteeing our debt. The proposal didn’t get signed into law, but it is one example of how you could create an additional claim-paying capacity for that very large, very infrequent event.”

“(Re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing” — Paul Nunn, SCOR

The CEA leverages both traditional and non-traditional risk-transfer mechanisms. “Risk transfer is important. No one entity can take it on alone,” says Pomeroy. “Through risk transfer from insurer to (re)insurer the risk is spread broadly through the entrance of the capital markets as another source for claim-paying capability and another way of diversifying the concentration of the risk.

“We manage our exposure very carefully by staying within our risk-transfer guidelines,” he continues. “When we look at spreading our risk, we look at spreading it through a large number of (re)insurance companies from 15 countries around the world. And we know the (re)insurers have their own strict guidelines on how big their California quake exposure should be.”

The prospect of a higher frequency of larger events producing a “fatter” tail also raises the prospect of an overall reduction in average annual loss (AAL) for (re)insurance portfolios, a factor that is likely to add to pricing pressure as the industry approaches the key January 1 renewal date, predicts Nunn. “The AAL for Los Angeles coming down in the models will impact the industry in the sense that it will affect pricing and how much probable maximum loss people think they’ve got. Most carriers are busy digesting the changes and carrying out due diligence on the new model updates.

“Although the eye-catching change is the possibility of the ‘big one,’ the bigger immediate impact on the industry is what’s happening at lower return periods where we’re selling a lot of coverage,” he says. “LA was a big driver of risk in the California quake portfolio and that’s coming down somewhat, while the risk in San Francisco is going up. So (re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing.”

 


Quantifying the resilience dividend

New opportunities arise for risk capital providers and city planners as the resilience movement gets analytical. EXPOSURE explores the potential.

A hundred years ago, a seven-and-a-half-mile seawall was built to protect San Francisco from Mother Nature. It gave the city’s planning department the confidence to develop today’s commercially and culturally rich downtown.

But that iconic waterfront is under threat. The aging seawall has serious seismic vulnerability. Almost $80 billion of San Francisco property is exposed to sea level rise.

To ensure his city’s long-term resilience, Mayor Ed Lee commissioned a plan to design and fund the rebuild of the seawall. A cost of $8 million for the feasibility study last year and $40 million for the preliminary design this year is just the beginning. With an estimated price tag of up to $5 billion, the stakes are high. Getting it wrong is not an option. But getting it right won’t be easy.

San Francisco is no outlier. Investing in resilience is in vogue. Citizens expect their city officials to understand the risks faced and deal with them. The science is there, so citizens want to see their city planning and investing for a robust, resilient city looking fifty or a hundred years ahead. The frequency and severity of natural catastrophes continues to rise. The threat of terror continues to evolve. Reducing damage and disruption when the worst happens has become an imperative across the political spectrum.

Uncertainty around various macro trends complicates the narrative: sea level rise, coastal development, urban densification, fiscal constraints, “disaster deductibles.” Careful planning is required. An informed understanding of how the right intervention leads to a meaningful reduction in risk is higher than ever before on the City Hall agenda.

This has various implications for risk capital providers. Opportunities are emerging to write more profitable business in catastrophe-exposed areas. Municipal buyers are looking for new products that link risk transfer and risk reduction or deliver more than just cash when disaster strikes.

The innovators will win, thinks John Seo, co-founder and managing principal of Fermat Capital Management. “Considerable time and thought must be invested on what to do with funds, both pre- and post-event.

“All municipalities function on a relatively fixed annual budget. Risk transfer smooths the costs of catastrophe risk, which lessens the disruption on ongoing spending and programs. Ideally, risk transfer comes with a plan for what to do with the funds received from a risk transfer payout. That plan is just as valuable, if not more valuable, than the payout itself.”

Resisting a shock in New Orleans

This innovative approach to resilience has become central to New Orleans under Mayor Mitch Landrieu. Partnering with utilities and reinsurance experts, the city examined its drinking water, sanitation and rainwater evacuation facilities to determine their vulnerability to major storms. This analysis provided the basis for investments to ensure these facilities could withstand a shock and continue operating effectively.

“In New Orleans, the city’s pumps are a critical piece of infrastructure. So, the question was: can you create a better nexus between an engineering company with manpower and thought-power to help keep those pumps going, to prepare them in advance of a catastrophe, and align insurance contracts and risk so we are continuing service delivery,” explains Elizabeth Yee, vice president of city solutions at 100 Resilient Cities.

The aim is to focus on disaster response and business continuity, in addition to risk financing. “If there’s an earthquake it’s great the city might receive $10 million to help repair the airport, but what they really need is an airport that is up and running, not just $10 million,” says Yee. “So, there needs to be a way to structure insurance contracts so they better help continue service delivery, as opposed to just providing money.”

There is also the need to reflect the impact of strengthened infrastructure when modeling and pricing the risk. But this isn’t always an easy journey.

In the city of Miami Beach, Mayor Philip Levine decided to raise its roads, so the barrier island’s thoroughfares stay open even in a flood. While the roads remain dry, this intervention has brought some unwelcome consequences.

City residents and business owners are concerned that the runoff will flood adjacent properties. Irrespective of where the water from the streets goes, it is no longer clear whether in-force insurance policies would pay out in the event of flood damage. The ground floor is no longer technically the ground floor. It is now a basement as it sits below the street level which one local restaurateur found out when Allstate denied his $15,000 claim last year.

“That’s an example of the kind of highly nuanced problem government agencies are grappling with all over the world,” explains Daniel Stander, global managing director at RMS. “There are often no quick and easy answers. Economic analysis is essential. Get it wrong and well-intentioned intervention can actually increase the risk — and the cost of insurance with it.

“The interventions you put in place have to reduce the risk in the eyes of the market,” he continues. “If you want to get the credit for your resilience investments, you need to make sure you understand your risk as the market does, and then reduce your risk in its eyes. Get it right, and communities and economies thrive. Get it wrong, and whole neighborhoods become uninsurable, unaffordable, unlivable.”

Retrofitting shelters in Berkeley

Through its partnership with 100 Resilient Cities, RMS is helping a growing number of cities determine which resilience interventions will make the biggest difference.

Knowing that a major Hayward fault rupture would displace up to 12,000 households, with up to 4,000 seeking temporary shelter, the city of Berkeley engaged RMS to ascertain whether the city’s earthquake shelters would withstand the most probable events on the fault. A citywide analysis highlighted that the shelters perform, on average, worse than the surrounding buildings from which residents would flee. The RMS analysis also found that a $17 million seismic retrofit investment plan is substantially more cost-effective and environmentally friendly than rebuilding or repairing structures after an earthquake.

“We’ve encouraged our chief resilience officers who are new to a city to learn about their exposures,” explains Yee. “From that baseline understanding, they can then work with someone like RMS to carry out more specific analysis. The work that RMS did with Berkeley helped them to better understand the economic risk posed by an earthquake, and ensured the city was able to secure funding to upgrade earthquake shelters for its residents.”

Rewarding resilience

In parts of the world where the state or national government acts as (re)insurer-of-last-resort, stepping in to cover the cost of a catastrophe, there may be a lack of incentive to improve city resilience, warns Yee. “Many of the residents in my neighbourhood have elevated our homes, because we had fish in our yards after Hurricane Sandy,” she says. “But some of our neighbours have decided to wait until the ‘next one’ because there’s this attitude that FEMA (the Federal Emergency Management Agency) will just pay them back for any damage that occurs. We need to change the regulatory framework so that good behavior is incentivized and rewarded.”

“You don’t have to go to emerging markets to find plenty of exposure that is not covered by insurance”— Daniel Stander, RMS

In the U.S., FEMA has suggested the introduction of a “disaster deductible.” This would require recipients of FEMA public assistance funds to expend a predetermined amount of their own funds on emergency management and disaster costs before they receive federal funding. Critically, it is hoped the proposed disaster deductible could “incentivize risk reduction efforts, mitigate future disaster impacts and lower overall recovery costs.”

City resilience framework

The City Resilience Framework, developed by Arup with support from the Rockefeller Foundation, helps clarify the primary factors contributing to resilient cities.  

Resilient cities are more insurable cities, points out Stander. “There are constraints on how much risk can be underwritten by the market in a given city or county. Those constraints bite hardest in high-hazard, high-exposure locations.”

“So, despite an overcapitalized market, there is significant underinsurance,” explains Stander. “You don’t have to go to emerging markets to find plenty of exposure that is not covered by insurance.”

Insurers need not fear that cities’ resilience investments will be to the detriment of premium income. “The insurance industry wants risk to be at an appropriate level,” says Stander. “There are parts of the world where the risk is so high, the industry is rightly reluctant to touch it. Informal neighborhoods throughout South America and South Asia are so poorly constructed they’re practically uninsurable. The insurance industry likes resilience interventions that keep risk insurable at a rate which is both affordable and profitable.”

“Besides, it’s not like you can suddenly make Miami zero-risk,” he adds. “But what you can do as a custodian of a city’s economy is prioritize and communicate resilience interventions that simultaneously reduce rates for citizens and attract private insurance markets. And as a capital provider you can structure products that reward resilient thinking, which help cities monetize their investments in resilience.”

Movements like Rockefeller Foundation‒pioneered 100 Resilient Cities are both responding to and driving this urgency. There is a real and present need for action to meet growing threats.

In San Francisco, investments in resilience are being made now. The city is beyond strategy formulation and on to implementation mode. Shovel-ready projects are required to stem the impacts of 66 inches of sea level rise by 2100. For San Francisco and hundreds of cities and regions around the globe, resilience is a serious business.


Quantifying the economic impact of sea level rise in San Francisco 

In May 2016, RMS published the findings of an analysis into the likely economic impact of sea level rise (SLR) in San Francisco, with the aim to inform the city’s action plan. It found that by the year 2100, $77 billion of property would be at risk from a one-in-100-year extreme storm surge event and that $55 billion of property in low-lying coastal zones could be permanently inundated in the absence of intervention.
The city’s Sea Level Rise Action Plan, which incorporated RMS findings, enabled San Francisco’s mayor to invest $8 million in assessing the feasibility of retrofitting the city’s seawall. The city subsequently commissioned a $40 million contract to design that retrofit program. 


 


A burgeoning opportunity

As traditional (re)insurers hunt for opportunity outside of property catastrophe classes, new probabilistic casualty catastrophe models are becoming available. At the same time, as catastrophe risks are becoming increasingly “manufactured” or human-made, so casualty classes have the potential to be the source of claims after a large “natural” catastrophe.

Just as the growing sophistication of property catastrophe models has enabled industry innovation, there is growing excitement that new tools available to casualty (re)insurers could help to expand the market. By improved evaluation of casualty clash exposures, reinsurers will be better able to understand, price and manage their exposures, as well as design new products that cater to underserved areas.

However, the casualty market must switch from pursuing a purely defensive strategy. “There is an ever-growing list of exclusions in liability insurance and interest in the product is declining with the proliferation of these exclusions,” explains Dr. Robert Reville, president and CEO of Praedicat, the world’s first liability catastrophe modeling company. “There is a real growth opportunity for the industry to deal with these exclusions and recognize where they can confidently write more business.

“Industry practitioners look at what’s happened in property — where modeling has led to a lot of new product ideas, including capital market solutions, and a lot of innovation — and casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property,” he adds.

Perils — particularly emerging risks that underwriters have struggled to price, manage and understand — have typically been excluded from casualty products. This includes electromagnetic fields (EMFs), such as those emanating from broadcast antennas and cell phones. Cover for such exposures is restricted, particularly for the U.S. market, where it is often excluded entirely. Some carriers will not offer any cover at all if the client has even a remote exposure to EMF risks. Yet are they being over-apprehensive about the risk?

The fear that leads to an over application of exclusions is very tangible. “The latency of the disease development process — or the way a product might be used, with more people becoming exposed over time — causes there to be a build-up of risk that may result in catastrophe,” Reville continues. “Insurers want to be relevant to insuring innovation in product, but they have to come to terms with the latency and the potential for a liability catastrophe that might emerge from it.”

Unique nature of casualty catastrophe

It is a misconception that casualty is not a catastrophe class of business. Reville points out that the industry’s US$100 billion-plus loss relating to asbestos claims is arguably its biggest-ever catastrophe. Within the Lloyd’s market the overwhelming nature of APH (asbestos, pollution and health) liabilities contributed to the market’s downward spiral in the late 1980s, only brought under control through the formation of the run-off entity Equitas, now owned and managed by Warren Buffett’s Berkshire Hathaway.

As the APH claims crisis demonstrated, casualty catastrophes differ from property catastrophes in that they are a “two-tailed loss.” There is the “tail loss” both have in common, which describes the high frequency, low probability characteristics — or high return period — of a major event. But in addition, casualty classes of business are “long-tail” in nature. This means that a policy written in 2017 may not experience a claim until 20 years later, providing an additional challenge from a modeling and reserving perspective.

“Casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property” — Robert Reville, Praedicat

Another big difference between casualty clash and property catastrophe from a modeling perspective is that the past is not a good indication of future claims. “By the time asbestos litigation had really taken off, it was already a banned product in the U.S., so it was not as though asbestos claims were any use in trying to figure out where the next environmental disaster or next product liability was going to be,” says Reville. “So, we needed a forward-looking approach to identify where there could be new sources of litigation.”

With the world becoming both more interconnected and more litigious, there is every expectation that future casualty catastrophe losses could be much greater and impact multiple classes of business. “The reality is there’s serial aggregation and systemic risk within casualty business, and our answer to that has generally been that it’s too difficult to quantify,” says Nancy Bewlay, chief underwriting officer, global casualty, at XL Catlin. “But the world is changing. We now have technology advances and data collection capabilities we never had before, and public information that can be used in the underwriting process.

“Take the Takata airbag recall,” she continues. “In 2016, they had to recall 100 million airbags worldwide. It affected all the major motor manufacturers, who then faced the accumulation potential not only of third-party liability claims, but also product liability and product recall. Everything starts to accumulate and combine within that one industry, and when you look at the economic footprint of that throughout the supply chain there’s a massive potential for a casualty catastrophe when you see how everything is interconnected.”

RMS chief research officer Robert Muir-Wood explains: “Another area where we can expect an expansion of modeling applications concerns casualty lines picking up losses from more conventional property catastrophes. This could occur when the cause of a catastrophe can be argued to have ‘non-natural’ origins, and particularly where there are secondary ‘cascade’ consequences of a catastrophe — such as a dam failing after a big earthquake or for claims on ‘professional lines’ coverages of builders and architects — once it is clear that standard property insurance lines will not compensate for all the building damage.”

“This could be prevalent in regions with low property catastrophe insurance penetration, such as in California, where just one in ten homeowners has earthquake cover. In the largest catastrophes, we could expect claims to be made against a wide range of casualty lines. The big innovation around property catastrophe in particular was to employ high-resolution GIS [geographic information systems] data to identify the location of all the risk. We need to apply similar location data to casualty coverages, so that we can estimate the combined consequences of a property/casualty clash catastrophe.”

One active instance, cited by Muir-Wood, of this shift from property to casualty cover-
ages concerns earthquakes in Oklahoma. “There are large amounts of wastewater left over from fracking, and the cheapest way of disposing of it is to pump it down deep boreholes. But this process has been triggering earthquakes, and these earthquakes have started getting quite big — the largest so far in September 2016 had a magnitude of M5.8.

“At present the damage to buildings caused by these earthquakes is being picked up by property insurers,” he continues. “But what you will see over time are lawsuits to try and pass the costs back to the operators of the wells themselves. Working with Praedicat, RMS has done some modeling work on how these operators can assess the risk cost of adding a new disposal well. Clearly the larger the earthquake, the less likely it is to occur. However, the costs add up: our modeling shows that an earthquake bigger than M6 right under Oklahoma City could cause more than US$10 billion of damage.”

Muir-Wood adds: “The challenge is that casualty insurance tends to cover many potential sources of liability in the contract and the operators of the wells, and we believe their insurers are not currently identifying this particular — and potentially catastrophic —source of future claims. There’s the potential for a really big loss that would eventually fall onto the liability writers of these deep wells ... and they are not currently pricing for this risk, or managing their portfolios of casualty lines.”

A modeled class of business

According to Reville, the explosion of data and development of data science tools have been key to the development of casualty catastrophe modeling. The opportunity to develop probabilistic modeling for casualty classes of business was born in the mid-2000s when Reville was senior economist at the RAND Corporation.

At that time, RAND was using data from the RMS® Probabilistic Terrorism Model to help inform the U.S. Congress in its decision on the renewal of the Terrorism Risk Insurance Act (TRIA). Separately, it had written a paper on the scope and scale of asbestos litigation and its potential future course.

“As we were working on these two things it occurred to us that here was this US$100 billion loss — this asbestos problem — and adjacently within property catastrophe insurance there was this developed form of analytics that was helping insurers solve a similar problem. So, we decided to work together to try and figure out if there was a way of solving the problem on the liability side as well,” adds Reville.

Eventually Praedicat was spun out of the initial project as its own brand, launching its first probabilistic liability catastrophe model in summer 2016. “The industry has evolved a lot over the past five years, in part driven by Solvency II and heightened interest from the regulators and rating agencies,” says Reville. “There is a greater level of concern around the issue, and the ability to apply technologies to understand risk in new ways has evolved a lot.”

There are obvious benefits to (re)insurers from a pricing and exposure management perspective. “The opportunity is changing the way we underwrite,” says Bewlay. “Historically, we underwrote by exclusion with a view to limiting our maximum loss potential. We couldn’t get a clear understanding of our portfolio because we weren’t able to. We didn’t have enough meaningful, statistical and credible data.”

“We feel they are not being proactive enough because ... there’s the potential for a really big loss that would fall onto the liability writers of these deep wells”— Robert Muir-Wood, RMS

Then there are the exciting opportunities for growth in a market where there is intense competition and downward pressure on rates. “Now you can take a view on the ‘what-if’ scenario and ask: how much loss can I handle and what’s the probability of that happening?” she continues. “So, you can take on managed risk. Through the modeling you can better understand your industry classes and what could happen within your portfolio, and can be slightly more opportunistic in areas where previously you may have been extremely cautious.”

Not only does this expand the potential range of casualty insurance and reinsurance products, it should allow the industry to better support developments in burgeoning industries. “Cyber is a classic example,” says Bewlay. “If you can start to model the effects of a cyber loss you might decide you’re OK providing cyber in personal lines for individual homeowners in addition to providing cyber in a traditional business or technology environment.

“You would start to model all three of these scenarios and what your potential market share would be to a particular event, and how that would impact your portfolio,” she continues. “If you can answer those questions utilizing your classic underwriting and actuarial techniques, a bit of predictive modeling in there — this is the blend of art and science — you can start taking opportunities that possibly you couldn’t before.”


The Future of (Re)Insurance: Evolution of the Insurer DNA

The (re)insurance industry is at a tipping point. Rapid technological change, disruption through new, more efficient forms of capital and an evolving risk landscape are challenging industry incumbents like never before. Inevitably, as EXPOSURE reports, the winners will be those who find ways to harmonize analytics, technology, industry innovation, and modelling.

There is much talk of disruptive innovation in the insurance industry. In personal lines insurance, disintermediation, the rise of aggregator websites and the Internet of Things (IoT) – such as connected car, home, and wearable devices – promise to transform traditional products and services. In the commercial insurance and reinsurance space, disruptive technological change has been less obvious, but behind the scenes the industry is undergoing some fundamental changes.

The tipping point

The ‘Uber’ moment has yet to arrive in reinsurance, according to Michael Steel, global head of solutions at RMS. “The change we’re seeing in the industry is constant. We’re seeing disruption throughout the entire insurance journey. It’s not the case that the industry is suffering from a short-term correction and then the market will go back to the way it has done business previously. The industry is under huge competitive pressures and the change we’re seeing is permanent and it will be continuous over time.”

Experts feel the industry is now at a tipping point. Huge competitive pressures, rising expense ratios, an evolving risk landscape and rapid technological advances are forcing change upon an industry that has traditionally been considered somewhat of a laggard. And the revolution, when it comes, will be a quick one, thinks Rupert Swallow, co-founder and CEO of Capsicum Re.

“WE’RE SEEING DISRUPTION THROUGHOUT THE ENTIRE INSURANCE JOURNEY”

— MICHAEL STEEL, RMS

Other sectors have plenty of cautionary tales on what happens when businesses fail to adapt to a changing world, he explains. “Kodak was a business that in 1998 had 120,000 employees and printed 95 percent of the world’s photographs. Two years later, that company was bankrupt as digital cameras built their presence in the marketplace. When the tipping point is reached, the change is radical and fast and fundamental.”

While it is impossible to predict exactly how the industry will evolve going forward, it is clear that tomorrow’s leading (re)insurance companies will share certain attributes. This includes a strong appetite to harness data and invest in new technology and analytics capabilities, the drive to differentiate and design new products and services, and the ability to collaborate.  According to Eric Yau, general manager of software at RMS, the goal of an analytic-driven organization is to leverage the right technologies to bring data, workflow and business analytics together to continuously drive more informed, timely and collaborative decision making across the enterprise.

“New technologies play a key role and while there are many choices with the rise of insurtech firms, history shows us that success is achieved only when the proper due diligence is done to really understand and assess how these technologies enable the longer term business strategy, goals and objectives,” says Yau.

Yau says one of the most important ingredients to success is the ability to effectively blend the right team of technologists, data scientists and domain experts who can work together to understand and deliver upon these key objectives.

The most successful companies will also look to attract and retain the best talent, with succession planning that puts a strong emphasis on bringing Millennials up through the ranks. “There is a huge difference between the way Millennials look at the workplace and live their lives, versus industry professionals born in the 1960s or 1970s - the two generations are completely different,” says Swallow. “Those guys [Millennials] would no sooner write a cheque to pay for something than fly to the moon.”

Case for collaboration

If (re)insurers drag their heels in embracing and investing in new technology and analytics capabilities, disruption could well come from outside the industry. Back in 2015, Lloyd’s CEO Inga Beale warned that insurers were in danger of being “Uber-ized” as technology allows companies from Google to Walmart to undermine the sector’s role of managing risk.

Her concerns are well founded, with Google launching a price comparison site in the U.S. and Rakuten and Alibaba, Japan and China’s answers to Amazon respectively, selling a range of insurance products on their platforms.

“No area of the market is off-limits to well-organized technology companies that are increasingly encroaching everywhere,” says Rob Procter, CEO of Securis Investment Partners. “Why wouldn’t Google write insurance… particularly given what they are doing with autonomous vehicles? They may not be insurance experts but these technology firms are driving the advances in terms of volumes of data, data manipulation, and speed of data processing.”

Procter makes the point that the reinsurance industry has already been disrupted by the influx of third-party capital into the ILS space over the past decade to 15 years. Collateralized products such as catastrophe bonds, sidecars and non-traditional reinsurance have fundamentally altered the reinsurance cycle and exposed the industry’s inefficiencies like never before.

“We’ve been innovators in this industry because we came in ten or 15 years ago, and we’ve changed the way the industry is structured and is capitalized and how the capital connects with the customer,” he says. “But more change is required to bring down expenses and to take out what are massive friction costs, which in turn will allow reinsurance solutions to be priced competitively in situations where they are not currently.

“It’s astounding that 70 percent of the world’s catastrophe losses are still uninsured,” he adds. “That statistic has remained unchanged for the last 20 years. If this industry was more efficient it would be able to deliver solutions that work to close that gap.”

Collaboration is the key to leveraging technology – or insurtech – expertise and getting closer to the original risk. There are numerous examples of tie-ups between
(re)insurance industry incumbents and tech firms. Others have set up innovation garages or bought their way into innovation, acquiring or backing niche start-up firms. Silicon Valley, Israel’s Silicon Wadi, India’s tech capital Bangalore and Shanghai in China are now among the favored destinations for scouting visits by insurance chief innovation officers.

One example of a strategic collaboration is the MGA Attune, set up last year by AIG, Hamilton Insurance Group, and affiliates of Two Sigma Investments. Through the partnership, AIG gained access to Two Sigma’s vast technology and data-science capabilities to grow its market share in the U.S. small to mid-sized commercial insurance space.

“The challenge for the industry is to remain relevant to our customers,” says Steel. “Those that fail to adapt will get left behind. To succeed you’re going to need greater information about the underlying risk, the ability to package the risk in a different way, to select the appropriate risks, differentiate more, and construct better portfolios.”

Investment in technology in and of itself is not the solution, thinks Swallow. He thinks there has been too much focus on process and not enough on product design. “Insurtech is an amazing opportunity but a lot of people seem to spend time looking at the fulfilment of the product – what ‘Chily’ [Swallow’s business partner and industry guru Grahame Chilton] would call ‘plumbing’.

“In our industry, there is still so much attention on the ‘plumbing’ and the fact that the plumbing doesn’t work, that insurtech isn’t yet really focused on compliance, regulation of product, which is where all the real gains can be found, just as they have been in the capital markets,” adds Swallow.

Taking out the friction

Blockchain however, states Swallow, is “plumbing on steroids”. “Blockchain is nothing but pure, unadulterated, disintermediation. My understanding is that if certain events happen at the beginning of the chain, then there is a defined outcome that actually happens without any human intervention at the other end of the chain.”

In January, Aegon, Allianz, Munich Re, Swiss Re, and Zurich launched the Blockchain Insurance Industry Initiative, a “$5 billion opportunity” according to PwC. The feasibility study will explore the potential of distributed ledger technologies to better serve clients through faster, more convenient and secure services.

“BLOCKCHAIN FOR THE REINSURANCE SPACE IS AN EFFICIENCY TOOL. AND IF WE ALL GET MORE EFFICIENT, YOU ARE ABLE TO INCREASE INSURABILITY BECAUSE YOUR PRICES COME DOWN”

— KURT KARL, SWISS RE

Blockchain offers huge potential to reduce some of the significant administrative burdens in the industry, thinks Kurt Karl, chief economist at Swiss Re. “Blockchain for the reinsurance space is an efficiency tool. And if we all get more efficient, you are able to increase insurability because your prices come down, and you can have more affordable reinsurance and therefore more affordable insurance. So I think we all win if it’s a cost saving for the industry.”

Collaboration will enable those with scale to behave like nimble start-ups, explains Karl. “We like scale. We’re large. I’ll be blunt about that,” he says. “For the reinsurance space, what we do is to leverage our size to differentiate ourselves. With size, we’re able to invest in all these new technologies and then understand them well enough to have a dialogue with our clients. The nimbleness doesn’t come from small insurers; the nimbleness comes from insurance tech start-ups.”

He gives the example of Lemonade, the peer-to-peer start-up insurer that launched last summer, selling discounted homeowners’ insurance in New York. Working off the premise that insurance customers lack trust in the industry, Lemonade’s business model is based around returning premium to customers when claims are not made. In its second round of capital raising, Lemonade secured funding from XL Group’s venture fund, also a reinsurance partner of the innovative new firm. The firm is also able to offer faster, more efficient, claims processing.

“Lemonade’s [business model] is all about efficiency and the cost saving,” says Karl. “But it’s also clearly of benefit to the client, which is a lot more appealing than a long, drawn-out claims process.”

Tearing up the rule book

By collecting and utilizing data from customers and third parties, personal lines insurers are now able to offer more customized products and, in many circumstances, improve the underlying risk. Customers can win discounts for protecting their homes and other assets, maintaining a healthy lifestyle and driving safely. In a world where products are increasingly designed with the digital native in mind, drivers can pay-as-they-go and property owners can access cheaper home insurance via peer-to-peer models.

Reinsurers may be one step removed from this seismic shift in how the original risk is perceived and underwritten, but just as personal lines insurers are tearing up the rule book, so too are their risk partners. It is over 300 years since the first marine and fire insurance policies were written. In that time (re)insurance has expanded significantly with a range of property, casualty, and specialty products.

However, the wordings contained in standard (re)insurance policies, the involvement of a broker in placing the business and the face-to-face transactional nature of the business – particularly within the London market – has not altered significantly over the past three centuries. Some are questioning whether these traditional indemnity products are the right solution for all classes of risk.

“We think people are often insuring cyber against the wrong things,” says Dane Douetil, group CEO of Minova Insurance. “They probably buy too much cover in some places and not nearly enough in areas where they don’t really understand they’ve got a risk. So we’re starting from the other way around, which is actually providing analysis about where their risks are and then creating the policy to cover it.”

“There has been more innovation in intangible type risks, far more in the last five to ten years than probably people give credit for. Whether you’re talking about cyber, product recall, new forms of business interruption, intellectual property or the huge growth in mergers and acquisition coverages against warranty and indemnity claims – there’s been a lot of development in all of those areas and none of that existed ten years ago.”

Closing the gap

Access to new data sources along with the ability to interpret and utilize that information will be a key instrument in improving the speed of settlement and offering products that are fit for purpose and reflect today’s risk landscape. “We’ve been working on a product that just takes all the information available from airlines, about delays and how often they happen,” says Karl. “And of course you can price off that; you don’t need the loss history, all you need is the probability of the loss, how often does the plane have a five-hour delay?”

“All the travel underwriters then need to do is price it ‘X’, and have a little margin built-in, and then they’re able to offer a nice new product to consumers who get some compensation for the frustration of sitting there on the tarmac.”

With more esoteric lines of business such as cyber, parametric products could be one solution to providing meaningful coverage for a rapidly-evolving corporate risk. “The corporates of course want indemnity protection, but that’s extremely difficult to do,” says Karl. “I think there will be some of that but also some parametric, because it’s often a fixed payout that’s capped and is dependent upon the metric, as opposed to indemnity, which could well end up being the full value of the company. Because you can potentially have a company destroyed by a cyber-attack at this point.”

One issue to overcome with parametric products is the basis risk aspect. This is the risk that an insured suffers a significant loss of income, but its cover is not triggered. However, as data and risk management improves, the concerns surrounding basis risk should reduce.

Improving the underlying risk

The evolution of the cyber (re)insurance market also points to a new opportunity in a data-rich age: pre-loss services. By tapping into a wealth of claims and third-party data sources, successful (re)insurers of the future will be in an even stronger position to help their insureds become resilient and incident-ready. In cyber, these services are already part of the package and include security consultancy, breach-response services and simulated cyber attacks to test the fortitude of corporate networks and raise awareness among staff. “We’ve heard about the three ‘Vs’ when harnessing data – velocity, variety, and volume – in our industry we need to add a fourth, veracity,” says Yau. “When making decisions around which risks to write, our clients need to have allocated the right capital to back that decision or show regulators what parameters fed that decision.”

“WE DO A DISSERVICE TO OUR INDUSTRY BY SAYING THAT WE’RE NOT INNOVATORS, THAT WE’RE STUCK IN THE PAST”

— DANE DOUETIL, MINOVA INSURANCE

IoT is not just an instrument for personal lines. Just as insurance companies are utilizing data collected from connected devices to analyze individual risks and feedback information to improve the risk, (re)insurers also have an opportunity to utilize third-party data. “GPS sensors on containers can allow insurers to monitor cargo as it flows around the world – there is a use for this technology to help mitigate and manage the risk on the front end of the business,” states Steel.

Information is only powerful if it is analyzed effectively and available in real-time as transactional and pricing decisions are made, thinks RMS’ Steel. “The industry is getting better at using analytics and ensuring the output of analytics is fed directly into the hands of key business decision makers.”

“It’s about using things like portfolio optimization, which even ten years ago would have been difficult,” he adds. “As you’re using the technologies that are available now you’re creating more efficient capital structures and better, more efficient business models.”

Minova’s Douetil thinks the industry is stepping up to the plate. “Insurance is effectively the oil that lubricates the economy,” he says. “Without insurance, as we saw with the World Trade Center disaster and other catastrophes, the whole economy could come to a grinding halt pretty quickly if you take the ‘oil’ away.”

“That oil has to continually adapt and be innovative in terms of being able to serve the wider economy,” he continues. “But I think we do a disservice to our industry by saying that we’re not innovators, that we’re stuck in the past. I just think about how much this business has changed over the years.”

“It can change more, without a doubt, and there is no doubt that the communication capabilities that we have now mean there will be a shortening of the distribution chain,” he adds. “That’s already happening quite dramatically and in the personal lines market, obviously even more rapidly.”


Managing the next financial shock

EXPOSURE reports on how a pilot project to stress test banks’ exposure to drought could hold the key to future economic resilience.

here is a growing recognition that environmental stress testing is a crucial instrument to ensure a sustainable financial system. In December 2016, the Task Force on Climate-related Financial Disclosures (TCFD) released its recommendations for effective disclosure of climate-related financial risks.

“This represents an important effort by the private sector to improve transparency around climate-related financial risks and opportunities,” said Michael Bloomberg, chair of the TCFD. “Climate change is not only an environmental problem, but a business one as well. We need business leaders to join us to help spread these recommendations across their industries in order to help make markets more efficient and economies more stable, resilient and sustainable.”

Why drought?

Drought is a significant potential source of shock to the global financial system. There is a common misconception that sustained lack of water is primarily a problem for agriculture and food production. In Europe alone, it is estimated that around 40 percent of total water extraction is used for industry and energy production (cooling in power plants) and 15 percent for public water supply. The main water consumption sectors are irrigation, utilities and manufacturing.

The macro-economic impact of a prolonged or systemic drought could therefore be severe, and is currently the focus of a joint project between RMS and a number of leading financial institutions and development agencies to stress test lending portfolios to see how they would respond to environmental risk.

“ONLY BY BRINGING TOGETHER MINISTERIAL LEVEL GOVERNMENT OFFICIALS WITH LEADERS IN COMMERCE CAN WE ADDRESS THE WORLD’S BIGGEST ISSUES”

— DANIEL STANDER, RMS

“Practically every industry in the world has some reliance on water availability in some shape or form,” states Stephen Moss, director, capital markets at RMS. “And, as we’ve seen, as environmental impacts become more frequent and severe, so there is a growing awareness that water — as a key future resource — is starting to become more acute.”

“So the questions are: do we understand how a lack of water could impact specific industries and how that could then flow down the line to all the industrial activities that rely on the availability of water? And then how does that impact on the broader economy?” he continues. “We live in a very interconnected world and as a result, the impact of drought on one industry sector or one geographic region can have a material impact on adjacent industries or regions, regardless of whether they themselves are impacted by that phenomenon or not.”

This interconnectivity is at the heart of why a hazard such as drought could become a major systemic threat for the global financial system, explains RMS scientist, Dr. Navin Peiris. “You could have an event or drought occurring in the U.S. and any reduction in production of goods and services could impact global supply chains and draw in other regions due to the fact the world is so interconnected.”

The ability to model how drought is likely to impact banks’ loan default rates will enable financial institutions to accurately measure and control the risk. By adjusting their own risk management practices there should be a positive knock-on effect that ripples down if banks are motivated to encourage better water conservation behaviors amongst their corporate borrowers, explains Moss.

“The expectation would be that in the same way that an insurance company incorporates the risk of having to payout on a large natural event, a bank should also be incorporating that into their overall risk assessment of a corporate when providing a loan - and including that incremental element in the pricing,” he says. “And just as insureds are motivated to defend themselves against flood or to put sprinklers in the factories in return for a lower premium, if you could provide financial incentives to borrowers through lower loan costs, businesses would then be encouraged to improve their resilience to water shortage.”

A critical stress test

In May 2016, the Natural Capital Finance Alliance, which is made up of the Global Canopy Programme (GCP) and the United Nations Environment Programme Finance Initiative, teamed up with Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH Emerging Markets Dialogue on Finance (EMDF) and several leading financial institutions to launch a project to pilot scenario modeling.

“THERE IS A GROWING AWARENESS THAT WATER — AS A KEY FUTURE RESOURCE — IS STARTING TO BECOME MORE ACUTE”

— STEPHEN MOSS, RMS

Funded by the German Federal Ministry for Economic Cooperation and Development (BMZ), RMS was appointed to develop a first-of-its-kind drought model. The aim is to help financial institutions and wider economies become more resilient to extreme droughts, as Yannick Motz, head of the emerging markets dialogue on finance, GIZ, explains.

“GIZ has been working with financial institutions and regulators from G20 economies to integrate environmental indicators into lending and investment decisions, product development and risk management. Particularly in the past few years, we have experienced a growing awareness in the financial sector for climate-related risks.”

The Dustbowl – The first distinct drought (1930 – 1931) in the ‘dust bowl’ years affected much of the north east and western U.S.

“The lack of practicable methodologies and tools that adequately quantify, price and assess such risks, however, still impedes financial institutions in fully addressing and integrating them into their decision-making processes,” he continues. “Striving to contribute to filling this gap, GIZ and NCFA initiated this pilot project with the objective to develop an open-source tool that allows banks to assess the potential impact of drought events on the performance of their corporate loan portfolio.”

It is a groundbreaking project between key stakeholders across public and private sectors, according to RMS managing director Daniel Stander. “There are certain things in this world that you can only get done at a Davos level. You need to bring ministerial-level government officials and members of commerce together. It’s only that kind of combination that is going to address the world’s biggest issues. At RMS, experience has taught us that models don’t just solve problems. With the right level of support, they can make markets and change behaviors as well. This initiative is a good example of that.”

RMS adapted well-established frameworks from the insurance sector to build – in a consortium complemented by the Universities of Cambridge and Oxford – a tool for banks to stress test the impact of drought. The model was built in close collaboration with several financial institutions, including the Industrial and Commercial Bank of China (ICBC), Caixa Econômica Federal, Itaú and Santander in Brazil, Banorte, Banamex and Trust Funds for Rural Development (FIRA) in Mexico, UBS in Switzerland and Citigroup in the US.

“Some of the largest losses we saw in some of our scenarios were not necessarily as a result of an industry sector not having access to water, but because other industry sectors didn’t have access to water, so demand dropped significantly and those companies were therefore not able to sell their wares. This was particularly true for petrochemical businesses that are heavily reliant on the health of the broader economy,” explains Moss. “So, this model is a broad framework that incorporates domestic interconnectivity and trade, as well as global macroeconomic effects.”

There is significant scope to apply this approach to modeling other major threats and potential sources of global economic shock, including natural, manmade and emerging perils. “The know-how we’ve applied on this project can be used to evaluate the potential impacts of other stresses,” explains Peiris. “Drought is just one environmental risk facing the financial services industry. This approach can be replicated to measure the potential impact of other systemic risks on macro and micro economic scales.”


The day a botnet took down the internet

The Dyn distributed denial of service (DDoS) attack in October 2016 highlighted security flaws inherent in the Internet of Things (IoT). EXPOSURE asks what this means for businesses and insurers as the world becomes increasingly connected.

A decade ago, Internet connections were largely limited to desktop computers, laptops, tablets, and smart phones. Since then there has been an explosion of devices with IP addresses, including baby monitors, connected home appliances, motor vehicles, security cameras, webcams, ‘Fitbits’ and other wearables. Gartner predicts there will be 20.8 billion things connected to the Internet by 2020.

In a hyper-connected world, governments, corporates, insurers and banks need to better understand the potential for systemic and catastrophic risk arising from a cyber attack seeking to exploit IoT vulnerabilities. With few actual examples of how such attacks could play out, realistic disaster scenarios and cyber modeling are essential tools by which (re)insurers can manage their aggregate exposures and stress test their portfolios.

“IF MALICIOUS ACTORS WANTED TO, THEY WOULD ATTACK CORE SERVICES ON THE INTERNET AND I THINK WE’D BE SEEING A NEAR GLOBAL OUTAGE”

— KEN MUNRO, PEN TEST PARTNERS

Many IoT devices currently on the market were not designed with strict IT security in mind. Ethical hackers have demonstrated how everything from cars to children’s toys can be compromised. These connected devices are often an organization’s weakest link. The cyber criminals responsible for the 2013 Target data breach are understood to have gained access to the retailer’s systems and the credit card details of over 40 million customers via the organization’s heating, ventilation and air conditioning (HVAC) system.

The assault on DNS hosting firm Dyn in October 2016, which brought down multiple websites including Twitter, Netflix, Amazon, Spotify, Reddit, and CNN in Europe and the U.S., was another wake-up call. The DDoS attack was perpetrated using the Mirai virus to compromise IoT systems. Like a parasite, the malware gained control of an estimated 100,000 devices, using them to bombard and overwhelm Dyn’s infrastructure.

This is just the tip of the iceberg, according to Ken Munro, partner, Pen Test Partners. “My first thought [following the Dyn attack] was ‘you ain’t seen nothing yet’. That particular incident was probably using the top end of a terabyte of data per second, and that’s nothing. We’ve already seen a botnet that is several orders of magnitude larger than that. If malicious actors wanted to, they would attack core services on the Internet and I think we’d be seeing a near global outage.”

In the rush to bring new IoT devices to market, IT security has been somewhat of an afterthought, thinks Munro. The situation is starting to change, though, with consumer watchdogs in Norway, the Netherlands and the U.S. taking action. However, there is a significant legacy problem to overcome and it will be several years before current security weaknesses are tackled in a meaningful way.

“I’ve still got our first baby monitor from 10 years ago,” he points out. “The Mirai botnet should have been impossible, but it wasn’t because a whole bunch of security camera manufacturers did a really cheap job. IT security wasn’t on their radar. They were thinking about keeping people’s homes secure without even considering that the device itself might actually be the problem.”

In attempting to understand the future impact of such attacks, it is important to gain a better understanding of motivation. For cyber criminals, DDoS attacks using IoT botnets could be linked to extortion attempts or to diverting the attention of IT professionals away from other activities. For state-sponsored actors, the purpose could be more sinister, with the intent to cause widespread disruption, and potentially physical damage and bodily harm.

Insurers stress-test “silent” cyber

It is the latter scenario that is of growing concern to risk and insurance managers. Lloyd’s, for instance, has asked syndicates to create at least three internal “plausible but extreme” cyber attack scenarios as stress-tests for cyber catastrophe losses. It has asked them to calculate their total gross aggregate exposure to each scenario across all classes, including “silent” cyber.

AIG is also considering how a major cyber attack could impact its book of business. “We are looking at it, not only from our own ERM perspective, but also to understand what probable maximum losses there could be as we start to introduce other products and are able to attach cyber to traditional property and casualty policies,” explains Mark Camillo, head of cyber at AIG. “We look at different types of scenarios and how they would impact a book.”

AIG and a number of Lloyd’s insurers have expanded their cyber offerings to include cover for non-damage business interruption and physical damage and bodily harm arising from a cyber incident. Some carriers – including FM Global – are explicitly including cyber in their traditional suite of products. Others have yet to include explicit wording on how traditional products would respond to a cyber incident.

“WE’RE RELEASING A NUMBER OF CYBER-PHYSICAL ATTACK SCENARIOS THAT CAUSE LOSSES TO TRADITIONAL PROPERTY INSURANCE”

— ANDREW COBURN, RMS

“I don’t know if the market will move towards exclusions or including affirmative cyber coverage within property and casualty to give insureds a choice as to how they want to purchase it,” states Camillo. “What will change is that there is going to have to be some sort of due diligence to ensure cyber exposures are coded properly and carriers are taking that into consideration in capital requirements for these types of attacks.”

In addition to markets such as Lloyd’s, there is growing scrutiny from insurance industry regulators, including the Prudential Regulation Authority in the U.K., on how a major cyber event could impact the insurance industry and its capital buffers. They are putting pressure on those carriers that are currently silent on how their traditional products would respond, to make it clear whether cyber-triggered events would be covered under conventional policies.

“The reinsurance market is certainly concerned about, and constantly looking at the potential for, catastrophic events that could happen across a portfolio,” says William Henriques, senior managing director and co-head of the Cyber Practice Group at Aon Benfield. “That has not stopped them from writing cyber reinsurance and there’s enough capacity out there. But as the market grows and gets to $10 billion, and reinsurers keep supporting that growth, they are going to be watching that accumulation and potential for catastrophic risk and managing that.”

Catastrophic cyber scenarios

In December 2015 and again in December 2016, parts of Ukraine’s power grid were taken down. WIRED magazine noted that many parts of the U.S. grid were less secure than Ukraine’s and would take longer to reboot. It was eerily similar to a fictitious scenario published by Cambridge University’s Centre for Risk Studies in partnership with Lloyd’s in 2015. ‘Business Blackout’ considered the impact of a cyber attack on the US power grid, estimating total economic impact from the 1-in-200 scenario would be $243 billion, rising to $1 trillion in its most extreme form.

It is not beyond the realms of possibility for a Mirai-style virus targeting smart thermostats to be used to achieve such a blackout, thinks Pen Test Partners’ Ken Munro. “You could simultaneously turn them all on and off at the same time and create huge power spikes on the electricity grid. If you turn it on and off and on again quickly, you’ll knock out the grid – then we would see some really serious consequences.”

Smart thermostats could be compromised in other ways, for instance by targeting food and pharmaceutical facilities with the aim to spoil goods. There is a commonly held belief that the industrial and supervisory control and data acquisition systems (ICS/SCADA) used by energy and utility companies are immune to cyber attacks because they are disconnected from the Internet, a protective measure known as “air gapping”. Smart thermostats and other connected devices could render that defense obsolete.

In its latest Cyber Accumulation Management System (CAMS v2.0), RMS considers how silent cyber exposures could impact accumulation risk in the event of major cyber attacks on operations technology, using the Ukrainian power grid attack as an example. “We’re releasing a number of cyber-physical attack scenarios that cause losses to traditional property insurance,” explains Andrew Coburn, senior vice president at RMS and a founder and member of the executive team of the Cambridge Centre for Risk Studies.

“We’re working with our clients on trying to figure out what level of stress test should be running,” he explains. “The CAMS system we’ve released is about running large numbers of scenarios and we’re extending that to look at silent cover, things in conventional insurance policies that could potentially be triggered by a cyber attack, such as fires and explosions.”

Multiple lines of business could be impacted by a cyber event thinks Coburn, including nearly all property classes, including aviation and aerospace. “We have just developed some scenarios for marine and cargo insurance, offshore energy lines of business, industrial property, large numbers of general liability and professional lines, and, quite importantly, financial institutions professional indemnity, D&O and specialty lines.”

“The IoT is a key element of the systemic potential of cyber attacks,” he says. “Most of the systemic risk is about looking at your tail risk. Insurers need to look at how much capital they need to support each line of business, how much reinsurance they need to buy and how they structure their risk capital.”


Scenarios added to RMS CAMS v2.0

Cyber-Induced Fires in Commercial Office Buildings

Hackers exploit vulnerabilities in the smart battery management system of a common brand of laptop, sending their lithium-ion batteries into thermal runaway state. The attack is coordinated to occur on one night. A small proportion of infected laptops that are left on charge overnight overheat and catch fire, and some unattended fires in commercial office buildings spread to cause major losses. Insurers face claims for a large numbers of fires in their commercial property and homeowners’ portfolios.

Cyber-Enabled Marine Cargo Theft from Port

Cyber criminals gain access to a port management system in use at several major ports. They identify high value cargo shipments and systematically switch and steal containers passing through the ports over many months. When the process of theft is finally discovered, the hackers scramble the data in the system, disabling the ports from operating for several days. Insurers face claims for cargo loss and business interruption in their marine lines.

ICS-Triggered Fires in Industrial Processing Plants

External saboteurs gain access to the process control network of large processing plants, and spoof the thermostats of the industrial control systems (ICS), causing heat-sensitive processes to overheat and ignite flammable materials in storage facilities. Insurers face sizeable claims for fire and explosions in a number of major industrial facilities in their large accounts and facultative portfolio.

PCS-Triggered Explosions on Oil Rigs

A disgruntled employee gains access to a Network Operations Centre (NOC) controlling a field of oil rigs, and manipulates several of the Platform Control Systems (PCS) to cause structural misalignment of well heads, damage to several rigs, oil and gas release, and fires. At least one platform has a catastrophic explosion. Insurers face significant claims to multiple production facilities in their offshore energy book.

Regional Power Outage from Cyber Attack on U.S. Power Generation

A well-resourced cyber team infiltrates malware into the control systems of U.S. power generating companies that creates desynchronization in certain types of generators. Sufficient generators are damaged to cause a cascading regional power outage that is complex to repair. Restoration of power to 90 percent of customers takes two weeks. Insurers face claims in many lines of business, including large commercial accounts, energy, homeowners and speciality lines. The scenario is published as a Lloyd’s Emerging Risk Report ‘Business Blackout’ by Cambridge Centre for Risk Studies and was released in RMS CAMS v1.1.

Regional Power Outage from Cyber Attack on UK Power Distribution

A nation-state plants ‘Trojan Horse’ rogue hardware in electricity distribution substations, which are activated remotely to curtail power distribution and cause rolling blackouts intermittently over a multi-week campaign. Insurers face claims in many lines of business, including large commercial accounts, energy, homeowners and specialty lines. The scenario is published as ‘Integrated Infrastructure’ by Cambridge Centre for Risk Studies, and was released in RMS CAMS v1.1.