Monthly Archives: September 2016

Integrating Catastrophe Models Under Solvency II

In terms of natural catastrophe risk, ensuring capital adequacy and managing an effective risk management framework under Solvency II, requires the use of an internal model and the implementation of sophisticated nat cat models into the process. But what are the benefits of using an internal model and how can integrated cat models help a (re)insurer assess cat risk under the new regulatory regime?

Internal Model Versus the Standard Formula

Under Pillar I of the Directive, insurers are required to calculate their Solvency Capital Requirement (SCR), which is used to demonstrate to supervisors, policyholders, and shareholders that they have an adequate level of financial resources to absorb significant losses.

Companies have a choice between using the Standard Formula or an internal model (or partial internal model) when calculating their SCR, with many favoring the use of internal models, despite requiring significant resources and regulatory approval. Internal models are more risk-sensitive and can closely capture the true risk profile of a business by taking risks into account that are not always appropriately covered by the Standard Formula, therefore resulting in reduced capital requirements.

Catastrophe Risk is a Key Driver for Capital Under Solvency II

Rising insured losses from global natural catastrophes, driven by factors such as economic growth, increasing property values, rising population density, and insurance penetration—often in high risk regions, all demonstrate the value of embedding a cat model into the internal model process.

Due to significant variances in data granularity between the Standard Formula and an internal model, a magnitude of difference can exist between the two approaches when calculating solvency capital, with potentially lower SCR calculations for the cat component when using an internal model.

The application of Solvency II is, however, not all about capital estimation, but also relates to effective risk management processes embedded throughout an organization. Implementing cat models fully into the internal model process, as opposed to just relying only on cat model loss output, can introduce significant improvements to risk management processes. Cat models provide an opportunity to improve exposure data quality and allow model users to fully understand the benefits of complex risk mitigation structures and diversification. By providing a better reflection of a company’s risk profile, this can help reveal a company’s potential exposure to cat risk and support companies in making better governance and strategic management decisions.

Managing Cat Risk Using Cat Models

A challenging aspect of bringing cat models in-house and integrating them into the internal model process is the selection of the ”right” model and the “right” method to evaluate a company’s cat exposure. Catastrophe model vendors are therefore obliged to help users understand underlying assumptions and their inherent uncertainties, and provide them with the means of justifying model selection and appropriateness.

Insurers have benefited from RMS support to fulfil these requirements, offering model users deep insight into the underlying data, assumptions, and model validation, to ensure they have complete confidence in model strengths and limitations. With the knowledge that RMS provides, insurers can understand, take ownership, and implement a company’s own view of risk, and then demonstrate this to make more informed strategic decisions as required by the Own Risk and Solvency Assessment (ORSA), which lies at the heart of Solvency II.

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.

RiskTech   

My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.

 

The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

The Rise and Stall of Terrorism Insurance

In the 15 years since the terrorist attacks of September 11, 2001, partnerships between the public sector and private industries have yielded more effective security and better public awareness about the threat of terrorism. We may never come to terms with the sheer volume of human loss from that day and among the hundreds of attacks that continue every year. But we have achieved greater resilience in the face of the ongoing realities of terrorism – except for when it comes to looking ahead at recovering from the catastrophic costs for rebuilding in its aftermath.

Terrorism insurance is facing a structural crisis: hundreds of terrorist attacks occur annually, but actual insurance payouts have been negligible. The economic costs of terrorism have skyrocketed, but demand for terrorism coverage has remained relatively flat. And despite a proliferation of catastrophe bonds and other forms of alternative capital flooding into the property insurance market, relatively little terrorism risk has been transferred to the capital markets. If terrorism insurance – and the insurers who provide it – are to remain relevant, they must embrace the new tools and data available to them to create more relevant products, more innovative coverages, and new risk transfer mechanisms that address today’s threat landscape.

The September 11th, 2001 attacks rank among the largest insurance losses in history at $44 billion, putting it among catastrophes with severe losses such as Hurricane Katrina ($70 billion), the Tohoku earthquake and tsunami ($38 billion), and Hurricane Andrew ($25 billion).

But unlike natural catastrophes, whose damages span hundreds of kilometers, most of the 9/11 damages in New York were concentrated in an area of just 16 acres. Such extreme concentration of loss caused a crisis in the insurance marketplace and highlighted the difficulty of insuring against such a peril.

Following the events of the September 11 attacks, most insurers subsequently excluded terrorism from their policies, forcing the U.S. government to step in and provide a backstop through the Terrorism Risk and Insurance Act (2002). Terrorism insurance has become cost effective as insurer capacity for terrorism risk increased. Today there are an estimated 40 insurers providing it on a stand-alone basis, and it is bundled with standard property insurance contracts by many others.

But despite better data on threat groups, more sophisticated terrorism modeling tools, and increased transparency into the counter-terrorism environment, terrorism insurance hasn’t changed all that much in the past 15 years. The contractual coverage is the same – usually distinguishing between conventional and CBRN (chemical, biological, radiological, and nuclear) attacks. And terrorism insurance take-up remains minimal where attacks occur most frequently, in the middle east and Africa, highlighting what policymakers refer to as an increasing “protection gap.”

Closing this gap – through new products, coverages, and risk transfer schemes – will enable greater resilience following an attack and promote a more comprehensive understanding of the global terrorism risk landscape.

What can you learn from Exposure?

Many RMS team members are active bloggers and speakers at speaking engagements, and contribute to articles, but to capture even more of our expertise and insight, and to reflect the breadth of our activities, we have created a new magazine called Exposure, which is ready for you to download.

The aim of Exposure magazine is to bring together topics of special interest to catastrophe and risk management professionals, and recognize the vast area that individuals involved in risk management must cover today.  There is also a theme that we believe unites the risk community, and that is the belief that “risk is opportunity,” and the articles within Exposure magazine reflect that this is a market seeking to avoid surprises, improve business performance, and innovate to create new opportunities for growth.

Within the foreword to Exposure, Hemant Shah, CEO of RMS, also reflects on an “inflection point” in the industry, a mix of globalization, changing market structures, to technology, data and analytics, offering a chance for the industry to innovate and increase its relevance.  In Exposure, there is a mix of articles examining perils and regions, industry issues, and articles discussing what’s coming up for our industry.

Within perils and regions, Exposure looks at opportunities for U.S. and European flood, the effect extra-tropical transitioning has on typhoons in Japan, and the impact that secondary hazards have, such as liquefaction, and the earthquake sequencing that hit the low-seismicity area of Canterbury, New Zealand in 2010 and 2011.

The magazine also tackles issues around Solvency II, the emergence of the “grey swan” event, why reinsurers are opting to buy or build their own insurance-linked securities fund management capabilities, and wraps up with Robert Muir-Wood, chief research officer for RMS, explaining how insurers can help drive the resilience analytics revolution.

Please download your copy now.

Fire Weather

Fires can start at all times and places, but how a fire spreads is principally down to the weather.

This week, 350 years ago, the fire at Thomas Farriner’s bakery on Pudding Lane, a small alleyway running down to the river from the City of London, broke out at the quietest time of the week, around 1am on Sunday morning September 2, 1666. London had been experiencing a drought and the thatched roofs of the houses were tinder dry. At 4 am the Lord Mayor, roused from his sleep, decided the blaze was easily manageable. It was already too late, however. By 7am the roofs of some 300 houses were burning and fanned by strong easterly winds the fire was spreading fast towards the west. Within three days the fire had consumed 13,000 houses and left 70,000 homeless.

In the city’s reconstruction only brick and tiles houses were permitted, severely reducing the potential for repeat conflagrations. Within a few years there were the first fire insurers, growing their business as fear outran the risk.

Yet big city fires had by no means gone away and the wooden cities of northern Europe were primed to burn. The 1728 Copenhagen fire destroyed 28% of the city while the 1795 fire left 6000 homeless. A quarter of the city of Helsinki burned down in November 1808. The 1842 fire that destroyed Hamburg left 20,000 homeless. The center of the city of Bergen Norway burnt down in 1855 and then again in January 1916.

Wind and fire

By the start of the 20th Century, improvements in fire-fighting had reduced the chance that a great city fire took hold, but not if there were strong winds, like the 1916 Bergen, Norway fire, which broke out in the middle of an intense windstorm with hurricane force gusts. In February 1941 the fire that burnt out the historic center of Santander on the coast of northern Spain was driven by an intense windstorm: equivalent to the 1987 October storm in the U.K. And then there is the firestorm that destroyed Yokohama and Tokyo after the 1923 earthquake, driven by 50 miles per hour winds on the outer edge of a typhoon in which, over a few hours, an estimated 140,000 died.

Wind and fire in the wooden city are a deadly combination. Above a certain wind speed, the fire becomes an uncontrollable firestorm. The 1991 Oakland Hills fire flared up late morning also on a Sunday and then surged out of the mountains into the city, driven by hot dry 60 miles per hour Diablo Winds from the east, jumping an 8 lane highway and overwhelming the ability of the fire crews to hold the line, until the wind eventually turned and the fire blew back over its own embers.  The fire consumed 2800 houses, spreading so fast that 25 died. On February 7, 2009 a strong northwesterly wind drew baking air out of Australia’s interior and fires took off across the state of Victoria. Fallen power cables sparked a fire whose embers, blown by 60 miles per hour winds, flashed from one woodland to another, overwhelming several small towns so fast that 173 died before they could escape.

Most recently we have seen fire storms in Canada. Again there is nothing new about the phenomenon; the Matheson fires in 1919 destroyed 49 Ontario towns and killed 244 people in a fire front that extended 60km wide. It was a firestorm fanned by gale force winds, that destroyed one third of the city of Slave Lake, Alberta, in 2011 and it is fortunate only that the roads were broad and straight to allow people to escape the fires that raged into Fort McMurray in summer 2016.

There is no remedy for a firestorm blown on gale-force winds. And wooden property close to drought ridden forests are at very high risk, such as those from South Lake Tahoe to Berkeley in California and in New Zealand, from Canberra to Christchurch. Which is why urban fire needs to stay on the agenda of catastrophe risk management. A wind driven conflagration can blow deep into any timber city, and insurers need to manage their exposure concentrations.