Tag Archives: natural catastrophe risk

Integrating Catastrophe Models Under Solvency II

In terms of natural catastrophe risk, ensuring capital adequacy and managing an effective risk management framework under Solvency II, requires the use of an internal model and the implementation of sophisticated nat cat models into the process. But what are the benefits of using an internal model and how can integrated cat models help a (re)insurer assess cat risk under the new regulatory regime?

Internal Model Versus the Standard Formula

Under Pillar I of the Directive, insurers are required to calculate their Solvency Capital Requirement (SCR), which is used to demonstrate to supervisors, policyholders, and shareholders that they have an adequate level of financial resources to absorb significant losses.

Companies have a choice between using the Standard Formula or an internal model (or partial internal model) when calculating their SCR, with many favoring the use of internal models, despite requiring significant resources and regulatory approval. Internal models are more risk-sensitive and can closely capture the true risk profile of a business by taking risks into account that are not always appropriately covered by the Standard Formula, therefore resulting in reduced capital requirements.

Catastrophe Risk is a Key Driver for Capital Under Solvency II

Rising insured losses from global natural catastrophes, driven by factors such as economic growth, increasing property values, rising population density, and insurance penetration—often in high risk regions, all demonstrate the value of embedding a cat model into the internal model process.

Due to significant variances in data granularity between the Standard Formula and an internal model, a magnitude of difference can exist between the two approaches when calculating solvency capital, with potentially lower SCR calculations for the cat component when using an internal model.

The application of Solvency II is, however, not all about capital estimation, but also relates to effective risk management processes embedded throughout an organization. Implementing cat models fully into the internal model process, as opposed to just relying only on cat model loss output, can introduce significant improvements to risk management processes. Cat models provide an opportunity to improve exposure data quality and allow model users to fully understand the benefits of complex risk mitigation structures and diversification. By providing a better reflection of a company’s risk profile, this can help reveal a company’s potential exposure to cat risk and support companies in making better governance and strategic management decisions.

Managing Cat Risk Using Cat Models

A challenging aspect of bringing cat models in-house and integrating them into the internal model process is the selection of the ”right” model and the “right” method to evaluate a company’s cat exposure. Catastrophe model vendors are therefore obliged to help users understand underlying assumptions and their inherent uncertainties, and provide them with the means of justifying model selection and appropriateness.

Insurers have benefited from RMS support to fulfil these requirements, offering model users deep insight into the underlying data, assumptions, and model validation, to ensure they have complete confidence in model strengths and limitations. With the knowledge that RMS provides, insurers can understand, take ownership, and implement a company’s own view of risk, and then demonstrate this to make more informed strategic decisions as required by the Own Risk and Solvency Assessment (ORSA), which lies at the heart of Solvency II.

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.

RiskTech   

My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.

 

The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

Opportunities and Challenges ahead for Vietnam: Lessons Learned from Thailand

Earlier this month I gave a presentation at the 13th Asia Insurance Review conference in Ho Chi Minh City, Vietnam. It was a very worthwhile event that gave good insights into this young insurance market, and it was great to be in Ho Chi Minh City—a place that immediately captured me with its charm.


Bangkok, Thailand during the 2011 floods. Photo by Petty Officer 1st Class Jennifer Villalovos.

Vietnam shares striking similarities to Thailand, both from a peril and an exposure perspective. And, for Vietnam to become more resilient, it could make sense to learn from Thailand’s recent natural catastrophe (NatCat) experiences, and understand why some of the events were particularly painful in absence of good exposure data.

NatCat and Exposure similarities between Thailand and Vietnam 

Flood profile Vietnam shows a similar flood profile as Thailand, with significant flooding every year. Vietnam’s Mekong Delta, responsible for half of the country’s rice production, is especially susceptible to flooding.
Coast line Both coastlines are similar in length[1] and are similarly exposed to storm surge and tsunami.[2]
Tsunami & Tourism Thailand and its tourism industry were severely affected by the 2004 Indian Ocean Tsunami. Vietnam’s coastline and it’s tourism hotspots (e.g. Da Nang) show similar exposure to tsunami, potentially originating from the Manila Arc.2
GDP growth Thailand’s rapid GDP growth and accompanying exposure growth in the decade prior to the 2011 floods caught many by surprise. Vietnam has been growing even faster in the last ten years[3]; and exposure data quality (completeness and accuracy) have not necessarily kept up with this development.
Industrialization and global supply chain relevance Many underestimated the significance Thailand played in the global supply chain; for example, in 2011 about a quarter of all hard disk drives were produced in Thailand. Currently, Vietnam is undergoing the same rapid industrialization. For example, Samsung opened yet another multi-billion dollar industrial facility in Vietnam, propelling the country to the forefront of mobile phone production and increasing its significance to the global supply chain.

Implications for the Insurance Industry

In light of these similarities and the strong impact that global warming will have on Vietnam[4], regulators and (re)insurers are now facing several challenges and opportunities:

Modeling of perils and technical writing of business needs to be at the forefront of every executive’s mind for any mid-to long-term business plan. While this is not something that can be implemented overnight, the first steps have been taken, and it’s just a matter of time to get there.

But to get there as quickly and efficiently as possible, another crucial step stone must be taken: to improve exposure data quality in Vietnam. Better exposure insights in Thailand would almost certainly have led to a better understanding of exposure accumulations and could have made a significant difference post floods, resulting in less financial and reputational damage to many (re)insurers.

As insurance veterans know, it’s not a question of if a large scale NatCat event will happen in Vietnam, but a question of when. And while it’s not possible to fully eliminate the element of surprise in NatCat events, the severity of these surprise can be reduced by having better exposure data and exposure management in place.

This is where the real opportunity and challenge lies for Vietnam: getting better exposure insights to be able to mitigate risks. Ultimately, any (re)insurer wants to be in a confident position when someone poses this question: “Do you understand your exposures in Vietnam?”

RMS recognizes the importance of improving the quality and management of exposure data: Over the past twelve months, RMS has released exposure data sets for Vietnam and many other territories in the Asia-Pacific. To find out more about the RMS® Asia Exposure data sets, please e-mail asia-exposure@rms.com.  

[1] Source: https://en.wikipedia.org/wiki/List_of_countries_by_length_of_coastline
[2] Please refer to the RMS® Global Tsunami Scenario Catalog and the RMS® report on Coastlines at Risk of Giant Earthquakes & Their Mega-Tsunami, 2015
[3] The World Bank: http://data.worldbank.org/country/vietnam, last accessed: 1 July 2015
[4] Vietnam ranks among the five countries to be most affected by global warming, World Bank Country Profile 2011: http://sdwebx.worldbank.org/climateportalb/doc/GFDRRCountryProfiles/wb_gfdrr_climate_change_country_profile_for_VNM.pdf

Lessons Hidden In A Quiet Windstorm Season

Wind gusts in excess of 100mph hit remote parts of Scotland earlier this month as a strong jet stream brought windstorms Elon and Felix to Europe. The storms are some of the strongest so far this winter; however, widespread severe damage is not expected because the winds struck mainly remote areas.

These storms are characteristic of what has largely been an unspectacular 2014/15 Europe windstorm season. In fact the most chaotic thing to cross the North Atlantic this winter and impact our shores has probably been the Black Friday sales.

This absence of a significantly damaging windstorm in Europe follows on from what was an active winter in 2013/14, but which contained no individual standout events. More detail of the characteristics of that season are outlined in RMS’ 2013-2014 Winter Storms in Europe report.

There’s a temptation to say there is nothing to learn from this year’s winter storm season. Look closer, however, and there are lessons that can help the industry prepare for more extreme seasons.

What have we learnt?

This season was unusual in that a series of wind, flood, and surge events accumulated to drive losses. This contrasts to previous seasons when losses have generally been dominated by a single peril—either a knockout windstorm or inland flood.

This combination of loss drivers poses a challenge for the (re)insurance industry, as it can be difficult to break out the source of claims and distinguish wind from flood losses, which can complicate claim payments, particularly if flood is excluded or sub-limited.

The clustering of heavy rainfall that led to persistent flooding put a focus on the terms and conditions of reinsurance contracts, in particular the hours clause: the time period over which losses can be counted as a single event.

The season also brought home the challenges of understanding loss correlation across perils, as well as the need to have high-resolution inland flood modeling tools. (Re)insurers need to understand flood risk consistently at a high resolution across Europe, while understanding loss correlation across river basins and the impact of flood specific financial terms, such as the hours clause.

Unremarkable as it was, the season has highlighted many challenges that the industry needs to be able to evaluate before the next “extreme” season comes our way.