Integrating Catastrophe Models Under Solvency II

In terms of natural catastrophe risk, ensuring capital adequacy and managing an effective risk management framework under Solvency II, requires the use of an internal model and the implementation of sophisticated nat cat models into the process. But what are the benefits of using an internal model and how can integrated cat models help a (re)insurer assess cat risk under the new regulatory regime?

Internal Model Versus the Standard Formula

Under Pillar I of the Directive, insurers are required to calculate their Solvency Capital Requirement (SCR), which is used to demonstrate to supervisors, policyholders, and shareholders that they have an adequate level of financial resources to absorb significant losses.

Companies have a choice between using the Standard Formula or an internal model (or partial internal model) when calculating their SCR, with many favoring the use of internal models, despite requiring significant resources and regulatory approval. Internal models are more risk-sensitive and can closely capture the true risk profile of a business by taking risks into account that are not always appropriately covered by the Standard Formula, therefore resulting in reduced capital requirements.

Catastrophe Risk is a Key Driver for Capital Under Solvency II

Rising insured losses from global natural catastrophes, driven by factors such as economic growth, increasing property values, rising population density, and insurance penetration—often in high risk regions, all demonstrate the value of embedding a cat model into the internal model process.

Due to significant variances in data granularity between the Standard Formula and an internal model, a magnitude of difference can exist between the two approaches when calculating solvency capital, with potentially lower SCR calculations for the cat component when using an internal model.

The application of Solvency II is, however, not all about capital estimation, but also relates to effective risk management processes embedded throughout an organization. Implementing cat models fully into the internal model process, as opposed to just relying only on cat model loss output, can introduce significant improvements to risk management processes. Cat models provide an opportunity to improve exposure data quality and allow model users to fully understand the benefits of complex risk mitigation structures and diversification. By providing a better reflection of a company’s risk profile, this can help reveal a company’s potential exposure to cat risk and support companies in making better governance and strategic management decisions.

Managing Cat Risk Using Cat Models

A challenging aspect of bringing cat models in-house and integrating them into the internal model process is the selection of the ”right” model and the “right” method to evaluate a company’s cat exposure. Catastrophe model vendors are therefore obliged to help users understand underlying assumptions and their inherent uncertainties, and provide them with the means of justifying model selection and appropriateness.

Insurers have benefited from RMS support to fulfil these requirements, offering model users deep insight into the underlying data, assumptions, and model validation, to ensure they have complete confidence in model strengths and limitations. With the knowledge that RMS provides, insurers can understand, take ownership, and implement a company’s own view of risk, and then demonstrate this to make more informed strategic decisions as required by the Own Risk and Solvency Assessment (ORSA), which lies at the heart of Solvency II.

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.

RiskTech   

My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.

 

The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

The Rise and Stall of Terrorism Insurance

In the 15 years since the terrorist attacks of September 11, 2001, partnerships between the public sector and private industries have yielded more effective security and better public awareness about the threat of terrorism. We may never come to terms with the sheer volume of human loss from that day and among the hundreds of attacks that continue every year. But we have achieved greater resilience in the face of the ongoing realities of terrorism – except for when it comes to looking ahead at recovering from the catastrophic costs for rebuilding in its aftermath.

Terrorism insurance is facing a structural crisis: hundreds of terrorist attacks occur annually, but actual insurance payouts have been negligible. The economic costs of terrorism have skyrocketed, but demand for terrorism coverage has remained relatively flat. And despite a proliferation of catastrophe bonds and other forms of alternative capital flooding into the property insurance market, relatively little terrorism risk has been transferred to the capital markets. If terrorism insurance – and the insurers who provide it – are to remain relevant, they must embrace the new tools and data available to them to create more relevant products, more innovative coverages, and new risk transfer mechanisms that address today’s threat landscape.

The September 11th, 2001 attacks rank among the largest insurance losses in history at $44 billion, putting it among catastrophes with severe losses such as Hurricane Katrina ($70 billion), the Tohoku earthquake and tsunami ($38 billion), and Hurricane Andrew ($25 billion).

But unlike natural catastrophes, whose damages span hundreds of kilometers, most of the 9/11 damages in New York were concentrated in an area of just 16 acres. Such extreme concentration of loss caused a crisis in the insurance marketplace and highlighted the difficulty of insuring against such a peril.

Following the events of the September 11 attacks, most insurers subsequently excluded terrorism from their policies, forcing the U.S. government to step in and provide a backstop through the Terrorism Risk and Insurance Act (2002). Terrorism insurance has become cost effective as insurer capacity for terrorism risk increased. Today there are an estimated 40 insurers providing it on a stand-alone basis, and it is bundled with standard property insurance contracts by many others.

But despite better data on threat groups, more sophisticated terrorism modeling tools, and increased transparency into the counter-terrorism environment, terrorism insurance hasn’t changed all that much in the past 15 years. The contractual coverage is the same – usually distinguishing between conventional and CBRN (chemical, biological, radiological, and nuclear) attacks. And terrorism insurance take-up remains minimal where attacks occur most frequently, in the middle east and Africa, highlighting what policymakers refer to as an increasing “protection gap.”

Closing this gap – through new products, coverages, and risk transfer schemes – will enable greater resilience following an attack and promote a more comprehensive understanding of the global terrorism risk landscape.

What can you learn from Exposure?

Many RMS team members are active bloggers and speakers at speaking engagements, and contribute to articles, but to capture even more of our expertise and insight, and to reflect the breadth of our activities, we have created a new magazine called Exposure, which is ready for you to download.

The aim of Exposure magazine is to bring together topics of special interest to catastrophe and risk management professionals, and recognize the vast area that individuals involved in risk management must cover today.  There is also a theme that we believe unites the risk community, and that is the belief that “risk is opportunity,” and the articles within Exposure magazine reflect that this is a market seeking to avoid surprises, improve business performance, and innovate to create new opportunities for growth.

Within the foreword to Exposure, Hemant Shah, CEO of RMS, also reflects on an “inflection point” in the industry, a mix of globalization, changing market structures, to technology, data and analytics, offering a chance for the industry to innovate and increase its relevance.  In Exposure, there is a mix of articles examining perils and regions, industry issues, and articles discussing what’s coming up for our industry.

Within perils and regions, Exposure looks at opportunities for U.S. and European flood, the effect extra-tropical transitioning has on typhoons in Japan, and the impact that secondary hazards have, such as liquefaction, and the earthquake sequencing that hit the low-seismicity area of Canterbury, New Zealand in 2010 and 2011.

The magazine also tackles issues around Solvency II, the emergence of the “grey swan” event, why reinsurers are opting to buy or build their own insurance-linked securities fund management capabilities, and wraps up with Robert Muir-Wood, chief research officer for RMS, explaining how insurers can help drive the resilience analytics revolution.

Please download your copy now.

Fire Weather

Fires can start at all times and places, but how a fire spreads is principally down to the weather.

This week, 350 years ago, the fire at Thomas Farriner’s bakery on Pudding Lane, a small alleyway running down to the river from the City of London, broke out at the quietest time of the week, around 1am on Sunday morning September 2, 1666. London had been experiencing a drought and the thatched roofs of the houses were tinder dry. At 4 am the Lord Mayor, roused from his sleep, decided the blaze was easily manageable. It was already too late, however. By 7am the roofs of some 300 houses were burning and fanned by strong easterly winds the fire was spreading fast towards the west. Within three days the fire had consumed 13,000 houses and left 70,000 homeless.

In the city’s reconstruction only brick and tiles houses were permitted, severely reducing the potential for repeat conflagrations. Within a few years there were the first fire insurers, growing their business as fear outran the risk.

Yet big city fires had by no means gone away and the wooden cities of northern Europe were primed to burn. The 1728 Copenhagen fire destroyed 28% of the city while the 1795 fire left 6000 homeless. A quarter of the city of Helsinki burned down in November 1808. The 1842 fire that destroyed Hamburg left 20,000 homeless. The center of the city of Bergen Norway burnt down in 1855 and then again in January 1916.

Wind and fire

By the start of the 20th Century, improvements in fire-fighting had reduced the chance that a great city fire took hold, but not if there were strong winds, like the 1916 Bergen, Norway fire, which broke out in the middle of an intense windstorm with hurricane force gusts. In February 1941 the fire that burnt out the historic center of Santander on the coast of northern Spain was driven by an intense windstorm: equivalent to the 1987 October storm in the U.K. And then there is the firestorm that destroyed Yokohama and Tokyo after the 1923 earthquake, driven by 50 miles per hour winds on the outer edge of a typhoon in which, over a few hours, an estimated 140,000 died.

Wind and fire in the wooden city are a deadly combination. Above a certain wind speed, the fire becomes an uncontrollable firestorm. The 1991 Oakland Hills fire flared up late morning also on a Sunday and then surged out of the mountains into the city, driven by hot dry 60 miles per hour Diablo Winds from the east, jumping an 8 lane highway and overwhelming the ability of the fire crews to hold the line, until the wind eventually turned and the fire blew back over its own embers.  The fire consumed 2800 houses, spreading so fast that 25 died. On February 7, 2009 a strong northwesterly wind drew baking air out of Australia’s interior and fires took off across the state of Victoria. Fallen power cables sparked a fire whose embers, blown by 60 miles per hour winds, flashed from one woodland to another, overwhelming several small towns so fast that 173 died before they could escape.

Most recently we have seen fire storms in Canada. Again there is nothing new about the phenomenon; the Matheson fires in 1919 destroyed 49 Ontario towns and killed 244 people in a fire front that extended 60km wide. It was a firestorm fanned by gale force winds, that destroyed one third of the city of Slave Lake, Alberta, in 2011 and it is fortunate only that the roads were broad and straight to allow people to escape the fires that raged into Fort McMurray in summer 2016.

There is no remedy for a firestorm blown on gale-force winds. And wooden property close to drought ridden forests are at very high risk, such as those from South Lake Tahoe to Berkeley in California and in New Zealand, from Canberra to Christchurch. Which is why urban fire needs to stay on the agenda of catastrophe risk management. A wind driven conflagration can blow deep into any timber city, and insurers need to manage their exposure concentrations.

Launching a New Journal for Terrorism and Cyber Insurance

Natural hazard science is commonly studied at college, and to some level in the insurance industry’s further education and training courses. But this is not the case with terrorism risk. Even if insurance professionals learn about terrorism in the course of their daily business, as they move into other positions, their successors may begin with hardly any technical familiarity with terrorism risk. It is not surprising therefore that, even fifteen years after 9/11, knowledge and understanding of terrorism insurance risk modeling across the industry is still relatively low.

There is no shortage of literature on terrorism, but much has a qualitative geopolitical and international relations focus, and little is directly relevant to terrorism insurance underwriting or risk management.

As a step towards redressing the imbalance in available terrorism literature, a new online journal, The Journal of Terrorism and Cyber Insurance, has been established; its launch is to coincide with the fifteenth anniversary of 9/11. The journal has been welcomed and supported by global terrorism insurance pools, and its launch will be publicized at the annual terrorism pools congress in Canberra, Australia, on October 7, 2016.

Originally conceived as a journal of terrorism insurance, coverage has been extended to include cyber risk, recognizing the increasing insurance industry concerns over cyber terrorism and the burgeoning insurance market in cyber risk. The aim of the open access journal is to raise the industry’s level of knowledge and understanding of terrorism risk. By increasing information transparency for this subject the editorial board hopes to facilitate the growth of the terrorism insurance market, which serves the risk management requirements of the wider international community. The first issue is a solid step in this direction, and will include articles on the ISIS attacks in Paris in November 2015; terrorism insurance in France and Australia; parametric terrorism insurance triggers; non-conventional threats; the clean-up costs of anthrax, and the terrorist use of drones.

The four founding editors of the journal have extensive knowledge of the field. The managing editor is Rachel Anne Carter, who has terrorism insurance administrative experience with both OECD and U.K. Pool Re. Dr. Raveem Ismail, specialty terrorism underwriter at Ariel Re, brings to the editorial board detailed direct terrorism and political risk underwriting knowledge. Padraig Belton is a writer with extensive political risk expertise, having served as a correspondent in the Middle East and Pakistan. As chief architect of the RMS terrorism model, I will bring terrorism risk modeling expertise to the team and have the responsibility and pleasure to review all article submissions. I look forward to sharing insights from the journal with subscribers to this blog.

No More Guessing Games for Marine Insurers

Huge ports mean huge amounts of cargo. Huge amounts of cargo mean huge accumulations of risk.

As a guiding principle about where marine insurers are exposed to the highest potential losses, it seems reasonable enough. But in fact, as RMS research has proven this week, this proposition may be a bit misleading. Surprisingly, a port’s size and its catastrophe loss potential are not strongly correlated.

Take the Port of Plaquemines, LA which is just south-east of New Orleans. It is neither well known nor big in comparison with others around the world. Yet it has the third highest risk in the world of insurance loss due to catastrophe: our analysis revealed its 500-year marine cargo loss from hurricane would be $1.5 billion.

Plaquemines is not an isolated case. There were other smaller ports in our ranking: Pascagoula, MS in the United States ranks 6 on our list with a potential $1 billion marine cargo loss due to storm surge and hurricane; Bremerhaven in Germany (ranked 4th at $1 billion) and Le Havre in France (ranked 10th at $0.7 billion).

Asia-Pacific ports featured less frequently, but worryingly one Asia port topped the list: Nagoya, Japan was number 1 ($2.3 billion potential losses) with Guangzhou, China a close second ($2 billion). Our analysis modeled risk posed by earthquake, wind, and storm surge perils in a 500-year return period across 150 ports – the top ten results are further down this blog.

Ports At Risk For Highest Lost
(500 year estimated catastrophe loss for earthquake, wind, and storm surge perils)

Estimated Marine Cargo Loss in Billions USD
1 Nagoya, Japan 2.3
2 Guangzhou, China 2.0
3 Plaquemines, LA, U.S. 1.5
4 Bremerhaven, Germany 1.0
5 New Orleans, LA, U.S. 1.0
6 Pascagoula, MS, U.S. 1.0
7 Beaumont, TX, U.S. 0.9
8 Baton Rouge, LA, U.S. 0.8
9 Houston, TX, U.S. 0.8
10 Le Havre, France 0.7

* Losses rounded to one decimal place.

Our analysis demonstrates what we at RMS have long suspected: outdated marine risk modeling tools and incomplete data obscure many high-risk locations, big and small. These ports are risky because of the natural perils they face and the cargos which transit through them, as well as the precise way those cargos are stored. But many in the marine sector don’t have these comprehensive insights. Instead they have to make do with a guessing game in determining catastrophe risk and port accumulations. And with the advanced analytics available in 2016 this is no longer good enough.

Big Port or Small – Risk Can Now Be Determined

Back to that seemingly simple proposition about the relationship between port size and insurance risk which I began this blog with. As the table above demonstrates, smaller ports can also present a huge risk.

But the bigger ships and bigger ports brought about by containerization have led, overall, to a bigger risk exposure for marine insurers. Not least because larger vessels have rendered many river ports inaccessible forcing shippers to rely on seaside ports, which are more vulnerable to hurricanes, typhoons, and storm surge.

The value of global catastrophe-exposed cargo is already huge and is likely to keep growing. But the right tools, which use the most precise data, can reveal where the risk of insurance loss is greatest. Leveraging these tools, (re)insurers can avoid dangerous cargo accumulations and underwrite with greater confidence.

Which means that, at last, the guessing game can stop.

In a box: Our ranking of high risk ports used the new RMS Marine Cargo Model™, with geospatial analysis of thousands of square kilometers of satellite imagery across ports in 43 countries. RMS’ exposure development team used a proprietary technique for allocating risk exposure across large, complex terminals to assess the ports’ exposure and highlight the risk of port aggregations. The model took into account:

  • Cargo type (e.g. autos, bulk grains, electronics, specie)
  • Precise storage location (e.g. coastal, estuarine, waterside or within dock complex)
  • Storage type (e.g. open air, warehouse, container — stacked or ground level)
  • Dwell time (which can vary due to port automation, labor relations and import/export ratios)

Insurance-Linked Securities in Asia – Looking Out for the Tipping Point

We were at a conference in Singapore, pushing to develop a market that doesn’t yet really exist. Grounds, you might think, for frustration.

And yet my RMS capital markets colleague, Jin Shah, and I were upbeat and, in truth, a little excited.

So often we end up at ILS conferences talking to the same audiences about the same topics. But this was different. The inaugural ILS Asia Conference organized by Artemis.bm, the de facto bulletin-board for the ILS industry, had 170 industry experts and practitioners from the region gathered in the Raffles Hotel ballroom.

The aim of the event was to demonstrate the ILS industry’s commitment to building a global footprint and developing expertise in the asset class among Asia’s investors and reinsurers. This conference was exciting because we can see the Asia insurance industry will approach a tipping point in the next decade or so, resulting in increased appetite in Asian ILS instruments from both sides. Let me explain how.

An Insurance Market Which Has Not Yet Matured

Currently in many Asian countries, the insurance market is still developing and the concept of insurance as a social and economic “good” is still not culturally normalized. In addition, mandatory insurance outside of auto/motor is, in some places, almost non-existent, with individuals looking instinctively to family and other social networks to provide financial safety-net.

Because of these factors, combined with generally lower levels of disposable income, property insurance penetration, in particular, is comparatively low in Asia. Thus, the region only contributes a small amount to reinsurer’s portfolios and capital loads. So they don’t yet need to transfer some of that risk to the capital markets as is the case in core, concentrated regions such as the U.S., Japan, and Europe. The economics of ILS in Asia are challenging to say the least, and in some cases, make fully collateralized products “non-starters” from a competitive point of view.

Growing Populations and Changing Demographics

But that’s the current environment. The future growth of the middle classes, particularly in China and India, will fuel increasing demand for all forms of insurance as more people chose to protect their assets against damage and loss. Given the sheer size of the population and their rate of growth, it is not inconceivable that within ten years these markets could represent a similar level of risk concentration to (re)insurers as the U.S., Europe, or Japan.

And that’s the tipping point.

In certain Asian countries, the ILS sector is already developed. For a number of years, Australian insurers have been tapping the capital markets as a strategic element of their outwards protection. Japanese risk has been a core part of the risk available in both the cat bond and collateralized re markets. Outside of these more mature markets, last year China Re issued their Panda Re cat bond which, whilst only being a $50 million dip-of-a-toe in the water, showed that ILS funds were keen to accept China risk and pave the way for larger issuances in the future.

And with social, demographic and economic changes in the years ahead Asia will provide a natural hunting ground for ILS funds, keen to leverage their broad and diversified capital base to support the local insurance market’s continued growth.

Sensing this future tipping point too, the Artemis conference was attended by more than 25 industry stalwarts who’d travelled from London, Bermuda, New York, San Francisco, Japan, and Australia to bring the conversation to new audiences. ILS investors are clearly looking to this region to diversify their own portfolios, both as a risk management measure and with an eye to the rapid growth occurring in the region – and the opportunities it presents.

Searching for Clues After the Ecuador Earthquake

Reconnaissance work is built into the earthquake modeler’s job description – the backpack is always packed and ready. Large earthquakes are thankfully infrequent, but when they do occur, there is much to be learned from studying their impact, and this knowledge helps to improve risk models.

An RMS reconnaissance team recently visited Ecuador. Close to 7pm local time, on April 16, 2016, an Mw7.8 earthquake struck between the small towns of Muisne and Pedernales on the northwestern coast of Ecuador. Two smaller, more recent earthquakes have also impacted the area, on July 11, 2016 an Mw5.8 and Mw6.2, fortunately with no significant damage.

April’s earthquake was the strongest recorded in the country since 1979 and, at the time of writing, the strongest earthquake experienced globally so far in 2016. The earthquake caused more than 650 fatalities, more than 17,600 injuries, and damage to more than 10,000 buildings.

Two weeks after the earthquake, an RMS reconnaissance team of engineers started their work, visiting five cities across the affected region, including Guayaquil, Manta, Bahía de Caráquez, Pedernales, and Portoviejo. Pedernales was the most affected, experiencing the highest damage levels due to its proximity to the epicenter, approximately 40km to the north of the city.

Sharing the Same Common Vulnerability

The majority of buildings in the affected region were constructed using the same structural system: reinforced concrete (RC) frames with unreinforced concrete masonry (URM) infill. This type of structural system relies on RC beams and columns to resist earthquake shaking, with the walls filled in with unreinforced masonry blocks. This system was common across residential, industrial, and commercial properties and across occupancies, from hospitals and office buildings to government buildings and high-rise condominiums.

URM infill is particularly susceptible to damage during earthquakes, and for this reason it is prohibited by many countries with high seismic hazard. But even though Ecuador’s building code was updated in 2015, URM infill walls are still permitted in construction, and are even used in high-end residential and commercial properties.

Without reinforcing steel or adequate connection to the surrounding frame, the URM often cracks and crumbles during strong earthquake shaking. In some cases, damaged URM on the exterior of buildings falls outward, posing safety risks to people below. And for URM that falls inward, besides posing a safety risk, it often causes damage to interior finishes, mechanical equipment, and contents.

Across the five cities, the observed damage ranged from Modified Mercalli Intensity (MMI) 7.0-9.0. For an MMI of 7.0, the damage equated to light to moderate damage of URM infill walls, and mostly minimal damage to RC frames with isolated instances of moderate-to-heavy damage or collapse. An MMI of 9.0, which based on RMS observations, occurred in limited areas, meant moderate to heavy damage of URM infill walls and slight to severe damage or collapse to RC frames.

While failure of URM infill was the most common damage pattern observed, there were instances of partial and even complete structural collapse. Collapse was often caused, at least in part by poor construction materials and building configurations, such as vertical irregularities, that concentrated damage in particular areas of buildings.

Disruption to Business and Public Services

The RMS team also examined disruption to business and public services caused by the earthquake. A school in Portoviejo will likely be out of service for more than six months, and a police station in Pedernales will likely require more than a year of repair work. The disruption observed by the RMS team was principally due to direct damage to buildings and contents. However, there was some disruption to lifeline utilities such as electricity and water in the affected region, and this undoubtedly impacted some businesses.

RMS engineers also visited four public hospitals and clinics, with damage ranging from light to complete collapse. The entire second floor of a clinic in Portoviejo collapsed. A staff doctor told RMS that the floor was empty at the time and all occupants, including patients, evacuated safely.

Tourism was disrupted, with a few hotels experiencing partial or complete collapse. In some cases, even lightly damaged and unaffected hotels were closed as they were within cordoned-off zones in Manta or Portoviejo.

Tuna is an important export product for Ecuador. Two plants visited sustained minor structural damage, with unanchored machinery requiring repositioning and recalibration. One tuna processing plant reached 100% capacity just 16 days after the earthquake. Another in Manta reached 85% capacity about 17 days after the earthquake, and full capacity was expected within one month.

The need for risk differentiation

Occupancy, construction class, year built, and other building characteristics influence the vulnerability of buildings and, consequently, the damage they sustain during earthquakes. Vulnerability is so important in calculating damage from earthquakes that RMS model developers go to great lengths to ensure that each country’s particular engineering and construction practices are accurately captured by the models. This approach enables the models to differentiate risk across thousands of different factors.

Residential insurance penetration in Ecuador is still relatively low for commercial buildings and privately owned or financed homes, but higher amongst government-backed mortgages, as these require insurance. The knowledge gained from reconnaissance work is fundamental to our understanding of earthquake risk and informs future updates to RMS models. Better models will improve the insurance industry’s understanding and management of earthquake risk as insurance penetration increases both here and around the world.

How U.S. inland flood became a “peak” peril

This article by Jeff Waters, meteorologist and product manager at RMS, first appeared in Carrier Management.

As the journey towards a private flood insurance market progresses, (re)insurers can learn a lot from the recent U.S. flood events to help develop profitable flood risk management strategies.

Flood is the most pervasive and frequent peril in the U.S. Yet, despite having the world’s highest non-life premium volume and one of the highest insurance penetration rates, a significant protection gap still exists in the U.S. for this peril.

It is well-known that U.S. flood risk is primarily driven by tropical cyclone-related events, with storm surge being the main cause. In the last decade alone, flooding from tropical cyclones have caused more than $40 billion (2015 USD) in insured losses and contributed to today’s massive $23 billion National Flood Insurance Program (NFIP) deficit: 13 out of the top 15 flood events, determined by total NFIP payouts, were related to storm surge-driven coastal flooding from tropical cyclones.

Inland flooding, however, should not be overlooked. It too can contribute to a material portion of overall U.S. flood risk, as seen recently in the Southern Gulf, South Carolina, and in West Virginia, two areas impacted by major loss-causing events. These catastrophes caused billions in economic and insured losses while demonstrating the widespread impact caused by precipitation-driven fluvial (riverine) or pluvial (surface water) flooding. It is these types of flooding events that should be accounted for and well understood by (re)insurers looking to enter the private flood insurance market.

It hasn’t just rained; it has poured

In the past 15 months the U.S. has suffered several record-breaking or significant rainfall-induced inland flood events ….

To read the article in full, please click here.