Tag Archives: Natural Catastrophe

Earthquake Hazard: What Has New Zealand’s Kaikoura Earthquake Taught Us So Far?

The northeastern end of the South Island is a tectonically complex region with the plate motion primarily accommodated through a series of crustal faults. On November 14, as the Kaikoura earthquake shaking began, multiple faults ruptured at the same time culminating in a Mw 7.8 event (as reported by GNS Science).

The last two weeks have been busy for earthquake modelers. The paradox of our trade is that while we exist to help avoid the damage this natural phenomenon causes, the only way we can fully understand this hazard is to see it in action so that we can refine our understanding and check that our science provides the best view of risk. Since November 14 we have been looking at what Kaikoura tells us about our latest, high-definition New Zealand Earthquake model, which was designed to handle such complex events.

Multiple-Segment Ruptures

With the Kaikoura earthquake’s epicenter at the southern end of the faults identified, the rupture process moved from south to north along this series of interlinked faults (see graphic below). Multi-fault rupture is not unique to this event as the same process occurred during the 2010 Mw 7.2 Darfield Earthquake. Such ruptures are important to consider in risk modeling as they produce events of larger magnitude, and therefore affect a larger area, than individual faults would on their own.

Map showing the faults identified by GNS Sciences as experiencing surface fault rupture in the Kaikoura Earthquake.
Source: http://info.geonet.org.nz/display/quake/2016/11/16/Ruptured +land%3A+observations+from+the+air

In keeping with the latest scientific thinking, the New Zealand Earthquake HD Model provides an expanded suite of events that represent complex ruptures along multiple faults. For now, these are included only for areas of high slip fault segments in regions with exposure concentrations, but their addition increases the robustness of the tail of the Exceedance Probability curve, meaning clients get a better view of the risk of the most damaging, but lower probability events.

Landsliding and Liquefaction

While most property damage has been caused directly by shaking, infrastructure has been heavily impacted by landsliding and, to a lesser extent, liquefaction. Landslides and slumps have occurred across the region, most notably over Highway 1, an arterial route. The infrastructure impacts of the Kaikoura earthquake are a likely dress rehearsal for the expected event on the Alpine Fault. This major fault runs 600 km along the western coast of the South Island and is expected to produce an Mw 8+ event with a probability of 30% in the next 50 years, according to GNS Science.

As many as 80 – 100,000 landslides have been reported in the upper South Island, with some creating temporary dams over rivers and in some cases temporary lakes (see below). These dams can fail catastrophically, sending a sudden increase of water flow down the river.



Examples of rivers blocked by landslides photographed by GNS Science researchers.

Source: http://info.geonet.org.nz/display/quake/2016/11/18/ Landslides+and+Landslide+dams+caused +by+the+Kaikoura+Earthquake









Liquefaction occurred in discrete areas across the region impacted by the Kaikoura earthquake. The Port of Wellington experienced both lateral and vertical deformation likely due to liquefaction processes in reclaimed land. There have been reports of liquefaction near the upper South Island towns (Blenheim, Seddon, Ward), but liquefaction will not be a driver of loss in the Kaikoura event to the extent it was in the Christchurch earthquake sequence.

RMS’ New Zealand Earthquake HD Model includes a new liquefaction component that was derived using the immense amount of new borehole data collected after the Canterbury Earthquake Sequence in 2010-2011. This new methodology considers additional parameters, such as depth to the groundwater table and soil-strength characteristics, that lead to better estimates of lateral and vertical displacement. The HD model is the first probabilistic model with a landslide susceptibility component for New Zealand.


The Kaikoura Earthquake generated tsunami waves that were observed in Kaikoura at 2.5m, Christchurch at 1m, and Wellington at 0.5m. The tsunami waves arrived in Kaikoura significantly earlier than in Christchurch and Wellington indicating that the tsunami was generated near Kaikoura. The waves were likely generated by offshore faulting, but also may be associated with submarine landsliding. Fortunately, the scale of the tsunami waves did not produce significant damage. RMS’ latest New Zealand Earthquake HD Model captures tsunami risk due to local ocean bottom deformation caused by fault rupture, and is the first model in the New Zealand market to do this, that is built from a fully hydrodynamic model.

Next Generation Earthquake Modeling at RMS

Thankfully the Kaikoura earthquake seems to have produced damage that is lower than we might have seen had it hit a more heavily populated area of New Zealand with greater exposures – for detail on damage please see my other blog on this event.

But what Kaikoura has told us is that our latest HD model offers an advanced view of risk. Released only in September 2016, it was designed to handle such a complex event as the Kaikoura earthquake, featuring multiple-segment ruptures, a new liquefaction model at very high resolution, and the first landslide susceptibility model for New Zealand.

New Zealand’s Kaikoura Earthquake: What Have We Learned So Far About Damage?

The Kaikoura Earthquake of November 14 occurred in a relatively low population region of New Zealand, situated between Christchurch and Wellington. The largest town close to the epicentral region is Blenheim, with a population near 30,000.

Early damage reports indicate there has been structural damage in the northern part of the South Island as well as to numerous buildings in Wellington. While most of this has been caused directly by shaking, infrastructure and ports across the affected region have been heavily impacted by landsliding and, to a lesser extent, liquefaction. Landslides and slumps have occurred across the northeastern area of the South Island, most notably over Highway 1, severing land routes to Kaikoura – a popular tourist destination.

The picture of damage is still unfolding as access to badly affected areas improves. At RMS we have been comparing what we have learned from this earthquake to the view of risk provided by our new, high-definition New Zealand Earthquake model, which is designed to improve damage assessment and loss quantification at location-level resolution.

No Damage to Full Damage

The earthquake shook a relatively low population area of the South Island and, while it was felt keenly in Christchurch, there have been no reports of significant damage in the city. The earthquake ruptured approximately 150 km along the coast, propagating north towards Wellington. The capital experienced ground shaking intensities at the threshold for damage, producing façade and internal, non-structural damage in the central business district. Although the shaking intensities were close to those experienced during the Cook Strait sequence in 2013, which mostly affected short and mid-rise structures, the longer duration and frequency content of the larger magnitude Kaikoura event has caused more damage to taller structures which have longer natural periods.

From: Wellington City Council

Within Wellington, cordons are currently in place around a few buildings in the CBD (see above) as engineers carry out more detailed inspections. Some are being demolished or are set to be, including a nine-story structure on Molesworth Street and three city council buildings. It should be noted that most of the damage has been to buildings on reclaimed land close to the harbor where ground motions were likely amplified by the underlying sediments.

From: http://www.stuff.co.nz/national/86505695/quakehit-wellington-building-at-risk-of-collapse-holds-up-overnight; The building on Molesworth street before the earthquake (L) and on Tuesday (R).

From: http://www.stuff.co.nz/national/86505695/quakehit-wellington-building-at-risk-of-collapse-holds-up-overnight; The building on Molesworth street before the earthquake (L) and after on November 16 (R).

Isolated incidences of total damage in an area of otherwise minor damage demonstrate why RMS is moving to the new HD financial modeling framework. The RMS RiskLink approach applies a low mean damage ratio across the area, whereas RMS HD damage functions allow for zero or total loss – as well as a distribution in between which is sampled for each event for each location. The HD financial modeling framework is able to capture a more realistic pattern of gross losses.

Business Interruption

The Kaikoura Earthquake will produce business interruption losses from a variety of causes such as direct property or content damages, relocation costs, or loss of access to essential services (i.e. power and water utilities, information technology) that cripple operations in otherwise structurally sound buildings. How quickly businesses are able to recover depends on how quickly these utilities are restored. Extensive landslide damage to roads means access to Kaikoura itself will be restricted for months. The New Zealand government has announced financial assistance packages for small business to help them through the critical period immediately after the earthquake. Similar assistance was provided to businesses in Christchurch after the Canterbury Earthquake Sequence in 2010-2011.

That earthquake sequence and others around the world have provided valuable insights on business interruption, allowing our New Zealand Earthquake HD model to better capture these impacts. For example, during the Canterbury events, lifelines were found to be repaired much more quickly in urban areas than in rural areas, and areas susceptible to liquefaction were associated with longer down times due to greater damage to underground services. The new business interruption model provides a more accurate assessment of these risks by accounting for the influence of both property and contents damage as well as lifeline downtime.

It remains to be seen how significant any supply chain or contingent business interruption losses will be. Landslide damage to the main road and rail route from Christchurch to the inter-island ferry terminal at Picton has disrupted supply routes across the South Island. Alternative, longer routes with less capacity are available.

Next Generation Earthquake Modeling at RMS

RMS designed the update to its New Zealand Earthquake High Definition (HD) model, released in September 2016, to enhance location-level damage assessment and improve the gross loss quantification with a more realistic HD financial methodology. The model update was validated with billions of dollars of claims data from the 2010-11 Canterbury Earthquake Sequence.

Scientific and industry lessons learned following damaging earthquakes such as last month’s in Kaikoura and the earlier event in Christchurch increase the sophistication and realism of our understanding of earthquake risk, allowing communities and businesses to shift and adapt – so becoming more resilient to future catastrophic events.

Shrugging Off a Hurricane: A Three Hundred Year Old Culture of Disaster Resilience

If a global prize was to be awarded to the city or country that achieves the peak of disaster resilience, Bermuda might be a fitting first winner.

This October’s Hurricane Nicole made direct landfall on the island. The eyewall tracked over Bermuda with maximum measured windspeeds close to 120 mph. Nonetheless there were there were no casualties. The damage tally was principally to fallen trees, roadway debris, some smashed boats and many downed utility poles. The airport opened in 24 hours, with the island’s ferries operating the following day.

Bermuda’s performance through Nicole was exemplary. What’s behind that?

Since its foundation in 1609 when 150 colonists and crew were shipwrecked on the island, Bermuda has got used to its situation at the heart of hurricane alley. Comprising 21 square miles of reef and lithified dunes, sitting out in the Atlantic 650 miles west of Cape Hatteras, a hurricane hits the island on average once every six or seven years. Mostly these are glancing blows, but once or twice a century Bermuda sustains direct hits at Category 3 or 4 intensity. Hurricane Fabian in 2003 was the worst of the recent storms, causing $300 million of damage (estimated to be worth $650 million, accounting for today’s higher prices and greater property exposure). The cost of the damage from Hurricane Gonzalo in 2014 was about half this amount.

How did Bermuda’s indigenous building style come to adopt such a high standard of wind resistance? It seems to go back to a run of four hurricanes at the beginning of the 18th Century. First, in September 1712 a hurricane persisted for eight hours destroying the majority of wooden buildings. Then twice in 1713 and again more strongly in 1715 the hurricane winds ruined the newly rebuilt churches. One hurricane can seem like an exception, four becomes a trend. In response, houses were constructed with walls of massive reef limestone blocks, covered by roofs tiled with thick slabs of coral stone: traditional house styles that have been sustained ever since.

The frequency of hurricanes has helped stress test the building stock, and ensure the traditional construction styles have been sustained. More recently there has been a robust and well-policed building code to ensure adequate wind resistance for all new construction on the island.

Yet resilience is more than strong buildings. It also requires hardened infrastructure, and that is where Bermuda has some room for improvement. Still dependent on overhead power lines, 90 percent of the island’s 27,000 houses lost power in Hurricane Nicole – although half of these had been reconnected by the following morning and the remainder through that day. Mobile phone and cable networks were also back in operation over a similar timescale. Experience of recent hurricanes has ensured an adequate stockpile of cable and poles.

Expert Eyes on the Island

It helps that there is an international reinsurance industry on the island, with many specialists in the science of hurricanes and the physics and engineering of building performance on hand to scrutinize the application of improved resilience. Almost every building is insured, giving underwriters oversight of building standards. Most importantly, the very functioning of global reinsurance depends on uninterrupted connection with the rest of the world, as well as ensuring that on-island staff are not distracted by having to attend to their family’s welfare.

Bermuda’s experience during Nicole would merit the platinum standard of resilience adopted by the best businesses: that all functions can be restored within 72 hours of a disaster. The Bermuda Business Development Agency and the Association of Bermuda Insurers and Reinsurers were fulsome in their praise for how the island had withstood the hurricane. The strong and widely-owned culture of preparedness, reflects the experience of recent storms like Gonzalo and Fabian.

Stephen Weinstein, general counsel at RenaissanceRe, commented “It’s remarkable that one day after a major hurricane strike, Bermuda is open for business, helping finance disaster risk worldwide, and poised to welcome back business visitors and vacationers alike.”

In early 2017, RMS will issue an update to Bermuda wind vulnerability in the version 17 software release as part of a broader update to the 33 islands and territories covered by the North Atlantic Hurricane Models. Updates to Bermuda vulnerability will consider past hurricane observations and the latest building code research.

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.


My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.


The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

How U.S. inland flood became a “peak” peril

This article by Jeff Waters, meteorologist and product manager at RMS, first appeared in Carrier Management.

As the journey towards a private flood insurance market progresses, (re)insurers can learn a lot from the recent U.S. flood events to help develop profitable flood risk management strategies.

Flood is the most pervasive and frequent peril in the U.S. Yet, despite having the world’s highest non-life premium volume and one of the highest insurance penetration rates, a significant protection gap still exists in the U.S. for this peril.

It is well-known that U.S. flood risk is primarily driven by tropical cyclone-related events, with storm surge being the main cause. In the last decade alone, flooding from tropical cyclones have caused more than $40 billion (2015 USD) in insured losses and contributed to today’s massive $23 billion National Flood Insurance Program (NFIP) deficit: 13 out of the top 15 flood events, determined by total NFIP payouts, were related to storm surge-driven coastal flooding from tropical cyclones.

Inland flooding, however, should not be overlooked. It too can contribute to a material portion of overall U.S. flood risk, as seen recently in the Southern Gulf, South Carolina, and in West Virginia, two areas impacted by major loss-causing events. These catastrophes caused billions in economic and insured losses while demonstrating the widespread impact caused by precipitation-driven fluvial (riverine) or pluvial (surface water) flooding. It is these types of flooding events that should be accounted for and well understood by (re)insurers looking to enter the private flood insurance market.

It hasn’t just rained; it has poured

In the past 15 months the U.S. has suffered several record-breaking or significant rainfall-induced inland flood events ….

To read the article in full, please click here.

A Perennial Debate: Disaster Planning versus Disaster Response

In May we saw a historic first: the World Humanitarian Summit. Held in Istanbul, representatives of 177 states attended. One UN chief summarised its mission thus: “a once-in-a-generation opportunity to set in motion an ambitious and far-reaching agenda to change the way that we alleviate, and most importantly prevent, the suffering of the world’s most vulnerable people.”

And in that sentence we find one of the enduring tensions within the disaster field: between “prevention” and “alleviation.” Between on the one hand reducing disaster risk through resilience-building investments, and on the other reducing suffering and loss through emergency response.

But in a world of constrained political budgets, where should we concentrate our energies and resources: disaster risk reduction or disaster response?

How to Close the Resilience Gap

The Istanbul summit saw a new global network launched to engage business in crisis situations through “pre-positioning supplies, meeting humanitarian needs and providing resources, knowledge and expertise to disaster prevention.” It is, of course, prudent to have stockpiles of humanitarian supplies strategically placed.

But is the dialogue still too focused on response? Could we not have hoped to see a greater emphasis on driving the disaster-resilient behaviours and investments, which reduce the reliance on emergency response in the first place?

Politics & Priorities

“Cost-effectiveness” is a concept with which humanitarian aid and governmental agencies have struggled over many years. But when it comes to building resilience, it is in fact possible to cost-justify the best course of action. After all, the insurance industry, piqued by the dual surprise of Hurricane Andrew and then the Northridge earthquake, has been using stochastic models to quantify and reduce catastrophe risk since the mid-1990s.

Unfortunately risk/reward analyses are rarely straightforward in practice. This is less a failing of the models to accurately characterise complex phenomena, though that certainly is a challenge. It’s more a question of politics.

It is harder for any government to argue that spending scarce public funds on building resilience in advance of a possible disaster is money well spent. By contrast, when disaster strikes and human suffering is writ large across the media, then there is a pressing political imperative to intervene. As a result many agencies sadly allocate more funds to disaster response than to disaster prevention, even though the analytics mostly suggest the opposite would be more beneficial.

A New, Ambitious form of Public Private Partnership

But there are signs that across the different strata of government the mood is changing. The cities of San Francisco and Berkeley, for example, have begun to use catastrophe models to quantify the cost of inaction and thereby drive risk-reducing investments. For San Francisco the focus has been on protecting the city’s economic and social wealth from future sea level rise. In Berkeley, resilience models have been deployed to shore-up critical infrastructure against the threat of earthquakes.

In May, RMS held the first international workshop on how resilience analytics can be used to manage urban resilience. Attended by public officials from several continents the engagement in the topic was very high.

The role of resilience analytics to help design, implement, and measure resilience strategies was emphasized by Arnoldo Kramer, the first Chief Resilience Officer (CRO) of the largest city in the western hemisphere, Mexico City. The workshop discussion went further than just explaining how these models can be used to quantify the potential, risk-adjusted return on investment from resilience initiatives. The group stressed the role of resilience metrics in helping cities finance capital investments in new, protective infrastructure.

Stimulated by commitments under the Sendai Framework to work more closely with the private sector, lower income regions are also increasingly benefiting from such techniques – not just to inform disaster response, but also to finance the reduction of disaster risk in the first place. Indeed there are encouraging signs that these two different worlds are beginning to understand each other better. At the inaugural working group meeting of the Insurance Development Forum in Singapore last month there was a productive dialogue between the UN Development Programme and the risk transfer industry. It was clear that both sides wanted action, not just words.

Such initiatives can only serve to accelerate the incorporation of resilience analytics into existing disaster risk reduction programmes. This may be a once-in-a-generation opportunity to address the shameful gap between the economic costs of natural disasters and the fraction of those costs that are insured.

We cannot prevent natural disasters from happening. But neither can we continue to afford to spend billions of dollars picking up the pieces when they strike. I am hopeful that we will take this opportunity to bring resilience analytics into under-served societies, making them tougher, more resilient, so that when catastrophe strikes, the impact is lessened and societies can bounce back far more readily.

Lessons Learned from Winter Windstorm Season in Europe

The 2013–2014 winter windstorm season in Europe will be remembered for being particularly active, bringing persistent unsettled weather to the region, and with it some exceptional meteorological features. The insurance industry will have much to learn from this winter.

Past extreme windstorms, such as Daria, Herta, Vivian, and Wiebke in 1990, each caused significant losses in Europe. In contrast, the individual storms of 2013–2014 caused relatively low levels of loss. While not extreme on a single-event basis, the accumulated activity and loss across the season was notable, primarily due to the specific characteristics of the jet stream.

A stronger-than-usual jet stream off the U.S. Eastern Seaboard was caused by very cold polar air over Canada and warmer-than-normal sea-surface temperatures in the sub-tropical West Atlantic and Caribbean Sea. Subsequently, this jet stream significantly weakened over the East Atlantic.

Therefore, the majority of systems were mature and wet when they reached Europe. These storms, while associated with steep pressure gradients, brought only moderate peak gust wind speeds onshore, mainly to the U.K. and Ireland. In contrast, the storms that hit Europe in 1990 were mostly still in their development phase under a strong jet stream as they passed over the continent.

The 2013––2014 storms were also very wet, and many parts of the U.K. experienced record-breaking rainfall resulting in significant inland flooding. Again, individual storms were not uniquely severe, but the impact was cumulative, especially as the soil progressively saturated.

Not all events this winter season weakened before impact. Windstorms Christian and Xaver were exceptions, only becoming mature storms after crossing the British Isles into the North Sea and were more damaging.

Christian impacted Germany, Denmark, and Sweden with strong winds. RMS engineers visited the region and observed that the majority of building damage was dominated by the usual tile uplift along the edges of the buildings. Fallen trees were observed, but in most cases, there was sufficient clearance to prevent them from causing building damage.

Xaver brought a significant storm surge to northern Europe, although coastal defenses mostly withstood the storm. Xaver, as well as some of this year’s other events, demonstrated the importance of understanding tides when assessing surge hazard as many events coincided with some of the highest tides of the year. The size of a storm-induced surge is much smaller than the local tidal range; consequently, if these events had occurred a few days earlier or later, the astronomical tide would have been reduced, significantly reducing the high water level.


Wind, flood, and coastal surge are three components of this variable peril that can make the difference between unsettled and extreme weather. This highlights the importance of modeling the complete life cycle of windstorms, the background climate, and antecedent conditions to fully understand the potential hazard.

This season has also raised questions about the variability of windstorm activity in Europe, how much we understand this variability, and what we can do to better understand it in the future. While this winter season was active, we have been in a lull of storm activity for about 20 years.

Given the uncertainty that surrounds our ability to predict the future of this damaging peril, perhaps for now we are best positioned to learn lessons from the past. This past winter provided a unique opportunity, compared to the more extreme events that have dominated the recent historical record.

RMS has prepared a detailed report on the 2013–2014 Europe windstorm season, which analyzes the events that occurred and their insurance and modeling considerations. To access the full report, visit RMS publications.

2014 Atlantic Hurricane Season Outlook: Are the Tides Beginning to Turn?

The 2014 Atlantic Hurricane Season officially kicked off this week (June 1), running through November 30. Coming off a hurricane season with the lowest number of hurricanes in the Atlantic Basin since 1983, will 2014 follow suit as a less active season? If so, is the Atlantic Basin officially signaling a shift out of an active phase of hurricane activity? Or will we revert back to the above-average hurricane numbers and intensities we’ve grown accustomed to over most of the last 20 years? And regardless of the season’s severity, what should be done to prepare?

Forecasting the 2014 Hurricane Season

Most forecasts to date, including those of Colorado State University and the National Oceanic and Atmospheric Administration (NOAA), are calling for an average to below-average season in terms of the number of named storms (8–13), hurricanes (3–6), and major hurricanes (0–3). The same holds true for the overall intensity forecasts, where projected seasonal values of Accumulated Cyclone Energy (ACE) range from just 55 to 84, compared to the average overall seasonal ACE of 101.8.

So what’s driving this outlook? Most forecasting organizations are attributing it to two major atmospheric drivers that have been known to suppress hurricane activity: the strong likelihood of an El Niño event developing this summer into the peak part of the season from July through October, and below-average sea surface temperatures in the Atlantic Basin’s Main Development Region (MDR).

Model forecasts for El Niño/La Niña conditions in 2014. El Niño and La Niña conditions occur when sea surface temperatures in the equatorial central Pacific are 0.5°C warmer than average and 0.5°C cooler than average, respectively.

Model forecasts for El Niño/La Niña conditions in 2014. El Niño and La Niña conditions occur when sea surface temperatures in the equatorial central Pacific are 0.5°C warmer than average and 0.5°C cooler than average, respectively.

El Niño conditions create stronger-than-normal upper-level winds, which inhibit storms from forming and maintaining a favorable structure for intensification. Similarly, below-average ocean temperatures in the MDR essentially reduce the energy available to fuel storms, making it difficult for them to develop and intensify.

However, low activity does not always translate into a decrease in landfalling hurricanes. Also, all it takes is one landfalling event to cause catastrophic losses. For example, 1992 was a strong El Niño year, yet Hurricane Andrew made landfall in Florida as a Category 5 storm, eventually becoming the fourth most intense U.S. landfalling hurricane recorded, and the fourth costliest U.S. Atlantic hurricane. Of course, while a landfalling storm like Andrew may have occurred during the last significant El Niño year, there’s no guarantee it will happen this season. The U.S. has not experienced a major landfalling hurricane since Hurricane Wilma in August of 2005. This eight-year drought is the longest in recorded U.S. history.

Preparing for Hurricane Season

Whether or not the 2014 Atlantic hurricane season is active, it is imperative to monitor and prepare for impending storms effectively to help reduce the effects of a hurricane disaster.

The NOAA National Hurricane Center provides several tips and educational guides for improving hurricane awareness, including forecasting tools that assess the potential impacts of landfalling hurricanes. This year, NOAA also offers an experimental mapping tool, as well as other new tools, to help communities understand their potential storm surge flood threat.

The RMS Event Response team provides real-time updates for all Atlantic hurricanes, among other global hazards, 24 hours a day, seven days a week. Similarly, when it comes to preparation, along with the essentials, such as bottled water, canned foods, and battery-powered flashlights, consider purchasing these ten items.

Are you ready for the 2014 Atlantic Hurricane season?

Understanding the Potential Impact of the Next Catastrophic European Flood

Over the past year, Europe has intermittently but consistently suffered from significant flooding.

Most recently, the Balkans experienced widespread devastation in May due to some of the region’s heaviest precipitation on record. Three months worth of rain fell in just three days. The subsequent flooding was so severe that entire towns were submerged. While it is too soon to estimate the full impact, the economic and humanitarian costs will be high.

This event follows one of the stormiest and wettest winters on record for the U.K. Remote locations bore the worst of it, and for now, the U.K. government and insurance industry appear to have largely escaped a sizeable bill, at least on the scale of previous flood events.

The events come just one year after the costliest natural catastrophe of 2013 for the insurance industry, when flooding inundated Central and Eastern Europe in late May and early June. The event caused around $20 billion (€12 billion) in economic losses, of which it is estimated that approximately 20 percent was insured.

As with the more recent Balkans and U.K. events, the May 2013 flooding followed a period of extreme rainfall; consequently, groundwater and soil moisture levels were saturated. As more rain fell in late May and early June, the precipitation had nowhere to go except to flow through catchments into the river network as runoff. The Danube, Elbe, and other rivers overflowed, resulting in significant flooding across Germany and the Czech Republic, and, to lesser extents, Austria, Switzerland, Poland, Slovakia, Hungary, Croatia, and Serbia.

Each of these events highlighted the importance of understanding the impact of precipitation, whether from a short, intense period of rainfall, prolonged wet conditions, or a combination of these characteristics. In each case, to evaluate flood risk, it is vital to understand how antecedent wetness conditions influences subsequent flooding.

In 2002, Central Europe was similarly inundated by severe flooding, producing economic losses of over $28 billion (€17 billion). Both events were triggered by similar meteorological phenomena, Genoa type-lows. However, the antecedent conditions in 2002 were comparatively dry compared to those in 2013, and the precipitation that triggered the eventual flooding was more severe in 2002 compared to 2013.

Both events had significant impacts, but what would happen if we combined the worst features of both to create a “perfect storm” type of flood event?

Combining the antecedent wetness of spring 2013 with the extreme precipitation of the August 2002 event, RMS researchers estimated how severe this “perfect flood” could be. Results of this study show a substantial increase in peak flow (more than 50 percent on average) for both the Elbe and Danube rivers.

Elbe River flood hazard map for a "perfect flood event," Riesa, Germany

Elbe River flood hazard map for a “perfect flood event,” Riesa, Germany

In certain locations, this scenario would be characterized by a flood extent (shown above for the area surrounding Riesa, Germany) of about 2.5 times that observed in 2002. But given the remarkable non-linearity between hazard and damage, RMS research estimates that the increased losses could aggregate to a total economic loss of approximately four times the 2002 losses. While this is a theoretical scenario, it is also an entirely realistic one.

The events that have occurred since May 2013 are a stark reminder that flood is a peril from which much can be lost.

After the 2002 flooding, flood defenses were improved in some locations, such as Prague, resulting in less severe flooding. However, because both the flood hazard itself and the physical environment change over time, Europe’s flood risk must be continually and holistically assessed to ensure that we are prepared for when, not if, a similar event occurs again.

One Year Later: What We Learned from the Moore Tornadoes

This week marks the one-year anniversary of the severe weather outbreak that brought high winds, hail, and tornadoes to half of all U.S. states. The most damaging event in the outbreak was the Moore, Oklahoma tornado of May 20, 2013. Rated at the maximum intensity of EF5, it had maximum sustained wind speeds of up to 210 mph and was the most deadly and damaging tornado of the year for both Oklahoma and the U.S., causing roughly $2 billion in insured losses.

As we reflect upon the events that have taken place in Moore, the following can be discerned:

  • Understanding severe weather risks is key: According to the RMS U.S. Severe Convective Storm Model in RiskLink 13.1, the annual likelihood of a severe weather event causing at least $1 billion in insured losses in the U.S. is 92 percent, meaning it is almost certain to occur each year. For reference, from a loss perspective, the $2 billion 2013 Moore tornado loss represented a 1-in-50-year event in Oklahoma, or an event with a 2 percent chance of occurring in a given year. Similarly, a 1-in-100-year event, or an event with a 1 percent chance of occurring in a given year, would cause $4 billion or more in insured losses for Oklahoma. Events in excess of the 1-in-100-year return period would be driven by large, destructive tornadoes hitting more concentrated urban environments, such as a direct hit on Oklahoma City. Probabilistic severe storm models provide more perspective on these types of risks, and can better prepare the industry for the “big ones.”
  • What grabs the headlines doesn’t cause the most damage: Although tornadoes get all the news coverage and are often catastrophic, hail drives roughly 60 percent of the average annual loss in convective storms. This is mainly driven by the much higher frequency of hailstorms compared to tornadoes. Hailstorms also have a much larger average footprint size.
  • Tornado Alley isn’t the only risky place: Tornado Alley drives roughly 32 percent of the average annual loss for severe convective storms in the U.S., while the Upper Midwest drives 24 percent, Texas drives 16 percent, and the Southeast drives 12 percent. Buildings in affected areas need continued upgrades: For example, the Moore city council approved 12 changes to the residential building code after the Moore tornado, including mandates for continuous plywood bracing and wind-resistant garages (often the first point of failure during weak to moderate winds).

While we can never predict exactly when severe weather will occur, it’s imperative for communities, businesses, and individuals to understand its potential impact. Doing so will help people and industries exposed to severe weather be better prepared for the next big event.

Are you located in one of the regions affected by last May’s outbreak, or in another risk-prone area? Have you been affected by any recent severe weather events? If so, what did you learn, and what changes were made in your region to safeguard the community, businesses, and homes? Please share your experience in the comment section.

Jeff Waters also contributed to this post.