Author Archives: Adrian Mark

About Adrian Mark

Senior Manager, Model Product Strategy, RMS
As a member of RMS' model solutions team, Adrian works to guide more informed usage of catastrophe models and enhance understanding of model uncertainty. This requires interaction with the market, as well as other important stakeholders such as regulators and rating agencies, to help RMS develop tools that capture the evolving needs of the risk management industry. Based in London, his primary focus is on supporting the RMS European modeling suite and in facilitating client understanding around the open modeling capabilities of RMS(one). Adrian holds a BS in meteorology and oceanography from the University of East Anglia and an MS in engineering in the coastal environment from the University of Southampton.

Lessons Hidden In A Quiet Windstorm Season

Wind gusts in excess of 100mph hit remote parts of Scotland earlier this month as a strong jet stream brought windstorms Elon and Felix to Europe. The storms are some of the strongest so far this winter; however, widespread severe damage is not expected because the winds struck mainly remote areas.

These storms are characteristic of what has largely been an unspectacular 2014/15 Europe windstorm season. In fact the most chaotic thing to cross the North Atlantic this winter and impact our shores has probably been the Black Friday sales.

This absence of a significantly damaging windstorm in Europe follows on from what was an active winter in 2013/14, but which contained no individual standout events. More detail of the characteristics of that season are outlined in RMS’ 2013-2014 Winter Storms in Europe report.

There’s a temptation to say there is nothing to learn from this year’s winter storm season. Look closer, however, and there are lessons that can help the industry prepare for more extreme seasons.

What have we learnt?

This season was unusual in that a series of wind, flood, and surge events accumulated to drive losses. This contrasts to previous seasons when losses have generally been dominated by a single peril—either a knockout windstorm or inland flood.

This combination of loss drivers poses a challenge for the (re)insurance industry, as it can be difficult to break out the source of claims and distinguish wind from flood losses, which can complicate claim payments, particularly if flood is excluded or sub-limited.

The clustering of heavy rainfall that led to persistent flooding put a focus on the terms and conditions of reinsurance contracts, in particular the hours clause: the time period over which losses can be counted as a single event.

The season also brought home the challenges of understanding loss correlation across perils, as well as the need to have high-resolution inland flood modeling tools. (Re)insurers need to understand flood risk consistently at a high resolution across Europe, while understanding loss correlation across river basins and the impact of flood specific financial terms, such as the hours clause.

Unremarkable as it was, the season has highlighted many challenges that the industry needs to be able to evaluate before the next “extreme” season comes our way.

What to expect this 2014-2015 Europe Winter Windstorm Season

When it rains in Sulawesi it blows a gale in Surrey, some 12,000 miles away? While these occurrences may sound distinct and uncorrelated, the wet weather in Indonesia is likely to have played some role in the persistent stormy weather experienced across northern Europe last winter.

Weather events are clearly connected in different parts of the world. The events of last winter are discussed in RMS’ 2013-2014 Winter Storms in Europe report, which provides an in-depth analysis of the main 2013-2014 winter storm events and why it is difficult to predict European windstorm hazard due to many factors, including the influence of distant climate anomalies from across the globe.

Can we predict seasonal windstorm activity during the 2014-2015 Europe winter windstorm season?

As we enter the 2014-2015 Europe winter windstorm season, (re)insurers are wondering what to expect.

Many consider current weather forecasting tools beyond a week to be as useful as the unique “weather forecasting stone” that I came across on a recent vacation.

I am not so cynical; while weather forecasting models may have missed storms in the past and the outputs of long-range forecasts still contain uncertainty, they have progressed significantly in recent years.

In addition, our understanding of climatic drivers that strongly influence our weather, such as the North Atlantic Oscillation (NAO), El Niño Southern Oscillation (ENSO), and the Quasi-Biennial Oscillation (QBO) is constantly improving. As we learn more about these phenomena, forecasts will improve, as will our ability to identify trends and likely outcomes.

What can we expect this season?

The Indian dipole is an oscillation in sea surface temperatures between the East and West Indian Ocean. It has trended positively since the beginning of the year to a neutral phase and is forecast to remain neutral into 2015. Indonesia is historically wet during a negative phase, so we are unlikely to observe the same pattern that was characteristic of winter 2013-2014.

Current forecasts indicate that we will observe a weak central El Niño this winter. Historically speaking this has led to colder winter temperatures over northern Europe, with a blocking system drawing cooler temperatures from the north and northeast.

The influence of ENSO on the jet stream is less well-defined but potentially indicates that storms will be steered along a more southerly track. Lastly, the QBO is currently in a strong easterly phase, which tends to weaken the polar vortex as well as westerlies over the Atlantic.

Big losses can occur during low-activity seasons

Climatic features like NAO, ENSO, and QBO are indicators of potential trends in activity. While they provide some insight, (re)insurers are unlikely to use them to inform their underwriting strategy.

And, knowing that a season may have low overall winter storm activity does not remove the risk of having a significant windstorm event. For example, Windstorm Klaus occurred during a period of low winter storm activity in 2009 and devastated large parts of southern Europe, causing $3.4 billion in insured losses.

Given this uncertainty around what could occur, catastrophe models remain the best tool available for the (re)insurance industry to evaluate risk and prepare for potential impacts. While they don’t aim to forecast exactly what will happen this winter, they help us understand potential worst-case scenarios, and inform appropriate strategies to manage the exposure.

When Did Windstorms Become So Wet?

Looking back to the start of the European windstorm season, my colleague Brian Owens pondered whether the insurance industry would experience a windfall or windy fall? Well, a week into February, I think all observers would agree that this has been a very active season.

As the industry continues to count the cost of the succession of systems that have assaulted our shores, it is apparent that the accumulated losses over the season will make this a year from which much can be learned.

The storms impacting northern Europe have frequently brought damaging winds to coastal areas, occasionally exceeding 90mph in the most exposed areas.

However, the driving jet stream has typically been very strong to the west but tapered off in the northeast Atlantic. This has caused systems to explosively deepen and mature before they reached the U.K. and Ireland, but then decay as they approached these shores. Consequently, the long storm paths have prompted higher waves and storm surges, but the latter decay, even for extremely deep cyclones, has meant less damaging winds. This has thus far spared Atlantic-facing countries from extreme wind losses.

But as the season has developed, the main story hasn’t been storm gusts. Anyone living in or visiting the U.K. this winter can testify that it has been exceedingly wet. Not just from excessive rainfall, but from repeated coastal inundations from storm surges combined with high tides as well. Consequently inland and coastal flooding has been significant, dominating our attention.


The persistent rainfall since December has caused river catchments such as the River Severn and Somerset Levels to swell, particularly across southern England and Wales. Groundwater reservoirs and soils are also saturated, leading to pluvial and groundwater flooding.

However, perhaps most interesting this season has been the surge-driven coastal flooding. Storm surges occur when strong winds force the underlying water toward the coast. As the surge develops, water levels are influenced by the shape of the coastline and tidal interactions, both of which can act to amplify surge heights and resulting coastal flooding.

While property damage has not yet reached the scale of prior major flood incidents in the U.K., this series of events highlights the importance of evaluating the complete flood cycle, from the initiating precipitation and antecedent conditions to the final mode of flooding, as seen during the 2012 U.K. flooding.

With tidal ranges as large as 15 m in the U.K., the timing of the surge is vital for determining the scale of the hazard. Surges that impact a region at high (spring) tide pose the most risk for flooding. The storms impacting northern Europe this winter have consistently coincided with some of the highest tides of the year.

Level of surge (green), relative to actual (blue) and predicted (red) storm-tide

Level of surge (green), relative to actual (blue) and predicted (red) storm-tide

Beginning with Windstorm Xaver in December, the U.K. east coast and coastal locations in Germany were given their sternest test since the devastating 1953 and 1962 events. Fortunately coastal defenses have been improved since those historical floods and the subsequent flooding was not significant.

Numerous systems have continued to arrive through January, with southeast England and Wales, Ireland and northern France particularly affected. As recently as last week, Windstorms Petra and Ruth brought yet more coastal damage and flooding, and the risk of more flooding remains high this week.

As with the wind and inland flood impacts of each individual storm, the coastal damage may not be viewed significant in isolation. Consequently specific storms from this season may not stick in the memory, like 87J has. But the accumulating damage and cost of this continuous series of events has made this a season to remember.

It has also posed a question around how we as an industry evaluate our wind and flood risk. Do we evaluate these perils in isolation or do we consider the correlation these perils have in winter months. A question that may become more prominent as the future of flood insurance in the U.K. evolves.

A Tale of Two Storms

“Horror and confusion seized upon all, whether on shore or at sea: no pen can describe it; no tongue can express it; no thought conceive it…”

Those were the words of Daniel Defoe in “The Storm”, which he published the year following the great 1703 windstorm, an event that saw it’s 310th anniversary on December 7. This event truly was a great storm, estimated to be one of the strongest windstorms to impact the UK.

RMS performed an innovative footprint reconstruction and estimates that wind speeds up to 110 mph were experienced across an area the size of greater London. These speeds are 30-40 mph stronger than those brought to the UK recently by windstorm Christian and are comparable to a category 2 hurricane. Such speeds can cause considerable damage, particularly to inadequately designed and constructed properties.

January also sees the 175th anniversary of the Irish “Oiche na Gaoithe Moire”; which is “The Night of the Big Wind” for those who don’t speak Gaelic.

Reports of the precise meteorological characteristics of this storm are unclear, but analyses of the event estimate that wind gusts in excess of 115 mph occurred and maximum mean wind speeds could have reached 80 mph. At the time it was considered the greatest storm in living memory to hit Ireland and its intensity may not have been rivaled since.

However, other than an interesting history lesson, is there anything valuable to note from these events from an insurance industry perspective?

Both events were severe European windstorms, causing significant widespread damage, but both would also be significant today.

Hubert Lamb’s unique study analyzing historic European windstorms over a period of 500 years places these events in the top grade of severity, at number 4 and 6 in his severity index and RMS estimates that a reoccurrence of the 1703 storm would cause an insured loss in excess of £10B ($16B).

A feature of both events at the time was the extensive and widespread damage to roofs. The 1703 event left tiles and slates littering the streets of London and the 1839 event caused parts of Dublin to look like a “sacked city”.

Roof damage was in part due to poor construction, lack of maintenance and inadequate design for the wind speeds experienced. This is a significant consideration today. Across Europe, design codes in relation to wind damage vary significantly and are a key source of uncertainty when modeling wind vulnerability.

Similar risks and construction types can perform quite differently comparing the north and south of the UK or Ireland. Properties further north experience higher wind speeds more frequently and are generally better prepared. Historically adopted construction practices and older buildings that pre-date many of the building codes and design guidance existing today further complicate the issue.

Another feature of both events were the extents of severe damage, which led to inflated repair costs due to the demand for materials and labor. These were early examples of what we now refer to as post-event loss amplification (PLA). From an insurance perspective we consider inflated “economic” costs (i.e. temporary shortage of material and labor) and also inflation of claims due to relaxed claims processing procedures after an event.

While events today exhibit different forms of PLA compared to historical events, it is clear that PLA has potentially always been an issue after large events, so we need to continue studying this phenomenon, to understand possible future costs. For example, many companies now establish mitigating measures, such as pre-event contracts, guaranteeing services, should an event occur.

For 300 years we have observed common factors across windstorms in Europe and there are lessons to learn from each event. However, the key to being prepared in the future is to:

  • Monitor changing trends
  • Maintain an accurate and up-to-date representation of exposure at risk
  • Understand how losses behave when events occur

Rumbling Below the Waves

Which of the following would you say has the shortest odds?

a) Someone getting injured by a firework
b) A meteor landing on a house
c) Someone being struck by lightening
d) A tsunami striking east coast of Japan
e) A person being on a plane with a drunken pilot

Disturbingly option e) has the shortest odds at 117 to 1 but you may be surprised to hear that option d) is next, the earthquake that led to the 2011 Japan tsunami had an annual probability of approximately 600 to 1 (other odds; a. 19.5K to 1; b. 182T to 1; c. 576K to 1).

Tsunamis can be devastating when they occur, as we saw when the Indian Ocean tsunami hit in 2004 and more recently with the Tohoku, Japan tsunami in 2011. But before these events, tsunami risk wasn’t high on many (re)insurers agendas.

It’s one of those risks that many would place in the upper left green section of the frequency / severity risk map below, requiring periodic attention.

The first step in managing risk is to identify and categorize it.

The first step in managing risk is to identify and categorize it.

Some other natural catastrophe risks, such as earthquake, fall near this region but generally garner immediate attention. So what made tsunami different?

Tsunamis have a particularly low frequency, especially when only considering events that have impacted developed regions. In addition, limited data availability and complexities associated with modeling this hazard meant that it was a risk that the industry was aware of but didn’t necessarily evaluate.

The lack of data was highlighted by the Tohoku event. An earthquake of the magnitude observed was not anticipated for the subduction zone off the east coast of Japan. The maximum projected earthquake magnitude was 8.3, with accompanying expected tsunami heights not as large as those experienced. Sea walls built along northeast Japan’s coastal towns, such as in Minamisanriku, Miyagi, weren’t designed to protect against the tsunami that occurred.

This devastating event brought tsunami risk into sharp focus but the questions we must now ask are:

  • Where will the next tsunami-generating great earthquake be?
  • How can we manage this risk?

An interesting conundrum surrounding the first question is the number of very large earthquakes that we have observed recently.

Before the 2004 Indian Ocean tsunami, the last earthquake greater than magnitude 8.6 occurred 40 years earlier (1965’s Mw 8.7 Rat Islands, Alaska Earthquake). However, since 2004 we have observed 5 earthquakes equal to or greater than Mw 8.6. It’s unclear whether we have underestimated the potential for large earthquakes or are just observing a random clustering of large events.

As research continues into the frequency and occurrence of these events, perhaps the best approach is to focus on understanding hazard hot spots. Most devastating tsunamis are generated by earthquakes in subduction zones and incidentally, subduction zones are where most great earthquakes (Mw 8.7+) have been observed.

Since 1900 all observed great earthquakes were generated on shallow subduction zone “megathrust” faults. Therefore it is vital to understand where these earthquakes occur and the potential associated tsunami scenarios.

Looking back to our risk map, risks with this frequency / severity combination may not stop (re)insurers providing cover. However, assessing scenarios will help them understand their tail risk and manage potential accumulations, which may lead to stricter underwriting guidelines and policy terms in high-risk zones.

Typhoon Haiyan recently highlighted the devastation caused by coastal inundation. On this occasion from storm surge, yet the city of Tacloban is vulnerable to tsunamis of greater height, as noted by Robert Muir-Wood in his recent post.

November 25th marked the 180th anniversary of another great earthquake to strike the region of Sumatra, Indonesia; the 1833 event occurred just south of the location of the devastating 2004 earthquake and also caused a significant tsunami.

These events remind us that while tsunami may be an infrequent hazard, coastal inundation can be devastating and these events have occurred in the past and will occur again in the future.

Although the industry may not know when the next event will occur, tools like accumulation scenarios can help (re)insurers explore the risk, understand where their exposure to tsunami is greatest and evaluate how to best to manage it.

What Lies Beneath?

As we approach the first anniversary of Superstorm Sandy, I’ve been reflecting on my own experience of the event.

Living in New York at the time, I sat tight in my apartment as the storm headed toward the New Jersey coastline. A meteorologist at heart, I watched with concern and fascination as the disaster unfolded on TV, until my power cut out.

The following morning, with no power and most of lower Manhattan shut down, I took a walk downtown to explore the impact of the storm.

I passed many downed trees and the signs of flood inundation from the surge were clear to see.

Downed trees after Superstorm Sandy

Downed trees in the village on Houston Street, NYC after Sandy.

As I walked down Broad Street in the financial district, a very noticeable consequence of the flooding could be smelled in the air and observed across the ground. An oily sheen covered the street as basement oil tanks in commercial buildings in the area had flooded and leaked, their contents subsequently spread by the floodwaters.

Bentley parked in Tribeca. The back seat shows signs that the whole car had flooded.

In the year after Sandy, this contamination issue has also been observed in other flood events.

After the significant summer flooding that impacted central Europe, RMS sent a reconnaissance team to inspect the damage. Basement-level heating tanks leaking oil were commonly observed, adding to the cost of cleanup, due to the cost of replacement and decontamination.

Contamination on a much larger scale occurred three months later, in the devastating Colorado floods. During this event, floodwaters reached oil and gas wells in the region, prompting concerns over contamination and significant potential environmental and financial costs.

Water pumping in the financial district, NYC, after Superstorm Sandy.

While the physical damage and business interruption from flood events are significant, each of these events highlights how important the issue of contamination can be. Contaminated properties will take longer and cost more to repair but the negative environmental and health consequences can also be significant both in their impact and cost.

Contamination coverage may not be included in all property insurance policies, but where it is provided, it could represent an unexpected additional cost from these events. However, it is the potential liability cost associated with this hazard that should perhaps be of most concern to the insurance industry.

Various forms of advice exist surrounding how to design properties to protect them against flood damage but there is no guarantee that a risk will be compliant with a proposed guideline. The onus must be for the insurance industry to fully understand the risks they are providing coverage to.

Contamination poses an issue for the insurance industry, as modeling this risk would be very complex. The mode of damage and probability of occurrence will be difficult to represent and the combination of policy terms stretches beyond the realms existing solutions.

It has been widely noted in recent years that a proportion (either the peril itself or a component of the loss from a modeled peril) of global insured losses are not modeled.

Looking to the future, the industry will need tools that have the potential to evaluate all sources of risk; the exposures, the relevant policy terms and the non-modeled sources of loss.

While the industry may not be able to avoid surprises in the future, such as a large contamination loss, with improved technology (re)insurers should at least be equipped with the tools to explore such potential surprises.

Diving into Flood Risk

Central Europe is still recovering from the massive flooding that followed one of the wettest months of May in recorded history for this region. In late May and early June, a period of intense rainfall caused major river systems, including the Danube and Elbe, which were already flowing above normal levels, to swell rapidly and burst their banks. Further showers and thunderstorms raised river levels higher, bringing localized flash floods.

Flooding near the source of precipitation occurred rapidly but destruction spread as the flood wave propagated downstream over the following week, impacting numerous cities over a vast area. Dike overtopping and breaching was common during this event, with significant on-floodplain flooding. Large areas of Germany, Austria, and the Czech Republic were seriously affected, while Switzerland, Poland, Slovakia, Hungary, Croatia, and Serbia were also impacted to lesser extents.


June 10, 2013 flooding and overflow of the banks of the Elbe along the lower course at Havelberg in Saxony-Anhalt. (Source: CC-BY-SA-3.0

According to Aon Benfield, the Central Europe floods were the costliest economic disaster during the first half of 2013—and the costliest to insurers, with expected payouts of US$5.3 billion (£3.5 billion) or more, with Germany experiencing the greatest loss.

This was a very different type of event to last year’s 2012 U.K. floods, where the sheer persistence of rainfall throughout the year saturated the soil and raised groundwater levels. Consequently, numerous small-scale, off-floodplain floods were observed, with surface water or pluvial flooding composing a significant component of the total insured loss.

Although most individual event losses were not notable, 2012’s estimated accumulated loss reached US$1.8 billion (£1.2 billion) (the second largest U.K. flood-related insured loss after the 2007 floods.

The contrasting nature of the past two years of attritional and catastrophic European floods raises the question: How can the insurance industry effectively manage flood risk in the face of such a widespread and variable peril? And, what role should the government play in mitigating the risk?

In part, government spending on flood defenses helps to manage the hazard; as demonstrated in Prague this year, where new defenses successfully protected vulnerable locations. However, budgets for such schemes are finite, meaning some regions will remain vulnerable and defense schemes are often targeted towards protecting against flooding in the floodplain or at the coast, which will not protect against the type of inland flooding observed in the U.K. in 2012.

For properties that remain in vulnerable locations, affordable insurance is their only means of protection but the frequency and severity of the hazard at such locations, makes it challenging for the insurance industry to offer affordable cover.

While there are positive initiatives on the horizon, such as the proposed Flood Re scheme in the U.K. (a government and industry backed flood pool), which is planned to replace the existing Statement of Principles (a voluntary commitment by the U.K. insurance industry). Making such schemes work requires a comprehensive evaluation of the hazard; a piecemeal approach will not suffice.

As demonstrated in the U.K. in 2012, it is necessary to consider all sources of flooding on- and off- the floodplain, including surface water flooding. The sheer scale of the 2013 Central European floods also showed that this peril can’t be viewed through the constraints of geographical boundaries; rather, large-scale catchments must be assessed.

In 2015 RMS will release a new and expanded pan-European inland flood model on RMS(one). The model will cover 13 at-risk countries, including those affected in 2012 and 2013, to comprehensively and consistently evaluate the risk from all sources of inland flooding, considering all underlying aspects of the hazard, at a high resolution across Europe. The scale and resolution of this model will, for the first time, enable the (re)insurance industry to assess European flood risk in a coherent manner, with the level of detail required to manage this dynamic and extreme peril.