The Blizzard of 2016: The Historical Significance of Winter Storm Jonas

Many of us in the Northeastern U.S. can remember the Blizzard of 1996 as a crippling winter weather event that dumped multiple feet of snow across major cities along the I-95 corridor of Washington, D.C., Philadelphia, New York City, and Boston.

20 years later, another historic winter storm has joined the 1996 event in the record books, this time occurring in a more socially connected world where winter storms are given names that share a resemblance to a popular boy band (thanks to the naming by The Weather Channel of Winter Storm Jonas).

Blowing and drifting snow on cars parked in an unplowed street in Hoboken, New Jersey at the height of the storm.
Credit: Jeff Waters, Meteorologist and resident of Hoboken from RMS

The Blizzard of 1996 saw (in today’s terms) an economic loss of $4.6 billion as well as insured losses of $900 million for the affected states, which corresponds to a loss return period within the RMS U.S. and Canada Winterstorm Model of roughly 10 years. Many of the same drivers of loss in the 1996 event were evident during Winter Storm Jonas, including business interruption caused by halted public transportation services in cities like Washington, D.C., and New York City, as well as coastal flooding along the New Jersey shorelines.

The Blizzard of 2016, dropped as much as 40 inches of snow in some regions of the Mid-Atlantic, causing wide scale disruption for approximately 85 million people, including over 300,000 power outages, cancellation of almost 12,000 flights, and a death toll of at least 48 people. The table below summarizes snowfall amounts for this year’s storm from three major Northeast cities as it compares to similar historic winter storms. These ranges of snowfall correspond to a 10-25 hazard return period event for the affected areas, according to RMS.

U.S. City

Blizzard of ’16 Blizzard of ’96 Snowpocalypse ’10 February 2006

Philadelphia

22.4in/567mm 30.7in/780mm 28.5in/724mm 12.0in/305mm
Baltimore 29.2in/742mm 26.6in/676mm 25.0in/635mm

13.1in/333mm

New York City 26.8in/681mm 20.5in/521mm 20.9in/531mm

26.9in/683mm

*Data from the National Weather Service

Another comparison between 1996 and 2016 events are seen through the following NOAA link explaining how the two storms compared on the Northeast Snowfall Impact Scale (NESIS), which takes into account population and societal impacts in addition to meteorological measurements.

There were warning signs in the week leading up to the event that a major nor’easter would creep up the east coast and “bomb out” just offshore. This refers to a meteorological term known as “bombogenesis” in which the pressure in the center of the storm drops rapidly, further intensifying the storm and allowing for heavier snow bands to move inland at snowfall rates of 1-3 inches per hour.

One of the things that will make the Blizzard of 2016 one to remember are the snowfall rates of more than 1” per hour that affected several major cities for an extended period of time. According to The Weather Channel, “New York’s LaGuardia airport had 14 hours of 1 to 3 inch per hour snowfall rates from 6 a.m. Saturday until 8 p.m. Saturday.”

Another unique aspect of the storm was the reports of a meteorological phenomenon known as “thundersnow,” which according to the National Severe Storms Laboratory (NSSL), “can be found where there is relatively strong instability and abundant moisture above the surface.”

The phenomenon known as “thundersnow” was reported during Winter Storm Jonas and was captured by NASA astronaut Scott Kelly in this picture of lightning taken from a window on the International Space Station.
Credit: NASA/Scott Kelly via Twitter (@StationCDRKelly)

Time will tell how the Blizzard of 2016 compares to historic storms like the Blizzard of 1996 from an economic loss perspective, but the similarities in terms of unique weather phenomena as well as the heavy snowfall amounts across the major Northeastern U.S. cities will keep Jonas in the conversation.

An event like Winter Storm Jonas, coupled with the crippling Boston snowstorms from last year, makes it very important for those of us in the catastrophe risk space to understand its drivers and quantitative impacts. Fortunately, weather forecasting capabilities have improved substantially since the Blizzard of 1996, but its important to further understand the threats that winter storms pose from an insured loss perspective. Please reach out to Sales@rms.com if you are interested in learning more about RMS and our suite of winter storm models.

Just How Unlucky Was Britain to Suffer Desmond, Eva, and Frank in a Single December?

Usually, it’s natural disasters occurring elsewhere in the world that make headlines in Britain, not the other way around. But you’d have to have been hiding under a rock to have missed the devastation wrought by flooding in the U.K. last month, thanks to the triple-whammy of storms Desmond, Eva, and Frank. Initial analysis from the Association of British Insurers suggests that the damage done could run to the region of £1.3bn.

But just how unlucky was the U.K.to suffer not just one, or two, but three big storms in one December, and for these three storms to interact in such a way as to produce the chaos that followed?

First it’s worth pointing out that floods in the U.K. are—as is usually the case elsewhere—subject to important seasonal variation (see chart below). The winter months bring the highest number of events, and December does in fact come out (slightly) on top, especially for flooding events of the sort seen last month, which tend to follow heavy rainfall leading to soil saturation (November 2015 received about twice the average climatological rainfall for November in the U.K.).

Source: RMS

The reason this matters is that, when soil is sodden following an extended period of heavy rains, further rains can more easily run off the surface, exacerbating the risk of pluvial flooding. The water will then follow natural and artificial drains until it reaches the closest river network, in which it can accumulate, potentially triggering river or “fluvial” flooding. The runaway effect of the masses of water can also cause what is known as ground-water flooding. This cumulative phenomenon means that—as we saw in December—flooding can persist for a significant amount of time, leading to several flood events in close succession.

A flood CAT model that properly captures these sorts of interactions between rainfall events and hydrological systems will allow not just for an assessment of the likelihood of a single severe event, but also a better understanding of the compounding factors that can lead to the sort of flooding seen in the U.K. last month. And based on our latest RMS pan-Europe flood model, the chances of having three rainstorms lead to major inland flooding over a single December are far from negligible.

Source: RMS Europe Flood Model

The chart above shows the probability of one, two, three, and four flood events for the month of December. What it means is that, on average, every second December in the U.K. has at least one flood event, and every third December has only one flood event. Around every eight years there are two flood events, and a cluster of three flood events happens once every quarter-century.

Now, this does not mean that flooding on the scale just witnessed happens on average every 25 years—just that this is the average period for seeing three flood events in one December. Even if it did, it wouldn’t mean that the U.K. can rest on its laurels until 2041… this is just a statistical average. It is quite possible for clusters to hit several years in a row—a so-called “flood-rich period”.

This gets to the real nub of the issue. The question of how often this sort of flooding takes place in the U.K. is almost by-the-by. The point is that it isn’t rare as hen’s teeth, and so the U.K. needs to be prepared. And what was most shocking about December wasn’t the flooding itself, so much as the sheer lack of resilience on display. A media storm has understandably been whipped up regarding the level of investment into flood walls and so on, but protective infrastructure is only part of the equation. What is needed is not just flood walls (which can actually be counterproductive on their own), but a wider culture of resilience. This includes things such as flood warning systems, regular evacuation drills, citizens having personal plans in place (such as being ready to move furniture to upper levels in the case of an alert) and, critically, the ability to respond and recover should the defences fail and the worst happen (which is always a possibility). The U.K. is the world’s sixth richest country—it has the resources to cope with flood events of this magnitude… whether they happen every five, ten or 25 years.

Can Flood Walls Reduce Resilience?

In early December 2015 Storm Desmond hit, bringing an “atmospheric river” to the northwest of England with its headwaters snaking back to the Caribbean. It broke the U.K.’s 24 hour rainfall record, with 341.1mm of rain recorded in Cumbria.

Just three weeks later, while a great anticyclone remained locked in place over central Europe and the atmospheric flows had only shifted south by 150km, Storm Eva arrived. The English counties of Lancashire and Yorkshire were drenched during December 26th, and the media was once more overwhelmed with flood scenes—streets of Victorian-era houses inundated by 30-40cm of slow-moving water.

Journalists soon turned their attention to the failure of flood protections in the affected regions. In one interview in Carlisle, a beleaguered Environment Agency representative commended their defenses for not having failed—even when they had been overtopped. If the defenses had failed, maybe the water would not have ponded for so long.

 The call for “resilience”?

The call has gone out worldwide for improved “resilience” against disasters. As outlined by the UN Secretary General’s Climate Resilience Initiative, resilience is defined as the ability to “Anticipate, Absorb and Reshape” or “A2R”.

How did the U.K.’s flood defenses match up to these criteria in December? Well, as for the two “A”s in A2R, the residents of Carlisle did not anticipate any danger, thanks to the £38 million spent on flood defenses since the last time Carlisle had a “1 in 200 year” flood in January 2005 (which hit 1,900 properties). And the only thing the houses of Carlisle were absorbing on the first weekend in December was the flood water seeping deep into their plaster, electricals, and furnishings. As for “reshaping”, beyond the political recriminations, now is the time for some serious thinking about what constitutes resilience in the face of floods.

A flood wall is not the same as resilience. Resilience is about the capacity to recover quickly from difficulties, to bounce back from adversity. Organizations such as the UK’s Environment Agency may be good at building flood defenses, but not so proficient at cultivating resilience.

A flood wall can certainly be part of a culture of resilience—but only when accompanied by regular evacuation drills, a flood warning system, and recognition that despite the flood wall, people still live in a flood zone. Because flood walls effectively remove the lesser more frequent floods, the small risk reminders go away.

A growing reliance on the protection provided by flood walls may even cause people to stop believing that they live in a flood plain at all, and think that the risk has gone to zero, whether this is in New Orleans, Central London or Carlisle.

Even when protected by a flood wall, residents of river flood plains should be incentivized, through grants and reduced insurance rates, to make their houses resistant to water: tiling walls and floors and raising electrical fittings. They should have plans in place—such as being ready to carry their furniture to an upper floor in the event of an alert—as one day, in all probability, their houses will flood.

Given the U.K.’s recent experience we should be asking are people becoming more resilient about their flood risks? It sometimes seems that the more we build flood walls, the less resilient we become.

Insurers Need a “Dual Horizon” View of Risk To Manage Climate Change

Last week, as a participant on the Joint OECD/Geneva Association Roundtable on Climate Change and the Insurance Sector, I had the opportunity to outline the (re)insurance industry’s critical role in the financial management of climate catastrophe events.

Source: COP21 Make It Work/Sciences Po

The meeting, held in Paris during The 21st Conference of the Parties (COP21) to the United Nations Framework Convention on Climate Change, gave rise to a thought-provoking discussion of the many ways in which the insurance industry will need to engage with the challenges of climate change and in particular extend its “risk horizon.”

A next generation perspective of risk

For centuries we have considered that sea level or climate stays the same. But now we must prepare for a world of constant change. A good way to start is by developing dual horizons—today and a generation away—for how we think about risk.

Today the focus of the insurance industry is short-term. Most contracts are for a single year, securitizations might run for three years, but no-one is looking beyond five years–what at RMS we call the “Medium Term.”

But as our world continues to warm and the catastrophe risk landscape evolves, we need a “next generation perspective” of risk: an additional forward-looking perspective focused 15 -35 years in the future.

Today’s (re)insurers should expect to develop plans for how they would function in a world where there is an explicit cost of carbon and more intense catastrophes from droughts to floods. Everything we build today, from city center high rises to coastal infrastructure, will still exist but in a more extreme catastrophe event environment. Already the U.K. and French insurance regulators are starting to ask questions of their supervised firms as to how their businesses would function in such a future.

In this next-generation perspective insurers will have to play an increased societal role.  Today, property owners assume that insurance will always be available. In our future world, that may become an unreasonable expectation. When determining where, and at what elevation, people can build in the flood plain, we should consider the risk over the whole future lifetime of the property, not simply the risk when the property is built.

This will require us to develop two defined datums: one for the current location of the 100-year flood zone, and a second “Next Generation” datum, showing where the 100 year flood hazard is expected to be 30+ years in the future. As highlighted by the December 2015 floods in Carlisle, northern England, flood protection already needs to consider how climate change is shifting the probabilities. When a building is constructed above the Next Generation flood datum a lifetime’s insurability may be guaranteed. These dual horizon datums will need to be objectively and independently defined, and insurers will want to be involved in determining what gets built and where.

The role of the catastrophe modelers

Since 2006, RMS has acknowledged it is no longer safe to assume that the activity of any catastrophe peril is best defined as the average of the past fifty or hundred years of history.  What then becomes the basis for determining activities and severities? We have committed more than ten person years of research to exploring what gives us the best perspective on current activity, with a focus on Atlantic hurricane. However we will need to apply the same thinking to all the climate perils.

All states of the climate contain a wide spectrum of extremes. If the climate changes, we can expect the spectrum of extremes to change. In a climate hazard catastrophe model we want to know what is the best representation of that activity, including what is the uncertainty in that estimation.

Our value to our clients comes from our true independence. This value also extends beyond the insurance industry, to providing a neutral perspective on risk to rating agencies and governments. RMS models are used by both insurers and reinsurers. They are employed for issuing securitizations and for portfolio management by investors in cat bonds. In every risk transaction, the party taking the risk will be more pessimistic than the party giving up the risk. We have a key role to play in providing a neutral science-based perspective.

Why We’re Getting Better at Changing the World

Photo by Blue Origin

Let’s start by celebrating goal achievement.  Last week, Jeff Bezos’ Blue Origin rocket successfully took off—and then returned back to Earth.  Mission accomplished. But, from 100 kilometers above the Earth’s surface, looking back at our planet, what’s the state of our global goal achievement?  Do we even have goals?

I ask this as over 50,000 delegates descend on Paris this week for the 21st Conference of the Parties to the United Nations Framework Convention on Climate Change. Or more simply, COP21.

COP21 offers the unique prospect of 196 countries achieving a (sort of) legally binding agreement on climate change: to keep global warming below 2°C by reducing greenhouse gases. But what would a “good goal” for COP21 look like and would it ever be achieved?

Set your goal. Measure it. Achieve it.

It’s easy to get cynical about achieving BHAGs (Big Hairy Audacious Goals), but maybe a template for success has emerged. In 2000, the United Nations agreed to eight BHAGs through its global Millennium Development Goals. Goal number four was to reduce child mortality by two-thirds within 15 years across 138 developing countries.

A 50% reduction in child mortality in 15 years

In 2010, Hans Rosling’s celebrated TED talk outlined a two million annual reduction in child deaths under the age of five within a decade, down to 8.1 million per year. By the end of 2015, we will be below six million deaths per year, almost halving in 15 years.  That’s still too high, and many countries will miss the two-thirds reduction target, but nonetheless it is a huge improvement.


Source: IHME

As Rosling points out, the Millennium Goals were strong due to measurable targets. Clear targets, at individual country level, have driven the ability to lobby for increases in financial resources for clean water, immunization and antibiotics, motivated by strong partnerships and innovations in service delivery.

A goal for climate change

Rio in 1992 is remembered for establishing climate change as being caused by humans, and more specifically, primarily the responsibility of the industrialized countries.

Source: Wikipedia/EU Edgar Database

France, as host, wants to build on the momentum of Rio, but also learn lessons from past summit failures.  So, much of the COP21 framework has been defined and negotiated in advance. Francois Hollande has already agreed with top-polluter China on a mechanism to monitor cuts every five years.

The squabbling that characterized Copenhagen in 1999 will be minimized, and high expectations set for those attending, such as encouraging heads of state to arrive at the start, rather than jetting in at the end.

Frequent communication with every country participating has been critical. As conference chair, Laurent Fabius, French Foreign Minister told the FT last week “Negotiators sometimes hold firm positions that only ministers can unlock, I know their bosses – I see them all the time. We talk often, it helps”.


Infographic: Who has pledged an INDC so far, and what percentage of the world’s emissions are covered. Credit: Rosamund Pearce, Carbon Brief, based on EU data

Paris will pull out all the stops to get an agreement, but will we be willing to accept the short-term costs and constraints to slow down climate change?  The answer is probably yes.

So what are the lessons for COP21?

As Jeff Bezos’ Blue Origin extreme rocket recycling shows, individual changes in our daily behavior such as recycling and energy conservation can affect climate change, but ultimately, changes need to be government led, especially around energy generation and emission control. Whether world leaders are ready to be held legally accountable for missing their climate goals is an ongoing issue.

Nonetheless, as the Millennium Goals show, clear and well-defined targets, annually measured (“are we there yet?”) create momentum to drive change forward. The success of COP21 will be defined by whether emerging sub-goals are specific, measurable, achievable, realistic, and time bounded. Now that would be smart.

Disasters Without Borders

On November 24 and 25, 2015 the first Scientific Symposium was held in London to discuss science for policy and operations for the new “Disaster Risk Management Knowledge Centre.” The Centre was launched by the European Commission in September this year to help member states respond to emergencies and to prevent and reduce the impact of disasters. The Centre will offer EU countries technical and scientific advice, provide an online repository of relevant research results, and create a network of crisis management laboratories. RMS was the only catastrophe modeler invited to present to the meeting.

Jointly organized by the UK Met Office and the European Commission, the symposium exposed some of the tensions between what countries can do on their own and where they require a supranational European institution. The British government contingents were particularly keen to show their leadership. The UK Cabinet Office co-ordinates inputs across government departments and agencies to manage a national risk register, identifying the likelihood and potential impact of a wide range of threats facing the country: from an Icelandic volcanic eruption to a storm surge flood to a terrorist incident. The office of the Chief Government Scientist then leads the response to the latest disaster, reporting directly to the Prime Minister.

These were not responsibilities the UK would ever consider transferring to a new European institution, because they go right to the heart of the function of a government—to protect the people and the national interest. However no single country can provide total management of events that run across borders, in particular when it is the country upstream that is controlling what heads your way, as with water storage dams. For this a Europe wide agency will be vital. The Centre will be most useful for the smaller European countries, unable to sustain research across the full range of hazards, or monitor activity around the clock. However do not expect the larger countries to share all their disaster intelligence.

Where does RMS fit into this picture? As described at the London symposium, probabilistic models will increasingly be key to making sense of potential disaster impacts and for ensuring organizations don’t become fixated on planning against a single historical scenario. RMS has more experience of probabilistic modeling than any other European science or government agency, in particular in areas such as the modeling of floods and flood defenses or for multi-hazard problems.

Two ideas with the potential for RMS leadership got picked up at the symposium. For an intervention such as a new flood defense, the results of the probabilistic model become used to define the “benefits”—the future losses that will not happen. A versatile model is required in which the user can explore the influence of a particular flood defense or even see the potential influence of climate change. Second we can expect to see a move towards the risk auditing of countries and cities, to show their progress in reducing disaster casualties and disaster impacts, in particular as part of their Sendai commitments. We know that risk cannot be defined based only on a few years of disaster data—the outcomes are far too volatile. Progress will need to be defined from consistent modeling. Catastrophe modeling will become a critical tool to facilitate “risk-based government”:  from measuring financial resilience to targeting investment in the most impactful risk reduction.

Tianjin Is A Wake-Up Call For The Marine Industry

“Unacceptable”  “Poor”  “Failed”

Such was the assessment of Ed Noonan, Chairman and CEO of Validus Holdings, on the state of marine cargo modeling, according to a recent report in Insurance Day.


China Stringer Network/Reuters

The pointed criticism came in the wake of the August 12, 2015 explosions at the Port of Tianjin, which caused an estimated $1.6 – $3.3 billion in cargo damages. It was the second time in three years that the cargo industry had been “surprised”—Superstorm Sandy being the other occasion, delivering a hefty $3 billion in marine loss. Noonan was unequivocal on the cargo market’s need to markedly increase its investment in understanding lines of risk in ports.

Noonan has a point. Catastrophe modeling has traditionally focused on stationary buildings, and marine cargo has been treated as somewhat of an afterthought. Accumulation management for cargo usually involves coding the exposure as warehouse contents, positioning it at a single coordinate (often the port centroid), and running it though a model designed to estimate damages to commercial and residential structures.

This approach is inaccurate for several reasons: first, ports are large and often fragmented. Tianjin, for example, consists of nine separate areas spanning more than 30 kilometers along the coast of Bohai Bay. Proper cargo modeling must correctly account for the geographic distribution of exposure. For storm surge models, whose output is highly sensitive to exposure positioning, this is particularly important.

Second, modeling cargo as “contents” fails to distinguish between vulnerable and resistive cargo. The same wind speed that destroys a cargo container full of electronics might barely make a dent in a concrete silo full of barley.

Finally, cargo tends to be more salvageable than general contents. Since cargo often consists of homogenous products that are carefully packaged for individual sale, more effort is undertaken to salvage it after being subjected to damaging forces.

The RMS Marine Cargo Model, scheduled for release in 2016, will address this modeling problem. The model will provide a cargo vulnerability scheme for 80 countries, cargo industry exposure databases (IEDs) for ten key global ports, and shape files outlining important points of exposure accumulation including free ports and auto storage lots.

The Tianjin port explosions killed 173 and injured almost 800. They left thousands homeless, burned 8,000 cars, and left a giant crater where dozens of prosperous businesses had previously been. The cargo industry should use the event as a catalyst to achieve a more robust understanding of its exposure, how it accumulates, and how vulnerable it might be to future losses.

The Paris Attack Explained: 7 Points

The suicide armed and bomb attacks in Paris on November 13, 2015 were unprecedented in size and scale. The attacks that killed more than 125 people and left 350 injured have exposed France’s vulnerability to political armed violence and alerted the rest of Europe to the threat of salafi-jihadist within their domain.

The Eiffel Tower was lit in the colors of the French flag in a tribute to the victims. Source: Reuters

Here are seven points we found noteworthy about these attacks:

1. Tragic but not surprising

Though tragic, the Paris attacks do not come as a complete surprise to the counter terrorism risk community.  The terrorism threat in France is higher compared to several other Western European countries. Apart from this recent attack, there have also been several terrorist attacks in France in the last 18 months.  These include the attack on December 20, 2014 in Tours, the armed assault at the ‘Charlie Hebdo’ offices in Paris on January 7, 2015, the shootings in Montrouge on January 8, 2015, the hostage siege at a Jewish supermarket in Paris on January 9, 2015 and an attack against three French soldiers in the city of Nice on February 3, 2015. On August 21, 2015, there was also a terrorist attack on the Amsterdam to Paris high speed Thalys (TGV) train service.

What is surprising is the magnitude and scale of these six assaults.  These attacks were very ambitious. Divided into three distinct groups, the militants were able to execute simultaneous strikes on six locations. Simultaneous attacks are very effective as they cause significant number of casualties before the security services have the time and ability to respond. The attacks were also very well coordinated and involved myriad attack devices reflecting a sophistication that can only come from having some level of military training and expertise as well as centralize control.

2. A well-coordinated attack with unprecedented magnitudes and scale  

In the first series of attacks, three bombs were detonated at locations near the Stade de France, where a soccer match between France and Germany was taking place.  These bombings killed five people. The three explosions at the Stade de France outside Paris were all suicide bombings. One of the attackers had a ticket to the game and attempted to enter the stadium when he was discovered wearing a suicide bomb vest. He blew himself up upon detection. The second suicide bomber killed himself outside the stadium few minutes later while a third suicide attacker detonated explosives at a nearby McDonalds.

Meanwhile at the same time, gunmen reportedly with AK-47 assault rifles opened fire on a tightly packed Southeast Asian restaurant in a drive-by shooting, killing more than 10 people.  Later in the evening there were two other drive by shootings in the different parts of the city that resulted in the deaths of 23 people. Another suicide bomb blast also occurred along the Boulevard Voltaire at a cafe, killing himself but also injuring 15 customers.

The worst violence occurred at the Bataclan Theater, where four militants took hostages during a concert performance by an American rock music group. Witnesses reported that the attackers launched grenades at people trapped in the theater. All the assailants were reported dead after the French police raided the building. Three of the assailants blew themselves up with suicide belts instead of getting arrested, as the police got close while the remaining one was shot and killed by the French authorities.  More than 80 people were believed to be killed at the theatre suicide siege.

3. Chosen strategy offers greatest impact

The suicide armed attacks or sieges witnessed at the Bataclan Theater involved a group opening fire on a gathering of people in order to kill as many as possible.  Similar to the Mumbai attacks in 2008, the ability to roam around and sustain the attack, while being willing to kill themselves in the onslaught, makes such terrorist attacks more difficult to combat. From the terrorist’s perspective, these assaults offer a number of advantages, such as greater target discrimination, flexibility during the operation, and the opportunity to cause large numbers of casualties and generate extensive worldwide media exposure.

It is possible that following the success of Friday’s Paris attacks, suicide-armed assaults and bomb attacks will become an even more attractive tactic for terrorist groups to replicate. Such attacks will typically target people in crowded areas that lay outside any security perimeter checks such as those of an airport or at a national stadium.  Probable targets for such attacks are landmark buildings where there is a large civilian presence.

4. Use of TATP explosives indicates high levels of experience

Also of interest is the terrorist’s use of triacetone triperoxide (TATP) explosives for the suicide bomb vests used in the attacks at the Stadium as well as the Bataclan Theater. TATP is basically a mixture of hydrogen peroxide and acetone with sulfuric, nitric, or hydrochloric acids. These are chemicals relatively available in neighborhood stores.  However, TATP is highly unstable and is very sensitive to heat as well as shock. More often than not TATP will detonate prior to the desired time.  Given the high level of precision and coordination needed to orchestrate these attacks, an experienced bomb maker had to be involved in creating the suicide bomb vest stable enough to be used in these operations.

5. Longstanding ethnic tensions fueled

The Islamic State (IS) has claimed responsibility for the catastrophic attacks in the French capital. While these claims have not been officially authenticated, the suicide operations and the synchronous nature of these attacks are consistent with the modus operandi of salafi-jihadi militant groups such as the IS and al-Qaida.

France’s military incursion in the Middle East such as the country’s recent bombing campaigns against IS positions in Syria and Iraq, justifies its targeting in the eyes of the Salafi-jihadi community. Both IS and al-Qaida linked groups have in the past have threaten reprisal attacks against France for their military intervention in the region.    On the domestic side, the fact the one of the suicide bombers was a Syrian refugee will also further fuel longstanding ethnic tensions in the country. France continues to struggle to deal with the problems of poor integration and perceived marginalization of its large Muslim population. Domestic policies such as the deeply unpopular headscarf ban have contributed to the feelings of victimization claimed by some sections of the French Muslim community.

6. Homegrown terrorists pose a threat

Compounding the threat landscape are indications that many French individuals have traveled to countries such as Syria and Libya to receive paramilitary training. The experience of other Western European countries, which face their own home-grown terrorist threat, has shown that individuals benefiting from foreign training and combat experience can act as lightning rods for local radicalized individuals and provide an addition impetus to orchestrate attacks in their homeland. So far, according to the French authorities it is believe that there is around 400 French citizens in Syria fighting with extremists, making the French among the largest western contingents of foreign fighters in Syria.

7. Potential for subsequent attacks

The November 13, 2015 attacks in Paris, France are the deadliest attacks in Europe since the 2004 train bombings in Madrid, Spain, where 191 people were killed and over 1,800 people were injured.

In regards to the terrorism risk landscape in France, while the suicide bombers have been all killed, the drive-by shooters remain at large. Moreover, despite several arrests in Belgium of individuals allegedly link to the attacks in Paris, it is still unclear whether these detentions have broken up the terrorist network that supported these attacks. Thus, in the short term, subsequent attacks in France or even neighboring countries cannot be discounted.

 

 

Are (Re)insurers Really Able To Plan For That Rainy Day?

Many (re)insurers may be taken aback by the level of claims arising from floods in the French Riviera on October 3, 2015. The reason? A large proportion of the affected homes and businesses they insure in the area are nowhere near a river or floodplain, so many models failed to identify the possibility of their inundation by rainfall and flash floods.

Effective flood modeling must begin with precipitation (rain/snowfall), since river-gauge-based modeling of inland flood risk lacks the ability to cope with extreme peaks of precipitation intensity. Further, a credible flood model must incorporate risk factors as well as the hazard: the nature of the ground, such as its saturation level due to antecedent conditions, and the extent of flood defenses. Failing to provide such critical factor can cause risk to be dramatically miscalculated.

A not so sunny Côte d’Azur

This was clearly apparent to the RMS event reconnaissance team who visited the affected areas of southern France immediately after the floods.

“High-water marks for fluvial flooding from the rivers Brague and Riou de l’Argentiere were at levels over two meters, but flash floodwaters reached heights in excess of one meter in areas well away from the rivers and their floodplains,” reported the team.

This caused significant damage to many more ground-floor properties than would have been expected, including structural damage to foundations and scouring caused by fast-floating debris. Damage to vehicles parked in underground carparks was extensive, as many filled with rainwater. Vehicles struck by more than 0.5 meters of water were written off, all as a result of an event that was not modeled by many insurers.

The Nice floods show clearly how European flood modeling must be taken to a new level. It is essential that modelers capture the entire temporal precipitation process that leads to floods. Antecedent conditions—primarily the capacity of the soil to absorb water must be considered, since a little additional rainfall may trigger saturation, causing “saturation excess overland flow” (or runoff). This in turn can lead to losses such as those assessed by our event reconnaissance team in Nice.

Our modeling team believes that to achieve this new level of understanding, models must be based on continuous hydrological simulations, with a fine time-step discretization; the models must simulate the intensity of rainfall over time and place, at a high level of granularity. We’ve been able to see that models that are not based on continuous precipitation modeling could miss up to 50% of losses that would occur off flood plains, leading to serious underestimation of technical pricing for primary and reinsurance contracts.

What’s in a model?

When building a flood model, starting from precipitation is fundamental to the reproduction, and therefore the modeling, of realistic spatial correlation patterns between river basins, cities, and other areas of concentrated risks, which are driven by positive relationships between precipitation fields. Such modeling of rainfall may also identify the potential for damage from fluvial events.

But credible defenses must also be included in the model. The small, poorly defended river Brague burst its banks due to rainfall, demolishing small structures in the town of Biot. Only a rainfall-based model that considers established defenses can capture this type of damage.

Simulated precipitation forms the foundation of RMS inland flood models, which enables representation of both fluvial and pluvial flood risk. Since flood losses are often driven by events outside major river flood plains, such an approach, coupled with an advanced defense model, is the only way to garner a satisfactory view of risk. Visits by our event reconnaissance teams further allow RMS to integrate the latest flood data into models, for example as point validation for hazard and vulnerability.

Sluggish growth in European insurance markets presents a challenge for many (re)insurers. Broad underwriting of flood risk presents an opportunity, but demands appropriate modeling solutions. RMS flood products provide just that, by ensuring that the potential for significant loss is well understood, and managed appropriately.

European Windstorm: Such A Peculiarly Uncertain Risk for Solvency II

Europe’s windstorm season is upon us. As always, the risk is particularly uncertain, and with Solvency II due smack in the middle of the season, there is greater imperative to really understand the uncertainty surrounding the peril—and manage windstorm risk actively. Business can benefit, too: new modeling tools to explore uncertainty could help (re)insurers to better assess how much risk they can assume, without loading their solvency capital.

Spikes and Lulls

The variability of European windstorm seasons can be seen in the record of the past few years. 2014-15 was quiet until storms Mike and Niklas hit Germany in March 2015, right at the end of the season. Though insured losses were moderate[1], had their tracks been different, losses could have been so much more severe.

In contrast, 2013-14 was busy. The intense rainfall brought by some storms resulted in significant inland flooding, though wind losses overall were moderate, since most storms matured before hitting the UK. The exceptions were Christian (known as St Jude in Britain) and Xaver, both of which dealt large wind losses in the UK. These two storms were outliers during a general lull of European windstorm activity that has lasted about 20 years.

During this quieter period of activity, the average annual European windstorm loss has fallen by roughly 35% in Western Europe, but it is not safe to presume a “new normal” is upon us. Spiky losses like Niklas could occur any year, and maybe in clusters, so it is no time for complacency.

Under Pressure

The unpredictable nature of European windstorm activity clashes with the demands of Solvency II, putting increased pressure on (re)insurance companies to get to grips with model uncertainties. Under the new regime, they must validate modeled losses using historical loss data. Unfortunately, however, companies’ claims records rarely reach back more than twenty years. That is simply too little loss information to validate a European windstorm model, especially given the recent lull, which has left the industry with scant recent claims data. That exacerbates the challenge for companies building their own view based only upon their own claims.

In March we released an updated RMS Europe Windstorm model that reflects both recent and historic wind history. The model includes the most up-to-date long-term historical wind record, going back 50 years, and incorporates improved spatial correlation of hazard across countries together with a enhanced vulnerability regionalization, which is crucial for risk carriers with regional or pan-European portfolios. For Solvency II validation, it also includes an additional view based on storm activity in the past 25 years. Pleasingly, we’re hearing from our clients that the updated model is proving successful for Solvency II validation as well as risk selection and pricing, allowing informed growth in an uncertain market.

Making Sense of Clustering

Windstorm clustering—the tendency for cyclones to arrive one after another, like taxis—is another complication when dealing with Solvency II. It adds to the uncertainties surrounding capital allocations for catastrophic events, especially due to the current lack of detailed understanding of the phenomena and the limited amount of available data. To chip away at the uncertainty, we have been leading industry discussion on European windstorm clustering risk, collecting new observational datasets, and developing new modeling methods. We plan to present a new view on clustering, backed by scientific publications, in 2016. These new insights will inform a forthcoming RMS clustered view, but will be still offered at this stage as an additional view in the model, rather than becoming our reference view of risk. We will continue to research clustering uncertainty, which may lead us to revise our position, should a solid validation of a particular view of risk be achieved.

Ongoing Learning

The scientific community is still learning what drives an active European storm season. Some patterns and correlations are now better understood, but even with powerful analytics and the most complete datasets possible, we still cannot yet forecast season activity. However, our recent model update allows (re)insurers to maintain an up-to-date view, and to gain a deeper comprehension of the variability and uncertainty of managing this challenging peril. That knowledge is key not only to meeting the requirements of Solvency II, but also to increasing risk portfolios without attracting the need for additional capital.

[1] Currently estimated by PERILS at 895m Euro, which aligns with the RMS loss estimate in April 2015