A Perennial Debate: Disaster Planning versus Disaster Response

In May we saw a historic first: the World Humanitarian Summit. Held in Istanbul, representatives of 177 states attended. One UN chief summarised its mission thus: “a once-in-a-generation opportunity to set in motion an ambitious and far-reaching agenda to change the way that we alleviate, and most importantly prevent, the suffering of the world’s most vulnerable people.”

And in that sentence we find one of the enduring tensions within the disaster field: between “prevention” and “alleviation.” Between on the one hand reducing disaster risk through resilience-building investments, and on the other reducing suffering and loss through emergency response.

But in a world of constrained political budgets, where should we concentrate our energies and resources: disaster risk reduction or disaster response?

How to Close the Resilience Gap

The Istanbul summit saw a new global network launched to engage business in crisis situations through “pre-positioning supplies, meeting humanitarian needs and providing resources, knowledge and expertise to disaster prevention.” It is, of course, prudent to have stockpiles of humanitarian supplies strategically placed.

But is the dialogue still too focused on response? Could we not have hoped to see a greater emphasis on driving the disaster-resilient behaviours and investments, which reduce the reliance on emergency response in the first place?

Politics & Priorities

“Cost-effectiveness” is a concept with which humanitarian aid and governmental agencies have struggled over many years. But when it comes to building resilience, it is in fact possible to cost-justify the best course of action. After all, the insurance industry, piqued by the dual surprise of Hurricane Andrew and then the Northridge earthquake, has been using stochastic models to quantify and reduce catastrophe risk since the mid-1990s.

Unfortunately risk/reward analyses are rarely straightforward in practice. This is less a failing of the models to accurately characterise complex phenomena, though that certainly is a challenge. It’s more a question of politics.

It is harder for any government to argue that spending scarce public funds on building resilience in advance of a possible disaster is money well spent. By contrast, when disaster strikes and human suffering is writ large across the media, then there is a pressing political imperative to intervene. As a result many agencies sadly allocate more funds to disaster response than to disaster prevention, even though the analytics mostly suggest the opposite would be more beneficial.

A New, Ambitious form of Public Private Partnership

But there are signs that across the different strata of government the mood is changing. The cities of San Francisco and Berkeley, for example, have begun to use catastrophe models to quantify the cost of inaction and thereby drive risk-reducing investments. For San Francisco the focus has been on protecting the city’s economic and social wealth from future sea level rise. In Berkeley, resilience models have been deployed to shore-up critical infrastructure against the threat of earthquakes.

In May, RMS held the first international workshop on how resilience analytics can be used to manage urban resilience. Attended by public officials from several continents the engagement in the topic was very high.

The role of resilience analytics to help design, implement, and measure resilience strategies was emphasized by Arnoldo Kramer, the first Chief Resilience Officer (CRO) of the largest city in the western hemisphere, Mexico City. The workshop discussion went further than just explaining how these models can be used to quantify the potential, risk-adjusted return on investment from resilience initiatives. The group stressed the role of resilience metrics in helping cities finance capital investments in new, protective infrastructure.

Stimulated by commitments under the Sendai Framework to work more closely with the private sector, lower income regions are also increasingly benefiting from such techniques – not just to inform disaster response, but also to finance the reduction of disaster risk in the first place. Indeed there are encouraging signs that these two different worlds are beginning to understand each other better. At the inaugural working group meeting of the Insurance Development Forum in Singapore last month there was a productive dialogue between the UN Development Programme and the risk transfer industry. It was clear that both sides wanted action, not just words.

Such initiatives can only serve to accelerate the incorporation of resilience analytics into existing disaster risk reduction programmes. This may be a once-in-a-generation opportunity to address the shameful gap between the economic costs of natural disasters and the fraction of those costs that are insured.

We cannot prevent natural disasters from happening. But neither can we continue to afford to spend billions of dollars picking up the pieces when they strike. I am hopeful that we will take this opportunity to bring resilience analytics into under-served societies, making them tougher, more resilient, so that when catastrophe strikes, the impact is lessened and societies can bounce back far more readily.

Using Insurance Claims Data to Drive Resilience

When disaster strikes for homeowners and businesses the insurance industry is a source of funds to pick up the pieces and carry on. In that way the industry provides an immediate benefit to society. But can insurers play an extended role in helping to reduce the risks for which they provide cover, to make society more resilient to the next disaster?

Insurers collect far more detailed and precise information on property damage than any other public sector or private organisation. Such claims data can provide deep insights into what determines damage – whether it’s the vulnerability of a particular building type or the fine scale structure of flood hazard.

While the data derived from claims experience helps insurers to price and manage their risk, it has not been possible to apply this data to reduce the potential for damage itself – but that is changing.

At a recent Organisation for Economic Co-operation and Development meeting in Paris on flood risk insurance we discussed new initiatives in Norway, France and Australia that harness and apply insurers’ claims experience to inform urban resilience strategies.

Norway Claims Data Improves Flood Risk

In Norway the costs of catastrophes are pooled across private insurance companies, making it the norm for insurers to share their claims data with the Natural Perils Pool. Norwegian insurers have collaborated to make the sharing process more efficient, agreeing a standardized approach in 2008 to address-level exposure and claims classifications covering all private, commercial and public buildings. Once the classifications were consistent it became clear that almost 70% of flood claims were driven by urban flooding from heavy rainfall.

Starting with a pilot of ten municipalities, including the capital Oslo, a group funded by the Norwegian finance and insurance sector took this address-level data to the city authorities to show exactly where losses were concentrated, so that the city engineer could identify and implement remedial actions: whether through larger storm drains or flood walls. As a result flood claims are being reduced.

French Observatory Applies Lessons Learned from Claims Data

Another example is from France, where natural catastrophe losses are refunded through the national ‘Cat Nat System’. Property insureds pay an extra 12% premium to be covered. All the claims data generated in this process now gets passed to the national Observatory of Natural Risks, set up after Storm Xynthia in 2010. This unit employs the data to perform forensic investigations to identify what can be learnt about the claims and then works with municipalities to see how to apply these lessons to reduce future losses. The French claims experience is not as comprehensive as in Norway because such data only gets collected when the state declares there has been a ‘Cat Nat event’  – which excludes some of the smaller and local losses that fail to reach the threshold of political attention.

Australian Insurers Forced Council to Act on Their Claims Data

In Australia sharing claims data with a city council was the result of a provocative action by insurers which were frustrated by the political pressure to offer universal flood insurance following the major floods in 2011.  Roma, a town in Queensland, had been inundated five times in six years – insurers mapped and published the addresses of the properties that had been repeatedly flooded and refused to renew the insurance cover unless action was taken. The insurers’ campaign achieved its goal, pressuring the local council to fund flood alleviation measures across the town.

These examples highlight how insurers can help cities identify where their investments will accomplish the most cost-effective risk reduction. All that’s needed is an appetite to find ways to process and deliver claims data in a format that provides the key insights that city bosses need, without compromising concerns around confidentiality or privacy.

This is another exciting application in the burgeoning new field of resilience analytics.

The Rising Cost of Hurricanes – and America’s Ability to Pay

Future hurricanes are going to cost the U.S. more money and, if we don’t act to address this, it will leave the government struggling to cope. That is the finding of a recent Congressional Budget Office (CBO) report which put it starkly:

“…over time, the costs associated with hurricane damage will increase more rapidly than the economy will grow. Consequently, hurricane damage will rise as a share of gross domestic product (GDP)…”

The CBO identified two core drivers for the escalating costs: climate change, which will drive just under half of the potential increases in hurricane damages while just over half of damages will come from coastal development. The four main four variables that would have the most impact were identified as:

  • Changes in sea levels for different U.S. states;
  • changes in the frequency of hurricanes of various intensities;
  • population growth in coastal areas, and;
  • per capita income in coastal areas.

Using Catastrophe Models to Calculate the Future Cost of Hurricanes

To inform the CBO’s research and creation of a range of possible hurricane scenarios based on future changes to the four key variables, RMS hurricane and storm surge risk experts provided the CBO with data from the RMS North Atlantic Hurricane Model and Storm Surge Model.

Through RMS’ previous work with the Risky Business Initiative we were able to provide state specific “damage functions” which were used to translate possible future hurricane events, state-specific sea levels and current property exposure into expected damaged. While we usually produce loss estimates for catastrophes, we didn’t provide the CBO with estimated losses ourselves – rather we built a tool so the CBO could “own” its own assumptions about changes in all the factors – a critical aspect of the CBO’s need to remain impartial and objective.

Solutions to Increase Coastal Urban Resilience

The CBO’s report includes suggested policies that could decrease the pressure on federal spending. The polices range from global initiatives to limit greenhouse gas emissions to more direct mechanisms that could shift costs to state and local governments and private entities, as well as investing in structural changes to reduce vulnerabilities. Such approaches bring to the forefront the role of local resilience in tackling a global problem.

However, therein lies the challenge. Many of the options open to society to increase resiliency against catastrophes, could have a less positive effect on the economy. It’s an issue that has been central to the wider debate about reducing the impacts of climate change. Limiting greenhouse gas emissions has direct effects on the oil and gas industry.  Likewise, curbing coastal development impacts developers and local economies. It has led states such as North Carolina to ban the use of future sea level rise projections as the basis for policies on coastal development.

Overcoming Political Resistance

Creating resiliency in U.S. towns and communities needs to be a multi-faceted effort. While initiatives to fortify the building stock and curb global climate change and sea level rise are moving ahead there is strong resistance from the political arena.  To overcome the resistance, solutions to transition the economy to new forms of energy must be found, as well as ways to adapt the current workforce to the jobs of the future. City leaders and developers should partner to find sustainable growth initiatives for urban growth, to ease the fears that coastal cities will wither and die under new coastal use restrictions.

Initiating these conversations will go a long way to removing the barriers to success in curbing greenhouse gas emissions and limiting coastal growth. With an already polarised political debate on climate change this CBO report may provoke further controversy about how to deal with the factors behind the increase in future hurricane damage costs. Though one conclusion is inescapable: catastrophe losses are going up and we will all be footing the bill.

This post was co-authored by Paul Wilson and Matthew Nielsen.

Matthew Nielsen

Senior Director of Global Governmental and Regulatory Affairs, RMS

Matthew Nielsen leads Governmental and Regulatory Affairs. He is responsible for maintaining relationships with regulators, legislators, and rating agencies on behalf of the company to establish open channels of communication around RMS models and solutions. Matthew is a meteorologist and geographer with extensive experience in North American catastrophe risk. In his prior role at RMS, he was responsible for developing the RMS climate peril models for the Americas, including the severe convective storm, winter storm, flood, and hurricane models. He has conducted field reconnaissance for major catastrophes including Hurricanes Katrina and Sandy. Before joining RMS, Matthew conducted remote sensing in satellite meteorology research at the Cooperative Institute for Research in the Atmosphere (CIRA). He holds a BS in physics from Ripon College, where he won the Henry Knop Award in Physics, and an MS in atmospheric science from Colorado State University. Matthew is a member of the American Meteorological Society (AMS), the International Society of Catastrophe Managers (ISCM), and the American Association of Geographers (AAG).

Euro 2016: France inundated by fans and floods

This week the final knockout rounds of Euro 2016 take place in France. Sadly, England has long since left the country and the tournament, largely due to some inept displays. But more miserable than England’s performance, was the weather at the start of the tournament, which caused concern in the capital as intense precipitation on top of an already saturated France, led to severe flooding.

Some areas of the country experienced the worst flooding they have seen in a century, with the floods across eastern and central France declared a natural disaster by French President François Hollande. River levels in the Seine were at their highest in nearly 35 years, impacting Paris, and leading to three of the capital’s best-known museums — the Louvre, the Grand Palais, and Orsay —closing their doors to the public, as staff moved priceless works of art to the safety of higher floors.

Source: The Guardian

There were also concerns surrounding how the flooding could impact the tournament. However, as you can see in the below image, which represents the RMS 1,000 year inland flood hazard extent, neither of the two stadia located in France’s capital (yellow markers) were really at any risk of flooding. The same can’t be said for the fan zone adjacent to the Eiffel Tower though (red marker). Continued intense rainfall, would have led to increased flood severity, meaning that 90,000 or so fans would have been in need of their waders.

Stade de France and Parc des Princes (yellow markers); Paris Fan Zone (red marker)

Paris wasn’t the only location in France to be impacted by the floods though; further south the town of Nemours observed severe flooding as the River Loing burst its banks. While devastating to the local community, this severity of flooding can be expected in the town. The RMS Europe Inland Flood maps demonstrate such flooding for events in excess of the 50 year return period, but as the below image of the 200 year flood extent demonstrates, the flooding could have been even more severe.

Rue de Paris, Nemours (yellow marker) and Château-Musée de Nemours (red marker)

The flooding in Nemours is a good example of why it is so important to understand the standard of protection offered by local flood defenses, in order to fully understand flood risk. The RMS Europe Inland Flood models and maps explicitly represent the impact of flood defenses and provide some noteworthy insights into the potential exposure at risk, if the standard of protection is not maintained or local flood defenses are overtopped.

Rue de Paris, Nemours. Source: The Guardian

If we removed all flood defenses and consider a 100 year return period level of flood hazard across France, the RMS analyses estimate that over €600 billion of insured exposure is at risk to flood damage. However, approximately 40 percent of this exposure at risk is protected against such levels of hazard by local flood defenses.

Source: Château-Musée de Nemours

And in the largest exposure concentrations, such as Paris and its surrounding area, the importance of local defenses is even more prominent. Looking at a similar 100 year return period level of flood hazard in this region, almost €60 billion of insured exposure would be at risk of flooding, but approximately 90 percent of that exposure is protected against this level of hazard.

Flood can be thought of as a polar peril; if you’re in the extent of a flood event, the costs are high but if you’re on the edge then you’re safe. And for this reason, an understanding of the impact of flood defenses is vital, because if they breach or become overtopped, the losses can be high. Knowing where exposure is protected allows you to write business smartly in higher risk zones. But understanding the hazard, should defenses fail, is also vital, enabling a more informed understanding of severe flood risk and its associated uncertainties.

This post was co-authored by Rachael Whitford and Adrian Mark.

Cat Losses, The Atlantic Basin, & Technology

Technology can be a powerful ally in the battle to successfully assess and manage risk. From new, high-definition models to fully hosted solutions that shrink costs and timeframes, risk professionals now have access to the tools they need to successfully manage their portfolios.

Advances both in the collection of data and computational strength have enabled more precise and comprehensive analytics than were previously possible, thus allowing a more complete and accurate risk profile.

The more you know about risk and exposure, the more they can be managed. Unmanaged or undermanaged, risks, and exposures can become problems and even turn tragic or fatal.

Global insured losses from catastrophes totaled $37 billion in 2015 according to Swiss Re’s most recent Sigma Study. The 2015 figure, at just over half the inflation-adjusted previous 10-year average of $62 billion in insured catastrophe losses, was substantially tied to a quiet Atlantic hurricane season.

“The relatively low level of losses was largely due to another benign hurricane season in the US. El Niño in 2015 contributed to weather patterns deviating from average climate norms,” said the Swiss Re report.

(Re)insurers’ financial results for the past two years have been dotted with the phrase “benign catastrophe losses,” demonstrating how they have benefitted from quiet Atlantic storm conditions producing below-average claims activity.

That period of below-average catastrophe losses for (re)insurers may be coming to an end as researchers and forecasters are pointing toward a more active Atlantic hurricane season for 2016.

When (not if) catastrophe losses do return to their 10-year average, that’s $25 billion across somebody’s balance sheet. What might the 2016 Atlantic hurricane season hold for the U.S. and those who insure it?

With ports lining the U.S. coast from Texas to New York, even one landfall could wreak havoc on marine activities and infrastructure as the country moves into the winter holiday and heating oil seasons.

More Active Season?

While 2015 saw only 11 named storms with just four hurricanes, early indications suggest that the 2016 season will exceed those totals.

An April 14 update from the Climate Prediction Center of the National Oceanic and Atmospheric Administration (NOAA) said that the current El Nino conditions, known to inhibit hurricane activity, are likely to abate.

El Niño is dissipating and NOAA’s Climate Prediction Center is forecasting a 70 percent chance that La Niña—which favors more hurricane activity—will be present during the peak months of hurricane season, August through October.

“Nearly all models predict further weakening of El Niño, with a transition to ENSO-neutral likely during late spring or early summer 2016. Then, the chance of La Niña increases during the late summer or early fall,” the Center said in its update.

The Colorado State University Tropical Meteorology Project issued a forecast that included an estimated 12 named storms and five hurricanes, again greater than observed 2015 totals.

The Weather Company’s Professional Division issued a report stating the 2016 Atlantic Hurricane season would be he most active since 2012. This report forecasts 14 named storms, eight hurricanes, and three major hurricanes, more than the 30-year historical average of 12 named storms, six hurricanes, and three major hurricanes, according to The Weather Channel.

Most recently, NOAA followed its earlier report on El Nino with its annual Atlantic Hurricane Forecast, stating that this year’s hurricane season will see closer to Normal activity after three slow years.

“A near-normal prediction for this season suggests we could see more hurricane activity than we’ve seen in the last three years, which were below normal,” said Gerry Bell, Ph.D., lead seasonal hurricane forecaster with NOAA’s Climate Prediction Center.

The NOAA forecast predicts a 70% likelihood of 10 to 16 named storms, of which 4 to 8 could become hurricanes and 1 to 4 major hurricanes (Category 3, 4, or 5). In addition to a near-normal season being most likely with a 45% chance, there is also a 30% chance of an above-normal season and a 25% chance of a below-normal season.

Another ominous harbinger was the formation of tropical storm Colin on June 5—the earliest third storm on record in the Atlantic basin. Colin then made landfall on June 6 along Florida’s Big Bend with maximum sustained winds of 50 mph—the first named storm to make landfall in Florida since Andrea in 2013.

Earlier this year, Hurricane Alex became only the second hurricane on record to form in the month of January, sweeping through The Azores as a tropical storm.

Prepare for the Worst

The insurance sector has been substantially re-shaped since the last large catastrophe loss—by M&A, the influx of new capital—meaning new people, new relationships, even new claims procedures and personnel

It’s an entirely new landscape, entirely untested—how will it respond when a catastrophe hits and claims and losses mount?

From first responders to catastrophe modelers, one piece of advice never changes—be prepared.

That means understanding your exposures and accumulations and owning your own view of risk.

You can’t control or avoid catastrophes, but you can manage and mitigate their effects. Being prepared is the first step.

The Orlando Shootings – and What They Tell Us About the Evolving Threat from Islamic State

This month saw what the President of the United States described as “the most deadly shooting in American history” with the killing of 49 innocent people at an Orlando nightclub, carried out by a man suspected of having leanings towards radical Islamist ideology.

Although information is still emerging, there are some clear threads and patterns, which link this attack to the increasing activity surrounding so-called Islamic State (IS).

1. Assaults Using an Automatic Rifle Becoming More Common

For somebody committed to terrorizing the population, there appears to be a growing tendency to use automatic weapons. Off-the-shelf military weapons are inherently more reliable than improvised explosive devices. In contrast to the atrocity in Orlando, a 2007 plot against the Tiger Tiger nightclub in London failed because the IED (improvised explosive device) failed to detonate.

2. The Increase in “Lone Wolf” Attacks as a Response to Surveillance

A “lone wolf” attack has been defined as a single individual or a group of two to three people driven to hateful actions based on a particular set of beliefs without a larger group’s knowledge or support. The FBI believes that most U.S. domestic attacks are carried out by lone actors to promote their own grievances and agendas.

Militants involved in such attacks are home-grown “self-starters” that are inspired by the jihadi movement, but may have little or no actual connection to these groups. Instead, many use the internet and social networking tools to find propaganda and research attack methods.

Mass interception of communications (as revealed by Edward Snowden’s leaks of National Security Agency files), particularly in North America, has raised the chances of terrorist conspiracies being detected. This has led to a move away from plots involving multiple attackers. There has been a corresponding rise in the United States in the risk of lone-actor attacks, which have a comparatively small chance of being found out and stopped.

3. Attacks Inspired by Islamic State

The Orlando terrorist contacted police via cellphone around the time of the attack to announce his allegiance to IS. There are strong indications that he has been deeply influenced by the group even if he had no contact with it. As IS concedes territory it controls in Iraq and Syria it is looking to organize or inspire atrocities overseas. There are two likely reasons for this. Firstly, striking on foreign soil helps to divert attention from its losses in the Middle East in order to retain credibility and an aura of potency. Second, the jihadi operations overseas are designed to deter further attacks by Western forces in IS strongholds in Iraq and Syria.

4. Targeting of Venues Which Extremists Claim Symbolize Values They Decry

Bars and nightclubs may feature in the attack plans of terrorist organizations because there are high concentrations of people in a public, accessible venue. Such locations are also targets for such extremists who may view them as representing Western lifestyles of which they disapprove.

5. Increased Attacks over Ramadan

The murders in Orlando happened a week after the start of the holy month of Ramadan. Radical Islamic militants tend to increase their operations during this period.  A recording released online from IS spokesman Abu Mohammed al-Adnani has claimed any martyrdom operation during the festival of Ramadan will bring more “rewards.” An increase in the tempo of Islamist terrorist activity would thus not be unexpected.

The increasing proliferation of extremism and global attacks is concerning. Our modeling team closely monitors the evolving risk landscape. By examining all attacks to capture greater insight into the workings and thinking of the terrorist groups, including targeting preferences and weapon selection, we can continue to offer terrorism models that enable our clients to deepen their understanding of terrorism risk and strengthen their terrorism risk management.

This post was co-authored by Weimeng Yeo and Gordon Woo. 

Weimeng Yeo

Principal Modeler, Model Development
Weimeng Yeo is a principal modeler on the Model Development team at Risk Management Solutions (RMS), and is a key member of the team responsible for the development of RMS’ terrorism modeling solutions. Prior to his tenure at RMS, Weimeng worked at the International Center for Political Violence and Terrorism Research at the Institute of Defense and Strategic Studies in Singapore. He received his bachelor’s degree in Political Science from Colby College in Maine and a Master’s degree in International Affairs from Georgetown University in Washington DC at the School of Foreign Service.

Is this the year that breaks the streak?

Sports fans around the world have witnessed impressive winning streaks throughout history. After capturing two consecutive UEFA European Championships (2008, 2012) and a World Cup championship (2010), the Spanish National Football Team entered the 2014 World Cup in Brazil as the top-ranked squad in international competition. The dominant Spaniards were among the international sportsbooks’ favorites to bring home the trophy once again.

Instead, surprising defeats at the hands of the Netherlands and Chile eliminated Spain at the group stage. Spain’s streak of dominance came to a sudden end, marking the earliest World Cup exit for a defending champion since 1950.

From a meteorological perspective, the United States is currently riding its own streak: ten Atlantic hurricane seasons without a major hurricane (category 3 or above) making landfall, the longest such stretch in recorded history. With another hurricane season upon us, many will be keeping a keen eye on the Atlantic this summer to see if this impressive streak will continue.

Global forecasting groups, such as Colorado State University and Tropical Storm Risk, have issued their tropical storm and hurricane activity forecasts for the 2016 Atlantic hurricane season. Christopher Allen of the RMS Event Response team has authored an excellent summary of their forecasts in the RMS 2016 North Atlantic Hurricane Season Outlook published this week on RMS.com.

You can also listen to my summary of the season’s forecasts during my talk to AM Best TV’s John Weber. In summary, most forecasts are predicting anywhere between near-average to above-average activity in the Atlantic basin, reflecting conflicting signals in the key indicators that influence hurricane formation.

Will we have increased hurricane activity?

One factor that may support increased hurricane activity this season is the anticipated state of the El Niño-Southern Oscillation, or ENSO. As reported on this blog several months ago, many ENSO forecasts project a transition out of last year’s historic El Niño phase into a La Niña phase, which is historically more favorable for hurricane development. Wind shear, detrimental to tropical cyclone formation, typically is reduced in the Atlantic basin during La Niña phases of ENSO.

Mid-May 2016 observations and model forecasts of ENSO, based on the NINO3.4 index, through March 2017. Positive values correspond with El Niño, while negative values correspond with La Niña. Source: International Research Institute for Climate and Society

Conversely, some forecasts predict a cooling of Atlantic sea surface temperatures (SSTs), which would oppose any support provided by a forecasted La Niña and reduce the potential for an active hurricane season. This cooling has been driven by a lengthened positive phase of the North Atlantic Oscillation (NAO), which causes stronger than normal trade winds in the tropical North Atlantic and upwelling of deeper cold ocean water near the surface.

February-April 2016 sea level pressure anomalies in the North Atlantic Ocean (hPa, anomalies with respect to 1981-2010 climatology). Anomalously high pressure evident in the Azores and the mid-latitude North Atlantic signals a positive phase of the NAO. Source: National Centers for Environmental Prediction Monthly Reanalysis (Kalnay, E. and Coauthors, 1996: The NCEP/NCAR Reanalysis 40-year Project. Bull. Amer. Meteor. Soc., 77, 437-471).

The Atlantic Multidecadal Oscillation may also be transitioning into a prolonged phase detrimental to tropical cyclone development, a theory often mentioned on this blog, although one that is still debated in the scientific community.

If considered in isolation, La Niña conditions and cooling Atlantic SSTs exert conflicting influences on Atlantic tropical cyclone development. However, forecasts contain key caveats that will ultimately determine this season’s activity:

  • Although a transition into a La Niña phase is widely anticipated, a late arrival would limit its ability to support development in the basin.
  • Further, forecasts of Atlantic sea surface temperature during August and September, the peak of hurricane season, remain conflicted.

Does the season’s early storm activity signify more activity?

Forecasts predicting above-average basin activity are understandable, given the early activity observed prior to the season’s official start. Tropical Storms Bonnie and Colin both formed before the second week in June, bringing heavy rainfall to South Carolina and the Gulf coast of Florida, respectively. Bonnie and Colin followed Hurricane Alex, the first January hurricane since 1938.

Bonnie’s formation marked the first time since 2012 that two named storms developed before June 1, the official start of hurricane season. The 2012 season ended with 19 total named storms, the third-most on record, including Superstorm Sandy, which caused more than $18 billion in insured losses.

Would the industry be prepared for the next major hurricane landfall? According to Fitch, the answer is yes: insurers and reinsurers in 18 coastal U.S. states would be equipped to handle one major event this season, although this resiliency has not been recently tested. More worrying, though, are the prospects of a large tail event or even multiple landfalling events, which may be supported by the right combination of oceanic and atmospheric influences.

With the hurricane season now officially underway, we will watch, wait and see how the season’s activity unfolds over the next few months. What is certain, though, is that streaks are made to be broken. It’s just a matter of when.

Mandatory reporting of cyber-attacks would improve understanding of cyber risk

The recent call by the Association of British Insurers (ABI) for the U.K. government to mandate the reporting of cyber-attacks is another welcome attempt to improve the collective learning opportunities presented by the continuous stream of cyber events. Every attack provides new data which can be fed into probabilistic models which help build resilience against this growing corporate peril – so long as we are able to find out about those attacks. Thus initiatives like this, which will lead to the sharing of valuable information and insights, are paramount.

Reporting cyber attacks is already mandatory in most U.S. states where laws require companies to notify their customers and regulators as soon they suffer a security breach. In 2018 a similar EU law, The European Network Information Security Directive, will make it mandatory for certain firms to provide alerts of cyber incidents.

However, having more information on data breaches still only provides just part of the picture required to fully understand cyber as a peril.

Current security breach notification laws, where they exist, do not require companies to report the many other types of cyber-attack that are increasingly being used to target organizations. Cyber extortion, for example, is a growing trend. Firms typically choose not to report this type of attack to limit damage to their corporate reputation.

Historical attacks not a good indicator of the future

While having access to data on historical cyber breaches is valuable, the threat is constantly evolving, such that previous attacks have rarely been a good indicator of future events. Even a small shift in the balance between the capabilities of hackers and cyber defenses could lead to a significant shift in the frequency and severity of cyber attacks.

Staying on top of the myriad of threat actors and their motivations and resources, as well as having a broad view of the range of viable attack methods that exist today, is crucial to understanding and managing cyber risk. But is challenging to manage.

As a first step to help insurers better understand their existing cyber risk loss potential, RMS recently launched its Cyber Accumulation Management System. This tool provides insurers with a framework to organize and structure their data, identify their accumulations and correlated risk, and stress test their portfolios against a range of cyber loss methods. Having this capability enables insurers to understand the potential size of cyber catastrophes and set their risk appetite to safely grow capacity for this line of business.

Cyber attacks are an increasingly significant threat to the global economy. The combination of new cyber risk management solutions combined with initiatives such as mandatory reporting will help the insurance industry to continue to play itscrucial role in ensuring the resiliency of our economy.

Contact the RMS cyber team for more information cyberrisk@rms.com.

Exceedance 2016: Welcome Back to Miami!

We are back in sunny Miami, FL for Exceedance 2016 and ready for a week of engaging sessions, invigorating discussion, and plenty of networking opportunities.

If you’re joining us again here at the Fontainebleau Hotel, meet us in the Fleur De Lis Ballroom tonight at 5:30 p.m. for the Welcome Reception. If you were unable to make it this year, follow the highlights as we share on Twitter and LinkedIn, and here on the RMS Blog for #Exceedance news, insights, and photos.

Over the course of three days we have more than 60 sessions across six different tracks, so there is no shortage of thought-provoking content and discussions to be had. Download the mobile app to help you manage your schedule and maximize your week.

Here are a few highlights as you plan out your week:

This year, our lineup of keynote speakers includes:

  • Professor Bruce Hoffman, terrorism and security expert
  • Tim Jarvis, environmental scientist, author, adventurer
  • Matt Olsen, a president and co-founder, IronNet Cybersecurity
  • Hemant Shah, RMS Co-founder and CEO
  • Robert Muir-Wood, RMS Chief Research Officer
  • Emily Paterson, RMS Event Response Lead
  • Mark Powell, VP and Founder, HWind
  • Emily Grover-Kopec, VP, Model Product Strategy
  • Arno Hilberts, Senior Director, Global Flood Models
  • Shree Khare, Senior Director, Asia Models
  • Chris Folkman, Director, Marine and Terrorism Models
  • Tom Harvey, Product Manager, Cyber Models

The Lab: During breakfast and lunch, be sure to stop by The Lab to meet RMS experts and learn latest about RiskAssessor, RiskLink® version 16, Hosting Plus, and much more. Looking for some hands-on exercise? Join us to assemble 50 partially built bikes for donation to several Miami-based charities.

“EP” – The Exceedance Party: This year’s EP will be a vision in white, inspired by retro Miami and Fontainebleau’s heyday. Join us in Glimmer Ballroom to show off your dance moves to a five-piece band and DJ while enjoying specialty cocktails, lively conversations, delicious bites, a candy bar, photo booth, and more!

We’re excited to see you in Miami and are looking forward to a great week ahead!

Calculating the cost of “Loss and Damage”

The idea that rich, industrialized countries should be liable for paying compensation to poorer, developing ones damaged by climate change is one that has been disputed endlessly at recent international climate conferences.

The fear among rich countries is that they would be signing a future blank check. And the legal headaches in working out the amount of compensation don’t bear thinking about when there are likely to be arguments about whether vulnerable states have done enough to protect themselves.

The question of who pays the compensation bill may prove intractable for some years to come. But the scientific models already exist to make the working out of that bill more transparent.

Some context: in the early years of climate negotiations there was a single focus—on mitigating or (limiting) greenhouse gas emissions. Through the 1990s it became clear atmospheric carbon dioxide was growing just as quickly, so a second mission was added: “adaptation” to the effects of climate change.

Now we have a third concept: “Loss and Damage” which recognizes that no amount of mitigation or adaptation will fully protect us from damages that can’t be stopped and losses that can’t be recovered.

Sufficient self-protection?

The Loss and Damage concept was originally developed by the Association of Small Island States, which saw themselves in the frontline of potential impacts from climate change, in particular around sea-level rise. By some projections at least four of the small island countries (Kiribati, Tuvalu, the Marshall Islands, and the Maldives) will be submerged by the end of this century.

Countries in such a predicament seeking compensation for their loss and damage will have to answer a difficult question: did they do enough to adapt to rising temperatures before asking other countries to help cover the costs? Rich countries will not look kindly on countries they deem to have done too little.

If money were no object, then adaptation strategies might seem limitless and nothing in the loss and damage world need be inevitable. Take sea level rise, for example. Even now in the South China Sea we see the Chinese government, armed with strategic will and giant dredgers, pumping millions of tons of sand so that submerged reefs can be turned into garrison town islands. New Orleans—a city that is 90% below sea level—is protected by a $14 billion flood wall.

But, clearly, adaptation is expensive and so the most effective strategies may be beyond the reach of poorer countries.

Calculating the cost with models

Through successive international conferences on climate change the legal and financial implications of loss and damage have seen diplomatic wrangling as richer and poorer nations argue about who’s going to foot the bill.

But we can conceptualize a scientific mechanism for tallying what that bill should be. It would need a combination of models to discriminate between costs that would have happened anyway and those that are the responsibility of climate change.

Firstly, we could use “attribution climate models” which run two versions of future climate change: one model is based on the atmosphere as it actually is in 2016 while the other “re-writes history” and supposes there’s been no increase in greenhouse gases since 1950.

By running these two models for thousands of simulation years we can see the difference in the number of times a particular climate extreme might happen. And the difference between them suggests how much that extreme is down to greenhouse gas emissions. After this we will need to model how much adaptation could have reduced loss and damage. An illustration:

  • A future extreme weather event might cause $100 billion damage.
  • Attribution studies show that the event has become twice as likely because of climate change.
  • Catastrophe models show the cost of the damage could have been halved with proper adaptation.
  • So the official loss and damage could be declared as $25 billion.

While hardly a straightforward accounting device it’s clear that this is a mechanism—albeit an impressively sophisticated one—that could be developed to calculate the bill for loss and damage due to climate change.

Leaving only the rather thorny question of who pays for it.