Tag Archives: Resilience

The Cure for Catastrophe?

On August 24, 2016 – just a few weeks ago – an earthquake hit a remote area of the Apennine mountains of central Italy in the middle of the night. Fewer than 3000 people lived in the vicinity of the strongest shaking. But nearly 1 in 10 of those died when the buildings in which they were sleeping collapsed.

This disaster, like almost all disasters, was squarely man-made. Manufactured by what we build and where we build it; or in more subtle ways – by failing to anticipate what will one day inevitably happen.

Italy has some of the richest and best researched disaster history of any country, going back more than a thousand years. The band of earthquakes that runs through the Apennines is well mapped – pretty much this exact same earthquake happened in 1639. If you were identifying the highest risk locations in Italy, these villages would be on your shortlist. So in the year 2016, 300 people dying in a well-anticipated, moderate-sized earthquake, in a rich and highly-developed country, is no longer excusable.

Half the primary school in the town of Amatrice collapsed in the August 24th earthquake. Very fortunately, it being the middle of the night, no children were in class. Four years before, €700,000 had been spent to make the school “earthquake proof.” An investigation is now underway to see why this proofing failed so spectacularly. If only Italy was as good at building disaster resilience as mobilizing disaster response: some 7000 emergency responders had arrived after the earthquake – more than twice as many as those who lived in the affected villages.

The unnatural disaster

When we look back through history and investigate them closely we find that many other “natural disasters” were, in their different ways, also man-made.

The city of Saint-Pierre on the island of Martinique was once known as the “little Paris of the Caribbean.” In 1900 it had a population of 26,000, with tree-lined streets of balconied two and three story houses. From the start of 1902 it was clear the neighbouring volcano of Mont Pelée was heading towards an eruption. The island’s governor convened a panel of experts who concluded Saint-Pierre was at no risk because the valleys beneath the volcano would guide the products of any eruption directly into the sea. As the tremors increased, the Governor brought his family to Saint-Pierre to show the city was safe, and therefore, likely all but one of the city’s inhabitants, died when the eruption blasted sideways out of the volcano. There are some parallels here with the story of those 20,000 people drowned in the 2011 Japanese tsunami, many of whom had assumed they would be protected by concrete tsunami walls and therefore did not bother to escape while they still had time. We should distrust simple notions of where is safe, based only on some untested theory.

Sometimes the disaster reflects the unforeseen consequence of some manmade intervention. In Spring 1965, the U.S. Army Corps of Engineers completed the construction of a broad shipping canal – known as the Mississippi River Gulf Outlet (“Mr Go”) linking New Orleans with the Gulf of Mexico. Within three months, a storm surge flood driven by the strong easterly winds ahead of Hurricane Betsy was funnelled up Mr Go into the heart of the city. Without Mr Go the city would not have flooded. Four decades later Hurricane Katrina performed this same trick on New Orleans again, only this time the storm surge was three feet higher. The flooding was exacerbated when thin concrete walls lining drainage canals fell over without being overtopped. Channels meant for pumping water out of the city reversed their intended function and became the means by which the city was inundated.

These were fundamental engineering and policy failures, for which many vulnerable people paid the price.

RiskTech   

My new book, “The Cure for Catastrophe,” challenges us to think differently about disasters. To understand how risk is generated before the disaster happens. To learn from countries, like Holland, which over the centuries mastered their ever-threatening flood catastrophes, through fostering a culture of disaster resilience.

Today we can harness powerful computer technology to help anticipate and reduce disasters. Catastrophe models, originally developed to price and manage insurance portfolios, are being converted into tools to model metrics on human casualties or livelihoods as well as monetary losses. And based on these measurements we can identify where to focus our investments in disaster reduction.

In 2015 the Tokyo City government was the first to announce it aims to halve its earthquake casualties and measure progress by using the results of a catastrophe model. The frontline towns of Italy should likewise have their risks modeled and independently audited, so that we can see if they are making progress in saving future lives before they suffer their next inevitable earthquake.

 

The Cure for Catastrophe is published by Oneworld (UK) and Basic Books (US)

The Rise and Stall of Terrorism Insurance

In the 15 years since the terrorist attacks of September 11, 2001, partnerships between the public sector and private industries have yielded more effective security and better public awareness about the threat of terrorism. We may never come to terms with the sheer volume of human loss from that day and among the hundreds of attacks that continue every year. But we have achieved greater resilience in the face of the ongoing realities of terrorism – except for when it comes to looking ahead at recovering from the catastrophic costs for rebuilding in its aftermath.

Terrorism insurance is facing a structural crisis: hundreds of terrorist attacks occur annually, but actual insurance payouts have been negligible. The economic costs of terrorism have skyrocketed, but demand for terrorism coverage has remained relatively flat. And despite a proliferation of catastrophe bonds and other forms of alternative capital flooding into the property insurance market, relatively little terrorism risk has been transferred to the capital markets. If terrorism insurance – and the insurers who provide it – are to remain relevant, they must embrace the new tools and data available to them to create more relevant products, more innovative coverages, and new risk transfer mechanisms that address today’s threat landscape.

The September 11th, 2001 attacks rank among the largest insurance losses in history at $44 billion, putting it among catastrophes with severe losses such as Hurricane Katrina ($70 billion), the Tohoku earthquake and tsunami ($38 billion), and Hurricane Andrew ($25 billion).

But unlike natural catastrophes, whose damages span hundreds of kilometers, most of the 9/11 damages in New York were concentrated in an area of just 16 acres. Such extreme concentration of loss caused a crisis in the insurance marketplace and highlighted the difficulty of insuring against such a peril.

Following the events of the September 11 attacks, most insurers subsequently excluded terrorism from their policies, forcing the U.S. government to step in and provide a backstop through the Terrorism Risk and Insurance Act (2002). Terrorism insurance has become cost effective as insurer capacity for terrorism risk increased. Today there are an estimated 40 insurers providing it on a stand-alone basis, and it is bundled with standard property insurance contracts by many others.

But despite better data on threat groups, more sophisticated terrorism modeling tools, and increased transparency into the counter-terrorism environment, terrorism insurance hasn’t changed all that much in the past 15 years. The contractual coverage is the same – usually distinguishing between conventional and CBRN (chemical, biological, radiological, and nuclear) attacks. And terrorism insurance take-up remains minimal where attacks occur most frequently, in the middle east and Africa, highlighting what policymakers refer to as an increasing “protection gap.”

Closing this gap – through new products, coverages, and risk transfer schemes – will enable greater resilience following an attack and promote a more comprehensive understanding of the global terrorism risk landscape.

A Perennial Debate: Disaster Planning versus Disaster Response

In May we saw a historic first: the World Humanitarian Summit. Held in Istanbul, representatives of 177 states attended. One UN chief summarised its mission thus: “a once-in-a-generation opportunity to set in motion an ambitious and far-reaching agenda to change the way that we alleviate, and most importantly prevent, the suffering of the world’s most vulnerable people.”

And in that sentence we find one of the enduring tensions within the disaster field: between “prevention” and “alleviation.” Between on the one hand reducing disaster risk through resilience-building investments, and on the other reducing suffering and loss through emergency response.

But in a world of constrained political budgets, where should we concentrate our energies and resources: disaster risk reduction or disaster response?

How to Close the Resilience Gap

The Istanbul summit saw a new global network launched to engage business in crisis situations through “pre-positioning supplies, meeting humanitarian needs and providing resources, knowledge and expertise to disaster prevention.” It is, of course, prudent to have stockpiles of humanitarian supplies strategically placed.

But is the dialogue still too focused on response? Could we not have hoped to see a greater emphasis on driving the disaster-resilient behaviours and investments, which reduce the reliance on emergency response in the first place?

Politics & Priorities

“Cost-effectiveness” is a concept with which humanitarian aid and governmental agencies have struggled over many years. But when it comes to building resilience, it is in fact possible to cost-justify the best course of action. After all, the insurance industry, piqued by the dual surprise of Hurricane Andrew and then the Northridge earthquake, has been using stochastic models to quantify and reduce catastrophe risk since the mid-1990s.

Unfortunately risk/reward analyses are rarely straightforward in practice. This is less a failing of the models to accurately characterise complex phenomena, though that certainly is a challenge. It’s more a question of politics.

It is harder for any government to argue that spending scarce public funds on building resilience in advance of a possible disaster is money well spent. By contrast, when disaster strikes and human suffering is writ large across the media, then there is a pressing political imperative to intervene. As a result many agencies sadly allocate more funds to disaster response than to disaster prevention, even though the analytics mostly suggest the opposite would be more beneficial.

A New, Ambitious form of Public Private Partnership

But there are signs that across the different strata of government the mood is changing. The cities of San Francisco and Berkeley, for example, have begun to use catastrophe models to quantify the cost of inaction and thereby drive risk-reducing investments. For San Francisco the focus has been on protecting the city’s economic and social wealth from future sea level rise. In Berkeley, resilience models have been deployed to shore-up critical infrastructure against the threat of earthquakes.

In May, RMS held the first international workshop on how resilience analytics can be used to manage urban resilience. Attended by public officials from several continents the engagement in the topic was very high.

The role of resilience analytics to help design, implement, and measure resilience strategies was emphasized by Arnoldo Kramer, the first Chief Resilience Officer (CRO) of the largest city in the western hemisphere, Mexico City. The workshop discussion went further than just explaining how these models can be used to quantify the potential, risk-adjusted return on investment from resilience initiatives. The group stressed the role of resilience metrics in helping cities finance capital investments in new, protective infrastructure.

Stimulated by commitments under the Sendai Framework to work more closely with the private sector, lower income regions are also increasingly benefiting from such techniques – not just to inform disaster response, but also to finance the reduction of disaster risk in the first place. Indeed there are encouraging signs that these two different worlds are beginning to understand each other better. At the inaugural working group meeting of the Insurance Development Forum in Singapore last month there was a productive dialogue between the UN Development Programme and the risk transfer industry. It was clear that both sides wanted action, not just words.

Such initiatives can only serve to accelerate the incorporation of resilience analytics into existing disaster risk reduction programmes. This may be a once-in-a-generation opportunity to address the shameful gap between the economic costs of natural disasters and the fraction of those costs that are insured.

We cannot prevent natural disasters from happening. But neither can we continue to afford to spend billions of dollars picking up the pieces when they strike. I am hopeful that we will take this opportunity to bring resilience analytics into under-served societies, making them tougher, more resilient, so that when catastrophe strikes, the impact is lessened and societies can bounce back far more readily.

Using Insurance Claims Data to Drive Resilience

When disaster strikes for homeowners and businesses the insurance industry is a source of funds to pick up the pieces and carry on. In that way the industry provides an immediate benefit to society. But can insurers play an extended role in helping to reduce the risks for which they provide cover, to make society more resilient to the next disaster?

Insurers collect far more detailed and precise information on property damage than any other public sector or private organisation. Such claims data can provide deep insights into what determines damage – whether it’s the vulnerability of a particular building type or the fine scale structure of flood hazard.

While the data derived from claims experience helps insurers to price and manage their risk, it has not been possible to apply this data to reduce the potential for damage itself – but that is changing.

At a recent Organisation for Economic Co-operation and Development meeting in Paris on flood risk insurance we discussed new initiatives in Norway, France and Australia that harness and apply insurers’ claims experience to inform urban resilience strategies.

Norway Claims Data Improves Flood Risk

In Norway the costs of catastrophes are pooled across private insurance companies, making it the norm for insurers to share their claims data with the Natural Perils Pool. Norwegian insurers have collaborated to make the sharing process more efficient, agreeing a standardized approach in 2008 to address-level exposure and claims classifications covering all private, commercial and public buildings. Once the classifications were consistent it became clear that almost 70% of flood claims were driven by urban flooding from heavy rainfall.

Starting with a pilot of ten municipalities, including the capital Oslo, a group funded by the Norwegian finance and insurance sector took this address-level data to the city authorities to show exactly where losses were concentrated, so that the city engineer could identify and implement remedial actions: whether through larger storm drains or flood walls. As a result flood claims are being reduced.

French Observatory Applies Lessons Learned from Claims Data

Another example is from France, where natural catastrophe losses are refunded through the national ‘Cat Nat System’. Property insureds pay an extra 12% premium to be covered. All the claims data generated in this process now gets passed to the national Observatory of Natural Risks, set up after Storm Xynthia in 2010. This unit employs the data to perform forensic investigations to identify what can be learnt about the claims and then works with municipalities to see how to apply these lessons to reduce future losses. The French claims experience is not as comprehensive as in Norway because such data only gets collected when the state declares there has been a ‘Cat Nat event’  – which excludes some of the smaller and local losses that fail to reach the threshold of political attention.

Australian Insurers Forced Council to Act on Their Claims Data

In Australia sharing claims data with a city council was the result of a provocative action by insurers which were frustrated by the political pressure to offer universal flood insurance following the major floods in 2011.  Roma, a town in Queensland, had been inundated five times in six years – insurers mapped and published the addresses of the properties that had been repeatedly flooded and refused to renew the insurance cover unless action was taken. The insurers’ campaign achieved its goal, pressuring the local council to fund flood alleviation measures across the town.

These examples highlight how insurers can help cities identify where their investments will accomplish the most cost-effective risk reduction. All that’s needed is an appetite to find ways to process and deliver claims data in a format that provides the key insights that city bosses need, without compromising concerns around confidentiality or privacy.

This is another exciting application in the burgeoning new field of resilience analytics.

The Rising Cost of Hurricanes – and America’s Ability to Pay

Future hurricanes are going to cost the U.S. more money and, if we don’t act to address this, it will leave the government struggling to cope. That is the finding of a recent Congressional Budget Office (CBO) report which put it starkly:

“…over time, the costs associated with hurricane damage will increase more rapidly than the economy will grow. Consequently, hurricane damage will rise as a share of gross domestic product (GDP)…”

The CBO identified two core drivers for the escalating costs: climate change, which will drive just under half of the potential increases in hurricane damages while just over half of damages will come from coastal development. The four main four variables that would have the most impact were identified as:

  • Changes in sea levels for different U.S. states;
  • changes in the frequency of hurricanes of various intensities;
  • population growth in coastal areas, and;
  • per capita income in coastal areas.

Using Catastrophe Models to Calculate the Future Cost of Hurricanes

To inform the CBO’s research and creation of a range of possible hurricane scenarios based on future changes to the four key variables, RMS hurricane and storm surge risk experts provided the CBO with data from the RMS North Atlantic Hurricane Model and Storm Surge Model.

Through RMS’ previous work with the Risky Business Initiative we were able to provide state specific “damage functions” which were used to translate possible future hurricane events, state-specific sea levels and current property exposure into expected damaged. While we usually produce loss estimates for catastrophes, we didn’t provide the CBO with estimated losses ourselves – rather we built a tool so the CBO could “own” its own assumptions about changes in all the factors – a critical aspect of the CBO’s need to remain impartial and objective.

Solutions to Increase Coastal Urban Resilience

The CBO’s report includes suggested policies that could decrease the pressure on federal spending. The polices range from global initiatives to limit greenhouse gas emissions to more direct mechanisms that could shift costs to state and local governments and private entities, as well as investing in structural changes to reduce vulnerabilities. Such approaches bring to the forefront the role of local resilience in tackling a global problem.

However, therein lies the challenge. Many of the options open to society to increase resiliency against catastrophes, could have a less positive effect on the economy. It’s an issue that has been central to the wider debate about reducing the impacts of climate change. Limiting greenhouse gas emissions has direct effects on the oil and gas industry.  Likewise, curbing coastal development impacts developers and local economies. It has led states such as North Carolina to ban the use of future sea level rise projections as the basis for policies on coastal development.

Overcoming Political Resistance

Creating resiliency in U.S. towns and communities needs to be a multi-faceted effort. While initiatives to fortify the building stock and curb global climate change and sea level rise are moving ahead there is strong resistance from the political arena.  To overcome the resistance, solutions to transition the economy to new forms of energy must be found, as well as ways to adapt the current workforce to the jobs of the future. City leaders and developers should partner to find sustainable growth initiatives for urban growth, to ease the fears that coastal cities will wither and die under new coastal use restrictions.

Initiating these conversations will go a long way to removing the barriers to success in curbing greenhouse gas emissions and limiting coastal growth. With an already polarised political debate on climate change this CBO report may provoke further controversy about how to deal with the factors behind the increase in future hurricane damage costs. Though one conclusion is inescapable: catastrophe losses are going up and we will all be footing the bill.

This post was co-authored by Paul Wilson and Matthew Nielsen.

Matthew Nielsen

Senior Director of Global Governmental and Regulatory Affairs, RMS

Matthew Nielsen leads Governmental and Regulatory Affairs. He is responsible for maintaining relationships with regulators, legislators, and rating agencies on behalf of the company to establish open channels of communication around RMS models and solutions. Matthew is a meteorologist and geographer with extensive experience in North American catastrophe risk. In his prior role at RMS, he was responsible for developing the RMS climate peril models for the Americas, including the severe convective storm, winter storm, flood, and hurricane models. He has conducted field reconnaissance for major catastrophes including Hurricanes Katrina and Sandy. Before joining RMS, Matthew conducted remote sensing in satellite meteorology research at the Cooperative Institute for Research in the Atmosphere (CIRA). He holds a BS in physics from Ripon College, where he won the Henry Knop Award in Physics, and an MS in atmospheric science from Colorado State University. Matthew is a member of the American Meteorological Society (AMS), the International Society of Catastrophe Managers (ISCM), and the American Association of Geographers (AAG).

Exceedance 2016: Welcome Back to Miami!

We are back in sunny Miami, FL for Exceedance 2016 and ready for a week of engaging sessions, invigorating discussion, and plenty of networking opportunities.

If you’re joining us again here at the Fontainebleau Hotel, meet us in the Fleur De Lis Ballroom tonight at 5:30 p.m. for the Welcome Reception. If you were unable to make it this year, follow the highlights as we share on Twitter and LinkedIn, and here on the RMS Blog for #Exceedance news, insights, and photos.

Over the course of three days we have more than 60 sessions across six different tracks, so there is no shortage of thought-provoking content and discussions to be had. Download the mobile app to help you manage your schedule and maximize your week.

Here are a few highlights as you plan out your week:

This year, our lineup of keynote speakers includes:

  • Professor Bruce Hoffman, terrorism and security expert
  • Tim Jarvis, environmental scientist, author, adventurer
  • Matt Olsen, a president and co-founder, IronNet Cybersecurity
  • Hemant Shah, RMS Co-founder and CEO
  • Robert Muir-Wood, RMS Chief Research Officer
  • Emily Paterson, RMS Event Response Lead
  • Mark Powell, VP and Founder, HWind
  • Emily Grover-Kopec, VP, Model Product Strategy
  • Arno Hilberts, Senior Director, Global Flood Models
  • Shree Khare, Senior Director, Asia Models
  • Chris Folkman, Director, Marine and Terrorism Models
  • Tom Harvey, Product Manager, Cyber Models

The Lab: During breakfast and lunch, be sure to stop by The Lab to meet RMS experts and learn latest about RiskAssessor, RiskLink® version 16, Hosting Plus, and much more. Looking for some hands-on exercise? Join us to assemble 50 partially built bikes for donation to several Miami-based charities.

“EP” – The Exceedance Party: This year’s EP will be a vision in white, inspired by retro Miami and Fontainebleau’s heyday. Join us in Glimmer Ballroom to show off your dance moves to a five-piece band and DJ while enjoying specialty cocktails, lively conversations, delicious bites, a candy bar, photo booth, and more!

We’re excited to see you in Miami and are looking forward to a great week ahead!

Can Flood Walls Reduce Resilience?

In early December 2015 Storm Desmond hit, bringing an “atmospheric river” to the northwest of England with its headwaters snaking back to the Caribbean. It broke the U.K.’s 24 hour rainfall record, with 341.1mm of rain recorded in Cumbria.

Just three weeks later, while a great anticyclone remained locked in place over central Europe and the atmospheric flows had only shifted south by 150km, Storm Eva arrived. The English counties of Lancashire and Yorkshire were drenched during December 26th, and the media was once more overwhelmed with flood scenes—streets of Victorian-era houses inundated by 30-40cm of slow-moving water.

Journalists soon turned their attention to the failure of flood protections in the affected regions. In one interview in Carlisle, a beleaguered Environment Agency representative commended their defenses for not having failed—even when they had been overtopped. If the defenses had failed, maybe the water would not have ponded for so long.

 The call for “resilience”?

The call has gone out worldwide for improved “resilience” against disasters. As outlined by the UN Secretary General’s Climate Resilience Initiative, resilience is defined as the ability to “Anticipate, Absorb and Reshape” or “A2R”.

How did the U.K.’s flood defenses match up to these criteria in December? Well, as for the two “A”s in A2R, the residents of Carlisle did not anticipate any danger, thanks to the £38 million spent on flood defenses since the last time Carlisle had a “1 in 200 year” flood in January 2005 (which hit 1,900 properties). And the only thing the houses of Carlisle were absorbing on the first weekend in December was the flood water seeping deep into their plaster, electricals, and furnishings. As for “reshaping”, beyond the political recriminations, now is the time for some serious thinking about what constitutes resilience in the face of floods.

A flood wall is not the same as resilience. Resilience is about the capacity to recover quickly from difficulties, to bounce back from adversity. Organizations such as the UK’s Environment Agency may be good at building flood defenses, but not so proficient at cultivating resilience.

A flood wall can certainly be part of a culture of resilience—but only when accompanied by regular evacuation drills, a flood warning system, and recognition that despite the flood wall, people still live in a flood zone. Because flood walls effectively remove the lesser more frequent floods, the small risk reminders go away.

A growing reliance on the protection provided by flood walls may even cause people to stop believing that they live in a flood plain at all, and think that the risk has gone to zero, whether this is in New Orleans, Central London or Carlisle.

Even when protected by a flood wall, residents of river flood plains should be incentivized, through grants and reduced insurance rates, to make their houses resistant to water: tiling walls and floors and raising electrical fittings. They should have plans in place—such as being ready to carry their furniture to an upper floor in the event of an alert—as one day, in all probability, their houses will flood.

Given the U.K.’s recent experience we should be asking are people becoming more resilient about their flood risks? It sometimes seems that the more we build flood walls, the less resilient we become.

Risk, Models, and Innovations: It’s All Interconnected

A few themes came through loud and clear during this morning’s keynote sessions at Exceedance 2015.

RMS’ commitment to modeling innovation was unmistakable. As RMS co-founder and CEO Hemant Shah highlighted on stage, RMS worked hard and met our commitment to release RiskLink version 15 on March 31, taking extra measures to ensure the quality of the product.

Over the past five years, RMS has released 210 model upgrades and 35 new models. With a 30% increase in model development resources over the last two years and 10 HD models in various stages of research and development, RMS has the most robust model pipeline in its history.

As Paul Wilson explained, HD models are all about providing better clarity into the risk. They are a more precise representation of the way a physical damage results in a (re)insurance loss, with a more precise treatment of propagation of uncertainty through the model, designed to deal with losses as closely as possible as the way claims occur in real life.

HD models are the cornerstone of the work RMS is doing in model development right now. HD models represent the intersection of RMS research, science and technology. With HD models we are not limited by software – we can approach the challenge of modeling risk in exciting new ways.

And it’s more than just the models – RMS is committed to transparency, engagement, and collaboration.

RMS’ commitment to RMS(one) was also clear. Learning from the lessons of the past year, RMS developing an open platform that’s not just about enabling RMS to build its own models. It’s an exposure and risk management platform that’s about enabling clients and partners to build models. It’s about analytics, dynamic risk management and more.

RMS(one) will be released, judiciously and fully-matured, in stages over the next 15 months,starting with a model evaluation environment for our first HD Model, Europe Flood, in autumn 2015.

And, Hemant emphasized that starting later this calendar year, RMS will open the platform to its clients and partners with the Platform Development Kit (PDK).

In addition, RMS(one) pricing will be built around three core principles:

  • Simple, predictable packages
  • In most cases, no additional fees for clients who simply want continuity in their RMS modeling relationships
  • Clearly differentiated high-value packages at compelling prices for those who wish to benefit from RMS(one) beyond its replacement as a superior modeling utility to RiskLink

The overall goal of RMS’ commitment to modeling and technology innovation is to capitalize on a growing and ever-changing global (re)insurance market, ultimately building a more resilient global society. RMS is working with industry clients and partners to do so by understanding emerging risks, identifying new opportunities to insure more risk, developing new risk transfer products, and creating new ways of measuring risk.

As Ben Brookes said, we only have to look at the recent events in Nepal to understand that there are huge opportunities – and needs – to improve resilience and the management of risk. RMS’ work for Metrocat, a catastrophe bond designed specifically to protect the New York MTA’s infrastructure against storm surge, showed the huge potential for the developing alternate methods of risk transfer in order to improve resilience.

And during his session, Daniel Stander pointed out that only 1.9% of the global economy is insured. As the world’s means of production shifts from assets to systems, RMS is working to understand how to understand systems of risk, starting with marine, supply chain, and cyber risk, tackling tough questions such as:

  • What are the choke points in the global shipping network, and how do they respond under stress?
  • How various events create a ripple effect that impact the global supply chain – for example, why did the Tohoku earthquake and tsunami in Japan cause a shortage of iPads in Australia, halt production at BMW in Germany, and enable a booming manufacturing industry in Guangzhou?
  • How do we measure cyber risk when technology has become so critical that it is systemically important to the global economy?

global shipping

Leaving the keynotes, a clear theme rang true: as the world becomes more interconnected, it is the intersection of innovation in science and technology that will enable us to scale and solve global problems head on.

Measuring Disaster Risk for Global UN Goals

A dispiriting part of the aftermath of a disaster is hearing about the staggering number of deaths and seemingly insurmountable economic losses. Many of the disaster risk reduction programs that implement disaster prevention and preparedness capabilities are helping to create more resilient communities. These worthwhile programs require ongoing financing, and their success must be measured and evaluated to continue to justify the allocation of limited funds.

There are two global UN frameworks being renewed this year:

Both frameworks will run for 15 years. This is the first time explicit numerical targets have been set around disaster risk, and consequently, there is now a more pressing need to measure the progress of disaster risk reduction programs to ensure the goals are being achieved.

The most obvious way to measure the progress of a country’s disaster risk reduction would be to observe the number of deaths and economic losses from disasters.

However, as we have learned in the insurance industry in the early 1990s, this approach presents big problems around data sampling. A few years or even decades of catastrophe experience do not give a clear indication of the level of risk in a country or region because catastrophes have a huge and volatile range of outcomes. An evaluation that is purely based on observed deaths or losses can give a misleading impression of success or failure if countries or regions are either lucky in avoiding (or unlucky in experiencing) severe disaster events during the period measured.

A good example is the 2010 Haiti earthquake, which claimed more than 200,000 lives and cost more than $13 billion. Yet for more than 100 years prior to this devastating event, earthquakes in Haiti had claimed fewer than 10 lives.

Haiti shows that it is simply not possible to determine the true level of risk from 15 years of observations for a single country. Even looking at worldwide data, certain events dominate the disaster mortality data, and progress cannot be measured.

Global disaster-related mortality rate (per million global population), 1980–2013 (From Setting, measuring and monitoring targets for disaster risk reduction: recommendations for post-2015 international policy frameworks. Source: adapted from www.emdat.be)

A more reliable way to measure the progress of disaster risk reduction programs is to use a probabilistic methods, which rely on a far more extensive range of possibilities, simulating tens of thousands of catastrophic events. These can then be combined with data on exposures and vulnerabilities to output metrics of specific interest for disaster risk reduction, such as houses or lives lost. Such metrics can be used to:

  • Measure disaster risk in a village, city, or country and how it changes over time
  • Analyze the cost-benefit of mitigation measures:
    • For a region: For example, the average annual savings in lives due to a flood defense or earthquake early warning system
    • For a location: For example, choosing which building has the biggest reduction in risk if retrofitted
  • Quantify the impact of climate change and how these risks are expected to vary over time

In the long term, probabilistic catastrophe modeling will be an important way to ensure improved measurement and, therefore, management of disaster risk, particularly in countries and regions at greatest risk.

The immediate focus should be on educating government bodies and NGOs on the valuable use of probabilistic methods. For the 15 year frameworks which are being renewed this year, serious consideration should be given on how to implement a useful and practical probabilistic method of measuring progress in disaster risk reduction, for example by using hazard maps. See here for further recommendations: http://www.preventionweb.net/english/professional/publications/v.php?id=39649 

2015 is an important year for measuring disaster risk: let’s get involved.

RMS and 100 Resilient Cities at the Clinton Global Initiative

I’ve just returned from the Clinton Global Initiative (CGI) annual meeting in New York. Every September, political, corporate, and non-profit leaders from around the world gather to discuss pressing challenges, form partnerships, and make commitments to action. It was inspiring to see the tremendous work already being done and the new commitments being made to address a diverse and wide range of issues, from containing the Ebola epidemic, to increasing access to education, to combatting climate change, and helping Haiti develop a self-sustaining economy.

One prevailing theme at the event this year was the importance of cross-sector partnerships to successfully tackle such complex issues. Not surprisingly, data crunched by the CGI team on commitments made over the past 10 years demonstrates the highest rate of success from partnerships vs. go-it-alone approaches.

In this spirit, we announced an RMS commitment last week to partner with the Rockefeller Foundation’s 100 Resilient Cities initiative to help increase the resilience of cities around the world. We will be making our RMS(one) platform and our catastrophe models available to cities in the 100RC network so that they can better understand their exposures, assess risk to catastrophic events as well as climate change, and prioritize investments in mitigating and managing that risk.

As the saying goes, “if you can measure it, you can manage it.” From our 25 years of experience helping the insurance industry better measure and then manage catastrophe risk, we believe there is a largely untapped opportunity for the public sector to similarly leverage exposure management and catastrophe modeling technology to establish more informed policies for managing risk and increasing resilience in cities throughout the world, both in developed and emerging economies.

It was also clear this week that the conversation in corporate boardrooms is increasingly moving from being focused solely on the financial bottom line to also having a positive impact on the world in a way that is strategically aligned with the core mission of the business.

Our partnership with 100RC, along with the partnerships with the UNISDR and the World Bank that we announced this summer, is another step in our own version of this journey. Through both our direct philanthropic support of Build Change and their admirable work to improve construction practices in developing countries and through the leveraging of our technology and the expertise of our colleagues to help the public sector, we are aligning all of our activities in support of our core mission to increase the resilience of our society.

Many of our clients have shared with us that they are on similar journeys, building on traditional support for local organizations to implement more strategic programs with broader impact and employee engagement. In particular, the insurance industry is uniquely positioned to understand the value of proactively investing in mitigation and in increasing resilience, instead of waiting until a tragedy has occurred and all that can be done is to support humanitarian response efforts.

With this common frame of reference, we look forward to increasingly partnering with our clients in the coming years not just to help them manage their own risk but to collectively help increase resilience around the world.