Tag Archives: catastrophe modeling

Prudential Regulation Authority on the Challenges Facing Cyber Insurers

Most firms lack clear strategies and appetites for managing cyber risk, with a shortage of cyber domain knowledge noted as a key area of concern. So said the Prudential Regulation Authority, the arm of the Bank of England which oversees the insurance industry, in a letter to CEOs last week.

This letter followed a lengthy consultation with a range of stakeholders, including RMS, and identified several key areas where insurance firms could and should improve their cyber risk management practices. It focussed on the two distinct types of cyber risk: affirmative and silent.

Affirmative cover is explicit cyber coverage, either offered as a stand-alone policy or as an endorsement to more traditional lines of business. Silent risk is where cover is provided “inadvertently” through a policy that was typically never designed for it. But this isn’t the only source of silent risk: it can also leak into policies where existing exclusions are not completely exhaustive. A good example being policies with NMA 2914 applied, which excludes cyber losses except for cases where physical damage is caused in any cyber-attack (eg. by fire or explosion).

The proliferation of this silent risk across the market is highlighted as one of the key areas of concern by the PRA. It believes this risk is not only material, but it is likely to increase over time and has the potential to cause losses across a wide range of classes, a sentiment we at RMS would certainly echo.

The PRA intervention shines a welcome spotlight and adds to the growing pressure on firms to do more to improve their cyber risk management practices. These challenges facing the market have been an issue for some time, but the how do we help the industry address them?

The PRA suggests firms with cyber exposure should have a clearly defined strategy and risk appetite owned by the board and risk management practices that include quantitative and qualitative elements.

At RMS our cyber modeling has focussed on providing precisely this insight, helping many of the largest cyber writers to quantify both their silent and affirmative cyber risk, thus allowing them to focus on growing cyber premiums.

If you would like to know more about the RMS Cyber Accumulation Management System (released February 2016), please contact cyberrisk@rms.com.

Shrugging Off a Hurricane: A Three Hundred Year Old Culture of Disaster Resilience

If a global prize was to be awarded to the city or country that achieves the peak of disaster resilience, Bermuda might be a fitting first winner.

This October’s Hurricane Nicole made direct landfall on the island. The eyewall tracked over Bermuda with maximum measured windspeeds close to 120 mph. Nonetheless there were there were no casualties. The damage tally was principally to fallen trees, roadway debris, some smashed boats and many downed utility poles. The airport opened in 24 hours, with the island’s ferries operating the following day.

Bermuda’s performance through Nicole was exemplary. What’s behind that?

Since its foundation in 1609 when 150 colonists and crew were shipwrecked on the island, Bermuda has got used to its situation at the heart of hurricane alley. Comprising 21 square miles of reef and lithified dunes, sitting out in the Atlantic 650 miles west of Cape Hatteras, a hurricane hits the island on average once every six or seven years. Mostly these are glancing blows, but once or twice a century Bermuda sustains direct hits at Category 3 or 4 intensity. Hurricane Fabian in 2003 was the worst of the recent storms, causing $300 million of damage (estimated to be worth $650 million, accounting for today’s higher prices and greater property exposure). The cost of the damage from Hurricane Gonzalo in 2014 was about half this amount.

How did Bermuda’s indigenous building style come to adopt such a high standard of wind resistance? It seems to go back to a run of four hurricanes at the beginning of the 18th Century. First, in September 1712 a hurricane persisted for eight hours destroying the majority of wooden buildings. Then twice in 1713 and again more strongly in 1715 the hurricane winds ruined the newly rebuilt churches. One hurricane can seem like an exception, four becomes a trend. In response, houses were constructed with walls of massive reef limestone blocks, covered by roofs tiled with thick slabs of coral stone: traditional house styles that have been sustained ever since.

The frequency of hurricanes has helped stress test the building stock, and ensure the traditional construction styles have been sustained. More recently there has been a robust and well-policed building code to ensure adequate wind resistance for all new construction on the island.

Yet resilience is more than strong buildings. It also requires hardened infrastructure, and that is where Bermuda has some room for improvement. Still dependent on overhead power lines, 90 percent of the island’s 27,000 houses lost power in Hurricane Nicole – although half of these had been reconnected by the following morning and the remainder through that day. Mobile phone and cable networks were also back in operation over a similar timescale. Experience of recent hurricanes has ensured an adequate stockpile of cable and poles.

Expert Eyes on the Island

It helps that there is an international reinsurance industry on the island, with many specialists in the science of hurricanes and the physics and engineering of building performance on hand to scrutinize the application of improved resilience. Almost every building is insured, giving underwriters oversight of building standards. Most importantly, the very functioning of global reinsurance depends on uninterrupted connection with the rest of the world, as well as ensuring that on-island staff are not distracted by having to attend to their family’s welfare.

Bermuda’s experience during Nicole would merit the platinum standard of resilience adopted by the best businesses: that all functions can be restored within 72 hours of a disaster. The Bermuda Business Development Agency and the Association of Bermuda Insurers and Reinsurers were fulsome in their praise for how the island had withstood the hurricane. The strong and widely-owned culture of preparedness, reflects the experience of recent storms like Gonzalo and Fabian.

Stephen Weinstein, general counsel at RenaissanceRe, commented “It’s remarkable that one day after a major hurricane strike, Bermuda is open for business, helping finance disaster risk worldwide, and poised to welcome back business visitors and vacationers alike.”

In early 2017, RMS will issue an update to Bermuda wind vulnerability in the version 17 software release as part of a broader update to the 33 islands and territories covered by the North Atlantic Hurricane Models. Updates to Bermuda vulnerability will consider past hurricane observations and the latest building code research.

Terrorism Insurance Under a Trump Presidency

It is likely that very few of the 60 million U.S. citizens who voted for Donald Trump would have done so because of his stance on terrorism insurance. Only because terrorism insurance is too arcane an issue to have come up in the presidential debates. However, many of the nation’s wavering voters may have been swayed by his pledge to make America safer from the scourge of terrorism. Under his presidency, border security will surely be tightened – even if no frontier wall is ever built and changes made to entry decisions for Syrian Muslim refugees into the United States.

Reauthorization of TRIA – Talks Start in 2018

On January 12, 2015, the Terrorism Risk Insurance Program Reauthorization Act of 2015 was signed into law by President Obama. This third extension of the original 2002 Terrorism Risk Insurance Act (TRIA) will sunset at the end of 2020, coinciding with the end of the first term of the Trump presidency. In the drafting of the 2015 reauthorization bill, detailed consideration was given by the House Financial Services Committee to alternative wordings that would have reduced the coverage provided by the U.S. government insurance backstop. One such alternative would have focused U.S. government involvement in the terrorism insurance market on covering terrorism losses from extreme attacks using weapons of mass destruction. When the future of terrorism risk insurance is raised once more on Capitol Hill in 2018, the Republican White House and Congress are likely to seek to further extend the private terrorism insurance market. Though I consider this to be contingent on President Trump keeping his pledge to keep America safe until then.

Balancing Civil Liberties in the Face of Reducing Terrorism Risk

In the democracies of the western alliance, the balance of keeping people safe from terrorism and preserving civil liberty is much debated issue. After the July 2005 London Transport bombings, the head of the British security service, MI5, warned that ‘there needs to be a debate on whether some erosion of civil liberties may be necessary to improve the chances of our citizens not being blown apart as they go about their daily lives’. On a national scale across America, a similar debate was prevalent during the 2016 U.S. presidential election. It may seem that in this instance, the champion of civil liberties, minority rights, and political correctness lost to the conservative advocate of oppressive counter-terrorism action and profiling of terrorist suspects.

Regardless of who occupies the White House, however, terrorist plots against the U.S. will persist and terrorists must be stopped before they move to their attack targets. Success in interdicting these plots depends crucially on intelligence gathered from electronic surveillance. It is well-documented that more intrusive surveillance can successfully increase the chances of lone wolf plots being stopped. And President-elect Trump has already affirmed his readiness to authorize more surveillance. He can claim a public mandate for this: for America to be great again, it has to be safe again – even from lone wolf terrorist plots. After the Orlando nightclub attack on June 12, 2016, perpetrated by the radicalized son of an Afghan immigrant, Donald Trump said that ‘we cannot afford to be politically correct anymore’. And in fighting global Islamist extremism vigorously, he may be able to count on President Putin’s support. While the two world leaders differ on geopolitics, their mutual respect as a President may be maintained through abrasive counter-terrorism action.

When Michael Chertoff was appointed Secretary of Homeland Security, President George W. Bush told him not to let 9/11 happen again – and he didn’t. President-elect Trump will expect a similarly impressive clean sheet. On a more personal level he also has a special interest in increased security against terrorist attacks. His own real estate empire includes some notable potential terrorist targets, including high-profile landmark buildings bearing his name. While the New York Stock Exchange has too tight security to be attacked, in contrast, the Trump Building on Wall Street has easy public access. There are numerous opportunities for terrorist target substitution.

New Zealand Earthquake – Early Perspectives

On Monday 14 November 2016 Dr Robert Muir-Wood, RMS chief research officer who is an earthquake expert and specialist in catastrophe risk management, made the following observations about the earthquake in Amberley:

“The November 13 earthquake was assigned a magnitude 7.8 by the United States Geological Service. That makes it more than fifty times bigger than the February 2011 earthquake which occurred directly beneath Christchurch. However, it was still around forty times smaller than the Great Tohoku earthquake off the northeast coast of Japan in March 2011.”

“Although it was significantly bigger than the Christchurch earthquake, the source of the earthquake was further from major exposure concentrations. The northeast coast of South Island has a very low population and the earthquake occurred in the middle of the night when there was little traffic on the coast road. Characteristic of such an earthquake in steep mountainous terrain, there have been thousands of landslides, some of which have blocked streams and rivers – there is now a risk of flooding downstream when these “dams” break.

In the capital city, Wellington, liquefaction and slumping on man-made ground around the port has damaged some quays and made it impossible for the ferry that runs between North and South Island to dock. The most spectacular damage has come from massive landslides blocking the main coast road Highway 1 that is the overland connection from the ferryport opposite Wellington down to Christchurch. This will take months or even years to repair. Therefore it appears the biggest consequences of the earthquake can be expected to be logistical, with particular implications for any commercial activity in Christchurch that is dependent on overland supplies from the north. As long as the main highway remains closed, ferries may have to ship supplies down to Lyttelton, the main port of Christchurch.”

“The earthquake appears to have occurred principally along the complex fault system in the north-eastern part of the South Island, where the plate tectonic motion between the Pacific and Australian plates transfers from subduction along the Hikurangi Subduction Zone to strike-slip along the Alpine Fault System. Faults in this area strike predominantly northeast-southwest and show a combination of thrust and strike-slip motion. From its epicenter the rupture unzipped towards the northeast, for about 100-140km reaching to about 200 km to the capital city Wellington.”

“Given the way the rupture spread to the northeast there is some potential for a follow-on major earthquake on one of the faults running beneath Wellington. The chances of a follow-on major earthquake are highest in the first few days after a big earthquake, and tail off exponentially. Aftershocks are expected to continue to be felt for months.”

“These events occurred on multiple fault segments in close proximity to one another. The technology to model this type of complex rupture is now available in the latest RMS high-definition New Zealand Earthquake Model (2016) where fault segments may now interconnect under certain considerations.”

The Rise and Stall of Terrorism Insurance

In the 15 years since the terrorist attacks of September 11, 2001, partnerships between the public sector and private industries have yielded more effective security and better public awareness about the threat of terrorism. We may never come to terms with the sheer volume of human loss from that day and among the hundreds of attacks that continue every year. But we have achieved greater resilience in the face of the ongoing realities of terrorism – except for when it comes to looking ahead at recovering from the catastrophic costs for rebuilding in its aftermath.

Terrorism insurance is facing a structural crisis: hundreds of terrorist attacks occur annually, but actual insurance payouts have been negligible. The economic costs of terrorism have skyrocketed, but demand for terrorism coverage has remained relatively flat. And despite a proliferation of catastrophe bonds and other forms of alternative capital flooding into the property insurance market, relatively little terrorism risk has been transferred to the capital markets. If terrorism insurance – and the insurers who provide it – are to remain relevant, they must embrace the new tools and data available to them to create more relevant products, more innovative coverages, and new risk transfer mechanisms that address today’s threat landscape.

The September 11th, 2001 attacks rank among the largest insurance losses in history at $44 billion, putting it among catastrophes with severe losses such as Hurricane Katrina ($70 billion), the Tohoku earthquake and tsunami ($38 billion), and Hurricane Andrew ($25 billion).

But unlike natural catastrophes, whose damages span hundreds of kilometers, most of the 9/11 damages in New York were concentrated in an area of just 16 acres. Such extreme concentration of loss caused a crisis in the insurance marketplace and highlighted the difficulty of insuring against such a peril.

Following the events of the September 11 attacks, most insurers subsequently excluded terrorism from their policies, forcing the U.S. government to step in and provide a backstop through the Terrorism Risk and Insurance Act (2002). Terrorism insurance has become cost effective as insurer capacity for terrorism risk increased. Today there are an estimated 40 insurers providing it on a stand-alone basis, and it is bundled with standard property insurance contracts by many others.

But despite better data on threat groups, more sophisticated terrorism modeling tools, and increased transparency into the counter-terrorism environment, terrorism insurance hasn’t changed all that much in the past 15 years. The contractual coverage is the same – usually distinguishing between conventional and CBRN (chemical, biological, radiological, and nuclear) attacks. And terrorism insurance take-up remains minimal where attacks occur most frequently, in the middle east and Africa, highlighting what policymakers refer to as an increasing “protection gap.”

Closing this gap – through new products, coverages, and risk transfer schemes – will enable greater resilience following an attack and promote a more comprehensive understanding of the global terrorism risk landscape.

How U.S. inland flood became a “peak” peril

This article by Jeff Waters, meteorologist and product manager at RMS, first appeared in Carrier Management.

As the journey towards a private flood insurance market progresses, (re)insurers can learn a lot from the recent U.S. flood events to help develop profitable flood risk management strategies.

Flood is the most pervasive and frequent peril in the U.S. Yet, despite having the world’s highest non-life premium volume and one of the highest insurance penetration rates, a significant protection gap still exists in the U.S. for this peril.

It is well-known that U.S. flood risk is primarily driven by tropical cyclone-related events, with storm surge being the main cause. In the last decade alone, flooding from tropical cyclones have caused more than $40 billion (2015 USD) in insured losses and contributed to today’s massive $23 billion National Flood Insurance Program (NFIP) deficit: 13 out of the top 15 flood events, determined by total NFIP payouts, were related to storm surge-driven coastal flooding from tropical cyclones.

Inland flooding, however, should not be overlooked. It too can contribute to a material portion of overall U.S. flood risk, as seen recently in the Southern Gulf, South Carolina, and in West Virginia, two areas impacted by major loss-causing events. These catastrophes caused billions in economic and insured losses while demonstrating the widespread impact caused by precipitation-driven fluvial (riverine) or pluvial (surface water) flooding. It is these types of flooding events that should be accounted for and well understood by (re)insurers looking to enter the private flood insurance market.

It hasn’t just rained; it has poured

In the past 15 months the U.S. has suffered several record-breaking or significant rainfall-induced inland flood events ….

To read the article in full, please click here.

A Perennial Debate: Disaster Planning versus Disaster Response

In May we saw a historic first: the World Humanitarian Summit. Held in Istanbul, representatives of 177 states attended. One UN chief summarised its mission thus: “a once-in-a-generation opportunity to set in motion an ambitious and far-reaching agenda to change the way that we alleviate, and most importantly prevent, the suffering of the world’s most vulnerable people.”

And in that sentence we find one of the enduring tensions within the disaster field: between “prevention” and “alleviation.” Between on the one hand reducing disaster risk through resilience-building investments, and on the other reducing suffering and loss through emergency response.

But in a world of constrained political budgets, where should we concentrate our energies and resources: disaster risk reduction or disaster response?

How to Close the Resilience Gap

The Istanbul summit saw a new global network launched to engage business in crisis situations through “pre-positioning supplies, meeting humanitarian needs and providing resources, knowledge and expertise to disaster prevention.” It is, of course, prudent to have stockpiles of humanitarian supplies strategically placed.

But is the dialogue still too focused on response? Could we not have hoped to see a greater emphasis on driving the disaster-resilient behaviours and investments, which reduce the reliance on emergency response in the first place?

Politics & Priorities

“Cost-effectiveness” is a concept with which humanitarian aid and governmental agencies have struggled over many years. But when it comes to building resilience, it is in fact possible to cost-justify the best course of action. After all, the insurance industry, piqued by the dual surprise of Hurricane Andrew and then the Northridge earthquake, has been using stochastic models to quantify and reduce catastrophe risk since the mid-1990s.

Unfortunately risk/reward analyses are rarely straightforward in practice. This is less a failing of the models to accurately characterise complex phenomena, though that certainly is a challenge. It’s more a question of politics.

It is harder for any government to argue that spending scarce public funds on building resilience in advance of a possible disaster is money well spent. By contrast, when disaster strikes and human suffering is writ large across the media, then there is a pressing political imperative to intervene. As a result many agencies sadly allocate more funds to disaster response than to disaster prevention, even though the analytics mostly suggest the opposite would be more beneficial.

A New, Ambitious form of Public Private Partnership

But there are signs that across the different strata of government the mood is changing. The cities of San Francisco and Berkeley, for example, have begun to use catastrophe models to quantify the cost of inaction and thereby drive risk-reducing investments. For San Francisco the focus has been on protecting the city’s economic and social wealth from future sea level rise. In Berkeley, resilience models have been deployed to shore-up critical infrastructure against the threat of earthquakes.

In May, RMS held the first international workshop on how resilience analytics can be used to manage urban resilience. Attended by public officials from several continents the engagement in the topic was very high.

The role of resilience analytics to help design, implement, and measure resilience strategies was emphasized by Arnoldo Kramer, the first Chief Resilience Officer (CRO) of the largest city in the western hemisphere, Mexico City. The workshop discussion went further than just explaining how these models can be used to quantify the potential, risk-adjusted return on investment from resilience initiatives. The group stressed the role of resilience metrics in helping cities finance capital investments in new, protective infrastructure.

Stimulated by commitments under the Sendai Framework to work more closely with the private sector, lower income regions are also increasingly benefiting from such techniques – not just to inform disaster response, but also to finance the reduction of disaster risk in the first place. Indeed there are encouraging signs that these two different worlds are beginning to understand each other better. At the inaugural working group meeting of the Insurance Development Forum in Singapore last month there was a productive dialogue between the UN Development Programme and the risk transfer industry. It was clear that both sides wanted action, not just words.

Such initiatives can only serve to accelerate the incorporation of resilience analytics into existing disaster risk reduction programmes. This may be a once-in-a-generation opportunity to address the shameful gap between the economic costs of natural disasters and the fraction of those costs that are insured.

We cannot prevent natural disasters from happening. But neither can we continue to afford to spend billions of dollars picking up the pieces when they strike. I am hopeful that we will take this opportunity to bring resilience analytics into under-served societies, making them tougher, more resilient, so that when catastrophe strikes, the impact is lessened and societies can bounce back far more readily.

Using Insurance Claims Data to Drive Resilience

When disaster strikes for homeowners and businesses the insurance industry is a source of funds to pick up the pieces and carry on. In that way the industry provides an immediate benefit to society. But can insurers play an extended role in helping to reduce the risks for which they provide cover, to make society more resilient to the next disaster?

Insurers collect far more detailed and precise information on property damage than any other public sector or private organisation. Such claims data can provide deep insights into what determines damage – whether it’s the vulnerability of a particular building type or the fine scale structure of flood hazard.

While the data derived from claims experience helps insurers to price and manage their risk, it has not been possible to apply this data to reduce the potential for damage itself – but that is changing.

At a recent Organisation for Economic Co-operation and Development meeting in Paris on flood risk insurance we discussed new initiatives in Norway, France and Australia that harness and apply insurers’ claims experience to inform urban resilience strategies.

Norway Claims Data Improves Flood Risk

In Norway the costs of catastrophes are pooled across private insurance companies, making it the norm for insurers to share their claims data with the Natural Perils Pool. Norwegian insurers have collaborated to make the sharing process more efficient, agreeing a standardized approach in 2008 to address-level exposure and claims classifications covering all private, commercial and public buildings. Once the classifications were consistent it became clear that almost 70% of flood claims were driven by urban flooding from heavy rainfall.

Starting with a pilot of ten municipalities, including the capital Oslo, a group funded by the Norwegian finance and insurance sector took this address-level data to the city authorities to show exactly where losses were concentrated, so that the city engineer could identify and implement remedial actions: whether through larger storm drains or flood walls. As a result flood claims are being reduced.

French Observatory Applies Lessons Learned from Claims Data

Another example is from France, where natural catastrophe losses are refunded through the national ‘Cat Nat System’. Property insureds pay an extra 12% premium to be covered. All the claims data generated in this process now gets passed to the national Observatory of Natural Risks, set up after Storm Xynthia in 2010. This unit employs the data to perform forensic investigations to identify what can be learnt about the claims and then works with municipalities to see how to apply these lessons to reduce future losses. The French claims experience is not as comprehensive as in Norway because such data only gets collected when the state declares there has been a ‘Cat Nat event’  – which excludes some of the smaller and local losses that fail to reach the threshold of political attention.

Australian Insurers Forced Council to Act on Their Claims Data

In Australia sharing claims data with a city council was the result of a provocative action by insurers which were frustrated by the political pressure to offer universal flood insurance following the major floods in 2011.  Roma, a town in Queensland, had been inundated five times in six years – insurers mapped and published the addresses of the properties that had been repeatedly flooded and refused to renew the insurance cover unless action was taken. The insurers’ campaign achieved its goal, pressuring the local council to fund flood alleviation measures across the town.

These examples highlight how insurers can help cities identify where their investments will accomplish the most cost-effective risk reduction. All that’s needed is an appetite to find ways to process and deliver claims data in a format that provides the key insights that city bosses need, without compromising concerns around confidentiality or privacy.

This is another exciting application in the burgeoning new field of resilience analytics.

Calculating the cost of “Loss and Damage”

The idea that rich, industrialized countries should be liable for paying compensation to poorer, developing ones damaged by climate change is one that has been disputed endlessly at recent international climate conferences.

The fear among rich countries is that they would be signing a future blank check. And the legal headaches in working out the amount of compensation don’t bear thinking about when there are likely to be arguments about whether vulnerable states have done enough to protect themselves.

The question of who pays the compensation bill may prove intractable for some years to come. But the scientific models already exist to make the working out of that bill more transparent.

Some context: in the early years of climate negotiations there was a single focus—on mitigating or (limiting) greenhouse gas emissions. Through the 1990s it became clear atmospheric carbon dioxide was growing just as quickly, so a second mission was added: “adaptation” to the effects of climate change.

Now we have a third concept: “Loss and Damage” which recognizes that no amount of mitigation or adaptation will fully protect us from damages that can’t be stopped and losses that can’t be recovered.

Sufficient self-protection?

The Loss and Damage concept was originally developed by the Association of Small Island States, which saw themselves in the frontline of potential impacts from climate change, in particular around sea-level rise. By some projections at least four of the small island countries (Kiribati, Tuvalu, the Marshall Islands, and the Maldives) will be submerged by the end of this century.

Countries in such a predicament seeking compensation for their loss and damage will have to answer a difficult question: did they do enough to adapt to rising temperatures before asking other countries to help cover the costs? Rich countries will not look kindly on countries they deem to have done too little.

If money were no object, then adaptation strategies might seem limitless and nothing in the loss and damage world need be inevitable. Take sea level rise, for example. Even now in the South China Sea we see the Chinese government, armed with strategic will and giant dredgers, pumping millions of tons of sand so that submerged reefs can be turned into garrison town islands. New Orleans—a city that is 90% below sea level—is protected by a $14 billion flood wall.

But, clearly, adaptation is expensive and so the most effective strategies may be beyond the reach of poorer countries.

Calculating the cost with models

Through successive international conferences on climate change the legal and financial implications of loss and damage have seen diplomatic wrangling as richer and poorer nations argue about who’s going to foot the bill.

But we can conceptualize a scientific mechanism for tallying what that bill should be. It would need a combination of models to discriminate between costs that would have happened anyway and those that are the responsibility of climate change.

Firstly, we could use “attribution climate models” which run two versions of future climate change: one model is based on the atmosphere as it actually is in 2016 while the other “re-writes history” and supposes there’s been no increase in greenhouse gases since 1950.

By running these two models for thousands of simulation years we can see the difference in the number of times a particular climate extreme might happen. And the difference between them suggests how much that extreme is down to greenhouse gas emissions. After this we will need to model how much adaptation could have reduced loss and damage. An illustration:

  • A future extreme weather event might cause $100 billion damage.
  • Attribution studies show that the event has become twice as likely because of climate change.
  • Catastrophe models show the cost of the damage could have been halved with proper adaptation.
  • So the official loss and damage could be declared as $25 billion.

While hardly a straightforward accounting device it’s clear that this is a mechanism—albeit an impressively sophisticated one—that could be developed to calculate the bill for loss and damage due to climate change.

Leaving only the rather thorny question of who pays for it.

Learning More About Catastrophe Risk From History

In my invited presentation on October 22, 2015 at the UK Institute and Faculty of Actuaries GIRO conference in Liverpool, I discussed how modeling of extreme events can be smarter, from a counterfactual perspective.

A counterfactual perspective enables you to consider what has not yet happened, but could, would, or might have under differing circumstances. By adopting this approach, the risk community can reassess historical catastrophe events to glean insights into previously unanticipated future catastrophes, and so reduce catastrophe “surprises.”

The statistical foundation of typical disaster risk analysis is actual loss experience. The past cannot be changed and is therefore traditionally treated by insurers as fixed. The general consensus is why consider varying what happened in the past? From a scientific perspective, however, actual history is just one realization of what might have happened, given the randomness and chaotic dynamics of nature. The stochastic analysis of the past, used by catastrophe models, is an exploratory exercise in counterfactual history, considering alternative possible scenarios.

Using a stochastic approach to modeling can reveal major surprises that may be lurking in alternative realizations of historical experience. To quote Philip Roth, the eminent American writer: “History, harmless history, where everything unexpected in its own time is chronicled on the page as inevitable. The terror of the unforeseen is what the science of history hides.”  All manner of unforeseen surprising catastrophes have been close to occurring, but ultimately did not materialize, and hence are completely absent from the historical record.

Examples can be drawn from all natural and man-made hazards, covering insurance risks on land, sea, and air. A new domain of application is cyber risk: new surprise cyber attack scenarios can be envisaged with previous accidental causes of instrumentation failure being substituted by control system hacking.

The past cannot be changed—but I firmly believe that counterfactual disaster analysis can change the future and be a very useful analytical tool for underwriting management. I’d be interested to hear your thoughts on the subject.