Je Suis Charlie 2020

“Trump is a joke: why satire makes sense when politics doesn’t.” This was the title of a presentation at the McCourtney Institute for Democracy at Penn State on February 20, 2018. One of the cherished freedoms of democracy is satire. Since Saturday Night Live first aired on NBC in October 1975, all presidents from Ford to Trump, and other prominent public figures, have been targets of merciless satire. Censorship of satire is an affront to democracy. The murder of satirists is terrorism.

Five years ago, on Wednesday, January 7, 2015, the Paris office of the satirical weekly, Charlie Hebdo, was attacked by two Jihadi brothers, Cherif and Said Kouachi, armed with AK-47s. This was the day of the week when the editorial committee met. Of the dozen deaths, eight were Charlie Hebdo staff members, including the fearless editor Stéphane Charbonnier, known as Charb. He had declared he would “… prefer to die standing than live on his knees.” President Hollande condemned this terrorist attack as the most serious in France in more than forty years. Indeed, this event was referred to as France’s 9/11: it was like killing Voltaire.

Memorial at the Place de la Republique in Paris. Image credit: Author’s own

A measure of the singular nature of this event was the global response it triggered; not only heavily armed police security in Paris, but also a popular demonstration of millions throughout France, joined by world political leaders, expressing international solidarity against terrorism. The popular slogan, “Je Suis Charlie”, coined spontaneously at a French style magazine, echoed around the world.

At the start of a new decade, democracies are still threatened with acts of terrorist violence, aimed at coercing changes in public policy. These threats can emerge from the far right as well as from Islamists. The hate speech peddled by terrorist groups needs to be vigorously countered by the free press, including satirical publications.

As one of the most notable terrorist attacks in the Western world since 9/11, the attack on Charlie Hebdo has been insightful for affirming the basic principles of terrorism risk modeling. An overarching principle is that terrorist operations follow the path of least resistance. Charlie Hebdo was a prime target because of the global media publicity associated with the brutal assassination of the editorial committee. According to an ISIS doctrine, half of Jihad is media. Charlie Hebdo was also a soft target, having weak security compared with alternative political, economic or military targets.

Another important principle underlying terrorist operations is that too many terrorists spoil the plot. The surveillance profile of a plot is minimized by having as few operatives as possible. Brothers, like the Kouachis (and the Tsarnaev Boston marathon bombers), also have the advantage of a more compact counterintelligence footprint than terrorists from different families. 

Since the great majority of terrorist plots against Western democracies are interdicted by the security and law enforcement services, a counterfactual perspective on plots is essential for terrorism risk modeling. This perspective of reimagining history is universal, and applies to any peril, natural or man-made[1]. Carl von Clausewitz, the Prussian master of military strategy, noted that perfecting the art of warfare entails knowing not only what has occurred, but also everything that could have occurred.

On January 7, 2015, the editorial committee of Charlie Hebdo planned to have lunch after their meeting at a local bistro, Les Canailles (Little Rascals). A large table was routinely reserved for them on Wednesdays. This bistro had open public access, unlike the Charlie Hebdo office, so it would have been easy for the two shooters to storm in firing their AK-47s. Through target substitution, this bistro might well have been attacked rather than the Charlie Hebdo office.

Counterfactual thinking such as this is not mere hindsight; it can provide valuable foresight into the future. The vulnerability of Parisian restaurants and cafés was evident. Months later, on Friday, November 13, 2015, six Parisian restaurants were attacked by ISIS terrorists, along with the Stade de France sports stadium and the Bataclan theater.  

Knowledge of history is a precious resource for terrorism risk modelers. Most attacks have either happened before – or might have happened before. For insurers as well as civic authorities and businesses, strategic surprise can be avoided by reimagining history.


[1] Woo G. (2019)  Downward counterfactual search for extreme events. Frontiers in Earth Science https:/doi.org/10.3389/feart.2019.00340

RMS Impact Trek 2020: Share Your Expertise and Make a Difference

If you are part of the risk management industry, you are acutely aware of the impact catastrophes have. Because of that understanding, many risk professionals actively help communities in need post-disaster, through donations, working with organizations to promote resilience, or through on the ground assistance.

We are fortunate to say that this is true for everyone at RMS. Our values embrace a stronger understanding of risk, building resiliency, and making a positive impact to improve the lives of communities suffering after disasters. One of the ways we live our values is through our annual RMS Impact Trek, where both RMS employees and our clients work together with the social enterprise Build Change in some of the world’s most catastrophe-prone areas (see the 2019 Impact Trek recap video here).

And for the fifth year in a row, we’re again sponsoring representatives from our clients to join RMS employees and Build Change and share their skills to build stronger communities. If you are an RMS client, we have recently invited your organization to participate in our annual RMS Impact Trek.

Continue reading

The Storm Surge and the Tsunami

The core idea behind catastrophe modeling is that the architecture of risk quantification is the same whatever the peril. While a hurricane is not an earthquake, building a hurricane catastrophe model has elements in common with an earthquake catastrophe model. Stochastic event occurrence, the hazard footprint, the damage mechanism, clustering, post-event loss amplification are all shared concepts.

While on the university campus, disciplines may retain their nineteenth century segregations, in catastrophe modeling we are “ecumenical” about what is the driver of loss: whether it is wind, hail, vibration, flood, cyber, a virus or a terrorist attack. The track of a hurricane, the track of a fault rupture: the contagion of influenza, the contagion of NotPetya malware: the topographic controls of flooding, the topographic controls of wildfire. Exploring the parallels can be illuminating.

Which is why it is interesting to discover historical figures, who like catastrophe modelers, have looked sideways across the catastrophe disciplines. One such figure is the Anglo-Greek Lafcadio Hearn (unless you are from Japan where he is known as Koizumi Yakumo.)

Continue reading

White Island Risk Mitigation: A Role for Insurance

Safety from volcanic eruptions is heavily influenced by economic factors. Those who earn their livelihood from farming around a volcano may be reluctant to evacuate, and those who operate tourist excursions may be reluctant to suspend them. This may have been the case with White Island Tours which holds an exclusive license to land tourists on the privately-owned island in New Zealand’s Bay of Plenty, named White Island by Captain James Cook. Tourists averse to sea sickness have also been able to arrive on the island via helicopter through Volcanic Air.

With around 10,000 customers per year, paying up to several hundred New Zealand dollars (US$0.66 per NZ dollar) for a tour, White Island Tours has been a substantial business. But one whose financial viability would have required that as few trips as possible were cancelled because of the volcano risk.

After 22 years of business operation, the volcano erupted on December 9, 2019. There were 47 tourists on White Island; most were killed or seriously injured. 38 of the tourists were from the cruise liner Ovation of the Seas, operated by Royal Caribbean Cruises Ltd., which denies any responsibility for their excursions, which were advertised with the statement that White Island is one of the most active volcanoes in the world. The terms and conditions of cruise tickets require that any lawsuit be filed in Miami. The U.S. courts will thus decide what liability Royal Caribbean Cruises Ltd. had in vetting White Island Tours.

Continue reading

Newcastle: Thirtieth Anniversary of Australia’s Largest Earthquake Loss. But What If…?

Over the past 15 years, we have witnessed some of the world’s largest possible recorded earthquakes that have had catastrophic impacts around the globe. But, looking back 30 years to 1989, we saw two smaller, but still significant earthquakes. The first was the M6.9 Loma Prieta event that hit the San Francisco Bay Area in October, an earthquake that is familiar to many due to its proximity to the city, and its level of destruction. However, less are aware of the other notable earthquake that year. December 28, 1989, is a memorable date for many Australians; as it marks the country’s most damaging earthquake in recorded history, and still remains one of Australia’s costliest natural catastrophes to date.

Despite its moderate magnitude, the M5.4 Newcastle earthquake caused widespread ground shaking, with insured losses of just under $1 billion AUD (US$690 million) at the time of the event (ICA, 2012), a loss which if the earthquake was repeated, RMS estimates would cost over $5 billion AUD.

Continue reading

Twenty Years After Storms Anatol, Lothar and Martin: Memories From the End of the Millennium

Twenty years ago, while the planet was getting ready for transitioning to year 2000 and trying to solve the Y2K bug, the (re)insurance industry in Europe got caught by surprise by windstorm Lothar. Even today, 1999 remains a historic windstorm year, with catastrophic storms Anatol (December 3), Lothar (December 26) and Martin (December 28) all happening within a period of less than a month.

Lothar tracked across northern France, southern Belgium and central Germany and into Poland; Martin tracked through southern Europe – affecting France, Spain, Switzerland and Italy. Between Lothar and Martin, 140 people were killed, and losses ran over €14.2 billion economic losses, approximately €7.7 billion of which was insured. If the three events happened today, they would cost approximately €20 billion (US$23.3 billion) to the (re)insurance industry.

At the time, I was still living in Geneva with my parents. I remember waking up the day after Christmas and seeing fallen trees in our garden and our telephone line was cut. It was very dramatic and since then, no other windstorm has caused that kind of damage in this region.

In commemoration of the twentieth anniversary of windstorms Anatol, Lothar and Martin, I have asked my colleagues at RMS to share their experience of the storms.

Continue reading

Technology: The Springboard to Innovative Treaty Underwriting

Cautious optimism surrounds the January 1, 2020 reinsurance renewals, with expectations that the anticipated hardening of rates might be realized – to a modest degree at least.

Reinsurance underwriters who can harness technology to conquer historic risk assessment challenges – including robust marginal impact analytics, and create the space for innovation can build customer relationships that are resilient to future market rate oscillations.

The capital influx to reinsurance markets, triggered by low market returns globally, has led to increased limits and more generous terms being offered without commensurate increases in rates. This trend can only last for so long before having dire effects on reinsurer profitability. 

Profitability in the primary insurance markets has been helped by innovation, with new product offerings linked to enhanced risk assessment techniques like telematics. But while the insurtech wave has propagated hundreds of companies and ideas focused on primary insurers, progress in “reinsure-tech” has been limited, due primarily to the current soft market. These market conditions have constrained resources available for speculative investments and has limited the reinsurer’s ability to pursue potential upside in the fast-moving tech space.

Almost ironically, in response to the market conditions, companies have instituted cautious underwriting approaches still rooted in low-fidelity risk assessment techniques, which haven’t evolved to capitalize on the technological advances made since the market softened at the start of the decade.

Continue reading

Toward a Science of Cyber Risk

Why a “Science”? (Part One)

A new article, The Science of Cyber Risk: A Research Agenda has just been published in Science. A free, non-paywall version of this paper is available here. Written by a diverse team of 19 authors, including myself, it presents a concise argument for interdisciplinary research, to establish a scientific basis for risk analysis and management in the cyber security domain.

As a leading provider of cyber risk models for the (re)insurance industry, RMS is committed to advancing the state-of-the-art in the science of cyber risk. The proposed six category research agenda is of keen interest to RMS and we recommend this Science journal article to anyone who shares our interest in solving the hard problems.

In this the first of three blog posts, I’ll explore why we need a “science” and what difference it will make. The next two posts will feature case studies in interdisciplinary collaboration, including lessons from past successes and failures.

Continue reading

Data Engineering for Risk Analytics with Risk Data Open Standard

This article was originally published by DZone

What Is Risk Analytics?

The picture below on the left shows the extensive flooding at industrial parks north of Bangkok, Thailand. Western Digital had 60 percent of its total hard drive production coming from the country – floods disrupted production facilities at multiple sites to dramatically affect a major, global supply chain. And the picture on the right – showing flooding on the New York Subway from Hurricane Sandy, caused widespread disruption and nearly US$70 billion of losses across the northeastern U.S.

In both examples, the analysis of risk should not only help with physical protection measures such as stronger buildings through improved building codes or better defenses, but also the protection available through financial recovery. Providing financial protection is the job of the financial services and insurance industries. Improving our understanding of and practices in risk analytics as a field is one of the most interesting problems in big data these days, given the increasing set of risks we have to watch for.

Flooding at industrial parks north of Bangkok, Thailand in 2011 (left) and flooded subway stations in New York after Hurricane Sandy in 2012 (right) Image credit: Wikimedia/Flickr

How Does Risk Analytics Work?

Obviously, the risk landscape is vast. It stretches from “natural” events – such as severe hurricanes and typhoons, to earthquakes to “human-generated” disasters, such as cyberattacks, terrorism and so on.

The initial steps of risk analytics start with understanding the exposure – this is the risks a given asset, individual etc. are exposed to. Understanding exposure means detailing events that lead to damage and the related losses that could result from those events. Formulas get more complicated from here. There is a busy highway of data surrounding this field. Data engineers, data scientists, and others involved in risk analytics work to predict, model, select, and price risk to calculate how to provide effective protection.

Data Engineering for Risk Analytics

Let’s look at property-focused risks. In this instance, risk analytics starts with an understanding of how a property – such as a commercial or a residential building, is exposed to risk. The kind of events that could pose a risk and the associated losses that could result from those events depends on many variables.

The problem is that in today’s enterprise, if you want to work with exposure data, you have to work with multiple siloed systems that have their own data formats and representations. These systems do not speak the same language. For a user to get a complete picture, they need to go across these systems and constantly translate and transform data between them. As a data engineer, how do you provide a unified view of data across all systems? For instance, how can you enable a risk analyst to understand all kinds of perils – from a hurricane, a hailstorm to storm surge, and then roll this all up so you can guarantee the coverage on these losses?

There are also a number of standards used by the insurance industry to integrate, transfer, and exchange this type of information. The most popular of these formats is the Exposure Data Model (EDM). However, EDMs and some of their less popular counterparts (Catastrophe Exposure Database Exchange – CEDE, and Open Exposure Data – OED) have not aged well and have not kept up with the industry needs:

  • These older standards are property centric; risk analytics requires an accommodation and understanding of new risks, such as cyberattacks, liability risks, and supply chain risk.
  • These older standards are propriety-designed for single systems that do not take into account the needs of various systems, for example, they can’t support new predictive risk models.
  • These standards don’t come with the right containment to represent high fidelity data portability – the exposure data formats do not usually represent losses, reference data, and settings used to produce the loss information that can allow for data integrity.
  • These standards are not extensible. Versioning and dependencies on specific product formats (such as database formats specific to version X of SQL Server etc) constantly make data portability harder.

This creates a huge data engineering challenge. If you can’t exchange information with high fidelity, forget getting reliable insights. As anyone dealing with data will say: garbage in, garbage out!

For any data engineer dealing with risk analytics, there is great news. There is a new open standard that is designed to remove shortcomings of the EDM and other similar formats. This new standard has been in the works for several years. It is the Risk Data Open Standard. The Risk Data Open Standard (RDOS) is designed to simplify data engineering. It is designed to simplify integrating data between systems that deal with exposure and loss data. It isn’t just RMS working to invent and validate this standard in isolation. A steering committee of thought leaders from influential companies is working on validating the Risk Data OS.

The Risk Data OS will allow us to work on risk analytics much more effectively. This is the way we can better understand the type of protection we need to create to help mitigate against climate change and other natural or human-made disasters. You can find details on the Risk Data OS here. If you are interested in the Risk Data OS, have feedback, or would like to help us define this standard, you can email the Risk Data OS steering committee by clicking here .

Nine Years After Darfield: When an Earthquake Drives a New Model – Part Two

The Liquefaction Model

The 2010 M7.1 Darfield earthquake in New Zealand started a sequence of events – the Canterbury Earthquake Sequence (CES), that propagated eastward in the Canterbury region over several years. Since the City of Christchurch is built on alluvial sediments where the water table is very shallow, several of the larger events created widespread liquefaction within the city and surrounding areas. Such ground deformations caused a significant number of buildings with shallow foundations to settle, tilt and deform.

Prior to these New Zealand earthquakes, liquefaction was observed but not on this scale in a built-up area in a developed country. As in previous well-studied liquefaction events (e.g. 1964 Niigata) this was a unique opportunity to examine liquefaction severity and building responses. Christchurch was referred to as a “liquefaction laboratory” with the multiple events causing different levels of shaking across the city. However, we had not previously seen suburbs of insured buildings damaged by liquefaction.

Continue reading