Category Archives: Security and Privacy

Prudential Regulation Authority on the Challenges Facing Cyber Insurers

Most firms lack clear strategies and appetites for managing cyber risk, with a shortage of cyber domain knowledge noted as a key area of concern. So said the Prudential Regulation Authority, the arm of the Bank of England which oversees the insurance industry, in a letter to CEOs last week.

This letter followed a lengthy consultation with a range of stakeholders, including RMS, and identified several key areas where insurance firms could and should improve their cyber risk management practices. It focussed on the two distinct types of cyber risk: affirmative and silent.

Affirmative cover is explicit cyber coverage, either offered as a stand-alone policy or as an endorsement to more traditional lines of business. Silent risk is where cover is provided “inadvertently” through a policy that was typically never designed for it. But this isn’t the only source of silent risk: it can also leak into policies where existing exclusions are not completely exhaustive. A good example being policies with NMA 2914 applied, which excludes cyber losses except for cases where physical damage is caused in any cyber-attack (eg. by fire or explosion).

The proliferation of this silent risk across the market is highlighted as one of the key areas of concern by the PRA. It believes this risk is not only material, but it is likely to increase over time and has the potential to cause losses across a wide range of classes, a sentiment we at RMS would certainly echo.

The PRA intervention shines a welcome spotlight and adds to the growing pressure on firms to do more to improve their cyber risk management practices. These challenges facing the market have been an issue for some time, but the how do we help the industry address them?

The PRA suggests firms with cyber exposure should have a clearly defined strategy and risk appetite owned by the board and risk management practices that include quantitative and qualitative elements.

At RMS our cyber modeling has focussed on providing precisely this insight, helping many of the largest cyber writers to quantify both their silent and affirmative cyber risk, thus allowing them to focus on growing cyber premiums.

If you would like to know more about the RMS Cyber Accumulation Management System (released February 2016), please contact cyberrisk@rms.com.

The Changing Landscape of Cyber Threats

The cyber risk landscape is constantly changing. In the last few weeks alone we’ve seen potentially game-changing events with the release of U.S. National Security Agency hacking tools through the shadow brokers auction, and one of the most significant Denial of Service (DDoS) attacks ever seen when millions of Internet of Things devices were hijacked to target a major piece of Internet infrastructure taking hundreds of websites offline. In this blog I’ll discuss some of the constant ebb and flow of attack verses defense through the lens of the five cyber loss methods currently modeled by RMS.

Data Breaches

The loss of 500 million records in a single cyberattack represents the largest data breach event in history – so far, at least. The recent Yahoo hack, and the potential impact on the proposed Verizon takeover, has sent another stark reminder to industry executives of the dangers surrounding data breaches.

It may have been the biggest single hack ever in terms of records lost, but it’s hardly an isolated one. The leak of the Panama Papers was significant in terms of size – but also led to huge political fall-out globally as politicians were implicated in secret offshore funds, with the resignation of the Icelandic prime minister.

Governments and public agencies themselves have also been targeted in the U.S., Mexico, and the Philippines, for example. One of the most significant breaches affected Turkey, with the release of nearly 50 million records from the country’s General Directorate of Population and Citizenship Affairs, which included addresses, birth dates, and most troublingly, national ID numbers.

These individual large events fit within the observed pattern for 2016 so far, with less frequent cyber data hacks, though ones of higher severity.

Denial of Service Attack

2016 has been another active period for Denial of Service (DDoS) attacks. Going into the year we’d seen signs of a downwards trend. However this was spectacularly reversed in the first quarter which saw 19 attacks greater than 100 gigabits per second. Gaming and software industries continue to be most heavily impacted. Furthermore, we’re seeing a growing number of companies attacked repeatedly – on average, each targeted company was attacked 29 times, but with one company being attacked 283 times!

Frequency aside, the increasing complexity of attacks is most disturbing. 59% in the first quarter of 2016 were “multi-vector” attacks which require unique mitigation controls for each attack vector, as seen in the recent DDoS attack on Dyn, the DNS provider. If this trend continues we can expect existing defenses to be less effective against DDoS, and therefore disruption to be increased.

Cloud Provider Failure

With the leading cloud providers continuing to achieve double and even triple-digit year-on-year growth, the clear trend of companies moving their services to the cloud is continuing apace. Though overall trends have seen a decrease in the annual downtime, 2016 has seen several small but significant failures including an Amazon Web Services outage in Australia, Salesforce in both the U.S. and Europe and a Verizon issue that impacted among others JetBlue Airways. As these cloud services become more popular, the accumulation of risk to both business interruption and data loss is becoming ever more severe as more companies become increasingly reliant on the cloud.

Financial Transaction Theft

Perhaps the most audacious cyber-attack of the past year was when almost US$100 million was stolen from Bangladesh’s central bank and transferred to accounts in Manila and the Philippines. Even more shocking, this money was stolen from the bank account at the U.S. Federal Reserve and was transferred using standard SWIFT financial transaction messages.

The largest cyber heist ever could have been even larger but for a misspelling, and it was this typo that raised the attention of the U.S. Federal Reserve Bank in New York. The perpetrators had attempted to withdraw $950 million over 35 separate transactions. A similar attack, using a vulnerability in the SWIFT messaging system, led to another multi-million dollar theft from a Ukrainian bank.

Perhaps more than any other sector, the interconnected nature of modern financial services leaves the industry open to large scale systemic cyber losses.

Cyber Extortion

Ransomware attacks are continuing to become more frequent and more complex in 2016. One alarming pattern has seen an increased targeting of healthcare institutions where we’ve seen multiple hospitals in California and Kentucky in the U.S. and in Germany, all being attacked. In one particularly un-ethical incident the Hollywood Presbyterian Hospital had to pay out around $17,000 to regain access to their systems.

The more sophisticated software now being used to perpetrate attacks is starting pay dividends for the hacking groups. The “Jigsaw” malware, for example, threatens to delete an increasing number of files after every hour of nonpayment. Encryption type malware has become the norm – and targeted, business-focused malware is growing as evidenced by the “Samsam” scheme which targets unpatched server software.

Incorporating Into the RMS Cyber Model

RMS is continuing to monitor the broad spectrum of cyber-attacks that are impacting thousands of companies every month. During a recent online seminar, the RMS cyber team shared some of these key trends outlined in this blog, and discussed the impacts on cyber insurers. Through the RMS Cyber Accumulation Management System, RMS is continuing to incorporate these insights into our modeling to provide the most comprehensive and accurate view of cyber risk.

Mandatory reporting of cyber-attacks would improve understanding of cyber risk

The recent call by the Association of British Insurers (ABI) for the U.K. government to mandate the reporting of cyber-attacks is another welcome attempt to improve the collective learning opportunities presented by the continuous stream of cyber events. Every attack provides new data which can be fed into probabilistic models which help build resilience against this growing corporate peril – so long as we are able to find out about those attacks. Thus initiatives like this, which will lead to the sharing of valuable information and insights, are paramount.

Reporting cyber attacks is already mandatory in most U.S. states where laws require companies to notify their customers and regulators as soon they suffer a security breach. In 2018 a similar EU law, The European Network Information Security Directive, will make it mandatory for certain firms to provide alerts of cyber incidents.

However, having more information on data breaches still only provides just part of the picture required to fully understand cyber as a peril.

Current security breach notification laws, where they exist, do not require companies to report the many other types of cyber-attack that are increasingly being used to target organizations. Cyber extortion, for example, is a growing trend. Firms typically choose not to report this type of attack to limit damage to their corporate reputation.

Historical attacks not a good indicator of the future

While having access to data on historical cyber breaches is valuable, the threat is constantly evolving, such that previous attacks have rarely been a good indicator of future events. Even a small shift in the balance between the capabilities of hackers and cyber defenses could lead to a significant shift in the frequency and severity of cyber attacks.

Staying on top of the myriad of threat actors and their motivations and resources, as well as having a broad view of the range of viable attack methods that exist today, is crucial to understanding and managing cyber risk. But is challenging to manage.

As a first step to help insurers better understand their existing cyber risk loss potential, RMS recently launched its Cyber Accumulation Management System. This tool provides insurers with a framework to organize and structure their data, identify their accumulations and correlated risk, and stress test their portfolios against a range of cyber loss methods. Having this capability enables insurers to understand the potential size of cyber catastrophes and set their risk appetite to safely grow capacity for this line of business.

Cyber attacks are an increasingly significant threat to the global economy. The combination of new cyber risk management solutions combined with initiatives such as mandatory reporting will help the insurance industry to continue to play itscrucial role in ensuring the resiliency of our economy.

Contact the RMS cyber team for more information cyberrisk@rms.com.

Unlocking the Potential of Cyber Insurance

The cyber insurance market presents insurers with an attractive growth opportunity. It also presents a significant challenge to overcome. Coverage constitutes the largest genuinely new class of business developed by the insurance industry for at least a generation. And its potential at even the conservative end-of-the-scale can be measured in tens of billions of US dollars.

However, with limited tools to measure the threat, carriers have been understandably reluctant to throw too much capital at the risk. With warnings about the systemic nature of the threat reverberating through the press to boardrooms, the industry has so far approached the risk with caution and coverage has been limited.

Yet the need for insurance solutions to assist corporates with their cyber threat is real and great. In the wake of losses such as Target’s $67 million settlement with Visa over a breach of customer payment data, and an estimated annual global cost of cybercrime of $445 billion, companies are eager to offload what they rightly see as a large and looming financial risk.

Industry Concerned by Systemic Nature of Cyber

We recently surveyed 40 RMS clients already writing cyber, including insurers, reinsurers, and brokers, to gain an understanding of their concerns. They had a number of common challenges.

Firstly, due the dynamic and emerging nature of the peril it’s difficult to quantify just how big and systemic a potential cyber catastrophe might be. In addition, with so many different attack methods available to cyber criminals—even knowing where the attack will come from poses some difficulty.

Another common challenge was the uncertainty of how cyber attacks could impact non-affirmative cyber policies—the so-called silent exposure. With limited precedent set for how cyber-related losses would trigger these policies there is uncertainty around the impact of a cyber catastrophe.

Lastly, the lack of a common data standard or a mechanism for understanding aggregations of risk, pose a further challenge, hindering companies in understanding their capital implications, setting risk appetites, and meeting their regulatory reporting obligations.

A Response to the Problem

We have tackled our clients’ cyber risk management concerns by developing a cyber accumulation management solution, built on three core elements.

  1. A data standard for the industry

    Our Cyber Exposure Data Schema was developed in conjunction with the Centre for Risk Studies at the University of Cambridge, with support from leading market companies. It provides an approach to standardising cyber data as a distinct peril. It copes with both affirmative and silent cyber coverage, and allows risk to be tracked and transferred by providing a consistent framework for data capture, storage, and analysis. Critically, it is open source, model-agnostic, and extensible.

  1. Five loss scenarios to stress test portfolios

    The new RMS cyber loss process models assess actual books of business against multiple realistic loss scenarios, testing various levels of severity for the top five cyber threats identified by our industry development partners at Cambridge. Running analyses shows underwriters how loss events would interact with their exposure, and isolates the key drivers of risk, allowing an informed, independent view of cyber to be formed.

  1. A Cyber Accumulation Management System

    The accumulation engine is the framework for generating loss projections. The analytical capabilities enable companies to report exposure aggregates by coverage type and potential loss characteristics, to a previously unthinkable level of granularity. It highlights accumulations and correlations, giving insurers, reinsurers, and brokers all of the tools necessary to answer questions regarding portfolio optimization, capacity and capital requirements, while delivering answers to regulatory demands.

Together these three components comprise a complete cyber risk management solution which solves the key, real-world challenges facing the insurance industry today. We have created a new standard for the capture and management of cyber exposure data, and mechanisms both to get a handle on affirmative and silent cyber risks, while simultaneously meeting reporting requirements. All of that delivers the insights necessary to unlock the capital necessary to meet ultimate insureds’ demands for cyber cover, and allow the insurance sector to grow confidently into this exciting new line of business.

Learning More About Catastrophe Risk From History

In my invited presentation on October 22, 2015 at the UK Institute and Faculty of Actuaries GIRO conference in Liverpool, I discussed how modeling of extreme events can be smarter, from a counterfactual perspective.

A counterfactual perspective enables you to consider what has not yet happened, but could, would, or might have under differing circumstances. By adopting this approach, the risk community can reassess historical catastrophe events to glean insights into previously unanticipated future catastrophes, and so reduce catastrophe “surprises.”

The statistical foundation of typical disaster risk analysis is actual loss experience. The past cannot be changed and is therefore traditionally treated by insurers as fixed. The general consensus is why consider varying what happened in the past? From a scientific perspective, however, actual history is just one realization of what might have happened, given the randomness and chaotic dynamics of nature. The stochastic analysis of the past, used by catastrophe models, is an exploratory exercise in counterfactual history, considering alternative possible scenarios.

Using a stochastic approach to modeling can reveal major surprises that may be lurking in alternative realizations of historical experience. To quote Philip Roth, the eminent American writer: “History, harmless history, where everything unexpected in its own time is chronicled on the page as inevitable. The terror of the unforeseen is what the science of history hides.”  All manner of unforeseen surprising catastrophes have been close to occurring, but ultimately did not materialize, and hence are completely absent from the historical record.

Examples can be drawn from all natural and man-made hazards, covering insurance risks on land, sea, and air. A new domain of application is cyber risk: new surprise cyber attack scenarios can be envisaged with previous accidental causes of instrumentation failure being substituted by control system hacking.

The past cannot be changed—but I firmly believe that counterfactual disaster analysis can change the future and be a very useful analytical tool for underwriting management. I’d be interested to hear your thoughts on the subject.

New Risks in Our Interconnected World

Heraclitus taught us more than 2,500 years ago that the only constant is change. And one of the biggest changes in our lifetime is that everything is interconnected. Today, global business is about networks of connections continents apart.

In the past, insurers were called on to protect discrete things: homes, buildings and belongings. While that’s still very much the case, globalization and the rise of the information economy means we are also being called upon to protect things like trading relationships, digital assets, and intellectual property.

Technological progress has led to a seismic change in how we do business. There are many factors driving this change: the rise of new powers like China and India, individual attitudes and even the climate. However, globalization and technology aren’t just symbiotic bedfellows; they are the factor stimulating the greatest change in our societies and economies.

The number, size, and types of networks are growing and will continue to do so. Understanding globalization and modeling interconnectedness is, in my opinion, the key challenge for the next era of risk modeling. I will discuss examples that merit particular attention in future blogs, including:

  • Marine risks: More than 90% of the world’s trade is carried by sea. Seaborne trade has quadrupled in my lifetime and shows no sign of relenting. To manage cargo, hull, and the related marine sublines well, the industry needs to better understand the architecture and the behavior of the global shipping network.
  • Corporate and Government risks: Corporations and public entities are increasingly exposed to networked risks: physical, virtual or in between. The global supply chain, for example, is vulnerable to shocks and disruptions. There are no local events anymore. What can corporations and government entities do to better understand the risks presented by their relationships with critical third parties? What can the insurance industry and the capital markets do to provide CBI coverage responsibly?
  • Cyber risks: This is an area where interconnectedness is crucial.  More of the world’s GDP is tied up in digital networks than in cargo. As Dr. Gordon Woo often says, the cyber threat is persistent and universal. There are a million cyber attacks every minute. How can insurers awash with capital deploy it more confidently to meet a strong demand for cyber coverage?

Globalization is real, extreme, and relentless. Until the Industrial Revolution, the pace of change was very slow. Sure, empires rose and fell. Yes, natural disasters redefined the terrain.

But until relatively recently, virtually all the world’s population worked in agriculture—and only a tiny fraction of the global population were rulers, religious leaders or merchants. So, while the world may actually be less globalized than we perceive it to be, it is undeniable that it is much flatter than it was.

As the world continues to evolve and the megacities in Asia modernize, the risk transfer market could grow tenfold. As emerging economies shift away from a reliance on a government backstops towards a culture of looking to private market solutions, the amount of risk transferred will increase significantly. The question for the insurance industry is whether it is ready to seize the opportunity.

The number, size, and types of networks are growing and will only continue to do so. Protecting this new interconnected world is our biggest challenge—and the biggest opportunity to lead.

Managing Cyber Catastrophes With Catastrophe Models

My colleague Andrew Coburn recently co-authored an article on Cyber Risk with Simon Ruffle and Sarah Pryor, both researchers at Cambridge University Centre of Risk Studies.

This is a timely article considering the cyber attacks in the past year on big U.S. corporations. TargetHome DepotJPMorgan and, most recently, Sony Pictures have all had to deal with unauthorized security breaches.

This isn’t the first time Sony has experienced a virtual assault. In 2011, the PlayStation Network suffered one of the biggest security breaches in recent memory, which is reported to have cost the company in excess of $171 million.

Image source

Cyber attacks can be costly and insurers are hesitant to offer commercial cyber attack coverage because the risk is not well understood.

Andrew and his co-authors contend that insurers are not concerned with individual loss events, such as the targeted security penetrations we’ve seen recently on Sony and JP Morgan. It’s whether individual loss events are manageable across a whole portfolio of policies.

The biggest challenge in evaluating cyber risk is its inherent systemic complexity and interconnectivity. The internet, the technology companies that run on it, and the enterprises they serve are inextricably intertwined; shocks to one part of a network can quickly cascade and affect the rest of the whole system.

Can catastrophe-modelling methodologies provide the solution? Read the full article in The Actuary here.

Amlin on Open Modeling and Superior Underwriting

Daniel Stander (Managing Director, RMS) in conversation with JB Crozet (Head of Group Underwriting Modeling, Amlin).

Daniel Stander, RMS and JB Crozet, Amlin

Daniel Stander, RMS and JB Crozet, Amlin

Daniel Stander: Amlin has been an RMS client for many years. How involved do you feel you’ve been as RMS has designed, developed and prepares to launch RMS(one)?

JB Crozet: Amlin has been an RMS client for over a decade now. We are very committed to the RMS(one) Early Access Program and it’s been very rewarding to be close to RMS on what is obviously such an important initiative for them, and the market. We had liked what we heard and saw when RMS first explained their vision to us back in 2011. The RMS(one) capabilities sounded compelling and we wanted to understand these better, rather than build our own platform. We know how costly and risky those kinds of internal IT projects can be.

My team has now been trained on Beta 3 and feedback from those involved has been positive. We gave an overview of Beta 3 to all our underwriters and actuaries at our 4th Catastrophe Modeling Annual Briefing. There was a lot of energy and enthusiasm in the room. My team has now been trained on Beta 4 and we look forward to gathering feedback on their experience, and sharing this with RMS. We’re on a journey at Amlin and we’re on a journey with RMS. RMS(one) is the next phase of that journey.

DS: In what ways do you think Amlin will derive value from RMS(one)? Does it have the potential to pull your biggest lever of value creation; improving your loss ratio?

JBC: In a prolonged soft market, Amlin is rightly focused on controlling its loss ratio with disciplined underwriting. We think about RMS(one) in this context. With RMS(one), there is a real opportunity for superior performance through improved underwriting – both in the overall underwriting strategy and in individual underwriting decisions. This is equally true of our outwards risk transfer as it is of our net retained portfolio of risks.

It’s a big part of my role in the Group Underwriting function to equip our underwriters with the tools they need at the point of sale to empower their decision-making. The transformational speed and depth of the analytics coming out of RMS(one) will surface insights that result in superior, data-driven decision-making. The impact overtime of consistently making better decisions is not trivial.

DS: Transparency is key here: not just transparency of cost and performance, but transparency into the RMS reference view. How do you think about RMS(one) in this context?

JBC: RMS(one) takes the concept of transparency to a new level. RMS’ documentation has always been market leading. The ability to customize models by making adjustments to model assumptions – to drop in alternative rate sets, to scale losses, to adjust vulnerability functions – well, that gives us a far better understanding of the uncertainty inherent in these models. We can much more easily stress test the models’ assumptions and use the RMS reference view with greater intelligence.

RMS(one) is truly “open”. The fact that RMS(one) is architected to run non-RMS models – and that RMS has extended the hand of partnership to vendors of alternative views – is game-changing. The idea that Amlin could bring in auxiliary vended view of risk – from say, EQE – is today totally impractical given the operational challenges associated with such a change. RMS(one) removes these barriers and effectively gives us more freedom to work with other experts who might be able to help us hone our “house view” of risk.

DS: What is the attitude to the “cloud” in the market?

JBC: Once you understand that the RMS cloud is as secure and as reliable – if not more so – than existing data centre solutions, traditional concerns about the cloud become a non-issue. At Amlin we have high standards and we are confident that RMS can meet or exceed them.

It’s worth remembering, though, that the cloud is central to the value one can derive from RMS(one) and it’s not some optional extra – then you realize it’s not just “fit for purpose”, it’s actually what the industry needs.

RMS is giving us choices we’ve never had before. Whether it’s detailed flood models for central Europe and North America, or high definition pan-Asian models for tropical cyclone, rainfall and flood. Whether it’s the ability to scale up the compute resources on demand, or the ability to choose how fast we want model results based on a clear tariff. We wouldn’t be able to derive the broader value from RMS, if we worked with the hardware and software capabilities that our industry has been used to.