Tag Archives: risk analytics

Data Engineering for Risk Analytics with Risk Data Open Standard

This article was originally published by DZone

What Is Risk Analytics?

The picture below on the left shows the extensive flooding at industrial parks north of Bangkok, Thailand. Western Digital had 60 percent of its total hard drive production coming from the country – floods disrupted production facilities at multiple sites to dramatically affect a major, global supply chain. And the picture on the right – showing flooding on the New York Subway from Hurricane Sandy, caused widespread disruption and nearly US$70 billion of losses across the northeastern U.S.

In both examples, the analysis of risk should not only help with physical protection measures such as stronger buildings through improved building codes or better defenses, but also the protection available through financial recovery. Providing financial protection is the job of the financial services and insurance industries. Improving our understanding of and practices in risk analytics as a field is one of the most interesting problems in big data these days, given the increasing set of risks we have to watch for.

Flooding at industrial parks north of Bangkok, Thailand in 2011 (left) and flooded subway stations in New York after Hurricane Sandy in 2012 (right) Image credit: Wikimedia/Flickr

How Does Risk Analytics Work?

Obviously, the risk landscape is vast. It stretches from “natural” events – such as severe hurricanes and typhoons, to earthquakes to “human-generated” disasters, such as cyberattacks, terrorism and so on.

The initial steps of risk analytics start with understanding the exposure – this is the risks a given asset, individual etc. are exposed to. Understanding exposure means detailing events that lead to damage and the related losses that could result from those events. Formulas get more complicated from here. There is a busy highway of data surrounding this field. Data engineers, data scientists, and others involved in risk analytics work to predict, model, select, and price risk to calculate how to provide effective protection.

Data Engineering for Risk Analytics

Let’s look at property-focused risks. In this instance, risk analytics starts with an understanding of how a property – such as a commercial or a residential building, is exposed to risk. The kind of events that could pose a risk and the associated losses that could result from those events depends on many variables.

The problem is that in today’s enterprise, if you want to work with exposure data, you have to work with multiple siloed systems that have their own data formats and representations. These systems do not speak the same language. For a user to get a complete picture, they need to go across these systems and constantly translate and transform data between them. As a data engineer, how do you provide a unified view of data across all systems? For instance, how can you enable a risk analyst to understand all kinds of perils – from a hurricane, a hailstorm to storm surge, and then roll this all up so you can guarantee the coverage on these losses?

There are also a number of standards used by the insurance industry to integrate, transfer, and exchange this type of information. The most popular of these formats is the Exposure Data Model (EDM). However, EDMs and some of their less popular counterparts (Catastrophe Exposure Database Exchange – CEDE, and Open Exposure Data – OED) have not aged well and have not kept up with the industry needs:

  • These older standards are property centric; risk analytics requires an accommodation and understanding of new risks, such as cyberattacks, liability risks, and supply chain risk.
  • These older standards are propriety-designed for single systems that do not take into account the needs of various systems, for example, they can’t support new predictive risk models.
  • These standards don’t come with the right containment to represent high fidelity data portability – the exposure data formats do not usually represent losses, reference data, and settings used to produce the loss information that can allow for data integrity.
  • These standards are not extensible. Versioning and dependencies on specific product formats (such as database formats specific to version X of SQL Server etc) constantly make data portability harder.

This creates a huge data engineering challenge. If you can’t exchange information with high fidelity, forget getting reliable insights. As anyone dealing with data will say: garbage in, garbage out!

For any data engineer dealing with risk analytics, there is great news. There is a new open standard that is designed to remove shortcomings of the EDM and other similar formats. This new standard has been in the works for several years. It is the Risk Data Open Standard. The Risk Data Open Standard (RDOS) is designed to simplify data engineering. It is designed to simplify integrating data between systems that deal with exposure and loss data. It isn’t just RMS working to invent and validate this standard in isolation. A steering committee of thought leaders from influential companies is working on validating the Risk Data OS.

The Risk Data OS will allow us to work on risk analytics much more effectively. This is the way we can better understand the type of protection we need to create to help mitigate against climate change and other natural or human-made disasters. You can find details on the Risk Data OS here. If you are interested in the Risk Data OS, have feedback, or would like to help us define this standard, you can email the Risk Data OS steering committee by clicking here .

EXPOSURE Magazine: Taking Cloud Adoption to the Core

This is a taster of an article published in the latest edition of EXPOSURE magazine. For the full article click here or visit the EXPOSURE website.

With the main benefits of Cloud computing now well-established, EXPOSURE explored why insurance and reinsurance companies have demonstrated some reluctance in moving core services onto a Cloud-based infrastructure.

While a growing number of insurance and reinsurance companies are using Cloud services (such as those offered by Amazon Web Services, Microsoft Azure and Google Cloud) for nonessential office and support functions, most have been reluctant to consider Cloud for their mission-critical infrastructure. Simply moving a legacy offering and placing it on a new Cloud platform offers a potentially better user interface, but it’s not really transforming the process.

EXPOSURE also asked whether now is the time for market-leading (re)insurers to make that leap and really transform how they do business, embrace the new and different, and take comfort in what other industries have been able to do.

Continue reading

From Insurance Data Management to Data Enlightenment

This RMS article was previously published in Property Casualty 360

The effective use of data is so important to every insurance business — especially as big data and analytics are seen as a “silver bullet” for transformation. But to get on this transformative journey, your approach to data in your business may have to change. The traditional view of data focuses mainly on data collection and storage: how to collect, store, access and arrange the data, with rules and procedures to achieve this.

There is a tendency to separate data from analytics. If you think of data analytics, the image may be of the hard-pressed team of analysts and IT specialists, working to tight deadlines, “mining the data” to deliver the core reports that the business needs.

If any of the above rings true, you may need to change your mindset. First, for data collection and storage, the cloud has revolutionized the way data is stored, accessed and managed, offering high capacity and high availability, all typically on a pay-as-you-use basis. Historically, this is where much of the investment in this area went. But with the cloud, the burden has lifted as businesses now do not need to become experts in data storage or to plan, build and manage data centers, which were seen as critical in-house infrastructure in the past.

Continue reading

Join Us for Must-See Keynotes and Up Close Sessions at Exceedance 2018

With Exceedance 2018 coming May 14 – 17, we have lined up an interesting group of keynote speakers who will be onstage to provide their insights, ideas and inspiration. This year’s topics include:

Day 1: Earth, Wind and Fire

Our day one general session will focus on how we have leveraged lessons learned to bring new advances in model science and technology that enable (re)insurers to better manage and capitalize on catastrophe risk:

Continue reading

Towards a Resilient Insurance Market

This blog was originally published on InsurTech Gateway by Hambro Perks, click here for the original blog.

 

It is a fascinating time to work in the risk analytics business.

Traditional risks are changing, with much of this change being driven by technology. From the challenges posed by autonomous vehicles to the rapid digitization of the “smart home”, with automatic detection of threats such as fire and theft, systems are getting smarter and risks are changing.

Other types of traditional risk however still offer tremendous opportunities — last year’s storms in the U.S. have shown that even in one of the world’s most established insurance markets, uninsured losses are still a major problem. Barely half of the losses from Harvey, Irma and Maria were insured — the rest lies with the uninsured victims of the disaster, or if they are lucky, with the federal government who will help them rebuild.

This “protection gap” between those who have adequate insurance and those who do not, represents both a huge societal challenge, and a massive opportunity for the insurance market.

Continue reading

Data Analytics: Fueling the Future of Insurance

Look around and you see the financial services industry being transformed by a newfound ability to tap into a vast amount of data, right at their fingertips. Where business decisions were reliant on intuition and experience, and transactions underpinned by the strength of relationships, data analytics now drives everything from credit rating to complaint handling, from social media-driven marketing to employee performance monitoring.

Continue reading

How to Accelerate the Understanding of Disaster Risk

RMS is delighted in playing an integral role at the United Nations’ Global Platform for Disaster Risk Reduction in Cancun next week.  This is the first time that government stakeholders from all 193 member countries have come together on this subject since the Sendai Framework for Disaster Risk Reduction was adopted in March 2015.  Cancun looks forward to welcoming some 5,000 participants.

Continue reading