EXPOSURE Magazine Snapshots: Water Security – Managing the Next Financial Shock

This is a taster of an article published by RMS in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

18 Apr 2017 Exposure Drought image

 

EXPOSURE magazine reported on how a pilot project to stress test banks’ exposure to drought could hold the key to future economic resilience, as recognition grows that environmental stress testing is a crucial instrument to ensure a sustainable financial system.

In May 2016, the Natural Capital Finance Alliance (NCFA), which is made up of the Global Canopy Programme (GCP) and the United Nations Environment Programme Finance Initiative, teamed up with Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH Emerging Markets Dialogue on Finance (EMDF) and several leading financial institutions to launch a project to pilot scenario modeling.

Funded by the German Federal Ministry for Economic Cooperation and Development (BMZ), RMS was appointed to develop a first of-its-kind drought model. The aim is to help financial institutions and wider economies become more resilient to extreme droughts, as Yannick Motz, head of the emerging markets dialogue on finance, GIZ, explains.

“GIZ has been working with financial institutions and regulators from G20 economies to integrate environmental indicators into lending and investment decisions, product development and risk management.”

But Why Drought?

Drought is a significant potential source of shock to the global financial system. There is a common misconception that sustained lack of water is primarily a problem for agriculture and food production, but in Europe alone, an estimated 40 percent of total water extraction is used for industry and energy production, such as cooling in power plants, and 15 percent for public water supply.

Motz adds “Particularly in the past few years, we have experienced a growing awareness in the financial sector for climate-related risks.  The lack of practicable methodologies and tools that adequately quantify, price and assess such risks, however, still impedes financial institutions in fully addressing and integrating them into their decision-making processes.

“Striving to contribute to filling this gap, GIZ and NCFA initiated this pilot project with the objective to develop an open-source tool that allows banks to assess the potential impact of drought events on the performance of their corporate loan portfolio.”

Defining the Problem

Stephen Moss, director, capital markets at RMS, and RMS scientist Dr. Navin Peiris explain how drought affects the global economy and how a drought stress-test will help build resilience for financial institutions:

Water Availability Links Every Industry:  Stephen Moss believes practically every industry in the world has some reliance on water availability in some shape or form.  “With environmental impacts become more frequent and severe, there is a growing awareness that water as a key future resource is starting to become more acute.” adds Moss.

“So, the questions are, do we understand how a lack of water could impact specific industries and how that could then flow down the line to all the industrial activities that rely on the availability of water? And then how does that impact on the broader economy?”

Interconnected World:  Dr. Navin Peiris acknowledges that the highly-interconnected world we live in means the impact of drought on one industry sector or one geographic region can have a material impact on adjacent industries or regions, whether those adjacent are impacted by that phenomenon or not. This interconnectivity is at the heart of why a hazard such as drought could become a major systemic threat for the global financial system.

“You could have an event or drought occurring in the U.S. and any reduction in production of goods and services could impact global supply chains and draw in other regions due to the fact the world is so interconnected.” comments Peiris.

Encouraging Water Conservation Behaviors:  The ability to model how drought is likely to impact banks’ loan default rates will enable financial institutions to accurately measure and control the risk. By adjusting their own risk management practices there should be a positive knock-on effect that ripples down if banks are motivated to encourage better water conservation behaviors amongst their corporate borrowers, explains Moss.

“Similar to how an insurance company incorporates the risk of having to payout on a large natural event, a bank should also be incorporating that into their overall risk assessment of a corporate when providing a loan – and including that incremental element in the pricing.” he says. “And just as insureds are motivated to defend themselves against flood or to put sprinklers in the factories in return for a lower premium, if you could provide financial incentives to borrowers through lower loan costs, businesses would then be encouraged to improve their resilience to water shortage.”

Read the full article in EXPOSURE to find out more about the new drought stress-test.

 

Stephen Moss: Modeling Drought Reveals Surprising Range of Impacts
Stephen Moss, director, capital markets at RMS, said droughts affect far more than agriculture and can affect financial portfolios and supply chains. Moss spoke with A.M. BestTV at the Exceedance 2017 conference.

 

From Real-time Earthquake Forecasts to Operational Earthquake Forecasting – A New Opportunity for Earthquake Risk Management?

Jochen Wössner, lead modeler, RMS Model Development

Delphine Fitzenz, principal modeler, RMS Model Development

Earthquake forecasting is in the spotlight again as an unresolved challenge for earth scientists, with the world tragically reminded of this after the deadly impacts of recent earthquakes that hit Ecuador and Italy. Questions constantly arise.  For instance, when and where will the next strong shaking occur and what can we do to be better prepared for the sequence of earthquakes that would follow the main shock? What actions and procedures need to be in place to mitigate the societal and economic consequences of future earthquakes?

The United States Geological Survey (USGS) started a series of workshops on “Potential Uses of Operational Earthquake Forecasting” (OEF) to understand what type of earthquake forecasting would provide the best information for a range of stakeholders and use cases.  This included delivering information relevant for the public, official earthquake advisory councils, emergency management, post-earthquake building inspection, zoning and building codes, oil and gas regulation, the insurance industry, and capital markets. With the strong ties that RMS has with the USGS, we were invited to the still ongoing workshop series and contributed to the outline of potential products the USGS may provide in future.  These can act as the basis for new solutions for the market, as we outline below.

Operational Earthquake Forecasting: What Do Seismologists Propose?

The aim of Operational Earthquake Forecasting (OEF) is to disseminate authoritative information about time-dependent earthquake probabilities on short timescales ranging from hours to months. Considering the large uncertainty in the model forecasts, there is considerable debate in the earth scientist community whether this effort is appropriate at all – especially during an earthquake sequence when the pressure to disseminate information becomes intense.

Our current RMS models provide average annual earthquake probabilities for most regions around the world, although we know that the latter constantly fluctuate due to earthquake clustering on all timescales.  OEF applications can provide daily to multi-year forecasts based on existing clustering models that update earthquake probabilities on a regular schedule or whenever an event occurs.

How Much Can We Trust Short-Term Earthquake Forecasting?

A vast amount of research focused on providing short-term earthquake forecasts (for a month or less) has been triggered by the Collaboratory Study for Earthquake Predictability (CSEP), spearheaded by scientists of the Southern California Earthquake Center (SCEC). The challenge is that the forecasted probabilities are very small.  They may increase by factors of 1000, but remain very small even when they jump from one-in-a-million to one-in-a hundred thousand. Only in the case of an aftershock sequence would this climb to above a 10 percent chance for a short period, yet again with considerable uncertainty between different models. While this is a challenging task, the developments over the last 20 years have allowed for increased confidence as these models are already implemented in some countries, such as New Zealand and Italy.

How Can We Use OEF and What Do We Require?

RMS is dedicated to understanding new potential solutions that can fill market needs. OEF has the potential to generate new solutions if paired with reliable, authoritative, consistent, timely, and transparent feeds of information. This potential can translate into innovations in understanding and managing earthquake risk in the near future.

About Jochen Wössner:

Lead Modeler, RMS Model Development

Jochen Wössner works on earthquake source and deformation modeling with a focus on Asia earthquake models. He joined RMS in 2014 from ETH Zurich following the release of the European Seismic Hazard Model as part of the European Commission SHARE project which he led as project manager and senior scientist. Within the Collaboratory for the Study of Earthquake Predictability (CSEP), he has developed and contributed to real-time forecasting experiments, especially for Italy.

About Delphine Fitzenz:

Principal Modeler, RMS Model Development

Delphine Fitzenz works on earthquake source modeling for risk products, with a particular emphasis on spatio-temporal patterns of large earthquakes.  Delphine joined RMS in 2012 after over ten years in academia, and works to bring the risk and the earthquake science communities closer together through articles and by organizing special sessions at conferences.

These include the Annual Meeting of the Seismological Society of America (2015 and 2016), an invited talk at the Ninth International Workshop on Statistical Seismology in Potsdam, Germany in 2015, on “How Much Spatio-Temporal Clustering Should One Build Into a Risk Model?” and an invitation to “Workshop One: Potential Uses of Operational Earthquake Forecasting (OEF) System” in California.

Has That Oilfield Caused My Earthquake?

“Some six months have passed since the magnitude (Mw) 6.7 earthquake struck Los Angeles County, with an epicenter close to the coast in Long Beach. Total economic loss estimates are more than $30 billion.  Among the affected homeowners, the earthquake insurance take-up rates were pitifully low – around 14 percent. And even then, the punitive deductibles contained in their policies means that homeowners may only recover 20 percent of their repair bills.  So, there is a lot of uninsured loss looking for compensation. Now there are billboards with pictures of smiling lawyers inviting disgruntled homeowners to become part of class action lawsuits, directed at several oilfield operators located close to the fault. For there is enough of an argument to suggest that this earthquake was triggered by human activities.”   

This is not a wild hypothesis with little chance of establishing liability, or the lawyers would not be investing in the opportunity. There are currently three thousand active oil wells in Los Angeles County. There is even an oil derrick in the grounds of Beverly Hills High School. Los Angeles County is second only to its northerly neighbor Kern County in terms of current levels of oil production in California.  In 2013, the U.S. Geological Survey (USGS) estimated there were 900 million barrels of oil still to be extracted from the coastal Wilmington Field which extends for around six miles (10 km) around Long Beach, from Carson to the Belmont Shore.

Beverly Hills High School Picture Credit: Sarah Craig for Faces of Fracking / FLICKR

Beverly Hills High School   Picture Credit: Sarah Craig for Faces of Fracking / FLICKR

However, the Los Angeles oil boom was back in the 1920s when most of the large fields were first discovered. Two seismologists at the USGS have now searched back through the records of earthquakes and oil field production – and arrived at a startling conclusion. Many of the earthquakes during this period appear to have been triggered by neighboring oil field production.

The Mw4.9 earthquake of June 22, 1920 had a shallow source that caused significant damage in a small area just a mile to the west of Inglewood. Local exploration wells releasing oil and gas pressures had been drilled at this location in the months before the earthquake.

A Mw4.3 earthquake in July 1929 at Whittier, some four miles (6 km) southwest of downtown Los Angeles, had a source close to the Santa Fe Springs oil field; one of the top producers through the 1920s, a field which had been drilled deeper and had a production boom in the months leading up to the earthquake.

A Mw5 earthquake occurred close to Santa Monica on August 31, 1930, in the vicinity of the Playa del Rey oilfield at Venice, California, a field first identified in December 1929 with production ramping up to four million barrels over the second half of 1930.

The epicenter of the Mw6.4 1933 Long Beach earthquake, on the Newport-Inglewood Fault was in the footprint of the Huntingdon Beach oilfield at the southern end of this 47 mile-long (75 km) fault.

As for a mechanism – the Groningen gas field in the Netherlands, shows how earthquakes can be triggered simply by the extraction of oil and gas, as reductions in load and compaction cause faults to break.

More Deep Waste Water Disposal Wells in California than Oklahoma

Today many of the Los Angeles oilfields are being managed through secondary recovery – pumping water into the reservoir to flush out the oil. In which case, we have an additional potential mechanism to generate earthquakes – raising deep fluid pressures – as currently experienced in Oklahoma. And Oklahoma is not even the number one U.S. state for deep waste water disposal. Between 2010 and 2013 there were 9,900 active deep waste water disposal wells in California relative to 8,600 in Oklahoma. And the California wells tend to be deeper.

More than 75 percent of the state’s oil production and more than 80 percent of all injection wells are in Kern County, central California, which happens to be close to the largest earthquake in the region over the past century on the White Wolf Fault: Mw7.3 in 1952. In 2005, there was an abrupt increase in the rates of waste water injection close to the White Wolf Fault, which was followed by an unprecedented swarm of four earthquakes over Magnitude 4 on the same day in September 2005. The injection and the seismicity have been linked in a research paper by Caltech and University of Southern California seismologists published in 2016. One neighboring well, delivering 57,000 cubic meters of waste water each month, was started just five months before the earthquake swarm broke out. The seismologists found a smoking gun, a pattern of smaller shocks migrating from the site of the well to the location of the earthquake cluster.

To summarize – we know that raising fluid pressures at depth can cause earthquakes, as is the case in Oklahoma, and also in Kern County, CA. We know there is circumstantial evidence for a connection between specific damaging earthquakes and oil extraction in southern California in the 1920s and 1930s. According to the location of the next major earthquake in southern or central California, there is a reasonable probability there will be an actively managed oilfield or waste water well in the vicinity.

Whoever is holding the liability cover for that operator may need some deep pockets.

EXPOSURE Magazine Snapshots: A New Way of Learning

This is a taster of an article published by RMS in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

7 Apr 2017 - Machine Learning blog - Exposure banner image 720 x 168

 

In EXPOSURE magazine, we delved into the algorithmic depths of machine learning to better understand the data potential that it offers the insurance industry.  In the article, Peter Hahn, head of predictive analytics at Zurich North America illustrated how pattern recognition sits at the core of current machine learning. How do machines learn?  Peter compares it to how a child is taught to differentiate between similar animals; a machine would “learn” by viewing numerous different pictures of the animals, which are clearly tagged, again and again.

Hahn comments “Over time, the machine intuitively forms a pattern recognition that allows them to tell a tiger from, say, a leopard. You can’t predefine a set of rules to categorize every animal, but through pattern recognition you learn what the differences are.”

Hahn adds that pattern recognition is already a part of how underwriters assess a risk. “A decision-making process will obviously involve traditional, codified analytical processes, but it will also include sophisticated pattern recognition based on their experiences of similar companies operating in similar fields with similar constraints. They essentially know what this type of risk ‘looks like’ intuitively.”

The Potential of Machine Learning

EXPOSURE magazine asked Christos Mitas, vice president of model development at RMS, on how he sees machine learning being used.  Mitas opened the discussion saying “We are now operating in a world where that data is expanding exponentially, and machine learning is one tool that will help us to harness that.”

Here are three areas where Mitas believes machine learning will make an impact:

Cyber Risk Modeling: Mitas adds “Where machine learning can play an important role here is in helping us tackle the complexity of this risk. Being able to collect and digest more effectively the immense volumes of data which have been harvested from numerous online sources and datasets will yield a significant advantage.”

Image Processing: “With developments in machine learning, for example, we might be able to introduce new data sources into our processing capabilities and make it a faster and more automated data management process to access images in the aftermath of a disaster. Further, we might be able to apply machine learning algorithms to analyze building damage post event to support speedier loss assessment processes.”

Natural Language Processing: “Advances here could also help tremendously in claims processing and exposure management,” Mitas adds, “where you have to consume reams of reports, images and facts rather than structured data. That is where algorithms can really deliver a different scale of potential solutions.”

For the full article and more insight for the insurance industry, click here and download your full copy of EXPOSURE magazine now.

For more information on RMS(one)®, a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

EXPOSURE Magazine Snapshots: Evolution of the Insurer DNA

This is a taster of an article published in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

6 Apr 2017 - Evolution of Insurer DNA blog image banner 720 x 168Many in (re)insurance recognize that the industry is at a tipping point. Rapid technological change, disruption through new, more efficient forms of capital, and an evolving risk landscape are challenging industry incumbents like never before. EXPOSURE magazine reported that inevitably the winners will be those who find ways to harmonize analytics, technology, industry innovation, and modeling.

“Disruptive innovation” is increasingly obvious in areas such as personal lines insurance, with disintermediation, the rise of aggregator websites and the Internet of Things (IoT).  In the commercial insurance and reinsurance space, disruptive technological change has been less obvious, but behind the scenes the industry is undergoing some fundamental changes.

The tipping point, the “Uber” moment has yet to arrive in reinsurance, according to Michael Steel, global head of solutions at RMS. “­The change we’re seeing in the industry is constant. We’re seeing disruption throughout the entire insurance journey. It’s not the case that the industry is suffering from a short-term correction and then the market will go back to the way it has done business previously. ­ The industry is under huge competitive pressures and the change we’re seeing is permanent and it will be continuous over time.”

While it is impossible to predict exactly how the industry will evolve going forward, it is evident that tomorrow’s leading (re)insurance companies will share certain attributes. ­ This includes a strong appetite to harness data and invest in new technology and analytics capabilities, the drive to differentiate and design new products and services, and the ability to collaborate. According to Eric Yau, general manager of software at RMS, the goal of an analytic-driven organization is to leverage the right technologies to bring data, workflow and business analytics together to continuously drive more informed, timely and collaborative decision making across the enterprise.

“New technologies play a key role and while there are many choices with the rise of insurtech firms, history shows us that success is achieved only when the proper due diligence is done to really understand and assess how these technologies enable the longer-term business strategy, goals and objectives.” says Yau. Yau also believes that one of the most important ingredients to success is the ability to effectively blend the right team of technologists, data scientists and domain experts who can work together to understand and deliver upon these key objectives.

Looking for Success in this New World

Which factors will help companies stand out and compete in the future?  EXPOSURE asked industry experts for their views on the attributes that winning companies will share:

The Race for Millennial Talent:  The most successful companies will look to attract and retain the best talent, says Rupert Swallow, co-founder and CEO of Capsicum Re, with succession planning that puts a strong emphasis on bringing Millennials up through the ranks. “­There is a huge difference between the way Millennials look at the workplace and live their lives, versus industry professionals born in the 1960s or 1970s — the two generations are completely different,” says Swallow. “­ Those guys [Millennials] would no sooner write a check to pay for something than fly to the moon.”

Collaboration is the Key: There are numerous examples of tie-ups between (re)insurance industry incumbents and tech firms, to leverage technology – or insurtech – expertise, to get closer to the original risk. ­ One example of a strategic collaboration is MGA Attune, set up last year by AIG, Hamilton Insurance Group, and affiliates of Two Sigma Investments. ­ Through the partnership, AIG gained access to Two Sigma’s vast technology and data-science capabilities to grow its market share in the U.S. small to mid-sized commercial insurance space.

Blockchain:  Blockchain offers huge potential to reduce some of the significant administrative burdens in the industry, thinks Kurt Karl, chief economist at Swiss Re. “Blockchain for the reinsurance space is an efficiency tool. And if we all get more efficient, you are able to increase insurability because your prices come down, and you can have more affordable reinsurance and therefore more affordable insurance. So I think we all win if it’s a cost saving for the industry.”

“­The challenge for the industry is to remain relevant to our customers,” says RMS’ Michael Steel. “­Those that fail to adapt will get left behind. To succeed you’re going to need greater information about the underlying risk, the ability to package the risk in a different way, to select the appropriate risks, differentiate more, and construct better portfolios.”

For the full article and more insight for the insurance industry, click here and download your full copy of EXPOSURE magazine now.

Watch Video: Eric Yau – Managing Risk is an Interconnected Process

Eric Yau, general manager, software business unit at RMS, said those managing risk should keep in mind that risk selection is part of an overall process that affects capacity and portfolio strategy. Yau spoke with A.M. BestTV at the Exceedance 2017 conference.

For more information on RMS(one)®, a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

EXPOSURE Magazine Snapshots: The Analytics Driven Organization

This is a taster of an article published in the second edition of EXPOSURE magazine.  Click here and download your full copy now.5 Apr 2017 - Exposure Analytics Org image with Exposure masthead

Farhana Alarakhiya, vice president, products at RMS, writes… In my recent article in EXPOSURE magazine, I was interested in exploring how firms in the insurance sector can move towards building a more analytics-driven organization.  Being analytics-driven translates to being an agile business, and in a turbulent market landscape, building underwriting agility is becoming critical to business survival.

There is no doubt we have seen revolutionary technological advances and an explosion of new digital data sources, which has reinvented the core disciplines of insurers over the past 15 years.  Many (re)insurers also see big data and analytics (BD&A) as a “silver bullet” to provide competitive advantage and address their current market challenges.

Similar to other industries who continue to invest heavily in BD&A to secure their position and open a new chapter of growth, the insurance sector is also ramping up investment, in open BD&A platforms such as RMS(one)®, which is purpose-built for the insurance industry.  But although there is a real buzz around BD&A, what may be lacking is a big data strategy specifically for evolving pricing, underwriting and risk selection, areas which provide huge potential gains for firms.

With the opportunity for our industry to gain transformational agility in analytics now within reach, we need to be conscious of how to avoid DRIP, being data rich, but information poor, with too much focus being on data capture, management, and structures, at the expense of creating useable insights that can be fed to the people at the point of impact.  Regulation is not the barrier to success either, many other regulated business areas have transformed their business and gained agility through effective analytics.

Please read the full article in EXPOSURE magazine to discover more about the three main lessons insurers can learn from other businesses who have their BD&A recipe just right, but here’s a short summary:

Lesson #1 – Delivering Analytics to the Point of Impact

Being reliant on back office processes for analytics is common for insurers, but doesn’t work for a frontline healthcare worker, for example.  Data analysts are rare in this sector, because a healthcare worker has analytics designed around their role, to support their delivery.  If you look at a portfolio manager in the insurance sector, they typically work in tandem with an analyst to get relevant data, let alone insight, which compromises their ability to perform effectively.

Lesson #2 – Ensuring Usability

Recognizing the workflow of an analytics user and giving due consideration to the veracity of the data provided to reduce uncertainty is vital. Looking at our healthcare example, analytics tools used by doctors to diagnose a patient’s condition use standardized information – age, sex, weight, height, ethnicity, address – and the patient’s symptoms.

They are provided not with a defined prognosis but a set of potential diagnoses accompanied by a probability score and the sources. Imagine this level of analytical capability provided in real-time at the point of underwriting, where the underwriter not only has access to the right set of analytics, they also have a clear understanding of other options and underlying assumptions.

Lesson #3 – Integration into the Common Workflow

To achieve data nirvana, BD&A output needs to integrate naturally into daily business-as-usual operations. When analytics are embedded directly into the daily workflow, there is a far higher success rate of it being put to effective use.  With customer service technology, all the required systems are directly integrated into the customer agents’ software for a holistic view of the customer.  Using platforms built and designed with open architecture allows legacy systems or your specific intellectual property-intensive processes to be integrated, for access to analytics that allow them to derive insights as part of the daily workflow for every risk they write.

This is a taster of an article published in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

Watch Video: Farhana Alarakhiya – The Data Challenge Is Getting It to the Right People

Farhana Alarakhiya, vice president, products at RMS, said insurers are responding to the allure of big data, but must focus on turning voluminous data into meaningful insights. Alarakhiya spoke with A.M. BestTV at the Exceedance 2017 conference.

For more information of RMS(one)®, a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

“Computers Do the Calculating to Allow People to Transform the World.”

The quote above is from Conrad Wolfram, the renowned British mathematician, well known as an advocate for the advancement of mathematics teaching.  He argues that teaching students how to calculate using computers is more effective and more liberating than teaching calculation by hand. In his talk at TEDGlobal 2010, he describes his four-step process to solve a math problem:

  1. Pose the right question
  2. Formulize the question
  3. Computation
  4. Verifying that the computation answered the question

Currently, Wolfram believes 80 percent of math education focuses on step three – computation – and teaching people how to compute by hand.  Instead, he proposes “…we ought to use computers to do step three, so students can spend much more effort on learning how to do steps one, two and four – conceptualizing problems, applying them.”

The proper development and utilization of modern computer systems, including hardware and software advances, should enable Wolfram’s vision to come true, with users moving their allocated time away from calculations and process – stage three issues – and moving it to conceptualizing problems and applying the solutions effectively. And through my recent hands-on experience, I am confident that Risk Modeler, powered by RMS(one)®, will truly allow risk analysts to modify that time allocation.  I shared my experience of Risk Modeler on the keynote stage at Exceedance, and I invite you to watch the video below.

4 Apr 2017 Blog - Josh-ellingson-video-image

 

 

 

 

 

 

 

 

In 1999, a Legend Was Born

Many of you reading this may not realize it, but RiskLink® celebrates its 18th birthday this September.  RiskLink was born in 1999, and for some of you, RiskLink started its cat modeling career before you did.  I can remember using RiskLink back then, and it is a testament to the quality of that product that it is still the predominant catastrophe modeling software.  I’ve grown up with it, many of us have, and with that kind of longevity and familiarity, it is no wonder that few people even consider or question what can be an elongated process involved in completing an analysis, using this bedrock of catastrophe management.

That process to access your analysis is a lot of work. File management, model profile management, financial perspective re-maps, system restrictions. Wolfram’s assumption looks reasonable, that up to 80 percent of your natural catastrophe modeling time is spent in this process.

We’ll celebrate 18 successful years of RiskLink, but the market is shifting to an embrace of big data analytics.  This creates great timing for Risk Modeler. Risk Modeler is built specifically to work with large amounts of data to remove the procedural, tactical component of your work and move it to an efficient and speedy system.

How Would You Use Your Process Time?

This reallocation of process allows you to spend more time using your experience and intuition to conceptualize, understand and guide your business more effectively.  You can start to ask and answer questions that anticipate the business’ needs.  You can spend more time proactively working on change management with your key stakeholders. You can work more directly with the underwriting teams to understand and differentiate risks more thoroughly.

Risk Modeler is an efficient interface between your insight and experience and the analytical power of cloud-based computing. It allows you to simply ask a question, and it delivers the answer.   Mr. Wolfram reminds us that, “…math is not equal to calculating. Math is a much broader subject than calculating…I think of calculating, in a sense, as the machinery of math. It’s the chore. It’s the thing you’d like to avoid if you can, like to get a machine to do.”

Modeling is more than process, which is the chore of risk modeling.  I am excited that Risk Modeler is that system capable of completing that chore for you. You can now unleash your energy, creativity, insight, and experience on improving your company, your industry and to help make the world more resilient.

For more information on RMS(one), a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

“Specialty Lines Are All About Opportunity”

When we started RMS almost 28 years ago, the Specialty and E&S markets were among the first to embrace the benefits of catastrophe modeling.   In many ways, I learned about (re)insurance at the knee of the specialty markets, which gave me an appreciation of the sophisticated and myriad ways this market provides coverage and underwrites business.   What struck me then, as it does today, is the entrepreneurial nature of the specialty lines.  Underwriter-driven and close to their markets, they are constantly boxing-on-their-toes to identify new opportunities to offer innovative coverage and to write profitable business.

Last week, during our annual client conference Exceedance 2017, I welcomed almost 900 participants across clients, partners, and our RMS experts and client teams.  During the three-day conference, we convened a Specialty Roundtable with a cross-section of our clients from the U.S. specialty lines market to discuss the priorities for this dynamic sector of the industry.

The discussion was lively and ran across several themes, from identifying new opportunities in today’s market, to the benefits of well-informed stakeholders, to competing on data in a market increasingly driven by agile, real-time analytics.

Here are some highlights from our discussions:

Baking A Bigger Pie

“Specialty lines are all about opportunity.” said one participant, and from the discussion it became clear that the protection gap isn’t just in emerging markets.  Even in the U.S., penetration and coverage for earthquake and flood risk is limited relative to the underlying exposures.  Another participant stressed the need to move beyond the status quo, stating “It’s not about competing in a zero-sum game; we need to expand the market.” But although it was recognized that the current market has its challenges, one participant remarked that “…within every market there is opportunity.”   We also discussed how new RMS models for earthquake and flood can help firms gain new insights to better identify and write quality business to expand the pie.

Educating the Market

Another imperative, one that came through loud and clear during the discussion, is the importance of a well-informed market.  Not just for the underwriters, but also upstream with the producers and buyers.   The group felt that there continues to be too much focus on using average annual loss (AAL) as the basis for placing individual accounts, with an insufficient understanding of the standard deviations, tail-correlations, and contributory metrics.  This is particularly the case for earthquake risk, which is the quintessential tail-risk peril.  With the April release of the updated RMS North America Earthquake Model, we’re giving clients a more complete view of the risk, with the ‘tails’ of the exceedance probability (EP) curve playing an even more important role than in the past.  We discussed steps RMS is taking to inform key stakeholders, and we will continue to do more to be proactive and educate the entire market.

The Analytic Enterprise

Analytic agility was a constant theme throughout our discussion, with one participant remarking “With analytics, you’re either on the bus, or off the bus.” It was agreed that there is no half-way measure to adopting analytics.  All participants emphasized the central role of the underwriter in the risk-decision process, applying their experience and judgement, supported by analytics to make sound decisions.  However, there was much discussion that underwriting, portfolio management and analytics need to be increasingly agile and more tightly coupled.  Statements such as “I need real-time,” were made and why critical up-to-date portfolio information is needed to be able to proactively manage their book.  The importance of dynamic insight was emphasized “…underwriting works in lock-step with the portfolio; you can’t look at one without the other, particularly in this market.”  And the need for empowering underwriters with analytics will only grow as “…you can never have enough data,” with the market now “more data-driven than ever.”

30 Mar 2017 - New Analytics Demo small

Having had hands-on access to the RMS(one) solutions in The Lab at Exceedance, several of the group were pleased to note its apparent benefits, from new exposure management and analytic applications to more flexible representations of contracts and financial structures.  “We’re seeing a lot of contracts that are getting more complex, and RMS(one) using CDL (contract definition language) will help.”

It was motivating for me to hear the excitement for our new data and analytics platform, with one member saying “after seeing RMS(one), I’m excited that I can be more innovative.”

Continuing to stay close to what is important for our clients and their markets is a strategic priority for RMS.  At a time of great change across the industry, agility is instrumental to mitigating risks and seizing new opportunities, and the specialty markets are at the forefront.

Day Four at Exceedance 2017

Thursday in New Orleans, and there was still much to see and learn on the final morning of Exceedance.

Attendees were taking advantage of all there was to offer in The Lab, including connecting with RMS experts for product demonstrations and training for the latest Version 17 and Risk Modeler developments.

23 Mar 2017 EXCD Emily P smaller

As mentioned in yesterday’s blog, attendance has been exceptional at Exceedance, and some track sessions have been so popular that Thursday’s agenda was updated to repeat the High Definition Modeling capabilities, Version 17 RMS® North America Earthquake and RMS® North Atlantic Hurricane Model Changes, RMS(one)® solutions, RMS roadmap and future solutions, and U.S. Flood Market tracks.

A Personal Message from Hemant

The RMS Exceedance Party (EP) Was the Place to Be!

Those who attended the EP Wednesday night at Generations Hall were treated to quite a party! Along with three separate spaces – each with its own New Orleans theme – many grooved to the music of Rockin’ Dopsie, Jr. and the Zydeco Twisters.

23 Mar 2017 EXCD EP party singer small

 

 

 

 

 

 

 

 

 

 

 

 

 

Mr. Dopsie (or perhaps it’s “Mr. Rockin’”) had the dance floor alive with revelers moving to the beat of local zydeco as well as hits from the past. The night was capped off with Café Du Monde serving their world-famous beignets.

A Final Note on Exceedance 2017

As Exceedance 2017 comes to a successful conclusion, all of us here at RMS want to thank those who came from around the globe to be in attendance.

This was truly “your” conference, and we hope you found value in listening to our keynote speakers on the big stage, as well as learning more about our exciting updates and new solutions that will enable you to own your view of risk, provide the flexibility you need to make decisions, operate more cost-effectively, and create resilience.

As we move beyond this year’s Exceedance, RMS is ready to meet its commitments as we remain on track for a full schedule of delivery throughout 2017!

Day Three at Exceedance 2017

It’s Wednesday, which meant another full day of sessions, presentations, The Lab, a networking event, and more, happening here in New Orleans.

22 Mar 2017 - The Lab EXCD small

Attendance has been exceptional at Exceedance, and some track sessions have been so popular that we are repeating a few of them. For those of you here in New Orleans, the sessions will repeat on Thursday morning, starting at 10 a.m. Be sure to check the Exceedance app for details.

The main theme of the morning’s general session was a demonstration of how RMS is working to help clients explore and manage new and emerging perils, as well as applying RMS model expertise to long-standing lines. Speakers included Mike Steel, Christos Mitas, Robert Reville, Steve Jewson, and Andrew Coburn.

Wednesday Highlights

A few of the highlights of the day’s sessions included:

  • Christos Mitas took us deep into what he described as the unique and exceptional world of cyber terror and cyber risk modeling, with insights that included the upcoming (April 2017) launch of the RMS Cyber Accumulation Management System CAMS v. 2.0.
  • Robert Reville from Praedicat explored product stewardship and product liability risk, explaining the causes of liability accumulation, how the risk of major technological innovation is not known, and how risk accumulation can go on for years.
  • Steve Jewson transported us to India and China, presenting new agricultural risk models – including drought models for four countries. Agricultural risk is one of the top concerns for our clients in Asia-Pacific and Latin America, offering the market exciting growth opportunities.
  • Andrew Coburn from RMS and Dr. Hjörtur Thráinsson from Munich Re combined to present the RMS strategy of a single data standard for all lines and classes of insured exposure, as well as opportunities to generate exposure analytics for more business lines, a single client, or a single location.

Our Second Theme – Resilience – Personified.

The afternoon general session focused on resilience, and the exceptional work happening here in New Orleans over the past several years. Paul Wilson began by acknowledging the accomplishments of Build Change, a partner organization to RMS that continues to build resilience in emerging nations.

He then walked the crowd through a brief history of New Orleans – a city that has been built, and rebuilt, on its experience with hurricanes – before introducing keynote speakers Tanya Harris-Glasow of the Make It Right Foundation, and Jeff Herbert, chief resilience officer for New Orleans. The success of the city following Hurricane Katrina stems from the efforts of innovators like them, and their stories of strength, perseverance, teamwork, and inspiration, truly personify the theme of resilience.

The session continued with Dr. Robert Muir-Wood’s discussion on risk modeling and resilience in Louisiana, and concluded with remarks from RMS President Mike Pritula, who spoke on a variety of topics including his commitment to concentrate on RMS clients, and the challenge of embracing the inevitable change that technology is bringing to the catastrophe modeling community.

The Lab is the Hot Spot in New Orleans!

Customer feedback about The Lab continues to be extremely positive – with a lot of great conversations, product demos, and training sessions focusing on the latest developments from Version 17 and Risk Modeler to help customers choose the best routes for adopting new solutions for 2017 and beyond.

Get Your Mojo Rising on at the EP Tonight!

Last night’s well-attended masquerade in The Lab is now a happy memory. And far as we can tell, everyone removed their masks in time for the first general session this morning.

22 Mar 2017 - Masked Ball Small

But if you’re here at Exceedance, our legendary “EP” is coming up tonight – offering three tastes of New Orleans in one unique location – Generations Hall, built in the 1820’s and originally a sugar refinery.

With three themes, Jazz Night Club, Mardi Gras, and Louisiana Cajun, be ready to put on your dancing shoes and show us your voodoo.

Thursday is our final day in New Orleans, so please check back tomorrow for highlights and a message from Hemant!