Tag Archives: Risk Management

Irma’s Landfall in Florida Could Have Been Worse

19:00 UTC  Monday, September 11

Tom Sabbatelli, hurricane risk expert – RMS

Irma is a hurricane no more: following a Category 3 strength second Florida landfall near Marco Island, south of Naples, the storm’s intensity rapidly faded as it traveled to the north and west overnight. Irma now only maintains tropical storm strength as it crosses northern Florida. Expect to see further commentary on the storm’s rapid decay from my colleagues in the RMS HWind team later today.

Irma’s landfall and overnight track has allowed RMS to narrow yesterday’s stochastic event selection. A landfall eliminates offshore track scenarios that produce lower levels of onshore wind but higher levels of storm surge hazard along Florida’s west coast. Irma’s passage to the east of Tampa reduces the risk of significant storm surge levels and loss near Tampa Bay.

Initial damage reports indicate that damage may not be as severe as once feared, despite sizable roof damage in Naples and the Florida Keys.

As per the RMS Event Response process, our attention shifts from the regular, reactive stochastic event selections to a comprehensive interrogation of all causes of loss, both modeled and unmodeled. During Hurricane Harvey, my colleague Michael Young explained that one of the biggest challenges we face is collecting enough wind observations to create a complete picture of a storm’s wind field. Although our RMS HWind team have been collecting and processing data every three hours, we face these challenges with Irma as well. More data will become available in the coming days, which will enhance the accuracy of our wind field and surge modeling.

As part of our event response service, we have the following activities underway:

  • Reconnaissance will help us narrow the uncertainty around what could be a potentially significant contribution to loss from storm surge and flooding. These efforts do not need to wait for boots to hit the ground, though; we have teams already scouring high-resolution satellite images to detail the exact extent of the flood waters and the underlying exposure at risk.
  • Our RMS Field Recon team, fresh off their trips to Texas following Hurricane Harvey, are reactivating their reconnaissance plans. In the days ahead, their visits to cities across Florida will help reveal the depth of the flood waters and the extent of wind damage observed throughout the state.

You may have already seen that our colleague, Victor Roldan, has been documenting his experience riding out the storm from his home in Miami. Victor has reported that basements are flooded along Brickell Avenue, and area that was hard hit in Wilma, but wind damage is minimal.

Following a hurricane, power outages predominantly contribute to heightened business interruption and post-event loss amplification, which is possible in an event of this magnitude. As many as seven million customers in Florida may have been without power at one time, almost triple the peak outage observed during Hurricane Matthew.

Caribbean Impacts

Let’s not forget about the Caribbean islands left in Irma’s wake, still cleaning up and attempting to restore power and telecommunications several days after the storm’s initial impact. For instance, as many as one-quarter of customers in Puerto Rico remain without power four days after Irma’s passage. This prolonged restoration may prove to be a figure that could compound insured losses across the island. During their comprehensive review of the event’s lifecycle, RMS modelers will refine projections of the insured loss across all Caribbean islands, which is assumed will contribute materially to the total industry loss.

As the Event Response team now transitions from producing real-time event updates, they have many existing key data sources from which they can draw these critical observations. Ultimately, these insights will inform our official insured industry loss estimate, targeted for publication in approximately two weeks’ time.

Recent Attacks Illustrate the Principles of Terrorism Risk Modeling

Some fifteen years after terrorism risk modeling began after 9/11, it is still suggested that the vagaries of human behavior render terrorism risk modeling an impossible challenge, but still the core principles underlying terrorism risk modeling are not widely appreciated. Terrorism risk modeling, as it has developed and evolved from an RMS perspective is unique in being based on solid principles, which are as crucial as the laws of physics are to natural hazard modeling.  The recent high-profile terrorist attacks in London, Stockholm, and Paris adhere to many of these principles.

Continue reading

EXPOSURE Magazine Snapshots: Water Security – Managing the Next Financial Shock

This is a taster of an article published by RMS in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

18 Apr 2017 Exposure Drought image

 

EXPOSURE magazine reported on how a pilot project to stress test banks’ exposure to drought could hold the key to future economic resilience, as recognition grows that environmental stress testing is a crucial instrument to ensure a sustainable financial system.

Continue reading

From Real-time Earthquake Forecasts to Operational Earthquake Forecasting – A New Opportunity for Earthquake Risk Management?

Jochen Wössner, lead modeler, RMS Model Development

Delphine Fitzenz, principal modeler, RMS Model Development

Earthquake forecasting is in the spotlight again as an unresolved challenge for earth scientists, with the world tragically reminded of this after the deadly impacts of recent earthquakes that hit Ecuador and Italy. Questions constantly arise.  For instance, when and where will the next strong shaking occur and what can we do to be better prepared for the sequence of earthquakes that would follow the main shock? What actions and procedures need to be in place to mitigate the societal and economic consequences of future earthquakes?

Continue reading

Has That Oilfield Caused My Earthquake?

“Some six months have passed since the magnitude (Mw) 6.7 earthquake struck Los Angeles County, with an epicenter close to the coast in Long Beach. Total economic loss estimates are more than $30 billion.  Among the affected homeowners, the earthquake insurance take-up rates were pitifully low – around 14 percent. And even then, the punitive deductibles contained in their policies means that homeowners may only recover 20 percent of their repair bills.  So, there is a lot of uninsured loss looking for compensation. Now there are billboards with pictures of smiling lawyers inviting disgruntled homeowners to become part of class action lawsuits, directed at several oilfield operators located close to the fault. For there is enough of an argument to suggest that this earthquake was triggered by human activities.”   

This is not a wild hypothesis with little chance of establishing liability, or the lawyers would not be investing in the opportunity. There are currently three thousand active oil wells in Los Angeles County. There is even an oil derrick in the grounds of Beverly Hills High School. Los Angeles County is second only to its northerly neighbor Kern County in terms of current levels of oil production in California.  In 2013, the U.S. Geological Survey (USGS) estimated there were 900 million barrels of oil still to be extracted from the coastal Wilmington Field which extends for around six miles (10 km) around Long Beach, from Carson to the Belmont Shore.

Beverly Hills High School Picture Credit: Sarah Craig for Faces of Fracking / FLICKR

Beverly Hills High School   Picture Credit: Sarah Craig for Faces of Fracking / FLICKR

However, the Los Angeles oil boom was back in the 1920s when most of the large fields were first discovered. Two seismologists at the USGS have now searched back through the records of earthquakes and oil field production – and arrived at a startling conclusion. Many of the earthquakes during this period appear to have been triggered by neighboring oil field production.

The Mw4.9 earthquake of June 22, 1920 had a shallow source that caused significant damage in a small area just a mile to the west of Inglewood. Local exploration wells releasing oil and gas pressures had been drilled at this location in the months before the earthquake.

A Mw4.3 earthquake in July 1929 at Whittier, some four miles (6 km) southwest of downtown Los Angeles, had a source close to the Santa Fe Springs oil field; one of the top producers through the 1920s, a field which had been drilled deeper and had a production boom in the months leading up to the earthquake.

A Mw5 earthquake occurred close to Santa Monica on August 31, 1930, in the vicinity of the Playa del Rey oilfield at Venice, California, a field first identified in December 1929 with production ramping up to four million barrels over the second half of 1930.

The epicenter of the Mw6.4 1933 Long Beach earthquake, on the Newport-Inglewood Fault was in the footprint of the Huntingdon Beach oilfield at the southern end of this 47 mile-long (75 km) fault.

As for a mechanism – the Groningen gas field in the Netherlands, shows how earthquakes can be triggered simply by the extraction of oil and gas, as reductions in load and compaction cause faults to break.

More Deep Waste Water Disposal Wells in California than Oklahoma

Today many of the Los Angeles oilfields are being managed through secondary recovery – pumping water into the reservoir to flush out the oil. In which case, we have an additional potential mechanism to generate earthquakes – raising deep fluid pressures – as currently experienced in Oklahoma. And Oklahoma is not even the number one U.S. state for deep waste water disposal. Between 2010 and 2013 there were 9,900 active deep waste water disposal wells in California relative to 8,600 in Oklahoma. And the California wells tend to be deeper.

More than 75 percent of the state’s oil production and more than 80 percent of all injection wells are in Kern County, central California, which happens to be close to the largest earthquake in the region over the past century on the White Wolf Fault: Mw7.3 in 1952. In 2005, there was an abrupt increase in the rates of waste water injection close to the White Wolf Fault, which was followed by an unprecedented swarm of four earthquakes over Magnitude 4 on the same day in September 2005. The injection and the seismicity have been linked in a research paper by Caltech and University of Southern California seismologists published in 2016. One neighboring well, delivering 57,000 cubic meters of waste water each month, was started just five months before the earthquake swarm broke out. The seismologists found a smoking gun, a pattern of smaller shocks migrating from the site of the well to the location of the earthquake cluster.

To summarize – we know that raising fluid pressures at depth can cause earthquakes, as is the case in Oklahoma, and also in Kern County, CA. We know there is circumstantial evidence for a connection between specific damaging earthquakes and oil extraction in southern California in the 1920s and 1930s. According to the location of the next major earthquake in southern or central California, there is a reasonable probability there will be an actively managed oilfield or waste water well in the vicinity.

Whoever is holding the liability cover for that operator may need some deep pockets.

EXPOSURE Magazine Snapshots: A New Way of Learning

This is a taster of an article published by RMS in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

7 Apr 2017 - Machine Learning blog - Exposure banner image 720 x 168

 

In EXPOSURE magazine, we delved into the algorithmic depths of machine learning to better understand the data potential that it offers the insurance industry.  In the article, Peter Hahn, head of predictive analytics at Zurich North America illustrated how pattern recognition sits at the core of current machine learning. How do machines learn?  Peter compares it to how a child is taught to differentiate between similar animals; a machine would “learn” by viewing numerous different pictures of the animals, which are clearly tagged, again and again.

Hahn comments “Over time, the machine intuitively forms a pattern recognition that allows them to tell a tiger from, say, a leopard. You can’t predefine a set of rules to categorize every animal, but through pattern recognition you learn what the differences are.”

Hahn adds that pattern recognition is already a part of how underwriters assess a risk. “A decision-making process will obviously involve traditional, codified analytical processes, but it will also include sophisticated pattern recognition based on their experiences of similar companies operating in similar fields with similar constraints. They essentially know what this type of risk ‘looks like’ intuitively.”

The Potential of Machine Learning

EXPOSURE magazine asked Christos Mitas, vice president of model development at RMS, on how he sees machine learning being used.  Mitas opened the discussion saying “We are now operating in a world where that data is expanding exponentially, and machine learning is one tool that will help us to harness that.”

Here are three areas where Mitas believes machine learning will make an impact:

Cyber Risk Modeling: Mitas adds “Where machine learning can play an important role here is in helping us tackle the complexity of this risk. Being able to collect and digest more effectively the immense volumes of data which have been harvested from numerous online sources and datasets will yield a significant advantage.”

Image Processing: “With developments in machine learning, for example, we might be able to introduce new data sources into our processing capabilities and make it a faster and more automated data management process to access images in the aftermath of a disaster. Further, we might be able to apply machine learning algorithms to analyze building damage post event to support speedier loss assessment processes.”

Natural Language Processing: “Advances here could also help tremendously in claims processing and exposure management,” Mitas adds, “where you have to consume reams of reports, images and facts rather than structured data. That is where algorithms can really deliver a different scale of potential solutions.”

For the full article and more insight for the insurance industry, click here and download your full copy of EXPOSURE magazine now.

For more information on RMS(one)®, a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

EXPOSURE Magazine Snapshots: Evolution of the Insurer DNA

This is a taster of an article published in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

6 Apr 2017 - Evolution of Insurer DNA blog image banner 720 x 168Many in (re)insurance recognize that the industry is at a tipping point. Rapid technological change, disruption through new, more efficient forms of capital, and an evolving risk landscape are challenging industry incumbents like never before. EXPOSURE magazine reported that inevitably the winners will be those who find ways to harmonize analytics, technology, industry innovation, and modeling.

“Disruptive innovation” is increasingly obvious in areas such as personal lines insurance, with disintermediation, the rise of aggregator websites and the Internet of Things (IoT).  In the commercial insurance and reinsurance space, disruptive technological change has been less obvious, but behind the scenes the industry is undergoing some fundamental changes.

The tipping point, the “Uber” moment has yet to arrive in reinsurance, according to Michael Steel, global head of solutions at RMS. “­The change we’re seeing in the industry is constant. We’re seeing disruption throughout the entire insurance journey. It’s not the case that the industry is suffering from a short-term correction and then the market will go back to the way it has done business previously. ­ The industry is under huge competitive pressures and the change we’re seeing is permanent and it will be continuous over time.”

While it is impossible to predict exactly how the industry will evolve going forward, it is evident that tomorrow’s leading (re)insurance companies will share certain attributes. ­ This includes a strong appetite to harness data and invest in new technology and analytics capabilities, the drive to differentiate and design new products and services, and the ability to collaborate. According to Eric Yau, general manager of software at RMS, the goal of an analytic-driven organization is to leverage the right technologies to bring data, workflow and business analytics together to continuously drive more informed, timely and collaborative decision making across the enterprise.

“New technologies play a key role and while there are many choices with the rise of insurtech firms, history shows us that success is achieved only when the proper due diligence is done to really understand and assess how these technologies enable the longer-term business strategy, goals and objectives.” says Yau. Yau also believes that one of the most important ingredients to success is the ability to effectively blend the right team of technologists, data scientists and domain experts who can work together to understand and deliver upon these key objectives.

Looking for Success in this New World

Which factors will help companies stand out and compete in the future?  EXPOSURE asked industry experts for their views on the attributes that winning companies will share:

The Race for Millennial Talent:  The most successful companies will look to attract and retain the best talent, says Rupert Swallow, co-founder and CEO of Capsicum Re, with succession planning that puts a strong emphasis on bringing Millennials up through the ranks. “­There is a huge difference between the way Millennials look at the workplace and live their lives, versus industry professionals born in the 1960s or 1970s — the two generations are completely different,” says Swallow. “­ Those guys [Millennials] would no sooner write a check to pay for something than fly to the moon.”

Collaboration is the Key: There are numerous examples of tie-ups between (re)insurance industry incumbents and tech firms, to leverage technology – or insurtech – expertise, to get closer to the original risk. ­ One example of a strategic collaboration is MGA Attune, set up last year by AIG, Hamilton Insurance Group, and affiliates of Two Sigma Investments. ­ Through the partnership, AIG gained access to Two Sigma’s vast technology and data-science capabilities to grow its market share in the U.S. small to mid-sized commercial insurance space.

Blockchain:  Blockchain offers huge potential to reduce some of the significant administrative burdens in the industry, thinks Kurt Karl, chief economist at Swiss Re. “Blockchain for the reinsurance space is an efficiency tool. And if we all get more efficient, you are able to increase insurability because your prices come down, and you can have more affordable reinsurance and therefore more affordable insurance. So I think we all win if it’s a cost saving for the industry.”

“­The challenge for the industry is to remain relevant to our customers,” says RMS’ Michael Steel. “­Those that fail to adapt will get left behind. To succeed you’re going to need greater information about the underlying risk, the ability to package the risk in a different way, to select the appropriate risks, differentiate more, and construct better portfolios.”

For the full article and more insight for the insurance industry, click here and download your full copy of EXPOSURE magazine now.

Watch Video: Eric Yau – Managing Risk is an Interconnected Process

Eric Yau, general manager, software business unit at RMS, said those managing risk should keep in mind that risk selection is part of an overall process that affects capacity and portfolio strategy. Yau spoke with A.M. BestTV at the Exceedance 2017 conference.

For more information on RMS(one)®, a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

EXPOSURE Magazine Snapshots: The Analytics Driven Organization

This is a taster of an article published in the second edition of EXPOSURE magazine.  Click here and download your full copy now.5 Apr 2017 - Exposure Analytics Org image with Exposure masthead

Farhana Alarakhiya, vice president, products at RMS, writes… In my recent article in EXPOSURE magazine, I was interested in exploring how firms in the insurance sector can move towards building a more analytics-driven organization.  Being analytics-driven translates to being an agile business, and in a turbulent market landscape, building underwriting agility is becoming critical to business survival.

There is no doubt we have seen revolutionary technological advances and an explosion of new digital data sources, which has reinvented the core disciplines of insurers over the past 15 years.  Many (re)insurers also see big data and analytics (BD&A) as a “silver bullet” to provide competitive advantage and address their current market challenges.

Similar to other industries who continue to invest heavily in BD&A to secure their position and open a new chapter of growth, the insurance sector is also ramping up investment, in open BD&A platforms such as RMS(one)®, which is purpose-built for the insurance industry.  But although there is a real buzz around BD&A, what may be lacking is a big data strategy specifically for evolving pricing, underwriting and risk selection, areas which provide huge potential gains for firms.

With the opportunity for our industry to gain transformational agility in analytics now within reach, we need to be conscious of how to avoid DRIP, being data rich, but information poor, with too much focus being on data capture, management, and structures, at the expense of creating useable insights that can be fed to the people at the point of impact.  Regulation is not the barrier to success either, many other regulated business areas have transformed their business and gained agility through effective analytics.

Please read the full article in EXPOSURE magazine to discover more about the three main lessons insurers can learn from other businesses who have their BD&A recipe just right, but here’s a short summary:

Lesson #1 – Delivering Analytics to the Point of Impact

Being reliant on back office processes for analytics is common for insurers, but doesn’t work for a frontline healthcare worker, for example.  Data analysts are rare in this sector, because a healthcare worker has analytics designed around their role, to support their delivery.  If you look at a portfolio manager in the insurance sector, they typically work in tandem with an analyst to get relevant data, let alone insight, which compromises their ability to perform effectively.

Lesson #2 – Ensuring Usability

Recognizing the workflow of an analytics user and giving due consideration to the veracity of the data provided to reduce uncertainty is vital. Looking at our healthcare example, analytics tools used by doctors to diagnose a patient’s condition use standardized information – age, sex, weight, height, ethnicity, address – and the patient’s symptoms.

They are provided not with a defined prognosis but a set of potential diagnoses accompanied by a probability score and the sources. Imagine this level of analytical capability provided in real-time at the point of underwriting, where the underwriter not only has access to the right set of analytics, they also have a clear understanding of other options and underlying assumptions.

Lesson #3 – Integration into the Common Workflow

To achieve data nirvana, BD&A output needs to integrate naturally into daily business-as-usual operations. When analytics are embedded directly into the daily workflow, there is a far higher success rate of it being put to effective use.  With customer service technology, all the required systems are directly integrated into the customer agents’ software for a holistic view of the customer.  Using platforms built and designed with open architecture allows legacy systems or your specific intellectual property-intensive processes to be integrated, for access to analytics that allow them to derive insights as part of the daily workflow for every risk they write.

This is a taster of an article published in the second edition of EXPOSURE magazine.  Click here and download your full copy now.

Watch Video: Farhana Alarakhiya – The Data Challenge Is Getting It to the Right People

Farhana Alarakhiya, vice president, products at RMS, said insurers are responding to the allure of big data, but must focus on turning voluminous data into meaningful insights. Alarakhiya spoke with A.M. BestTV at the Exceedance 2017 conference.

For more information of RMS(one)®, a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

“Computers Do the Calculating to Allow People to Transform the World.”

The quote above is from Conrad Wolfram, the renowned British mathematician, well known as an advocate for the advancement of mathematics teaching.  He argues that teaching students how to calculate using computers is more effective and more liberating than teaching calculation by hand. In his talk at TEDGlobal 2010, he describes his four-step process to solve a math problem:

  1. Pose the right question
  2. Formulize the question
  3. Computation
  4. Verifying that the computation answered the question

Currently, Wolfram believes 80 percent of math education focuses on step three – computation – and teaching people how to compute by hand.  Instead, he proposes “…we ought to use computers to do step three, so students can spend much more effort on learning how to do steps one, two and four – conceptualizing problems, applying them.”

The proper development and utilization of modern computer systems, including hardware and software advances, should enable Wolfram’s vision to come true, with users moving their allocated time away from calculations and process – stage three issues – and moving it to conceptualizing problems and applying the solutions effectively. And through my recent hands-on experience, I am confident that Risk Modeler, powered by RMS(one)®, will truly allow risk analysts to modify that time allocation.  I shared my experience of Risk Modeler on the keynote stage at Exceedance, and I invite you to watch the video below.

4 Apr 2017 Blog - Josh-ellingson-video-image

 

 

 

 

 

 

 

 

In 1999, a Legend Was Born

Many of you reading this may not realize it, but RiskLink® celebrates its 18th birthday this September.  RiskLink was born in 1999, and for some of you, RiskLink started its cat modeling career before you did.  I can remember using RiskLink back then, and it is a testament to the quality of that product that it is still the predominant catastrophe modeling software.  I’ve grown up with it, many of us have, and with that kind of longevity and familiarity, it is no wonder that few people even consider or question what can be an elongated process involved in completing an analysis, using this bedrock of catastrophe management.

That process to access your analysis is a lot of work. File management, model profile management, financial perspective re-maps, system restrictions. Wolfram’s assumption looks reasonable, that up to 80 percent of your natural catastrophe modeling time is spent in this process.

We’ll celebrate 18 successful years of RiskLink, but the market is shifting to an embrace of big data analytics.  This creates great timing for Risk Modeler. Risk Modeler is built specifically to work with large amounts of data to remove the procedural, tactical component of your work and move it to an efficient and speedy system.

How Would You Use Your Process Time?

This reallocation of process allows you to spend more time using your experience and intuition to conceptualize, understand and guide your business more effectively.  You can start to ask and answer questions that anticipate the business’ needs.  You can spend more time proactively working on change management with your key stakeholders. You can work more directly with the underwriting teams to understand and differentiate risks more thoroughly.

Risk Modeler is an efficient interface between your insight and experience and the analytical power of cloud-based computing. It allows you to simply ask a question, and it delivers the answer.   Mr. Wolfram reminds us that, “…math is not equal to calculating. Math is a much broader subject than calculating…I think of calculating, in a sense, as the machinery of math. It’s the chore. It’s the thing you’d like to avoid if you can, like to get a machine to do.”

Modeling is more than process, which is the chore of risk modeling.  I am excited that Risk Modeler is that system capable of completing that chore for you. You can now unleash your energy, creativity, insight, and experience on improving your company, your industry and to help make the world more resilient.

For more information on RMS(one), a big data and analytics platform built from the ground-up for the insurance industry, and solutions such as Risk Modeler and Exposure Manager, please click here.

“Specialty Lines Are All About Opportunity”

When we started RMS almost 28 years ago, the Specialty and E&S markets were among the first to embrace the benefits of catastrophe modeling.   In many ways, I learned about (re)insurance at the knee of the specialty markets, which gave me an appreciation of the sophisticated and myriad ways this market provides coverage and underwrites business.   What struck me then, as it does today, is the entrepreneurial nature of the specialty lines.  Underwriter-driven and close to their markets, they are constantly boxing-on-their-toes to identify new opportunities to offer innovative coverage and to write profitable business.

Last week, during our annual client conference Exceedance 2017, I welcomed almost 900 participants across clients, partners, and our RMS experts and client teams.  During the three-day conference, we convened a Specialty Roundtable with a cross-section of our clients from the U.S. specialty lines market to discuss the priorities for this dynamic sector of the industry.

The discussion was lively and ran across several themes, from identifying new opportunities in today’s market, to the benefits of well-informed stakeholders, to competing on data in a market increasingly driven by agile, real-time analytics.

Here are some highlights from our discussions:

Baking A Bigger Pie

“Specialty lines are all about opportunity.” said one participant, and from the discussion it became clear that the protection gap isn’t just in emerging markets.  Even in the U.S., penetration and coverage for earthquake and flood risk is limited relative to the underlying exposures.  Another participant stressed the need to move beyond the status quo, stating “It’s not about competing in a zero-sum game; we need to expand the market.” But although it was recognized that the current market has its challenges, one participant remarked that “…within every market there is opportunity.”   We also discussed how new RMS models for earthquake and flood can help firms gain new insights to better identify and write quality business to expand the pie.

Educating the Market

Another imperative, one that came through loud and clear during the discussion, is the importance of a well-informed market.  Not just for the underwriters, but also upstream with the producers and buyers.   The group felt that there continues to be too much focus on using average annual loss (AAL) as the basis for placing individual accounts, with an insufficient understanding of the standard deviations, tail-correlations, and contributory metrics.  This is particularly the case for earthquake risk, which is the quintessential tail-risk peril.  With the April release of the updated RMS North America Earthquake Model, we’re giving clients a more complete view of the risk, with the ‘tails’ of the exceedance probability (EP) curve playing an even more important role than in the past.  We discussed steps RMS is taking to inform key stakeholders, and we will continue to do more to be proactive and educate the entire market.

The Analytic Enterprise

Analytic agility was a constant theme throughout our discussion, with one participant remarking “With analytics, you’re either on the bus, or off the bus.” It was agreed that there is no half-way measure to adopting analytics.  All participants emphasized the central role of the underwriter in the risk-decision process, applying their experience and judgement, supported by analytics to make sound decisions.  However, there was much discussion that underwriting, portfolio management and analytics need to be increasingly agile and more tightly coupled.  Statements such as “I need real-time,” were made and why critical up-to-date portfolio information is needed to be able to proactively manage their book.  The importance of dynamic insight was emphasized “…underwriting works in lock-step with the portfolio; you can’t look at one without the other, particularly in this market.”  And the need for empowering underwriters with analytics will only grow as “…you can never have enough data,” with the market now “more data-driven than ever.”

30 Mar 2017 - New Analytics Demo small

Having had hands-on access to the RMS(one) solutions in The Lab at Exceedance, several of the group were pleased to note its apparent benefits, from new exposure management and analytic applications to more flexible representations of contracts and financial structures.  “We’re seeing a lot of contracts that are getting more complex, and RMS(one) using CDL (contract definition language) will help.”

It was motivating for me to hear the excitement for our new data and analytics platform, with one member saying “after seeing RMS(one), I’m excited that I can be more innovative.”

Continuing to stay close to what is important for our clients and their markets is a strategic priority for RMS.  At a time of great change across the industry, agility is instrumental to mitigating risks and seizing new opportunities, and the specialty markets are at the forefront.