logo image
More Topics

Reset Filters

NIGEL ALLEN
May 10, 2018
Getting Wildfire Under Control

The extreme conditions of 2017 demonstrated the need for much greater data resolution on wildfire in North America The 2017 California wildfire season was record-breaking on virtually every front. Some 1.25 million acres were torched by over 9,000 wildfire events during the period, with October to December seeing some of the most devastating fires ever recorded in the region*. From an insurance perspective, according to the California Department of Insurance, as of January 31, 2018, insurers had received almost 45,000 claims relating to losses in the region of US$11.8 billion. These losses included damage or total loss to over 30,000 homes and 4,300 businesses. On a countrywide level, the total was over 66,000 wildfires that burned some 9.8 million acres across North America, according to the National Interagency Fire Center. This compares to 2016 when there were 65,575 wildfires and 5.4 million acres burned. Caught off Guard “2017 took us by surprise,” says Tania Schoennagel, research scientist at the University of Colorado, Boulder. “Unlike conditions now [March 2018], 2017 winter and early spring were moist with decent snowpack and no significant drought recorded.” Yet despite seemingly benign conditions, it rapidly became the third-largest wildfire year since 1960, she explains. “This was primarily due to rapid warming and drying in the late spring and summer of 2017, with parts of the West witnessing some of the driest and warmest periods on record during the summer and remarkably into the late fall. “Additionally, moist conditions in early spring promoted build-up of fine fuels which burn more easily when hot and dry,” continues Schoennagel. “This combination rapidly set up conditions conducive to burning that continued longer than usual, making for a big fire year.” While Southern California has experienced major wildfire activity in recent years, until 2017 Northern California had only experienced “minor-to-moderate” events, according to Mark Bove, research meteorologist, risk accumulation, Munich Reinsurance America, Inc. “In fact, the region had not seen a major, damaging fire outbreak since the Oakland Hills firestorm in 1991, a US$1.7 billion loss at the time,” he explains. “Since then, large damaging fires have repeatedly scorched parts of Southern California, and as a result much of the industry has focused on wildfire risk in that region due to the higher frequency and due to the severity of recent events. “Although the frequency of large, damaging fires may be lower in Northern California than in the southern half of the state,” he adds, “the Wine Country fires vividly illustrated not only that extreme loss events are possible in both locales, but that loss magnitudes can be larger in Northern California. A US$11 billion wildfire loss in Napa and Sonoma counties may not have been on the radar screen for the insurance industry prior to 2017, but such losses are now.” Smoke on the Horizon Looking ahead, it seems increasingly likely that such events will grow in severity and frequency as climate-related conditions create drier, more fire-conducive environments in North America. “Since 1985, more than 50 percent of the increase in the area burned by wildfire in the forests of the Western U.S. has been attributed to anthropogenic climate change,” states Schoennagel. “Further warming is expected, in the range of 2 to 4 degrees Fahrenheit in the next few decades, which will spark ever more wildfires, perhaps beyond the ability of many Western communities to cope.” “Climate change is causing California and the American Southwest to be warmer and drier, leading to an expansion of the fire season in the region,” says Bove. “In addition, warmer temperatures increase the rate of evapotranspiration in plants and evaporation of soil moisture. This means that drought conditions return to California faster today than in the past, increasing the fire risk.” “Even though there is data on thousands of historical fires … it is of insufficient quantity and resolution to reliably determine the frequency of fires” Mark Bove Munich Reinsurance America While he believes there is still a degree of uncertainty as to whether the frequency and severity of wildfires in North America has actually changed over the past few decades, there is no doubt that exposure levels are increasing and will continue to do so. “The risk of a wildfire impacting a densely populated area has increased dramatically,” states Bove. “Most of the increase in wildfire risk comes from socioeconomic factors, like the continued development of residential communities along the wildland-urban interface and the increasing value and quantity of both real estate and personal property.” Breaches in the Data Yet while the threat of wildfire is increasing, the ability to accurately quantify that increased exposure potential is limited by a lack of granular historical data, both on a countrywide basis and even in highly exposed fire regions such as California, to accurately determine the probability of an event occurring. “Even though there is data on thousands of historical fires over the past half-century,” says Bove, “it is of insufficient quantity and resolution to reliably determine the frequency of fires at all locations across the U.S. “This is particularly true in states and regions where wildfires are less common, but still holds true in high-risk states like California,” he continues. “This lack of data, as well as the fact that the wildfire risk can be dramatically different on the opposite ends of a city, postcode or even a single street, makes it difficult to determine risk-adequate rates.” According to Max Moritz, Cooperative Extension specialist in fire at the University of California, current approaches to fire mapping and modeling are also based too much on fire-specific data. “A lot of the risk data we have comes from a bottom-up view of the fire risk itself. Methodologies are usually based on the Rothermel Fire Spread equation, which looks at spread rates, flame length, heat release, et cetera. But often we’re ignoring critical data such as wind patterns, ignition loads, vulnerability characteristics, spatial relationships, as well as longer-term climate patterns, the length of the fire season and the emergence of fire-weather corridors.” Ground-level data is also lacking, he believes. “Without very localized data you’re not factoring in things like the unique landscape characteristics of particular areas that can make them less prone to fire risk even in high-risk areas.” Further, data on mitigation measures at the individual community and property level is in short supply. “Currently, (re)insurers commonly receive data around the construction, occupancy and age of a given risk,” explains Bove, “information that is critical for the assessment of a wind or earthquake risk.” However, the information needed to properly assess wildfire risk is typically not captured. For example, whether roof covering or siding is combustible. Bove says it is important to know if soffits and vents are open-air or protected by a metal covering, for instance. “Information about a home’s upkeep and surrounding environment is critical as well,” he adds. At Ground Level While wildfire may not be as data intensive as a peril such as flood, it is almost as demanding, especially on computational capacity. It requires simulating stochastic or scenario events all the way from ignition through to spread, creating realistic footprints that can capture what the risk is and the physical mechanisms that contribute to its spread into populated environments. The RMS®North America Wildfire HD Model capitalize on this expanded computational capacity and improved data sets to bring probabilistic capabilities to bear on the peril for the first time across the entirety of the contiguous U.S. and Canada. Using a high-resolution simulation grid, the model provides a clear understanding of factors such as the vegetation levels, the density of buildings, the vulnerability of individual structures and the extent of defensible space. The model also utilizes weather data based on re-analysis of historical weather observations to create a distribution of conditions from which to simulate stochastic years. That means that for a given location, the model can generate a weather time series that includes wind speed and direction, temperature, moisture levels, et cetera. As wildfire risk is set to increase in frequency and severity due to a number of factors ranging from climate change to expansions of the wildland-urban interface caused by urban development in fire-prone areas, the industry now has to be able to live with that and understand how it alters the risk landscape. On the Wind Embers have long been recognized as a key factor in fire spread, either advancing the main burn or igniting spot fires some distance from the originating source. Yet despite this, current wildfire models do not effectively factor in ember travel, according to Max Moritz, from the University of California. “Post-fire studies show that the vast majority of buildings in the U.S. burn from the inside out due to embers entering the property through exposed vents and other entry points,” he says. “However, most of the fire spread models available today struggle to precisely recreate the fire parameters and are ineffective at modeling ember travel.” During the Tubbs Fire, the most destructive wildfire event in California’s history, embers carried on extreme ‘Diablo’ winds sparked ignitions up to two kilometers from the flame front. The rapid transport of embers not only created a more fast-moving fire, with Tubbs covering some 30 to 40 kilometers within hours of initial ignition, but also sparked devastating ignitions in areas believed to be at zero risk of fire, such as Coffey Park, Santa Rosa. This highly built-up area experienced an urban conflagration due to ember-fueled ignitions.  “Embers can fly long distances and ignite fires far away from its source,” explains Markus Steuer, consultant, corporate underwriting at Munich Re. “In the case of the Tubbs Fire they jumped over a freeway and ignited the fire in Coffey Park, where more than 1,000 homes were destroyed. This spot fire was not connected to the main fire. In risk models or hazard maps this has to be considered. Firebrands can fly over natural or man-made fire breaks and damage can occur at some distance away from the densely vegetated areas.” For the first time, the RMS North America Wildfire HD Model enables the explicit simulation of ember transport and accumulation, allowing users to detail the impact of embers beyond the fire perimeters. The simulation capabilities extend beyond the traditional fuel-based fire simulations, and enable users to capture the extent to which large accumulations of firebrands and embers can be lofted beyond the perimeters of the fire itself and spark ignitions in dense residential and commercial areas. As was shown in the Tubbs Fire, areas not previously considered at threat of wildfire were exposed by the ember transport. The introduction of ember simulation capability allows the industry to quantify the complete wildfire risk appropriately across North America wildfire portfolios.

Helen Yates
May 10, 2018
Where Tsunami Warnings Are Etched in Stone

EXPOSURE looks back at the 2011 Tohoku event and other significant events that have shaped scientific knowledge and understanding of earthquake risk incorporated into the RMS® Japan Earthquake and Tsunami HD Model Hundreds of ancient markers dot the coastline of Japan, some over 600 years old, as a reminder of the danger of tsunami. Today, a new project to construct a 12.5-meter-high seawall stretching nearly 400 kilometers along Japan’s northeast coast is another reminder. Japan is a highly seismically active country and was well prepared for earthquakes and tsunami ahead of the Tohoku Earthquake in 2011. It had strict building codes, protective tsunami barriers, early-warning systems and disaster-response plans. But it was the sheer magnitude, scale and devastation caused by the Tohoku Earthquake and Tsunami that made it stand out from the many thousands of earthquakes that had come before it in modern times. What had not been foreseen in government planning was that an earthquake of this magnitude could occur, nor that it could produce such a sizable tsunami. The Tohoku Earthquake was a magnitude 9.0 event — off the charts as far as the Japanese historical record for earthquakes was concerned. A violent change in the ocean bottom triggered an immense tsunami with waves of up to 40 meters that tore across the northeast coast of the main island of Honshu, traveling up to 10 kilometers inland in the Sendai area. The tsunami breached sea walls and claimed almost everything in its path, taking 16,000 lives (a further 2,000 remain missing, presumed dead) and causing economic losses of US$235 billion. However, while the historical record proved inadequate preparation for the Tohoku event, the geological record shows that events of that magnitude had occurred before records began, explains Mohsen Rahnama, chief risk modeling officer at RMS. “Since the Tohoku event, there’s been a shift … to moving further back in time using a more full consideration of the geological record” Mohsen Rahnama RMS “If you go back in the geological record to 869 in the Tohoku region, there is evidence for a potentially similarly scaled tsunami,” he explains. “Since the Tohoku event, there’s been a shift in the government assessments moving away from a focus on what happened historically to a more full consideration of the geological record.” The geological record, which includes tsunami deposits in coastal lakes and across the Sendai and Ishinomaki plains, shows there were large earthquakes and associated tsunami in A.D. 869, 1611 and 1896. The findings of this research point to the importance of having a fully probabilistic tsunami model at a very high resolution. Rahnama continues: “The Tohoku event really was the ‘perfect’ tsunami hitting the largest exposure concentration at risk to tsunami in Japan. The new RMS tsunami model for Japan includes tsunami events similar to and in a few cases larger than were observed in 2011. Because the exposure in the region is still being rebuilt, the model cannot produce tsunami events with this scale of loss in Tohoku at this time.” Incorporating Secondary Perils RMS has incorporated the lessons from the Tohoku Earthquake and other major earthquakes that have occurred within its model. There have been several large earthquakes around the world, and they all inform thinking about the largest events, particularly how the ground motions they produce are modeled. Crucially, it includes a fully probabilistic tsunami model that is integrated with the earthquake stochastic event set. On understanding strong ground shaking, information across events is utilized. For example, petrochemical facilities around the world are built with relatively consistent construction practices. This means that examination of the damage experienced by these types of facilities in Chile and Japan can inform the understanding of the performance of these facilities in other parts of the world with similar seismic hazard. The Maule Earthquake in Chile in 2010, the Canterbury sequence of earthquakes in New Zealand in 2010 and 2011, and the more recent Kumamoto Earthquakes in Japan in 2016, have added considerably to the data sets. Most notably they have informed scientific understanding of the nature of secondary earthquake perils, including tsunami, fire following earthquake, landslides and liquefaction. The 2016 Kumamoto Earthquake sequence triggered extensive landsliding. The sequence included five events in the range of magnitude 5.5 to 7.0 and caused severe damage in Kumamoto and Oita Prefectures from ground shaking, landsliding, liquefaction and fire following earthquake. “Liquefaction is in the model as a secondary peril. RMS has redesigned and recalibrated the liquefaction model for Japan. The new model directly calculates damage due to vertical deformation due to liquefaction processes,” says Chesley Williams, senior director, product management at RMS. “While the 1964 Niigata Earthquake with its tipped apartment buildings showed that liquefaction damages can be severe in Japan, on a countrywide basis the earthquake risk is driven by the shaking, tsunami and fire following, followed by liquefaction and landslide. For individual exposures, the key driver of the earthquake risk is very site specific, highlighting the importance of high-resolution modeling in Japan.” The RMS model accounts for the clustering of large events on the Nankai Trough. This is an important advancement as an examination of the historical record shows that events on the Nankai Trough have either occurred as full rupturing events (e.g., 1707 Hoei Earthquake) or as pairs of events (e.g., 1944 and 1946 and two events in 1854). This is different from aftershocks, explains Williams. “Clustered events are events on different sources that would have happened in the long-term earthquake record, and the occurrence of one event impacts the timing of the other events. This is a subtle but important distinction. We can model event clustering on the Nankai Trough due to the comprehensive event record informed by both historical events and the geologic record.” The Tohoku event resulted in insurance losses of US$30 billion to US$40 billion, the costliest earthquake event for the insurance industry in history. While the news media focused on the extreme tsunami, the largest proportion of the insurance claims emanated from damage wrought by the strong ground shaking. Interestingly, likely due to cultural constraints, only a relatively low amount of post-event loss amplification was observed. “In general for very large catastrophes, claims costs can exceed the normal cost of settlement due to a unique set of economic, social and operational factors,” says Williams. “Materials and labor become more expensive and claims leakage can be more of an issue, so there are a number of factors that kick in that are captured by the RMS post-event loss amplification modeling. The Japan model explicitly models post-event loss amplification but limits the impacts to be consistent with the observations in recent events in Japan.” Supply chain disruption and contingent business interruption were significant sources of loss following the Tohoku event. This was exacerbated by the level seven meltdown at the Fukushima nuclear power plant that resulted in evacuations, exclusion zones and rolling blackouts. “We sent reconnaissance teams to Japan after the event to understand the characteristics of damage and to undertake case studies for business interruption,” says Williams. “We visited large industrial facilities and talked to them about their downtime, their material requirement and their access to energy sources to better understand what had impacted their ability to get back up and running.” Recent events have re-emphasized that there are significant differences in business interruption by occupancy. “For example,  a semiconductor facility is likely going to have a longer downtime than a cement factory,” says Williams. “The recent events have highlighted the impacts on business interruption for certain occupancies by damage to supply sources. These contingent business interruptions are complex, so examination of the case studies investigated in Japan were instrumental for informing the model.” Rebuilding in the seven years since the Tohoku Tsunami struck has been an exercise in resilient infrastructure. With nearly half a million people left homeless, there has been intense rebuilding to restore services, industry and residential property. US$12 billion has been spent on seawalls alone, replacing the 4-meter breakwaters with 12.5-meter-high tsunami barriers. An endless convoy of trucks has been moving topsoil from the hills to the coastline in order to raise the land by over 10 meters in places. Most cities have decided to elevate by several meters, with a focus on rebuilding commercial premises in exposed areas. Some towns have forbidden the construction of homes in flat areas nearest the coasts and relocated residents to higher ground. Tokyo-Yokohama: The World’s Most Exposed Metropolis The Japanese metropolis of Tokyo-Yokohama has the world’s greatest GDP at risk from natural catastrophes. Home to 38 million residents, it has potential for significant economic losses from multiple perils, but particularly earthquakes. According to Swiss Re it is the riskiest metropolitan area in the world. A combination of strict building codes, land use plans and disaster preparedness have significantly reduced the city’s vulnerability in recent decades. Despite the devastation caused by the tsunami, very few casualties (around 100) related to partial or complete building collapse resulting from ground shaking during the magnitude 9.0 Tohoku Earthquake.  

Helen Yates
September 04, 2017
A Burgeoning Opportunity

As traditional (re)insurers hunt for opportunity outside of property catastrophe classes, new probabilistic casualty catastrophe models are becoming available. At the same time, as catastrophe risks are becoming increasingly “manufactured” or human-made, so casualty classes have the potential to be the source of claims after a large “natural” catastrophe. Just as the growing sophistication of property catastrophe models has enabled industry innovation, there is growing excitement that new tools available to casualty (re)insurers could help to expand the market. By improved evaluation of casualty clash exposures, reinsurers will be better able to understand, price and manage their exposures, as well as design new products that cater to underserved areas. However, the casualty market must switch from pursuing a purely defensive strategy. “There is an ever-growing list of exclusions in liability insurance and interest in the product is declining with the proliferation of these exclusions,” explains Dr. Robert Reville, president and CEO of Praedicat, the world’s first liability catastrophe modeling company. “There is a real growth opportunity for the industry to deal with these exclusions and recognize where they can confidently write more business. “Industry practitioners look at what’s happened in property — where modeling has led to a lot of new product ideas, including capital market solutions, and a lot of innovation — and casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property,” he adds. Perils — particularly emerging risks that underwriters have struggled to price, manage and understand — have typically been excluded from casualty products. This includes electromagnetic fields (EMFs), such as those emanating from broadcast antennas and cell phones. Cover for such exposures is restricted, particularly for the U.S. market, where it is often excluded entirely. Some carriers will not offer any cover at all if the client has even a remote exposure to EMF risks. Yet are they being over-apprehensive about the risk? The fear that leads to an over application of exclusions is very tangible. “The latency of the disease development process — or the way a product might be used, with more people becoming exposed over time — causes there to be a build-up of risk that may result in catastrophe,” Reville continues. “Insurers want to be relevant to insuring innovation in product, but they have to come to terms with the latency and the potential for a liability catastrophe that might emerge from it.” Unique Nature of Casualty Catastrophe It is a misconception that casualty is not a catastrophe class of business. Reville points out that the industry’s US$100 billion-plus loss relating to asbestos claims is arguably its biggest-ever catastrophe. Within the Lloyd’s market the overwhelming nature of APH (asbestos, pollution and health) liabilities contributed to the market’s downward spiral in the late 1980s, only brought under control through the formation of the run-off entity Equitas, now owned and managed by Warren Buffett’s Berkshire Hathaway. As the APH claims crisis demonstrated, casualty catastrophes differ from property catastrophes in that they are a “two-tailed loss.” There is the “tail loss” both have in common, which describes the high frequency, low probability characteristics — or high return period — of a major event. But in addition, casualty classes of business are “long-tail” in nature. This means that a policy written in 2017 may not experience a claim until 20 years later, providing an additional challenge from a modeling and reserving perspective. “Casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property” Robert Reville Praedicat Another big difference between casualty clash and property catastrophe from a modeling perspective is that the past is not a good indication of future claims. “By the time asbestos litigation had really taken off, it was already a banned product in the U.S., so it was not as though asbestos claims were any use in trying to figure out where the next environmental disaster or next product liability was going to be,” says Reville. “So, we needed a forward-looking approach to identify where there could be new sources of litigation.” With the world becoming both more interconnected and more litigious, there is every expectation that future casualty catastrophe losses could be much greater and impact multiple classes of business. “The reality is there’s serial aggregation and systemic risk within casualty business, and our answer to that has generally been that it’s too difficult to quantify,” says Nancy Bewlay, chief underwriting officer, global casualty, at XL Catlin. “But the world is changing. We now have technology advances and data collection capabilities we never had before, and public information that can be used in the underwriting process. “Take the Takata airbag recall,” she continues. “In 2016, they had to recall 100 million airbags worldwide. It affected all the major motor manufacturers, who then faced the accumulation potential not only of third-party liability claims, but also product liability and product recall. Everything starts to accumulate and combine within that one industry, and when you look at the economic footprint of that throughout the supply chain there’s a massive potential for a casualty catastrophe when you see how everything is interconnected.” RMS chief research officer Robert Muir-Wood explains: “Another area where we can expect an expansion of modeling applications concerns casualty lines picking up losses from more conventional property catastrophes. This could occur when the cause of a catastrophe can be argued to have ‘non-natural’ origins, and particularly where there are secondary ‘cascade’ consequences of a catastrophe — such as a dam failing after a big earthquake or for claims on ‘professional lines’ coverages of builders and architects — once it is clear that standard property insurance lines will not compensate for all the building damage.” “This could be prevalent in regions with low property catastrophe insurance penetration, such as in California, where just one in ten homeowners has earthquake cover. In the largest catastrophes, we could expect claims to be made against a wide range of casualty lines. The big innovation around property catastrophe in particular was to employ high-resolution GIS [geographic information systems] data to identify the location of all the risk. We need to apply similar location data to casualty coverages, so that we can estimate the combined consequences of a property/casualty clash catastrophe.” One active instance, cited by Muir-Wood, of this shift from property to casualty cover- ages concerns earthquakes in Oklahoma. “There are large amounts of wastewater left over from fracking, and the cheapest way of disposing of it is to pump it down deep boreholes. But this process has been triggering earthquakes, and these earthquakes have started getting quite big — the largest so far in September 2016 had a magnitude of M5.8. “At present the damage to buildings caused by these earthquakes is being picked up by property insurers,” he continues. “But what you will see over time are lawsuits to try and pass the costs back to the operators of the wells themselves. Working with Praedicat, RMS has done some modeling work on how these operators can assess the risk cost of adding a new disposal well. Clearly the larger the earthquake, the less likely it is to occur. However, the costs add up: our modeling shows that an earthquake bigger than M6 right under Oklahoma City could cause more than US$10 billion of damage.” Muir-Wood adds: “The challenge is that casualty insurance tends to cover many potential sources of liability in the contract and the operators of the wells, and we believe their insurers are not currently identifying this particular — and potentially catastrophic —source of future claims. There’s the potential for a really big loss that would eventually fall onto the liability writers of these deep wells … and they are not currently pricing for this risk, or managing their portfolios of casualty lines.” A Modeled Class of Business According to Reville, the explosion of data and development of data science tools have been key to the development of casualty catastrophe modeling. The opportunity to develop probabilistic modeling for casualty classes of business was born in the mid-2000s when Reville was senior economist at the RAND Corporation. At that time, RAND was using data from the RMS® Probabilistic Terrorism Model to help inform the U.S. Congress in its decision on the renewal of the Terrorism Risk Insurance Act (TRIA). Separately, it had written a paper on the scope and scale of asbestos litigation and its potential future course. “As we were working on these two things it occurred to us that here was this US$100 billion loss — this asbestos problem — and adjacently within property catastrophe insurance there was this developed form of analytics that was helping insurers solve a similar problem. So, we decided to work together to try and figure out if there was a way of solving the problem on the liability side as well,” adds Reville. Eventually Praedicat was spun out of the initial project as its own brand, launching its first probabilistic liability catastrophe model in summer 2016. “The industry has evolved a lot over the past five years, in part driven by Solvency II and heightened interest from the regulators and rating agencies,” says Reville. “There is a greater level of concern around the issue, and the ability to apply technologies to understand risk in new ways has evolved a lot.” There are obvious benefits to (re)insurers from a pricing and exposure management perspective. “The opportunity is changing the way we underwrite,” says Bewlay. “Historically, we underwrote by exclusion with a view to limiting our maximum loss potential. We couldn’t get a clear understanding of our portfolio because we weren’t able to. We didn’t have enough meaningful, statistical and credible data.” “We feel they are not being proactive enough because … there’s the potential for a really big loss that would fall onto the liability writers of these deep wells” Robert Muir-Wood RMS Then there are the exciting opportunities for growth in a market where there is intense competition and downward pressure on rates. “Now you can take a view on the ‘what-if’ scenario and ask: how much loss can I handle and what’s the probability of that happening?” she continues. “So, you can take on managed risk. Through the modeling you can better understand your industry classes and what could happen within your portfolio, and can be slightly more opportunistic in areas where previously you may have been extremely cautious.” Not only does this expand the potential range of casualty insurance and reinsurance products, it should allow the industry to better support developments in burgeoning industries. “Cyber is a classic example,” says Bewlay. “If you can start to model the effects of a cyber loss you might decide you’re OK providing cyber in personal lines for individual homeowners in addition to providing cyber in a traditional business or technology environment. “You would start to model all three of these scenarios and what your potential market share would be to a particular event, and how that would impact your portfolio,” she continues. “If you can answer those questions utilizing your classic underwriting and actuarial techniques, a bit of predictive modeling in there — this is the blend of art and science — you can start taking opportunities that possibly you couldn’t before.”

NIGEL ALLEN
September 04, 2017
Breaching the Flood Insurance Barrier

With many short-term reauthorizations of the National Flood Insurance Program, EXPOSURE considers how the private insurance market can bolster its presence in the U.S. flood arena and overcome some of the challenges it faces. According to Federal Emergency Management Agency (FEMA), as of June 30, 2017, the National Flood Insurance Program (NFIP) had around five million policies in force, representing a total in-force written premium exceeding US$3.5 billion and an overall exposure of about US$1.25 trillion. Florida alone accounts for over a third of those policies, with over 1.7 million in force in the state, representing premiums of just under US$1 billion. However, with the RMS Exposure Source Database estimating approximately 85 million residential properties alone in the U.S., the NFIP only encompasses a small fraction of the overall number of properties exposed to flood, considering floods can occur throughout the country. Factors limiting the reach of the program have been well documented: the restrictive scope of NFIP policies, the fact that mandatory coverage applies only to special flood hazard plains, the challenges involved in securing elevation certificates, the cost and resource demands of conducting on-site inspections, the poor claims performance of the NFIP, and perhaps most significant the refusal by many property owners to recognize the threat posed by flooding. At the time of writing, the NFIP is once again being put to the test as Hurricane Harvey generates catastrophic floods across Texas. As the affected regions battle against these unprecedented conditions, it is highly likely that the resulting major losses will add further impetus to the push for a more substantive private flood insurance market. The Private Market Potential While the private insurance sector shoulders some of the flood coverage, it is a drop in the ocean, with RMS estimating the number of private flood policies to be around 200,000. According to Dan Alpay, line underwriter for flood and household at Hiscox London Market, private insurers represent around US$300 to US$400 million of premium — although he adds that much of this is in “big- ticket policies” where flood has been included as part of an all-risks policy. “In terms of stand-alone flood policies,” he says, “the private market probably only represents about US$100 million in premiums — much of which has been generated in the last few years, with the opening up of the flood market following the introduction of the Biggert-Waters Flood Insurance Reform Act of 2012 and the Homeowner Flood Insurance Affordability Act of 2014.” But it is clear therefore that the U.S. flood market represents one of the largest untapped insurance opportunities in the developed world, with trillions of dollars of property value at risk across the country. “It is extremely rare to have such a huge potential market like this,” says Alpay, “and we are not talking about a risk that the market does not understand. It is U.S. catastrophe business, which is a sector that the private market has extensive experience in. And while most insurers have not provided specific cover for U.S. flood before, they have been providing flood policies in many other countries for many years, so have a clear understanding of the peril characteristics. And I would also say that much of the experience gained on the U.S. wind side is transferable to the flood sector.” Yet while the potential may be colossal, the barriers to entry are also significant. First and foremost, there is the challenge of going head-to-head with the NFIP itself. While there is concerted effort on the part of the U.S. government to facilitate a greater private insurer presence in the flood market as part of its reauthorization, the program has presided over the sector for almost 50 years and competing for those policies will be no easy task. “The main problem is changing consumer behavior,” believes Alpay. “How do we get consumers who have been buying policies through the NFIP since 1968 to appreciate the value of a private market product and trust that it will pay out in the event of a loss? While you may be able to offer a product that on paper is much more comprehensive and provides a better deal for the insured, many will still view it as risky given their inherent trust in the government.” For many companies, the aim is not to compete with the program, but rather to source opportunities beyond the flood zones, accessing the potential that exists outside of the mandatory purchase requirements. But to do this, property owners who are currently not located in these zones need to understand that they are actually in an at-risk area and need to consider purchasing flood cover. This can be particularly challenging in locations where homeowners have never experienced a damaging flood event. Another market opportunity lies in providing coverage for large industrial facilities and high-value commercial properties, according to Pete Dailey, vice president of product management at RMS. “Many businesses already purchase NFIP policies,” he explains, “in fact those with federally insured mortgages and locations in high-risk flood zones are required to do so. “However,” he continues, “most businesses with low-to-moderate flood risk are unaware that their business policy excludes flood damage to the building, its contents and losses due to business interruption. Even those with NFIP coverage have a US$500,000 limit and could benefit from an excess policy. Insurers eager to expand their books by offering new product options to the commercial lines will facilitate further expansion of the private market.” Assessing the Flood Level But to be able to effectively target this market, insurers must first be able to ascertain what the flood exposure levels really are. The current FEMA flood mapping database spans 20,000 individual plains. However, much of this data is out of date, reflecting limited resources, which, coupled with a lack of consistency in how areas have been mapped using different contractors, means their risk assessment value is severely limited. While a proposal to use private flood mapping studies instead of FEMA maps is being considered, the basic process of maintaining flood plain data is an immense problem given the scale. With the U.S. exposed to flood in virtually every location, this makes it a high-resolution peril, meaning there is a long list of attributes and inter-dependent dynamic factors influencing what flood risk in a particular area might be. With 100 years of scientific research, the physics of flooding itself is well understood, the issue has been generating the data and creating the model at sufficient resolution to encompass all of the relevant factors from an insurance perspective. In fact, to manage the scope of the data required to release the RMS U.S. Flood Hazard Maps for a small number of return periods required the firm to build a supercomputer, capitalizing on immense Cloud-based technology to store and manage the colossal streams of information effectively. With such data now available, insurers are in a much better position to generate functional underwriting maps – FEMA maps were never drawn up for underwriting purposes. The new hazard maps provide actual gradient and depth of flooding data, to get away from the ‘in’ or ‘out’ discussion, allowing insurers to provide detail, such as if a property is exposed to two to three feet of flooding at a 1-in-100 return period. No Clear Picture Another hindrance to establishing a clear flood picture is the lack of a systematic database of the country’s flood defense network. RMS estimates that the total network encompasses some 100,000 miles of flood defenses; however, FEMA’s levy network accounts for approximately only 10 percent of this. Without the ability to model existing flood defenses accurately,  higher frequency, lower risk events are overestimated. To help counter this lack of defense data, RMS developed the capability within its U.S. Inland Flood HD Model to identify the likelihood of such measures being present and, in turn, assess the potential protection levels. Data shortage is also limiting the potential product spectrum. If an insurer is not able to demonstrate to a ratings agency or regulator what the relationship between different sources of flood risk (such as storm surge and river flooding) is for a given portfolio, then it could reduce the range of flood products they can offer. Insurers also need the tools and the data to differentiate the more complicated financial relationships, exclusions and coverage options relative to the nature of the events that could occur. Launching into the Sector In May 2016, Hiscox London Market launched its FloodPlus product into the U.S. homeowners sector, following the deregulation of the market. Distributed through wholesale brokers in the U.S., the policy is designed to offer higher limits and a wider scope than the NFIP. “We initially based our product on the NFIP policy with slightly greater coverage,” Alpay explains, “but we soon realized that to firmly establish ourselves in the market we had to deliver a policy of sufficient value to encourage consumers to shift from the NFIP to the private market. “As we were building the product and setting the limits,” he continues, “we also looked at how to price it effectively given the lack of granular flood information. We sourced a lot of data from external vendors in addition to proprietary modeling which we developed ourselves, which enabled us to build our own pricing system. What that enabled us to do was to reduce the process time involved in buying and activating a policy from up to 30 days under the NFIP system to a matter of minutes under FloodPlus.” This sort of competitive edge will help incentivize NFIP policyholders to make a switch. “We also conducted extensive market research through our coverholders,” he adds, “speaking to agents operating within the NFIP system to establish what worked and what didn’t, as well as how claims were handled.” “We soon realized that to firmly establish ourselves … we had to deliver a policy of sufficient value to encourage consumers to shift from the NFIP to the private market”  Dan Alpay Hiscox London Market Since launch, the product has been amended on three occasions in response to customer demand. “For example, initially the product offered actual cash value on contents in line with the NFIP product,” he adds. “However, after some agent feedback, we got comfortable with the idea of providing replacement cost settlement, and we were able to introduce this as an additional option which has proved successful.” To date, coverholder demand for the product has outstripped supply, he says. “For the process to work efficiently, we have to integrate the FloodPlus system into the coverholder’s document issuance system. So, given the IT integration process involved plus the education regarding the benefits of the product, it can’t be introduced too quickly if it is to be done properly.” Nevertheless, growing recognition of the risk and the need for coverage is encouraging to those seeking entry into this emerging market. A Market in the Making The development of a private U.S. flood insurance market is still in its infancy, but the wave of momentum is building. Lack of relevant data, particularly in relation to loss history, is certainly dampening the private sector’s ability to gain market traction. However, as more data becomes available, modeling capabilities improve, and insurer products gain consumer trust by demonstrating their value in the midst of a flood event, the market’s potential will really begin to flow. “Most private insurers,” concludes Alpay, “are looking at the U.S. flood market as a great opportunity to innovate, to deliver better products than those currently available, and ultimately to give the average consumer more coverage options than they have today, creating an environment better for everyone involved.” The same can be said for the commercial and industrial lines of business where stakeholders are actively searching for cost savings and improved risk management. Climate Complications As the private flood market emerges, so too does the debate over how flood risk will adjust to a changing climate. “The consensus today among climate scientists is that climate change is real and that global temperatures are indeed on the rise,” says Pete Dailey, vice president of product management at RMS. “Since warmer air holds more moisture, the natural conclusion is that flood events will become more common and more severe. Unfortunately, precipitation is not expected to increase uniformly in time or space, making it difficult to predict where flood risk would change in a dramatic way.” Further, there are competing factors that make the picture uncertain. “For example,” he explains, “a warmer environment can lead to reduced winter snowpack, and, in turn, reduced springtime melting. Thus, in regions susceptible to springtime flooding, holding all else constant, warming could potentially lead to reduced flood losses.” For insurers, these complications can make risk selection and portfolio management more complex. “While the financial implications of climate change are uncertain,” he concludes, “insurers and catastrophe modelers will surely benefit from climate change research and byproducts like better flood hazard data, higher resolution modeling and improved analytics being developed by the climate science community.”

EDITOR
September 04, 2017
What One Thing Would Help Reinsurers Get Closer to the Original Risk?

In each edition of EXPOSURE we ask three experts for their opinion on how they would tackle a major risk and insurance challenge. This issue, we consider how (re)insurers can gain more insight into the original risk, and in so doing, remove frictional costs. As our experts Kieran Angelini-Hurll, Will Curran and Luzi Hitz note, more insight does not necessarily mean disintermediation. Kieran Angelini-Hurll CEO, Reinsurance at Ed The reduction of frictional costs would certainly help. At present, there are too many frictional costs between the reinsurer and the original risk. The limited amount of data available to reinsurers on the original risks is also preventing them from getting a clear picture. A combination of new technology and a new approach from brokers can change this. First, the technology. A trading platform which reduces frictional costs by driving down overheads will bridge the gap between reinsurance capital and the insured. However, this platform can only work if it provides the data which will allow reinsurers to better understand this risk. Arguably, the development of such a platform could be achieved by any broker with the requisite size, relevance and understanding of reinsurers’ appetites. Brokers reluctant to share data will watch their business migrate to more disruptive players However, for most, their business models do not allow for it. Their size stymies innovation and they have become dependent on income derived from the sale of data, market-derived income and ‘facilitization’. These costs prevent reinsurers from getting closer to the risk. They are also unsustainable, a fact that technology will prove. A trading platform which has the potential to reduce costs for all parties, streamline the throughput of data, and make this information readily and freely available could profoundly alter the market. Brokers that continue to add costs and maintain their reluctance to share data will be forced to evolve or watch their business migrate to leaner, more disruptive players. Brokers which are committed to marrying reinsurance capital with risk, regardless of its location and that deploy technology, can help overcome the barriers put in place by current market practices and bring reinsurers closer to the original risk. Will Curran Head of Reinsurance, Tokio Marine Kiln, London More and more, our customers are looking to us as their risk partners, with the expectation that we will offer far more than a transactional risk transfer product. They are looking for pre-loss services, risk advisory and engineering services, modeling and analytical capabilities and access to our network of external experts, in addition to more traditional risk transfer. As a result of offering these capabilities, we are getting closer to the original risk, through our discussions with cedants and brokers, and our specialist approach to underwriting. Traditional carriers are able to differentiate by going beyond vanilla risk transfer The long-term success of reinsurers needs to be built on offering more than being purely a transactional player. To a large extent, this has been driven by the influx of non-traditional capital into the sector. Whereas these alternative property catastrophe reinsurance providers are offering a purely transactional product, often using parametric or industry-loss triggers to simplify the claims process in their favor, traditional carriers are able to differentiate by going beyond vanilla risk transfer. Demand for risk advice and pre-loss services are particularly high within specialist and emerging risk classes of business. Cyber is a perfect example of this, where we work closely with our corporate and insurance clients to help them improve their resilience to cyber-attack and to plan their response in the event of a breach. Going forward, successful reinsurance companies will be those that invest time and resources in becoming true risk partners. In an interconnected and increasingly complex world, where there is a growing list of underinsured exposures, risk financing is just one among many service offerings in the toolkit of specialist reinsurers. Luzi Hitz CEO, PERILS AG The nature of reinsurance means the reinsurer is inherently further away from the underlying risk than most other links in the value chain. The risk is introduced by the original insured, and is transferred into the primary market before reaching the reinsurer – a process normally facilitated by reinsurance intermediaries. I am wary of efforts to shortcut or circumvent this established multi-link chain to reduce the distance between reinsurer and the underlying risk. The reinsurer in many cases lacks the granular insight found earlier in the process required to access the risk directly. What we need is a more cooperative relationship between reinsurer and insurer in developing risk transfer products. Too often the reinsurers act purely as capital providers in the chain and from the source risk, viewing it almost as an abstract concept within the overall portfolio. The focus should be on how to bring all parties to the risk closer together By collaborating on the development of insurance products, not only will it help create greater alignment of interest based on a better understanding of the risk relationship, but also prove beneficial to the entire insurance food chain. It will make the process more efficient and cost effective, and hopefully see the risk owners securing the protection they want. In addition, it is much more likely to stimulate product innovation and growth, which is badly needed in many mature markets. The focus in my opinion should not be on how to bring the reinsurer closer to the risk, but rather on how to bring all parties to the risk closer together. What I am saying is not new, and it is certainly something which many larger reinsurers have been striving to achieve for years. And while there is evidence of this more collaborative approach between insurers and reinsurers gaining traction, there is still a very long way to go.  

NIGEL ALLEN
September 04, 2017
The Lay of The Land

China has made strong progress in developing agricultural insurance and aims to continually improve. As farming practices evolve, and new capabilities and processes enhance productivity, how can agricultural insurance in China keep pace with trending market needs? EXPOSURE investigates. The People’s Republic of China is a country of immense scale. Covering some 9.6 million square kilometers (3.7 million square miles), just two percent smaller than the U.S., the region spans five distinct climate areas with a diverse topography extending from the lowlands to the east and south to the immense heights of the Tibetan Plateau. Arable land accounts for approximately 135 million hectares (521,238 square miles), close to four times the size of Germany, feeding a population of 1.3 billion people. In total, over 1,200 crop varieties are cultivated, ranging from rice and corn to sugar cane and goji berries. In terms of livestock, some 20 species covering over 740 breeds are found across China; while it hosts over 20,000 aquatic breeds, including 3,800 types of fish.1 A Productive Approach With per capita land area less than half of the global average, maintaining agricultural output is a central function of the Chinese government, and agricultural strategy has formed the primary focus of the country’s “No. 1 Document” for the last 14 years. To encourage greater efficiency, the central government has sought to modernize methods and promote large-scale production, including the creation of more agriculture cooperatives, including a doubling of agricultural machinery cooperatives encouraging mechanization over the last four years.2 According to the Ministry of Agriculture, by the end of May 2015 there were 1.393 million registered farming cooperatives, up 22.4 percent from 2014 — a year that saw the government increase its funding for these specialized entities by 7.5 percent to ¥2 billion (US$0.3 billion). Changes in land allocation are also dramatically altering the landscape. In April 2017, the minister of agriculture, Han Changfu, announced plans to assign agricultural production areas to two key functions over the next three years, with 900 million mu (60 million hectares) for primary grain products, such as rice and wheat, and 238 million mu (16 million hectares) for five other key products, including cotton, rapeseed and natural rubber. Productivity levels are also being boosted by enhanced farming techniques and higher-yield crops, with new varieties of crop including high-yield wheat and “super rice” increasing annual tonnage. Food grain production has risen from 446 million tons in 1990 to 621 million tons in 2015.3 The year 2016 saw a 0.8 percent decline — the first in 12 years — but structural changes were a contributory factor. Insurance Penetration China is one of the most exposed regions in the world to natural catastrophes. Historically, China has repeatedly experienced droughts with different levels of spatial extent of damage to crops, including severe widespread droughts in 1965, 2000 and 2007. Frequent flooding also occurs, but with development of flood mitigation schemes, flooding of crop areas is on a downward trend. China has, however, borne the brunt of one the costliest natural catastrophes to date in 2017, according to Aon Benfield,4 with July floods along the Yangtze River basin causing economic losses topping US$6.4 billion. The 2016 summer floods caused some US$28 billion in losses along the river;5 while flooding in northeastern China caused a further US$4.7 billion in damage. Add drought losses of US$6 billion and the annual weather-related losses stood at US$38.7 billion.6 However, insured losses are a fraction of that figure, with only US$1.1 billion of those losses insured. “Often companies not only do not know where their exposures are, but also what the specific policy requirements for that particular region are in relation to terms and conditions” Laurent Marescot RMS The region represents the world’s second largest agricultural insurance market, which has grown from a premium volume of US$100 million in 2006 to more than US$6 billion in 2016. However, government subsidies — at both central and local level — underpin the majority of the market. In 2014, the premium subsidy level ranged from between 65 percent and 80 percent depending on the region and the type of insurance. Most of the insured are small acreage farms, for which crop insurance is based on a named peril but includes multiple peril cover (drought, flood, extreme winds and hail, freeze and typhoon). Loss assessment is generally performed by surveyors from the government, insurers and an individual that represents farmers within a village. Subsidized insurance is limited to specific crop varieties and breeds and primarily covers only direct material costs, which significantly lowers its appeal to the farming community. One negative impact of current multi-peril crop insurance is the cost of operations, thus reducing the impact of subsidies. “Currently, the penetration of crop insurance in terms of the insured area is at about 70 percent,” says Mael He, head of agriculture, China, at Swiss Re. “However, the coverage is limited and the sum insured is low. The penetration is only 0.66 percent in terms of premium to agricultural GDP. As further implementation of land transfer in different provinces and changes in supply chain policy take place, livestock, crop yield and revenue insurance will be further developed.” As He points out, changing farming practices warrant new types of insurance. “For the cooperatives, their insurance needs are very different compared to those of small household farmers. Considering their main income is from farm production, they need insurance cover on yield or event-price-related agricultural insurance products, instead of cover for just production costs in all perils.” At Ground Level Given low penetration levels and limited coverage, China’s agricultural market is clearly primed for growth. However, a major hindering factor is access to relevant data to inform meaningful insurance decisions. For many insurers, the time series of insurance claims is short, government-subsidized agriculture insurance only started in 2007, according to Laurent Marescot, senior director, market and product specialists at RMS. “This a very limited data set upon which to forecast potential losses,” says Marescot. “Given current climate developments and changing weather patterns, it is highly unlikely that during that period we have experienced the most devastating events that we are likely to see. It is hard to get any real understanding of a potential 1-in-100 loss from such data.” Major changes in agricultural practices also limit the value of the data. “Today’s farming techniques are markedly different from 10 years ago,” states Marescot. “For example, there is a rapid annual growth rate of total agricultural machinery power in China, which implies significant improvement in labor and land productivity.” Insurers are primarily reliant on data from agriculture and finance departments for information, says He. “These government departments can provide good levels of data to help insurance companies understand the risk for the current insurance coverage. However, obtaining data for cash crops or niche species is challenging.” “You also have to recognize the complexities in the data,” Marescot believes. “We accessed over 6,000 data files with government information for crops, livestock and forestry to calibrate our China Agricultural Model (CAM). Crop yield data is available from the 1980s, but in most cases it has to be calculated from the sown area. The data also needs to be processed to resolve inconsistencies and possibly de-trended, which is a fairly complex process. In addition, the correlation between crop yield and loss is not great as loss claims are made at a village level and usually involve negotiation.” A Clear Picture Without the right level of data, international companies operating in these territories may not have a clear picture of their risk profile. “Often companies not only have a limited view where their exposures are, but also of what the specific policy requirements for that particular province are in relation to terms and conditions,” says Marescot. “These are complex as they vary significantly from one line of business and province to the next.” A further level of complexity stems from the fact that not only can data be hard to source, but in many instances it is not reported on the same basis from province to province. This means that significant resource must be devoted to homogenizing information from multiple different data streams. “We’ve devoted a lot of effort to ensuring the homogenization of all data underpinning the CAM,” Marescot explains. “We’ve also translated the information and policy requirements from Mandarin into English. This means that users can either enter their own policy conditions into the model or rely upon the database itself. In addition, the model is able to disaggregate low-resolution exposure to higher-resolution information, using planted area data information. All this has been of significant value to our clients.” The CAM covers all three lines of agricultural insurance — crop, livestock and forestry. A total of 12 crops are modeled individually, with over 60 other crop types represented in the model. For livestock, CAM covers four main perils: disease, epidemics, natural disasters and accident/fire for cattle, swine, sheep and poultry. The Technology Age As efforts to modernize farming practices continue, so new technologies are being brought to bear on monitoring crops, mapping supply and improving risk management. “More farmers are using new technology, such as apps, to track the growing conditions of crops and livestock and are also opening this to end consumers so that they can also monitor this online and in real-time,” He says. “There are some companies also trying to use blockchain technology to track the movements of crops and livestock based on consumer interest; for instance, from a piglet to the pork to the dumpling being consumed.” He says, “3S technology — geographic information sciences, remote sensing and global positioning systems — are commonly used in China for agriculture claims assessments. Using a smartphone app linked to remote control CCTV in livestock farms is also very common. These digital approaches are helping farmers better manage risk.” Insurer Ping An is now using drones for claims assessment. There is no doubt that as farming practices in China evolve, the potential to generate much greater information from new data streams will facilitate the development of new products better designed to meet on-the-ground requirements. He concludes: “China can become the biggest agricultural insurance market in the next 10 years. … As the Chinese agricultural industry becomes more professional, risk management and loss assessment experience from international markets and professional farm practices could prove valuable to the Chinese market.” References: 1. Ministry of Agriculture of the People’s Republic of China 2. Cheng Fang, “Development of Agricultural Mechanization in China,” Food and Agriculture Organization of the United Nations, https://forum2017.iamo.de/microsites/forum2017.iamo.de/fileadmin/presentations/B5_Fang.pdf 3. Ministry of Agriculture of the People’s Republic of China 4. Aon Benfield, “Global Catastrophe Recap: First Half of 2017,” July 2017, http://thoughtleadership.aonbenfield.com/Documents/201707-if-1h-global-recap.pdf 5. Aon Benfield, “2016 Annual Global Climate and Catastrophe Report,” http://thoughtleadership.aonbenfield.com/Documents/20170117-ab-ifannualclimate-catastrophe-report.pdf 6. Ibid. The Disaster Plan In April 2017, China announced the launch of an expansive disaster insurance program spanning approximately 200 counties in the country’s primary grain producing regions, including Hebei and Anhui.  The program introduces a new form of agriculture insurance designed to provide compensation for losses to crop yields resulting from natural catastrophes, including land fees, fertilizers and crop-related materials. China’s commitment to providing robust disaster cover was also demonstrated in 2016, when Swiss Re announced it had entered into a reinsurance protection scheme with the government of Heilongjiang Province and the Sunlight Agriculture Mutual Insurance Company of China — the first instance of the Chinese government capitalizing on a commercial program to provide cover for natural disasters. The coverage provides compensation to farming families for both harm to life and damage to property as well as income loss resulting from floods, excessive rain, drought and low temperatures. It determines insurance payouts based on triggers from satellite and meteorological data. Speaking at the launch, Swiss Re president for China John Chen said: “It is one of the top priorities of the government bodies in China to better manage natural catastrophe risks, and it has been the desire of the insurance companies in the market to play a bigger role in this sector. We are pleased to bridge the cooperation with an innovative solution and would look forward to replicating the solutions for other provinces in China.”  

EDITOR
September 04, 2017
Efficiency Breeds Value

Insurers must harness data, technology and human capital if they are to operate more efficiently and profitably in the current environment, but as AXIS Capital’s Albert Benchimol tells EXPOSURE, offering better value to clients may be a better long-term motive for becoming more efficient. Efficiency is a top priority for insurers the world over as they bid to increase margins, reduce costs and protect profitability in the competitive heat of the enduring soft market. But according to AXIS Capital president and CEO Albert Benchimol, there is a broader, more important and longer-term challenge that must also be addressed through the ongoing efficiency drive: value for money. “When I think of value, I think of helping our clients and partners succeed in their own endeavors. This means providing quick and responsive service, creative policy structures that address our customers’ coverage needs, best-in-class claims handling and trusting our people to pursue their own entrepreneurial goals,” says Benchimol. “While any one insurance policy may in itself offer good value, when aggregated, insurance is not necessarily seen as good value by clients. Our industry as a whole needs to deliver a better value proposition — and that means that all participants in the value chain will need to become much more efficient.” According to Benchimol — who prior to being appointed CEO of AXIS in 2012 served as the Bermuda-based insurance group’s CFO and also held senior executive positions at Partner Re, Reliance Group and Bank of Montreal — the days of paying out US$0.55-$0.60 in claims for every dollar of premium paid are over. “We need to start framing our challenge as delivering a 70 percent-plus loss ratio within a low 90s combined ratio,” he asserts. “Every player in the value chain needs to adopt efficiency-enhancing technology to lower our costs and pass those savings on to the customer.” With a surfeit of capital making it unlikely the insurance industry will return to its traditional cyclical nature any time soon, Benchimol says these changes have to be adopted for the long term. “Insurers have to evaluate their portfolios and product offerings to match customer needs with marketplace realities. We will need to develop new products to meet emerging demand; offer better value in the eyes of insureds; apply data, analytics and technology to all facets of our business; and become much more efficient,” he explains. Embracing Technology The continued adoption and smarter use of data will be central to achieving this goal. “We’ve only begun to scratch the surface of what data we can access and insights we can leverage to make better, faster decisions throughout the risk transfer value chain,” Benchimol says. “If we use technology to better align our operations and costs with our customers’ needs and expectations, we will create and open-up new markets because potential insureds will see more value in the insurance product.” “I admire companies that constantly challenge themselves and that are driven by data to make informed decisions — companies that don’t rest on their laurels and don’t accept the status quo” Technology, data and analytics have already brought improved efficiencies to the insurance market. This has allowed insurers to focus their efforts on targeted markets and develop applications to deliver improved, customized purchasing experiences and increase client satisfaction and engagement, Benchimol notes. The introduction of data modeling, he adds, has also played a key role in improving economic protection, making it easier for (re)insurance providers to evaluate risks and enter new markets, thereby increasing the amount of capacity available to protect insureds. “While this can sometimes raise pricing pressures, it has a positive benefit of bringing more affordable capacity to potential customers. This has been most pronounced in the development of catastrophe models in underinsured emerging markets, where capital hasn’t always been available in the past,” he says. The introduction of models made these markets more attractive to capital providers which, in turn, makes developing custom insurance products more cost-effective and affordable for both insurers and their clients, Benchimol explains. However, there is no doubt the insurance industry has more to do if it is not only to improve its own profitability and offerings to customers, but also to stave off competition from external threats, such as disruptive innovators in the FinTech and InsurTech spheres. Strategic Evolution “The industry’s inefficiencies and generally low level of customer satisfaction make it relatively easy prey for disruption,” Benchimol admits. However, he believes that the regulated and highly capital-intensive nature of insurance is such that established domain leaders will continue to thrive if they are prepared to beat innovators at their own game. “We need to move relatively quickly, as laggards may have a difficult time catching up,” he warns. “In order to thrive in the disruptive market economy, market leaders must take intelligent risks. This isn’t easy, but is absolutely necessary,” Benchimol says. “I admire companies that constantly challenge themselves and that are driven by data to make informed decisions — companies that don’t rest on their laurels and don’t accept the status quo.” “We need to start framing our challenge as delivering a 70-percent plus loss ratio within a low 90s combined ratio” Against the backdrop of a rapidly evolving market and transformed business environment, AXIS took stock of its business at the start of 2016, evaluating its key strengths and reflecting on the opportunities and challenges in its path. What followed was an important strategic evolution. “Over the course of the year we implemented a series of strategic initiatives across the business to drive long-term growth and ensure we deliver the most value to our clients, employees and shareholders,” Benchimol says. “This led us to sharpen our focus on specialty risk, where we believe we have particular expertise. We implemented new initiatives to even further enhance the quality of our underwriting. We invested more in our data and analytics capabilities, expanded the focus in key markets where we feel we have the greatest relevance, and took action to acquire firms that allow us to expand our leadership in specialty insurance, such as our acquisition of specialty aviation insurer and reinsurer Aviabel and our recent offer to acquire Novae.” Another highlight for AXIS in 2016 was the launch of Harrington Re, co-founded with the Blackstone Group. “At AXIS, our focus on innovation also extends to how we look at alternative funding sources and our relationship with third-party capital, which centers on matching the right risk with the right capital,” Benchimol explains. “We currently have a number of alternative capital sources that complement our balance sheet and enable us to deliver enhanced capacity and tailored solutions to our clients and brokers.” Benchimol believes a significant competitive advantage for AXIS is that it is still small enough to be agile and responsive to customers’ needs, yet large enough to take advantage of its global capabilities and resources in order to help clients manage their risks. But like many of his competitors, Benchimol knows future success will be heavily reliant on how well AXIS melds human expertise with the use of data and technology. “We need to combine our ingenuity, innovation and values with the strength, speed and intelligence offered by technology, data and analytics. The ability to combine these two great forces — the art and science of insurance — is what will define the insurer of the future,” Benchimol states. The key, he believes, is to empower staff to make informed, data-driven decisions. “The human elements that are critical to success in the insurance industry are, among others: knowledge, creativity, service and commitment to our clients and partners. We need to operate within a framework that utilizes technology to provide a more efficient customer experience and is underpinned by enhanced data and analytics capabilities that allow us to make informed, intelligent decisions on behalf of our clients.” However, Benchimol insists insurers must embrace change while holding on to the traditional principles that underpinned insurance in the analog age, as these same principles must continue to do so into the future. “We must harness technology for good causes, while remaining true to the core values and universal strengths of our industry — a passion for helping people when they are down, a creativity in structuring products, and the commitment to keeping the promise we make to our clients to help them mitigate risks and ensure the security of their assets,” he says. “We must not forget these critical elements that comprise the heart of the insurance industry.”

close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an RMS specialist shortly.

RMS.com uses cookies to improve your experience and analyze site usage. Read Cookie Policy or click I understand.

close