logo image
Clear All
More Topics
Helen Yates
30-Casualty
30-Casualty
A Burgeoning Opportunity
September 04, 2017

As traditional (re)insurers hunt for opportunity outside of property catastrophe classes, new probabilistic casualty catastrophe models are becoming available. At the same time, as catastrophe risks are becoming increasingly “manufactured” or human-made, so casualty classes have the potential to be the source of claims after a large “natural” catastrophe. Just as the growing sophistication of property catastrophe models has enabled industry innovation, there is growing excitement that new tools available to casualty (re)insurers could help to expand the market. By improved evaluation of casualty clash exposures, reinsurers will be better able to understand, price and manage their exposures, as well as design new products that cater to underserved areas. However, the casualty market must switch from pursuing a purely defensive strategy. “There is an ever-growing list of exclusions in liability insurance and interest in the product is declining with the proliferation of these exclusions,” explains Dr. Robert Reville, president and CEO of Praedicat, the world’s first liability catastrophe modeling company. “There is a real growth opportunity for the industry to deal with these exclusions and recognize where they can confidently write more business. “Industry practitioners look at what’s happened in property — where modeling has led to a lot of new product ideas, including capital market solutions, and a lot of innovation — and casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property,” he adds. Perils — particularly emerging risks that underwriters have struggled to price, manage and understand — have typically been excluded from casualty products. This includes electromagnetic fields (EMFs), such as those emanating from broadcast antennas and cell phones. Cover for such exposures is restricted, particularly for the U.S. market, where it is often excluded entirely. Some carriers will not offer any cover at all if the client has even a remote exposure to EMF risks. Yet are they being over-apprehensive about the risk? The fear that leads to an over application of exclusions is very tangible. “The latency of the disease development process — or the way a product might be used, with more people becoming exposed over time — causes there to be a build-up of risk that may result in catastrophe,” Reville continues. “Insurers want to be relevant to insuring innovation in product, but they have to come to terms with the latency and the potential for a liability catastrophe that might emerge from it.” Unique nature of casualty catastrophe It is a misconception that casualty is not a catastrophe class of business. Reville points out that the industry’s US$100 billion-plus loss relating to asbestos claims is arguably its biggest-ever catastrophe. Within the Lloyd’s market the overwhelming nature of APH (asbestos, pollution and health) liabilities contributed to the market’s downward spiral in the late 1980s, only brought under control through the formation of the run-off entity Equitas, now owned and managed by Warren Buffett’s Berkshire Hathaway. As the APH claims crisis demonstrated, casualty catastrophes differ from property catastrophes in that they are a “two-tailed loss.” There is the “tail loss” both have in common, which describes the high frequency, low probability characteristics — or high return period — of a major event. But in addition, casualty classes of business are “long-tail” in nature. This means that a policy written in 2017 may not experience a claim until 20 years later, providing an additional challenge from a modeling and reserving perspective. “Casualty insurers are hungry for that sort of innovation, for the same sort of transformation in liability that happened in property” Robert Reville Praedicat Another big difference between casualty clash and property catastrophe from a modeling perspective is that the past is not a good indication of future claims. “By the time asbestos litigation had really taken off, it was already a banned product in the U.S., so it was not as though asbestos claims were any use in trying to figure out where the next environmental disaster or next product liability was going to be,” says Reville. “So, we needed a forward-looking approach to identify where there could be new sources of litigation.” With the world becoming both more interconnected and more litigious, there is every expectation that future casualty catastrophe losses could be much greater and impact multiple classes of business. “The reality is there’s serial aggregation and systemic risk within casualty business, and our answer to that has generally been that it’s too difficult to quantify,” says Nancy Bewlay, chief underwriting officer, global casualty, at XL Catlin. “But the world is changing. We now have technology advances and data collection capabilities we never had before, and public information that can be used in the underwriting process. “Take the Takata airbag recall,” she continues. “In 2016, they had to recall 100 million airbags worldwide. It affected all the major motor manufacturers, who then faced the accumulation potential not only of third-party liability claims, but also product liability and product recall. Everything starts to accumulate and combine within that one industry, and when you look at the economic footprint of that throughout the supply chain there’s a massive potential for a casualty catastrophe when you see how everything is interconnected.” RMS chief research officer Robert Muir-Wood explains: “Another area where we can expect an expansion of modeling applications concerns casualty lines picking up losses from more conventional property catastrophes. This could occur when the cause of a catastrophe can be argued to have ‘non-natural’ origins, and particularly where there are secondary ‘cascade’ consequences of a catastrophe — such as a dam failing after a big earthquake or for claims on ‘professional lines’ coverages of builders and architects — once it is clear that standard property insurance lines will not compensate for all the building damage.” “This could be prevalent in regions with low property catastrophe insurance penetration, such as in California, where just one in ten homeowners has earthquake cover. In the largest catastrophes, we could expect claims to be made against a wide range of casualty lines. The big innovation around property catastrophe in particular was to employ high-resolution GIS [geographic information systems] data to identify the location of all the risk. We need to apply similar location data to casualty coverages, so that we can estimate the combined consequences of a property/casualty clash catastrophe.” One active instance, cited by Muir-Wood, of this shift from property to casualty cover- ages concerns earthquakes in Oklahoma. “There are large amounts of wastewater left over from fracking, and the cheapest way of disposing of it is to pump it down deep boreholes. But this process has been triggering earthquakes, and these earthquakes have started getting quite big — the largest so far in September 2016 had a magnitude of M5.8. “At present the damage to buildings caused by these earthquakes is being picked up by property insurers,” he continues. “But what you will see over time are lawsuits to try and pass the costs back to the operators of the wells themselves. Working with Praedicat, RMS has done some modeling work on how these operators can assess the risk cost of adding a new disposal well. Clearly the larger the earthquake, the less likely it is to occur. However, the costs add up: our modeling shows that an earthquake bigger than M6 right under Oklahoma City could cause more than US$10 billion of damage.” Muir-Wood adds: “The challenge is that casualty insurance tends to cover many potential sources of liability in the contract and the operators of the wells, and we believe their insurers are not currently identifying this particular — and potentially catastrophic —source of future claims. There’s the potential for a really big loss that would eventually fall onto the liability writers of these deep wells … and they are not currently pricing for this risk, or managing their portfolios of casualty lines.” A modeled class of business According to Reville, the explosion of data and development of data science tools have been key to the development of casualty catastrophe modeling. The opportunity to develop probabilistic modeling for casualty classes of business was born in the mid-2000s when Reville was senior economist at the RAND Corporation. At that time, RAND was using data from the RMS® Probabilistic Terrorism Model to help inform the U.S. Congress in its decision on the renewal of the Terrorism Risk Insurance Act (TRIA). Separately, it had written a paper on the scope and scale of asbestos litigation and its potential future course. “As we were working on these two things it occurred to us that here was this US$100 billion loss — this asbestos problem — and adjacently within property catastrophe insurance there was this developed form of analytics that was helping insurers solve a similar problem. So, we decided to work together to try and figure out if there was a way of solving the problem on the liability side as well,” adds Reville. Eventually Praedicat was spun out of the initial project as its own brand, launching its first probabilistic liability catastrophe model in summer 2016. “The industry has evolved a lot over the past five years, in part driven by Solvency II and heightened interest from the regulators and rating agencies,” says Reville. “There is a greater level of concern around the issue, and the ability to apply technologies to understand risk in new ways has evolved a lot.” There are obvious benefits to (re)insurers from a pricing and exposure management perspective. “The opportunity is changing the way we underwrite,” says Bewlay. “Historically, we underwrote by exclusion with a view to limiting our maximum loss potential. We couldn’t get a clear understanding of our portfolio because we weren’t able to. We didn’t have enough meaningful, statistical and credible data.” “We feel they are not being proactive enough because … there’s the potential for a really big loss that would fall onto the liability writers of these deep wells” Robert Muir-Wood RMS Then there are the exciting opportunities for growth in a market where there is intense competition and downward pressure on rates. “Now you can take a view on the ‘what-if’ scenario and ask: how much loss can I handle and what’s the probability of that happening?” she continues. “So, you can take on managed risk. Through the modeling you can better understand your industry classes and what could happen within your portfolio, and can be slightly more opportunistic in areas where previously you may have been extremely cautious.” Not only does this expand the potential range of casualty insurance and reinsurance products, it should allow the industry to better support developments in burgeoning industries. “Cyber is a classic example,” says Bewlay. “If you can start to model the effects of a cyber loss you might decide you’re OK providing cyber in personal lines for individual homeowners in addition to providing cyber in a traditional business or technology environment. “You would start to model all three of these scenarios and what your potential market share would be to a particular event, and how that would impact your portfolio,” she continues. “If you can answer those questions utilizing your classic underwriting and actuarial techniques, a bit of predictive modeling in there — this is the blend of art and science — you can start taking opportunities that possibly you couldn’t before.”

NIGEL ALLEN
26-Flood
26-Flood
Breaching the Flood Insurance Barrier
September 04, 2017

With many short-term reauthorizations of the National Flood Insurance Program, EXPOSURE considers how the private insurance market can bolster its presence in the U.S. flood arena and overcome some of the challenges it faces. According to Federal Emergency Management Agency (FEMA), as of June 30, 2017, the National Flood Insurance Program (NFIP) had around five million policies in force, representing a total in-force written premium exceeding US$3.5 billion and an overall exposure of about US$1.25 trillion. Florida alone accounts for over a third of those policies, with over 1.7 million in force in the state, representing premiums of just under US$1 billion. However, with the RMS Exposure Source Database estimating approximately 85 million residential properties alone in the U.S., the NFIP only encompasses a small fraction of the overall number of properties exposed to flood, considering floods can occur throughout the country. Factors limiting the reach of the program have been well documented: the restrictive scope of NFIP policies, the fact that mandatory coverage applies only to special flood hazard plains, the challenges involved in securing elevation certificates, the cost and resource demands of conducting on-site inspections, the poor claims performance of the NFIP, and perhaps most significant the refusal by many property owners to recognize the threat posed by flooding. At the time of writing, the NFIP is once again being put to the test as Hurricane Harvey generates catastrophic floods across Texas. As the affected regions battle against these unprecedented conditions, it is highly likely that the resulting major losses will add further impetus to the push for a more substantive private flood insurance market. The private market potential While the private insurance sector shoulders some of the flood coverage, it is a drop in the ocean, with RMS estimating the number of private flood policies to be around 200,000. According to Dan Alpay, line underwriter for flood and household at Hiscox London Market, private insurers represent around US$300 to US$400 million of premium — although he adds that much of this is in “big- ticket policies” where flood has been included as part of an all-risks policy. “In terms of stand-alone flood policies,” he says, “the private market probably only represents about US$100 million in premiums — much of which has been generated in the last few years, with the opening up of the flood market following the introduction of the Biggert-Waters Flood Insurance Reform Act of 2012 and the Homeowner Flood Insurance Affordability Act of 2014.” But it is clear therefore that the U.S. flood market represents one of the largest untapped insurance opportunities in the developed world, with trillions of dollars of property value at risk across the country. “It is extremely rare to have such a huge potential market like this,” says Alpay, “and we are not talking about a risk that the market does not understand. It is U.S. catastrophe business, which is a sector that the private market has extensive experience in. And while most insurers have not provided specific cover for U.S. flood before, they have been providing flood policies in many other countries for many years, so have a clear understanding of the peril characteristics. And I would also say that much of the experience gained on the U.S. wind side is transferable to the flood sector.” Yet while the potential may be colossal, the barriers to entry are also significant. First and foremost, there is the challenge of going head-to-head with the NFIP itself. While there is concerted effort on the part of the U.S. government to facilitate a greater private insurer presence in the flood market as part of its reauthorization, the program has presided over the sector for almost 50 years and competing for those policies will be no easy task. “The main problem is changing consumer behavior,” believes Alpay. “How do we get consumers who have been buying policies through the NFIP since 1968 to appreciate the value of a private market product and trust that it will pay out in the event of a loss? While you may be able to offer a product that on paper is much more comprehensive and provides a better deal for the insured, many will still view it as risky given their inherent trust in the government.” For many companies, the aim is not to compete with the program, but rather to source opportunities beyond the flood zones, accessing the potential that exists outside of the mandatory purchase requirements. But to do this, property owners who are currently not located in these zones need to understand that they are actually in an at-risk area and need to consider purchasing flood cover. This can be particularly challenging in locations where homeowners have never experienced a damaging flood event. Another market opportunity lies in providing coverage for large industrial facilities and high-value commercial properties, according to Pete Dailey, vice president of product management at RMS. “Many businesses already purchase NFIP policies,” he explains, “in fact those with federally insured mortgages and locations in high-risk flood zones are required to do so. “However,” he continues, “most businesses with low-to-moderate flood risk are unaware that their business policy excludes flood damage to the building, its contents and losses due to business interruption. Even those with NFIP coverage have a US$500,000 limit and could benefit from an excess policy. Insurers eager to expand their books by offering new product options to the commercial lines will facilitate further expansion of the private market.” Assessing the flood level But to be able to effectively target this market, insurers must first be able to ascertain what the flood exposure levels really are. The current FEMA flood mapping database spans 20,000 individual plains. However, much of this data is out of date, reflecting limited resources, which, coupled with a lack of consistency in how areas have been mapped using different contractors, means their risk assessment value is severely limited. While a proposal to use private flood mapping studies instead of FEMA maps is being considered, the basic process of maintaining flood plain data is an immense problem given the scale. With the U.S. exposed to flood in virtually every location, this makes it a high-resolution peril, meaning there is a long list of attributes and inter-dependent dynamic factors influencing what flood risk in a particular area might be. With 100 years of scientific research, the physics of flooding itself is well understood, the issue has been generating the data and creating the model at sufficient resolution to encompass all of the relevant factors from an insurance perspective. In fact, to manage the scope of the data required to release the RMS U.S. Flood Hazard Maps for a small number of return periods required the firm to build a supercomputer, capitalizing on immense Cloud-based technology to store and manage the colossal streams of information effectively. With such data now available, insurers are in a much better position to generate functional underwriting maps – FEMA maps were never drawn up for underwriting purposes. The new hazard maps provide actual gradient and depth of flooding data, to get away from the ‘in’ or ‘out’ discussion, allowing insurers to provide detail, such as if a property is exposed to two to three feet of flooding at a 1-in-100 return period. No clear picture Another hindrance to establishing a clear flood picture is the lack of a systematic database of the country’s flood defense network. RMS estimates that the total network encompasses some 100,000 miles of flood defenses; however, FEMA’s levy network accounts for approximately only 10 percent of this. Without the ability to model existing flood defenses accurately,  higher frequency, lower risk events are overestimated. To help counter this lack of defense data, RMS developed the capability within its U.S. Inland Flood HD Model to identify the likelihood of such measures being present and, in turn, assess the potential protection levels. Data shortage is also limiting the potential product spectrum. If an insurer is not able to demonstrate to a ratings agency or regulator what the relationship between different sources of flood risk (such as storm surge and river flooding) is for a given portfolio, then it could reduce the range of flood products they can offer. Insurers also need the tools and the data to differentiate the more complicated financial relationships, exclusions and coverage options relative to the nature of the events that could occur. Launching into the sector In May 2016, Hiscox London Market launched its FloodPlus product into the U.S. homeowners sector, following the deregulation of the market. Distributed through wholesale brokers in the U.S., the policy is designed to offer higher limits and a wider scope than the NFIP. “We initially based our product on the NFIP policy with slightly greater coverage,” Alpay explains, “but we soon realized that to firmly establish ourselves in the market we had to deliver a policy of sufficient value to encourage consumers to shift from the NFIP to the private market. “As we were building the product and setting the limits,” he continues, “we also looked at how to price it effectively given the lack of granular flood information. We sourced a lot of data from external vendors in addition to proprietary modeling which we developed ourselves, which enabled us to build our own pricing system. What that enabled us to do was to reduce the process time involved in buying and activating a policy from up to 30 days under the NFIP system to a matter of minutes under FloodPlus.” This sort of competitive edge will help incentivize NFIP policyholders to make a switch. “We also conducted extensive market research through our coverholders,” he adds, “speaking to agents operating within the NFIP system to establish what worked and what didn’t, as well as how claims were handled.” “We soon realized that to firmly establish ourselves … we had to deliver a policy of sufficient value to encourage consumers to shift from the NFIP to the private market”  Dan Alpay Hiscox London Market Since launch, the product has been amended on three occasions in response to customer demand. “For example, initially the product offered actual cash value on contents in line with the NFIP product,” he adds. “However, after some agent feedback, we got comfortable with the idea of providing replacement cost settlement, and we were able to introduce this as an additional option which has proved successful.” To date, coverholder demand for the product has outstripped supply, he says. “For the process to work efficiently, we have to integrate the FloodPlus system into the coverholder’s document issuance system. So, given the IT integration process involved plus the education regarding the benefits of the product, it can’t be introduced too quickly if it is to be done properly.” Nevertheless, growing recognition of the risk and the need for coverage is encouraging to those seeking entry into this emerging market. A market in the making The development of a private U.S. flood insurance market is still in its infancy, but the wave of momentum is building. Lack of relevant data, particularly in relation to loss history, is certainly dampening the private sector’s ability to gain market traction. However, as more data becomes available, modeling capabilities improve, and insurer products gain consumer trust by demonstrating their value in the midst of a flood event, the market’s potential will really begin to flow. “Most private insurers,” concludes Alpay, “are looking at the U.S. flood market as a great opportunity to innovate, to deliver better products than those currently available, and ultimately to give the average consumer more coverage options than they have today, creating an environment better for everyone involved.” The same can be said for the commercial and industrial lines of business where stakeholders are actively searching for cost savings and improved risk management. Climate complications As the private flood market emerges, so too does the debate over how flood risk will adjust to a changing climate. “The consensus today among climate scientists is that climate change is real and that global temperatures are indeed on the rise,” says Pete Dailey, vice president of product management at RMS. “Since warmer air holds more moisture, the natural conclusion is that flood events will become more common and more severe. Unfortunately, precipitation is not expected to increase uniformly in time or space, making it difficult to predict where flood risk would change in a dramatic way.” Further, there are competing factors that make the picture uncertain. “For example,” he explains, “a warmer environment can lead to reduced winter snowpack, and, in turn, reduced springtime melting. Thus, in regions susceptible to springtime flooding, holding all else constant, warming could potentially lead to reduced flood losses.” For insurers, these complications can make risk selection and portfolio management more complex. “While the financial implications of climate change are uncertain,” he concludes, “insurers and catastrophe modelers will surely benefit from climate change research and byproducts like better flood hazard data, higher resolution modeling and improved analytics being developed by the climate science community.”

EDITOR
24-Closer-look-KAL
24-Closer-look-KAL
What One Thing Would Help Reinsurers Get Closer to the Original Risk?
September 04, 2017

In each edition of EXPOSURE we ask three experts for their opinion on how they would tackle a major risk and insurance challenge. This issue, we consider how (re)insurers can gain more insight into the original risk, and in so doing, remove frictional costs. As our experts Kieran Angelini-Hurll, Will Curran and Luzi Hitz note, more insight does not necessarily mean disintermediation. Kieran Angelini-Hurll CEO, Reinsurance at Ed The reduction of frictional costs would certainly help. At present, there are too many frictional costs between the reinsurer and the original risk. The limited amount of data available to reinsurers on the original risks is also preventing them from getting a clear picture. A combination of new technology and a new approach from brokers can change this. First, the technology. A trading platform which reduces frictional costs by driving down overheads will bridge the gap between reinsurance capital and the insured. However, this platform can only work if it provides the data which will allow reinsurers to better understand this risk. Arguably, the development of such a platform could be achieved by any broker with the requisite size, relevance and understanding of reinsurers’ appetites. Brokers reluctant to share data will watch their business migrate to more disruptive players However, for most, their business models do not allow for it. Their size stymies innovation and they have become dependent on income derived from the sale of data, market-derived income and ‘facilitization’. These costs prevent reinsurers from getting closer to the risk. They are also unsustainable, a fact that technology will prove. A trading platform which has the potential to reduce costs for all parties, streamline the throughput of data, and make this information readily and freely available could profoundly alter the market. Brokers that continue to add costs and maintain their reluctance to share data will be forced to evolve or watch their business migrate to leaner, more disruptive players. Brokers which are committed to marrying reinsurance capital with risk, regardless of its location and that deploy technology, can help overcome the barriers put in place by current market practices and bring reinsurers closer to the original risk. Will Curran Head of Reinsurance, Tokio Marine Kiln, London More and more, our customers are looking to us as their risk partners, with the expectation that we will offer far more than a transactional risk transfer product. They are looking for pre-loss services, risk advisory and engineering services, modeling and analytical capabilities and access to our network of external experts, in addition to more traditional risk transfer. As a result of offering these capabilities, we are getting closer to the original risk, through our discussions with cedants and brokers, and our specialist approach to underwriting. Traditional carriers are able to differentiate by going beyond vanilla risk transfer The long-term success of reinsurers needs to be built on offering more than being purely a transactional player. To a large extent, this has been driven by the influx of non-traditional capital into the sector. Whereas these alternative property catastrophe reinsurance providers are offering a purely transactional product, often using parametric or industry-loss triggers to simplify the claims process in their favor, traditional carriers are able to differentiate by going beyond vanilla risk transfer. Demand for risk advice and pre-loss services are particularly high within specialist and emerging risk classes of business. Cyber is a perfect example of this, where we work closely with our corporate and insurance clients to help them improve their resilience to cyber-attack and to plan their response in the event of a breach. Going forward, successful reinsurance companies will be those that invest time and resources in becoming true risk partners. In an interconnected and increasingly complex world, where there is a growing list of underinsured exposures, risk financing is just one among many service offerings in the toolkit of specialist reinsurers. Luzi Hitz CEO, PERILS AG The nature of reinsurance means the reinsurer is inherently further away from the underlying risk than most other links in the value chain. The risk is introduced by the original insured, and is transferred into the primary market before reaching the reinsurer – a process normally facilitated by reinsurance intermediaries. I am wary of efforts to shortcut or circumvent this established multi-link chain to reduce the distance between reinsurer and the underlying risk. The reinsurer in many cases lacks the granular insight found earlier in the process required to access the risk directly. What we need is a more cooperative relationship between reinsurer and insurer in developing risk transfer products. Too often the reinsurers act purely as capital providers in the chain and from the source risk, viewing it almost as an abstract concept within the overall portfolio. The focus should be on how to bring all parties to the risk closer together By collaborating on the development of insurance products, not only will it help create greater alignment of interest based on a better understanding of the risk relationship, but also prove beneficial to the entire insurance food chain. It will make the process more efficient and cost effective, and hopefully see the risk owners securing the protection they want. In addition, it is much more likely to stimulate product innovation and growth, which is badly needed in many mature markets. The focus in my opinion should not be on how to bring the reinsurer closer to the risk, but rather on how to bring all parties to the risk closer together. What I am saying is not new, and it is certainly something which many larger reinsurers have been striving to achieve for years. And while there is evidence of this more collaborative approach between insurers and reinsurers gaining traction, there is still a very long way to go.  

NIGEL ALLEN
15-Chinaagriculture
15-Chinaagriculture
The Lay of The Land
September 04, 2017

China has made strong progress in developing agricultural insurance and aims to continually improve. As farming practices evolve, and new capabilities and processes enhance productivity, how can agricultural insurance in China keep pace with trending market needs? EXPOSURE investigates. The People’s Republic of China is a country of immense scale. Covering some 9.6 million square kilometers (3.7 million square miles), just two percent smaller than the U.S., the region spans five distinct climate areas with a diverse topography extending from the lowlands to the east and south to the immense heights of the Tibetan Plateau. Arable land accounts for approximately 135 million hectares (521,238 square miles), close to four times the size of Germany, feeding a population of 1.3 billion people. In total, over 1,200 crop varieties are cultivated, ranging from rice and corn to sugar cane and goji berries. In terms of livestock, some 20 species covering over 740 breeds are found across China; while it hosts over 20,000 aquatic breeds, including 3,800 types of fish.1 A productive approach With per capita land area less than half of the global average, maintaining agricultural output is a central function of the Chinese government, and agricultural strategy has formed the primary focus of the country’s “No. 1 Document” for the last 14 years. To encourage greater efficiency, the central government has sought to modernize methods and promote large-scale production, including the creation of more agriculture cooperatives, including a doubling of agricultural machinery cooperatives encouraging mechanization over the last four years.2 According to the Ministry of Agriculture, by the end of May 2015 there were 1.393 million registered farming cooperatives, up 22.4 percent from 2014 — a year that saw the government increase its funding for these specialized entities by 7.5 percent to ¥2 billion (US$0.3 billion). Changes in land allocation are also dramatically altering the landscape. In April 2017, the minister of agriculture, Han Changfu, announced plans to assign agricultural production areas to two key functions over the next three years, with 900 million mu (60 million hectares) for primary grain products, such as rice and wheat, and 238 million mu (16 million hectares) for five other key products, including cotton, rapeseed and natural rubber. Productivity levels are also being boosted by enhanced farming techniques and higher-yield crops, with new varieties of crop including high-yield wheat and “super rice” increasing annual tonnage. Food grain production has risen from 446 million tons in 1990 to 621 million tons in 2015.3 The year 2016 saw a 0.8 percent decline — the first in 12 years — but structural changes were a contributory factor. Insurance penetration China is one of the most exposed regions in the world to natural catastrophes. Historically, China has repeatedly experienced droughts with different levels of spatial extent of damage to crops, including severe widespread droughts in 1965, 2000 and 2007. Frequent flooding also occurs, but with development of flood mitigation schemes, flooding of crop areas is on a downward trend. China has, however, borne the brunt of one the costliest natural catastrophes to date in 2017, according to Aon Benfield,4 with July floods along the Yangtze River basin causing economic losses topping US$6.4 billion. The 2016 summer floods caused some US$28 billion in losses along the river;5 while flooding in northeastern China caused a further US$4.7 billion in damage. Add drought losses of US$6 billion and the annual weather-related losses stood at US$38.7 billion.6 However, insured losses are a fraction of that figure, with only US$1.1 billion of those losses insured. “Often companies not only do not know where their exposures are, but also what the specific policy requirements for that particular region are in relation to terms and conditions” Laurent Marescot RMS The region represents the world’s second largest agricultural insurance market, which has grown from a premium volume of US$100 million in 2006 to more than US$6 billion in 2016. However, government subsidies — at both central and local level — underpin the majority of the market. In 2014, the premium subsidy level ranged from between 65 percent and 80 percent depending on the region and the type of insurance. Most of the insured are small acreage farms, for which crop insurance is based on a named peril but includes multiple peril cover (drought, flood, extreme winds and hail, freeze and typhoon). Loss assessment is generally performed by surveyors from the government, insurers and an individual that represents farmers within a village. Subsidized insurance is limited to specific crop varieties and breeds and primarily covers only direct material costs, which significantly lowers its appeal to the farming community. One negative impact of current multi-peril crop insurance is the cost of operations, thus reducing the impact of subsidies. “Currently, the penetration of crop insurance in terms of the insured area is at about 70 percent,” says Mael He, head of agriculture, China, at Swiss Re. “However, the coverage is limited and the sum insured is low. The penetration is only 0.66 percent in terms of premium to agricultural GDP. As further implementation of land transfer in different provinces and changes in supply chain policy take place, livestock, crop yield and revenue insurance will be further developed.” As He points out, changing farming practices warrant new types of insurance. “For the cooperatives, their insurance needs are very different compared to those of small household farmers. Considering their main income is from farm production, they need insurance cover on yield or event-price-related agricultural insurance products, instead of cover for just production costs in all perils.” At ground level Given low penetration levels and limited coverage, China’s agricultural market is clearly primed for growth. However, a major hindering factor is access to relevant data to inform meaningful insurance decisions. For many insurers, the time series of insurance claims is short, government-subsidized agriculture insurance only started in 2007, according to Laurent Marescot, senior director, market and product specialists at RMS. “This a very limited data set upon which to forecast potential losses,” says Marescot. “Given current climate developments and changing weather patterns, it is highly unlikely that during that period we have experienced the most devastating events that we are likely to see. It is hard to get any real understanding of a potential 1-in-100 loss from such data.” Major changes in agricultural practices also limit the value of the data. “Today’s farming techniques are markedly different from 10 years ago,” states Marescot. “For example, there is a rapid annual growth rate of total agricultural machinery power in China, which implies significant improvement in labor and land productivity.” Insurers are primarily reliant on data from agriculture and finance departments for information, says He. “These government departments can provide good levels of data to help insurance companies understand the risk for the current insurance coverage. However, obtaining data for cash crops or niche species is challenging.” “You also have to recognize the complexities in the data,” Marescot believes. “We accessed over 6,000 data files with government information for crops, livestock and forestry to calibrate our China Agricultural Model (CAM). Crop yield data is available from the 1980s, but in most cases it has to be calculated from the sown area. The data also needs to be processed to resolve inconsistencies and possibly de-trended, which is a fairly complex process. In addition, the correlation between crop yield and loss is not great as loss claims are made at a village level and usually involve negotiation.” A clear picture Without the right level of data, international companies operating in these territories may not have a clear picture of their risk profile. “Often companies not only have a limited view where their exposures are, but also of what the specific policy requirements for that particular province are in relation to terms and conditions,” says Marescot. “These are complex as they vary significantly from one line of business and province to the next.” A further level of complexity stems from the fact that not only can data be hard to source, but in many instances it is not reported on the same basis from province to province. This means that significant resource must be devoted to homogenizing information from multiple different data streams. “We’ve devoted a lot of effort to ensuring the homogenization of all data underpinning the CAM,” Marescot explains. “We’ve also translated the information and policy requirements from Mandarin into English. This means that users can either enter their own policy conditions into the model or rely upon the database itself. In addition, the model is able to disaggregate low-resolution exposure to higher-resolution information, using planted area data information. All this has been of significant value to our clients.” The CAM covers all three lines of agricultural insurance — crop, livestock and forestry. A total of 12 crops are modeled individually, with over 60 other crop types represented in the model. For livestock, CAM covers four main perils: disease, epidemics, natural disasters and accident/fire for cattle, swine, sheep and poultry. The technology age As efforts to modernize farming practices continue, so new technologies are being brought to bear on monitoring crops, mapping supply and improving risk management. “More farmers are using new technology, such as apps, to track the growing conditions of crops and livestock and are also opening this to end consumers so that they can also monitor this online and in real-time,” He says. “There are some companies also trying to use blockchain technology to track the movements of crops and livestock based on consumer interest; for instance, from a piglet to the pork to the dumpling being consumed.” He says, “3S technology — geographic information sciences, remote sensing and global positioning systems — are commonly used in China for agriculture claims assessments. Using a smartphone app linked to remote control CCTV in livestock farms is also very common. These digital approaches are helping farmers better manage risk.” Insurer Ping An is now using drones for claims assessment. There is no doubt that as farming practices in China evolve, the potential to generate much greater information from new data streams will facilitate the development of new products better designed to meet on-the-ground requirements. He concludes: “China can become the biggest agricultural insurance market in the next 10 years. … As the Chinese agricultural industry becomes more professional, risk management and loss assessment experience from international markets and professional farm practices could prove valuable to the Chinese market.” References: 1. Ministry of Agriculture of the People’s Republic of China 2. Cheng Fang, “Development of Agricultural Mechanization in China,” Food and Agriculture Organization of the United Nations, https://forum2017.iamo.de/microsites/forum2017.iamo.de/fileadmin/presentations/B5_Fang.pdf 3. Ministry of Agriculture of the People’s Republic of China 4. Aon Benfield, “Global Catastrophe Recap: First Half of 2017,” July 2017, http://thoughtleadership.aonbenfield.com/Documents/201707-if-1h-global-recap.pdf 5. Aon Benfield, “2016 Annual Global Climate and Catastrophe Report,” http://thoughtleadership.aonbenfield.com/Documents/20170117-ab-ifannualclimate-catastrophe-report.pdf 6. Ibid. The disaster plan In April 2017, China announced the launch of an expansive disaster insurance program spanning approximately 200 counties in the country’s primary grain producing regions, including Hebei and Anhui.  The program introduces a new form of agriculture insurance designed to provide compensation for losses to crop yields resulting from natural catastrophes, including land fees, fertilizers and crop-related materials. China’s commitment to providing robust disaster cover was also demonstrated in 2016, when Swiss Re announced it had entered into a reinsurance protection scheme with the government of Heilongjiang Province and the Sunlight Agriculture Mutual Insurance Company of China — the first instance of the Chinese government capitalizing on a commercial program to provide cover for natural disasters. The coverage provides compensation to farming families for both harm to life and damage to property as well as income loss resulting from floods, excessive rain, drought and low temperatures. It determines insurance payouts based on triggers from satellite and meteorological data. Speaking at the launch, Swiss Re president for China John Chen said: “It is one of the top priorities of the government bodies in China to better manage natural catastrophe risks, and it has been the desire of the insurance companies in the market to play a bigger role in this sector. We are pleased to bridge the cooperation with an innovative solution and would look forward to replicating the solutions for other provinces in China.”  

EDITOR
Albert Benchimol
Albert Benchimol
Efficiency Breeds Value
September 04, 2017

Insurers must harness data, technology and human capital if they are to operate more efficiently and profitably in the current environment, but as AXIS Capital’s Albert Benchimol tells EXPOSURE, offering better value to clients may be a better long-term motive for becoming more efficient. Efficiency is a top priority for insurers the world over as they bid to increase margins, reduce costs and protect profitability in the competitive heat of the enduring soft market. But according to AXIS Capital president and CEO Albert Benchimol, there is a broader, more important and longer-term challenge that must also be addressed through the ongoing efficiency drive: value for money. “When I think of value, I think of helping our clients and partners succeed in their own endeavors. This means providing quick and responsive service, creative policy structures that address our customers’ coverage needs, best-in-class claims handling and trusting our people to pursue their own entrepreneurial goals,” says Benchimol. “While any one insurance policy may in itself offer good value, when aggregated, insurance is not necessarily seen as good value by clients. Our industry as a whole needs to deliver a better value proposition — and that means that all participants in the value chain will need to become much more efficient.” According to Benchimol — who prior to being appointed CEO of AXIS in 2012 served as the Bermuda-based insurance group’s CFO and also held senior executive positions at Partner Re, Reliance Group and Bank of Montreal — the days of paying out US$0.55-$0.60 in claims for every dollar of premium paid are over. “We need to start framing our challenge as delivering a 70 percent-plus loss ratio within a low 90s combined ratio,” he asserts. “Every player in the value chain needs to adopt efficiency-enhancing technology to lower our costs and pass those savings on to the customer.” With a surfeit of capital making it unlikely the insurance industry will return to its traditional cyclical nature any time soon, Benchimol says these changes have to be adopted for the long term. “Insurers have to evaluate their portfolios and product offerings to match customer needs with marketplace realities. We will need to develop new products to meet emerging demand; offer better value in the eyes of insureds; apply data, analytics and technology to all facets of our business; and become much more efficient,” he explains. Embracing technology The continued adoption and smarter use of data will be central to achieving this goal. “We’ve only begun to scratch the surface of what data we can access and insights we can leverage to make better, faster decisions throughout the risk transfer value chain,” Benchimol says. “If we use technology to better align our operations and costs with our customers’ needs and expectations, we will create and open-up new markets because potential insureds will see more value in the insurance product.” “I admire companies that constantly challenge themselves and that are driven by data to make informed decisions — companies that don’t rest on their laurels and don’t accept the status quo” Technology, data and analytics have already brought improved efficiencies to the insurance market. This has allowed insurers to focus their efforts on targeted markets and develop applications to deliver improved, customized purchasing experiences and increase client satisfaction and engagement, Benchimol notes. The introduction of data modeling, he adds, has also played a key role in improving economic protection, making it easier for (re)insurance providers to evaluate risks and enter new markets, thereby increasing the amount of capacity available to protect insureds. “While this can sometimes raise pricing pressures, it has a positive benefit of bringing more affordable capacity to potential customers. This has been most pronounced in the development of catastrophe models in underinsured emerging markets, where capital hasn’t always been available in the past,” he says. The introduction of models made these markets more attractive to capital providers which, in turn, makes developing custom insurance products more cost-effective and affordable for both insurers and their clients, Benchimol explains. However, there is no doubt the insurance industry has more to do if it is not only to improve its own profitability and offerings to customers, but also to stave off competition from external threats, such as disruptive innovators in the FinTech and InsurTech spheres. Strategic evolution “The industry’s inefficiencies and generally low level of customer satisfaction make it relatively easy prey for disruption,” Benchimol admits. However, he believes that the regulated and highly capital-intensive nature of insurance is such that established domain leaders will continue to thrive if they are prepared to beat innovators at their own game. “We need to move relatively quickly, as laggards may have a difficult time catching up,” he warns. “In order to thrive in the disruptive market economy, market leaders must take intelligent risks. This isn’t easy, but is absolutely necessary,” Benchimol says. “I admire companies that constantly challenge themselves and that are driven by data to make informed decisions — companies that don’t rest on their laurels and don’t accept the status quo.” “We need to start framing our challenge as delivering a 70-percent plus loss ratio within a low 90s combined ratio” Against the backdrop of a rapidly evolving market and transformed business environment, AXIS took stock of its business at the start of 2016, evaluating its key strengths and reflecting on the opportunities and challenges in its path. What followed was an important strategic evolution. “Over the course of the year we implemented a series of strategic initiatives across the business to drive long-term growth and ensure we deliver the most value to our clients, employees and shareholders,” Benchimol says. “This led us to sharpen our focus on specialty risk, where we believe we have particular expertise. We implemented new initiatives to even further enhance the quality of our underwriting. We invested more in our data and analytics capabilities, expanded the focus in key markets where we feel we have the greatest relevance, and took action to acquire firms that allow us to expand our leadership in specialty insurance, such as our acquisition of specialty aviation insurer and reinsurer Aviabel and our recent offer to acquire Novae.” Another highlight for AXIS in 2016 was the launch of Harrington Re, co-founded with the Blackstone Group. “At AXIS, our focus on innovation also extends to how we look at alternative funding sources and our relationship with third-party capital, which centers on matching the right risk with the right capital,” Benchimol explains. “We currently have a number of alternative capital sources that complement our balance sheet and enable us to deliver enhanced capacity and tailored solutions to our clients and brokers.” Benchimol believes a significant competitive advantage for AXIS is that it is still small enough to be agile and responsive to customers’ needs, yet large enough to take advantage of its global capabilities and resources in order to help clients manage their risks. But like many of his competitors, Benchimol knows future success will be heavily reliant on how well AXIS melds human expertise with the use of data and technology. “We need to combine our ingenuity, innovation and values with the strength, speed and intelligence offered by technology, data and analytics. The ability to combine these two great forces — the art and science of insurance — is what will define the insurer of the future,” Benchimol states. The key, he believes, is to empower staff to make informed, data-driven decisions. “The human elements that are critical to success in the insurance industry are, among others: knowledge, creativity, service and commitment to our clients and partners. We need to operate within a framework that utilizes technology to provide a more efficient customer experience and is underpinned by enhanced data and analytics capabilities that allow us to make informed, intelligent decisions on behalf of our clients.” However, Benchimol insists insurers must embrace change while holding on to the traditional principles that underpinned insurance in the analog age, as these same principles must continue to do so into the future. “We must harness technology for good causes, while remaining true to the core values and universal strengths of our industry — a passion for helping people when they are down, a creativity in structuring products, and the commitment to keeping the promise we make to our clients to help them mitigate risks and ensure the security of their assets,” he says. “We must not forget these critical elements that comprise the heart of the insurance industry.”

NIGEL ALLEN
Quantum_computer_core
Quantum_computer_core
Quantum Leap
September 04, 2017

Much hype surrounds quantum processing. This is perhaps unsurprising given that it could create computing systems thousands (or millions, depending on the study) of times more powerful than current classical computing frameworks. The power locked within quantum mechanics has been recognized by scientists for decades, but it is only in recent years that its conceptual potential has jumped the theoretical boundary and started to take form in the real world. Since that leap, the “quantum race” has begun in earnest, with China, Russia, Germany and the U.S. out in front. Technology heavyweights such as IBM, Microsoft and Google are breaking new quantum ground each month, striving to move these processing capabilities from the laboratory into the commercial sphere. But before getting swept up in this quantum rush, let’s look at the mechanics of this processing potential. The quantum framework Classical computers are built upon a binary framework of “bits” (binary digits) of information that can exist in one of two definite states — zero or one, or “on or off.” Such systems process information in a linear, sequential fashion, similar to how the human brain solves problems. In a quantum computer, bits are replaced by “qubits” (quantum bits), which can operate in multiple states — zero, one or any state in between (referred to as quantum superposition). This means they can store much more complex data. If a bit can be thought of as a single note that starts and finishes, then a qubit is the sound of a huge orchestra playing continuously. What this state enables — largely in theory, but increasingly in practice — is the ability to process information at an exponentially faster rate. This is based on the interaction between the qubits. “Quantum entanglement” means that rather than operating as individual pieces of information, all the qubits within the system operate as a single entity. From a computational perspective, this creates an environment where multiple computations encompassing exceptional amounts of data can be performed virtually simultaneously. Further, this beehive-like state of collective activity means that when new information is introduced, its impact is instantly transferred to all qubits within the system. Getting up to processing speed To deliver the levels of interaction necessary to capitalize on quantum power requires a system with multiple qubits. And this is the big challenge. Quantum information is incredibly brittle. Creating a system that can contain and maintain these highly complex systems with sufficient controls to support analytical endeavors at a commercially viable level is a colossal task. In March, IBM announced IBM Q — part of its ongoing efforts to create a commercially available universal quantum computing system. This included two different processors: a 16-qubit processor to allow developers and programmers to run quantum algorithms; and a 17-qubit commercial processor prototype — its most powerful quantum unit to date. At the launch, Arvind Krishna, senior vice president and director of IBM Research and Hybrid Cloud, said: “The significant engineering improvements announced today will allow IBM to scale future processors to include 50 or more qubits, and demonstrate computational capabilities beyond today’s classical computing systems.” “a major challenge is the simple fact that when building such systems, few components are available off-the-shelf” Matthew Griffin 311 Institute IBM also devised a new metric for measuring key aspects of quantum systems called “Quantum Volume.” These cover qubit quality, potential system error rates and levels of circuit connectivity. According to Matthew Griffin, CEO of innovation consultants the 311 Institute, a major challenge is the simple fact that when building such systems, few components are available off-the-shelf or are anywhere near maturity. “From compute to memory to networking and data storage,” he says, “companies are having to engineer a completely new technology stack. For example, using these new platforms, companies will be able to process huge volumes of information at near instantaneous speeds, but even today’s best and fastest networking and storage technologies will struggle to keep up with the workloads.” In response, he adds that firms are looking at “building out DNA and atomic scale storage platforms that can scale to any size almost instantaneously,” with Microsoft aiming to have an operational system by 2020. “Other challenges include the operating temperature of the platforms,” Griffin continues. “Today, these must be kept as close to absolute zero (minus 273.15 degrees Celsius) as possible to maintain a high degree of processing accuracy. One day, it’s hoped that these platforms will be able to operate at, or near, room temperature. And then there’s the ‘fitness’ of the software stack — after all, very few, if any, software stacks today can handle anything like the demands that quantum computing will put onto them.” Putting quantum computing to use One area where quantum computing has major potential is in optimization challenges. These involve the ability to analyze immense data sets to establish the best possible solutions to achieve a particular outcome. And this is where quantum processing could offer the greatest benefit to the insurance arena — through improved risk analysis. “From an insurance perspective,” Griffin says, “some opportunities will revolve around the ability to analyze more data, faster, to extrapolate better risk projections. This could allow dynamic pricing, but also help better model systemic risk patterns that are an increasing by-product of today’s world, for example, in cyber security, healthcare and the internet of things, to name but a fraction of the opportunities.” Steve Jewson, senior vice president of model development at RMS, adds: “Insurance risk assessment is about considering many different possibilities, and quantum computers may be well suited for that task once they reach a sufficient level of maturity.” However, he is wary of overplaying the quantum potential. “Quantum computers hold the promise of being superfast,” he says, “but probably only for certain specific tasks. They may well not change 90 percent of what we do. But for the other 10 percent, they could really have an impact. “I see quantum computing as having the potential to be like GPUs [graphics processing units] — very good at certain specific calculations. GPUs turned out to be fantastically fast for flood risk assessment, and have revolutionized that field in the last 10 years. Quantum computers have the potential to revolutionize certain specific areas of insurance in the same way.” On the insurance horizon? It will be at least five years before quantum computing starts making a meaningful difference to businesses or society in general — and from an insurance perspective that horizon is probably much further off. “Many insurers are still battling the day-to-day challenges of digital transformation,” Griffin points out, “and the fact of the matter is that quantum computing … still comes some way down the priority list.” “In the next five years,” says Jewson, “progress in insurance tech will be about artificial intelligence and machine learning, using GPUs, collecting data in smart ways and using the cloud to its full potential. Beyond that, it could be about quantum computing.” According to Griffin, however, the insurance community should be seeking to understand the quantum realm. “I would suggest they explore this technology, talk to people within the quantum computing ecosystem and their peers in other industries, such as financial services, who are gently ‘prodding the bear.’ Being informed about the benefits and the pitfalls of a new technology is the first step in creating a well thought through strategy to embrace it, or not, as the case may be.” Cracking the code Any new technology brings its own risks — but for quantum computing those risks take on a whole new meaning. A major concern is the potential for quantum computers, given their astronomical processing power, to be able to bypass most of today’s data encryption codes.  “Once ‘true’ quantum computers hit the 1,000 to 2,000 qubit mark, they will increasingly be able to be used to crack at least 70 percent of all of today’s encryption standards,” warns Griffin, “and I don’t need to spell out what that means in the hands of a cybercriminal.” Companies are already working to pre-empt this catastrophic data breach scenario, however. For example, PwC announced in June that it had “joined forces” with the Russian Quantum Center to develop commercial quantum information security systems. “As companies apply existing and emerging technologies more aggressively in the push to digitize their operating models,” said Igor Lotakov, country managing partner at PwC Russia, following the announcement, “the need to create efficient cyber security strategies based on the latest breakthroughs has become paramount. If companies fail to earn digital trust, they risk losing their clients.”

Helen Yates
Golden Gate
Golden Gate
The Peril of Ignoring The Tail
September 04, 2017

Drawing on several new data sources and gaining a number of new insights from recent earthquakes on how different fault segments might interact in future earthquakes, Version 17 of the RMS North America Earthquake Models sees the frequency of larger events increasing, making for a fatter tail.  EXPOSURE asks what this means for (re)insurers from a pricing and exposure management perspective. Recent major earthquakes, including the M9.0 Tohoku Earthquake in Japan in 2011 and the Canterbury Earthquake Sequence in New Zealand (2010-2011), have offered new insight into the complexities and interdependencies of losses that occur following major events. This insight, as well as other data sources, was incorporated into the latest seismic hazard maps released by the U.S. Geological Survey (USGS). In addition to engaging with USGS on its 2014 update, RMS went on to invest more than 100 person-years of work in implementing the main findings of this update as well as comprehensively enhancing and updating all components in its North America Earthquake Models (NAEQ). The update reflects the deep complexities inherent in the USGS model and confirms the adage that “earthquake is the quintessential tail risk.” Among the changes to the RMS NAEQ models was the recognition that some faults can interconnect, creating correlations of risk that were not previously appreciated. Lessons from Kaikoura While there is still a lot of uncertainty surrounding tail risk, the new data sets provided by USGS and others have improved the understanding of events with a longer return period. “Global earthquakes are happening all of the time, not all large, not all in areas with high exposures,” explains Renee Lee, director, product management at RMS. “Instrumentation has become more advanced and coverage has expanded such that scientists now know more about earthquakes than they did eight years ago when NAEQ was last released in Version 9.0.” This includes understanding about how faults creep and release energy, how faults can interconnect, and how ground motions attenuate through soil layers and over large distances. “Soil plays a very important role in the earthquake risk modeling picture,” says Lee. “Soil deposits can amplify ground motions, which can potentially magnify the building’s response leading to severe damage.” The 2016 M7.8 earthquake in Kaikoura, on New Zealand’s South Island, is a good example of a complex rupture where fault segments connected in more ways than had previously been realized. In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next, producing a single larger earthquake. “The Kaikoura quake was interesting in that we did have some complex energy release moving from fault to fault,” says Glenn Pomeroy, CEO of the California Earthquake Authority (CEA). “We can’t hide our heads in the sand and pretend that scientific awareness doesn’t exist. The probability has increased for a very large, but very infrequent, event, and we need to determine how to manage that risk.” San Andreas correlations Looking at California, the updated models include events that extend from the north of San Francisco to the south of Palm Springs, correlating exposures along the length of the San Andreas fault. While the prospect of a major earthquake impacting both northern and southern California is considered extremely remote, it will nevertheless affect how reinsurers seek to diversify different types of quake risk within their book of business. “In the past, earthquake risk models have considered Los Angeles as being independent of San Francisco,” says Paul Nunn, head of catastrophe risk modeling at SCOR. “Now we have to consider that these cities could have losses at the same time (following a full rupture of the San Andreas Fault). In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next “However, it doesn’t make that much difference in the sense that these events are so far out in the tail … and we’re not selling much coverage beyond the 1-in-500-year or 1-in-1,000-year return period. The programs we’ve sold will already have been exhausted long before you get to that level of severity.” While the contribution of tail events to return period losses is significant, as Nunn explains, this could be more of an issue for insurance companies than (re)insurers, from a capitalization standpoint. “From a primary insurance perspective, the bigger the magnitude and event footprint, the more separate claims you have to manage. So, part of the challenge is operational — in terms of mobilizing loss adjusters and claims handlers — but primary insurers also have the risk that losses from tail events could go beyond the (re)insurance program they have bought. “It’s less of a challenge from the perspective of global (re)insurers, because most of the risk we take is on a loss limited basis — we sell layers of coverage,” he continues. “Saying that, pricing for the top layers should always reflect the prospect of major events in the tail and the uncertainty associated with that.” He adds: “The magnitude of the Tohoku earthquake event is a good illustration of the inherent uncertainties in earthquake science and wasn’t represented in modeled scenarios at that time.” While U.S. regulation stipulates that carriers writing quake business should capitalize to the 1-in-200-year event level, in Canada capital requirements are more conservative in an effort to better account for tail risk. “So, Canadian insurance companies should have less overhang out of the top of their (re)insurance programs,” says Nunn. Need for post-event funding For the CEA, the updated earthquake models could reinvigorate discussions around the need for a mechanism to raise additional claims-paying capacity following a major earthquake. Set up after the Northridge Earthquake in 1994, the CEA is a not-for-profit, publicly managed and privately funded earthquake pool. “It is pretty challenging for a stand-alone entity to take on large tail risk all by itself,” says Pomeroy. “We have, from time to time, looked at the possibility of creating some sort of post-event risk-transfer mechanism. “A few years ago, for instance, we had a proposal in front of the U.S. Congress that would have created the ability for the CEA to have done some post-event borrowing if we needed to pay for additional claims,” he continues. “It would have put the U.S. government in the position of guaranteeing our debt. The proposal didn’t get signed into law, but it is one example of how you could create an additional claim-paying capacity for that very large, very infrequent event.” “(Re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing” Paul Nunn SCOR The CEA leverages both traditional and non-traditional risk-transfer mechanisms. “Risk transfer is important. No one entity can take it on alone,” says Pomeroy. “Through risk transfer from insurer to (re)insurer the risk is spread broadly through the entrance of the capital markets as another source for claim-paying capability and another way of diversifying the concentration of the risk. “We manage our exposure very carefully by staying within our risk-transfer guidelines,” he continues. “When we look at spreading our risk, we look at spreading it through a large number of (re)insurance companies from 15 countries around the world. And we know the (re)insurers have their own strict guidelines on how big their California quake exposure should be.” The prospect of a higher frequency of larger events producing a “fatter” tail also raises the prospect of an overall reduction in average annual loss (AAL) for (re)insurance portfolios, a factor that is likely to add to pricing pressure as the industry approaches the key January 1 renewal date, predicts Nunn. “The AAL for Los Angeles coming down in the models will impact the industry in the sense that it will affect pricing and how much probable maximum loss people think they’ve got. Most carriers are busy digesting the changes and carrying out due diligence on the new model updates. “Although the eye-catching change is the possibility of the ‘big one,’ the bigger immediate impact on the industry is what’s happening at lower return periods where we’re selling a lot of coverage,” he says. “LA was a big driver of risk in the California quake portfolio and that’s coming down somewhat, while the risk in San Francisco is going up. So (re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing.”

close button
Video Title

Thank You

You’ll be contacted by an RMS specialist shortly.

RMS.com uses cookies to improve your experience and analyze site usage. Read Cookie Policy or click I understand.

close