logo image
NIGEL ALLENSeptember 05, 2018
10-risk-driven-business
10-risk-driven-business
A Risk-Driven Business
September 05, 2018

Following Tower Insurance’s switch to risk-based pricing in New Zealand, EXPOSURE examines how recent market developments may herald a more fundamental industry shift The ramifications of the Christchurch earthquakes of 2010-11 continue to reverberate through the New Zealand insurance market. The country’s Earthquake Commission (EQC), which provides government-backed natural disaster insurance, is forecast to have paid around NZ$11 billion (US$7.3 billion) by the time it settles its final claim. The devastating losses exposed significant shortfalls in the country’s insurance market. These included major deficiencies in insurer data, gaps in portfolio management and expansive policy wordings that left carriers exposed to numerous unexpected losses. Since then, much has changed. Policy terms have been tightened, restrictions have been introduced on coverage and concerted efforts have been made to bolster databases. On July 1, 2019, the EQC increased the cap limit on the government-mandated residential cover it provides to all householders from NZ$100,000 (US$66,000) (a figure set in 1993) to NZ$150,000. A significant increase, but well below the average house price in New Zealand as of December 2017, which stood at NZ$669,565, and an average rebuild cost of NZ$350,000. It has also removed contents coverage. More recently, however, one development has taken place that has the potential to have a much more profound impact on the market. Risk-Based Pricing In March 2018, New Zealand insurer Tower Insurance announced a move to risk-based pricing for home insurance. It aims to ensure premium levels are commensurate with individual property risk profiles, with those in highly exposed areas experiencing a price rise on the earthquake component of their coverage. Describing the shift as a “fairer and more equitable way of pricing risk,” Tower CEO Richard Harding says this was the “right thing to do” both for the “long-term benefit of New Zealand” and for customers, with risk-based pricing “the fairest way to distribute the costs we face as an insurer.” The move has generated much media coverage, with stories highlighting instances of triple-digit percentage hikes in earthquake-prone regions such as Wellington. Yet, what has generated significantly fewer column inches has been the marginal declines available to the vast majority of households in the less seismically active regions, as the high-risk earthquake burden on their premium is reduced. A key factor in Tower’s decision was the increasing quality and granularity of the underwriting data at its disposal. “Tower has always focused on the quality of its data and has invested heavily in ensuring it has the highest-resolution information available,” says Michael Drayton, senior risk modeler for RMS, based in New Zealand. “The earthquakes generated the most extensive liquefaction in a built-up area seen in a developed country” Michael Drayton RMS In fact, in the aftermath of the Christchurch earthquakes, RMS worked with Tower as RMS rebuilt its New Zealand High-Definition (HD) Earthquake Model due to the caliber of their data. Prior to the earthquake, claims data was in very short supply given that there had been few previous events with large-scale impacts on highly built-up areas. “On the vulnerability side,” Drayton explains, “we had virtually no local claims data to build our damage functions. Our previous model had used comparisons of building performance in other earthquake-exposed regions. After Christchurch, we suddenly had access to billions of dollars of claims information.” RMS sourced data from numerous parties, including EQC and Tower, as well as geoscience research firm GNS Science, as it reconstructed the model from this swell of data. “RMS had a model that had served the market well for many years,” he explains. “On the hazard side, the fundamentals remained the same — the highest hazard is along the plate boundary, which runs offshore along the east coast of North Island traversing over to the western edge of South Island. But we had now gathered new information on fault lines, activity rates, magnitudes and subduction zones. We also updated our ground motion prediction equations.” One of the most high-profile model developments was the advanced liquefaction module. “The 2010-11 earthquakes generated probably the most extensive liquefaction in a built-up area seen in a developed country. With the new information, we were now able to capture the risk at much higher gradients and in much greater resolution,” says Drayton. This data surge enabled RMS to construct its New Zealand Earthquake HD Model on a variable resolution grid set at a far more localized level. In turn, this has helped give Tower sufficient confidence in the granularity and accuracy of its data at the property level to adopt risk-based pricing. The Ripple Effects As homeowners received their renewal notices, the reality of risk-based pricing started to sink in. Tower is the third-largest insurer for domestic household, contents and private motor cover in New Zealand and faces stiff competition. Over 70 percent of the market is in the hands of two players, with IAG holding around 47 percent and Suncorp approximately 25 percent. News reports also suggested movement from the larger players. AMI and State, both owned by IAG, announced that three-quarters of its policyholders — those at heightened risk of earthquake, landslide or flood — will see an average annual premium increase of NZ$91 (US$60); the remaining quarter at lower risk will see decreases averaging NZ$54 per year. A handful of households could see increases or decreases of up to NZ$1,000. According to the news website Stuff, IAG has not changed premiums for its NZI policyholders, with NZI selling house insurance policies through brokers. “One interesting dynamic is that a small number of start-ups are now entering the market with the same risk-based pricing stance taken by Tower,” Drayton points out. “These are companies with new purpose-built IT systems that are small and nimble and able to target niche sectors.” “It’s certainly a development to watch closely,” he continues, “as it raises the potential for larger players, if they are not able to respond effectively, being selected against. It will be interesting to see if the rate of these new entrants increases.” The move from IAG suggests risk-based pricing will extend beyond the earthquake component of cover to flood-related elements. “Flood is not a reinsurance peril for New Zealand, but it is an attritional one,” Drayton points out. “Then there is the issue of rising sea levels and the potential for coastal flooding, which is a major cause for concern. So, the risk-based pricing shift is feeding into climate change discussions too.” A Fundamental Shift Policyholders in risk-exposed areas such as Wellington were almost totally unaware of how much higher their insurance should be based on their property exposure, largely shielded away from the risk reality of earthquakes in recent years. The move to risk-based pricing will change that. “The market shifts we are seeing today pose a multitude of questions and few clear answers”  Michael Drayton RMS Drayton agrees that recent developments are opening the eyes of homeowners. “There is a growing realization that New Zealand’s insurance market has operated very differently from other insurance markets and that that is now changing.” One major marketwide development in recent years has been the move from full replacement cover to fixed sums insured in household policies. “This has a lot of people worried they might not be covered,” he explains. “Whereas before, people simply assumed that in the event of a big loss the insurer would cover it all, now they’re slowly realizing it no longer works like that. This will require a lot of policyholder education and will take time.” At a more foundational level, current market dynamics also address the fundamental role of insurance, exposing the conflicted role of the insurer as both a facilitator of risk pooling and a profit-making enterprise. When investment returns outweighed underwriting profit, it appeared as if cross-subsidization wasn’t a big issue. However, current dynamics has meant the operating model is squarely focused on underwriting returns — to favor risk-based pricing. Cross-subsidization is the basis upon which EQC is built, but is it fair? Twenty cents in every NZ$100 (US$66) of home or contents fire insurance premium, up to a maximum of NZ$100,000 insured, is passed on to the EQC. While to date there has been limited government response to risk-based pricing, it is monitoring the situation closely given the broader implications. Looking globally, in an RMS blog, chief research officer Robert Muir-Wood also raises the question whether “flat-rated” schemes, like the French cat nat scheme, will survive now that it has become clear how to use risk models to calculate the wide differentials in the underlying cost of the risk. He asks whether “such schemes are established in the name of ‘solidarity’ or ignorance?” While there is no evidence yet, current developments raise the potential for certain risks to become uninsurable. Increasingly granular data combined with the drive for greater profitability may cause a downward spiral in a market built on a shared burden. Drayton adds: “Potential uninsurability has more to do with land-use planning and building consent regimes, and insurers shouldn’t be paying the price for poor planning decisions. Ironically, earthquake loading codes are very sophisticated and have evolved to recognize the fine gradations in earthquake risk provided by localized data. In fact, they are so refined that structural engineers remark that they are too nuanced and need to be simpler. But if you are building in a high-risk area, it’s not just designing for the hazard, it is also managing the potential financial risk.” He concludes: “The market shifts we are seeing today pose a multitude of questions and few clear answers. However, the only constant running through all these discussions is that they are all data driven.” Making the Move Key to understanding the rationale behind the shift to risk-based pricing is understanding the broader economic context of New Zealand, says Tower CEO Richard Harding. “The New Zealand economy is comparatively small,” he explains, “and we face a range of unique climatic and geological risks. If we don’t plan for and mitigate these risks, there is a chance that reinsurers will charge insurers more or restrict cover. “Before this happens, we need to educate the community, government, councils and regulators, and by moving toward risk-based pricing, we’re putting a signal into the market to drive social change through these organizations. “These signals will help demonstrate to councils and government that more needs to be done to plan for and mitigate natural disasters and climate change.”  Harding feels that this risk-based pricing shift is a natural market evolution. “When you look at global trends, this is happening around the world. So, given that we face a number of large risks here in New Zealand, in some respects, it’s surprising it hasn’t happened sooner,” he says. While some parties have raised concerns that there may be a fall in insurance uptake in highly exposed regions, Harding does not believe this will be the case. “For the average home, insurance may be more expensive than it currently is, but it won’t be unattainable,” he states.  Moving forward, he says that Tower is working to extend its risk-based pricing approach beyond the earthquake component of its cover, stating that the firm “is actively pursuing risk-based pricing for flood and other natural perils, and over the long term we would expect other insurers to follow in our footsteps.”  In terms of the potential wider implications if this occurs, Harding says that such a development would compel government, councils and other organizations to change how they view risk in their planning processes. “I think it will start to drive customers to consider risk more holistically and take this into account when they build and buy homes,” he concludes.

Helen YatesSeptember 04, 2017
Golden Gate
Golden Gate
The Peril of Ignoring The Tail
September 04, 2017

Drawing on several new data sources and gaining a number of new insights from recent earthquakes on how different fault segments might interact in future earthquakes, Version 17 of the RMS North America Earthquake Models sees the frequency of larger events increasing, making for a fatter tail.  EXPOSURE asks what this means for (re)insurers from a pricing and exposure management perspective. Recent major earthquakes, including the M9.0 Tohoku Earthquake in Japan in 2011 and the Canterbury Earthquake Sequence in New Zealand (2010-2011), have offered new insight into the complexities and interdependencies of losses that occur following major events. This insight, as well as other data sources, was incorporated into the latest seismic hazard maps released by the U.S. Geological Survey (USGS). In addition to engaging with USGS on its 2014 update, RMS went on to invest more than 100 person-years of work in implementing the main findings of this update as well as comprehensively enhancing and updating all components in its North America Earthquake Models (NAEQ). The update reflects the deep complexities inherent in the USGS model and confirms the adage that “earthquake is the quintessential tail risk.” Among the changes to the RMS NAEQ models was the recognition that some faults can interconnect, creating correlations of risk that were not previously appreciated. Lessons from Kaikoura While there is still a lot of uncertainty surrounding tail risk, the new data sets provided by USGS and others have improved the understanding of events with a longer return period. “Global earthquakes are happening all of the time, not all large, not all in areas with high exposures,” explains Renee Lee, director, product management at RMS. “Instrumentation has become more advanced and coverage has expanded such that scientists now know more about earthquakes than they did eight years ago when NAEQ was last released in Version 9.0.” This includes understanding about how faults creep and release energy, how faults can interconnect, and how ground motions attenuate through soil layers and over large distances. “Soil plays a very important role in the earthquake risk modeling picture,” says Lee. “Soil deposits can amplify ground motions, which can potentially magnify the building’s response leading to severe damage.” The 2016 M7.8 earthquake in Kaikoura, on New Zealand’s South Island, is a good example of a complex rupture where fault segments connected in more ways than had previously been realized. In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next, producing a single larger earthquake. “The Kaikoura quake was interesting in that we did have some complex energy release moving from fault to fault,” says Glenn Pomeroy, CEO of the California Earthquake Authority (CEA). “We can’t hide our heads in the sand and pretend that scientific awareness doesn’t exist. The probability has increased for a very large, but very infrequent, event, and we need to determine how to manage that risk.” San Andreas Correlations Looking at California, the updated models include events that extend from the north of San Francisco to the south of Palm Springs, correlating exposures along the length of the San Andreas fault. While the prospect of a major earthquake impacting both northern and southern California is considered extremely remote, it will nevertheless affect how reinsurers seek to diversify different types of quake risk within their book of business. “In the past, earthquake risk models have considered Los Angeles as being independent of San Francisco,” says Paul Nunn, head of catastrophe risk modeling at SCOR. “Now we have to consider that these cities could have losses at the same time (following a full rupture of the San Andreas Fault). In Kaikoura, at least six fault segments were involved, where the rupture “jumped” from one fault segment to the next “However, it doesn’t make that much difference in the sense that these events are so far out in the tail … and we’re not selling much coverage beyond the 1-in-500-year or 1-in-1,000-year return period. The programs we’ve sold will already have been exhausted long before you get to that level of severity.” While the contribution of tail events to return period losses is significant, as Nunn explains, this could be more of an issue for insurance companies than (re)insurers, from a capitalization standpoint. “From a primary insurance perspective, the bigger the magnitude and event footprint, the more separate claims you have to manage. So, part of the challenge is operational — in terms of mobilizing loss adjusters and claims handlers — but primary insurers also have the risk that losses from tail events could go beyond the (re)insurance program they have bought. “It’s less of a challenge from the perspective of global (re)insurers, because most of the risk we take is on a loss limited basis — we sell layers of coverage,” he continues. “Saying that, pricing for the top layers should always reflect the prospect of major events in the tail and the uncertainty associated with that.” He adds: “The magnitude of the Tohoku earthquake event is a good illustration of the inherent uncertainties in earthquake science and wasn’t represented in modeled scenarios at that time.” While U.S. regulation stipulates that carriers writing quake business should capitalize to the 1-in-200-year event level, in Canada capital requirements are more conservative in an effort to better account for tail risk. “So, Canadian insurance companies should have less overhang out of the top of their (re)insurance programs,” says Nunn. Need for Post-Event Funding For the CEA, the updated earthquake models could reinvigorate discussions around the need for a mechanism to raise additional claims-paying capacity following a major earthquake. Set up after the Northridge Earthquake in 1994, the CEA is a not-for-profit, publicly managed and privately funded earthquake pool. “It is pretty challenging for a stand-alone entity to take on large tail risk all by itself,” says Pomeroy. “We have, from time to time, looked at the possibility of creating some sort of post-event risk-transfer mechanism. “A few years ago, for instance, we had a proposal in front of the U.S. Congress that would have created the ability for the CEA to have done some post-event borrowing if we needed to pay for additional claims,” he continues. “It would have put the U.S. government in the position of guaranteeing our debt. The proposal didn’t get signed into law, but it is one example of how you could create an additional claim-paying capacity for that very large, very infrequent event.” “(Re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing” Paul Nunn SCOR The CEA leverages both traditional and non-traditional risk-transfer mechanisms. “Risk transfer is important. No one entity can take it on alone,” says Pomeroy. “Through risk transfer from insurer to (re)insurer the risk is spread broadly through the entrance of the capital markets as another source for claim-paying capability and another way of diversifying the concentration of the risk. “We manage our exposure very carefully by staying within our risk-transfer guidelines,” he continues. “When we look at spreading our risk, we look at spreading it through a large number of (re)insurance companies from 15 countries around the world. And we know the (re)insurers have their own strict guidelines on how big their California quake exposure should be.” The prospect of a higher frequency of larger events producing a “fatter” tail also raises the prospect of an overall reduction in average annual loss (AAL) for (re)insurance portfolios, a factor that is likely to add to pricing pressure as the industry approaches the key January 1 renewal date, predicts Nunn. “The AAL for Los Angeles coming down in the models will impact the industry in the sense that it will affect pricing and how much probable maximum loss people think they’ve got. Most carriers are busy digesting the changes and carrying out due diligence on the new model updates. “Although the eye-catching change is the possibility of the ‘big one,’ the bigger immediate impact on the industry is what’s happening at lower return periods where we’re selling a lot of coverage,” he says. “LA was a big driver of risk in the California quake portfolio and that’s coming down somewhat, while the risk in San Francisco is going up. So (re)insurers will be considering how to adjust the balance between the LA and San Francisco business they’re writing.”

MEGAN ARNOLDJuly 25, 2016
NZEQ-Cracked
NZEQ-Cracked
Learning From New Zealand to Avoid Surprises
July 25, 2016

Secondary hazards, such as liquefaction, and the earthquake sequencing that hit the low-seismicity area of Canterbury, New Zealand, in 2010 and 2011 contributed significantly to the overall loss figures, explains RMS seismology expert Megan Arnold.  The phenomenon of “loss creep” has long been an issue associated with major catastrophes, and slight revisions in expected losses are to be expected. However, when unanticipated losses occur and an insurance or reinsurance company radically revises its loss figures upwards there can be a detrimental impact on the business.  “When unanticipated losses occur and an insurance or reinsurance company radically revises its loss figures upwards there can be a detrimental impact on the business.” While catastrophe models and exposure management tools have evolved considerably, every major catastrophe is a necessary learning experience. This includes the 2010 and 2011 Canterbury earthquake sequence in New Zealand. Figure 1. Example of liquefaction that caused significant damage to buildings during the Canterbury earthquake sequence. The photo was taken during the RMS reconnaissance trip to Christchurch after the February 22, 2011, earthquake. The magnitude 7.1 earthquake in September 2010 on an unknown fault in Canterbury, which was previously thought to be a low seismic-hazard area, caused surprisingly widespread damage, but no loss of life. This started a sequence of 17 loss-causing earthquakes in the region, lasting over a year. It was the magnitude 6.3 event – right beneath the city of Christchurch on February 22, 2011 – that proved deadly. Many buildings that had been damaged and weakened in earlier quakes were reduced to rubble, and 182 people died. In addition to this low-seismicity area suddenly experiencing earthquake shake damage, the main unanticipated losses were from the unprecedented amount of liquefaction, when saturated or partially saturated soil substantially loses strength causing it to behave like a liquid. This phenomenon produced so much damage that thousands of residential homes in the region were found to be situated on land with liquefaction susceptibility too hazardous for repairs or rebuilding. They were subsequently designated within the government red zone and demolished. The impact of repeated events and the large amount of liquefaction created progressive damage during the 2010-2011 Canterbury earthquake sequence, significantly confusing the loss picture and prolonging the loss adjusting and claims settlement process. The New Zealand Earthquake Commission (EQC) and private insurers are still settling outstanding Canterbury earthquake claims five years later. “These important enhancements to the model’s liquefaction loss component offer a more precise tool with which to gauge the likely impact of this secondary earthquake hazard on a book of business.” The 2010-2011 earthquakes presented an important opportunity to learn more about the behavior of liquefaction. The Natural Hazards Research Platform, EQC and many local agencies in New Zealand funded the collection of liquefaction observation data across Christchurch. This extensive, high-quality data reveals several key observations, including: The spatial extent of the observed liquefaction during the February 2, 2011, M6.3 event corresponds well to the shallow groundwater zones in the area of Christchurch, but not where the groundwater is deeper. The observations confirm that groundwater depth is an important factor in predicting liquefaction initiation. There is significant spatial variation in the liquefaction-related ground displacements over short distances. To account for these large differences in severity over short distances, the modeling methods need to map liquefaction severity parameters for localized variations where possible. Two primary failure mechanisms cause the severe ground displacements, predominantly vertical deformation as well as more laterally induced ground displacement. The Christchurch liquefaction data shows a probable correlation between ground displacement severity and damage. Lateral deformation is found to be more damaging than vertical displacement. Figure 2. Example of lateral spreading that caused severe damage to buildings and infrastructure in Christchurch during the February 22, 2011, earthquake. The photo was taken during the RMS reconnaissance trip to Christchurch. Learning from the earthquakes using observational data and our own research, RMS incorporates four innovations in liquefaction loss modeling into the RMS® New Zealand Earthquake HD Model to help firms better predict the occurrence and severity of liquefaction: Innovation 1: New geospatial methods that map groundwater-well data and near-surface groundwater depth to better determine regions of high liquefaction susceptibility across the country, including low-seismicity areas. Innovation 2: New geospatial methods that use site-specific liquefaction borehole data to create maps that delineate liquefaction initiation potential and severity parameters. Innovation 3: New methods of predicting where liquefaction could result in horizontal displacement. Innovation 4: New analysis of empirical building fragility to liquefaction based on the Christchurch observation data and insurance claims. These important enhancements to the model’s liquefaction loss component offer a more precise tool with which to gauge the likely impact of this secondary earthquake hazard on a book of business by enabling firms to predict a more granular scale loss from liquefaction. The developments have improved how RMS earthquake models determine the spatial pattern of liquefaction initiation, the liquefaction severity at the ground surface (if initiated) and expected building responses to liquefaction-induced ground displacements.

Loading Icon
close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.