The police had attempted an evacuation of some communities and the army was on standby. This was because of warnings of a ‘catastrophic’ North Sea storm surge on January 13 for which the UK Environment Agency applied the highest level flood warnings along parts of the East Coast: ‘severe’ which represents a danger to life. And yet the flooding did not materialize.
Water levels were 1.2m lower along the Lincolnshire coast than those experienced in the last big storm surge flood in December 2013, and 0.9m lower around the Norfolk towns of Great Yarmouth and Lowestoft. Predicting the future in such complex situations, even very near-term, always has the potential to make fools of the experts. But there’s a pressure on public agencies, knowing the political fallout of missing a catastrophe, to adopt the precautionary principle and take action. Imagine the set of headlines, and ministerial responses, if there had been no warnings followed by loss of life.
Interestingly, most of those who had been told to evacuate as this storm approached chose to stay in their homes. One police force in Essex, knocked on 2,000 doors yet only 140 of those people registered at an evacuation centre. Why did the others ignore the warnings and stay put? Media reports suggest that many felt this was another false alarm.
The precautionary principal might seem prudent, but a false alarm forecast can encourage people to ignore future warnings. Recent years offer numerous examples of the consequences.
The Lessons of History
Following a 2006 Mw8.3 earthquake offshore from the Kurile Islands, tsunami evacuation warnings were issued all along the Pacific coast of northern Japan, where the tsunami that did arrive was harmless. For many people that experience weakened the imperative to evacuate after feeling the three-minute shaking of the March 2011 Mw9 earthquake, following which 20,000 people were drowned by the tsunami. Based on the fear of what happened in 2004 and 2011, today tsunami warnings are being ‘over-issued’ in many countries around the Pacific and Indian Oceans.
For the inhabitants of New Orleans, the evacuation order issued in advance of Hurricane Ivan in December 2004 (when one third of the city’s population moved out, while the storm veered away), left many sceptical about the mandatory evacuation issued in advance of Hurricane Katrina in August 2005 (after which around 1500 drowned).
Agencies whose job it is to forecast disaster know only too well what happens if they don’t issue a warning as any risk looms. However, the long-term consequences from false alarms are perhaps not made explicit enough. While risk models to calculate the consequence are not yet available, a simple hypothetical calculation illustrates the basic principles of how such a model might work:
the chance of a dangerous storm surge in the next 20 years is 10 percent, for a given community;
if this happens, then let’s say 5,000 people would be at grave risk;
because of a recent ‘false’ alarm, one percent of those residents will ignore evacuation orders;
thus the potential loss of life attributed to the false alarm is five people.
Now repeat with real data.
Forecasting agencies need a false alarm forecast risk model to be able to help balance their decisions about when to issue severe warnings. There is an understandable instinct to be over cautious in the short-term, but when measured in terms of future lives lost, disaster warnings need to be carefully rationed. And that rationing requires political support, as well as public education.
[Note: RMS models storm surge in the U.K. where the risk is highest along England’s East Coast – the area affected by flood warnings on January 13. Surge risk is complex, and the RMS Europe Windstorm Model™ calculates surge losses caused by extra-tropical cyclones considering factors such as tidal state, coastal defenses, and saltwater contamination.]
You May Also Like
November 08, 2023
Five Years on Since California's Camp Fire: The Fate of Paradise
Robert Muir-Wood works to enhance approaches to natural catastrophe modeling, identify models for new areas of risk, and explore expanded applications for catastrophe modeling. Robert has more than 25 years of experience developing probabilistic catastrophe models. He was lead author for the 2007 IPCC Fourth Assessment Report and 2011 IPCC Special Report on Extremes, and is Chair of the OECD panel on the Financial Consequences of Large Scale Catastrophes.
He is the author of seven books, most recently: ‘The Cure for Catastrophe: How we can Stop Manufacturing Natural Disasters’. He has also written numerous research papers and articles in scientific and industry publications as well as frequent blogs. He holds a degree in natural sciences and a PhD both from Cambridge University and is a Visiting Professor at the Institute for Risk and Disaster Reduction at University College London.