New algorithms look to provide emergency response managers with real-time data that could save lives.
The novel coronavirus pandemic has thrust emergency response modelling into the spotlight like never before. We are learning that recommendations need to be fast enough to stay ahead of a crisis, but they also need to be accurate and useful for effective disaster response.
Models are always imperfect. They can be whipsawed by the quality of their data and the sudden shifts that are impossible to predict in a complex situation. And models that help create an emergency preparedness plan for the future also need to be supplemented by algorithms that can inform disaster response in real time.
Researchers at North Carolina State, the U.S. Army, and Texas A&M are developing algorithms that can give first responders and emergency response managers real-time insights during disasters, both natural and manmade.
“It will be on a scale that we have never seen before.”
A huge number of American civilians and non-combat personnel live and work in South Korea. If conflict ever erupted with North Korea, it would trigger America’s largest ever non-combatant evacuation operation (NEO).
“It will be on a scale that we have never seen before,” says David Maxwell, senior fellow at the Foundation for Defense of Democracies. A 30-year Army veteran, Maxwell’s expertise includes North Korea. There have been smaller NEOs around the world, he says, “but the scale we are talking about are in the hundreds of thousands (of evacuees).”
NEOs are how the State Department spirits American citizens away from war zones and disasters. Declared and run by ambassadors, the emergency response itself is carried out by the military. In South Korea, the Army takes point.
An emergency evacuation plan on the scale of a South Korean NEO is staggering. Among the logistical concerns are acquiring, mobilizing, and keeping ground transport running; setting up—and stocking—staging areas; and moving people to the airports and seaports that will take them to safety. Military logisticians and planners have built, and currently use, models to design their NEO playbook. A pair of researchers and veterans from NCSU and the U.S. Army have designed a new model that uses traditional algorithms not only to plan, but to adjust those plans in real time.
The intent is not to replace or improve upon the current models, says NC State’s Brandon McConnell, assistant research professor in industrial and systems engineering.
“It really is about providing something different, a different type of capability for the commander to help visualize decision trade-offs and understand them.”
Built with lead author and Army Captain John Kearby, an NC State grad and instructor at the U.S. Military Academy at West Point, the algorithm crunches a number of variables to deliver a schedule for the NEO.
Those inputs include how many evacuees there are and their location by assigned assembly point, how many assets are available—like trains, busses, and helicopters—and their transit times. The schedule the algorithm creates is designed to perform the NEO in the least amount of time possible. But what makes it unique is that this schedule can adjust as things inevitably change.
If one of the variables is updated with the new information, the schedule will update as well. The model needs to provide answers to questions quickly and accurately. The trade-offs any decisions can make could have a major impact on the whole operation. The model aims to allow logisticians and commanders to see the results of their actions in real time.
“That’s what it’s about in the environment,” McConnell says. “These things don’t come for free.” Kearby and McConnell’s model can predict these changes down to the level of an individual truck.
That level of granularity is important, McConnell says. A key challenge was figuring out how much detail was needed. Too little, and the decisions would not be as informed as possible. Too much, and the results could be too slow to develop and the model too cumbersome to help.
“It really seems like this model can very much contribute to decision making,” Maxwell, the Korea expert, says. Issues—traffic jams, road closures, foreign government’s actions—are inevitable.
There are still some gaps in what can be modeled, Maxwell notes. Among them are how other governments may impact the NEO—say, China outbidding the U.S. to get private contractor’s vehicles to shuttle their evacuees. The actions of Korean citizens looking for safety would be difficult to account for as well.
An easy to use and understand user interface will also be necessary to make the model field ready, McConnell says.
Kearby and McConnell’s approach used the plans for a South Korea NEO, but it could apply to other situations. Emergency response to disasters of every stripe are fluid situations.
The Deluge: Emergency Response in a Flood
Few natural disasters have as broad an impact area as flooding.
“Generally speaking, (floods) … can happen almost anywhere,” says Tim Frazier, professor and faculty director of the Emergency and Disaster Management program at Georgetown University. “It’s also the one where virtually every year we see the highest losses in dollar values for any natural disaster that we face.”
Models that are based on the physics of water and the topography of an area can help predict where flooding may start to occur. But they become less useful mid-flood, with waters and pressure rising and time washing away.
Ali Mostafavi, an assistant professor of civil and environmental engineering at Texas A&M, has designed an algorithm built to help predict where flood waters will flow next in real time. By taking data from a city’s flood gauges and combining it with the design of the city’s drainage systems, Mostafavi’s algorithm can model where flood waters will head in a city.
“You have an overflow in node 5 (of the drainage system),” Mostafavi says, as an example. The algorithm could tell you that “three hours from now node 7 has a 90% likelihood of an overflow as well.” It doesn’t predict the future, Mostafavi says, but provides a probabilistic framework to shape emergency response.
Such info could help determine where limited rescue resources should be allocated and how many may be needed, Frazier says. It could also help officials better warn the public.
Mostafavi used Harris County, Texas, and the flood-prone city of Houston to build his model. With a robust system of flood gauges, Houston was especially well-suited to providing data for the model. The algorithm drew on history. It was fed data from the Memorial Day flood of 2015 and Hurricane Harvey in 2017. Using this, the algorithm’s predictions matched the historical data on the floodwaters of the 2016 Tax Day flood with 83% accuracy.
He is currently working on a deep learning iteration that is even more accurate and may be better suited to overcoming one of the model’s main challenges: lack of data.
“That’s the reason we started with using the probabilistic model,” Mostafavi says. Such models are less data dependent. But the crucial data he does need is dependent on a city’s flood monitoring infrastructure (including that the gauges are actually working properly).
“Especially in some resource limited areas, flood gauges are more scarce,” Mostafavi says. He is looking at potential workarounds that the deep learning version could use to fill in the gap, like monitoring social media posts as an ad hoc tracking system.
Like Kearby and McConnell’s model, Mostafavi’s is not meant to replace physics-based models mainly used for planning. The goal is to provide a different tool set — one optimized for real-time decision making — to supplement the models already in use.
“The thing we worry about in emergency management more than anything is saving lives,” Georgetown’s Frazier says. “If we have a model that tells us where the flooding is going to be, that’s a huge advantage.”