Call it IBM's Math-To-The-Rescue Program. Big Blue this week said its researchers had created specialized algorithms to help model and manage natural disasters such as wildfires, floods and diseases.
The idea is to use high-level math techniques, which IBM calls Stochastic programming, to help speed up and simplify complex tasks such as determining the fastest route to deliver packages, detecting fraud in health insurance claims, automating complex risk decisions for international financial institutions, scheduling supply chain and production at a manufacturing plant to maximize efficiency or detecting patterns in medical data for new insights and breakthroughs.
More than 197 million people were affected by natural disasters in 2007, and despite the impact of these events, government and relief agencies still don't have a cohesive system to facilitate communication and manage staff deployment, distribution of supplies and other critical resources, IBM said.
The deployment of resources during a natural disaster, whether it is water, food, machines or people, requires complex planning and scheduling and the need to adapt to constantly changing scenarios, often involving large number of resources, unique requirements based on location and the varying staffing levels.
Government agencies use different systems to estimate their program needs, including preparedness resource planning, yet no one system has been able to adapt to the increasing complexity of natural disaster management, IBM said.
Stochastic programming offers great modeling power and flexibility, but it comes at a cost-premium processing time. However, recently, stochastic programming has benefited from the development of more efficient algorithms and faster computer processors. This means that rather than predicting a limited future using forecasting, decisions supporting a wide range of probable scenarios can be taken, IBM said in a release.
The model allows all unforeseen challenges to be solved, mostly within an hour, and has very good scalability that promises to gracefully manage even larger models in the future.IBM scientists developed a large-scale strategic budgeting framework based on Stochastic algorithms for managing natural disaster events, with a focus on better preparedness for future uncertain disaster scenarios. The underlying optimization models and algorithms were initially prototyped on a large unnamed US Government program, where the key problem was how to efficiently deploy a large number of critical resources to a range of disaster event scenarios. The same models can be explored to manage floods or famines in India, or natural disasters anywhere in the world, IBM said.
A fully developed, customized and implemented model can significantly help the country's approach for disaster risk reduction and disaster management.
"We are creating a set of intellectual properties and software assets that can be employed to gauge and improve levels of preparedness to tackle unforeseen natural disasters," says Dr Gyana Parija, senior researcher and optimization expert at IBM India Research Laboratory. "Most real-world problems involve uncertainty, and this has been the inspiration for us to tackle challenges in natural disaster management."
In the case of flooding, for example, the stochastic programming model would use various flood scenarios, resource supply capabilities at different dispatch locations, and fixed and variable costs associated with deployment of various flood-management resources to manage various risk measures. By assigning probabilities to the factors driving outcomes, the model outlines how limited resources can meet tomorrow's unknown demands or liabilities. In this way, the risks and rewards of various tradeoffs can be explored, IBM said.
IBM scientists from its research labs in New York and India mixed expertise from its Global Business Services, government bodies, relief agencies and strategic planning companies to develop the algorithms.