War fighters’ missions rely on a virtual net of sensors and communications systems for greater battlefield awareness than at any time in history.
At the same time, demands for actionable information have spiked as warfighters at every level – whether at the planning table or on patrol – are called upon to make well-informed decisions.
To maximize mission effectiveness and enhance national security, the Department of Defense is now challenged to more efficiently fuse, analyze and disseminate the massive volumes of data this network produces.
Current DOD systems and processes for handling and analyzing information cannot be efficiently or effectively scaled to meet this challenge. The volume and characteristics of the data, and the range of applications for data analysis, require a fundamentally new approach to data science, analysis and incorporation into mission planning on timelines consistent with operational tempo.
“The sheer volume of information creates a background clutter …,” said DARPA Acting Director Kaigham J. Gabriel. “Let me put this in some context. The Atlantic Ocean is roughly 350 million cubic kilometers in volume, or nearly 100 billion, billon gallons of water. If each gallon of water represented a byte or character, the Atlantic Ocean would be able to store, just barely, all the data generated by the world in 2010. Looking for a specific message or page in a document would be the equivalent of searching the Atlantic Ocean for a single 55-gallon drum barrel.”
Recognizing the challenges presented by this volume of data, DARPA began the XDATA program to develop computational techniques and software tools for processing and analyzing the vast amount of mission-oriented information for Defense activities. As part of this exploration, XDATA aims to address the need for scalable algorithms for processing and visualization of imperfect and incomplete data. And because of the variety of DoD users, XDATA anticipates creation of human-computer interaction tools that could be easily customized for different missions.
To enable large scale data processing in a wide range of potential settings, XDATA plans to release open-source software toolkits to enable collaboration among the applied mathematics, computer science and data visualization communities.
“It’s a great time to leverage recent commercial and academic advances in processing large amounts of data for analysis,” said Chris White, DARPA program manager. “We are calling on all technical communities with expertise in this area to help us ensure our men and women in uniform have the benefit of the best information we can provide.”
To increase awareness of the XDATA program and attract potential researchers, DARPA is planning a Proposers’ Day workshop for April 2012. This workshop will introduce the research community to the effort, explain the mechanics of a DARPA research program, and encourage collaborative arrangements among potential performers. The meeting is in support of the XDATA Broad Agency Announcement. More information regarding both is available here: http://go.usa.gov/Et7.
The XDATA program was announced on March 29, 2012, by the White House Office of Science and Technology Policy and the National Coordination Office for Networking and Information Technology Research and Development as part of President Barack Obama’s “Big Data” initiative. DARPA’s XDATA program supports a whole-of-government effort to coordinate management of big data technology and better use the volumes of data collected by federal agencies.