Space

August 15, 2012

Digital processors limited by power; What’s the UPSIDE?

Artist’s concept. Through UPSIDE, sensor data is analyzed by an array of self-organizing devices. The array processes the data using inference, where elements of the image are automatically sorted based on similarities and dissimilarities. Sensor data flows into the array and target identification and tracking is the output.

Today’s Defense missions rely on a massive amount of sensor data collected by intelligence, surveillance and reconnaissance platforms.

Not only has the volume of sensor data increased exponentially, there has also been a dramatic increase in the complexity of analysis required for applications such as target identification and tracking.

The digital processors used for ISR data analysis are limited by power requirements, potentially limiting the speed and type of data analysis that can be done. A new, ultra-low power processing method may enable faster, mission critical analysis of ISR data.

The Unconventional Processing of Signals for Intelligent Data Exploitation (UPSIDE) program seeks to break the status quo of digital processing with methods of video and imagery analysis based on the physics of nanoscale devices. UPSIDE processing will be non-digital and fundamentally different from current digital processors and the power and speed limitations associated with them.

Instead of traditional complementary metal-oxide-semiconductor (CMOS)-based electronics, UPSIDE envisions arrays of physics-based devices (nanoscale oscillators may be one example) performing the “processing. These arrays would self-organize and adapt to inputs, meaning that they will not need to be programmed as digital processors are. Unlike traditional digital processors that operate by executing specific instructions to compute, it is envisioned that the UPSIDE arrays will rely on a higher level computational element based on probabilistic inference embedded within a digital system.

Probabilistic inference is the fundamental computational model for the UPSIDE program. An inference process uses energy minimization to determine a probability distribution to find the object that is the most likely interpretation of the sensor data. It can be implemented directly in approximate precision by traditional semiconductors as well as by new kinds of emerging devices.

“Redefining the fundamental computation as inference could unlock processing speeds and power efficiency for visual data sets that are not currently possible,” explained Dan Hammerstrom, DARPA program manager. “DARPA hopes that this type of technology will not only yield faster video and image analysis, but also lend itself to being scaled for increasingly smaller platforms.”

An interdisciplinary approach is expected as interested performer teams must address three tasks set forth in the UPSIDE solicitation. Task 1 forms the foundation for the program and involves the development of the computational model and the image processing application that will be used for demonstration and benchmarking. Tasks 2 and 3 will build on the results of Task 1 to demonstrate the inference module implemented in mixed signal CMOS in Task 2 and with non-CMOS emerging nano-scale devices in Task 3. The ability to successfully address all three tasks will require close collaboration within the proposer’s team and will be an important aspect of any successful UPSIDE effort.

“Leveraging the physics of devices to perform computations is not a new idea, but it is one that has never been fully realized,” added Hammerstrom. “However, digital processors can no longer keep up with the requirements of the Defense mission. We are reaching a critical mass in terms of our understanding of the required algorithms, of probabilistic inference and its role in sensor data processing, and the sophistication of new kinds of emerging devices. At DARPA, we believe that the time has come to fund the development of systems based on these ideas and take computational capabilities to the next level.”

 




All of this week's top headlines to your email every Friday.


 
 

 
NASA/JPL photograph

NASA’s Dawn spacecraft captures best-ever view of dwarf planet

Zoomed out – PIA19173 Ceres appears sharper than ever at 43 pixels across, a higher resolution than images of Ceres taken by the NASA’s Hubble Space Telescope in 2003 and 2004. NASA’s Dawn spacecraft has retur...
 
 
ATK

ATK completes installation of world’s largest solid rocket motor for ground test

ATK The first qualification motor for NASA’s Space Launch Systems booster is installed in ATK’s test stand in Utah – ready for a March 11 static-fire test. NASA and ATK have completed installing the first Spac...
 
 
ULA photograph

Third Lockheed Martin-built MUOS satellite launched, responding to commands

ULA photograph The U.S. Air Force’s 45th Space Wing successfully launched the third Mobile User Objective System satellite, built by Lockheed Martin, for the U.S. Navy at 8:04 p.m. Jan. 20, 2015, from Launch Complex 41 at...
 

 
ULA photograph

ULA successfully launches Navy’s Mobile User Objective System-3

ULA photograph The U.S. Air Force’s 45th Space Wing successfully launched the third Mobile User Objective System (MUOS) satellite, built by Lockheed Martin, for the U.S. Navy at 8:04 p.m. Jan. 20, 2015, from Launch Comple...
 
 

Aerojet Rocketdyne Propulsion supports launch, flight of third MUOS satellite

Aerojet Rocketdyne played a critical role in successfully placing the third of five planned Mobile User Objective System (MUOS-3) satellites, designed and built by Lockheed Martin, into orbit for the U.S. Navy. The mission was launched from Cape Canaveral Air Force Station in Florida aboard a United Launch Alliance Atlas V rocket, with five Aerojet...
 
 
LM-MUOS-satellite

U.S. Navy poised to Launch Lockheed Martin-built MUOS-3 satellite

The U.S. Navy and Lockheed Martin are ready to launch the third Mobile User Objective System satellite, MUOS-3, from Cape Canaveral Air Force Station, Fla., Jan. 20 aboard a United Launch Alliance Atlas V rocket. The launch win...
 




0 Comments


Be the first to comment!


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>