The U.S. Air Force Research Laboratory has teamed up with IBM to build a supercomputing platform designed to facilitate information discovery and neural network learning.
The brain-inspired supercomputer will run on the IBM TrueNorth Neurosynaptic System’s 64-chip array and will have sensory processing and pattern recognition functionalities equivalent to 16 billion synapses and 64 million neurons, IBM said Friday.
The new scalable neurosynaptic system will have a processor component with energy requirements equivalent to a 10-watt light bulb.
TrueNorth works to convert images, audio, video and text derived from various sensors into symbols and was developed by the Defense Advanced Research Projects Agency and Cornell University through DARPA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronic program.
Under the partnership, AFRL aims for the system to merge the “left-brain” processing features of traditional computers with the “right-brain” perception functionalities as well as feature both model and data parallelism capabilities.
Dharmendra Modha, an IBM fellow and chief scientist for brain-inspired computing at IBM Research, said the evolution of the TrueNorth system seeks to reflect the company’s efforts to innovate in the field of artificial intelligence hardware.
“Over the last six years, IBM has expanded the number of neurons per system from 256 to more than 64 million – an 800 percent annual increase over six years,” he added.