Recent News

Neuromorphic Computing Will Revolutionize the Edge – EE Times Europe

Biomimicry, the science-slash-art of copying natural structures, is not a new idea. For decades, we have been trying to copy biological brains to make efficient computers, only slightly deterred by the fact that we don’t know how biological intelligence works exactly. Armed with our best guesses, we developed models of the neuron and spiking neural networks based on the human brain, and we are now trying to develop these in silicon. Silicon imitations generally use simplified versions of the neuron, but they can still offer distinct advantages to edge applications that need fast, energy-efficient processing to make decisions.

Neuromorphic Computing

ABI Research reports that 4.6 billion sensors will ship in 2027, embedded in smart-home devices, robots, and appliances, up from 1.8 billion in 2021. These additional sensors will support existing and new functions going forward, resulting in a surge of sensor data that will need to be processed. While the vast majority of smart-home devices and appliances will feature internet connections by 2027, the cloud may not be the best place for this data to be processed. There’s a cost attached to hosting and processing this data in the cloud, it’s slow, and there are privacy implications.

The best bet for processing sensor data in real time, very close to the sensor, may well be neuromorphic computing. Demonstrations of neuromorphic computing systems have proven the technology’s value for ultra-fast, ultra-low–power decision-making at the edge. Biomimicry in computing and neuromorphic computing are poised to bring a whole new level of intelligence to edge devices, making it feasible to add decision-making power to devices with extreme limits on energy consumption and speed. As spiking networks and specialized hardware continue to develop, the effects will become even more pronounced.

Neuromorphic’s competitor, deep learning (the paradigm that powers most of mainstream AI today), is developing fast. Today, it’s easily possible to do small deep-learning applications, including keyword spotting and basic image processing, on a sub-US$1 microcontroller. But neuromorphic concepts take this a step further, squeezing into minuscule energy budgets. Will these technologies compete or coexist at the edge? The most likely medium-term scenario is coexistence — with millions of use cases at the edge, there are millions of niches, and some may suit neuromorphic computing better, for technical or commercial reasons.

While the demise of Moore’s Law has been somewhat circumvented by accelerated computing/domain-specific computing, it’s still a tricky balance between flexibility in the computing architecture and performance, especially for quickly evolving workloads like AI. Taking our cue from the most efficient computer ever known — the human brain — and using the results of millions of years of evolution as a starting point feels like a safe bet.


Read also:

neuromorphic

Advertisement

This UrIoTNews article is syndicated fromGoogle News

About Post Author