Neuromorphic spiking sensors and neural networks for low-power edge application
TimeMonday, December 6th10:30am - 12:00pm PST
DescriptionNeuromorphic spiking sensors including the event-driven Dynamic Audio Sensor (DAS) and Dynamic Vision Sensor (DVS) event camera are inspired by the functionality of the biological retina and cochlea equivalents. The asynchronous outputs of these event-driven sensors can enable always-on sensing at possible low-latency system-level response time than conventional sampled sensors for Internet of Things (IoT) and Brain-Machine Interface (BMI) applications. Recent developments in deep networks, spiking networks, analog computing and non-volatile novel memory devices have led to very low-power neuromorphic systems that combine these sensors and networks for these application domains. Through supervised and unsupervised learning methods from deep learning and neuroscience fields, we can configure these systems and equivalent hardware to achieve high accuracy and low latency on benchmark tasks. The novel memory devices help to reduce power due to off-chip memory access and can support different learning algorithms useful for adaptive neuromorphic systems. Analog computing methods reduce energy required to implement matrix operations and nonlinearities over digital systems.
This tutorial will describe the advances in the design of neuromorphic sensors, bio-inspired network architectures and algorithms, and hardware implementations that can also be applied to the spiking sensor output . We will present examples of the use of neuromorphic systems in low-latency low-power application domains.
The main topics of the tutorial include:
1) An overview of neuromorphic spiking sensors and processing
2) Neural network accelerators applied to these sensory data
3) Architectures for large scale neuromorphic systems such as crossbar, island style and time-multiplexed crossbar.
4) Applications of adaptive neuromorphic systems such as Brain-machine interfaces and Internet of Things (IoT).