Porting Deep Spiking Q-Networks to neuromorphic chip Loihi

Mahmoud Akl, Yulia Sandamirskaya, Florian Walter, Alois Knoll

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

Deep neural networks (DNNs) set the benchmark in many tasks in perception and control. Spiking versions of DNNs, implemented on neuromorphic hardware can enable orders of magnitude lower power consumption and low latency during network use. In this paper, we explore behavior and generalization capability of spiking, quantized spiking, and hardware implementation of deep Q-networks in two classical reinforcement learning tasks. We found that spiking neural networks have slightly decreased performance compared to non-spiking network, but we can avoid performance degradation from quantization and in-chip implementation. We conclude that since hardware implementation leads to lower power consumption and low latency, neuromorphic approach is a promising avenue for deep Q-learning. Furthermore, online learning, enabled in neuromorphic chips, can be used to compensate for the performance decrease in environments with parameter variations.

Original languageEnglish
Title of host publicationICONS 2021 - Proceedings of International Conference on Neuromorphic Systems 2021
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450386913
DOIs
StatePublished - 27 Jul 2021
Event2021 International Conference on Neuromorphic Systems, ICONS 2021 - Virtual, Onlie, United States
Duration: 27 Jul 202129 Jul 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference2021 International Conference on Neuromorphic Systems, ICONS 2021
Country/TerritoryUnited States
CityVirtual, Onlie
Period27/07/2129/07/21

Keywords

  • Spiking neural networks
  • neuromorphic hardware
  • reinforcement learning

Fingerprint

Dive into the research topics of 'Porting Deep Spiking Q-Networks to neuromorphic chip Loihi'. Together they form a unique fingerprint.

Cite this