Exploiting Inhomogeneities of Subthreshold Transistors as Populations of Spiking Neurons

Etienne Mueller, Daniel Auge, Alois Knoll

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review


As machine learning applications are becoming increasingly more powerful and are deployed to an increasing number of different appliances, the need for energy-efficient implementations is rising. To meet this demand, a promising field of research is the adoption of spiking neural networks jointly used with neuromorphic hardware, as energy is solely consumed when information is processed. The approach that maximizes energy efficiency, an analog layout with transistors operating in subthreshold mode, suffers from inhomogeneities such as device mismatch which makes it challenging to create a uniform threshold necessary for spiking neurons. Furthermore, previous work mainly focused on spiking feedforward or convolutional networks, as neural networks based on rectified linear units translate well to rate coded spiking neurons. Consequently, the processing of continuous sequential data remains challenging, as neural networks, based on long short-term memory or gated recurrent units as recurrent cells, utilize sigmoid and tanh as activation functions. We show how these two disadvantages can compensate for each other, as a population of spiking neurons with a normally distributed threshold can reliably represent the sigmoid and tanh activation functions. With this finding we present a novel method how to convert a long short-term memory recurrent network to a spiking neural network. Although computationally expensive in a simulation environment, this approach offers a significant opportunity for energy reduction and hardware feasibility as it leverages the often unwanted process variance as a design feature.

Original languageEnglish
Title of host publicationLecture Notes on Data Engineering and Communications Technologies
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages10
StatePublished - 2023

Publication series

NameLecture Notes on Data Engineering and Communications Technologies
ISSN (Print)2367-4512
ISSN (Electronic)2367-4520


  • Conversion
  • Long short-term memory
  • Spiking neural networks
  • Subthreshold analog neuromorphic hardware


Dive into the research topics of 'Exploiting Inhomogeneities of Subthreshold Transistors as Populations of Spiking Neurons'. Together they form a unique fingerprint.

Cite this