TY - CHAP
T1 - Exploiting Inhomogeneities of Subthreshold Transistors as Populations of Spiking Neurons
AU - Mueller, Etienne
AU - Auge, Daniel
AU - Knoll, Alois
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - As machine learning applications are becoming increasingly more powerful and are deployed to an increasing number of different appliances, the need for energy-efficient implementations is rising. To meet this demand, a promising field of research is the adoption of spiking neural networks jointly used with neuromorphic hardware, as energy is solely consumed when information is processed. The approach that maximizes energy efficiency, an analog layout with transistors operating in subthreshold mode, suffers from inhomogeneities such as device mismatch which makes it challenging to create a uniform threshold necessary for spiking neurons. Furthermore, previous work mainly focused on spiking feedforward or convolutional networks, as neural networks based on rectified linear units translate well to rate coded spiking neurons. Consequently, the processing of continuous sequential data remains challenging, as neural networks, based on long short-term memory or gated recurrent units as recurrent cells, utilize sigmoid and tanh as activation functions. We show how these two disadvantages can compensate for each other, as a population of spiking neurons with a normally distributed threshold can reliably represent the sigmoid and tanh activation functions. With this finding we present a novel method how to convert a long short-term memory recurrent network to a spiking neural network. Although computationally expensive in a simulation environment, this approach offers a significant opportunity for energy reduction and hardware feasibility as it leverages the often unwanted process variance as a design feature.
AB - As machine learning applications are becoming increasingly more powerful and are deployed to an increasing number of different appliances, the need for energy-efficient implementations is rising. To meet this demand, a promising field of research is the adoption of spiking neural networks jointly used with neuromorphic hardware, as energy is solely consumed when information is processed. The approach that maximizes energy efficiency, an analog layout with transistors operating in subthreshold mode, suffers from inhomogeneities such as device mismatch which makes it challenging to create a uniform threshold necessary for spiking neurons. Furthermore, previous work mainly focused on spiking feedforward or convolutional networks, as neural networks based on rectified linear units translate well to rate coded spiking neurons. Consequently, the processing of continuous sequential data remains challenging, as neural networks, based on long short-term memory or gated recurrent units as recurrent cells, utilize sigmoid and tanh as activation functions. We show how these two disadvantages can compensate for each other, as a population of spiking neurons with a normally distributed threshold can reliably represent the sigmoid and tanh activation functions. With this finding we present a novel method how to convert a long short-term memory recurrent network to a spiking neural network. Although computationally expensive in a simulation environment, this approach offers a significant opportunity for energy reduction and hardware feasibility as it leverages the often unwanted process variance as a design feature.
KW - Conversion
KW - Long short-term memory
KW - Spiking neural networks
KW - Subthreshold analog neuromorphic hardware
UR - http://www.scopus.com/inward/record.url?scp=85147852052&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-20738-9_55
DO - 10.1007/978-3-031-20738-9_55
M3 - Chapter
AN - SCOPUS:85147852052
T3 - Lecture Notes on Data Engineering and Communications Technologies
SP - 483
EP - 492
BT - Lecture Notes on Data Engineering and Communications Technologies
PB - Springer Science and Business Media Deutschland GmbH
ER -