TY - GEN
T1 - Spiking Transformer Networks
T2 - 7th International Conference on Systems and Informatics, ICSAI 2021
AU - Mueller, Etienne
AU - Studenyak, Viktor
AU - Auge, Daniel
AU - Knoll, Alois
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Machine learning applications are steadily increasing in performance, while also being deployed on a growing number of devices with limited energy resources. To minimize this trade-off, researchers are continually looking for more energy efficient solutions. A promising field involves the use of spiking neural networks in combination with neuromorphic hardware, significantly reducing energy consumption since energy is only consumed as information is being processed. However, as their learning algorithms lag behind conventional neural networks trained with backpropagation, not many applications can be found today. The highest levels of accuracy can be achieved by converting networks that are trained with backpropagation to spiking networks. Spiking neural networks can show nearly the same performance in fully connected and convolutional networks. The conversion of recurrent networks has been shown to be challenging. However, recent progress with transformer networks could change this. This type of network not only consists of modules that can easily be converted, but also shows the best accuracy levels for different machine learning tasks. In this work, we present a method to convert the transformer architecture to networks of spiking neurons. With only minimal conversion loss, our approach can be used for processing sequential data with very high accuracy while offering the possibility of reductions in energy consumption.
AB - Machine learning applications are steadily increasing in performance, while also being deployed on a growing number of devices with limited energy resources. To minimize this trade-off, researchers are continually looking for more energy efficient solutions. A promising field involves the use of spiking neural networks in combination with neuromorphic hardware, significantly reducing energy consumption since energy is only consumed as information is being processed. However, as their learning algorithms lag behind conventional neural networks trained with backpropagation, not many applications can be found today. The highest levels of accuracy can be achieved by converting networks that are trained with backpropagation to spiking networks. Spiking neural networks can show nearly the same performance in fully connected and convolutional networks. The conversion of recurrent networks has been shown to be challenging. However, recent progress with transformer networks could change this. This type of network not only consists of modules that can easily be converted, but also shows the best accuracy levels for different machine learning tasks. In this work, we present a method to convert the transformer architecture to networks of spiking neurons. With only minimal conversion loss, our approach can be used for processing sequential data with very high accuracy while offering the possibility of reductions in energy consumption.
UR - http://www.scopus.com/inward/record.url?scp=85124972504&partnerID=8YFLogxK
U2 - 10.1109/ICSAI53574.2021.9664146
DO - 10.1109/ICSAI53574.2021.9664146
M3 - Conference contribution
AN - SCOPUS:85124972504
T3 - ICSAI 2021 - 7th International Conference on Systems and Informatics
BT - ICSAI 2021 - 7th International Conference on Systems and Informatics
A2 - Yang, Jianxi
A2 - Li, Kenli
A2 - Tu, Wanqing
A2 - Xiao, Zheng
A2 - Wang, Libo
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 13 November 2021 through 15 November 2021
ER -