TY - GEN
T1 - Robustness of Neuromorphic Computing with RRAM-based Crossbars and Optical Neural Networks
AU - Zhang, Grace Li
AU - Li, Bing
AU - Zhu, Ying
AU - Wang, Tianchen
AU - Shi, Yiyu
AU - Yin, Xunzhao
AU - Zhuo, Cheng
AU - Gu, Huaxi
AU - Ho, Tsung Yi
AU - Schlichtmann, Ulf
N1 - Publisher Copyright:
© 2021 Association for Computing Machinery.
PY - 2021/1/18
Y1 - 2021/1/18
N2 - RRAM-based crossbars and optical neural networks are attractive platforms to accelerate neuromorphic computing. However, both accelerators suffer from hardware uncertainties such as process variations. These uncertainty issues left unaddressed, the inference accuracy of these computing platforms can degrade significantly. In this paper, a statistical training method where weights under process variations and noise are modeled as statistical random variables is presented. To incorporate these statistical weights into training, the computations in neural networks are modified accordingly. For optical neural networks, we modify the cost function during software training to reduce the effects of process variations and thermal imbalance. In addition, the residual effects of process variations are extracted and calibrated in hardware test, and thermal variations on devices are also compensated in advance. Simulation results demonstrate that the inference accuracy can be improved significantly under hardware uncertainties for both platforms.
AB - RRAM-based crossbars and optical neural networks are attractive platforms to accelerate neuromorphic computing. However, both accelerators suffer from hardware uncertainties such as process variations. These uncertainty issues left unaddressed, the inference accuracy of these computing platforms can degrade significantly. In this paper, a statistical training method where weights under process variations and noise are modeled as statistical random variables is presented. To incorporate these statistical weights into training, the computations in neural networks are modified accordingly. For optical neural networks, we modify the cost function during software training to reduce the effects of process variations and thermal imbalance. In addition, the residual effects of process variations are extracted and calibrated in hardware test, and thermal variations on devices are also compensated in advance. Simulation results demonstrate that the inference accuracy can be improved significantly under hardware uncertainties for both platforms.
UR - http://www.scopus.com/inward/record.url?scp=85100544751&partnerID=8YFLogxK
U2 - 10.1145/3394885.3431634
DO - 10.1145/3394885.3431634
M3 - Conference contribution
AN - SCOPUS:85100544751
T3 - Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC
SP - 853
EP - 858
BT - Proceedings of the 26th Asia and South Pacific Design Automation Conference, ASP-DAC 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 26th Asia and South Pacific Design Automation Conference, ASP-DAC 2021
Y2 - 18 January 2021 through 21 January 2021
ER -