Targeting DNN Inference Via Efficient Utilization of Heterogeneous Precision DNN Accelerators

Ourania Spantidi, Georgios Zervakis, Sami Alsalamin, Isai Roman-Ballesteros, Jorg Henkel, Hussam Amrouch, Iraklis Anagnostopoulos

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

8 Zitate (Scopus)

Abstract

Modern applications rely more and more on the simultaneous execution of multiple DNNs, and Heterogeneous DNN Accelerators (HDAs) prevail as a solution to this trend. In this work, we propose, implement, and evaluate low precision Neural Processing Units (NPUs) which serve as building blocks to construct HDAs, to address the efficient deployment of multi-DNN workloads. Moreover, we design and evaluate HDA designs that increase the overall throughput, while reducing the energy consumption during NN inference. At the design time, we implement HDAs inspired by the big.LITTLE computing paradigm, consisting of 8-bit NPUs together with lower precision bit-width NPUs. Additionally, an NN-to-NPU scheduling methodology is implemented to decide at run-time how to map the executed NN to the suitable NPU based on an accuracy drop threshold value. Our hardware/software co-design reduces the energy and response time of NNs by 29% and 10% respectively when compared to state-of-the-art homogeneous architectures. This comes with a negligible accuracy drop of merely 0.5%. Similar to the traditional CPU big.LITTLE, our asymmetric NPU design can open new doors for designing novel DNN accelerator architectures, due to their profound role in increasing the efficiency of DNNs with minimal losses in accuracy.

OriginalspracheEnglisch
Seiten (von - bis)112-125
Seitenumfang14
FachzeitschriftIEEE Transactions on Emerging Topics in Computing
Jahrgang11
Ausgabenummer1
DOIs
PublikationsstatusVeröffentlicht - 1 Jan. 2023
Extern publiziertJa

Fingerprint

Untersuchen Sie die Forschungsthemen von „Targeting DNN Inference Via Efficient Utilization of Heterogeneous Precision DNN Accelerators“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren