A Masked Hardware Accelerator for Feed-Forward Neural Networks With Fixed-Point Arithmetic

Manuel Brosch, Matthias Probst, Matthias Glaser, Georg Sigl

Research output: Contribution to journalArticlepeer-review


Neural network (NN) execution on resource-constrained edge devices is increasing. Commonly, hardware accelerators are introduced in small devices to support the execution of NNs. However, an attacker can often gain physical access to edge devices. Therefore, side-channel attacks are a potential threat to obtain valuable information about the NN. In order to keep the network secret and protect it from extraction, countermeasures are required. In this article, we propose a masked hardware accelerator for feed-forward NNs that utilizes fixed-point arithmetic and is protected against side-channel analysis (SCA). We use an existing arithmetic masking scheme and improve it to prevent incorrect results. Moreover, we transfer the scheme to the hardware layer by utilizing the glitch-extended probing model and demonstrate the security of the individual modules. To exhibit the effectiveness of the masked design, we implement it on an FPGA and measure the power consumption. The results show that with two million measurements, no secret information is leaked by means of a t -test. In addition, we compare our accelerator with the masked software implementation and other hardware designs. The comparison indicates that our accelerator is up to 38 times faster than software and improves the throughput by a factor of about 4.1 compared to other masked hardware accelerators.

Original languageEnglish
Pages (from-to)231-244
Number of pages14
JournalIEEE Transactions on Very Large Scale Integration (VLSI) Systems
Issue number2
StatePublished - 1 Feb 2024


  • Countermeasure
  • hardware
  • masking
  • neural network (NN) accelerator
  • side-channel analysis (SCA)


Dive into the research topics of 'A Masked Hardware Accelerator for Feed-Forward Neural Networks With Fixed-Point Arithmetic'. Together they form a unique fingerprint.

Cite this