Abstract
Formal verification of neural networks is a challenging problem due to the complexity and nonlinearity of neural networks. It has been shown that polynomial zonotopes can tightly enclose the output set of a neural network. Unfortunately, the tight enclosure comes with additional complexity in the set representation, thus, rendering subsequent operations expensive to compute, such as computing interval bounds and intersection checking. To address this issue, we present a novel approach to restructure a polynomial zonotope to tightly enclose the original polynomial zonotope while drastically reducing its complexity. The restructuring is achieved by relaxing the exponents of the dependent factors of polynomial zonotopes and finding an appropriate approximation error. We demonstrate the applicability of our approach on output sets of neural networks, where we obtain tighter results in various subsequent operations, such as order reduction, zonotope enclosure, and range bounding.
| Original language | English |
|---|---|
| Pages (from-to) | 21304-21311 |
| Number of pages | 8 |
| Journal | Proceedings of the AAAI Conference on Artificial Intelligence |
| Volume | 38 |
| Issue number | 19 |
| DOIs | |
| State | Published - 25 Mar 2024 |
| Event | 38th AAAI Conference on Artificial Intelligence, AAAI 2024 - Vancouver, Canada Duration: 20 Feb 2024 → 27 Feb 2024 |
Fingerprint
Dive into the research topics of 'Exponent Relaxation of Polynomial Zonotopes and Its Applications in Formal Neural Network Verification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver