Abstract
Recurrent Neural Networks are an integral part of modern machine learning. They are good at performing tasks on sequential data. However, long sequences are still a challenge for those models due to the well-known exploding/vanishing gradient problem. In this work, we build on recent approaches to interpreting the gradient problem as instability of the underlying dynamical system. We extend previous approaches to systems with top-down feedback, which is abundant in biological neural networks. We prove that the resulting system is stable for arbitrary depth and width and confirm this empirically. We further show that its performance is on par with long short-term memory (LSTM) models and related approaches on standard benchmarks.
Original language | English |
---|---|
Pages (from-to) | 880-894 |
Number of pages | 15 |
Journal | Proceedings of Machine Learning Research |
Volume | 189 |
State | Published - 2022 |
Externally published | Yes |
Event | 14th Asian Conference on Machine Learning, ACML 2022 - Hyderabad, India Duration: 12 Dec 2022 → 14 Dec 2022 |
Keywords
- Dynamical Systems
- Gradient Stability
- Recurrent Neural Networks