TY - GEN
T1 - Convolutional Neural Networks with analytically determined Filters
AU - Kissel, Matthias
AU - Diepold, Klaus
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - In this paper, we propose a new training algorithm for Convolutional Neural Networks (CNNs) based on well-known training methods for neural networks with random weights. Our algorithm analytically determines the filters of the convolutional layers by solving a least squares problem using the Moore-Penrose generalized inverse. The resulting algorithm does not suffer from convergence issues and the training time is drastically reduced compared to traditional CNN training using gradient descent. We validate our algorithm with several standard datasets (MNIST, FashionMNIST and CIFAR10) and show that CNNs trained with our method outperform previous approaches with random or unsupervisedly learned filters in terms of test pre-diction accuracy. Moreover, our approach is up to 25 times faster than training CNNs with equivalent architecture using a gradient-descent based algorithm.
AB - In this paper, we propose a new training algorithm for Convolutional Neural Networks (CNNs) based on well-known training methods for neural networks with random weights. Our algorithm analytically determines the filters of the convolutional layers by solving a least squares problem using the Moore-Penrose generalized inverse. The resulting algorithm does not suffer from convergence issues and the training time is drastically reduced compared to traditional CNN training using gradient descent. We validate our algorithm with several standard datasets (MNIST, FashionMNIST and CIFAR10) and show that CNNs trained with our method outperform previous approaches with random or unsupervisedly learned filters in terms of test pre-diction accuracy. Moreover, our approach is up to 25 times faster than training CNNs with equivalent architecture using a gradient-descent based algorithm.
KW - Convolutional Neural Network
KW - Efficient Training
KW - Gradient-Free Training
KW - Pseudo-Inverse
UR - http://www.scopus.com/inward/record.url?scp=85140796267&partnerID=8YFLogxK
U2 - 10.1109/IJCNN55064.2022.9891906
DO - 10.1109/IJCNN55064.2022.9891906
M3 - Conference contribution
AN - SCOPUS:85140796267
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 International Joint Conference on Neural Networks, IJCNN 2022
Y2 - 18 July 2022 through 23 July 2022
ER -