Abstract
In this paper the theory of probably approximately correct (PAC) learning is applied to Discrete-Time Cellular Neural Networks (DTCNNS). The Vapnik-Chervonenkis dimension of DTCNN is to be determined. Considering two different operation modes of the network, an upper bound of the sample size for a reliable generalization of DTCNN architecture will be given.
Originalsprache | Englisch |
---|---|
Seiten | 159-164 |
Seitenumfang | 6 |
Publikationsstatus | Veröffentlicht - 1994 |
Veranstaltung | Proceedings of the 3rd IEEE International Workshop on Cellular Neural Networks and their Applications (CNNA-94) - Rome, Italy Dauer: 18 Dez. 1994 → 21 Dez. 1994 |
Konferenz
Konferenz | Proceedings of the 3rd IEEE International Workshop on Cellular Neural Networks and their Applications (CNNA-94) |
---|---|
Ort | Rome, Italy |
Zeitraum | 18/12/94 → 21/12/94 |