Analyzing the Sample Complexity of Self-Supervised Image Reconstruction Methods

Tobit Klug, Dogukan Atik, Reinhard Heckel

Publikation: Beitrag in FachzeitschriftKonferenzartikelBegutachtung

3 Zitate (Scopus)

Abstract

Supervised training of deep neural networks on pairs of clean image and noisy measurement achieves state-of-the-art performance for many image reconstruction tasks, but such training pairs are difficult to collect. Self-supervised methods enable training based on noisy measurements only, without clean images. In this work, we investigate the cost of self-supervised training in terms of sample complexity for a class of self-supervised methods that enable the computation of unbiased estimates of gradients of the supervised loss, including noise2noise methods. We analytically show that a model trained with such self-supervised training is as good as the same model trained in a supervised fashion, but self-supervised training requires more examples than supervised training. We then study self-supervised denoising and accelerated MRI empirically and characterize the cost of self-supervised training in terms of the number of additional samples required, and find that the performance gap between self-supervised and supervised training vanishes as a function of the training examples, at a problem-dependent rate, as predicted by our theory.

OriginalspracheEnglisch
FachzeitschriftAdvances in Neural Information Processing Systems
Jahrgang36
PublikationsstatusVeröffentlicht - 2023
Veranstaltung37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, USA/Vereinigte Staaten
Dauer: 10 Dez. 202316 Dez. 2023

Fingerprint

Untersuchen Sie die Forschungsthemen von „Analyzing the Sample Complexity of Self-Supervised Image Reconstruction Methods“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren