Reviving autoencoder pretraining

You Xie, Nils Thuerey

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The pressing need for pretraining algorithms has been diminished by numerous advances in terms of regularization, architectures, and optimizers. Despite this trend, we re-visit the classic idea of unsupervised autoencoder pretraining and propose a modified variant that relies on a full reverse pass trained in conjunction with a given training task. This yields networks that are as-invertible-as-possible and share mutual information across all constrained layers. We additionally establish links between singular value decomposition and pretraining and show how it can be leveraged for gaining insights about the learned structures. Most importantly, we demonstrate that our approach yields an improved performance for a wide variety of relevant learning and transfer tasks ranging from fully connected networks over residual neural networks to generative adversarial networks. Our results demonstrate that unsupervised pretraining has not lost its practical relevance in today’s deep learning environment.

Original languageEnglish
Pages (from-to)4587-4619
Number of pages33
JournalNeural Computing and Applications
Volume35
Issue number6
DOIs
StatePublished - Feb 2023

Keywords

  • Greedy layer-wise pretraining
  • Orthogonality
  • Transfer learning
  • Unsupervised pretraining

Fingerprint

Dive into the research topics of 'Reviving autoencoder pretraining'. Together they form a unique fingerprint.

Cite this