Delineating the effective use of self-supervised learning in single-cell genomics

Till Richter, Mojtaba Bahrami, Yufan Xia, David S. Fischer, Fabian J. Theis

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Self-supervised learning (SSL) has emerged as a powerful method for extracting meaningful representations from vast, unlabelled datasets, transforming computer vision and natural language processing. In single-cell genomics (SCG), representation learning offers insights into the complex biological data, especially with emerging foundation models. However, identifying scenarios in SCG where SSL outperforms traditional learning methods remains a nuanced challenge. Furthermore, selecting the most effective pretext tasks within the SSL framework for SCG is a critical yet unresolved question. Here we address this gap by adapting and benchmarking SSL methods in SCG, including masked autoencoders with multiple masking strategies and contrastive learning methods. Models trained on over 20 million cells were examined across multiple downstream tasks, including cell-type prediction, gene-expression reconstruction, cross-modality prediction and data integration. Our empirical analyses underscore the nuanced role of SSL, namely, in transfer learning scenarios leveraging auxiliary data or analysing unseen datasets. Masked autoencoders excel over contrastive methods in SCG, diverging from computer vision trends. Moreover, our findings reveal the notable capabilities of SSL in zero-shot settings and its potential in cross-modality prediction and data integration. In summary, we study SSL methods in SCG on fully connected networks and benchmark their utility across key representation learning scenarios.

Original languageEnglish
Article number6611
JournalNature Machine Intelligence
DOIs
StateAccepted/In press - 2024

Fingerprint

Dive into the research topics of 'Delineating the effective use of self-supervised learning in single-cell genomics'. Together they form a unique fingerprint.

Cite this