Abstract

Single-cell transcriptomics enabled the study of cellular heterogeneity in response to perturbations at the resolution of individual cells. However, scaling high-throughput screens (HTSs) to measure cellular responses for many drugs remains a challenge due to technical limitations and, more importantly, the cost of such multiplexed experiments. Thus, transferring information from routinely performed bulk RNA HTS is required to enrich single-cell data meaningfully. We introduce chemCPA, a new encoder-decoder architecture to study the perturbational effects of unseen drugs. We combine the model with an architecture surgery for transfer learning and demonstrate how training on existing bulk RNA HTS datasets can improve generalisation performance. Better generalisation reduces the need for extensive and costly screens at single-cell resolution. We envision that our proposed method will facilitate more efficient experiment designs through its ability to generate in-silico hypotheses, ultimately accelerating drug discovery.

OriginalspracheEnglisch
TitelAdvances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022
Redakteure/-innenS. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, A. Oh
Herausgeber (Verlag)Neural information processing systems foundation
ISBN (elektronisch)9781713871088
PublikationsstatusVeröffentlicht - 2022
Veranstaltung36th Conference on Neural Information Processing Systems, NeurIPS 2022 - New Orleans, USA/Vereinigte Staaten
Dauer: 28 Nov. 20229 Dez. 2022

Publikationsreihe

NameAdvances in Neural Information Processing Systems
Band35
ISSN (Print)1049-5258

Konferenz

Konferenz36th Conference on Neural Information Processing Systems, NeurIPS 2022
Land/GebietUSA/Vereinigte Staaten
OrtNew Orleans
Zeitraum28/11/229/12/22

Fingerprint

Untersuchen Sie die Forschungsthemen von „Predicting Cellular Responses to Novel Drug Perturbations at a Single-Cell Resolution“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren