Modeling positional effects of regulatory sequences with spline transformations increases prediction accuracy of deep neural networks

Ziga Avsec, Mohammadamin Barekatain, Jun Cheng, Julien Gagneur

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

Motivation Regulatory sequences are not solely defined by their nucleic acid sequence but also by their relative distances to genomic landmarks such as transcription start site, exon boundaries or polyadenylation site. Deep learning has become the approach of choice for modeling regulatory sequences because of its strength to learn complex sequence features. However, modeling relative distances to genomic landmarks in deep neural networks has not been addressed. Results Here we developed spline transformation, a neural network module based on splines to flexibly and robustly model distances. Modeling distances to various genomic landmarks with spline transformations significantly increased state-of-the-art prediction accuracy of in vivo RNA-binding protein binding sites for 120 out of 123 proteins. We also developed a deep neural network for human splice branchpoint based on spline transformations that outperformed the current best, already distance-based, machine learning model. Compared to piecewise linear transformation, as obtained by composition of rectified linear units, spline transformation yields higher prediction accuracy as well as faster and more robust training. As spline transformation can be applied to further quantities beyond distances, such as methylation or conservation, we foresee it as a versatile component in the genomics deep learning toolbox.

Original languageEnglish
Pages (from-to)1261-1269
Number of pages9
JournalBioinformatics
Volume34
Issue number8
DOIs
StatePublished - 15 Apr 2018

Fingerprint

Dive into the research topics of 'Modeling positional effects of regulatory sequences with spline transformations increases prediction accuracy of deep neural networks'. Together they form a unique fingerprint.

Cite this