Assessing the role of evolutionary information for enhancing protein language model embeddings

Kyra Erckert, Burkhard Rost

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Embeddings from protein Language Models (pLMs) are replacing evolutionary information from multiple sequence alignments (MSAs) as the most successful input for protein prediction. Is this because embeddings capture evolutionary information? We tested various approaches to explicitly incorporate evolutionary information into embeddings on various protein prediction tasks. While older pLMs (SeqVec, ProtBert) significantly improved through MSAs, the more recent pLM ProtT5 did not benefit. For most tasks, pLM-based outperformed MSA-based methods, and the combination of both even decreased performance for some (intrinsic disorder). We highlight the effectiveness of pLM-based methods and find limited benefits from integrating MSAs.

Original languageEnglish
Article number20692
JournalScientific Reports
Volume14
Issue number1
DOIs
StatePublished - Dec 2024

Keywords

  • Embeddings
  • Evolutionary information
  • Machine learning
  • Multiple sequence alignments
  • Protein language models
  • Protein structure prediction
  • Secondary structure

Fingerprint

Dive into the research topics of 'Assessing the role of evolutionary information for enhancing protein language model embeddings'. Together they form a unique fingerprint.

Cite this