From BERT's Point of View: Revealing the Prevailing Contextual Differences

Carolin M. Schuster, Simon Hegelich

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

1 Zitat (Scopus)

Abstract

Though successfully applied in research and industry large pretrained language models of the BERT family are not yet fully understood. While much research in the field of BERTology has tested whether specific knowledge can be extracted from layer activations, we invert the popular probing design to analyze the prevailing differences and clusters in BERT's high dimensional space. By extracting coarse features from masked token representations and predicting them by probing models with access to only partial information we can apprehend the variation from 'BERT's point of view'. By applying our new methodology to different datasets we show how much the differences can be described by syntax but further how they are to a great extent shaped by the most simple positional information.

OriginalspracheEnglisch
TitelACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Findings of ACL 2022
Redakteure/-innenSmaranda Muresan, Preslav Nakov, Aline Villavicencio
Herausgeber (Verlag)Association for Computational Linguistics (ACL)
Seiten1120-1138
Seitenumfang19
ISBN (elektronisch)9781955917254
PublikationsstatusVeröffentlicht - 2022
Veranstaltung60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 - Dublin, Irland
Dauer: 22 Mai 202227 Mai 2022

Publikationsreihe

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Konferenz

Konferenz60th Annual Meeting of the Association for Computational Linguistics, ACL 2022
Land/GebietIrland
OrtDublin
Zeitraum22/05/2227/05/22

Fingerprint

Untersuchen Sie die Forschungsthemen von „From BERT's Point of View: Revealing the Prevailing Contextual Differences“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren