Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.355
|View full text |Cite
|
Sign up to set email alerts
|

CxGBERT: BERT meets Construction Grammar

Abstract: While lexico-semantic elements no doubt capture a large amount of linguistic information, it has been argued that they do not capture all information contained in text. This assumption is central to constructionist approaches to language which argue that language consists of constructions, learned pairings of a form and a function or meaning that are either frequent or have a meaning that cannot be predicted from its component parts. BERT's training objectives give it access to a tremendous amount of lexico-se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 79 publications
0
9
0
Order By: Relevance
“…So far, relatively few papers approached LM probing from a construction grammar perspective. Madabushi et al (2020) probed for BERT's knowledge of constructions via a sentence pair classification task of predicting whether two sentences share the same construction. Their probe was based on data from Dunn (2017), who used an unsupervised algorithm to extract plausible constructions from corpora based on association strength.…”
Section: Linguistic Probing Of Lmsmentioning
confidence: 99%
“…So far, relatively few papers approached LM probing from a construction grammar perspective. Madabushi et al (2020) probed for BERT's knowledge of constructions via a sentence pair classification task of predicting whether two sentences share the same construction. Their probe was based on data from Dunn (2017), who used an unsupervised algorithm to extract plausible constructions from corpora based on association strength.…”
Section: Linguistic Probing Of Lmsmentioning
confidence: 99%
“…Other studies have approached the question by modifying BERT's training scheme. Tayyar Madabushi et al (2020) train a BERT variant where the next sentence prediction task has been replaced with a same construction prediction task and find mixed results on downstream tasks. Levine et al (2020) train a BERT variant by adding a new supersense prediction task, wherein the masked token's WordNet supersense is to be predicted, and find performance gains on a variety of meaning-related tasks, which shows that BERT's representations do not perfectly capture word senses.…”
Section: Word Sense Bertologymentioning
confidence: 99%
“…A constructional approach to language focuses on symbolic formmeaning mappings that are potentially idiomatic. Previous work on computational CxG has explored how to discover potential constructions (Wible and Tsao, 2010;Forsberg et al, 2014;Dunn, 2017), the process of construction learning , and whether constructional information is implicitly encoded in models like BERT (Tayyar Madabushi et al, 2020).…”
Section: Exposure and Convergencementioning
confidence: 99%