2020
DOI: 10.1075/cal.27.09bud
|View full text |Cite
|
Sign up to set email alerts
|

Putting connections centre stage in diachronic Construction Grammar

Abstract: Construction Grammar conceptualizes language as a hierarchically organized network of constructions, defined as conventional pairings of form and meaning. Importantly, constructions are interlinked: vertical links connect lower-level constructions with their higher-level parents; horizontal links connect sister constructions on the same level. While the importance of vertical connections is well-established, horizontal connections have received only little attention in the theoretical literature so far. The po… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 45 publications
0
5
0
Order By: Relevance
“…Cognitive CxG, meanwhile, has developed a broad inventory of empirical methods to study the synchronic and diachronic use of constructions and draw inferences about their representation in speakers' minds. In particular, proponents of the framework draw on an ever expanding set of corpus-based methods, including measures of frequency, dispersion and association (Gries 2008), distributional semantic methods (Hilpert & Perek 2015) and (most recently) artificial neural networks (Budts & Petré 2020). These corpus approaches have been increasingly complemented by experimental paradigms, such as acceptability judgments (Gries & Wulff 2009), sorting tasks (Perek 2012), artificial language learning (Casenhiser & Goldberg 2005) and priming (Ungerer 2021).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Cognitive CxG, meanwhile, has developed a broad inventory of empirical methods to study the synchronic and diachronic use of constructions and draw inferences about their representation in speakers' minds. In particular, proponents of the framework draw on an ever expanding set of corpus-based methods, including measures of frequency, dispersion and association (Gries 2008), distributional semantic methods (Hilpert & Perek 2015) and (most recently) artificial neural networks (Budts & Petré 2020). These corpus approaches have been increasingly complemented by experimental paradigms, such as acceptability judgments (Gries & Wulff 2009), sorting tasks (Perek 2012), artificial language learning (Casenhiser & Goldberg 2005) and priming (Ungerer 2021).…”
Section: Methodsmentioning
confidence: 99%
“…But nevertheless, ANNs may still constrain the way in which symbolic networks are constructed, for example by providing estimates of the connection strength between patterns that can then be represented with a symbolic architecture. This is illustrated by Budts and Petré's (2020) study, who provide one of the yet rare applications of ANNs in (Diachronic) Construction Grammar. Training their model on corpus data between 1580 and 1700, the authors simulate how the distributional profile of periphrastic do became increasingly similar to those of modal auxiliaries like will, can and may.…”
Section: Areas For Further Researchmentioning
confidence: 99%
“…For this reason, predictive DSMs yield better-quality vectors (Baroni et al 2014) and are gradually replacing count-based DSMs (whose vectors are both long and sparse) in the detection of shifts in constructional meaning. The efficiency of neural language models in cognitive/constructional semantics have been exemplified by Budts (2020), Budts & Peter Petré (2020), Fonteyn (2021), Fonteyn &Manjavacas (2021), andDesagulier (2022).…”
Section: Data-driven Semanticsmentioning
confidence: 99%
“…Word embeddings are derived fully automatically from the distribution of words in a large corpus and implicitly encode semantic properties (like near-synonymity) as well as syntactic properties (like part of speech) (e.g. Mikolov et al 2013;Hamilton et al 2016;Budts and Petré 2020) for neural embeddings; Hilpert and Perek 2015 for non-neural embeddings; Dubossarsky et al 2017;Tahmasebi et al 2018 for a comparison between various types of embeddings). If the embeddings of all the words in a sentence are placed right next to each other, the sentence representation forms a 2D-grid not unlike a picture.…”
Section: A Sentence As a Picturementioning
confidence: 99%