Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.538
|View full text |Cite
|
Sign up to set email alerts
|

Free the Plural: Unrestricted Split-Antecedent Anaphora Resolution

Abstract: Now that the performance of coreference resolvers on the simpler forms of anaphoric reference has greatly improved, more attention is devoted to more complex aspects of anaphora. One limitation of virtually all coreference resolution models is the focus on single-antecedent anaphors. Plural anaphors with multiple antecedents-so-called split-antecedent anaphors (as in John met Mary. They went to the movies)-have not been widely studied, because they are not annotated in ONTONOTES and are relatively infrequent i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
28
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 9 publications
(31 citation statements)
references
References 32 publications
(51 reference statements)
0
28
0
Order By: Relevance
“…In this work, we propose an alternative, an extended version of LEA (Moosavi and Strube, 2016) that does joint evaluation of single/split-antecedent anaphors by explicitly representing plural entities. Yu et al (2020a) introduced the first system to resolve all split-antecedent anaphors annotated in the ARRAU corpus. Their work focuses on the data sparsity problem; split-antecedent anaphora resolution is helped using four auxiliary corpora created from a crowdsourced corpus and other anaphoric annotations in the ARRAU corpus.…”
Section: Other Aspects Of Anaphoric Interpretationmentioning
confidence: 99%
See 3 more Smart Citations
“…In this work, we propose an alternative, an extended version of LEA (Moosavi and Strube, 2016) that does joint evaluation of single/split-antecedent anaphors by explicitly representing plural entities. Yu et al (2020a) introduced the first system to resolve all split-antecedent anaphors annotated in the ARRAU corpus. Their work focuses on the data sparsity problem; split-antecedent anaphora resolution is helped using four auxiliary corpora created from a crowdsourced corpus and other anaphoric annotations in the ARRAU corpus.…”
Section: Other Aspects Of Anaphoric Interpretationmentioning
confidence: 99%
“…Thanks in part to the latest developments in deep neural network architectures and contextual word embeddings (e.g., ELMo (Peters et al, 2018) and BERT (Devlin et al, 2019)), the performance of models for single-antecedent anaphora resolution has greatly improved (Wiseman et al, 2016;Clark and Manning, 2016b;Lee et al, 2017Kantor and Globerson, 2019;Joshi et al, 2020). So recently, the attention has turned to more complex cases of anaphora, such as anaphora requiring some sort of commonsense knowledge as in the Winograd Schema Challenge (Rahman and Ng, 2012;Peng et al, 2015;Liu et al, 2017;Sakaguchi et al, 2020); pronominal anaphors that cannot be resolved purely using gender (Webster et al, 2018), bridging reference (Hou, 2020;Yu and Poesio, 2020), discourse deixis (Kolhatkar and Hirst, 2014;Marasović et al, 2017; and, finally, split-antecedent anaphora (Zhou and Choi, 2018;Yu et al, 2020a) -plural anaphoric reference in which the two antecedents are not part of a single noun phrase.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, more research has been carried out on aspects of anaphoric interpretation that go beyond identity anaphora but are covered by datasets * Work done when the author was a student at CMU such as ARRAU (Poesio et al, 2018;Uryupina et al, 2020). These include, e.g., bridging reference (Clark, 1977;Hou et al, 2018;Hou, 2020;Yu and Poesio, 2020;Kobayashi and Ng, 2021), discourse deixis (Webber, 1991;Marasović et al, 2017; or split-antecedent anaphora (Eschenbach et al, 1989;Vala et al, 2016;Zhou and Choi, 2018;Yu et al, 2020b.…”
Section: Introductionmentioning
confidence: 99%