Proceedings of the 25th Conference on Computational Natural Language Learning 2021
DOI: 10.18653/v1/2021.conll-1.40
|View full text |Cite
|
Sign up to set email alerts
|

Tackling Zero Pronoun Resolution and Non-Zero Coreference Resolution Jointly

Abstract: Zero pronoun resolution aims at recognizing dropped pronouns and pointing out their anaphoric mentions, while non-zero coreference resolution targets at clustering mentions referring to the same entity. Existing efforts often deal with the two problems separately regardless of their close essential correlations. In this paper, we investigate the possibility of jointly solving zero pronoun resolution and coreference resolution via a novel end-to-end neural model. Specifically, we design a gapmasked self-attenti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 21 publications
(33 reference statements)
0
1
0
Order By: Relevance
“…Omitted pronouns in these languages, also called zero pronouns, are increasingly important in computational linguistics (e.g. Chen et al, 2021;Iida et al, 2006Iida et al, , 2015Kong et al, 2019). This paper formalizes the notion of Topic Chains, introduced by Tsao (1977) and demonstrates that people omit pronouns when a certain kind of discourse salience is high.…”
Section: Introductionmentioning
confidence: 94%
“…Omitted pronouns in these languages, also called zero pronouns, are increasingly important in computational linguistics (e.g. Chen et al, 2021;Iida et al, 2006Iida et al, , 2015Kong et al, 2019). This paper formalizes the notion of Topic Chains, introduced by Tsao (1977) and demonstrates that people omit pronouns when a certain kind of discourse salience is high.…”
Section: Introductionmentioning
confidence: 94%
“…In the field of MCR, there has been notable attention directed towards the research of two types of languages. One prominent area of investigation is around pro-drop languages, such as Chinese (Kong and Ng, 2013;Song et al, 2020;Chen et al, 2021;Zhang et al, 2022), Italian (Iida and Poesio, 2011) and Arabic Poesio, 2020, 2021). Another research direction involves the study of morphologically rich languages, such as German and Arabic (Roesiger and Kuhn, 2016;Aloraini and Poesio, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…Within the recent years, resolving zero pronoun through computational approaches has been studied broadly in other pro-drop languages such as Japanese, Chinese and Korean, and could be done in multiple ways using rule-based model, machine learning or deep learning, for instance, encoding zero pronouns and their candidate antecedents by LSTM, Attention or BERT (Chen et al, 2021), zero pronoun resolution with attention-based neural network (Yin, Zhang, Zhang, Liu, & Wang, 2018) and using ranking rules combined with machine learning (Isozaki & Hirao, 2003). To the best of our knowledge, Thai zero pronoun resolution with a computational approach has not yet been extensively explored.…”
Section: Translated By Google Translatementioning
confidence: 99%
“…Their model yielded the best performance and surpassed the existing Chinese zero pronoun resolution baseline systems. The compelling point from this study is the attention mechanism that enhances the performance of the model by helping it focus on the informative parts of the contexts that represent zero pronouns while the previous works using deep neural network methods tried to encode zero pronouns into the semantic vector-space by additional elements and underutilized context of zero pronouns.Another study that focuses on Chinese zero pronoun resolution was done byChen et al (2021). The researchers investigated the possibility of solving zero pronoun resolution and coreference resolution jointly via a neural model, while most existing works at that time tried to solve these two problems separately.…”
mentioning
confidence: 99%