Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1118
|View full text |Cite
|
Sign up to set email alerts
|

Rewarding Coreference Resolvers for Being Consistent with World Knowledge

Abstract: Unresolved coreference is a bottleneck for relation extraction, and high-quality coreference resolvers may produce an output that makes it a lot easier to extract knowledge triples. We show how to improve coreference resolvers by forwarding their input to a relation extraction system and reward the resolvers for producing triples that are found in knowledge bases. Since relation extraction systems can rely on different forms of supervision and be biased in different ways, we obtain the best performance, improv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(15 citation statements)
references
References 22 publications
1
14
0
Order By: Relevance
“…Precision vs. recall Precision often increases after finetuning whereas recall decreases. Similar effects have been reported for knowledge-base grounding of coreference resolvers (Aralikatte et al 2019).…”
Section: Error Analysissupporting
confidence: 82%
See 1 more Smart Citation
“…Precision vs. recall Precision often increases after finetuning whereas recall decreases. Similar effects have been reported for knowledge-base grounding of coreference resolvers (Aralikatte et al 2019).…”
Section: Error Analysissupporting
confidence: 82%
“…Previous work has augmented Coreference resolvers with syntax information (Wiseman, Rush, and Shieber 2016;Clark and Manning 2016a,b), external world knowledge (Rahman and Ng 2011;Emami et al 2018;Aralikatte et al 2019) and a variety of other linguistic features (Ng 2007;Haghighi and Klein 2009;Zhang, Song, and Song 2019). Similarly, Ponzetto and Strube (2006a,b) used features from SRL and external sources for a non-neural coreference resolver.…”
Section: Related Work Augmented Coreference Resolutionmentioning
confidence: 99%
“…One example of such data would be elevation maps [49], which can be integrated into MapLUR in a way similar to Experiment 4, where we provided the CNN with additional map image layers. Beyond this, there is a wide variety of methods to provide deep learning methods with additional information which holds great potential to further improve our results [4,39].…”
Section: Discussionmentioning
confidence: 99%
“…Conisdering CR approaches that use external knowledge, Aralikatte et al (2019) solved CR task by incorporate knowledge or information in reinforcement learning models. Emami et al (2018) solved the binary choice coreference-resolution task by leveraging information retrieval results from search engines.…”
Section: Related Workmentioning
confidence: 99%