Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing 2015
DOI: 10.18653/v1/d15-1086
|View full text |Cite
|
Sign up to set email alerts
|

Extracting Relations between Non-Standard Entities using Distant Supervision and Imitation Learning

Abstract: Distantly supervised approaches have become popular in recent years as they allow training relation extractors without textbound annotation, using instead known relations from a knowledge base and a large textual corpus from an appropriate domain. While state of the art distant supervision approaches use off-theshelf named entity recognition and classification (NERC) systems to identify relation arguments, discrepancies in domain or genre between the data used for NERC training and the intended domain for the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(14 citation statements)
references
References 19 publications
0
14
0
Order By: Relevance
“…Entity types for relation extraction. Several studies have integrated entity type information into relation extraction -either coarse-grained (Hoffmann et al, 2011;Zhou et al, 2005) or finegrained (Liu et al, 2014;Du et al, 2015;Augenstein et al, 2015;Vlachos and Clark, 2014;Ling and Weld, 2012) entity types. In contrast to most of this work, but similar to , we do not incorporate binary entity type values, but probabilistic outputs.…”
Section: Related Workmentioning
confidence: 99%
“…Entity types for relation extraction. Several studies have integrated entity type information into relation extraction -either coarse-grained (Hoffmann et al, 2011;Zhou et al, 2005) or finegrained (Liu et al, 2014;Du et al, 2015;Augenstein et al, 2015;Vlachos and Clark, 2014;Ling and Weld, 2012) entity types. In contrast to most of this work, but similar to , we do not incorporate binary entity type values, but probabilistic outputs.…”
Section: Related Workmentioning
confidence: 99%
“…However, such a "pipeline" diagram ignores the dependencies between different sub tasks and may suffer from error propagation between the tasks. Recent studies try to integrate entity extraction with relation extraction by performing global sequence labeling for both entities and relations [24,32,1], incorporating type constraints between relations and their arguments [44], or modeling factor graphs [47]. However, these methods require human-annotated corpora (cleaned and general) for model training and rely on existing entity detectors to provide entity mentions.…”
Section: Related Workmentioning
confidence: 99%
“…This assumes seeds are unambiguous and sufficiently frequent in the corpus, which requires careful seed se-arXiv:1610.08763v2 [cs.CL] 2 Jun 2017 Dataset NYT [43] Wiki-KBP [12], BioInfer [39] # of entity Table 1: A study of type label noise. (1): %entity mentions with multiple sibling entity types (e.g., actor, singer) in the given entity type hierarchy; (2): %relation mentions with multiple relation types, for the three experiment datasets. lection by human [2].…”
Section: Introductionmentioning
confidence: 99%
“…Imitation learning (Schaal, 1999;Abbeel and Ng, 2004;Ross et al, 2011) is a popular instance of learning from demonstration where the algorithm observes a human expert perform a series of actions to accomplish the task and learns a policy that "imitates" the expert with the purpose of generalizing to unseen data. Imitation learning is increasingly being used in NLP (Vlachos and Clark, 2014;Augenstein et al, 2015;Beck et al, 2016;Goodman et al, 2016a,b). However, all these models focus on learning respective NLP models from the final supervision e.g.…”
Section: Related Workmentioning
confidence: 99%