The World Wide Web Conference 2019
DOI: 10.1145/3308558.3313573
|View full text |Cite
|
Sign up to set email alerts
|

Learning Dual Retrieval Module for Semi-supervised Relation Extraction

Abstract: Relation extraction is an important task in structuring content of text data, and becomes especially challenging when learning with weak supervision-where only a limited number of labeled sentences are given and a large number of unlabeled sentences are available. Most existing work exploits unlabeled data based on the ideas of self-training (i.e., bootstrapping a model) and multi-view learning (e.g., ensembling multiple model variants). However, these methods either suffer from the issue of semantic drift, or… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
54
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 49 publications
(54 citation statements)
references
References 36 publications
0
54
0
Order By: Relevance
“…The model is optimized by perturbation-based loss and training loss jointly. • DualRE (Lin et al, 2019) leverages sentence retrieval as a dual task for relation extraction.…”
Section: Baselines and Evaluation Metricsmentioning
confidence: 99%
“…The model is optimized by perturbation-based loss and training loss jointly. • DualRE (Lin et al, 2019) leverages sentence retrieval as a dual task for relation extraction.…”
Section: Baselines and Evaluation Metricsmentioning
confidence: 99%
“…However, it's labor-intensive to obtain large amounts of manual annotations on corpus. Low resource Relation Extraction methods gained a lot of attention recently (Levy et al, 2017;Tarvainen and Valpola, 2017;Lin et al, 2019;Li and Qian, 2020;Hu et al, 2021Hu et al, , 2020, since these methods require fewer labeled data and deep neural networks could expand limited labeled information by exploiting information on unlabeled data to iteratively improve the performance. One major method is the self-training work proposed by Rosenberg et al (2005).…”
Section: Related Workmentioning
confidence: 99%
“…Long-tailed IE: RE (Zeng et al, 2014;Peng et al, 2017;Song et al, 2018;Lin et al, 2019;Nan et al, 2020;Qu et al, 2020;Guo et al, 2020;Zhang et al, 2021c;Zheng et al, 2021;Ye et al, 2021;Bai et al, 2021;, NER (Lample et al, 2016;Chiu and Nichols, 2016;ED (Nguyen andGrishman, 2015;Huang et al, 2018) are mainstream IE tasks in NLP. For the long-tailed IE, recent models (Lei et al, 2018;Zhang et al, 2019) the long-tailed IE problem from the perspective of causal inference.…”
Section: Related Workmentioning
confidence: 99%