Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.484
|View full text |Cite
|
Sign up to set email alerts
|

SENT: Sentence-level Distant Relation Extraction via Negative Training

Abstract: Distant supervision for relation extraction provides uniform bag labels for each sentence inside the bag, while accurate sentence labels are important for downstream applications that need the exact relation type. Directly using bag labels for sentence-level training will introduce much noise, thus severely degrading performance. In this work, we propose the use of negative training (NT), in which a model is trained using complementary labels regarding that "the instance does not belong to these complementary … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 35 publications
0
9
0
Order By: Relevance
“…Supervised/Distantly supervised relation extraction is oriented at predefined relational types. Researchers have explored different network architectures (Zhang et al, 2018), training strategies (Ma et al, 2021) and external information (Zhang et al, 2019) (Fader et al, 2011), clustering (Zhao et al, 2021) are used to deal with relations without pre-specified schemas. Different from them, we consider a more general scenario, in which known and unknown relations are mixed in the input.…”
Section: Relation Extractionmentioning
confidence: 99%
See 2 more Smart Citations
“…Supervised/Distantly supervised relation extraction is oriented at predefined relational types. Researchers have explored different network architectures (Zhang et al, 2018), training strategies (Ma et al, 2021) and external information (Zhang et al, 2019) (Fader et al, 2011), clustering (Zhao et al, 2021) are used to deal with relations without pre-specified schemas. Different from them, we consider a more general scenario, in which known and unknown relations are mixed in the input.…”
Section: Relation Extractionmentioning
confidence: 99%
“…Different optimization objectives such as large margin loss (Lin and Xu, 2019), gaussian mixture loss (Yan et al, 2020) are adopted to learn more discriminative representations to facilitate anomaly detection. Recently, Zhang et al (2021) propose to learn the adaptive decision boundary (ADB) that serves as the basis for judging outliers.…”
Section: Classification With Rejection Optionmentioning
confidence: 99%
See 1 more Smart Citation
“…of models to meet the requirement of diverse responses instead of MLE, such as MMI (Li et al, 2016), AdaLabel (Wang et al, 2021), and IAT (Zhou et al, 2021). Besides, several studies (Kulikov et al, 2019;Holtzman et al, 2020) proposed more advanced decoding strategies to alleviate the problem of generic responses.…”
Section: Introductionmentioning
confidence: 99%
“…However, inspired by negative training (Kim et al, 2019;Ma et al, 2021), we argue that it is also necessary to tell the dialogue model what not to say. To alleviate the problem of generic responses, He and Glass (2020) negatively updates the parameters when identifying the high-frequency responses.…”
Section: Introductionmentioning
confidence: 99%