2020
DOI: 10.1609/aaai.v34i04.6009
|View full text |Cite
|
Sign up to set email alerts
|

Weakly Supervised Sequence Tagging from Noisy Rules

Abstract: We propose a framework for training sequence tagging models with weak supervision consisting of multiple heuristic rules of unknown accuracy. In addition to supporting rules that vote on tags in the output sequence, we introduce a new type of weak supervision, called linking rules, that vote on how sequence elements should be grouped into spans with the same tag. These rules are an alternative to candidate span generators that require significantly more human effort. To estimate the accuracies of the rules and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
107
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 83 publications
(112 citation statements)
references
References 32 publications
1
107
0
Order By: Relevance
“…There are also some works that focus on redefining NER as a different problem for reducing the need of hand-labeled training data. For example, Linking Rules (Safranchik et al, 2020) based on votes recognize entities through whether adjacent elements in a sequence belong to the same class; Lin et al propose a new effective proxy of human explanation, "entity triggers", for encouraging label-efficient learning of NER models.…”
Section: Related Workmentioning
confidence: 99%
“…There are also some works that focus on redefining NER as a different problem for reducing the need of hand-labeled training data. For example, Linking Rules (Safranchik et al, 2020) based on votes recognize entities through whether adjacent elements in a sequence belong to the same class; Lin et al propose a new effective proxy of human explanation, "entity triggers", for encouraging label-efficient learning of NER models.…”
Section: Related Workmentioning
confidence: 99%
“…The approach most closely related to this paper is Safranchik et al (2020), which describe a similar weak supervision framework for sequence labelling based on an extension of HMMs called linked hidden Markov models. The authors introduce a new type of noisy rules, called linking rules, to determine how sequence elements should be grouped into spans of same tag.…”
Section: Weak Supervisionmentioning
confidence: 99%
“…TagRuler implements atomic rules which are specifically for span annotation tasks. Following previous works on rule-based labeling models [9,19], we focus on capturing semantic, syntactic, and entity-type information of the spans. 2.2.1 Contextual similarity rules with neural embeddings.…”
Section: Synthesizermentioning
confidence: 99%