Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.516
|View full text |Cite
|
Sign up to set email alerts
|

Simple and Effective Few-Shot Named Entity Recognition with Structured Nearest Neighbor Learning

Abstract: We present a simple few-shot named entity recognition (NER) system based on nearest neighbor learning and structured inference. Our system uses a supervised NER model trained on the source domain, as a feature extractor. Across several test domains, we show that a nearest neighbor classifier in this featurespace is far more effective than the standard meta-learning approaches. We further propose a cheap but effective method to capture the label dependencies between entity tags without expensive CRF training. W… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
113
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 116 publications
(155 citation statements)
references
References 12 publications
0
113
0
Order By: Relevance
“…In order to consider label dependencies that are essential in slot tagging tasks (Huang et al, 2015), Hou et al (2020) proposed a collapsed dependency transfer (CDT) mechanism by simulating transition scores for the target domain from transition probabilities among BIO labels in the source domain, outperforming previous methods on slot filling by a large margin. Yang and Katiyar (2020) further explored the transition probability by evenly distributing the collapsed transition scores to the target domain to maintain a valid distribution. However, this simulation is noisy and the difference between the source and target domains can result in biased transition probabilities.…”
Section: Metric Learning In Language Understandingmentioning
confidence: 99%
“…In order to consider label dependencies that are essential in slot tagging tasks (Huang et al, 2015), Hou et al (2020) proposed a collapsed dependency transfer (CDT) mechanism by simulating transition scores for the target domain from transition probabilities among BIO labels in the source domain, outperforming previous methods on slot filling by a large margin. Yang and Katiyar (2020) further explored the transition probability by evenly distributing the collapsed transition scores to the target domain to maintain a valid distribution. However, this simulation is noisy and the difference between the source and target domains can result in biased transition probabilities.…”
Section: Metric Learning In Language Understandingmentioning
confidence: 99%
“…Few-shot Sequence Labeling In recent years, several works (Fritzler et al, 2019;Hou et al, 2020;Yang and Katiyar, 2020) have been proposed to solve the few-shot named entity recognition using sequence labeling methods. Fritzler et al (2019) applied the vanilla CRF in the few-shot scenario directly.…”
Section: Related Workmentioning
confidence: 99%
“…Hou et al (2020) proposed a collapsed dependency transfer mechanism (CDT) into CRF, which learns label dependency patterns of a set of task-agnostic abstract labels and utilizes these patterns as transition scores for novel labels. Yang and Katiyar (2020) trains their model on the training data in a standard supervised learning manner and then uses the prototypical networks and the CDT for prediction in the inference phase. Different from these methods learning the transition scores by optimization, we build a network to generate the transition scores based on the label prototypes instead.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…NER has been addressed in several works. In Yang and Katiyar, 2020) the task of interest consists of recognizing one class of named entities, for tag set extension or domain transfer. In our work, we extend the N-way K-shot setting to structured prediction.…”
Section: Related Workmentioning
confidence: 99%