2020
DOI: 10.1609/aaai.v34i05.6237
|View full text |Cite
|
Sign up to set email alerts
|

Zero-Resource Cross-Lingual Named Entity Recognition

Abstract: Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Ou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 35 publications
(34 citation statements)
references
References 19 publications
(29 reference statements)
0
28
0
Order By: Relevance
“…Experimental results on three datasets show that our proposed approach sets a new state-of-the-art result in various evaluation metrics. In the future, we will apply our models on more tasks, such as information retrieval applications (Huang and Hu, 2009;Huang et al, 2003;Yin et al, 2013;Huang et al, 2005), sentiment analysis (Liu et al, 2007;Yu et al, 2012), learning from imbalanced or unlabeled datasets (Liu et al, 2006;Bari et al, 2019;Bari et al, 2020), and automatic chart question answering (Kim et al, 2020).…”
Section: Discussionmentioning
confidence: 99%
“…Experimental results on three datasets show that our proposed approach sets a new state-of-the-art result in various evaluation metrics. In the future, we will apply our models on more tasks, such as information retrieval applications (Huang and Hu, 2009;Huang et al, 2003;Yin et al, 2013;Huang et al, 2005), sentiment analysis (Liu et al, 2007;Yu et al, 2012), learning from imbalanced or unlabeled datasets (Liu et al, 2006;Bari et al, 2019;Bari et al, 2020), and automatic chart question answering (Kim et al, 2020).…”
Section: Discussionmentioning
confidence: 99%
“…Following most existing works (Lample et al, 2016;Lin and Lu, 2018;Bari et al, 2019), Word Processor is proposed to achieve basic understandings over the words in the entity mentions. Given an input entity mention X w = (t 1 , ..., t K ) with K tokens, each token t k is represented as [w k ; c k ].…”
Section: Word Processormentioning
confidence: 99%
“…In existing studies, researchers have begun to improve low-resource word representations via knowledge transfer from high-resource languages using bilingual lexicons [3]. Bari [4] proposed an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Xie [5] proposed a method that finds translations based on bilingual word embeddings which use the unsupervised transfer of natural language processing models in named entity recognition.…”
Section: Introductionmentioning
confidence: 99%