2014
DOI: 10.1162/tacl_a_00197
|View full text |Cite
|
Sign up to set email alerts
|

A Joint Model for Entity Analysis: Coreference, Typing, and Linking

Abstract: We present a joint model of three core tasks in the entity analysis stack: coreference resolution (within-document clustering), named entity recognition (coarse semantic typing), and entity linking (matching to Wikipedia entities). Our model is formally a structured conditional random field. Unary factors encode local features from strong baselines for each task. We then add binary and ternary factors to capture cross-task interactions, such as the constraint that coreferent mentions have the same semantic typ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
179
0
1

Year Published

2017
2017
2019
2019

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 206 publications
(180 citation statements)
references
References 21 publications
0
179
0
1
Order By: Relevance
“…We also report the entity extraction results on the OntoNotes 5.0 test set in Table 4. We compare our models with the existing feature-based models Ratinov and Roth (2009) and Durrett and Klein (2014), which both employ heavy feature engineering to bring in external knowledge. BiLSTM-CNN (Chiu and Nichols, 2016) employs a hybrid BiLSTM and Convolutional neural network (CNN) architecture and incorporates rich lexicon features derived from SENNA and DBPedia.…”
Section: Resultsmentioning
confidence: 99%
“…We also report the entity extraction results on the OntoNotes 5.0 test set in Table 4. We compare our models with the existing feature-based models Ratinov and Roth (2009) and Durrett and Klein (2014), which both employ heavy feature engineering to bring in external knowledge. BiLSTM-CNN (Chiu and Nichols, 2016) employs a hybrid BiLSTM and Convolutional neural network (CNN) architecture and incorporates rich lexicon features derived from SENNA and DBPedia.…”
Section: Resultsmentioning
confidence: 99%
“…Regarding the selection of the set of tasks, our work is closest to (Durrett and Klein 2014;Singh et al 2013). Durrett and Klein (2014) combine coreference resolution, entity linking (sometimes referred to as Wikification) and mention detection.…”
Section: Related Workmentioning
confidence: 99%
“…Entity Linking Durrett and Klein (2014) is the work that is closest to our approach, although not neural. In their approach they model interactions between the MD, CG and ED tasks jointly.…”
Section: Related Workmentioning
confidence: 99%
“…• Entity disambiguation (ED): (typically) a mix of useful coreference and coherence features together with a classifier determine the entity link. Durrett and Klein (2014) were the first to propose jointly modelling MD, CG and ED in a graphical model and could show that each of those steps are interdependent and benefit from a joint objective. Other approaches only model MD and ED jointly (Nguyen et al, 2016;Kolitsas et al, 2018), thus these architectures depend on a CG step after mention detection.…”
Section: Introductionmentioning
confidence: 99%