2016
DOI: 10.48550/arxiv.1603.01360
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Architectures for Named Entity Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
346
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 239 publications
(359 citation statements)
references
References 0 publications
2
346
0
1
Order By: Relevance
“…The downside of such models is the requirement of extensive feature engineering. Another method for NER is to use DL models (Ma and Hovy, 2016;Lample et al, 2016). This models not only select text spans containing named entities but also extract quality entity representations which can be used as input for NEN.…”
Section: Related Workmentioning
confidence: 99%
“…The downside of such models is the requirement of extensive feature engineering. Another method for NER is to use DL models (Ma and Hovy, 2016;Lample et al, 2016). This models not only select text spans containing named entities but also extract quality entity representations which can be used as input for NEN.…”
Section: Related Workmentioning
confidence: 99%
“…Conditional Random Field To model the label consistency, i.e., the emotion transfer on a dialogue, a linear chain conditional random field is employed to yield final emotion tags of each utterance. Following the description of (Lample et al 2016), for an input set of utterances U = u 1 , u 2 , ..., u n and a sequence of tag predictions y = y 1 , y 2 , .., y n , y i ∈ 1, • • • , K (K is the number of emotion tags), the score of the sequence is defined as:…”
Section: Consistency Modelingmentioning
confidence: 99%
“…This system achieved encouraging results and demonstrated the feasibility of using deep learning methods to extract medication information from raw clinical texts. Several other research papers have proposed BiLSTM with conditional random fields for their NER model [18], [19]. However, this type of approach requires working with words and phrases that have meaning in their sequence.…”
Section: Related Workmentioning
confidence: 99%