The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.15265/iy-2015-035
|View full text |Cite
|
Sign up to set email alerts
|

Clinical Natural Language Processing in 2014: Foundational Methods Supporting Efficient Healthcare

Abstract: SummaryObjective: To summarize recent research and present a selection of the best papers published in 2014 in the field of clinical Natural Language Processing (NLP). Method: A systematic review of the literature was performed by the two section editors of the IMIA Yearbook NLP section by searching bibliographic databases with a focus on NLP efforts applied to clinical texts or aimed at a clinical outcome. A shortlist of candidate best papers was first selected by the section editors before being peer-reviewe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
18
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(18 citation statements)
references
References 33 publications
0
18
0
Order By: Relevance
“…For example, between 2007 and 2018, the number of PubMed records with "free text" or "unstructured text" more than tripled [2]. Advances in natural language processing and machine learning, and access to de-identified clinical datasets, have contributed to this increase [3].…”
Section: Introductionmentioning
confidence: 99%
“…For example, between 2007 and 2018, the number of PubMed records with "free text" or "unstructured text" more than tripled [2]. Advances in natural language processing and machine learning, and access to de-identified clinical datasets, have contributed to this increase [3].…”
Section: Introductionmentioning
confidence: 99%
“…The review by Meystre et al [5] presents an excellent overview of health-related text processing and its applications until 2007. The research presented in the present manuscript includes, for social media and clinical records, advances to fundamental NLP methods such as classification, concept extraction, and normalization published since 2008, mostly omitting what has been included in similar recent reviews [6][7][8], except when required to complete and highlight recent advances. For each data source, after reviewing advances in fundamental methods, we reviewed specific applications that capture the patient's perspective for specific conditions, treatments, or phenotypes.…”
Section: Introductionmentioning
confidence: 99%
“…For the applications, we built on the 2016 review by Demner-Fushman and Elhadad [7] and highlighted major achievements from 2013 to 2016 that are relevant to the patient's perspective focus of this review. The search and selection criteria used were similar to the ones used by Névéol and Zweigenbaum [8], from January 1st, 2013 through December 31st, 2016, resulting in 464 papers. A total of 62 papers focusing on clinical records were selected from this set.…”
Section: Introductionmentioning
confidence: 99%
“…We omit discussing basic research recently reviewed by Névéol and Zweigenbaum [3] that is however still needed and ongoing. Some examples include exciting new approaches proposed in the context of community challenges: 2012 i2b2 event and time extraction [4], 2014 i2b2/UTHealth modeling of risk factors for heart disease [5], ShARe SemEval 2014 recognition and normalization of disorders [6,7], and ShARe SemEval 2015 disorder and template filling shared tasks [8].…”
Section: Introductionmentioning
confidence: 99%
“…Since Névéol and Zweigenbaum provide information about methods [3], we only mention here that the methods in the included papers range from regular expressions that dominate research in social-media text processing, to event extraction in a supervised setting. More recently, there has been more and more progress in incorporating the principles of distributional semantics into an NLP pipeline, and a shift towards more semantic parsing, however more work in these areas is needed.…”
Section: Introductionmentioning
confidence: 99%