Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2016
DOI: 10.1145/2939672.2939823
|View full text |Cite
|
Sign up to set email alerts
|

Multi-layer Representation Learning for Medical Concepts

Abstract: Learning efficient representations for concepts has been proven to be an important basis for many applications such as machine translation or document classification. Proper representations of medical concepts such as diagnosis, medication, procedure codes and visits will have broad applications in healthcare analytics. However, in Electronic Health Records (EHR) the visit sequences of patients include multiple concepts (diagnosis, procedure, and medication codes) per visit. This structure provides two types o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
300
1
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 380 publications
(303 citation statements)
references
References 23 publications
1
300
1
1
Order By: Relevance
“…There are two key differences between all these close previous work [3,4,14,25] and ours. First, unlike these past approaches, we tackle the readability of word senses and polysemy problems by learning document representations that leverage semantics inventoried in both text corpora and knowledge resources through fine-grained elements including words and concepts in a joint learning process.…”
Section: Neural Approaches Empowered By Knowledge Resources Forcontrasting
confidence: 64%
See 2 more Smart Citations
“…There are two key differences between all these close previous work [3,4,14,25] and ours. First, unlike these past approaches, we tackle the readability of word senses and polysemy problems by learning document representations that leverage semantics inventoried in both text corpora and knowledge resources through fine-grained elements including words and concepts in a joint learning process.…”
Section: Neural Approaches Empowered By Knowledge Resources Forcontrasting
confidence: 64%
“…The second and recent line of work aims at refining word embedding using relational constraints to better discriminate word senses by simultaneously learning the concept representations and inferring word senses, and accordingly tackling the polysemy issue [3,4,14,25]. Mancini et al [14] simultaneously learn embeddings for both words and their senses via a semantic network based on the CBOW architecture.…”
Section: Neural Approaches Empowered By Knowledge Resources Formentioning
confidence: 99%
See 1 more Smart Citation
“…kidney failure). Models that utilize such representation have shown higher prediction performance than previous models that do not [20,21].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, there has been considerable attention in the application of neural networks to represent medical concepts as multi-dimensional and continuous vectors [20,21]. A process called contextual embedding, commonly used in natural language processing, maps each word from a corpus of text to a hyper-dimensional space where similar words in terms of meaning and/or distributed usage would be located nearby (e.g., short cosine distance).…”
Section: Introductionmentioning
confidence: 99%