Proceedings of the 29th ACM International Conference on Information &Amp; Knowledge Management 2020
DOI: 10.1145/3340531.3411864
|View full text |Cite
|
Sign up to set email alerts
|

LSAN: Modeling Long-term Dependencies and Short-term Correlations with Hierarchical Attention for Risk Prediction

Abstract: Risk prediction using electronic health records (EHR) is a challenging data mining task due to the two-level hierarchical structure of EHR data. EHR data consist of a set of time-ordered visits, and within each visit, there is a set of unordered diagnosis codes. Existing approaches focus on modeling temporal visits with deep neural network (DNN) techniques. However, they ignore the importance of modeling diagnosis codes within visits, and a lot of task-unrelated information within visits usually leads to unsat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(9 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…For instance, GRAM [7], KAME [25], and G-BERT [30] leveraged the attention mechanism to integrate domain knowledge into disease or medication code representations for better performance. Retain [8], Dipole [24], Timeline [3] and LSAN [35] all introduced attention mechanism to model the disease progression by considering the dependencies among visits and provide some interpretable insights. In addition, GCT [10] were equipped with advanced attention networks, i.e., Transformer [33], to build the correlations between medical codes from every visits based on the automatically learned attention weights.…”
Section: Attention Mechanism In Health Informaticsmentioning
confidence: 99%
See 3 more Smart Citations
“…For instance, GRAM [7], KAME [25], and G-BERT [30] leveraged the attention mechanism to integrate domain knowledge into disease or medication code representations for better performance. Retain [8], Dipole [24], Timeline [3] and LSAN [35] all introduced attention mechanism to model the disease progression by considering the dependencies among visits and provide some interpretable insights. In addition, GCT [10] were equipped with advanced attention networks, i.e., Transformer [33], to build the correlations between medical codes from every visits based on the automatically learned attention weights.…”
Section: Attention Mechanism In Health Informaticsmentioning
confidence: 99%
“…1, EHR systems in hospitals accumulate complex temporal and heterogeneous sequences. Existing studies in health informatics have widely utilize the temporal sequential records from EHRs to solve healthcare problems such as predicting disease progression [9], [29], [40], [35], medications recommendation [38], [31], [18] and clinical trial recruitment [5], [37]. However, most of the studies such as T-LSTM [4], MNN [29] and LSAN [35] mainly focused on modelling the temporal dependencies of multiple visits of homogeneous sequence such as the history diseases sequence.…”
Section: Sequence Modeling In Health Informaticsmentioning
confidence: 99%
See 2 more Smart Citations
“…• LSAN[27]. An attention-based model that uses Transformer to capture global information and CNN to capture local information.• MTL Models: We develop the MTL version for each of the aforementioned neural network-based models by employing task-specific attention and decoder over the output generated by these models.…”
mentioning
confidence: 99%