Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/607
|View full text |Cite
|
Sign up to set email alerts
|

ATTAIN: Attention-based Time-Aware LSTM Networks for Disease Progression Modeling

Abstract: Modeling patient disease progression using Electronic Health Records (EHRs) is critical to assist clinical decision making. Long-Short Term Memory (LSTM) is an effective model to handle sequential data, such as EHRs, but it encounters two major limitations when applied to EHRs: it is unable to interpret the prediction results and it ignores the irregular time intervals between consecutive events. To tackle these limitations, we propose an attention-based time-aware LSTM Networks (ATTAIN), to improve the interp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
34
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 66 publications
(35 citation statements)
references
References 18 publications
1
34
0
Order By: Relevance
“…We built 2 cohorts as simulations for both the real-world application scenario ( early prediction ) and the model evaluation (called next-visit prediction in many previous studies [ 27 , 32 ]). In early prediction, we could not foresee when the exacerbation would happen but could only evaluate the future risk at each visit.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We built 2 cohorts as simulations for both the real-world application scenario ( early prediction ) and the model evaluation (called next-visit prediction in many previous studies [ 27 , 32 ]). In early prediction, we could not foresee when the exacerbation would happen but could only evaluate the future risk at each visit.…”
Section: Methodsmentioning
confidence: 99%
“…One of the most popular architectures is recurrent neural networks (RNNs), which make predictions according to the sequence of historical events. Dozens of successes have been achieved in applying deep learning to disease predictions [ 24 ], mostly using variants of RNNs with distinct network components, for example, by adding an attention mechanism to evaluate the weights of each variable [ 25 - 29 ] or by using special configurations to tackle the problem of time decays [ 23 , 25 , 27 , 30 - 32 ]. Typical prediction tasks include the prediction of diabetes mellitus [ 23 ], Parkinson disease [ 29 , 33 ], chronic heart failure [ 26 ], sepsis [ 34 ], mortality, and readmission [ 25 ].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It also uses a similar time decay function as RETAIN. We implemented it ourselves using TensorFlow ATTAIN g [32] The proposed TSANN model but with the second attention layer removed. Prediction is based on the final state of LSTM TSANN h -I Apply the time-encoding method from Song et al [39] on TSANN-I.…”
Section: Lr Amentioning
confidence: 99%
“…One of the most popular architectures is recurrent neural networks (RNNs), which make predictions according to the sequence of historical events. Dozens of successes have been achieved in applying deep learning to disease predictions [24], mostly using variants of RNNs with distinct network components, for example, by adding an attention mechanism to evaluate the weights of each variable [25][26][27][28][29] or by using special configurations to tackle the problem of time decays [23,25,27,[30][31][32]. Typical prediction tasks include the prediction of diabetes mellitus [23], Parkinson disease [29,33], chronic heart failure [26], sepsis [34], mortality, and readmission [25].…”
Section: Introductionmentioning
confidence: 99%