Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/339
|View full text |Cite
|
Sign up to set email alerts
|

Structured Inference for Recurrent Hidden Semi-markov Model

Abstract: Segmentation and labeling for high dimensional time series is an important yet challenging task in a number of applications, such as behavior understanding and medical diagnosis. Recent advances to model the nonlinear dynamics in such time series data, has suggested to involve recurrent neural networks into  Hidden Markov Models. However, this involvement has caused the inference procedure much more complicated, often leading to intractable inference, especially for the discrete variables of segmentation and l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
6
3

Relationship

3
6

Authors

Journals

citations
Cited by 32 publications
(17 citation statements)
references
References 4 publications
0
17
0
Order By: Relevance
“…The most widely-used models for discrete sequence generation are hidden Markov models (HMM) and their time-dependent generalisation, hidden semi-Markov models (HSMM) (Yu, 2015). In particular, HMMs and HSMMs are standard tools in a wide range of applications concerned with e.g., speech recognition (Liu et al, 2018;Zen et al, 2004;Deng et al, 2006) and activity recognition (Duong et al, 2005). Furthermore, they have often been used for the analysis of neuronal activity (Tokdar et al, 2010) and human behavior in general (Eldar et al, 2011).…”
Section: Sequence Recognition In Machine Learningmentioning
confidence: 99%
“…The most widely-used models for discrete sequence generation are hidden Markov models (HMM) and their time-dependent generalisation, hidden semi-Markov models (HSMM) (Yu, 2015). In particular, HMMs and HSMMs are standard tools in a wide range of applications concerned with e.g., speech recognition (Liu et al, 2018;Zen et al, 2004;Deng et al, 2006) and activity recognition (Duong et al, 2005). Furthermore, they have often been used for the analysis of neuronal activity (Tokdar et al, 2010) and human behavior in general (Eldar et al, 2011).…”
Section: Sequence Recognition In Machine Learningmentioning
confidence: 99%
“…proposed by [21], [22], and the authors of [23] relax the discrete latents and durations of a recurrent hidden semi-Markov Model. Applications of (Gumbel-based) discrete latent variable models as described above include (among others) planning [24], syntactic parsing [25], text modelling [26], speech modelling [27], [28], paraphrase generation [29], recommender systems [30], drug-drug interaction modelling [31], and event modelling [32].…”
Section: Discrete Latent Variable Modelsmentioning
confidence: 99%
“…[Qiu et al, 2020] introduces a novel inference technique that accounts for multimodality. Other works employ the idea of switching regimes incorporated with deep learning, such as [Johnson et al, 2016, Farnoosh et al, 2020, Dai et al, 2016, Liu et al, 2018. These models assume the Markov assumption on the state Figure 1: The architecture of the Switching Recurrent Kalman Network.…”
Section: Related Workmentioning
confidence: 99%