2022
DOI: 10.48550/arxiv.2212.12749
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Latent State Space Models for Time-Series Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…These models have been especially promising for long sequences, which are challenging for architectures such as Transformers [75], and has required custom approaches to adapt to higher-dimensional data [20,47] or long sequences [13,76]. Deep SSMs have shown state-of-the-art performance on a number of domains, including time series data [30,77,79], audio [27], visual data [53], text [17,50,51], and medical data [70]. A number of methods have also been proposed to simplify the S4 architecture in parameterization [31,34,68], make the parameterization more numerically stable [27], or improve the initialization [32].…”
Section: A Related Workmentioning
confidence: 99%
“…These models have been especially promising for long sequences, which are challenging for architectures such as Transformers [75], and has required custom approaches to adapt to higher-dimensional data [20,47] or long sequences [13,76]. Deep SSMs have shown state-of-the-art performance on a number of domains, including time series data [30,77,79], audio [27], visual data [53], text [17,50,51], and medical data [70]. A number of methods have also been proposed to simplify the S4 architecture in parameterization [31,34,68], make the parameterization more numerically stable [27], or improve the initialization [32].…”
Section: A Related Workmentioning
confidence: 99%