Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1152
|View full text |Cite
|
Sign up to set email alerts
|

Learning Representations from Imperfect Time Series Data via Tensor Rank Regularization

Abstract: There has been an increased interest in multimodal language processing including multimodal dialog, question answering, sentiment analysis, and speech recognition. However, naturally occurring multimodal data is often imperfect as a result of imperfect modalities, missing entries or noise corruption. To address these concerns, we present a regularization method based on tensor rank minimization. Our method is based on the observation that high-dimensional multimodal time series data often exhibit correlations … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(11 citation statements)
references
References 44 publications
0
11
0
Order By: Relevance
“…There exist various exciting recent work on improved multimodal fusion techniques Liang et al, 2019a;Pham et al, 2019;Baltrušaitis et al, 2019). In addition to the simplified feature and modality concatenations, we plan to explore some of these promising tensor-based multimodal fusion networks (Liu et al, 2018;Liang et al, 2019b;Tsai et al, 2019) for more robust intent classification on AMIE dataset as future work.…”
Section: Discussionmentioning
confidence: 99%
“…There exist various exciting recent work on improved multimodal fusion techniques Liang et al, 2019a;Pham et al, 2019;Baltrušaitis et al, 2019). In addition to the simplified feature and modality concatenations, we plan to explore some of these promising tensor-based multimodal fusion networks (Liu et al, 2018;Liang et al, 2019b;Tsai et al, 2019) for more robust intent classification on AMIE dataset as future work.…”
Section: Discussionmentioning
confidence: 99%
“…Low-rank imputation: Complete multimodal data exhibits correlations between different modalities and leads to the low-rank data matrix. However, incomplete data breaks these correlations and increases tensor rank [19], [20]. To capture multimodal correlations, previous works project data into a common space by using low-rankness.…”
Section: Imputation Methodsmentioning
confidence: 99%
“…Besides nuclear norm, Fan et al [23] also minimized tensor tubal rank to deal with various missing patterns. Furthermore, Liang et al [20] combined the strength of non-linear functions to learn complex correlations in tensor rank minimization. However, these methods are usually computationally expensive for big data [24].…”
Section: Imputation Methodsmentioning
confidence: 99%
“…The optimization objectives for the core tensor are presented in Equs. (8) and (9). The optimization objectives for the factor matrix are presented in Equ.…”
Section: Stochastic Optimizationmentioning
confidence: 99%
“…Here, during the learning process, each modality corresponds to a feature and the feature alignment involves fusion. Tensors are a common form of feature fusion for multi-modal learning [7], [8], [9], [10]. Unfortunately, tensors can be difficult to process in practice.…”
mentioning
confidence: 99%