2019
DOI: 10.1109/tnnls.2018.2873655
|View full text |Cite
|
Sign up to set email alerts
|

Feature Extraction for Incomplete Data Via Low-Rank Tensor Decomposition With Feature Regularization

Abstract: This is a repository copy of Feature extraction for incomplete data via low-rank tensor decomposition with feature regularization.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 60 publications
(14 citation statements)
references
References 51 publications
0
12
0
Order By: Relevance
“…TCMs utilize the information from not only the onedimensional EEG signals or simultaneously acquired twodimensional EEG signals, but also EEG signals from other trials which are multi-dimensional. In the analysis of EEG data, TCMs can also incorporate the multiway nature of data, similar to the applications of image processing and social network data [14][15][16][17][18][19]. Solé-Casals et al [7] proposed to reconstruct EEG signals with missing data using TCMs in a brain computer interface (BCI) context.…”
Section: Introductionmentioning
confidence: 99%
“…TCMs utilize the information from not only the onedimensional EEG signals or simultaneously acquired twodimensional EEG signals, but also EEG signals from other trials which are multi-dimensional. In the analysis of EEG data, TCMs can also incorporate the multiway nature of data, similar to the applications of image processing and social network data [14][15][16][17][18][19]. Solé-Casals et al [7] proposed to reconstruct EEG signals with missing data using TCMs in a brain computer interface (BCI) context.…”
Section: Introductionmentioning
confidence: 99%
“…Tensor decomposition is a powerful computational technique for extracting valuable information from tensorial data Shi et al 2018;Zhou, Lu, and Cheung 2019;. Taking this advantage, tensor decomposition-based approaches can handle multiple TS simultaneously and achieve good forecasting performance (Dunlavy, Kolda, and Acar 2011;Li et al 2015;Tan et al 2016;Bhanu et al 2018;Faloutsos et al 2018).…”
Section: Introductionmentioning
confidence: 99%
“…Several methods have been proposed for tensor decomposition [37][38][39]. In [37], NMF is applied to variational Bayesian matrix factorization, where each observed entry is assumed to be a beta distribution.…”
Section: Introductionmentioning
confidence: 99%
“…In [37], NMF is applied to variational Bayesian matrix factorization, where each observed entry is assumed to be a beta distribution. Shi et al [38] proposed tensor decomposition with variance maximization for feature extraction. In [39], pairwise similarity information is incorporated into Tucker tensor decomposition.…”
Section: Introductionmentioning
confidence: 99%