2022
DOI: 10.1002/int.22957
|View full text |Cite
|
Sign up to set email alerts
|

SelfMatch: Robust semisupervised time‐series classification with self‐distillation

Abstract: Over the years, a number of semisupervised deeplearning algorithms have been proposed for timeseries classification (TSC). In semisupervised deep learning, from the point of view of representation hierarchy, semantic information extracted from lower levels is the basis of that extracted from higher levels. The authors wonder if high-level semantic information extracted is also helpful for capturing low-level semantic information. This paper studies this problem and proposes a robust semisupervised model with s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 97 publications
(16 citation statements)
references
References 52 publications
0
11
0
Order By: Relevance
“…ABCN2 was used in an intelligent tutoring system (Woolf, 2010) to teach students arguing strategies (Guid et al, 2019). The intelligent system by Xing et al (2022b) uses feature extraction in semi-supervised learning and self-distillation; a similar feature extractor was used in the federated distillation learning system (Xing et al, 2022a). The last two methods are specific to time-series classification and do not use explicit constructive operators.…”
Section: Standalone Feature Constructionmentioning
confidence: 99%
“…ABCN2 was used in an intelligent tutoring system (Woolf, 2010) to teach students arguing strategies (Guid et al, 2019). The intelligent system by Xing et al (2022b) uses feature extraction in semi-supervised learning and self-distillation; a similar feature extractor was used in the federated distillation learning system (Xing et al, 2022a). The last two methods are specific to time-series classification and do not use explicit constructive operators.…”
Section: Standalone Feature Constructionmentioning
confidence: 99%
“…improved. For example, in [28], a deep-learning feature extractor for time-series data is designed for relation extraction, and the clustering effect achieved significant improvement. Nonetheless, in view of the unknown data distribution in actual problems, it is difficult to determine which clustering algorithm can get better clustering results.…”
Section: U N C O R R E C T E D a U T H O R P R O O Fmentioning
confidence: 99%
“…Moreover, to alleviate the issue of high dimension and sparsity of tag information in actual scenarios, the author in [24] developed two novel heterogeneous knowledge distillation methods in featurelevel and label-level to build relations between User-oriented autoencoder and item-oriented autoencoder. In the recent literature, several works proposing self-distillation [25,26] have also been emerged. The work in [27] introduced a weighting mechanism to dynamically put less weights on uncertain samples and showed promising results.…”
Section: Recommender Systemmentioning
confidence: 99%