2020
DOI: 10.1016/j.patcog.2020.107525
|View full text |Cite
|
Sign up to set email alerts
|

Mutual information based feature subset selection in multivariate time series classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 44 publications
(20 citation statements)
references
References 39 publications
0
17
0
Order By: Relevance
“…To avoid this, some studies have used a k-nearest neighbour (k-NN) approach to calculate the MI which avoids the need to calculate the probability distribution function and therefore can be used on the original multidimensional feature subset [11,16]. In a very recent development [12], the authors propose a Filter method for feature subset selection where they assign a score function to assess the relevance of each feature subset. Mutual information based on a k-NN strategy is used to measure the information shared between the two time series.…”
Section: Mutual Information Based Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To avoid this, some studies have used a k-nearest neighbour (k-NN) approach to calculate the MI which avoids the need to calculate the probability distribution function and therefore can be used on the original multidimensional feature subset [11,16]. In a very recent development [12], the authors propose a Filter method for feature subset selection where they assign a score function to assess the relevance of each feature subset. Mutual information based on a k-NN strategy is used to measure the information shared between the two time series.…”
Section: Mutual Information Based Methodsmentioning
confidence: 99%
“…It is worth comparing these results with the feature subset selection technique proposed by Ircio et al, which uses mutual information based on k-NN as it is directly comparable [12]. The authors show that on average for 8 out of the 16 datasets for DT W D and 4 out of the 16 datasets for DT W I the accuracy can be maintained or improved [12]. In our evaluations, MSTS improves or maintains the accuracy for 11 out of the 19 datasets for DTW and for 12 out of the 19 datasets for MiniRocket.…”
Section: Cfs For Time Seriesmentioning
confidence: 98%
See 1 more Smart Citation
“…[24][25][26] The filter method 24 is independent of the used classifier. 27 It firstly selects the features of the data set, and then selects the optimal feature subset after sorting all the features. The wrapper method needs to use the determined learning algorithm repeatedly to search the feature subsets.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, compared to the mutual information based methods, which can only work with the discrete or categorical features [15,16], the proposed method is applicable to both numerical (including discrete and continuous) and categorical features. The proposed method is also closely related to two recent advanced topics in the feature selection field, which are the multi-label feature selection [17,18,19] and multivariate time series feature selection [20,21], respectively. The fundamental issue that is addressed under the two research topics is to develop methods that can deal with the features and response that have to be represented in a matrix form.…”
Section: Introductionmentioning
confidence: 99%