2018
DOI: 10.3390/s18092936
|View full text |Cite
|
Sign up to set email alerts
|

Discrimination between Alternative Herbal Medicines from Different Categories with the Electronic Nose

Abstract: As alternative herbal medicine gains soar in popularity around the world, it is necessary to apply a fast and convenient means for classifying and evaluating herbal medicines. In this work, an electronic nose system with seven classification algorithms is used to discriminate between 12 categories of herbal medicines. The results show that these herbal medicines can be successfully classified, with support vector machine (SVM) and linear discriminant analysis (LDA) outperforming other algorithms in terms of ac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8
1

Relationship

3
6

Authors

Journals

citations
Cited by 21 publications
(23 citation statements)
references
References 40 publications
0
23
0
Order By: Relevance
“…Several feature engineering methods were applied: (1) SMOTEENN (synthetic minority oversampling technique and edited nearest neighbors) 48 : The method performs oversampling using SMOTE and cleaning using ENN to deal with imbalanced classes. In this study, a 1:1 (positive cases: negative cases) balanced dataset and a 1:3 imbalanced dataset were created respectively; (2) SMOTEENN + PCA (principal component analysis) [49][50][51] : upon enlarging the dataset, PCA was applied to reduce the dimensionality of the features. It applies singular value decomposition (SVD) to find the orthogonal principal components and the low-dimension representation of data.…”
Section: Feature Processingmentioning
confidence: 99%
“…Several feature engineering methods were applied: (1) SMOTEENN (synthetic minority oversampling technique and edited nearest neighbors) 48 : The method performs oversampling using SMOTE and cleaning using ENN to deal with imbalanced classes. In this study, a 1:1 (positive cases: negative cases) balanced dataset and a 1:3 imbalanced dataset were created respectively; (2) SMOTEENN + PCA (principal component analysis) [49][50][51] : upon enlarging the dataset, PCA was applied to reduce the dimensionality of the features. It applies singular value decomposition (SVD) to find the orthogonal principal components and the low-dimension representation of data.…”
Section: Feature Processingmentioning
confidence: 99%
“…First, we extracted the 160 temporal features (based on angular velocity and angular acceleration) that were used in our previously developed MLHMs [25], which included 20 features for each channel of signal: the maximum values, the minimum values, the integral values, the integral of absolute values, three sets of maximum and minimum values of the exponential moving average of the signal derivatives with different smoothing factors [38], [39], and the extrema information (the numbers of all positive extrema and of all negative extrema and the second to the fifth positive/negative extrema values). These features were extracted because they took both the signal intensities and time histories into consideration and proved to be effective in accurately predicting whole-brain strain, with specific details and reasons shown in our previous study [25].…”
Section: B Head Impact Kinematics Featuresmentioning
confidence: 99%
“…To tackle the high dimensionality, it is crucial to select the optimal set of features to cover most of the information. Calculating the correlation between them is a classical method to do so [178,179]. Perera et al [180] chose a bioinspired technique, which allowed a robust classification, even with a small number of samples.…”
Section: A Model Of Neurons: Toward Spike-based Neuromorphic Approachesmentioning
confidence: 99%