2011
DOI: 10.4304/jcp.6.8.1707-1714
|View full text |Cite
|
Sign up to set email alerts
|

Study of Emotion Recognition Based on Surface Electromyography and Improved Least Squares Support Vector Machine

Abstract: In order to improve human-computer interaction (HCI), computers need to recognize and respond properly to their user’s emotional state. This paper introduces emotional pattern recognition method of Least Squares Support Vector Machine (LS_SVM). The experiment introduces wavelet transform to analyze the Surface Electromyography (EMG) signal, and extracts maximum and minimum of the wavelet coefficients in every level. Then we construct the coefficients as eigenvectors and input them into improved Least… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 17 publications
0
12
0
Order By: Relevance
“…In this method, equality constraints are used to find the optimal solution by solving a set of linear equations instead of solving a quadratic optimization problem [81]. This classifier has already been examined for facial EMG classification and promising results were reported in [34], [39] and [42]. The LS-SVM model used in this paper is formed using RBF kernel where the regularization    and smoothing    2 parameters are set at 10 and 0.2 respectively while the multiclass LS-SVM is trained and encoded by the one-versus-all scheme.…”
Section: Least-square Support Vector Machines (Ls-svms)mentioning
confidence: 99%
“…In this method, equality constraints are used to find the optimal solution by solving a set of linear equations instead of solving a quadratic optimization problem [81]. This classifier has already been examined for facial EMG classification and promising results were reported in [34], [39] and [42]. The LS-SVM model used in this paper is formed using RBF kernel where the regularization    and smoothing    2 parameters are set at 10 and 0.2 respectively while the multiclass LS-SVM is trained and encoded by the one-versus-all scheme.…”
Section: Least-square Support Vector Machines (Ls-svms)mentioning
confidence: 99%
“…BP neural network was employed for solving linear equations used to achieve the classification makes the training time greatly reduced and the recognition rate can reach more than 83.33%. In [35], the author proposed emotional pattern recognition using Least Squares Support Vector Machine (LSSVM). For this, wavelet transform is used which has multi scale decomposition level.…”
Section: Emgmentioning
confidence: 99%
“…That is to say the distance from any data point lies in the profile to α is always lower than R. Following closely the derivation of [22]- [24], [26], to obtain the solution of the minimal hypersphere approximate covering with an appropriate width q of the kernel function, a gradient dynamical system which is associated with the trained kernel function R 2 (x) can be constructed as follows. 2 (1)…”
Section: A Phase I Partition Of Svsmentioning
confidence: 99%
“…Input: the dataset X , Gaussian kernel width q and the penalty term C Output: labels for all the data points 1 collect V for q by solving dual problem 2 To analyze the time complexity of the proposed method, let N be the number of data points in a dataset, N SV be the number of SVs, l be the average number of iterations for each data point to locate its corresponding local minimum via the steepest decent process [22], N SEV is the number of SEVs and m be the sample rate. Apparently, the time cost by the three phases are O(lN SV ), mN 2 SEV and O((N − N SV )N SEV ) respectively. Therefore, the time complexity of FSCL is O(lN SV + N N SEV ).…”
Section: Algorithm 1 Fscl(x Q C)mentioning
confidence: 99%
See 1 more Smart Citation