2016 Computing in Cardiology Conference (CinC) 2016
DOI: 10.22489/cinc.2016.172-318
|View full text |Cite
|
Sign up to set email alerts
|

Heart Sound Classification from Wavelet Decomposed Signal Using Morphological and Statistical Features

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…In order to demonstrate the spectral peaks in the PCG, a sub-frame of a sampled PCG signal is considered. By using Burg's methods AR (12) spectrum is constructed and shown in Fig. valve prolapse (MVP).…”
Section: Burg's Spectrum Based Formant Extractionmentioning
confidence: 99%
See 2 more Smart Citations
“…In order to demonstrate the spectral peaks in the PCG, a sub-frame of a sampled PCG signal is considered. By using Burg's methods AR (12) spectrum is constructed and shown in Fig. valve prolapse (MVP).…”
Section: Burg's Spectrum Based Formant Extractionmentioning
confidence: 99%
“…Apart from the various statistical measures, we consider six operations, namely mean absolute deviation (MAD), 1 st quartile, 3 rd quartile, inter quartile range (IQR), skewness and kurtosis. Considering two formants, these operations provide 12 parameters for each case of the formants (magnitude (F 12 Mag, Stats ), frequency (F 12 Freq, Stats ) and phase (F 12 Pha, Stats )). If we closely observe Fig.…”
Section: E Formation Of the Feature Dictionarymentioning
confidence: 99%
See 1 more Smart Citation
“…During the CinC 2016 conference, most of the participants have used time-frequency features, in particular the wavelet decomposition coefficients and Mel-Cepstral Frequency Coefficients (MFCCs). [5][6][7] Linear predictive coding coefficients have been also used. 7 In terms of classifiers, competitors mostly used Neural Networks, 5,8,9 Support Vector Machines 10,11 and Random Forest techniques.…”
Section: Prior Artmentioning
confidence: 99%
“…Numerous automated classi cation methods, such as Arti cial Neural Networks (ANN), Hidden Markov Model (HMM), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), and Deep Neural Networks (DNN), have been implemented [8], [10], [11]. In [12], the authors utilized a discrete wavelet transform to derive forty-two features. These features were then subjected to testing a support vector machine classi er, employing ve distinct kernel functions.…”
Section: Introductionmentioning
confidence: 99%