2017
DOI: 10.1117/12.2263172
|View full text |Cite
|
Sign up to set email alerts
|

Feature extraction for deep neural networks based on decision boundaries

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…The traditional method to solve the problem of missing training samples is to expand the existing labeled samples by rotating, translating, mirror transformation, and adding Gauss noise to the original labeled samples, and then generate new labeled samples to increase the number of samples. However, from the information point of view, although the diversity of samples is increased after data expansion, the amount of essential information beneficial to the training network has not increased [18,19] (Figs. 2 and 3).…”
Section: Using Transfer Learning To Solve the Problem Of Missing Traimentioning
confidence: 99%
“…The traditional method to solve the problem of missing training samples is to expand the existing labeled samples by rotating, translating, mirror transformation, and adding Gauss noise to the original labeled samples, and then generate new labeled samples to increase the number of samples. However, from the information point of view, although the diversity of samples is increased after data expansion, the amount of essential information beneficial to the training network has not increased [18,19] (Figs. 2 and 3).…”
Section: Using Transfer Learning To Solve the Problem Of Missing Traimentioning
confidence: 99%
“…Feature extraction is one of the fundamental prerequisite processes used in machinelearning classification and pattern recognition applications. As such, there has been extensive research effort into the development of an accurate feature extraction technique [1][2][3]. Feature extraction is described as a process through which a useful set of attributes is derived from the raw time-series data while removing its redundant information.…”
Section: Introductionmentioning
confidence: 99%
“…With the surging of deep learning, neural networks can model multiple events and learn richer representations that have the potential to learn better models of nonlinear data [15][16][17]. With multiple layers, deep neural networks (DNNs) [18,19] perform well on decision boundary and feature engineering problems by using a massive amount of data [20]. In recent years, a deep neural network hidden Markov model (DNN-HMM) has been proposed as a novel hybrid architecture and has been widely used on acoustic learning [21][22][23].…”
Section: Introductionmentioning
confidence: 99%