2023
DOI: 10.3390/math11061360
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Optimization Technique for Training Deep Neural Networks

Abstract: Deep learning is a sub-branch of artificial intelligence that acquires knowledge by training a neural network. It has many applications in the field of banking, automobile industry, agriculture, and healthcare industry. Deep learning has played a significant role in solving complex tasks related to computer vision, such as image classification, natural language processing, and object detection. On the other hand, optimizers also play an intrinsic role in training the deep learning model. Recent studies have pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(10 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…Machine learning models have increasingly incorporated statistical features from EEG data to enhance classification and diagnostic capabilities. Deep Neural Networks (DNN) [ 20 , 21 ], with their capacity for feature learning, have shown promise in deciphering complex patterns from Hjorth parameters and similar statistics. The hierarchical nature of DNNs allows them to distill high-level abstractions from raw data, which is particularly beneficial for identifying subtle neurological differences between various cognitive tasks or pathological states.…”
Section: Resultsmentioning
confidence: 99%
“…Machine learning models have increasingly incorporated statistical features from EEG data to enhance classification and diagnostic capabilities. Deep Neural Networks (DNN) [ 20 , 21 ], with their capacity for feature learning, have shown promise in deciphering complex patterns from Hjorth parameters and similar statistics. The hierarchical nature of DNNs allows them to distill high-level abstractions from raw data, which is particularly beneficial for identifying subtle neurological differences between various cognitive tasks or pathological states.…”
Section: Resultsmentioning
confidence: 99%
“…The proposal also involves modeling a neural network (NN) [51] model to classify primary and non-primary objects in an image. For modeling an efficient model [52], a good network and premeditated features [53] are required. Regarding feature design, metrics affecting model performance need to be included in the training.…”
Section: Methodology and Major Contributionsmentioning
confidence: 99%
“…The choice of optimizer can be influenced by the particulars of the training job and the kind of data since each optimizer has advantages and uses [65]. For example, Adagrad or RMSprop may be favored for jobs with sparse data, although Adam is frequently used due to its broad usefulness across various situations.…”
Section: Model Optimizermentioning
confidence: 99%