2012
DOI: 10.9790/3021-0205990993
|View full text |Cite
|
Sign up to set email alerts
|

Pattern Recognition System using MLP Neural Networks

Abstract: Pattern recognition can be used to recognize and/or locate specific objects in an image. The pattern recognition approaches are based on analysis of statistical parameters computed using image processing tools. The parameters may be compared with the standard pattern parameters to identify the pattern or a neural network may be trained using the statistical parameters to identify a given pattern. In the presented work, a neural network approach has been worked out in identifying a pattern. The neural approach … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 9 publications
(1 reference statement)
0
5
0
Order By: Relevance
“…It is challenging to manually find the most suitable hyperparameter combination for the dataset. At present, hyperparameter optimization mainly depends on the mathematical understanding of the algorithm, empirical judgment, and trial and error (Chauhan and Dhingra, 2012;Assi et al, 2018;Itano et al, 2018;Feurer et al, 2019). However, these empirical methods have a high chance of missing the optimal hyperparameter combination to match the dataset and building a bloated network structure that requires too many computing BO is suitable for multidimensional, high-cost valuation problems and has been widely used in maximum likelihood hyperparameter optimization (Feurer et al, 2019).…”
Section: Tuning the Dnn Hyperparameters Using Bayesian Optimization A...mentioning
confidence: 99%
See 1 more Smart Citation
“…It is challenging to manually find the most suitable hyperparameter combination for the dataset. At present, hyperparameter optimization mainly depends on the mathematical understanding of the algorithm, empirical judgment, and trial and error (Chauhan and Dhingra, 2012;Assi et al, 2018;Itano et al, 2018;Feurer et al, 2019). However, these empirical methods have a high chance of missing the optimal hyperparameter combination to match the dataset and building a bloated network structure that requires too many computing BO is suitable for multidimensional, high-cost valuation problems and has been widely used in maximum likelihood hyperparameter optimization (Feurer et al, 2019).…”
Section: Tuning the Dnn Hyperparameters Using Bayesian Optimization A...mentioning
confidence: 99%
“…However, no one dataset ensures that the site structures of different regions are covered. Researchers (Chauhan and Dhingra, 2012;Assi et al, 2018;Itano et al, 2018) rely on the mathematical understanding of the algorithm, empirical judgment, and trial and error to determine the network architectures, which incurs high computational costs. Researchers have proposed a data-driven method to limit the initial calculation space and utilized a population-based algorithm to enhance the computational efficiency of nonlinear parameter combinations (Guo et al, 2021;Luo et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…Here, we exploit the obtained classes, from original-by-original SI and intra-layer UPL tasks, as ground truths of the fingerprinted smartphones to classify "original" or "shared images" into m classes. Generally, ANNs, inspired by the biological form of the human neural system, have proven their effectiveness in classification tasks [57]. They are very flexible in learning features and can solve non-linear problems.…”
Section: Social-by-original Smartphone Identification and Inter-layermentioning
confidence: 99%
“…The most often applicable in practice for solving the problem of neural network pattern recognition is a multilayer perceptron [10,11] due to its relatively simple algorithmic implementation, the availability of advanced training methods, parallel computing capabilities. ANN type perceptron is also called Multilayer Feed-forward neural network because there are no feedback signals propagate from ANN input to its output through the unification of the neural layers with usually sigmoidal activation functions.…”
Section: Problem Of Pattern Recognition By Artificial Neural Networkmentioning
confidence: 99%