2010
DOI: 10.1016/j.eswa.2009.05.007
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of optical lens-controlled scanning electron microscopic resolution using generalized regression neural network and genetic algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…Because of the concept of GA, it can be considered as a more efficient "tweaking" of settings to find an optimal configuration, ideal for optimization of the settings of electron microscopy. For example, Kim et al [198] suggested that the combination of a generalized regression neural network (GRNN) and GA can be used to search for the best resolution of SEM. A prediction model of the SEM's resolution was built by using a GRNN and the GA was employed to enhance the prediction model.…”
Section: Applications Of Genetic Algorithmsmentioning
confidence: 99%
“…Because of the concept of GA, it can be considered as a more efficient "tweaking" of settings to find an optimal configuration, ideal for optimization of the settings of electron microscopy. For example, Kim et al [198] suggested that the combination of a generalized regression neural network (GRNN) and GA can be used to search for the best resolution of SEM. A prediction model of the SEM's resolution was built by using a GRNN and the GA was employed to enhance the prediction model.…”
Section: Applications Of Genetic Algorithmsmentioning
confidence: 99%
“…The method is suitable for regression problems where an assumption of linearity is not justified. A GRNN configuration consists of four layers, which include the input layer, pattern layer, summation layer, and output layer (Kim et al 2010). Each input unit in the input layer corresponds to individual observed parameters.…”
Section: Generalized Regression Neural Networkmentioning
confidence: 99%
“…The output layer divides the output of each Ssummation neuron by that of each D-summation neuron. Therefore, a predicted value ŷ(x) to an unknown input vector x can be expressed as (Kim et al 2010)…”
Section: Generalized Regression Neural Networkmentioning
confidence: 99%
“…Each unit in the pattern layer is connected to two units (S and D summation neurons) in the summation layer. The S neuron measures the sum of weighted outputs of the pattern layer while the D neuron computes the unweighted outputs of the units in the pattern layer (Kim et al, 2010;Haidar et al, 2011). Using GRNN, the spread (smoothing factor, σ) which can affect the degree of generalization of the network must be determined first.…”
Section: Generalized Regression Neural Network (Grnn)mentioning
confidence: 99%