2007
DOI: 10.1134/s1064226907050014
|View full text |Cite
|
Sign up to set email alerts
|

A comparative study of neural networks for input resistance computation of electrically thin and thick rectangular microstrip antennas

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2008
2008
2016
2016

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 51 publications
0
6
0
Order By: Relevance
“…However, in [51,52], more outputs were calculated by using multiple ANFIS. In general, in the literature, each different parameter of each different MSA was computed by using a different individual ANN [53][54][55] or ANFIS model [56][57][58][59][60]. Single neural models were proposed in [61,62] for simultaneously calculating the resonant frequencies of the rectangular, circular, and triangular MSAs.…”
Section: Introductionmentioning
confidence: 99%
“…However, in [51,52], more outputs were calculated by using multiple ANFIS. In general, in the literature, each different parameter of each different MSA was computed by using a different individual ANN [53][54][55] or ANFIS model [56][57][58][59][60]. Single neural models were proposed in [61,62] for simultaneously calculating the resonant frequencies of the rectangular, circular, and triangular MSAs.…”
Section: Introductionmentioning
confidence: 99%
“…Neurons in the layer of entry only act as buffers for distributing the input signals x i to neurons in the hidden layer. Each neuron in the hidden layer sums its input signals x i after weighting them with the strengths of the respective connections w ji from the layer of entry and computes its output y j as a function f of the sum, namely where f can be a simple threshold function or a sigmoid or hyperbolic tangent function [23]. The output of neurons in the output layer is computed likewise.…”
Section: Annsmentioning
confidence: 99%
“…Multilayer perceptrons (MLP) have been applied successfully to solve some difficult and diverse problems by training them in a supervised manner with a highly popular algorithm known as the error back propagation algorithm [21]. Where f can be a simple threshold function or a sigmoid or hyperbolic tangent function [22]. The output of neurons in the output layer is computed similarly.…”
Section: Artificial Neural Networkmentioning
confidence: 99%