2004
DOI: 10.1109/tsmcb.2003.811767
|View full text |Cite
|
Sign up to set email alerts
|

High-Order Neural Network Structure Selection for Function Approximation Applications Using Genetic Algorithms

Abstract: Neural network literature for function approximation is by now sufficiently rich. In its complete form, the problem entails both parametric (i.e., weights determination) and structural learning (i.e., structure selection). The majority of works deal with parametric uncertainty assuming knowledge of the appropriate neural structure. In this paper we present an algorithmic approach to determine the structure of High Order Neural Networks (HONNs), to solve function approximation problems. The method is based on a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0
1

Year Published

2009
2009
2016
2016

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(15 citation statements)
references
References 16 publications
0
11
0
1
Order By: Relevance
“…Boozarjomehry [109] presented a hybrid method based on genetic algorithm and Lindenmayer system [114,115] for the auto-design of NNs. Rovithakis [116] proposed an approach based on genetic algorithms and a stable update law to determine the structure of high-order NNs, for solving function approximation problems.…”
Section: Neural Network Structure Determinationmentioning
confidence: 99%
“…Boozarjomehry [109] presented a hybrid method based on genetic algorithm and Lindenmayer system [114,115] for the auto-design of NNs. Rovithakis [116] proposed an approach based on genetic algorithms and a stable update law to determine the structure of high-order NNs, for solving function approximation problems.…”
Section: Neural Network Structure Determinationmentioning
confidence: 99%
“…Contudo, existem abordagens híbridas que utilizam Algoritmos Genéticos para determinar a arquitetura de uma RNA de alta ordem [1], minimizando as chances da rede não conseguir aproximar a função. Porém, com a inserção de um novo passo no processo de aproximação, o custo computacional aumenta.…”
Section: Introductionunclassified
“…Instead, input and output data of many systems can be observed easily. Therefore, data approximation becomes one of the most important tasks, which refers to fitting sample data to functions and models (e.g., approximators) [4]. Neuronet was proved to be a universal approximator by various researchers [5]- [8].…”
Section: Introductionmentioning
confidence: 99%