1996
DOI: 10.1007/bf00180404
|View full text |Cite
|
Sign up to set email alerts
|

Preliminary screening of neural network configurations for bioreactor applications

Abstract: A methodology and a computer code have been devised to perform a preliminary analysis of six types of neural networks commonly employed for bioreactor problems. Both static and time-varying data can be analysed, and the values of the parameters and/or sampling times can be chosen according to the system behavior. The results help to select a suitable network configuration for detailed training and application. This is illustrated for a fed-batch fermentation to produce recombinant [Ggalactosidase.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

1999
1999
2007
2007

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 10 publications
0
4
0
Order By: Relevance
“…These differences reinforce the main tenet of the COMPARE library (Patnaik 1996), that in real situations there is no direct choice of an optimal neural configuration; therefore it is important to define the performance index (or a weighted combination of indexes) and then search for the best network. Given that the main theme of this study is the maximization of PHB concentration, an Elman network with four input neurons, two recurrent neurons, four hidden neurons and two output neurons is the best, followed by a Hopfield network with the same numbers of neurons.…”
Section: Application and Discussionmentioning
confidence: 95%
See 1 more Smart Citation
“…These differences reinforce the main tenet of the COMPARE library (Patnaik 1996), that in real situations there is no direct choice of an optimal neural configuration; therefore it is important to define the performance index (or a weighted combination of indexes) and then search for the best network. Given that the main theme of this study is the maximization of PHB concentration, an Elman network with four input neurons, two recurrent neurons, four hidden neurons and two output neurons is the best, followed by a Hopfield network with the same numbers of neurons.…”
Section: Application and Discussionmentioning
confidence: 95%
“…The bioreactor was represented by each of seven ANN configurations from the COMPARE library (Patnaik 1996). Given a set of data, COMPARE does a preliminary screening with the ANNs in the library and generates performance indices that help to decide which network is the most promising.…”
Section: Neural Optimization Of Phb Productionmentioning
confidence: 99%
“…As an aid to this, Patnaik [33] developed a library of networks used commonly for microbial processes; inherent in this library was a set of rules to screen competing networks and select a few promising ones for detailed studies. Both the library and the screening rules may be updated.…”
Section: Introduction Of the Kinetic Modelsmentioning
confidence: 99%
“…The final network architecture was determined through multiple runs to test various architectures. These included multilayer perceptrons with both one and two hidden layers, which have been shown to be effective for bioreactor problems ( , ). It was found that networks with two hidden layers provided improved results without loss of generality, which agrees with previous findings for fermentation reactors ().…”
Section: Methodsmentioning
confidence: 99%