1993
DOI: 10.1007/978-3-642-78486-6_88
|View full text |Cite
|
Sign up to set email alerts
|

Simulation Neuronaler Netze auf Massiv Parallelen Rechnern

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
0

Year Published

1996
1996
2023
2023

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 14 publications
(15 citation statements)
references
References 2 publications
0
15
0
Order By: Relevance
“…volume and surface area of the grain phase per unit reference volume) from a given training set of preclassified data (Tewari, 1997; Wei et al ., 1998; Mattfeldt et al ., 1999). From the rich choice of available neural network types, we decided to use classical multilayer feedforward networks with backpropagation (MLFF‐BP networks) and networks based on learning vector quantization (LVQ networks) (Zell, 1994; Kohonen, 1997) (Fig. 3).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…volume and surface area of the grain phase per unit reference volume) from a given training set of preclassified data (Tewari, 1997; Wei et al ., 1998; Mattfeldt et al ., 1999). From the rich choice of available neural network types, we decided to use classical multilayer feedforward networks with backpropagation (MLFF‐BP networks) and networks based on learning vector quantization (LVQ networks) (Zell, 1994; Kohonen, 1997) (Fig. 3).…”
Section: Methodsmentioning
confidence: 99%
“…In these studies initialization of LVQ networks was deterministic, hence there was no need for repetition of runs. For the studies with MLFF‐BP networks, we used the resilient backpropagation algorithm (Riedmiller & Braun, 1992, 1993), implemented within the Stuttgart Neural Network Simulator (Zell, 1994). For LVQ studies we used the LVQPAK software (Kohonen, 1997).…”
Section: Methodsmentioning
confidence: 99%
“…From the many available ANN types, a classical multilayer feedforward network with backpropagation (MLFF‐BP) and learning vector quantization (LVQ) [ 25, 26] were used, the latter in two software packages, the SNNS and LVQPAK implementations. The MLFF‐BP networks are the most popular ANNs by far, already implemented in various academic and commercial software packages.…”
Section: Methodsmentioning
confidence: 99%
“…Neural networks have been employed for many classification tasks in Bioinformatics and Computational Biology. EvoDNN is a framework that employs an Evolutionary Algorithm (EA) to evolve the weights, biases and AFs of a deep heterogeneous feed-forward neural network (FNN) [87] (A neural network (NN) that consists of many hidden layers, often more than five, is called a deep neural network). EvoDNN extends our earlier framework EvoNN [70], which evolved a simple, single hidden layer FNN.…”
Section: Motivationmentioning
confidence: 99%
“…Finally, gradient descent methods struggle to train deep neural networks [29], caused by gradient vanishing and explosion [40], saddle points, and some other forces, which not only affect deep FNNs [87], but also recurrent networks [64]. However, the depth of the network has no impact on neural networks trained by EAs, because EAs are a zero-order, population-based method.…”
Section: Motivationmentioning
confidence: 99%