Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology
DOI: 10.1109/icpr.1992.201708
|View full text |Cite
|
Sign up to set email alerts
|

Feedforward neural networks with random weights

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
201
0
8

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 336 publications
(211 citation statements)
references
References 3 publications
2
201
0
8
Order By: Relevance
“…In the first step we collect social and academic data from a set of computer science students. Using such dataset, we designed a classifier with reject option where a Feedforward Neural Network with Random Weights (FNNRW) [Schmidt et al 1992] was used as a base learner.…”
Section: Methodsmentioning
confidence: 99%
“…In the first step we collect social and academic data from a set of computer science students. Using such dataset, we designed a classifier with reject option where a Feedforward Neural Network with Random Weights (FNNRW) [Schmidt et al 1992] was used as a base learner.…”
Section: Methodsmentioning
confidence: 99%
“…Predecessors of these models were proposed in a number of early works on feedforward architectures, e.g. in [42,43]. A more mature version of RW-FNNs, called Random Vector Functional-Link (RVFL) networks were introduced in [44,45].…”
Section: Related Workmentioning
confidence: 99%
“…Only the outputs of the unit are connected to the units of next layer. Therefore there is no feedback in the system [7]. …”
Section: Feed Forward Neural Networkmentioning
confidence: 99%