2019
DOI: 10.1016/j.ins.2019.04.055
|View full text |Cite
|
Sign up to set email alerts
|

Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams

Abstract: The concept of stochastic configuration networks (SCNs) offers a solid framework for fast implementation of feedforward neural networks through randomized learning. Unlike conventional randomized approaches, SCNs provide an avenue to select appropriate scope of random parameters to ensure the universal approximation property. In this paper, a deep version of stochastic configuration networks, namely deep stacked stochastic configuration network (DSSCN), is proposed for modeling non-stationary data streams. As … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 57 publications
(16 citation statements)
references
References 49 publications
0
14
0
Order By: Relevance
“…This is done by associating each data entity with a rumour index, which reflects its current state of being influenced by harmful information. Measuring falsehood is essentially a problem of probability modelling, which compresses detective cues into a probability value, as studied in Bayesian statistics, frequentist statistics, and machine learning [1,21]. Several detective cues have been investigated [4,8,22]:…”
Section: Streaming Social Signalsmentioning
confidence: 99%
“…This is done by associating each data entity with a rumour index, which reflects its current state of being influenced by harmful information. Measuring falsehood is essentially a problem of probability modelling, which compresses detective cues into a probability value, as studied in Bayesian statistics, frequentist statistics, and machine learning [1,21]. Several detective cues have been investigated [4,8,22]:…”
Section: Streaming Social Signalsmentioning
confidence: 99%
“…This section discusses experimental study of DEVFNN in five popular real-world and synthetic data stream problems: electricity-pricing, weather, SEA, hyperplane, SUSY, and indoor RFID localization problem from our own project. DEVFNN is compared against prominent data stream algorithms in the literature: pENsemble [32], pENsemble+ [34], DSSCN [33], gClass [35]. Comparison with pENsemble and pENsemble+ are presented to illustrate the advantages of deep stacked network structure against ensemble structure, while DSSCN represents a deep algorithm created by the concept of random shift rather than the feature augmentation approach.…”
Section: Numerical Studymentioning
confidence: 99%
“…A novel incremental DNN, namely Deep Evolving Fuzzy Neural Network (DEVFNN), is proposed in this paper. DE-VFNN features a fully elastic structure where not only its fuzzy rule can be autonomously evolved but also the depth of network structure can be adapted in the fully automatic manner [33]. This property is capable of handling dynamic variations of data streams but also delivering continuous improvement of predictive performance.…”
Section: Introductionmentioning
confidence: 99%
“…In such scenarios, the fixed learner obtained by training deep networks on pre-collected samples always cannot finely adapt the varying environment, and the over-fitting issue then inevitably appears in the testing stage. Thus it is also a critical issue to make deep learning strategy automatically adaptable to open and dynamic environments [42,43] . How to improve the generalization capability of a deep learning approach should be the most significant issue for attaining this "learn to adapt" purpose.…”
Section: Automatic Adapting To Dynamic Environmentsmentioning
confidence: 99%