2020
DOI: 10.1007/978-3-030-43722-0_39
|View full text |Cite
|
Sign up to set email alerts
|

Neuro-Evolutionary Transfer Learning Through Structural Adaptation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…The proposed work is utilizes components from the Evolutionary Exploration of Augmenting Memory Models (EXAMM) algorithm [31] as its core and utilizes its mutation, crossover and training operations in online scenarios. EXAMM is a distributed NE algorithm that evolves progressively larger RNNs for large-scale, multivariate, real-world TSF [14,15]. EXAMM evolves RNN architectures consisting of varying recurrent connections and memory cells through a series of mutation and crossover (reproduction) operations.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The proposed work is utilizes components from the Evolutionary Exploration of Augmenting Memory Models (EXAMM) algorithm [31] as its core and utilizes its mutation, crossover and training operations in online scenarios. EXAMM is a distributed NE algorithm that evolves progressively larger RNNs for large-scale, multivariate, real-world TSF [14,15]. EXAMM evolves RNN architectures consisting of varying recurrent connections and memory cells through a series of mutation and crossover (reproduction) operations.…”
Section: Methodsmentioning
confidence: 99%
“…Generated offspring inherit their weights from their parents, which can significantly reduce the time needed for their training and evaluation [25]. It has been shown that EXAMM can swiftly adapt RNNs in transfer learning scenarios, even when the input and output data streams are changed [15] [14], which served as a preliminary motivation and justification for being able to adapt and evolve RNNs for TSF in online scenarios.…”
Section: Methodsmentioning
confidence: 99%
“…We chose eRNN for our TVA-E process due to its demonstrated ability to make accurate and reliable predictions using limited amounts of data in comparison with alternative options [14][15][16][17]. However, a myriad of alternative prediction techniques could be easily incorporated into our process to account for tactic volatility.…”
Section: Proposed Tactic Volatility Aware Processmentioning
confidence: 99%
“…The crucial aspect of the SLM approach is the geometric semantic mutation. ElSaid et al [36] proposed an approach based on [35], called network-aware adaptive structure transfer learning strategy with the goal of improving training time for deep RNNs. The authors used statistical information about the topology of the "source RNN" topology and the weight distributions.…”
Section: E DL Architecture: Deep Belief Networkmentioning
confidence: 99%