2009
DOI: 10.1007/s11063-009-9096-2
|View full text |Cite
|
Sign up to set email alerts
|

Structural Properties of Recurrent Neural Networks

Abstract: In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 17 publications
(21 reference statements)
0
3
0
Order By: Relevance
“…Each approach has advantages and limitations for modelling human cognition, and there are emerging lines of research that use the modern tools of network science to study the structure of neural networks (e.g. [24][25][26][27][28][29]). Our paper aims to highlight how modern network science can help advance our understanding of cognition, rather than to directly compare the network science and connectionism approaches.…”
Section: Spiral Of Representation: Defining Cognitive Representationsmentioning
confidence: 99%
“…Each approach has advantages and limitations for modelling human cognition, and there are emerging lines of research that use the modern tools of network science to study the structure of neural networks (e.g. [24][25][26][27][28][29]). Our paper aims to highlight how modern network science can help advance our understanding of cognition, rather than to directly compare the network science and connectionism approaches.…”
Section: Spiral Of Representation: Defining Cognitive Representationsmentioning
confidence: 99%
“…On the other hand, Connectionism primarily aims to model a process (e.g., learning), which is implemented as incremental modifications of a network-like structure (e.g., edge weights) to fine-tune the output and performance of the model. Each approach has advantages and limitations for modeling human cognition, and there are even emerging lines of research that uses the modern tools of Network Science to study the structure of neural networks (e.g., Annunziato, Bertini, De Felice, & Pizzuti, 2007;Dobnikar & Šter, 2009;Li, 2008;Simard, Nadeau, & Kröger, 2005;Tang, Xi, & Ma, 2006;Torres, Muñoz, Marro, & Garrido, 2004). Our paper aims to highlight how modern network science can help advance our understanding of cognition, rather than to directly compare the Network Science and Connectionism approaches.…”
Section: Spiral Of Representation: Defining Cognitive Representationsmentioning
confidence: 99%
“…Later, neural network parsers were enumerated by M iikkulainen [45] and Lane and Henderson [36]. [10], and Dobnikar and Šter [16]. Kolen and Kremer [33] used the concept of recurrent neural networks as advancement over feed forward neural networks.…”
Section: Neural Network and Recurrent Neural Networkmentioning
confidence: 99%