2003
DOI: 10.1590/s0104-65002003000100005
|View full text |Cite
|
Sign up to set email alerts
|

Symbolic processing in neural networks

Abstract: In this paper we show that programming languages can be translated into recurrent (analog, rational weighted)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2004
2004
2022
2022

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…First we present the neural network architecture able to sustain symbolic computation (more details in Neto et al, 1998Neto et al, , 2003. The chosen analog recurrent neural net model is a discrete time dynamic system, x(t+1) = φ(x(t), u(t)), with initial state x(0) = x 0 , where t denotes time, x i (t) denotes the activity (firing frequency) of neuron i at time t, within a population of N interconnected neurons, and u k (t) denotes the value of input channel k at time t, within a set of M input channels.…”
Section: Neural Symbolic Computationmentioning
confidence: 99%
See 1 more Smart Citation
“…First we present the neural network architecture able to sustain symbolic computation (more details in Neto et al, 1998Neto et al, , 2003. The chosen analog recurrent neural net model is a discrete time dynamic system, x(t+1) = φ(x(t), u(t)), with initial state x(0) = x 0 , where t denotes time, x i (t) denotes the activity (firing frequency) of neuron i at time t, within a population of N interconnected neurons, and u k (t) denotes the value of input channel k at time t, within a set of M input channels.…”
Section: Neural Symbolic Computationmentioning
confidence: 99%
“…If provided a high-level description of an algorithm A, is it possible to automatically create a neural network that computes the function described by A? Our previous works, (Neto et al, 1998(Neto et al, , 2003(Neto et al, , 2006, show that it is possible to answer this question, with a simple discrete time network model. Related works of symbolic processing in neural networks can be found at (Gruau et al, 1995;Siegelmann, 1999;Carnell et al, 2007;Herz et al, 2006).…”
Section: Introductionmentioning
confidence: 99%
“…Programmable neural networks (Verona et al, 1991 ; Neto et al, 2003 ; Eliasmith and Stewart, 2011 ; Bošnjak et al, 2017 ; Katz et al, 2019 ; Davis et al, 2021 ) comprise one potential approach to building such systems. These are neural networks whose dynamics can emulate execution of human-authored source code.…”
Section: Introductionmentioning
confidence: 99%
“…One approach to this problem aims to “compile” symbolic source code into a set of equivalent neural network weights, such that running the resulting network dynamics effectively emulates execution of the source code. We refer to such models as “programmable neural networks.” Examples from the past several decades include (Verona et al, 1991 ; Gruau et al, 1995 ; Neto et al, 2003 ; Eliasmith and Stewart, 2011 ), which often use local representation (i.e., one neuron represents one program variable) and/or static weights that do not change after “compilation” time. More recent approaches often use modern deep learning tools and gradient-based optimization to obtain the weights from training examples (Graves et al, 2016 ; Reed and De Freitas, 2016 ; Bošnjak et al, 2017 ), and employ model architectures that are “hybrid” (not purely neural) or otherwise biologically implausible.…”
Section: Introductionmentioning
confidence: 99%
“…This work implements a compiler and a simulator based on the previous Turing Universality of Neural Nets (Revisited) paper, [4]. In [3], [7] and [5] similar ideas are given but they are based on higher level languages.…”
Section: Introductionmentioning
confidence: 99%