2000
DOI: 10.1016/s0893-6080(00)00027-7
|View full text |Cite
|
Sign up to set email alerts
|

Mutual information of sparsely coded associative memory with self-control and ternary neurons

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2002
2002
2018
2018

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(16 citation statements)
references
References 26 publications
0
16
0
Order By: Relevance
“…In order to measure the retrieval quality of the recall process, we use the mutual information function [5,6,13,14]. In general, it measures the average amount of information that can be received by the user by observing the signal at the output of a channel [15,16].…”
Section: The Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to measure the retrieval quality of the recall process, we use the mutual information function [5,6,13,14]. In general, it measures the average amount of information that can be received by the user by observing the signal at the output of a channel [15,16].…”
Section: The Modelmentioning
confidence: 99%
“…To solve this problem, quite recently a self-control mechanism has been introduced in the dynamics of networks for so-called diluted architectures [5]. This self-control mechanism introduces a timedependent threshold in the transfer function [5,6]. It is determined as a function of both the cross-talk noise and the activity of the stored patterns in the network, and adapts itself in the course of the recall process.…”
Section: Introductionmentioning
confidence: 99%
“…It is known that an important property required for efficient design is an autonomous function independent from, for example, user specified parameters [4]. The use of many user specified parameters requires a user to know rich prior knowledge, which often does not exist for complex real-world problems.…”
Section: B Constructive Approachmentioning
confidence: 99%
“…It is determined as a function of both the cross-talk noise and the activity of the stored patterns in the network, and adapts itself in the course of the recall process. It furthermore allows to reach optimal retrieval performance both in the absence and in the presence of synaptic noise [5,6,7,8]. These diluted architectures contain no common ancestors nodes, in contrast with feedforward architectures.…”
Section: Introductionmentioning
confidence: 99%