2008
DOI: 10.1109/mci.2008.926615
|View full text |Cite
|
Sign up to set email alerts
|

Hypernetworks: A Molecular Evolutionary Architecture for Cognitive Learning and Memory

Abstract: Recent interest in human-level intelligence suggests a rethink of the role of machine learning in computational intelligence. We argue that without cognitive learning the goal of achieving human-level synthetic intelligence is far from completion. Here we review the principles underlying human learning and memory, and identify three of them, i.e., continuity, glocality, and compositionality, as the most fundamental to human-level machine learning. We then propose the recently-developed hypernetwork model as a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
53
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 81 publications
(56 citation statements)
references
References 55 publications
0
53
0
Order By: Relevance
“…The properties of the hypernetwork model are summarized as three aspects: glocality, compositionality and self association based on randomness and recall [5].…”
Section: Hypernetwork Modelmentioning
confidence: 99%
See 3 more Smart Citations
“…The properties of the hypernetwork model are summarized as three aspects: glocality, compositionality and self association based on randomness and recall [5].…”
Section: Hypernetwork Modelmentioning
confidence: 99%
“…(8) More explanations on the derivative of the log-likelihood are showed in [5]. Therefore, log-likelihood of hypernetwork can be maximized by decreasing the difference of hyperedges from a given data set.…”
Section: Hypernetwork Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…These techniques include probabilistic graphical models, such as Bayesian networks [5], Markov random fields [6], and Markov logic networks [7]. More recent models include deep belief networks [8] and random hypergraph structures [9] for parallel associative memory.…”
Section: Introductionmentioning
confidence: 99%