1999
DOI: 10.1142/s0218213099000191
|View full text |Cite
|
Sign up to set email alerts
|

Information Blending in Virtual Associative Networks: A New Paradigm for Sensor Integration

Abstract: Research reported in this article is motivated, in part, by current U.S. military programs aimed at the development of efficient data integration and sensor management methods capable of handling large sensor suites and achieving robust target recognition performance in real time scenarios. Modern sensor systems have shown good recognition abilities against a few isolated targets. However, these capabilities decline steeply when multiple sensors are acting against large target groups under realistic conditions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2002
2002
2021
2021

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…The self-adaptive resource optimization framework (Yufik, 1998b, 2002; Yufik and Malhotra, 1999; Yufik and Sheridan, 2002) offers a simple account of cognitive processes, highlighting the crucial role of Markov blanket induction in neuronal systems, as a pivotal optimization mechanism.…”
Section: Discussion and Suggestions For Further Researchmentioning
confidence: 99%
“…The self-adaptive resource optimization framework (Yufik, 1998b, 2002; Yufik and Malhotra, 1999; Yufik and Sheridan, 2002) offers a simple account of cognitive processes, highlighting the crucial role of Markov blanket induction in neuronal systems, as a pivotal optimization mechanism.…”
Section: Discussion and Suggestions For Further Researchmentioning
confidence: 99%
“…The computational architecture of associative cortices readily affords self-partitioning in associative networks allowing near-optimal behavior. In Gnostron, the partitioning quality is defined by a simple criterion: choose a particular optimization algorithm and compare results obtained before (baseline) and after partitioning into packets [a stripped down, proof-of-concept system for target recognition obtained close to two orders of magnitude complexity reduction with acceptably small error amplitude ( Malhotra and Yufik, 1999 ; Yufik and Malhotra, 1999 )]. Figure 13 generalizes the gnostron proposal.…”
Section: Machine Situational Understandingmentioning
confidence: 99%
“…It has been shown that grouping by packets (i.e., via forming associative networks partitioned into internally cohesive and externally weakly coupled neuronal groups) is near-optimal, i.e., maximizes rewards while minimizing the number of unproductive allocations. Packet mechanism was shown to yield near-optimal results and orders of magnitude reduction in the amount of computation in large-scale resource optimization tasks (e.g., target recognition) [9], [10].…”
Section: Gnostron: a Framework For Machine Understandingmentioning
confidence: 99%