1997
DOI: 10.1109/5.554206
|View full text |Cite
|
Sign up to set email alerts
|

Sensor fusion potential exploitation-innovative architectures and illustrative applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
197
0
19

Year Published

2005
2005
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 385 publications
(216 citation statements)
references
References 12 publications
0
197
0
19
Order By: Relevance
“…The confusion regarding the term "data fusion" occurs because of the two main data fusion cultures within the data fusion community (Dasarathy, 1997 based on the use of the term "data" in its most general form as "information" that could be provided by sensors, humans, reports, etc.…”
Section: Multisensor Data Fusionmentioning
confidence: 99%
“…The confusion regarding the term "data fusion" occurs because of the two main data fusion cultures within the data fusion community (Dasarathy, 1997 based on the use of the term "data" in its most general form as "information" that could be provided by sensors, humans, reports, etc.…”
Section: Multisensor Data Fusionmentioning
confidence: 99%
“…Dasarathy proposed an alternative categorization of Data Fusion systems according to the level of abstraction of the information at the input and output of the fusion system [6]. Three different levels of abstraction are defined: (1) data; (2) features and (3) about the class of the phenomena being recognized.…”
Section: Dasarathy's Input-output Modelmentioning
confidence: 99%
“…Another important work on a flexible architecture for sensor fusion is pro-posed in [5]. The author proposes six categories based on three levels of JDL, yet used as processing input/output modes.…”
Section: Sensor Fusion Architecturesmentioning
confidence: 99%
“…The centroid of each extracted object is calculated and sent to the Decision Maker to obtain information such as angle (θ) relative to the robot and estimated distance (d) of each object. At the same time, the acquired data of all the distance sensors is separated and sent to each Processor S i , where i ∈ [1,5], and a function of interpolation is applied with the objective of finding the distance from the decimalized value of each of these. Finally, all extracted information is sent to the Fusion Processor in order to generate knowledge representation of each object in the scene.…”
Section: System Architecturementioning
confidence: 99%