2022
DOI: 10.1109/mim.2022.9693406
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Sensor Measurement and Data Fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(8 citation statements)
references
References 14 publications
0
8
0
Order By: Relevance
“…These distributed sensors can complement each other since they have different observing angles and different types of information, such as position, shape, and mobility measures about the various stationary and moving objects. This sensing capability can be further enhanced by leveraging the recent advances in multi-modal data fusion [9]. As a result, it is becoming more feasible to acquire high-fidelity sensing information about the surrounding environment in nearly real-time, which is a key enabler for the envisioned digital twin.…”
Section: Today's Technology Advances Lead To Real-time Digital Twinsmentioning
confidence: 99%
“…These distributed sensors can complement each other since they have different observing angles and different types of information, such as position, shape, and mobility measures about the various stationary and moving objects. This sensing capability can be further enhanced by leveraging the recent advances in multi-modal data fusion [9]. As a result, it is becoming more feasible to acquire high-fidelity sensing information about the surrounding environment in nearly real-time, which is a key enabler for the envisioned digital twin.…”
Section: Today's Technology Advances Lead To Real-time Digital Twinsmentioning
confidence: 99%
“…[ 246 ] Instantly, contextual improvement is essential for environmental perception in night vision applications, particularly in low illumination constraints of the dark night. [ 247 ] To address this, colorization [ 248 ] and data fusion [ 249 ] of the target objects are good translational tools to visualized thermal depiction into colorized visible images and reflects overall features of the targeted objects. Liu et al., [ 247 ] suggested two steps for unsupervised smart sensing image translation neural network for infrared to visible (IR2VI) translation of night time thermal imaging includes, first translating thermal infrared images to gray‐scale visible images (GVI), denoted as IR‐GVI and then translation to color visible images denoted as (CVI).…”
Section: Applicationsmentioning
confidence: 99%
“…Feature-level fusion is designed to obtain more discriminative features for other tasks. The solution takes symbolic representations as sources and combines them to obtain a more accurate solution [51].…”
Section: Data Fusion Models In Intelligent Monitoring Systemsmentioning
confidence: 99%
“…10(a). The three types of inputs include data/information sources, supporting information, and a priori external knowledge [51]. The output of the block is the merged results.…”
Section: Fig 9 Structural Diagram Of the Information And Measurement ...mentioning
confidence: 99%