2017
DOI: 10.1117/12.2266946
|View full text |Cite
|
Sign up to set email alerts
|

Detection and localization of underground networks by fusion of electromagnetic signal and GPR images

Abstract: In this paper, we purpose a new approach to the post-processing of multi-sensor detection based on knowledge representation and data fusion provided by several technologies. The aim is to improve the detection and localization of underground networks. This work is part of the G4M project, leaded by ENGIE LAB CRIGEN, the objective of which is the design of a versatile device for a reliable detection and localization of underground networks. The objective of this work, which is at the core of the G4M project, fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 7 publications
(6 reference statements)
0
1
0
Order By: Relevance
“…Moreover, the authors in [15] use a Bayesian mapping model to integrate knowledge extracted from sensors' raw data and available statutory records to infer underground network data including water pipes. To enhance the detection of underground networks, [16] fuse the data collected from different radars. In [17], the authors apply deep neural networks to detect the position of manhole covers from high-resolution images.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, the authors in [15] use a Bayesian mapping model to integrate knowledge extracted from sensors' raw data and available statutory records to infer underground network data including water pipes. To enhance the detection of underground networks, [16] fuse the data collected from different radars. In [17], the authors apply deep neural networks to detect the position of manhole covers from high-resolution images.…”
Section: Introductionmentioning
confidence: 99%