2015
DOI: 10.1016/j.actaastro.2014.12.010
|View full text |Cite
|
Sign up to set email alerts
|

Relative pose estimation of satellites using PMD-/CCD-sensor data fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 42 publications
(16 citation statements)
references
References 20 publications
0
16
0
Order By: Relevance
“…The detected edges are marked with orange color. PMD image can be interpreted as 3D image; instead of a 2D line-t a 3D plane-t has been performed to estimate the pose from the given sensor data (Tzschichholz et al, 2015). For more details on camera based navigation experiments done on EPOS, we refer to ; Klionovska & Benningho (2016); Tzschichholz (2014); Tzschichholz et al (2011Tzschichholz et al ( , 2015.…”
Section: Camera Based Navigationmentioning
confidence: 99%
“…The detected edges are marked with orange color. PMD image can be interpreted as 3D image; instead of a 2D line-t a 3D plane-t has been performed to estimate the pose from the given sensor data (Tzschichholz et al, 2015). For more details on camera based navigation experiments done on EPOS, we refer to ; Klionovska & Benningho (2016); Tzschichholz (2014); Tzschichholz et al (2011Tzschichholz et al ( , 2015.…”
Section: Camera Based Navigationmentioning
confidence: 99%
“…A spacecraft pose estimation algorithm is tested in [ 26 ] which process real-time PMD time-of-flight (ToF) camera frames to produce a six degrees-of-freedom pose estimate by 3D feature detection and feature matching. A new pose estimation method of satellites is presented in [ 27 ] by fusing PMD time-of-flight (ToF) camera and CCD sensor in order to benefit from each other sensor’s advantages, and it is tested on the European Proximity Operations Simulator (EPOS).…”
Section: Related Workmentioning
confidence: 99%
“…An Improved Particle Filter (IPF) is designed to estimate the system motion parameters by processing the measurements from a sonar sensor and camera. In [ 18 ], laser scanners and a stereo vision camera are used to estimate the pose of moving objects in terrestrial and space applications. Based on the data from the Laser Camera System (LCS), [ 19 ] presents a closed-loop integrated sensor fusion approach, which consists of a Kalman filter and an Iterative Closest Point (ICP) algorithm.…”
Section: Literature Reviewmentioning
confidence: 99%