2012
DOI: 10.5194/isprsarchives-xxxix-b5-363-2012
|View full text |Cite
|
Sign up to set email alerts
|

Calibration and Epipolar Geometry of Generic Heterogenous Camera Systems

Abstract: ABSTRACT:The application of perspective camera systems in photogrammetry and computer vision is state of the art. In recent years nonperspective and especially omnidirectional camera systems were increasingly used in close-range photogrammetry tasks. In general perspective camera model, i. e. pinhole model, cannot be applied when using non-perspective camera systems. However, several camera models for different omnidirectional camera systems are proposed in literature. Using different types of cameras in a het… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…Another important use-case is non-standard optics (omnidirectional, telecentric, catadioptric, etc.). In precision DM, generic and model-free approaches have been shown to reduce measurement errors even for paraxial imaging, so the topic has attracted a lot of attention (Sturm and Ramalingam, 2004;Grossberg and Nayar, 2005;Wang et al, 2005;Kannala and Brandt, 2006;Barreto et al, 2008;Bothe et al, 2010;Luber, 2010;Ramalingam et al, 2010;Rosebrock and Wahl, 2012;Xiang et al, 2013;Pak, 2016b;Pak, 2016c;Prinzler et al, 2018;Schöps et al, 2020;Gauchan et al, 2021;Uhlig and Heizmann, 2021).…”
Section: Camera Modelsmentioning
confidence: 99%
“…Another important use-case is non-standard optics (omnidirectional, telecentric, catadioptric, etc.). In precision DM, generic and model-free approaches have been shown to reduce measurement errors even for paraxial imaging, so the topic has attracted a lot of attention (Sturm and Ramalingam, 2004;Grossberg and Nayar, 2005;Wang et al, 2005;Kannala and Brandt, 2006;Barreto et al, 2008;Bothe et al, 2010;Luber, 2010;Ramalingam et al, 2010;Rosebrock and Wahl, 2012;Xiang et al, 2013;Pak, 2016b;Pak, 2016c;Prinzler et al, 2018;Schöps et al, 2020;Gauchan et al, 2021;Uhlig and Heizmann, 2021).…”
Section: Camera Modelsmentioning
confidence: 99%
“…One generic projection model has been introduced by Luber and Reulke (Luber and Reulke, 2010), namely polynomial mapping of inclination angle θ to camera chip radius r. Generally any of the above generic radius r to radius r d distortion models can also be utilized as generic inclination angle θ to radius r projection models. In (Luber and Reulke, 2010) and (Luber et al, 2012), different of these models are used as projection models and evaluated. Also, a method of calibrating such models is presented.…”
Section: Varying the Projection Modelmentioning
confidence: 99%
“…In general the inverse of a projection model has to be determined numerically, as exact solutions are either expensive or analytically not possible. Also note that the calibration of the cameras for this paper has been done in the fashion of (Luber et al, 2012), refer to this paper for the details.…”
Section: Stereo Accuracy Evaluation With Generic Projection Modelsmentioning
confidence: 99%
“…Due to their generally smaller size and lower price, fisheye cameras nowadays are widely used in mobile phones (Sahin, 2016) and in applications including video surveillance (Kim et al., 2016), unmanned aerial vehicle mapping (Gurtner et al., 2009), automatic parking systems (Li and Hai, 2011) and robotic vision (Delibasis et al., 2015). The main weakness of fisheye cameras is that the fisheye image has serious distortions: aberrations are introduced and resolution losses occur when the image is corrected (Luber et al., 2012). Many correction algorithms have been developed to recover the “natural look” of a fisheye image (Abraham and Förstner, 2005; Wei et al., 2012; Campos et al., 2018; Yin et al., 2018; Peng et al., 2020).…”
Section: Introductionmentioning
confidence: 99%