2016 IEEE International Conference on Image Processing (ICIP) 2016
DOI: 10.1109/icip.2016.7533210
|View full text |Cite
|
Sign up to set email alerts
|

Motion estimation for fisheye video sequences combining perspective projection with camera calibration information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 18 publications
0
8
0
Order By: Relevance
“…Consequently, when dealing with an FOV of more than 180 • , we have to consider the peripheral fisheye regions where θ exceeds 90 • separately. To that end, we introduce an ultra wide-angle compensation procedure which handles the problem of faulty mappings [29]. Before discussing this proposed extension in detail, we first define r max and r 180 • , which are obtained by solving the equisolid fisheye projection function ( 5) for θ = FOV/2•π/180 and θ = π/2, respectively.…”
Section: Ultra Wide-angle Compensationmentioning
confidence: 99%
“…Consequently, when dealing with an FOV of more than 180 • , we have to consider the peripheral fisheye regions where θ exceeds 90 • separately. To that end, we introduce an ultra wide-angle compensation procedure which handles the problem of faulty mappings [29]. Before discussing this proposed extension in detail, we first define r max and r 180 • , which are obtained by solving the equisolid fisheye projection function ( 5) for θ = FOV/2•π/180 and θ = π/2, respectively.…”
Section: Ultra Wide-angle Compensationmentioning
confidence: 99%
“…Moreover to the RGB or RGB-D image space, there also have been some researches about motion estimation for unique image space such as fisheye video sequence [16], and omnidirectional image space [17]. Eichenseer et al [16] proposed a hybrid block-based motion estimation method for real-world fisheye videos by combining perspective projection with considerations about the fisheye image structure. Simone et al [17] presented an extension of block-based motion estimation for panoramic omnidirectional image sequences by considering spherical geometry of the imaging system.…”
Section: Related Workmentioning
confidence: 99%
“…In this paper, to the best of our knowledge, we tackle the problem of ego-motion classification for compound eye cameras using convolutional neural network (CNN) for the first time, of which the goal is to classify the motion of compound eye camera given two consecutive emulated high resolution compound images. The ego-motion classification is the problem that focuses on the movement of the camera itself, which is different from the previous camera motion estimating algorithms that aim to estimate the transition between two scenes [12,13,14,15,16,17]. Here, the ego-motion classification of the compound eye camera gives an important contribution to the robot community, since knowing the moving direction of a robot is critical for problems such as localization and navigation.…”
Section: Introductionmentioning
confidence: 99%
“…Besides, Ahmmed et al [8] introduce the elastic model into HEVC to better describe the object motions in the fish eye cameras. Moreover, Eichenseer et al [9] propose a re-mapping method to handle some extreme cases in which the field of view is larger than 180-degrees. However, these motion models focusing on the fish eye cameras are unable to efficiently handle the cube map for 360-degree videos.…”
Section: Introductionmentioning
confidence: 99%