2019
DOI: 10.48550/arxiv.1912.05131
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DeepMeshFlow: Content Adaptive Mesh Deformation for Robust Image Registration

Nianjin Ye,
Chuan Wang,
Shuaicheng Liu
et al.

Abstract: Image alignment by mesh warps, such as meshflow, is a fundamental task which has been widely applied in various vision applications(e.g., multi-frame HDR/denoising, video stabilization). Traditional mesh warp methods detect and match image features, where the quality of alignment highly depends on the quality of image features. However, the image features are not robust in occurrence of low-texture and low-light scenes. Deep homography methods, on the other hand, are free from such problem by learning deep fea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…DeTone et al 25 and Nguyen et al 26 presented supervised and unsupervised learning methods to estimate global homography models, respectively. Ye et al 27 introduced a deep meshflow model, but suffered from high training cost.…”
Section: Related Workmentioning
confidence: 99%
“…DeTone et al 25 and Nguyen et al 26 presented supervised and unsupervised learning methods to estimate global homography models, respectively. Ye et al 27 introduced a deep meshflow model, but suffered from high training cost.…”
Section: Related Workmentioning
confidence: 99%
“…Issues of parallax due to content at different depths can be better addressed by mesh-based warping [32,28,30] or pixel-wise dense optical flow [17,60,50,51,57,20] posed deep meshflow [66] to make mesh estimation more robust on different scenes. Due to the sparsity of the mesh, image contents can be better retained while warping.…”
Section: Related Workmentioning
confidence: 99%