2020
DOI: 10.1007/978-3-030-58586-0_18
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Identity Motion Transfer for Arbitrary Objects Through Pose-Attentive Video Reassembling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…The model-free approaches [30,31,19,32,34] does not rely on pre-trained third-party models, and extend the model-based method to arbitrary objects. Aliaksandr et al [30] proposed a model-free motion transfer model Monky-Net that can apply motion transfer on arbitrary objects with an unsupervised key point detector trained by reconstruction loss [18].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…The model-free approaches [30,31,19,32,34] does not rely on pre-trained third-party models, and extend the model-based method to arbitrary objects. Aliaksandr et al [30] proposed a model-free motion transfer model Monky-Net that can apply motion transfer on arbitrary objects with an unsupervised key point detector trained by reconstruction loss [18].…”
Section: Related Workmentioning
confidence: 99%
“…A generator module is utilized to generate final result with the warped source image feature. Subin et al [19] proposed pose attention mechanism with an unsupervised key point detector to model motion. Recently, Aliaksandr et al [32] improved FOMM with an advanced motion model and background motion model to MRAA.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…To overcome the huge data requirement of individual subjects and improve the generalization of personalized methods, recent works mainly focus on general-purpose methods [5,12,15,25,39,45,50]. This type of method aims to learn a model that can be adapted for the generation of unseen persons.…”
Section: Introductionmentioning
confidence: 99%