Differential Motion Evolution for Fine-Grained Motion Deformation in Unsupervised Image Animation

9 Oct 2021  ·  Peirong Liu, Rui Wang, Xuefei Cao, Yipin Zhou, Ashish Shah, Ser-Nam Lim ·

Image animation is the task of transferring the motion of a driving video to a given object in a source image. While great progress has recently been made in unsupervised motion transfer, requiring no labeled data or domain priors, many current unsupervised approaches still struggle to capture the motion deformations when large motion/view discrepancies occur between the source and driving domains. Under such conditions, there is simply not enough information to capture the motion field properly. We introduce DiME (Differential Motion Evolution), an end-to-end unsupervised motion transfer framework integrating differential refinement for motion estimation. Key findings are twofold: (1) by capturing the motion transfer with an ordinary differential equation (ODE), it helps to regularize the motion field, and (2) by utilizing the source image itself, we are able to inpaint occluded/missing regions arising from large motion changes. Additionally, we also propose a natural extension to the ODE idea, which is that DiME can easily leverage multiple different views of the source object whenever they are available by modeling an ODE per view. Extensive experiments across 9 benchmarks show DiME outperforms the state-of-the-arts by a significant margin and generalizes much better to unseen objects.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here