motion synthesis

28 papers with code • 1 benchmarks • 1 datasets

This task has no description! Would you like to contribute one?

Datasets


Most implemented papers

On human motion prediction using recurrent neural networks

una-dinosauria/human-motion-prediction CVPR 2017

Human motion modelling is a classical problem at the intersection of graphics and computer vision, with applications spanning human-computer interaction, motion synthesis, and motion prediction for virtual and augmented reality.

HP-GAN: Probabilistic 3D human motion prediction via GAN

ebarsoum/hpgan 27 Nov 2017

Our model, which we call HP-GAN, learns a probability density function of future human poses conditioned on previous poses.

MoGlow: Probabilistic and controllable motion synthesis using normalising flows

chaiyujin/glow-pytorch 16 May 2019

Data-driven modelling and synthesis of motion is an active research area with applications that include animation, games, and social robotics.

Multi-View Motion Synthesis via Applying Rotated Dual-Pixel Blur Kernels

Abdullah-Abuolaim/defocus-deblurring-dual-pixel 15 Nov 2021

In this work, we follow the trend of rendering the NIMAT effect by introducing a modification on the blur synthesis procedure in portrait mode.

Auto-Conditioned Recurrent Networks for Extended Complex Human Motion Synthesis

papagina/auto_conditioned_rnn_motion ICLR 2018

We present a real-time method for synthesizing highly complex human motions using a novel training regime we call the auto-conditioned Recurrent Neural Network (acRNN).

A Neural Temporal Model for Human Motion Prediction

cr7anand/neural_temporal_models CVPR 2019

We propose novel neural temporal models for predicting and synthesizing human motion, achieving state-of-the-art in modeling long-term motion trajectories while being competitive with prior work in short-term prediction and requiring significantly less computation.

CARL: Controllable Agent with Reinforcement Learning for Quadruped Locomotion

inventec-ai-center/carl-siggraph2020 7 May 2020

Motion synthesis in a dynamic environment has been a long-standing problem for character animation.

Unpaired Motion Style Transfer from Video to Animation

DeepMotionEditing/deep-motion-editing 12 May 2020

In this paper, we present a novel data-driven framework for motion style transfer, which learns from an unpaired collection of motions with style labels, and enables transferring motion styles not observed during training.

Skeleton-Aware Networks for Deep Motion Retargeting

DeepMotionEditing/deep-motion-editing 12 May 2020

In other words, our operators form the building blocks of a new deep motion processing framework that embeds the motion into a common latent space, shared by a collection of homeomorphic skeletons.

Style-Controllable Speech-Driven Gesture Synthesis Using Normalising Flows

simonalexanderson/StyleGestures Computer Graphics Forum 2020

In interactive scenarios, systems for generating natural animations on the fly are key to achieving believable and relatable characters.