Search Results for author: Guy Tevet

Found 7 papers, 6 papers with code

MAS: Multi-view Ancestral Sampling for 3D motion generation using 2D diffusion

no code implementations23 Oct 2023 Roy Kapon, Guy Tevet, Daniel Cohen-Or, Amit H. Bermano

We introduce Multi-view Ancestral Sampling (MAS), a method for 3D motion generation, using 2D diffusion models that were trained on motions obtained from in-the-wild videos.

Denoising

Human Motion Diffusion as a Generative Prior

2 code implementations2 Mar 2023 Yonatan Shafir, Guy Tevet, Roy Kapon, Amit H. Bermano

We evaluate the composition methods using an off-the-shelf motion diffusion model, and further compare the results to dedicated models trained for these specific tasks.

Denoising Motion Synthesis

Single Motion Diffusion

1 code implementation12 Feb 2023 Sigal Raab, Inbal Leibovitch, Guy Tevet, Moab Arar, Amit H. Bermano, Daniel Cohen-Or

We harness the power of diffusion models and present a denoising network explicitly designed for the task of learning from a single input motion.

Denoising Style Transfer

Human Motion Diffusion Model

1 code implementation29 Sep 2022 Guy Tevet, Sigal Raab, Brian Gordon, Yonatan Shafir, Daniel Cohen-Or, Amit H. Bermano

In this paper, we introduce Motion Diffusion Model (MDM), a carefully adapted classifier-free diffusion-based generative model for the human motion domain.

Motion Synthesis

MotionCLIP: Exposing Human Motion Generation to CLIP Space

1 code implementation15 Mar 2022 Guy Tevet, Brian Gordon, Amir Hertz, Amit H. Bermano, Daniel Cohen-Or

MotionCLIP gains its unique power by aligning its latent space with that of the Contrastive Language-Image Pre-training (CLIP) model.

Disentanglement Motion Interpolation

Evaluating the Evaluation of Diversity in Natural Language Generation

1 code implementation EACL 2021 Guy Tevet, Jonathan Berant

Despite growing interest in natural language generation (NLG) models that produce diverse outputs, there is currently no principled method for evaluating the diversity of an NLG system.

Text Generation

Evaluating Text GANs as Language Models

1 code implementation NAACL 2019 Guy Tevet, Gavriel Habib, Vered Shwartz, Jonathan Berant

Generative Adversarial Networks (GANs) are a promising approach for text generation that, unlike traditional language models (LM), does not suffer from the problem of ``exposure bias''.

Text Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.