Search Results for author: Yael Vinker

Found 7 papers, 4 papers with code

Attend-and-Excite: Attention-Based Semantic Guidance for Text-to-Image Diffusion Models

1 code implementation31 Jan 2023 Hila Chefer, Yuval Alaluf, Yael Vinker, Lior Wolf, Daniel Cohen-Or

Recent text-to-image generative models have demonstrated an unparalleled ability to generate diverse and creative imagery guided by a target text prompt.

Generative Semantic Nursing

CLIPascene: Scene Sketching with Different Types and Levels of Abstraction

no code implementations30 Nov 2022 Yael Vinker, Yuval Alaluf, Daniel Cohen-Or, Ariel Shamir

In this paper, we present a method for converting a given scene image into a sketch using different types and multiple levels of abstraction.

Disentanglement

CLIPasso: Semantically-Aware Object Sketching

1 code implementation11 Feb 2022 Yael Vinker, Ehsan Pajouheshgar, Jessica Y. Bo, Roman Christian Bachmann, Amit Haim Bermano, Daniel Cohen-Or, Amir Zamir, Ariel Shamir

Abstraction is at the heart of sketching due to the simple and minimal nature of line drawings.

Unpaired Learning for High Dynamic Range Image Tone Mapping

no code implementations ICCV 2021 Yael Vinker, Inbar Huberman-Spiegelglas, Raanan Fattal

In this paper we describe a new tone-mapping approach guided by the distinct goal of producing low dynamic range (LDR) renditions that best reproduce the visual characteristics of native LDR images.

Image Manipulation Tone Mapping

Image Shape Manipulation from a Single Augmented Training Sample

1 code implementation2 Jul 2020 Yael Vinker, Eliahu Horwitz, Nir Zabari, Yedid Hoshen

In this paper, we present DeepSIM, a generative model for conditional image manipulation based on a single image.

Image Manipulation Image-to-Image Translation +1

Training End-to-end Single Image Generators without GANs

no code implementations7 Apr 2020 Yael Vinker, Nir Zabari, Yedid Hoshen

We present AugurOne, a novel approach for training single image generative models.

Image Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.