no code implementations • ECCV 2020 • Oren Katzir, Dani Lischinski, Daniel Cohen-Or
We mitigate this by descending the deep layers of a pre-trained network, where the deep features contain more semantics, and applying the translation between these deep features.
no code implementations • 26 Oct 2023 • Oren Katzir, Or Patashnik, Daniel Cohen-Or, Dani Lischinski
Score Distillation Sampling (SDS) has emerged as the de facto approach for text-to-content generation in non-image domains.
no code implementations • 3 Apr 2022 • Oren Katzir, Dani Lischinski, Daniel Cohen-Or
We introduce an unsupervised technique for encoding point clouds into a canonical shape representation, by disentangling shape and pose.
no code implementations • 11 Feb 2022 • Oren Katzir, Vicky Perepelook, Dani Lischinski, Daniel Cohen-Or
Truncation is widely used in generative models for improving the quality of the generated samples, at the expense of reducing their diversity.
no code implementations • 4 Jun 2019 • Oren Katzir, Dani Lischinski, Daniel Cohen-Or
Our translation is performed in a cascaded, deep-to-shallow, fashion, along the deep feature hierarchy: we first translate between the deepest layers that encode the higher-level semantic content of the image, proceeding to translate the shallower layers, conditioned on the deeper ones.
1 code implementation • ICCV 2019 • Nadav Schor, Oren Katzir, Hao Zhang, Daniel Cohen-Or
Data-driven generative modeling has made remarkable progress by leveraging the power of deep neural networks.
no code implementations • 21 May 2018 • Jinming Cao, Oren Katzir, Peng Jiang, Dani Lischinski, Danny Cohen-Or, Changhe Tu, Yangyan Li
The key idea is that by learning to separately extract both the common and the domain-specific features, one can synthesize more target domain data with supervision, thereby boosting the domain adaptation performance.