Search Results for author: Ugo Tanielian

Found 18 papers, 1 papers with code

Learning disconnected manifolds: a no GAN's land

no code implementations ICML 2020 Ugo Tanielian, Thibaut Issenhuth, Elvis Dohmatob, Jeremie Mary

Typical architectures of Generative Adversarial Networks make use of a unimodal latent/input distribution transformed by a continuous generator.

3DGEN: A GAN-based approach for generating novel 3D models from image data

no code implementations13 Dec 2023 Antoine Schnepf, Flavian vasile, Ugo Tanielian

The recent advances in text and image synthesis show a great promise for the future of generative models in creative fields.

Image Generation Object Reconstruction

AdBooster: Personalized Ad Creative Generation using Stable Diffusion Outpainting

no code implementations8 Sep 2023 Veronika Shilova, Ludovic Dos Santos, Flavian vasile, Gaëtan Racic, Ugo Tanielian

In digital advertising, the selection of the optimal item (recommendation) and its best creative presentation (creative optimization) have traditionally been considered separate disciplines.

Data Augmentation

Unveiling the Latent Space Geometry of Push-Forward Generative Models

no code implementations21 Jul 2022 Thibaut Issenhuth, Ugo Tanielian, Jérémie Mary, David Picard

We investigate the relationship between the performance of these models and the geometry of their latent space.

Face Model

Lessons from the AdKDD'21 Privacy-Preserving ML Challenge

no code implementations31 Jan 2022 Eustache Diemert, Romain Fabre, Alexandre Gilotte, Fei Jia, Basile Leparmentier, Jérémie Mary, Zhonghua Qu, Ugo Tanielian, Hui Yang

Designing data sharing mechanisms providing performance and strong privacy guarantees is a hot topic for the Online Advertising industry.

Privacy Preserving

Optimal 1-Wasserstein Distance for WGANs

no code implementations8 Jan 2022 Arthur Stéphanovitch, Ugo Tanielian, Benoît Cadre, Nicolas Klutchnikoff, Gérard Biau

The mathematical forces at work behind Generative Adversarial Networks raise challenging theoretical issues.

valid

EdiBERT, a generative model for image editing

1 code implementation30 Nov 2021 Thibaut Issenhuth, Ugo Tanielian, Jérémie Mary, David Picard

Advances in computer vision are pushing the limits of im-age manipulation, with generative models sampling detailed images on various tasks.

Image Denoising Image Manipulation

Latent reweighting, an almost free improvement for GANs

no code implementations19 Oct 2021 Thibaut Issenhuth, Ugo Tanielian, David Picard, Jeremie Mary

Standard formulations of GANs, where a continuous function deforms a connected latent space, have been shown to be misspecified when fitting different classes of images.

What Users Want? WARHOL: A Generative Model for Recommendation

no code implementations2 Sep 2021 Jules Samaran, Ugo Tanielian, Romain Beaumont, Flavian vasile

Current recommendation approaches help online merchants predict, for each visiting user, which subset of their existing products is the most relevant.

Learning Disconnected Manifolds: Avoiding The No Gan's Land by Latent Rejection

no code implementations1 Jan 2021 Thibaut Issenhuth, Ugo Tanielian, David Picard, Jeremie Mary

Standard formulations of GANs, where a continuous function deforms a connected latent space, have been shown to be misspecified when fitting disconnected manifolds.

Wasserstein Learning of Determinantal Point Processes

no code implementations NeurIPS Workshop LMCA 2020 Lucas Anquetil, Mike Gartrell, Alain Rakotomamonjy, Ugo Tanielian, Clément Calauzènes

Through an evaluation on a real-world dataset, we show that our Wasserstein learning approach provides significantly improved predictive performance on a generative task compared to DPPs trained using MLE.

Point Processes

Approximating Lipschitz continuous functions with GroupSort neural networks

no code implementations9 Jun 2020 Ugo Tanielian, Maxime Sangnier, Gerard Biau

Recent advances in adversarial attacks and Wasserstein GANs have advocated for use of neural networks with restricted Lipschitz constants.

Learning disconnected manifolds: a no GANs land

no code implementations8 Jun 2020 Ugo Tanielian, Thibaut Issenhuth, Elvis Dohmatob, Jeremie Mary

Typical architectures of Generative AdversarialNetworks make use of a unimodal latent distribution transformed by a continuous generator.

Some Theoretical Insights into Wasserstein GANs

no code implementations4 Jun 2020 Gérard Biau, Maxime Sangnier, Ugo Tanielian

Generative Adversarial Networks (GANs) have been successful in producing outstanding results in areas as diverse as image, video, and text generation.

Text Generation

Relaxed Softmax for learning from Positive and Unlabeled data

no code implementations17 Sep 2019 Ugo Tanielian, Flavian vasile

In recent years, the softmax model and its fast approximations have become the de-facto loss functions for deep neural networks when dealing with multi-class prediction.

Density Estimation Language Modelling

Distributionally Robust Counterfactual Risk Minimization

no code implementations14 Jun 2019 Louis Faury, Ugo Tanielian, Flavian vasile, Elena Smirnova, Elvis Dohmatob

This manuscript introduces the idea of using Distributionally Robust Optimization (DRO) for the Counterfactual Risk Minimization (CRM) problem.

counterfactual Decision Making

Partially Mutual Exclusive Softmax for Positive and Unlabeled data

no code implementations ICLR 2019 Ugo Tanielian, Flavian vasile, Mike Gartrell

This is often the case for applications such as language modeling, next event prediction and matrix factorization, where many of the potential outcomes are not mutually exclusive, but are more likely to be independent conditionally on the state.

Language Modelling

Adversarial Training of Word2Vec for Basket Completion

no code implementations22 May 2018 Ugo Tanielian, Mike Gartrell, Flavian vasile

In recent years, the Word2Vec model trained with the Negative Sampling loss function has shown state-of-the-art results in a number of machine learning tasks, including language modeling tasks, such as word analogy and word similarity, and in recommendation tasks, through Prod2Vec, an extension that applies to modeling user shopping activity and user preferences.

Language Modelling Word Similarity

Cannot find the paper you are looking for? You can Submit a new open access paper.