Search Results for author: Youssef Tamaazousti

Found 11 papers, 1 papers with code

On the Hidden Negative Transfer in Sequential Transfer Learning for Domain Adaptation from News to Tweets

no code implementations EACL (AdaptNLP) 2021 Sara Meftah, Nasredine Semmar, Youssef Tamaazousti, Hassane Essafi, Fatiha Sadat

Transfer Learning has been shown to be a powerful tool for Natural Language Processing (NLP) and has outperformed the standard supervised learning paradigm, as it takes benefit from the pre-learned knowledge.

Chunking Domain Adaptation +5

Neural Supervised Domain Adaptation by Augmenting Pre-trained Models with Random Units

no code implementations9 Jun 2021 Sara Meftah, Nasredine Semmar, Youssef Tamaazousti, Hassane Essafi, Fatiha Sadat

In the standard fine-tuning scheme of TL, a model is initially pre-trained on a source domain and subsequently fine-tuned on a target domain and, therefore, source and target domains are trained using the same architecture.

Chunking Domain Adaptation +5

Multi-Task Supervised Pretraining for Neural Domain Adaptation

no code implementations WS 2020 Sara Meftah, Nasredine Semmar, Mohamed-Ayoub Tahiri, Youssef Tamaazousti, Hassane Essafi, Fatiha Sadat

Two prevalent transfer learning approaches are used in recent works to improve neural networks performance for domains with small amounts of annotated data: Multi-task learning which involves training the task of interest with related auxiliary tasks to exploit their underlying similarities, and Mono-task fine-tuning, where the weights of the model are initialized with the pretrained weights of a large-scale labeled source domain and then fine-tuned with labeled data of the target domain (domain of interest).

Domain Adaptation Multi-Task Learning

Where is the Fake? Patch-Wise Supervised GANs for Texture Inpainting

no code implementations6 Nov 2019 Ahmed Ben Saad, Youssef Tamaazousti, Josselin Kherroubi, Alexis He

We tackle the problem of texture inpainting where the input images are textures with missing values along with masks that indicate the zones that should be generated.

Image Inpainting

Exploration de l'apprentissage par transfert pour l'analyse de textes des r\'eseaux sociaux (Exploring neural transfer learning for social media text analysis )

no code implementations JEPTALNRECITAL 2019 Sara Meftah, Nasredine Semmar, Youssef Tamaazousti, Hassane Essafi, Fatiha Sadat

L{'}apprentissage par transfert repr{\'e}sente la capacit{\'e} qu{'}un mod{\`e}le neuronal entra{\^\i}n{\'e} sur une t{\^a}che {\`a} g{\'e}n{\'e}raliser suffisamment et correctement pour produire des r{\'e}sultats pertinents sur une autre t{\^a}che proche mais diff{\'e}rente.

Transfer Learning

How to make a pizza: Learning a compositional layer-based GAN model

no code implementations CVPR 2019 Dim P. Papadopoulos, Youssef Tamaazousti, Ferda Ofli, Ingmar Weber, Antonio Torralba

From a visual perspective, every instruction step can be seen as a way to change the visual appearance of the dish by adding extra objects (e. g., adding an ingredient) or changing the appearance of the existing ones (e. g., cooking the dish).

Generative Adversarial Network

Deep Multi-class Adversarial Specularity Removal

no code implementations4 Apr 2019 John Lin, Mohamed El Amine Seddik, Mohamed Tamaazousti, Youssef Tamaazousti, Adrien Bartoli

We propose a novel learning approach, in the form of a fully-convolutional neural network (CNN), which automatically and consistently removes specular highlights from a single image by generating its diffuse component.

Learning Finer-class Networks for Universal Representations

no code implementations4 Oct 2018 Julien Girard, Youssef Tamaazousti, Hervé Le Borgne, Céline Hudelot

This raises the question of how well the original representation is "universal", that is to say directly adapted to many different target-tasks.

MuCaLe-Net: Multi Categorical-Level Networks to Generate More Discriminating Features

no code implementations CVPR 2017 Youssef Tamaazousti, Herve Le Borgne, Celine Hudelot

In a transfer-learning scheme, the intermediate layers of a pre-trained CNN are employed as universal image representation to tackle many visual classification problems.

General Classification Image Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.