Search Results for author: Marc Masana

Found 15 papers, 8 papers with code

On the importance of cross-task features for class-incremental learning

1 code implementation22 Jun 2021 Albin Soutif--Cormerais, Marc Masana, Joost Van de Weijer, Bartłomiej Twardowski

We also define a new forgetting measure for class-incremental learning, and see that forgetting is not the principal cause of low performance.

class-incremental learning Incremental Learning +1

Class-incremental learning: survey and performance evaluation on image classification

1 code implementation28 Oct 2020 Marc Masana, Xialei Liu, Bartlomiej Twardowski, Mikel Menta, Andrew D. Bagdanov, Joost Van de Weijer

For future learning systems incremental learning is desirable, because it allows for: efficient resource usage by eliminating the need to retrain from scratch at the arrival of new data; reduced memory usage by preventing or limiting the amount of data required to be stored -- also important when privacy limitations are imposed; and learning that more closely resembles human learning.

class-incremental learning General Classification +2

Disentanglement of Color and Shape Representations for Continual Learning

no code implementations13 Jul 2020 David Berga, Marc Masana, Joost Van de Weijer

We hypothesize that disentangled feature representations suffer less from catastrophic forgetting.

Continual Learning

On Class Orderings for Incremental Learning

no code implementations4 Jul 2020 Marc Masana, Bartłomiej Twardowski, Joost Van de Weijer

The influence of class orderings in the evaluation of incremental learning has received very little attention.

Incremental Learning

Ternary Feature Masks: zero-forgetting for task-incremental learning

no code implementations23 Jan 2020 Marc Masana, Tinne Tuytelaars, Joost Van de Weijer

To allow already learned features to adapt to the current task without changing the behavior of these features for previous tasks, we introduce task-specific feature normalization.

Continual Learning Incremental Learning

A continual learning survey: Defying forgetting in classification tasks

1 code implementation18 Sep 2019 Matthias De Lange, Rahaf Aljundi, Marc Masana, Sarah Parisot, Xu Jia, Ales Leonardis, Gregory Slabaugh, Tinne Tuytelaars

Artificial neural networks thrive in solving the classification problem for a particular rigid task, acquiring knowledge through generalized learning behaviour from a distinct training phase.

Continual Learning General Classification +1

LIUM-CVC Submissions for WMT18 Multimodal Translation Task

no code implementations WS 2018 Ozan Caglayan, Adrien Bardet, Fethi Bougares, Loïc Barrault, Kai Wang, Marc Masana, Luis Herranz, Joost Van de Weijer

This paper describes the multimodal Neural Machine Translation systems developed by LIUM and CVC for WMT18 Shared Task on Multimodal Translation.

Machine Translation Translation

Metric Learning for Novelty and Anomaly Detection

1 code implementation16 Aug 2018 Marc Masana, Idoia Ruiz, Joan Serrat, Joost Van de Weijer, Antonio M. Lopez

When neural networks process images which do not resemble the distribution seen during training, so called out-of-distribution images, they often make wrong predictions, and do so too confidently.

Anomaly Detection Metric Learning +2

Context Proposals for Saliency Detection

no code implementations27 Jun 2018 Aymen Azaza, Joost Van de Weijer, Ali Douik, Marc Masana

Therefore, we extend object proposal methods with context proposals, which allow to incorporate the immediate context in the saliency computation.

RGB Salient Object Detection Saliency Prediction +1

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

2 code implementations8 Feb 2018 Xialei Liu, Marc Masana, Luis Herranz, Joost Van de Weijer, Antonio M. Lopez, Andrew D. Bagdanov

In this paper we propose an approach to avoiding catastrophic forgetting in sequential task learning scenarios.

Domain-adaptive deep network compression

2 code implementations ICCV 2017 Marc Masana, Joost Van de Weijer, Luis Herranz, Andrew D. Bagdanov, Jose M. Alvarez

We show that domain transfer leads to large shifts in network activations and that it is desirable to take this into account when compressing.

Low-rank compression

LIUM-CVC Submissions for WMT17 Multimodal Translation Task

no code implementations WS 2017 Ozan Caglayan, Walid Aransa, Adrien Bardet, Mercedes García-Martínez, Fethi Bougares, Loïc Barrault, Marc Masana, Luis Herranz, Joost Van de Weijer

This paper describes the monomodal and multimodal Neural Machine Translation systems developed by LIUM and CVC for WMT17 Shared Task on Multimodal Translation.

Machine Translation Translation

On-the-fly Network Pruning for Object Detection

no code implementations11 May 2016 Marc Masana, Joost Van de Weijer, Andrew D. Bagdanov

Object detection with deep neural networks is often performed by passing a few thousand candidate bounding boxes through a deep neural network for each image.

Network Pruning Object Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.