Search Results for author: Julio Hurtado

Found 13 papers, 5 papers with code

In-context Interference in Chat-based Large Language Models

no code implementations22 Sep 2023 Eric Nuertey Coleman, Julio Hurtado, Vincenzo Lomonaco

However, one limitation of this scenario is that users cannot modify the internal knowledge of the model, and the only way to add or modify internal knowledge is by explicitly mentioning it to the model during the current interaction.

A Comprehensive Empirical Evaluation on Online Continual Learning

1 code implementation20 Aug 2023 Albin Soutif--Cormerais, Antonio Carta, Andrea Cossu, Julio Hurtado, Hamed Hemati, Vincenzo Lomonaco, Joost Van de Weijer

Online continual learning aims to get closer to a live learning experience by learning directly on a stream of data with temporally shifting distribution and by storing a minimum amount of data from that stream.

Continual Learning Image Classification

Studying Generalization on Memory-Based Methods in Continual Learning

no code implementations16 Jun 2023 Felipe del Rio, Julio Hurtado, Cristian Buc, Alvaro Soto, Vincenzo Lomonaco

One of the objectives of Continual Learning is to learn new concepts continually over a stream of experiences and at the same time avoid catastrophic forgetting.

Continual Learning Out-of-Distribution Generalization

Continual Learning for Predictive Maintenance: Overview and Challenges

no code implementations29 Jan 2023 Julio Hurtado, Dario Salvati, Rudy Semola, Mattia Bosio, Vincenzo Lomonaco

In this work, we present a brief introduction to predictive maintenance, non-stationary environments, and continual learning, together with an extensive review of the current state of applying continual learning in real-world applications and specifically in predictive maintenance.

Continual Learning

Class-Incremental Learning with Repetition

1 code implementation26 Jan 2023 Hamed Hemati, Andrea Cossu, Antonio Carta, Julio Hurtado, Lorenzo Pellegrini, Davide Bacciu, Vincenzo Lomonaco, Damian Borth

We propose two stochastic stream generators that produce a wide range of CIR streams starting from a single dataset and a few interpretable control parameters.

class-incremental learning Class Incremental Learning +1

How Relevant is Selective Memory Population in Lifelong Language Learning?

no code implementations3 Oct 2022 Vladimir Araujo, Helena Balabin, Julio Hurtado, Alvaro Soto, Marie-Francine Moens

Lifelong language learning seeks to have models continuously learn multiple tasks in a sequential order without suffering from catastrophic forgetting.

Question Answering text-classification +1

A Study on the Predictability of Sample Learning Consistency

no code implementations7 Jul 2022 Alain Raymond-Saez, Julio Hurtado, Alvaro Soto

Curriculum Learning is a powerful training method that allows for faster and better training in some settings.

Populating Memory in Continual Learning with Consistency Aware Sampling

no code implementations4 Jul 2022 Julio Hurtado, Alain Raymond-Saez, Vladimir Araujo, Vincenzo Lomonaco, Alvaro Soto, Davide Bacciu

Based on these insights, we propose CAWS (Consistency AWare Sampling), an original storage policy that leverages a learning consistency score (C-Score) to populate the memory with elements that are easy to learn and representative of previous tasks.

Continual Learning

Entropy-based Stability-Plasticity for Lifelong Learning

1 code implementation18 Apr 2022 Vladimir Araujo, Julio Hurtado, Alvaro Soto, Marie-Francine Moens

The ability to continuously learn remains elusive for deep learning models.

Optimizing Reusable Knowledge for Continual Learning via Metalearning

1 code implementation NeurIPS 2021 Julio Hurtado, Alain Raymond-Saez, Alvaro Soto

On the other hand, a set of trainable masks provides the key mechanism to selectively choose from the KB relevant weights to solve each task.

Continual Learning

Catching the Long Tail in Deep Neural Networks

no code implementations1 Jan 2021 Julio Hurtado, Alain Raymond, Alvaro Soto

As a working hypothesis, we speculate that during learning some weights focus on mining patterns from frequent examples while others are in charge of memorizing rare long-tail samples.


Cannot find the paper you are looking for? You can Submit a new open access paper.