Search Results for author: Lukas Mauch

Found 17 papers, 7 papers with code

SAFT: Towards Out-of-Distribution Generalization in Fine-Tuning

no code implementations3 Jul 2024 Bac Nguyen, Stefan Uhlich, Fabien Cardinaux, Lukas Mauch, Marzieh Edraki, Aaron Courville

While a pre-trained vision-language model like CLIP has demonstrated remarkable zero-shot performance, further adaptation of the model to downstream tasks leads to undesirable degradation for OOD data.

Few-Shot Learning General Knowledge +2

LLM meets Vision-Language Models for Zero-Shot One-Class Classification

no code implementations31 Mar 2024 Yassir Bendou, Giulia Lioi, Bastien Pasdeloup, Lukas Mauch, Ghouthi Boukli Hacene, Fabien Cardinaux, Vincent Gripon

Namely, we propose a realistic benchmark where negative query samples are drawn from the same original dataset as positive ones, including a granularity-controlled version of iNaturalist, where negative samples are at a fixed distance in the taxonomy tree from the positive ones.

Classification One-Class Classification

A Novel Benchmark for Few-Shot Semantic Segmentation in the Era of Foundation Models

1 code implementation20 Jan 2024 Reda Bensaid, Vincent Gripon, François Leduc-Primeau, Lukas Mauch, Ghouthi Boukli Hacene, Fabien Cardinaux

In recent years, the rapid evolution of computer vision has seen the emergence of various foundation models, each tailored to specific data types and tasks.

Few-Shot Semantic Segmentation Segmentation +1

Inferring Latent Class Statistics from Text for Robust Visual Few-Shot Learning

1 code implementation24 Nov 2023 Yassir Bendou, Vincent Gripon, Bastien Pasdeloup, Giulia Lioi, Lukas Mauch, Fabien Cardinaux, Ghouthi Boukli Hacene

In this paper, we present a novel approach that leverages text-derived statistics to predict the mean and covariance of the visual feature distribution for each class.

Few-Shot Learning

Order-Preserving GFlowNets

1 code implementation30 Sep 2023 Yihang Chen, Lukas Mauch

To address these issues, we propose Order-Preserving GFlowNets (OP-GFNs), which sample with probabilities in proportion to a learned reward function that is consistent with a provided (partial) order on the candidates, thus eliminating the need for an explicit formulation of the reward function.

Neural Architecture Search

DBsurf: A Discrepancy Based Method for Discrete Stochastic Gradient Estimation

no code implementations7 Sep 2023 Pau Mulet Arabi, Alec Flowers, Lukas Mauch, Fabien Cardinaux

Computing gradients of an expectation with respect to the distributional parameters of a discrete distribution is a problem arising in many fields of science and engineering.

Benchmarking Neural Architecture Search

Efficient Training of Deep Equilibrium Models

1 code implementation23 Apr 2023 Bac Nguyen, Lukas Mauch

Deep equilibrium models (DEQs) have proven to be very powerful for learning data representations.

Neural Network Libraries: A Deep Learning Framework Designed from Engineers' Perspectives

1 code implementation12 Feb 2021 Takuya Narihira, Javier Alonsogarcia, Fabien Cardinaux, Akio Hayakawa, Masato Ishii, Kazunori Iwaki, Thomas Kemp, Yoshiyuki Kobayashi, Lukas Mauch, Akira Nakamura, Yukio Obuchi, Andrew Shin, Kenji Suzuki, Stephen Tiedmann, Stefan Uhlich, Takuya Yashima, Kazuki Yoshiyama

While there exist a plethora of deep learning tools and frameworks, the fast-growing complexity of the field brings new demands and challenges, such as more flexible network design, speedy computation on distributed setting, and compatibility between different tools.

Efficient Sampling for Predictor-Based Neural Architecture Search

no code implementations24 Nov 2020 Lukas Mauch, Stephen Tiedemann, Javier Alonso Garcia, Bac Nguyen Cong, Kazuki Yoshiyama, Fabien Cardinaux, Thomas Kemp

Usually, we compute the proxy for all DNNs in the network search space and pick those that maximize the proxy as candidates for optimization.

Neural Architecture Search

Deep Neural Network inference with reduced word length

no code implementations23 Oct 2018 Lukas Mauch, Bin Yang

Deep neural networks (DNN) are powerful models for many pattern recognition tasks, yet their high computational complexity and memory requirement limit them to applications on high-performance computing platforms.

Neural Network Ensembles to Real-time Identification of Plug-level Appliance Measurements

no code implementations20 Feb 2018 Karim Said Barsim, Lukas Mauch, Bin Yang

The problem of identifying end-use electrical appliances from their individual consumption profiles, known as the appliance identification problem, is a primary stage in both Non-Intrusive Load Monitoring (NILM) and automated plug-wise metering.

Non-Intrusive Load Monitoring

Cannot find the paper you are looking for? You can Submit a new open access paper.