no code implementations • 27 Nov 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
However, such techniques suffer from their lack of adaptability to the target devices, as a hardware typically only support specific bit widths.
no code implementations • 17 Nov 2023 • Rémi Ouazan Reboul, Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
To solve this problem, a popular solution is DNN pruning, and more so structured pruning, where coherent computational blocks (e. g. channels for convolutional networks) are removed: as an exhaustive search of the space of pruned sub-models is intractable in practice, channels are typically removed iteratively based on an importance estimation heuristic.
no code implementations • 29 Sep 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
However, this led to an increase in the memory footprint, to a point where it can be challenging to simply load a model on commodity devices such as mobile phones.
no code implementations • 11 Sep 2023 • Eden Belouadah, Arnaud Dapogny, Kevin Bailly
The main challenge of incremental learning is catastrophic forgetting, the inability of neural networks to retain past knowledge when learning a new one.
no code implementations • 15 Aug 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
GPTQ essentially consists in learning the rounding operation using a small calibration set.
no code implementations • 10 Aug 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
However, the optimization of the exponent parameter and weight values remains a challenging and novel problem which could not be solved with previous post training optimization techniques which only learn to round up or down weight values in order to preserve the predictive function.
no code implementations • 9 Aug 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly, Xavier Fischer
In this work, we propose to investigate DNN layer importance, i. e. to estimate the sensitivity of the accuracy w. r. t.
no code implementations • 30 Jun 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
We show experimentally that our approach allows to significantly improve the performance of ternary quantization through a variety of scenarios in DFQ, PTQ and QAT and give strong insights to pave the way for future research in deep neural network quantization.
no code implementations • 21 Mar 2023 • Gauthier Tallec, Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
The rising performance of deep neural networks is often empirically attributed to an increase in the available computational power, which allows complex models to be trained upon large amounts of annotated data.
no code implementations • 6 Mar 2023 • Gauthier Tallec, Arnaud Dapogny, Kevin Bailly
However, applying label smoothing as it is may aggravate imbalance-based pre-existing under-confidence issue and degrade performance.
no code implementations • 24 Jan 2023 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
In this paper, we identity the uniformity of the quantization operator as a limitation of existing approaches, and propose a data-free non-uniform method.
no code implementations • 8 Jul 2022 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
The leap in performance in state-of-the-art computer vision methods is attributed to the development of deep neural networks.
no code implementations • 28 Mar 2022 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Computationally expensive neural networks are ubiquitous in computer vision and solutions for efficient inference have drawn a growing attention in the machine learning community.
1 code implementation • 28 Mar 2022 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
Batch-Normalization (BN) layers have become fundamental components in the evermore complex deep neural network architectures.
no code implementations • 23 Mar 2022 • Gauthier Tallec, Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
Action Unit (AU) Detection is the branch of affective computing that aims at recognizing unitary facial muscular movements.
no code implementations • 1 Feb 2022 • Gauthier Tallec, Arnaud Dapogny, Kevin Bailly
MONET uses a differentiable order selection to jointly learn task-wise modules with their optimal chaining order.
no code implementations • NeurIPS 2021 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Deep Neural Networks (DNNs) are ubiquitous in today's computer vision landscape, despite involving considerable computational costs.
no code implementations • 30 Sep 2021 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Pruning Deep Neural Networks (DNNs) is a prominent field of study in the goal of inference runtime acceleration.
no code implementations • 31 May 2021 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Deep Neural Networks (DNNs) are ubiquitous in today's computer vision land-scape, despite involving considerable computational costs.
no code implementations • 15 Oct 2020 • Estephe Arnaud, Arnaud Dapogny, Kevin Bailly
Thus, the exogenous information is used two times in a throwable fashion, first as a conditioning variable for the target task, and second to create invariance within the endogenous representation.
Facial Expression Recognition Facial Expression Recognition (FER)
no code implementations • 21 Oct 2019 • Estephe Arnaud, Arnaud Dapogny, Kevin Bailly
Face alignment consists of aligning a shape model on a face image.
no code implementations • 7 Jul 2019 • Estephe Arnaud, Arnaud Dapogny, Kevin Bailly
Face alignment consists in aligning a shape model on a face in an image.
no code implementations • ICCV 2015 • Arnaud Dapogny, Kevin Bailly, Severine Dubuisson
Facial expression can be seen as the dynamic variation of one's appearance over time.
Facial Expression Recognition Facial Expression Recognition (FER)