no code implementations • 27 Nov 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
However, such techniques suffer from their lack of adaptability to the target devices, as a hardware typically only support specific bit widths.
no code implementations • 17 Nov 2023 • Rémi Ouazan Reboul, Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
To solve this problem, a popular solution is DNN pruning, and more so structured pruning, where coherent computational blocks (e. g. channels for convolutional networks) are removed: as an exhaustive search of the space of pruned sub-models is intractable in practice, channels are typically removed iteratively based on an importance estimation heuristic.
no code implementations • 29 Sep 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
However, this led to an increase in the memory footprint, to a point where it can be challenging to simply load a model on commodity devices such as mobile phones.
no code implementations • 15 Aug 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
GPTQ essentially consists in learning the rounding operation using a small calibration set.
no code implementations • 10 Aug 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
However, the optimization of the exponent parameter and weight values remains a challenging and novel problem which could not be solved with previous post training optimization techniques which only learn to round up or down weight values in order to preserve the predictive function.
no code implementations • 9 Aug 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly, Xavier Fischer
In this work, we propose to investigate DNN layer importance, i. e. to estimate the sensitivity of the accuracy w. r. t.
no code implementations • 30 Jun 2023 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
We show experimentally that our approach allows to significantly improve the performance of ternary quantization through a variety of scenarios in DFQ, PTQ and QAT and give strong insights to pave the way for future research in deep neural network quantization.
no code implementations • 21 Mar 2023 • Gauthier Tallec, Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
The rising performance of deep neural networks is often empirically attributed to an increase in the available computational power, which allows complex models to be trained upon large amounts of annotated data.
no code implementations • 24 Jan 2023 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
In this paper, we identity the uniformity of the quantization operator as a limitation of existing approaches, and propose a data-free non-uniform method.
no code implementations • 8 Jul 2022 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
The leap in performance in state-of-the-art computer vision methods is attributed to the development of deep neural networks.
no code implementations • 28 Mar 2022 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Computationally expensive neural networks are ubiquitous in computer vision and solutions for efficient inference have drawn a growing attention in the machine learning community.
1 code implementation • 28 Mar 2022 • Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
Batch-Normalization (BN) layers have become fundamental components in the evermore complex deep neural network architectures.
no code implementations • 23 Mar 2022 • Gauthier Tallec, Edouard Yvinec, Arnaud Dapogny, Kevin Bailly
Action Unit (AU) Detection is the branch of affective computing that aims at recognizing unitary facial muscular movements.
no code implementations • NeurIPS 2021 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Deep Neural Networks (DNNs) are ubiquitous in today's computer vision landscape, despite involving considerable computational costs.
no code implementations • 30 Sep 2021 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Pruning Deep Neural Networks (DNNs) is a prominent field of study in the goal of inference runtime acceleration.
no code implementations • 31 May 2021 • Edouard Yvinec, Arnaud Dapogny, Matthieu Cord, Kevin Bailly
Deep Neural Networks (DNNs) are ubiquitous in today's computer vision land-scape, despite involving considerable computational costs.
no code implementations • 15 Apr 2020 • Edouard Yvinec, Arnaud Dapogny, Kévin Bailly
In this paper, we introduce a deep, end-to-end trainable ensemble of heatmap-based weak predictors for 2D/3D gaze estimation.