Search Results for author: Marius Preda

Found 6 papers, 2 papers with code

End-to-end deep meta modelling to calibrate and optimize energy consumption and comfort

1 code implementation1 Feb 2021 Max Cohen, Sylvain Le Corff, Maurice Charbit, Marius Preda, Gilles Nozière

Parameters are estimated by comparing the predictions of the metamodel with real data obtained from sensors using the CMA-ES algorithm, a derivative free optimization procedure.

Multiobjective Optimization

One-Cycle Pruning: Pruning ConvNets Under a Tight Training Budget

1 code implementation5 Jul 2021 Nathan Hubens, Matei Mancas, Bernard Gosselin, Marius Preda, Titus Zaharia

Most of the time, sparsity is introduced using a three-stage pipeline: 1) train the model to convergence, 2) prune the model according to some criterion, 3) fine-tune the pruned model to recover performance.

End-to-end deep metamodeling to calibrate and optimize energy loads

no code implementations19 Jun 2020 Max Cohen, Maurice Charbit, Sylvain Le Corff, Marius Preda, Gilles Nozière

Finally, the optimal settings to minimize the energy loads while maintaining a target thermal comfort and air quality are obtained using a multi-objective optimization procedure.

Management

An Experimental Study of the Impact of Pre-training on the Pruning of a Convolutional Neural Network

no code implementations15 Dec 2021 Nathan Hubens, Matei Mancas, Bernard Gosselin, Marius Preda, Titus Zaharia

Neural networks usually involve a large number of parameters, which correspond to the weights of the network.

Improve Convolutional Neural Network Pruning by Maximizing Filter Variety

no code implementations11 Mar 2022 Nathan Hubens, Matei Mancas, Bernard Gosselin, Marius Preda, Titus Zaharia

This technique ensures that the criteria of selection focuses on redundant filters, while retaining the rare ones, thus maximizing the variety of remaining filters.

Network Pruning

Induced Feature Selection by Structured Pruning

no code implementations20 Mar 2023 Nathan Hubens, Victor Delvigne, Matei Mancas, Bernard Gosselin, Marius Preda, Titus Zaharia

The advent of sparsity inducing techniques in neural networks has been of a great help in the last few years.

feature selection

Cannot find the paper you are looking for? You can Submit a new open access paper.