Search Results for author: Bertrand Charpentier

Found 21 papers, 13 papers with code

Predicting Probabilities of Error to Combine Quantization and Early Exiting: QuEE

no code implementations20 Jun 2024 Florence Regol, Joud Chataoui, Bertrand Charpentier, Mark Coates, Pablo Piantanida, Stephan Gunnemann

Machine learning models can solve complex tasks but often require significant computational resources during inference.

Quantization

Uncertainty for Active Learning on Graphs

no code implementations2 May 2024 Dominik Fuchsgruber, Tom Wollschläger, Bertrand Charpentier, Antonio Oroz, Stephan Günnemann

Uncertainty Sampling is an Active Learning strategy that aims to improve the data efficiency of machine learning models by iteratively acquiring labels of data points with the highest uncertainty.

Active Learning Node Classification

Structurally Prune Anything: Any Architecture, Any Framework, Any Time

no code implementations3 Mar 2024 Xun Wang, John Rachwan, Stephan Günnemann, Bertrand Charpentier

However, the diverse patterns for coupling parameters, such as residual connections and group convolutions, the diverse deep learning frameworks, and the various time stages at which pruning can be performed make existing pruning methods less adaptable to different architectures, frameworks, and pruning criteria.

Network Pruning

Adversarial Training for Graph Neural Networks: Pitfalls, Solutions, and New Directions

no code implementations NeurIPS 2023 Lukas Gosch, Simon Geisler, Daniel Sturm, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

Including these contributions, we demonstrate that adversarial training is a state-of-the-art defense against adversarial structure perturbations.

Graph Learning

Uncertainty Estimation for Molecules: Desiderata and Methods

1 code implementation20 Jun 2023 Tom Wollschläger, Nicholas Gao, Bertrand Charpentier, Mohamed Amine Ketata, Stephan Günnemann

Graph Neural Networks (GNNs) are promising surrogates for quantum mechanical calculations as they establish unprecedented low errors on collections of molecular dynamics (MD) trajectories.

Computational chemistry

Accuracy is not the only Metric that matters: Estimating the Energy Consumption of Deep Learning Models

1 code implementation3 Apr 2023 Johannes Getzner, Bertrand Charpentier, Stephan Günnemann

Modern machine learning models have started to consume incredible amounts of energy, thus incurring large carbon footprints (Strubell et al., 2019).

Training, Architecture, and Prior for Deterministic Uncertainty Methods

1 code implementation10 Mar 2023 Bertrand Charpentier, Chenxiang Zhang, Stephan Günnemann

Accurate and efficient uncertainty estimation is crucial to build reliable Machine Learning (ML) models capable to provide calibrated uncertainty estimates, generalize and detect Out-Of-Distribution (OOD) datasets.

On the Robustness and Anomaly Detection of Sparse Neural Networks

no code implementations9 Jul 2022 Morgane Ayle, Bertrand Charpentier, John Rachwan, Daniel Zügner, Simon Geisler, Stephan Günnemann

The robustness and anomaly detection capability of neural networks are crucial topics for their safe adoption in the real-world.

Anomaly Detection

Disentangling Epistemic and Aleatoric Uncertainty in Reinforcement Learning

no code implementations3 Jun 2022 Bertrand Charpentier, Ransalu Senanayake, Mykel Kochenderfer, Stephan Günnemann

Characterizing aleatoric and epistemic uncertainty can be used to speed up learning in a training environment, improve generalization to similar testing environments, and flag unfamiliar behavior in anomalous testing environments.

reinforcement-learning Reinforcement Learning +1

Differentiable DAG Sampling

1 code implementation ICLR 2022 Bertrand Charpentier, Simon Kibler, Stephan Günnemann

To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering.

valid Variational Inference

On Out-of-distribution Detection with Energy-based Models

1 code implementation3 Jul 2021 Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

Several density estimation methods have shown to fail to detect out-of-distribution (OOD) samples by assigning higher likelihoods to anomalous data.

Density Estimation Out-of-Distribution Detection +1

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts

1 code implementation NeurIPS 2020 Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

The posterior distributions learned by PostNet accurately reflect uncertainty for in- and out-of-distribution data -- without requiring access to OOD data at training time.

Out of Distribution (OOD) Detection

Learning Graph Representations by Dendrograms

no code implementations13 Jul 2018 Thomas Bonald, Bertrand Charpentier

Hierarchical graph clustering is a common technique to reveal the multi-scale structure of complex networks.

Clustering Graph Clustering

Hierarchical Graph Clustering using Node Pair Sampling

3 code implementations5 Jun 2018 Thomas Bonald, Bertrand Charpentier, Alexis Galland, Alexandre Hollocou

We present a novel hierarchical graph clustering algorithm inspired by modularity-based clustering techniques.

Clustering Graph Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.