Search Results for author: Mario Marchand

Found 20 papers, 8 papers with code

Algorithm-Dependent Bounds for Representation Learning of Multi-Source Domain Adaptation

1 code implementation4 Apr 2023 Qi Chen, Mario Marchand

We further provide algorithm-dependent generalization bounds for these two settings, where the generalization is characterized by the mutual information between the parameters and the data.

Domain Adaptation Generalization Bounds +1

Generalization Properties of Decision Trees on Real-valued and Categorical Features

no code implementations18 Oct 2022 Jean-Samuel Leboeuf, Frédéric LeBlanc, Mario Marchand

Furthermore, we show that the VC dimension of a binary tree structure with $L_T$ leaves on examples of $\ell$ real-valued features is in $O(L_T \log(L_T\ell))$.

Partial Order in Chaos: Consensus on Feature Attributions in the Rashomon Set

1 code implementation26 Oct 2021 Gabriel Laberge, Yann Pequignot, Alexandre Mathieu, Foutse khomh, Mario Marchand

In this work, instead of aiming at reducing the under-specification of model explanations, we fully embrace it and extract logical statements about feature attributions that are consistent across all models with good empirical performance (i. e. all models in the Rashomon Set).

Additive models Feature Importance +1

Decision trees as partitioning machines to characterize their generalization properties

1 code implementation NeurIPS 2020 Jean-Samuel Leboeuf, Frédéric LeBlanc, Mario Marchand

We introduce the notion of partitioning function, and we relate it to the growth function and to the VC dimension.

MODELLING BIOLOGICAL ASSAYS WITH ADAPTIVE DEEP KERNEL LEARNING

no code implementations25 Sep 2019 Prudencio Tossou, Basile Dura, Daniel Cohen, Mario Marchand, François Laviolette, Alexandre Lacoste

Due to the significant costs of data generation, many prediction tasks within drug discovery are by nature few-shot regression (FSR) problems, including accurate modelling of biological assays.

Drug Discovery

Adaptive Deep Kernel Learning

no code implementations28 May 2019 Prudencio Tossou, Basile Dura, Francois Laviolette, Mario Marchand, Alexandre Lacoste

Deep kernel learning provides an elegant and principled framework for combining the structural properties of deep learning algorithms with the flexibility of kernel methods.

Benchmarking Drug Discovery +2

Large scale modeling of antimicrobial resistance with interpretable classifiers

1 code implementation3 Dec 2016 Alexandre Drouin, Frédéric Raymond, Gaël Letarte St-Pierre, Mario Marchand, Jacques Corbeil, François Laviolette

Antimicrobial resistance is an important public health concern that has implications in the practice of medicine worldwide.

Efficient Learning of Ensembles with QuadBoost

no code implementations8 Jun 2015 Louis Fortier-Dubois, François Laviolette, Mario Marchand, Louis-Emile Robitaille, Jean-Francis Roy

We first present a general risk bound for ensembles that depends on the Lp norm of the weighted combination of voters which can be selected from a continuous set.

Domain-Adversarial Training of Neural Networks

35 code implementations28 May 2015 Yaroslav Ganin, Evgeniya Ustinova, Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand, Victor Lempitsky

Our approach is directly inspired by the theory on domain adaptation suggesting that, for effective domain transfer to be achieved, predictions must be made based on features that cannot discriminate between the training (source) and test (target) domains.

Domain Generalization General Classification +5

Domain-Adversarial Neural Networks

1 code implementation15 Dec 2014 Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand

We propose a training objective that implements this idea in the context of a neural network, whose hidden layer is trained to be predictive of the classification task, but uninformative as to the domain of the input.

Denoising Domain Adaptation +3

On the String Kernel Pre-Image Problem with Applications in Drug Discovery

no code implementations3 Dec 2014 Sébastien Giguère, Amélie Rolland, François Laviolette, Mario Marchand

This work uses a recent result on combinatorial optimization of linear predictors based on string kernels to develop, for the pre-image, a low complexity upper bound valid for many string kernels.

Combinatorial Optimization Drug Discovery +1

Multilabel Structured Output Learning with Random Spanning Trees of Max-Margin Markov Networks

no code implementations NeurIPS 2014 Mario Marchand, Hongyu Su, Emilie Morvant, Juho Rousu, John S. Shawe-Taylor

We show that the usual score function for conditional Markov networks can be written as the expectation over the scores of their spanning trees.

Sequential Model-Based Ensemble Optimization

no code implementations4 Feb 2014 Alexandre Lacoste, Hugo Larochelle, François Laviolette, Mario Marchand

One of the most tedious tasks in the application of machine learning is model selection, i. e. hyperparameter selection.

Model Selection

A Note on Improved Loss Bounds for Multiple Kernel Learning

no code implementations30 Jun 2011 Zakria Hussain, John Shawe-Taylor, Mario Marchand

In this paper, we correct an upper bound, presented in~\cite{hs-11}, on the generalisation error of classifiers learned through multiple kernel learning.

From PAC-Bayes Bounds to KL Regularization

no code implementations NeurIPS 2009 Pascal Germain, Alexandre Lacasse, Mario Marchand, Sara Shanian, François Laviolette

We show that standard ell_p-regularized objective functions currently used, such as ridge regression and ell_p-regularized boosting, are obtained from a relaxation of the KL divergence between the quasi uniform posterior and the uniform prior.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.