Search Results for author: Stefano Favaro

Found 25 papers, 4 papers with code

Improved prediction of future user activity in online A/B testing

no code implementations5 Feb 2024 Lorenzo Masoero, Mario Beraha, Thomas Richardson, Stefano Favaro

In online randomized experiments or A/B tests, accurate predictions of participant inclusion rates are of paramount importance.

A Nonparametric Bayes Approach to Online Activity Prediction

no code implementations26 Jan 2024 Mario Beraha, Lorenzo Masoero, Stefano Favaro, Thomas S. Richardson

We derive closed-form expressions for the number of new users expected in a given period, and a simple Monte Carlo algorithm targeting the posterior distribution of the number of days needed to attain a desired number of users; the latter is important for experimental planning.

Activity Prediction

Frequency and cardinality recovery from sketched data: a novel approach bridging Bayesian and frequentist views

no code implementations27 Sep 2023 Mario Beraha, Stefano Favaro, Matteo Sesia

We study how to recover the frequency of a symbol in a large discrete data set, using only a compressed representation, or sketch, of those data obtained via random hashing.

Quantitative CLTs in Deep Neural Networks

no code implementations12 Jul 2023 Stefano Favaro, Boris Hanin, Domenico Marinucci, Ivan Nourdin, Giovanni Peccati

We study the distribution of a fully connected neural network with random Gaussian weights and biases in which the hidden layer widths are proportional to a large constant $n$.


Infinitely wide limits for deep Stable neural networks: sub-linear, linear and super-linear activation functions

no code implementations8 Apr 2023 Alberto Bordino, Stefano Favaro, Sandra Fortini

As a novelty with respect to previous works, our results rely on the use of a generalized central limit theorem for heavy tails distributions, which allows for an interesting unified treatment of infinitely wide limits for deep Stable NNs.

Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities

no code implementations8 Apr 2023 Alberto Bordino, Stefano Favaro, Sandra Fortini

There is a growing interest on large-width asymptotic properties of Gaussian neural networks (NNs), namely NNs whose weights are initialized according to Gaussian distributions.

Conformal Frequency Estimation using Discrete Sketched Data with Coverage for Distinct Queries

1 code implementation9 Nov 2022 Matteo Sesia, Stefano Favaro, Edgar Dobriban

This paper develops conformal inference methods to construct a confidence interval for the frequency of a queried object in a very large discrete data set, based on a sketch with a lower memory footprint.


Bayesian nonparametric estimation of coverage probabilities and distinct counts from sketched data

no code implementations5 Sep 2022 Stefano Favaro, Matteo Sesia

The estimation of coverage probabilities, and in particular of the missing mass, is a classical statistical problem with applications in numerous scientific fields.

Data Compression

Large-width asymptotics for ReLU neural networks with $α$-Stable initializations

no code implementations16 Jun 2022 Stefano Favaro, Sandra Fortini, Stefano Peluchetti

As a difference with respect to the Gaussian setting, our result shows that the choice of the activation function affects the scaling of the NN, that is: to achieve the infinitely wide $\alpha$-Stable process, the ReLU activation requires an additional logarithmic term in the scaling with respect to sub-linear activations.


Conformal Frequency Estimation with Sketched Data

1 code implementation8 Apr 2022 Matteo Sesia, Stefano Favaro

A flexible conformal inference method is developed to construct confidence intervals for the frequencies of queried objects in very large data sets, based on a much smaller sketch of those data.


Strong posterior contraction rates via Wasserstein dynamics

no code implementations21 Mar 2022 Emanuele Dolera, Stefano Favaro, Edoardo Mainini

In Bayesian statistics, posterior contraction rates (PCRs) quantify the speed at which the posterior distribution concentrates on arbitrarily small neighborhoods of a true model, in a suitable way, as the sample size goes to infinity.

Density Estimation

The power of private likelihood-ratio tests for goodness-of-fit in frequency tables

no code implementations20 Sep 2021 Emanuele Dolera, Stefano Favaro

This is obtained through a Bahadur-Rao large deviation expansion for the power of the private LR test, bringing out a critical quantity, as a function of the sample size, the dimension of the table and $(\varepsilon,\delta)$, that determines a loss in the power of the test.

Deep Stable neural networks: large-width asymptotics and convergence rates

no code implementations2 Aug 2021 Stefano Favaro, Sandra Fortini, Stefano Peluchetti

Then, we establish sup-norm convergence rates of the rescaled deep Stable NN to the Stable SP, under the ``joint growth" and a ``sequential growth" of the width over the NN's layers.

Bayesian Inference

Large-width functional asymptotics for deep Gaussian neural networks

no code implementations ICLR 2021 Daniele Bracale, Stefano Favaro, Sandra Fortini, Stefano Peluchetti

In this paper, we consider fully connected feed-forward deep neural networks where weights and biases are independent and identically distributed according to Gaussian distributions.

Gaussian Processes

Learning-augmented count-min sketches via Bayesian nonparametrics

no code implementations8 Feb 2021 Emanuele Dolera, Stefano Favaro, Stefano Peluchetti

Under this more general framework, we apply the arguments of the ``Bayesian" proof of the CMS-DP, suitably adapted to the PYP prior, in order to compute the posterior distribution of a point query, given the hashed data.

A Bayesian nonparametric approach to count-min sketch under power-law data streams

no code implementations7 Feb 2021 Emanuele Dolera, Stefano Favaro, Stefano Peluchetti

The count-min sketch (CMS) is a randomized data structure that provides estimates of tokens' frequencies in a large data stream using a compressed representation of the data by random hashing.

Infinite-channel deep stable convolutional neural networks

no code implementations7 Feb 2021 Daniele Bracale, Stefano Favaro, Sandra Fortini, Stefano Peluchetti

The interplay between infinite-width neural networks (NNs) and classes of Gaussian processes (GPs) is well known since the seminal work of Neal (1996).

Gaussian Processes

Doubly infinite residual neural networks: a diffusion process approach

no code implementations7 Jul 2020 Stefano Peluchetti, Stefano Favaro

Our results highlight a limited expressive power of doubly infinite ResNets when the unscaled network's parameters are i. i. d.

Gaussian Processes

Stable behaviour of infinitely wide deep neural networks

1 code implementation1 Mar 2020 Stefano Favaro, Sandra Fortini, Stefano Peluchetti

We consider fully connected feed-forward deep neural networks (NNs) where weights and biases are independent and identically distributed as symmetric centered stable distributions.

Gaussian Processes

Genomic variety prediction via Bayesian nonparametrics

no code implementations pproximateinference AABI Symposium 2019 Lorenzo Masoero, Federico Camerlenghi, Stefano Favaro, Tamara Broderick

We consider the case where scientists have already conducted a pilot study to reveal some variants in a genome and are contemplating a follow-up study.

Experimental Design

Nonparametric Bayesian multi-armed bandits for single cell experiment design

1 code implementation11 Oct 2019 Federico Camerlenghi, Bianca Dumitrascu, Federico Ferrari, Barbara E. Engelhardt, Stefano Favaro

The problem of maximizing cell type discovery under budget constraints is a fundamental challenge for the collection and analysis of single-cell RNA-sequencing (scRNA-seq) data.


Infinitely deep neural networks as diffusion processes

no code implementations27 May 2019 Stefano Peluchetti, Stefano Favaro

When the parameters are independently and identically distributed (initialized) neural networks exhibit undesirable properties that emerge as the number of layers increases, e. g. a vanishing dependency on the input and a concentration on restrictive families of functions including constant functions.

On consistent estimation of the missing mass

no code implementations25 Jun 2018 Fadhel Ayed, Marco Battiston, Federico Camerlenghi, Stefano Favaro

Given $n$ samples from a population of individuals belonging to different types with unknown proportions, how do we estimate the probability of discovering a new type at the $(n+1)$-th draw?

Vocal Bursts Type Prediction

A characterization of product-form exchangeable feature probability functions

no code implementations7 Jul 2016 Marco Battiston, Stefano Favaro, Daniel M. Roy, Yee Whye Teh

We characterize the class of exchangeable feature allocations assigning probability $V_{n, k}\prod_{l=1}^{k}W_{m_{l}}U_{n-m_{l}}$ to a feature allocation of $n$ individuals, displaying $k$ features with counts $(m_{1},\ldots, m_{k})$ for these features.

A marginal sampler for $σ$-Stable Poisson-Kingman mixture models

no code implementations16 Jul 2014 María Lomelí, Stefano Favaro, Yee Whye Teh

We investigate the class of $\sigma$-stable Poisson-Kingman random probability measures (RPMs) in the context of Bayesian nonparametric mixture modeling.

Clustering Density Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.