no code implementations • 22 Dec 2023 • Stefano Peluchetti
The scope of this paper is generative modeling through diffusion processes.
1 code implementation • 3 Apr 2023 • Stefano Peluchetti
The dynamic Schr\"odinger bridge problem seeks a stochastic process that defines a transport between two target probability measures, while optimally satisfying the criteria of being closest, in terms of Kullback-Leibler divergence, to a reference process.
no code implementations • 16 Jun 2022 • Stefano Favaro, Sandra Fortini, Stefano Peluchetti
As a difference with respect to the Gaussian setting, our result shows that the choice of the activation function affects the scaling of the NN, that is: to achieve the infinitely wide $\alpha$-Stable process, the ReLU activation requires an additional logarithmic term in the scaling with respect to sub-linear activations.
no code implementations • 2 Aug 2021 • Stefano Favaro, Sandra Fortini, Stefano Peluchetti
Then, we establish sup-norm convergence rates of the rescaled deep Stable NN to the Stable SP, under the ``joint growth" and a ``sequential growth" of the width over the NN's layers.
no code implementations • 7 Jun 2021 • Stefano Massaroli, Michael Poli, Stefano Peluchetti, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
We systematically develop a learning-based treatment of stochastic optimal control (SOC), relying on direct optimization of parametric control policies.
no code implementations • ICLR 2021 • Daniele Bracale, Stefano Favaro, Sandra Fortini, Stefano Peluchetti
In this paper, we consider fully connected feed-forward deep neural networks where weights and biases are independent and identically distributed according to Gaussian distributions.
no code implementations • 8 Feb 2021 • Emanuele Dolera, Stefano Favaro, Stefano Peluchetti
Under this more general framework, we apply the arguments of the ``Bayesian" proof of the CMS-DP, suitably adapted to the PYP prior, in order to compute the posterior distribution of a point query, given the hashed data.
no code implementations • 7 Feb 2021 • Emanuele Dolera, Stefano Favaro, Stefano Peluchetti
The count-min sketch (CMS) is a randomized data structure that provides estimates of tokens' frequencies in a large data stream using a compressed representation of the data by random hashing.
no code implementations • 7 Feb 2021 • Daniele Bracale, Stefano Favaro, Sandra Fortini, Stefano Peluchetti
The interplay between infinite-width neural networks (NNs) and classes of Gaussian processes (GPs) is well known since the seminal work of Neal (1996).
no code implementations • 7 Jul 2020 • Stefano Peluchetti, Stefano Favaro
Our results highlight a limited expressive power of doubly infinite ResNets when the unscaled network's parameters are i. i. d.
1 code implementation • 1 Mar 2020 • Stefano Favaro, Sandra Fortini, Stefano Peluchetti
We consider fully connected feed-forward deep neural networks (NNs) where weights and biases are independent and identically distributed as symmetric centered stable distributions.
no code implementations • 3 Oct 2019 • Tiago Ramalho, Thierry Sousbie, Stefano Peluchetti
Recent algorithms with state-of-the-art few-shot classification results start their procedure by computing data features output by a large pretrained model.
no code implementations • 27 May 2019 • Stefano Peluchetti, Stefano Favaro
When the parameters are independently and identically distributed (initialized) neural networks exhibit undesirable properties that emerge as the number of layers increases, e. g. a vanishing dependency on the input and a concentration on restrictive families of functions including constant functions.