Search Results for author: Giulio Franzese

Found 15 papers, 6 papers with code

RFMI: Estimating Mutual Information on Rectified Flow for Text-to-Image Alignment

no code implementations18 Mar 2025 Chao Wang, Giulio Franzese, Alessandro Finamore, Pietro Michiardi

First, we introduce RFMI, a novel Mutual Information (MI) estimator for RF models that uses the pre-trained model itself for the MI estimation.

Attribute

INFO-SEDD: Continuous Time Markov Chains as Scalable Information Metrics Estimators

no code implementations26 Feb 2025 Alberto Foresti, Giulio Franzese, Pietro Michiardi

In this work, we introduce INFO-SEDD, a novel method for estimating information-theoretic quantities of discrete data, including mutual information and entropy.

In Praise of Stubbornness: An Empirical Case for Cognitive-Dissonance Aware Continual Update of Knowledge in LLMs

1 code implementation5 Feb 2025 Simone Clemente, Zied Ben Houidi, Alexis Huet, Dario Rossi, Giulio Franzese, Pietro Michiardi

Unlike humans, who naturally resist contradictory information, these models indiscriminately accept contradictions, leading to devastating interference, destroying up to 80% of unrelated knowledge even when learning as few as 10-100 contradicting facts.

Latent Abstractions in Generative Diffusion Models

no code implementations4 Oct 2024 Giulio Franzese, Mattia Martini, Giulio Corallo, Paolo Papotti, Pietro Michiardi

In this work we study how diffusion-based generative models produce high-dimensional data, such as an image, by implicitly relying on a manifestation of a low-dimensional set of latent abstractions, that guide the generative process.

Information Theoretic Text-to-Image Alignment

no code implementations31 May 2024 Chao Wang, Giulio Franzese, Alessandro Finamore, Massimo Gallo, Pietro Michiardi

Diffusion models for Text-to-Image (T2I) conditional generation have recently achieved tremendous success.

Denoising Image Generation

S$Ω$I: Score-based O-INFORMATION Estimation

1 code implementation8 Feb 2024 Mustapha Bounoua, Giulio Franzese, Pietro Michiardi

The analysis of scientific data and complex multivariate systems requires information quantities that capture relationships among multiple random variables.

MINDE: Mutual Information Neural Diffusion Estimation

1 code implementation13 Oct 2023 Giulio Franzese, Mustapha Bounoua, Pietro Michiardi

In this work we present a new method for the estimation of Mutual Information (MI) between random variables.

Mutual Information Estimation

Multi-modal Latent Diffusion

1 code implementation7 Jun 2023 Mustapha Bounoua, Giulio Franzese, Pietro Michiardi

Multi-modal data-sets are ubiquitous in modern applications, and multi-modal Variational Autoencoders are a popular family of models that aim to learn a joint representation of the different modalities.

multimodal generation

One-Line-of-Code Data Mollification Improves Optimization of Likelihood-based Generative Models

1 code implementation NeurIPS 2023 Ba-Hien Tran, Giulio Franzese, Pietro Michiardi, Maurizio Filippone

Generative Models (GMs) have attracted considerable attention due to their tremendous success in various domains, such as computer vision where they are capable to generate impressive realistic-looking images.

Density Estimation

Continuous-Time Functional Diffusion Processes

1 code implementation NeurIPS 2023 Giulio Franzese, Giulio Corallo, Simone Rossi, Markus Heinonen, Maurizio Filippone, Pietro Michiardi

We introduce Functional Diffusion Processes (FDPs), which generalize score-based diffusion models to infinite-dimensional function spaces.

Image Generation

How Much is Enough? A Study on Diffusion Times in Score-based Generative Models

no code implementations10 Jun 2022 Giulio Franzese, Simone Rossi, Lixuan Yang, Alessandro Finamore, Dario Rossi, Maurizio Filippone, Pietro Michiardi

Score-based diffusion models are a class of generative models whose dynamics is described by stochastic differential equations that map noise into data.

Computational Efficiency

Revisiting the Effects of Stochasticity for Hamiltonian Samplers

no code implementations30 Jun 2021 Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi

We revisit the theoretical properties of Hamiltonian stochastic differential equations (SDES) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling.

Numerical Integration

Isotropic SGD: a Practical Approach to Bayesian Posterior Sampling

no code implementations9 Jun 2020 Giulio Franzese, Rosa Candela, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi

In this work we define a unified mathematical framework to deepen our understanding of the role of stochastic gradient (SG) noise on the behavior of Markov chain Monte Carlo sampling (SGMCMC) algorithms.

Sparsification as a Remedy for Staleness in Distributed Asynchronous SGD

no code implementations21 Oct 2019 Rosa Candela, Giulio Franzese, Maurizio Filippone, Pietro Michiardi

Large scale machine learning is increasingly relying on distributed optimization, whereby several machines contribute to the training process of a statistical model.

Distributed Optimization

Deep Information Networks

no code implementations6 Mar 2018 Giulio Franzese, Monica Visintin

We describe a novel classifier with a tree structure, designed using information theory concepts.

Cannot find the paper you are looking for? You can Submit a new open access paper.