no code implementations • 28 Jul 2022 • Luca Becchetti, Arthur Carvalho Walraven da Cunha, Andrea Clementi, Francesco d'Amore, Hicham Lesfari, Emanuele Natale, Luca Trevisan
random variables $X_1, ..., X_n$, we wish to approximate any point $z \in [-1, 1]$ as the sum of a suitable subset $X_{i_1(z)}, ..., X_{i_s(z)}$ of them, up to error $\varepsilon$.
1 code implementation • 15 Dec 2021 • Francesco d'Amore, Daniel Mitropolsky, Pierluigi Crescenzi, Emanuele Natale, Christos H. Papadimitriou
We revisit the planning problem in the blocks world, and we implement a known heuristic for this task.
no code implementations • ICLR 2022 • Arthur da Cunha, Emanuele Natale, Laurent Viennot
The lottery ticket hypothesis states that a randomly-initialized neural network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network.
no code implementations • 24 Oct 2019 • Hossein Baktash, Emanuele Natale, Laurent Viennot
There has recently been an increasing desire to evaluate neural networks locally on computationally-limited devices in order to exploit their recent effectiveness for several applications; such effectiveness has nevertheless come together with a considerable increase in the size of modern neural networks, which constitute a major downside in several of the aforementioned computationally-limited settings.
1 code implementation • 26 Nov 2018 • Luca Becchetti, Andrea Clementi, Emanuele Natale, Francesco Pasquale, Luca Trevisan
It follows from the Marcus-Spielman-Srivastava proof of the Kadison-Singer conjecture that if $G=(V, E)$ is a $\Delta$-regular dense expander then there is an edge-induced subgraph $H=(V, E_H)$ of $G$ of constant maximum degree which is also an expander.
Distributed, Parallel, and Cluster Computing