1 code implementation • 17 Sep 2024 • Raffaele Marino
In this manuscript, I present an analysis on the performance of OpenAI O1-preview model in solving random K-SAT instances for K$\in {2, 3, 4}$ as a function of $\alpha=M/N$ where $M$ is the number of clauses and $N$ is the number of variables of the satisfiable problem.
1 code implementation • 24 Jun 2024 • Raffaele Marino, Lorenzo Buffoni, Lorenzo Chicchi, Francesca Di Patti, Diego Febbe, Lorenzo Giambagli, Duccio Fanelli
The Wilson-Cowan model for metapopulation, a Neural Mass Network Model, treats different subcortical regions of the brain as connected nodes, with connections representing various types of structural, functional, or effective neuronal connectivity between these regions.
Ranked #1 on
Image Classification
on Flowers (Tensorflow)
no code implementations • 3 Jun 2024 • Lorenzo Chicchi, Lorenzo Buffoni, Diego Febbe, Lorenzo Giambagli, Raffaele Marino, Duccio Fanelli
Working with high-dimensional data is a common practice, in the field of machine learning.
no code implementations • 13 Mar 2024 • Carlo Nicolini, Jacopo Staiano, Bruno Lepri, Raffaele Marino
A substantial gap persists in understanding the reasons behind the exceptional performance of the Transformer architecture in NLP.
no code implementations • 13 Mar 2024 • Raffaele Marino, Lorenzo Buffoni, Bogdan Zavalnij
This manuscript provides a comprehensive review of the Maximum Clique Problem, a computational problem that involves finding subsets of vertices in a graph that are all pairwise adjacent to each other.
no code implementations • 22 Dec 2023 • Raffaele Marino, Lorenzo Buffoni, Lorenzo Chicchi, Lorenzo Giambagli, Duccio Fanelli
EODECA (Engineered Ordinary Differential Equations as Classification Algorithm) is a novel approach at the intersection of machine learning and dynamical systems theory, presenting a unique framework for classification tasks [1].
no code implementations • 12 Dec 2023 • Lorenzo Chicchi, Lorenzo Giambagli, Lorenzo Buffoni, Raffaele Marino, Duccio Fanelli
This paper presents a novel approach to advancing artificial intelligence (AI) through the development of the Complex Recurrent Spectral Network ($\mathbb{C}$-RSN), an innovative variant of the Recurrent Spectral Network (RSN) model.
1 code implementation • 17 Nov 2023 • Raffaele Marino, Lorenzo Giambagli, Lorenzo Chicchi, Lorenzo Buffoni, Duccio Fanelli
A novel approach for supervised classification is presented which sits at the intersection of machine learning and dynamical systems theory.
1 code implementation • 11 Sep 2023 • Maria Chiara Angelini, Angelo Giorgio Cavaliere, Raffaele Marino, Federico Ricci-Tersenghi
Is Stochastic Gradient Descent (SGD) substantially different from Metropolis Monte Carlo dynamics?
no code implementations • 10 May 2023 • Raffaele Marino, Federico Ricci-Tersenghi
This work presents a systematic attempt at understanding the role of the mini-batch size in training two-layer neural networks.
no code implementations • 26 Nov 2021 • Masoud Mohseni, Daniel Eppens, Johan Strumpfer, Raffaele Marino, Vasil Denchev, Alan K. Ho, Sergei V. Isakov, Sergio Boixo, Federico Ricci-Tersenghi, Hartmut Neven
In particular, for 90% of random 4-SAT instances we find solutions that are inaccessible for the best specialized deterministic algorithm known as Survey Propagation (SP) with an order of magnitude improvement in the quality of solutions for the hardest 10% instances.
no code implementations • 10 Dec 2020 • Raffaele Marino
This paper presents a new algorithm for computing approximate solutions in ${\Theta(N})$ for the Maximum Exact 3-Satisfiability (MAX-E-$3$-SAT) problem by using deep learning methodology.
no code implementations • 9 Dec 2020 • Nicolas Macris, Raffaele Marino
The main idea is to construct a deep network which is trained from the samples of discrete stochastic differential equations underlying Kolmogorov's equation.
no code implementations • 20 Aug 2015 • Raffaele Marino, Giorgio Parisi, Federico Ricci-Tersenghi
Discrete combinatorial optimization has a central role in many scientific disciplines, however, for hard problems we lack linear time algorithms that would allow us to solve very large instances.