no code implementations • 16 Feb 2024 • Tommaso Salvatori, Beren Millidge, Yuhang Song, Rafal Bogacz, Thomas Lukasiewicz
This problem can be easily solved by computing \emph{similarities} in an embedding space instead of the pixel space.
no code implementations • 16 Feb 2024 • Alexander Ororbia, Ankur Mali, Adam Kohan, Beren Millidge, Tommaso Salvatori
As a result, it accommodates hardware and scientific modeling, e. g. learning with physical systems and non-differentiable behavior.
no code implementations • 6 Dec 2023 • Karl J. Friston, Tommaso Salvatori, Takuya Isomura, Alexander Tschantz, Alex Kiefer, Tim Verbelen, Magnus Koudahl, Aswin Paul, Thomas Parr, Adeel Razi, Brett Kagan, Christopher L. Buckley, Maxwell J. D. Ramstead
First, we simulate the aforementioned in vitro experiments, in which neuronal cultures spontaneously learn to play Pong, by implementing nested, free energy minimising processes.
no code implementations • 17 Nov 2023 • Karl J. Friston, Lancelot Da Costa, Alexander Tschantz, Alex Kiefer, Tommaso Salvatori, Victorita Neacsu, Magnus Koudahl, Conor Heins, Noor Sajid, Dimitrije Markovic, Thomas Parr, Tim Verbelen, Christopher L Buckley
This paper concerns structure learning or discovery of discrete generative models.
no code implementations • 15 Aug 2023 • Tommaso Salvatori, Ankur Mali, Christopher L. Buckley, Thomas Lukasiewicz, Rajesh P. N. Rao, Karl Friston, Alexander Ororbia
Artificial intelligence (AI) is rapidly becoming one of the key technologies of this century.
no code implementations • 27 Jun 2023 • Tommaso Salvatori, Luca Pinchetti, Amine M'Charrak, Beren Millidge, Thomas Lukasiewicz
Bayesian inference models observations: what can be inferred about y if we observe a related variable x?
2 code implementations • NeurIPS 2023 • Simon Frieder, Luca Pinchetti, Alexis Chevalier, Ryan-Rhys Griffiths, Tommaso Salvatori, Thomas Lukasiewicz, Philipp Christian Petersen, Julius Berner
We investigate the mathematical capabilities of two iterations of ChatGPT (released 9-January-2023 and 30-January-2023) and of GPT-4 by testing them on publicly available datasets, as well as hand-crafted ones, using a novel methodology.
no code implementations • 9 Dec 2022 • Billy Byiringiro, Tommaso Salvatori, Thomas Lukasiewicz
Predictive coding is a message-passing framework initially developed to model information processing in the brain, and now also topic of research in machine learning due to some interesting properties.
Graph Representation Learning Out-of-Distribution Generalization
no code implementations • 16 Nov 2022 • Tommaso Salvatori, Yuhang Song, Yordan Yordanov, Beren Millidge, Zhenghua Xu, Lei Sha, Cornelius Emde, Rafal Bogacz, Thomas Lukasiewicz
Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience.
no code implementations • 7 Nov 2022 • Luca Pinchetti, Tommaso Salvatori, Yordan Yordanov, Beren Millidge, Yuhang Song, Thomas Lukasiewicz
A large amount of recent research has the far-reaching goal of finding training methods for deep neural networks that can serve as alternatives to backpropagation (BP).
1 code implementation • 8 Oct 2022 • Lei Sha, Yuhang Song, Yordan Yordanov, Tommaso Salvatori, Thomas Lukasiewicz
Transformers have become an indispensable module for text generation models since their great success in machine translation.
1 code implementation • 21 Jul 2022 • Beren Millidge, Yuhang Song, Tommaso Salvatori, Thomas Lukasiewicz, Rafal Bogacz
In this paper, we provide a comprehensive theoretical analysis of the properties of PCNs trained with prospective configuration.
1 code implementation • 31 May 2022 • Beren Millidge, Yuhang Song, Tommaso Salvatori, Thomas Lukasiewicz, Rafal Bogacz
How the brain performs credit assignment is a fundamental unsolved problem in neuroscience.
no code implementations • 18 Feb 2022 • Beren Millidge, Tommaso Salvatori, Yuhang Song, Rafal Bogacz, Thomas Lukasiewicz
The backpropagation of error algorithm used to train deep neural networks has been fundamental to the successes of deep learning.
1 code implementation • 9 Feb 2022 • Beren Millidge, Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz
A large number of neural network models of associative memory have been proposed in the literature.
no code implementations • 31 Jan 2022 • Tommaso Salvatori, Luca Pinchetti, Beren Millidge, Yuhang Song, TianYi Bao, Rafal Bogacz, Thomas Lukasiewicz
Training with backpropagation (BP) in standard deep learning consists of two main steps: a forward pass that maps a data point to its prediction, and a backward pass that propagates the error of this prediction back through the network.
no code implementations • NeurIPS 2021 • Tommaso Salvatori, Yuhang Song, Yujian Hong, Simon Frieder, Lei Sha, Zhenghua Xu, Rafal Bogacz, Thomas Lukasiewicz
We conclude by discussing the possible impact of this work in the neuroscience community, by showing that our model provides a plausible framework to study learning and retrieval of memories in the brain, as it closely mimics the behavior of the hippocampus as a memory index and generative model.
no code implementations • 8 Mar 2021 • Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz, Zhenghua Xu
Recent works prove that these methods can approximate BP up to a certain margin on multilayer perceptrons (MLPs), and asymptotically on any other complex model, and that zero-divergence inference learning (Z-IL), a variant of PC, is able to exactly implement BP on MLPs.
no code implementations • 5 Mar 2021 • Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz, Zhenghua Xu
Predictive coding networks (PCNs) are an influential model for information processing in the brain.
1 code implementation • NeurIPS 2020 • Ralph Abboud, İsmail İlkan Ceylan, Thomas Lukasiewicz, Tommaso Salvatori
Knowledge base completion (KBC) aims to automatically infer missing facts by exploiting information already present in a knowledge base (KB).
Ranked #1 on Link Prediction on FB-AUTO