1 code implementation • 22 Nov 2023 • Christian Donner, Anuj Mishra, Hideaki Shimazaki
Learning and forecasting stochastic time series is essential in various scientific fields.
no code implementations • 25 Aug 2023 • Ulises Rodríguez-Domínguez, Hideaki Shimazaki
Neurons in living things work cooperatively and efficiently to process incoming sensory information, often exhibiting sparse and widespread population activity involving structured higher-order interactions.
no code implementations • 23 Jun 2020 • Hideaki Shimazaki
This article reviews how organisms learn and recognize the world through the dynamics of neural networks from the perspective of Bayesian inference, and introduces a view on how such dynamics is described by the laws for the entropy of neural activity, a paradigm that we call thermodynamics of the Bayesian brain.
no code implementations • 28 Feb 2019 • Hideaki Shimazaki
We start with constructing a hierarchical model of the world as an internal model in the brain, and review standard machine learning methods to infer causes by approximately learning the model under the maximum likelihood principle.
no code implementations • 22 Jan 2019 • Jimmy Gaudreault, Arunabh Saxena, Hideaki Shimazaki
Sequences of correlated binary patterns can represent many time-series data including text, movies, and biological signals.
no code implementations • 24 Jul 2018 • Jimmy Gaudreault, Hideaki Shimazaki
Here we report that, in all examined stimulus conditions, pairwise interactions contribute to increasing sparseness and fluctuation.
no code implementations • 16 Dec 2013 • Hideaki Shimazaki
In this study, we develop a parametric method for simultaneously estimating the stimulus and spike-history effects on the ensemble activity from single-trial data even if the neurons exhibit dynamics that is largely unrelated to these effects.