1 code implementation • 7 May 2020 • Manuel Razo-Mejia, Sarah Marzen, Griffin Chure, Rachel Taubman, Muir Morrison, Rob Phillips
We then predict the information processing capacity of the genetic circuit for a suite of biophysical parameters such as protein copy number and protein-DNA affinity.
no code implementations • 26 Jun 2017 • Sarah Marzen
Recurrent networks are trained to memorize their input better, often in the hopes that such training will increase the ability of the network to predict.
no code implementations • 23 Apr 2016 • Alireza Goudarzi, Sarah Marzen, Peter Banda, Guy Feldman, Christof Teuscher, Darko Stefanovic
Recurrent neural networks (RNN) are simple dynamical systems whose computational power has been attributed to their short-term memory.
no code implementations • 1 Apr 2015 • James P. Crutchfield, Sarah Marzen
We introduce a simple analysis of the structural complexity of infinite-memory processes built from random samples of stationary, ergodic finite-memory component processes.
no code implementations • 30 Dec 2014 • James P. Crutchfield, Ryan G. James, Sarah Marzen, Dowman P. Varn
We recount recent history behind building compact models of nonlinear, complex processes and identifying their relevant macroscopic patterns or "macrostates".
no code implementations • 9 Dec 2014 • Sarah Marzen, James P. Crutchfield
Predictive rate-distortion analysis suffers from the curse of dimensionality: clustering arbitrarily long pasts to retain information about arbitrarily long futures requires resources that typically grow exponentially with length.
no code implementations • ICLR 2020 • Sarah Marzen, James P. Crutchfield
The inference of models, prediction of future symbols, and entropy rate estimation of discrete-time, discrete-event processes is well-worn ground.
no code implementations • 30 Mar 2020 • Daniel Levenstein, Veronica A. Alvarez, Asohan Amarasingham, Habiba Azab, Zhe Sage Chen, Richard C. Gerkin, Andrea Hasenstaub, Ramakrishnan Iyer, Renaud B. Jolivet, Sarah Marzen, Joseph D. Monaco, Astrid A. Prinz, Salma Quraishi, Fidel Santamaria, Sabyasachi Shivkumar, Matthew F. Singh, Roger Traub, Horacio G. Rotstein, Farzan Nadim, A. David Redish
In recent years, the field of neuroscience has gone through rapid experimental advances and a significant increase in the use of quantitative and computational methods.
no code implementations • 29 Apr 2024 • Sarah Marzen
We propose a new computational-level objective function for theoretical biology and theoretical neuroscience that combines: reinforcement learning, the study of learning with feedback via rewards; rate-distortion theory, a branch of information theory that deals with compressing signals to retain relevant information; and computational mechanics, the study of minimal sufficient statistics of prediction also known as causal states.