Search Results for author: Sarah Marzen

Found 8 papers, 1 papers with code

First-principles prediction of the information processing capacity of a simple genetic circuit

1 code implementation7 May 2020 Manuel Razo-Mejia, Sarah Marzen, Griffin Chure, Rachel Taubman, Muir Morrison, Rob Phillips

We then predict the information processing capacity of the genetic circuit for a suite of biophysical parameters such as protein copy number and protein-DNA affinity.


no code implementations ICLR 2020 Sarah Marzen, James P. Crutchfield

The inference of models, prediction of future symbols, and entropy rate estimation of discrete-time, discrete-event processes is well-worn ground.

Time Series

The difference between memory and prediction in linear recurrent networks

no code implementations26 Jun 2017 Sarah Marzen

Recurrent networks are trained to memorize their input better, often in the hopes that such training will increase the ability of the network to predict.

Memory and Information Processing in Recurrent Neural Networks

no code implementations23 Apr 2016 Alireza Goudarzi, Sarah Marzen, Peter Banda, Guy Feldman, Christof Teuscher, Darko Stefanovic

Recurrent neural networks (RNN) are simple dynamical systems whose computational power has been attributed to their short-term memory.

Signatures of Infinity: Nonergodicity and Resource Scaling in Prediction, Complexity, and Learning

no code implementations1 Apr 2015 James P. Crutchfield, Sarah Marzen

We introduce a simple analysis of the structural complexity of infinite-memory processes built from random samples of stationary, ergodic finite-memory component processes.

Understanding and Designing Complex Systems: Response to "A framework for optimal high-level descriptions in science and engineering---preliminary report"

no code implementations30 Dec 2014 James P. Crutchfield, Ryan G. James, Sarah Marzen, Dowman P. Varn

We recount recent history behind building compact models of nonlinear, complex processes and identifying their relevant macroscopic patterns or "macrostates".

Circumventing the Curse of Dimensionality in Prediction: Causal Rate-Distortion for Infinite-Order Markov Processes

no code implementations9 Dec 2014 Sarah Marzen, James P. Crutchfield

Predictive rate-distortion analysis suffers from the curse of dimensionality: clustering arbitrarily long pasts to retain information about arbitrarily long futures requires resources that typically grow exponentially with length.

Cannot find the paper you are looking for? You can Submit a new open access paper.