no code implementations • 25 Mar 2023 • Sarah E. Marzen, Paul M. Riechers, James P. Crutchfield
One conclusion is that large probabilistic state machines -- specifically, large $\epsilon$-machines -- are key to generating challenging and structurally-unbiased stimuli for ground-truthing recurrent neural network architectures.
no code implementations • 27 Feb 2017 • Sarah E. Marzen, James P. Crutchfield
Scientific explanation often requires inferring maximally predictive features from a given data set.
no code implementations • 18 Apr 2015 • Sarah E. Marzen, Michael R. DeWeese, James P. Crutchfield
A first step towards that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate).