1 code implementation • NeurIPS 2018 • Fabian Sinz, Alexander S. Ecker, Paul Fahey, Edgar Walker, Erick Cobos, Emmanouil Froudarakis, Dimitri Yatsenko, Zachary Pitkow, Jacob Reimer, Andreas Tolias
However, in many cases this approach requires that the model is able to generalize to stimulus statistics that it was not trained on, such as band-limited noise and other parameterized stimuli.
no code implementations • NeurIPS 2012 • Zachary Pitkow
This paper shows how sparse, high-dimensional probability distributions could be represented by neurons with exponential compression.
no code implementations • NeurIPS 2011 • Zachary Pitkow, Yashar Ahmadian, Ken D. Miller
On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm.