1 code implementation • 9 May 2020 • Simone Brugiapaglia, Matthew Liu, Paul Tupper
Finally, we demonstrate our theory with computational experiments in which we explore the effect of different input encodings on the ability of algorithms to generalize to novel inputs.
no code implementations • 4 Jan 2018 • Paul Tupper, Paul Smolensky, Pyeong Whan Cho
Gradient Symbolic Computation is proposed as a means of solving discrete global optimization problems using a neurally plausible continuous stochastic dynamical system.
no code implementations • 20 Apr 2017 • Benjamin Goodman, Paul Tupper
In spoken languages, speakers divide up the space of phonetic possibilities into different regions, corresponding to different phonemes.
no code implementations • 12 May 2016 • Paul Tupper, Bobak Shahriari
We propose a novel framework for the analysis of learning algorithms that allows us to say when such algorithms can and cannot generalize certain patterns from training data to test data.