no code implementations • 22 Oct 2021 • Simon Alford, Anshula Gandhi, Akshay Rangamani, Andrzej Banburski, Tony Wang, Sylee Dandekar, John Chin, Tomaso Poggio, Peter Chin
More specifically, we extend existing execution-guided program synthesis approaches with deductive reasoning based on function inverse semantics to enable a neural-guided bidirectional search algorithm.
no code implementations • NeurIPS Workshop LMCA 2020 • Andrzej Banburski, Anshula Gandhi, Simon Alford, Sylee Dandekar, Sang Chin, tomaso a poggio
We argue that this can be achieved by a modular system –– one that can adapt to solving different problems by changing only the modules chosen and the order in which those modules are applied to the problem.
no code implementations • 25 Mar 2020 • Jeremy Kepner, Simon Alford, Vijay Gadepally, Michael Jones, Lauren Milechin, Albert Reuther, Ryan Robinett, Sid Samsi
The Sparse Deep Neural Network (DNN) Challenge draws upon prior challenges from machine learning, high performance computing, and visual analytics to create a challenge that is reflective of emerging sparse AI systems.
no code implementations • 2 Sep 2019 • Jeremy Kepner, Simon Alford, Vijay Gadepally, Michael Jones, Lauren Milechin, Ryan Robinett, Sid Samsi
The Sparse DNN Challenge is based on a mathematically well-defined DNN inference computation and can be implemented in any programming environment.
no code implementations • 30 Sep 2018 • Simon Alford, Ryan Robinett, Lauren Milechin, Jeremy Kepner
We test pruning-based topologies, which are derived from an initially dense network whose connections are pruned, as well as RadiX-Nets, a class of network topologies with proven connectivity and sparsity properties.