no code implementations • 14 Dec 2022 • Gaurav Iyer, Boris Hanin, David Rolnick
Training a neural network requires choosing a suitable learning rate, involving a trade-off between speed and effectiveness of convergence.
no code implementations • 22 Nov 2022 • Alexandre Duval, Victor Schmidt, Santiago Miret, Yoshua Bengio, Alex Hernández-García, David Rolnick
Catalyst materials play a crucial role in the electrochemical reactions involved in a great number of industrial processes key to this transition, such as renewable energy storage and electrofuel synthesis.
no code implementations • 24 Oct 2022 • Setareh Cohan, Nam Hee Kim, David Rolnick, Michiel Van de Panne
Empirically, we find that the region density increases only moderately throughout training, as measured along fixed trajectories coming from the final policy.
1 code implementation • 24 Aug 2022 • Alexandra Sasha Luccioni, David Rolnick
We find that many of the classes are ill-defined or overlapping, and that 12% of the images are incorrectly labeled, with some classes having >90% of images incorrect.
1 code implementation • 8 Aug 2022 • Paula Harder, Venkatesh Ramesh, Alex Hernandez-Garcia, Qidong Yang, Prasanna Sattigeri, Daniela Szwarcman, Campbell Watson, David Rolnick
In order to conserve physical quantities, we develop methods that guarantee physical constraints are satisfied by a deep learning downscaling model while also improving their performance according to traditional metrics.
1 code implementation • 9 Jun 2022 • Giancarlo Kerg, Sarthak Mittal, David Rolnick, Yoshua Bengio, Blake Richards, Guillaume Lajoie
Recent work has explored how forcing relational representations to remain distinct from sensory representations, as it seems to be the case in the brain, can help artificial systems.
1 code implementation • 4 Feb 2022 • Gabriel Tseng, Hannah Kerner, David Rolnick
When developing algorithms for data-sparse regions, a natural approach is to use transfer learning from data-rich regions.
1 code implementation • 29 Nov 2021 • Salva Rühling Cachay, Venkatesh Ramesh, Jason N. S. Cole, Howard Barker, David Rolnick
Numerical simulations of Earth's weather and climate require substantial amounts of computation.
1 code implementation • NeurIPS 2021 • Sever Topan, David Rolnick, Xujie Si
Many experts argue that the future of artificial intelligence is limited by the field's ability to integrate symbolic logical reasoning into deep learning architectures.
1 code implementation • ICLR 2021 • Priya L. Donti, David Rolnick, J. Zico Kolter
Large optimization problems with hard constraints arise in many settings, yet classical solvers are often prohibitively slow, motivating the use of deep networks as cheap "approximate solvers."
no code implementations • 21 Mar 2021 • Charles A. Kantor, Marta Skreta, Brice Rauby, Léonard Boussioux, Emmanuel Jehanno, Alexandra Luccioni, David Rolnick, Hugues Talbot
Fine-grained classification aims at distinguishing between items with similar global perception and patterns, but that differ by minute details.
no code implementations • ICLR 2022 • Boris Hanin, Ryan Jeong, David Rolnick
Assessing the complexity of functions computed by a neural network helps us understand how the network will learn and generalize.
no code implementations • ICML 2020 • David Rolnick, Konrad P. Kording
It has been widely assumed that a neural network cannot be recovered from its outputs, as the network depends on its parameters in a highly nonlinear way.
no code implementations • 25 Sep 2019 • David Rolnick, Konrad P. Kording
The output of a neural network depends on its parameters in a highly nonlinear way, and it is widely assumed that a network's parameters cannot be identified from its outputs.
3 code implementations • 10 Jun 2019 • David Rolnick, Priya L. Donti, Lynn H. Kaack, Kelly Kochanski, Alexandre Lacoste, Kris Sankaran, Andrew Slavin Ross, Nikola Milojevic-Dupont, Natasha Jaques, Anna Waldman-Brown, Alexandra Luccioni, Tegan Maharaj, Evan D. Sherwin, S. Karthik Mukkavilli, Konrad P. Kording, Carla Gomes, Andrew Y. Ng, Demis Hassabis, John C. Platt, Felix Creutzig, Jennifer Chayes, Yoshua Bengio
Climate change is one of the greatest challenges facing humanity, and we, as machine learning experts, may wonder how we can help.
no code implementations • NeurIPS 2019 • Boris Hanin, David Rolnick
The success of deep networks has been attributed in part to their expressivity: per parameter, deep networks can approximate a richer class of functions than shallow networks.
no code implementations • 25 Jan 2019 • Boris Hanin, David Rolnick
It is well-known that the expressivity of a neural network depends on its architecture, with deeper networks expressing more complex functions.
no code implementations • CVPR 2019 • Yaron Meirovitch, Lu Mi, Hayk Saribekyan, Alexander Matveev, David Rolnick, Nir Shavit
Pixel-accurate tracking of objects is a key element in many computer vision applications, often solved by iterated individual object tracking or instance segmentation followed by object matching.
no code implementations • ICLR 2019 • David Rolnick, Arun Ahuja, Jonathan Schwarz, Timothy P. Lillicrap, Greg Wayne
We examine this issue in the context of reinforcement learning, in a setting where an agent is exposed to tasks in a sequence.
no code implementations • ICLR 2019 • Ari S. Benjamin, David Rolnick, Konrad Kording
To optimize a neural network one often thinks of optimizing its parameters, but it is ultimately a matter of optimizing the function that maps inputs to outputs.
no code implementations • NeurIPS 2018 • Boris Hanin, David Rolnick
We identify and study two common failure modes for early training in deep ReLU nets.
no code implementations • ICLR 2018 • David Rolnick, Andreas Veit, Serge Belongie, Nir Shavit
Deep neural networks trained on large supervised datasets have led to impressive results in image classification and other tasks.
no code implementations • 30 May 2017 • David Rolnick, Yaron Meirovitch, Toufiq Parag, Hanspeter Pfister, Viren Jain, Jeff W. Lichtman, Edward S. Boyden, Nir Shavit
Deep learning algorithms for connectomics rely upon localized classification, rather than overall morphology.
no code implementations • ICLR 2018 • David Rolnick, Max Tegmark
It is well-known that neural networks are universal approximators, but that deeper networks tend in practice to be more powerful than shallower ones.
no code implementations • 7 Dec 2016 • Yaron Meirovitch, Alexander Matveev, Hayk Saribekyan, David Budden, David Rolnick, Gergely Odor, Seymour Knowles-Barley, Thouis Raymond Jones, Hanspeter Pfister, Jeff William Lichtman, Nir Shavit
The field of connectomics faces unprecedented "big data" challenges.
no code implementations • 29 Aug 2016 • Henry W. Lin, Max Tegmark, David Rolnick
We show how the success of deep learning could depend not only on mathematics but also on physics: although well-known mathematical theorems guarantee that neural networks can approximate arbitrary functions well, the class of functions of practical interest can frequently be approximated through "cheap learning" with exponentially fewer parameters than generic ones.