Search Results for author: Amitabh Basu

Found 8 papers, 2 papers with code

Neural networks with linear threshold activations: structure and algorithms

no code implementations15 Nov 2021 Sammy Khalife, Hongyu Cheng, Amitabh Basu

We precisely characterize the class of functions that are representable by such neural networks and show that 2 hidden layers are necessary and sufficient to represent any function representable in the class.

Towards Lower Bounds on the Depth of ReLU Neural Networks

1 code implementation NeurIPS 2021 Christoph Hertrich, Amitabh Basu, Marco Di Summa, Martin Skutella

We contribute to a better understanding of the class of functions that is represented by a neural network with ReLU activations and a given architecture.

Enumerating integer points in polytopes with bounded subdeterminants

no code implementations19 Feb 2021 Amitabh Basu, Hongyi Jiang

We show that one can enumerate the vertices of the convex hull of integer points in polytopes whose constraint matrices have bounded and nonzero subdeterminants, in time polynomial in the dimension and encoding size of the polytope.

Combinatorics Optimization and Control 90C10, 90C57, 90C60

Lower bounds over Boolean inputs for deep neural networks with ReLU gates

no code implementations8 Nov 2017 Anirbit Mukherjee, Amitabh Basu

We use the method of sign-rank to show exponential in dimension lower bounds for ReLU circuits ending in a LTF gate and of depths upto $O(n^{\xi})$ with $\xi < \frac{1}{8}$ with some restrictions on the weights in the bottom most layer.

Understanding Deep Neural Networks with Rectified Linear Units

no code implementations ICLR 2018 Raman Arora, Amitabh Basu, Poorya Mianjy, Anirbit Mukherjee

In this paper we investigate the family of functions representable by deep neural networks (DNN) with rectified linear units (ReLU).

Cannot find the paper you are looking for? You can Submit a new open access paper.