Search Results for author: Vaggos Chatziafratis

Found 8 papers, 1 papers with code

Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem

no code implementations19 Oct 2021 Clayton Sanford, Vaggos Chatziafratis

Given a target function $f$, how large must a neural network be in order to approximate $f$?

From Trees to Continuous Embeddings and Back: Hyperbolic Hierarchical Clustering

2 code implementations NeurIPS 2020 Ines Chami, Albert Gu, Vaggos Chatziafratis, Christopher Ré

Recently, Dasgupta reframed HC as a discrete optimization problem by introducing a global cost function measuring the quality of a given tree.

Better Depth-Width Trade-offs for Neural Networks through the lens of Dynamical Systems

no code implementations ICML 2020 Vaggos Chatziafratis, Sai Ganesh Nagarajan, Ioannis Panageas

The expressivity of neural networks as a function of their depth, width and type of activation units has been an important question in deep learning theory.

Learning Theory

Depth-Width Trade-offs for ReLU Networks via Sharkovsky's Theorem

no code implementations ICLR 2020 Vaggos Chatziafratis, Sai Ganesh Nagarajan, Ioannis Panageas, Xiao Wang

Motivated by our observation that the triangle waves used in Telgarsky's work contain points of period 3 - a period that is special in that it implies chaotic behavior based on the celebrated result by Li-Yorke - we proceed to give general lower bounds for the width needed to represent periodic functions as a function of the depth.

Adversarially Robust Low Dimensional Representations

no code implementations29 Nov 2019 Pranjal Awasthi, Vaggos Chatziafratis, Xue Chen, Aravindan Vijayaraghavan

In particular, our adversarially robust PCA primitive leads to computationally efficient and robust algorithms for both unsupervised and supervised learning problems such as clustering and learning adversarially robust classifiers.

Hierarchical Clustering better than Average-Linkage

no code implementations7 Aug 2018 Moses Charikar, Vaggos Chatziafratis, Rad Niazadeh

Hierarchical Clustering (HC) is a widely studied problem in exploratory data analysis, usually tackled by simple agglomerative procedures like average-linkage, single-linkage or complete-linkage.

On the Computational Power of Online Gradient Descent

no code implementations3 Jul 2018 Vaggos Chatziafratis, Tim Roughgarden, Joshua R. Wang

We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings.

Hierarchical Clustering with Structural Constraints

no code implementations ICML 2018 Vaggos Chatziafratis, Rad Niazadeh, Moses Charikar

For many real-world applications, we would like to exploit prior information about the data that imposes constraints on the clustering hierarchy, and is not captured by the set of features available to the algorithm.

Cannot find the paper you are looking for? You can Submit a new open access paper.