Search Results for author: Hrayr Harutyunyan

Found 15 papers, 11 papers with code

Fast structure learning with modular regularization

3 code implementations NeurIPS 2019 Greg Ver Steeg, Hrayr Harutyunyan, Daniel Moyer, Aram Galstyan

We also use our approach for estimating covariance structure for a number of real-world datasets and show that it consistently outperforms state-of-the-art estimators at a fraction of the computational cost.

Disentangled Representations via Synergy Minimization

1 code implementation10 Oct 2017 Greg Ver Steeg, Rob Brekelmans, Hrayr Harutyunyan, Aram Galstyan

Scientists often seek simplified representations of complex systems to facilitate prediction and understanding.

Efficient Covariance Estimation from Temporal Data

2 code implementations30 May 2019 Hrayr Harutyunyan, Daniel Moyer, Hrant Khachatrian, Greg Ver Steeg, Aram Galstyan

Estimating the covariance structure of multivariate time series is a fundamental problem with a wide-range of real-world applications -- from financial modeling to fMRI analysis.

Time Series Time Series Analysis

Estimating informativeness of samples with Smooth Unique Information

1 code implementation ICLR 2021 Hrayr Harutyunyan, Alessandro Achille, Giovanni Paolini, Orchid Majumder, Avinash Ravichandran, Rahul Bhotika, Stefano Soatto

We define a notion of information that an individual sample provides to the training of a neural network, and we specialize it to measure both how much a sample informs the final weights and how much it informs the function computed by the weights.

Informativeness

Information-theoretic generalization bounds for black-box learning algorithms

1 code implementation NeurIPS 2021 Hrayr Harutyunyan, Maxim Raginsky, Greg Ver Steeg, Aram Galstyan

We derive information-theoretic generalization bounds for supervised learning algorithms based on the information contained in predictions rather than in the output of the training algorithm.

Generalization Bounds

Formal limitations of sample-wise information-theoretic generalization bounds

no code implementations13 May 2022 Hrayr Harutyunyan, Greg Ver Steeg, Aram Galstyan

Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.

Generalization Bounds

Identifying and Disentangling Spurious Features in Pretrained Image Representations

no code implementations22 Jun 2023 Rafayel Darbinyan, Hrayr Harutyunyan, Aram H. Markosyan, Hrant Khachatrian

Neural networks employ spurious correlations in their predictions, resulting in decreased performance when these correlations do not hold.

On information captured by neural networks: connections with memorization and generalization

1 code implementation28 Jun 2023 Hrayr Harutyunyan

Despite the popularity and success of deep learning, there is limited understanding of when, how, and why neural networks generalize to unseen examples.

Informativeness Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.