Search Results for author: Sung Min Park

Found 9 papers, 6 papers with code

FFCV: Accelerating Training by Removing Data Bottlenecks

2 code implementations CVPR 2023 Guillaume Leclerc, Andrew Ilyas, Logan Engstrom, Sung Min Park, Hadi Salman, Aleksander Madry

For example, we are able to train an ImageNet ResNet-50 model to 75\% in only 20 mins on a single machine.

TRAK: Attributing Model Behavior at Scale

2 code implementations24 Mar 2023 Sung Min Park, Kristian Georgiev, Andrew Ilyas, Guillaume Leclerc, Aleksander Madry

That is, computationally tractable methods can struggle with accurately attributing model predictions in non-convex settings (e. g., in the context of deep neural networks), while methods that are effective in such regimes require training thousands of models, which makes them impractical for large models or datasets.

ModelDiff: A Framework for Comparing Learning Algorithms

1 code implementation22 Nov 2022 Harshay Shah, Sung Min Park, Andrew Ilyas, Aleksander Madry

We study the problem of (learning) algorithm comparison, where the goal is to find differences between models trained with two different learning algorithms.

Data Augmentation

A Data-Based Perspective on Transfer Learning

1 code implementation CVPR 2023 Saachi Jain, Hadi Salman, Alaa Khaddaj, Eric Wong, Sung Min Park, Aleksander Madry

It is commonly believed that in transfer learning including more pre-training data translates into better performance.

Transfer Learning

Datamodels: Predicting Predictions from Training Data

1 code implementation1 Feb 2022 Andrew Ilyas, Sung Min Park, Logan Engstrom, Guillaume Leclerc, Aleksander Madry

We present a conceptual framework, datamodeling, for analyzing the behavior of a model class in terms of the training data.

On Distinctive Properties of Universal Perturbations

no code implementations31 Dec 2021 Sung Min Park, Kuo-An Wei, Kai Xiao, Jerry Li, Aleksander Madry

We identify properties of universal adversarial perturbations (UAPs) that distinguish them from standard adversarial perturbations.

Non-robust Features through the Lens of Universal Perturbations

no code implementations1 Jan 2021 Sung Min Park, Kuo-An Wei, Kai Yuanqing Xiao, Jerry Li, Aleksander Madry

We study universal adversarial perturbations and demonstrate that the above picture is more nuanced.

Sparse PCA from Sparse Linear Regression

no code implementations NeurIPS 2018 Guy Bresler, Sung Min Park, Madalina Persu

Sparse Principal Component Analysis (SPCA) and Sparse Linear Regression (SLR) have a wide range of applications and have attracted a tremendous amount of attention in the last two decades as canonical examples of statistical problems in high dimension.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.