Search Results for author: Cyrus Rashtchian

Found 18 papers, 7 papers with code

A Closer Look at Accuracy vs. Robustness

1 code implementation NeurIPS 2020 Yao-Yuan Yang, Cyrus Rashtchian, Hongyang Zhang, Ruslan Salakhutdinov, Kamalika Chaudhuri

Current methods for training robust networks lead to a drop in test accuracy, which has led prior works to posit that a robustness-accuracy tradeoff may be inevitable in deep learning.

Explainable $k$-Means and $k$-Medians Clustering

3 code implementations28 Feb 2020 Sanjoy Dasgupta, Nave Frost, Michal Moshkovitz, Cyrus Rashtchian

In terms of negative results, we show, first, that popular top-down decision tree algorithms may lead to clusterings with arbitrarily large cost, and second, that any tree-induced clustering must in general incur an $\Omega(\log k)$ approximation factor compared to the optimal clustering.

Clustering

ExKMC: Expanding Explainable $k$-Means Clustering

2 code implementations3 Jun 2020 Nave Frost, Michal Moshkovitz, Cyrus Rashtchian

To allow flexibility, we develop a new explainable $k$-means clustering algorithm, ExKMC, that takes an additional parameter $k' \geq k$ and outputs a decision tree with $k'$ leaves.

Clustering

Benchmarking Robustness to Adversarial Image Obfuscations

1 code implementation NeurIPS 2023 Florian Stimberg, Ayan Chakrabarti, Chun-Ta Lu, Hussein Hazimeh, Otilia Stretcu, Wei Qiao, Yintao Liu, Merve Kaya, Cyrus Rashtchian, Ariel Fuxman, Mehmet Tek, Sven Gowal

We evaluate 33 pretrained models on the benchmark and train models with different augmentations, architectures and training methods on subsets of the obfuscations to measure generalization.

Benchmarking

Unsupervised Embedding of Hierarchical Structure in Euclidean Space

1 code implementation30 Oct 2020 Jinyu Zhao, Yi Hao, Cyrus Rashtchian

To learn the embedding, we revisit using a variational autoencoder with a Gaussian mixture prior, and we show that rescaling the latent space embedding and then applying Ward's linkage-based algorithm leads to improved results for both dendrogram purity and the Moseley-Wang cost function.

Clustering

LSF-Join: Locality Sensitive Filtering for Distributed All-Pairs Set Similarity Under Skew

no code implementations6 Mar 2020 Cyrus Rashtchian, Aneesh Sharma, David P. Woodruff

Theoretically, we show that LSF-Join efficiently finds most close pairs, even for small similarity thresholds and for skewed input sets.

Recommendation Systems

Vector-Matrix-Vector Queries for Solving Linear Algebra, Statistics, and Graph Problems

no code implementations24 Jun 2020 Cyrus Rashtchian, David P. Woodruff, Hanlin Zhu

We consider the general problem of learning about a matrix through vector-matrix-vector queries.

Explainable k-Means and k-Medians Clustering

no code implementations ICML 2020 Michal Moshkovitz, Sanjoy Dasgupta, Cyrus Rashtchian, Nave Frost

In terms of negative results, we show that popular top-down decision tree algorithms may lead to clusterings with arbitrarily large cost, and we prove that any explainable clustering must incur an \Omega(\log k) approximation compared to the optimal clustering.

Clustering

Trace Reconstruction Problems in Computational Biology

no code implementations12 Oct 2020 Vinnu Bhardwaj, Pavel A. Pevzner, Cyrus Rashtchian, Yana Safonova

The problem of reconstructing a string from its error-prone copies, the trace reconstruction problem, was introduced by Vladimir Levenshtein two decades ago.

Retrieval

Approximate Trace Reconstruction

no code implementations12 Dec 2020 Sami Davies, Miklos Z. Racz, Cyrus Rashtchian, Benjamin G. Schiffer

In the usual trace reconstruction problem, the goal is to exactly reconstruct an unknown string of length $n$ after it passes through a deletion channel many times independently, producing a set of traces (i. e., random subsequences of the string).

Average-Case Communication Complexity of Statistical Problems

no code implementations3 Jul 2021 Cyrus Rashtchian, David P. Woodruff, Peng Ye, Hanlin Zhu

Our motivation is to understand the statistical-computational trade-offs in streaming, sketching, and query-based models.

Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians

no code implementations2 Sep 2021 Sami Davies, Arya Mazumdar, Soumyabrata Pal, Cyrus Rashtchian

Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory.

Learning Theory

A Theoretical View on Sparsely Activated Networks

no code implementations8 Aug 2022 Cenk Baykal, Nishanth Dikkala, Rina Panigrahy, Cyrus Rashtchian, Xin Wang

After representing LSH-based sparse networks with our model, we prove that sparse networks can match the approximation power of dense networks on Lipschitz functions.

Substance or Style: What Does Your Image Embedding Know?

no code implementations10 Jul 2023 Cyrus Rashtchian, Charles Herrmann, Chun-Sung Ferng, Ayan Chakrabarti, Dilip Krishnan, Deqing Sun, Da-Cheng Juan, Andrew Tomkins

We find that image-text models (CLIP and ALIGN) are better at recognizing new examples of style transfer than masking-based models (CAN and MAE).

Style Transfer

DreamSync: Aligning Text-to-Image Generation with Image Understanding Feedback

no code implementations29 Nov 2023 Jiao Sun, Deqing Fu, Yushi Hu, Su Wang, Royi Rassin, Da-Cheng Juan, Dana Alon, Charles Herrmann, Sjoerd van Steenkiste, Ranjay Krishna, Cyrus Rashtchian

Then, it uses two VLMs to select the best generation: a Visual Question Answering model that measures the alignment of generated images to the text, and another that measures the generation's aesthetic quality.

Question Answering Text-to-Image Generation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.