Search Results for author: David Tse

Found 22 papers, 10 papers with code

Group-Structured Adversarial Training

no code implementations18 Jun 2021 Farzan Farnia, Amirali Aghazadeh, James Zou, David Tse

Robust training methods against perturbations to the input data have received great attention in the machine learning literature.

Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma

2 code implementations10 Sep 2020 Joachim Neu, Ertem Nusret Tas, David Tse

To resolve this availability-finality dilemma, we formulate a new class of flexible consensus protocols, ebb-and-flow protocols, which support a full dynamically available ledger in conjunction with a finalized prefix ledger.

Cryptography and Security Distributed, Parallel, and Cluster Computing

Ultra Fast Medoid Identification via Correlated Sequential Halving

1 code implementation NeurIPS 2019 Tavor Baharav, David Tse

Four to five orders of magnitude gains over exact computation are obtained on real data, in terms of both number of distance computations needed and wall clock time.

Boomerang: Redundancy Improves Latency and Throughput in Payment Networks

1 code implementation4 Oct 2019 Vivek Bagaria, Joachim Neu, David Tse

Funds are forwarded using Boomerang contracts, which allow Alice to revert the transfer iff she has learned Bob's secret.

Cryptography and Security Distributed, Parallel, and Cluster Computing Information Theory Networking and Internet Architecture Information Theory

Prism: Scaling Bitcoin by 10,000x

2 code implementations25 Sep 2019 Lei Yang, Vivek Bagaria, Gerui Wang, Mohammad Alizadeh, David Tse, Giulia Fanti, Pramod Viswanath

Bitcoin is the first fully decentralized permissionless blockchain protocol and achieves a high level of security: the ledger it maintains has guaranteed liveness and consistency properties as long as the adversary has less compute power than the honest nodes.

Distributed, Parallel, and Cluster Computing Cryptography and Security Networking and Internet Architecture

Deconstructing Generative Adversarial Networks

no code implementations27 Jan 2019 Banghua Zhu, Jiantao Jiao, David Tse

Generalization: given a population target of GANs, we design a systematic principle, projection under admissible distance, to design GANs to meet the population requirement using finite samples.

Porcupine Neural Networks: Approximating Neural Network Landscapes

no code implementations NeurIPS 2018 Soheil Feizi, Hamid Javadi, Jesse Zhang, David Tse

Neural networks have been used prominently in several machine learning and statistics applications.

Generalizable Adversarial Training via Spectral Normalization

1 code implementation ICLR 2019 Farzan Farnia, Jesse M. Zhang, David Tse

A significant portion of this gap can be attributed to the decrease in generalization performance due to adversarial training.

A Convex Duality Framework for GANs

no code implementations NeurIPS 2018 Farzan Farnia, David Tse

For a convex set $\mathcal{F}$, this duality framework interprets the original GAN formulation as finding the generative model with minimum JS-divergence to the distributions penalized to match the moments of the data distribution, with the moments specified by the discriminators in $\mathcal{F}$.

Hidden Hamiltonian Cycle Recovery via Linear Programming

no code implementations15 Apr 2018 Vivek Bagaria, Jian Ding, David Tse, Yihong Wu, Jiaming Xu

Represented as bicolored multi-graphs, these extreme points are further decomposed into simpler "blossom-type" structures for the large deviation analysis and counting arguments.

Traveling Salesman Problem

A Spectral Approach to Generalization and Optimization in Neural Networks

no code implementations ICLR 2018 Farzan Farnia, Jesse Zhang, David Tse

The recent success of deep neural networks stems from their ability to generalize well on real data; however, Zhang et al. have observed that neural networks can easily overfit random labels.

Tensor Biclustering

1 code implementation NeurIPS 2017 Soheil Feizi, Hamid Javadi, David Tse

Consider a dataset where data is collected on multiple features of multiple individuals over multiple times.

NeuralFDR: Learning Discovery Thresholds from Hypothesis Features

1 code implementation NeurIPS 2017 Fei Xia, Martin J. Zhang, James Zou, David Tse

For example, in genetic association studies, each hypothesis tests the correlation between a variant and the trait.

Medoids in almost linear time via multi-armed bandits

1 code implementation2 Nov 2017 Vivek Bagaria, Govinda M. Kamath, Vasilis Ntranos, Martin J. Zhang, David Tse

Computing the medoid of a large number of points in high-dimensional space is an increasingly common operation in many data science problems.

Multi-Armed Bandits

Understanding GANs: the LQG Setting

no code implementations ICLR 2018 Soheil Feizi, Farzan Farnia, Tony Ginart, David Tse

Generative Adversarial Networks (GANs) have become a popular method to learn a probability model from data.

Porcupine Neural Networks: (Almost) All Local Optima are Global

1 code implementation5 Oct 2017 Soheil Feizi, Hamid Javadi, Jesse Zhang, David Tse

Neural networks have been used prominently in several machine learning and statistics applications.

Time-Sensitive Bandit Learning and Satisficing Thompson Sampling

no code implementations28 Apr 2017 Daniel Russo, David Tse, Benjamin Van Roy

We propose satisficing Thompson sampling -- a variation of Thompson sampling -- and establish a strong discounted regret bound for this new algorithm.

Maximally Correlated Principal Component Analysis

no code implementations17 Feb 2017 Soheil Feizi, David Tse

For jointly Gaussian variables we show that the covariance matrix corresponding to the identity (or the negative of the identity) transformations majorizes covariance matrices of non-identity functions.

Dimensionality Reduction

A Minimax Approach to Supervised Learning

1 code implementation NeurIPS 2016 Farzan Farnia, David Tse

Given a task of predicting $Y$ from $X$, a loss function $L$, and a set of probability distributions $\Gamma$ on $(X, Y)$, what is the optimal decision rule minimizing the worst-case expected loss over $\Gamma$?

Community Recovery in Graphs with Locality

no code implementations11 Feb 2016 Yuxin Chen, Govinda Kamath, Changho Suh, David Tse

Motivated by applications in domains such as social networks and computational biology, we study the problem of community recovery in graphs with locality.

Discrete Rényi Classifiers

no code implementations NeurIPS 2015 Meisam Razaviyayn, Farzan Farnia, David Tse

We prove that for a given set of marginals, the minimum Hirschfeld-Gebelein-Renyi (HGR) correlation principle introduced in [1] leads to a randomized classification rule which is shown to have a misclassification rate no larger than twice the misclassification rate of the optimal classifier.

Feature Selection General Classification

An Adaptive Successive Cancellation List Decoder for Polar Codes with Cyclic Redundancy Check

no code implementations15 Aug 2012 Bin Li, Hui Shen, David Tse

In this letter, we propose an adaptive SC (Successive Cancellation)-List decoder for polar codes with CRC.

Information Theory Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.