1 code implementation • 24 Jun 2024 • Ashwinee Panda, Berivan Isik, Xiangyu Qi, Sanmi Koyejo, Tsachy Weissman, Prateek Mittal
The resulting effects, such as catastrophic forgetting of earlier tasks, make it challenging to obtain good performance on multiple tasks at the same time.
1 code implementation • 22 Jun 2023 • Berivan Isik, Francesco Pase, Deniz Gunduz, Sanmi Koyejo, Tsachy Weissman, Michele Zorzi
The high communication cost of sending model updates from the clients to the server is a significant bottleneck for scalable federated learning (FL).
no code implementations • 20 Dec 2022 • Evgenya Pergament, Pulkit Tandon, Oren Rippel, Lubomir Bourdev, Alexander G. Anderson, Bruno Olshausen, Tsachy Weissman, Sachin Katti, Kedar Tatwawadi
The contributions of this work are threefold: (1) we introduce a web-tool which allows scalable collection of fine-grained perceptual importance, by having users interactively paint spatio-temporal maps over encoded videos; (2) we use this tool to collect a dataset with 178 videos with a total of 14443 frames of human annotated spatio-temporal importance maps over the videos; and (3) we use our curated dataset to train a lightweight machine learning model which can predict these spatio-temporal importance regions.
no code implementations • 5 Nov 2022 • Wei zhang, Yanjun Han, Zhengyuan Zhou, Aaron Flores, Tsachy Weissman
In the past four years, a particularly important development in the digital advertising industry is the shift from second-price auctions to first-price auctions for online display ads.
1 code implementation • 30 Sep 2022 • Berivan Isik, Francesco Pase, Deniz Gunduz, Tsachy Weissman, Michele Zorzi
At the end of the training, the final model is a sparse network with random weights -- or a subnetwork inside the dense random network.
1 code implementation • 8 May 2022 • Evgenya Pergament, Pulkit Tandon, Kedar Tatwawadi, Oren Rippel, Lubomir Bourdev, Bruno Olshausen, Tsachy Weissman, Sachin Katti, Alexander G. Anderson
We use this tool to collect data in-the-wild (10 videos, 17 users) and utilize the obtained importance maps in the context of x264 coding to demonstrate that the tool can indeed be used to generate videos which, at the same bitrate, look perceptually better through a subjective study - and are 1. 9 times more likely to be preferred by viewers.
no code implementations • 7 Feb 2022 • Berivan Isik, Tsachy Weissman
In this sense, the utility of the data for learning is essentially maintained, while reducing storage and privacy leakage by quantifiable amounts.
1 code implementation • 26 Jun 2021 • Pulkit Tandon, Shubham Chandak, Pat Pataranutaporn, Yimeng Liu, Anesu M. Mapuranga, Pattie Maes, Tsachy Weissman, Misha Sra
Video represents the majority of internet traffic today, driving a continual race between the generation of higher quality content, transmission of larger file sizes, and the development of network infrastructure.
1 code implementation • 16 Feb 2021 • Berivan Isik, Tsachy Weissman, Albert No
We study the neural network (NN) compression problem, viewing the tension between the compression ratio and NN performance through the lens of rate-distortion theory.
no code implementations • 15 Feb 2021 • Berivan Isik, Kristy Choi, Xin Zheng, Tsachy Weissman, Stefano Ermon, H. -S. Philip Wong, Armin Alaghi
Compression and efficient storage of neural network (NN) parameters is critical for applications that run on resource-constrained devices.
1 code implementation • 7 Nov 2020 • Roshan Prabhakar, Shubham Chandak, Carina Chiu, Renee Liang, Huong Nguyen, Kedar Tatwawadi, Tsachy Weissman
COVID-19 has made video communication one of the most important modes of information exchange.
no code implementations • NeurIPS Workshop DL-IG 2020 • Berivan Isik, Kristy Choi, Xin Zheng, H.-S. Philip Wong, Stefano Ermon, Tsachy Weissman, Armin Alaghi
Efficient compression and storage of neural network (NN) parameters is critical for resource-constrained, downstream machine learning applications.
no code implementations • 9 Jul 2020 • Yanjun Han, Zhengyuan Zhou, Aaron Flores, Erik Ordentlich, Tsachy Weissman
In this paper, we take an online learning angle and address the fundamental problem of learning to bid in repeated first-price auctions, where both the bidder's private valuations and other bidders' bids can be arbitrary.
no code implementations • 22 Mar 2020 • Yanjun Han, Zhengyuan Zhou, Tsachy Weissman
In this paper, we develop the first learning algorithm that achieves a near-optimal $\widetilde{O}(\sqrt{T})$ regret bound, by exploiting two structural properties of first-price auctions, i. e. the specific feedback structure and payoff function.
1 code implementation • 1 Nov 2019 • Shubham Chandak, Kedar Tatwawadi, Chengtao Wen, Lingyun Wang, Juan Aparicio, Tsachy Weissman
Time series data compression is emerging as an important problem with the growth in IoT devices and sensors.
1 code implementation • 19 Nov 2018 • Kristy Choi, Kedar Tatwawadi, Aditya Grover, Tsachy Weissman, Stefano Ermon
For reliable transmission across a noisy communication channel, classical results from information theory show that it is asymptotically optimal to separate out the source and channel coding processes.
no code implementations • 25 Oct 2018 • Ashutosh Bhown, Soham Mukherjee, Sean Yang, Shubham Chandak, Irena Fischer-Hwang, Kedar Tatwawadi, Judith Fan, Tsachy Weissman
The images, results and additional data are available at https://compression. stanford. edu/human-compression
no code implementations • 27 Sep 2018 • Kristy Choi, Kedar Tatwawadi, Tsachy Weissman, Stefano Ermon
For reliable transmission across a noisy communication channel, classical results from information theory show that it is asymptotically optimal to separate out the source and channel coding processes.
no code implementations • 23 Feb 2018 • Yanjun Han, Jiantao Jiao, Tsachy Weissman
We present \emph{Local Moment Matching (LMM)}, a unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance.
no code implementations • NeurIPS 2018 • Yanjun Han, Jiantao Jiao, Chuan-Zheng Lee, Tsachy Weissman, Yihong Wu, Tiancheng Yu
For estimating the Shannon entropy of a distribution on $S$ elements with independent samples, [Paninski2004] showed that the sample complexity is sublinear in $S$, and [Valiant--Valiant2011] showed that consistent estimation of Shannon entropy is possible if and only if the sample size $n$ far exceeds $\frac{S}{\log S}$.
no code implementations • 19 Dec 2017 • Dmitri S. Pavlichin, Jiantao Jiao, Tsachy Weissman
We propose an efficient algorithm for approximate computation of the profile maximum likelihood (PML), a variant of maximum likelihood maximizing the probability of observing a sufficient statistic rather than the empirical sample.
no code implementations • 5 Jul 2017 • Jiantao Jiao, Yanjun Han, Irena Fischer-Hwang, Tsachy Weissman
We show through case studies that it is easier to estimate the fundamental limits of data processing than to construct explicit algorithms to achieve those limits.
no code implementations • 3 Nov 2016 • Sihan Li, Jiantao Jiao, Yanjun Han, Tsachy Weissman
We show that with or without nonlinearities, by adding shortcuts that have depth two, the condition number of the Hessian of the loss function at the zero initial point is depth-invariant, which makes training very deep models no more difficult than shallow ones.
no code implementations • 26 Sep 2014 • Jiantao Jiao, Kartik Venkat, Yanjun Han, Tsachy Weissman
In a nutshell, a message of this recent work is that, for a wide class of functionals, the performance of these essentially optimal estimators with $n$ samples is comparable to that of the MLE with $n \ln n$ samples.
3 code implementations • 11 Jan 2012 • Jiantao Jiao, Haim H. Permuter, Lei Zhao, Young-Han Kim, Tsachy Weissman
Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments.
Information Theory Information Theory