Search Results for author: Eric Lei

Found 13 papers, 3 papers with code

Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding

no code implementations12 Mar 2024 Eric Lei, Hamed Hassani, Shirin Saeedi Bidokhti

On general vector sources, LTC improves upon standard neural compressors in one-shot coding performance.

Quantization

Score-Based Methods for Discrete Optimization in Deep Learning

no code implementations15 Oct 2023 Eric Lei, Arman Adibi, Hamed Hassani

One class of these problems involve objective functions which depend on neural networks, but optimization variables which are discrete.

WrappingNet: Mesh Autoencoder via Deep Sphere Deformation

no code implementations29 Aug 2023 Eric Lei, Muhammad Asad Lodhi, Jiahao Pang, Junghyun Ahn, Dong Tian

There have been recent efforts to learn more meaningful representations via fixed length codewords from mesh data, since a mesh serves as a complete model of underlying 3D shape compared to a point cloud.

Text + Sketch: Image Compression at Ultra Low Rates

1 code implementation4 Jul 2023 Eric Lei, Yiğit Berkay Uslu, Hamed Hassani, Shirin Saeedi Bidokhti

Recent advances in text-to-image generative models provide the ability to generate high-quality images from short text descriptions.

Image Compression

On a Relation Between the Rate-Distortion Function and Optimal Transport

no code implementations1 Jul 2023 Eric Lei, Hamed Hassani, Shirin Saeedi Bidokhti

We discuss a relationship between rate-distortion and optimal transport (OT) theory, even though they seem to be unrelated at first glance.

Quantization Relation

Federated Neural Compression Under Heterogeneous Data

no code implementations25 May 2023 Eric Lei, Hamed Hassani, Shirin Saeedi Bidokhti

We discuss a federated learned compression problem, where the goal is to learn a compressor from real-world data which is scattered across clients and may be statistically heterogeneous, yet share a common underlying representation.

Personalized Federated Learning

Neural Estimation of the Rate-Distortion Function With Applications to Operational Source Coding

1 code implementation4 Apr 2022 Eric Lei, Hamed Hassani, Shirin Saeedi Bidokhti

Motivated by the empirical success of deep neural network (DNN) compressors on large, real-world data, we investigate methods to estimate the rate-distortion function on such data, which would allow comparison of DNN compressors with optimality.

Data Compression

Robust Graph Neural Networks via Probabilistic Lipschitz Constraints

no code implementations14 Dec 2021 Raghu Arghal, Eric Lei, Shirin Saeedi Bidokhti

This allows for the use of the same computationally efficient algorithm on sampled constraints, which provides PAC-style guarantees on the stability of the GNN using results in scenario optimization.

Out-of-Distribution Robustness in Deep Learning Compression

no code implementations13 Oct 2021 Eric Lei, Hamed Hassani, Shirin Saeedi Bidokhti

In recent years, deep neural network (DNN) compression systems have proved to be highly effective for designing source codes for many natural sources.

CSI-Based Multi-Antenna and Multi-Point Indoor Positioning Using Probability Fusion

no code implementations6 Sep 2020 Emre Gönültaş, Eric Lei, Jack Langerman, Howard Huang, Christoph Studer

Channel state information (CSI)-based fingerprinting via neural networks (NNs) is a promising approach to enable accurate indoor and outdoor positioning of user equipments (UEs), even under challenging propagation conditions.

Outdoor Positioning

Siamese Neural Networks for Wireless Positioning and Channel Charting

no code implementations29 Sep 2019 Eric Lei, Oscar Castañeda, Olav Tirkkonen, Tom Goldstein, Christoph Studer

In this paper, we propose a unified architecture based on Siamese networks that can be used for supervised UE positioning and unsupervised channel charting.

Dimensionality Reduction

Characterization of Hemodynamic Signal by Learning Multi-View Relationships

no code implementations17 Sep 2017 Eric Lei, Kyle Miller, Michael R. Pinsky, Artur Dubrawski

We aim to investigate the usefulness of nonlinear multi-view relations to characterize multi-view data in an explainable manner.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.