no code implementations • 5 Mar 2024 • Jiarui Xu, Shashank Jere, Yifei Song, Yi-Hung Kao, Lizhong Zheng, Lingjia Liu
At the air interface, multiple-input multiple-output (MIMO) and its variants such as multi-user MIMO (MU-MIMO) and massive/full-dimension MIMO have been key enablers across successive generations of cellular networks with evolving complexity and design challenges.
1 code implementation • 6 Feb 2024 • J. Jon Ryu, Xiangxiang Xu, H. S. Melihcan Erol, Yuheng Bu, Lizhong Zheng, Gregory W. Wornell
Computing eigenvalue decomposition (EVD) of a given linear operator, or finding its leading eigenvalues and eigenfunctions, is a fundamental task in many machine learning and scientific computing problems.
no code implementations • 14 Nov 2023 • Jiarui Xu, Karim Said, Lizhong Zheng, Lingjia Liu
Orthogonal time frequency space (OTFS) is a promising modulation scheme for wireless communication in high-mobility scenarios.
no code implementations • 8 Oct 2023 • Shashank Jere, Karim Said, Lizhong Zheng, Lingjia Liu
With this groundwork, we incorporate the available domain knowledge in the form of the statistics of the wireless channel directly into the weights of the ESN model.
1 code implementation • 18 Sep 2023 • Xiangxiang Xu, Lizhong Zheng
We present a novel framework for learning system design based on neural feature extractors.
no code implementations • 4 Aug 2023 • Shashank Jere, Lizhong Zheng, Karim Said, Lingjia Liu
Our work results in clear signal processing-based model interpretability of RC and provides theoretical explanation/justification for the power of randomness in randomly generating instead of training RC's recurrent weights.
no code implementations • 22 May 2023 • Lianjun Li, Sai Sree Rayala, Jiarui Xu, Lizhong Zheng, Lingjia Liu
In this paper we introduce StructNet-CE, a novel real-time online learning framework for MIMO-OFDM channel estimation, which only utilizes over-the-air (OTA) pilot symbols for online training and converges within one OFDM subframe.
no code implementations • 4 Jan 2023 • Xiangxiang Xu, Lizhong Zheng
We study kernel methods in machine learning from the perspective of feature subspace.
no code implementations • 20 Dec 2022 • Yajie Bao, Yang Li, Shao-Lun Huang, Lin Zhang, Lizhong Zheng, Amir Zamir, Leonidas Guibas
Task transfer learning is a popular technique in image processing applications that uses pre-trained models to reduce the supervision cost of related tasks.
no code implementations • 1 Nov 2022 • Erixhen Sula, Lizhong Zheng
We focus on a semi-supervised case to learn the model from labeled and unlabeled samples.
no code implementations • 17 Aug 2022 • Jiarui Xu, Lianjun Li, Lizhong Zheng, Lingjia Liu
The DF mechanism further enhances detection performance by dynamically tracking the channel changes through detected data symbols.
no code implementations • NeurIPS 2021 • Xinyi Tong, Xiangxiang Xu, Shao-Lun Huang, Lizhong Zheng
Current transfer learning algorithm designs mainly focus on the similarities between source and target tasks, while the impacts of the sample sizes of these tasks are often not sufficiently addressed.
no code implementations • 3 Oct 2021 • Jiarui Xu, Zhou Zhou, Lianjun Li, Lizhong Zheng, Lingjia Liu
The binary classifier enables the efficient utilization of the precious online training symbols and allows an easy extension to high-order modulations without a substantial increase in complexity.
no code implementations • 1 Dec 2020 • Zhou Zhou, Shashank Jere, Lizhong Zheng, Lingjia Liu
In this paper, we explore neural network-based strategies for performing symbol detection in a MIMO-OFDM system.
no code implementations • NeurIPS Workshop LMCA 2020 • Zhou Zhou, Shashank Jere, Lizhong Zheng, Lingjia Liu
In this paper, we investigate a neural network-based learning approach towards solving an integer-constrained programming problem using very limited training.
no code implementations • 20 Nov 2019 • Shao-Lun Huang, Anuran Makur, Gregory W. Wornell, Lizhong Zheng
We consider the problem of identifying universal low-dimensional features from high-dimensional data for inference tasks in settings involving learning.
no code implementations • 8 Oct 2019 • Shao-Lun Huang, Xiangxiang Xu, Lizhong Zheng
In this paper, we propose an information-theoretic approach to design the functional representations to extract the hidden common structure shared by a set of random variables.
no code implementations • 16 May 2019 • Shao-Lun Huang, Xiangxiang Xu, Lizhong Zheng, Gregory W. Wornell
It is commonly believed that the hidden layers of deep neural networks (DNNs) attempt to extract informative features for learning tasks.
no code implementations • 22 Nov 2018 • Lichen Wang, Jiaxiang Wu, Shao-Lun Huang, Lizhong Zheng, Xiangxiang Xu, Lin Zhang, Junzhou Huang
We further generalize the framework to handle more than two modalities and missing modalities.
no code implementations • 10 Oct 2018 • David Qiu, Anuran Makur, Lizhong Zheng
In this paper, we present a local information theoretic approach to explicitly learn probabilistic clustering of a discrete random variable.