no code implementations • 8 Jan 2018 • Yu Cheng, Angus Wong, Kevin Hung, Zhizhong Li, Weitong Li, Jun Zhang
That is, the odor datasets are dynamically growing while both training samples and number of classes are increasing over time.
no code implementations • 29 Nov 2017 • Zhe Zhu, Ehab AlBadawy, Ashirbani Saha, Jun Zhang, Michael R. Harowicz, Maciej A. Mazurowski
Results: The best AUC performance for distinguishing molecular subtypes was 0. 65 (95% CI:[0. 57, 0. 71]) and was achieved by the off-the-shelf deep features approach.
no code implementations • 28 Nov 2017 • Zhe Zhu, Michael Harowicz, Jun Zhang, Ashirbani Saha, Lars J. Grimm, E. Shelley Hwang, Maciej A. Mazurowski
In the first approach, we adopted the transfer learning strategy, in which a network pre-trained on a large dataset of natural images is fine-tuned with our DCIS images.
no code implementations • 14 Oct 2017 • Tong Chen, Lin Wu, Yang Wang, Jun Zhang, Hongxu Chen, Xue Li
Inspired by point process in modeling temporal point process, in this paper we present a deep prediction method based on two recurrent neural networks (RNNs) to jointly model each user's continuous browsing history and asynchronous event sequences in the context of inter-user behavioral mutual infectivity.
no code implementations • 17 Aug 2017 • Congbo Cai, Yiqing Zeng, Chao Wang, Shuhui Cai, Jun Zhang, Zhong Chen, Xinghao Ding, Jianhui Zhong
After the ResNet was trained, it was applied to reconstruct the T2 mapping from simulation and in vivo human brain data.
no code implementations • 17 Feb 2017 • Hamid Hamraz, Marco A. Contreras, Jun Zhang
Existing segmentation procedures typically detect more than 90% of overstory trees, yet they barely detect 60% of understory trees because of the occlusion effect of higher canopy layers.
no code implementations • 31 Dec 2016 • Hamid Hamraz, Marco A. Contreras, Jun Zhang
This paper presents a tree segmentation approach for multi-story stands that stratifies the point cloud to canopy layers and segments individual tree crowns within each layer using a digital surface model based tree segmentation method.
no code implementations • 15 Aug 2016 • Yuanlong Li, Han Hu, Yonggang Wen, Jun Zhang
Finally, using the power consumption data from a real data center, we show that the proposed LTW can improve the classification accuracy of DTW from about 84% to 90%.
no code implementations • 17 May 2017 • Xu Tian, Jun Zhang, Zejun Ma, Yi He, Juan Wei
The system which combined frame retaining with frame stacking could reduces the time consumption of both training and decoding.
no code implementations • 20 Apr 2017 • Tong Chen, Lin Wu, Xue Li, Jun Zhang, Hongzhi Yin, Yang Wang
The proposed model delves soft-attention into the recurrence to simultaneously pool out distinct features with particular focus and produce hidden representations that capture contextual variations of relevant posts over time.
no code implementations • 21 Mar 2017 • Xu Tian, Jun Zhang, Zejun Ma, Yi He, Juan Wei, Peihao Wu, Wenchang Situ, Shuai Li, Yang Zhang
It is a competitive framework that LSTM models of more than 7 layers are successfully trained on Shenma voice search data in Mandarin and they outperform the deep LSTM models trained by conventional approach.
no code implementations • 3 Mar 2017 • Xu Tian, Jun Zhang, Zejun Ma, Yi He, Juan Wei
As training data rapid growth, large-scale parallel training with multi-GPUs cluster is widely applied in the neural network model learning currently. We present a new approach that applies exponential moving average method in large-scale parallel training of neural network model.
no code implementations • 1 Jan 2017 • Hamid Hamraz, Marco A. Contreras, Jun Zhang
This paper presents a non-parametric approach for segmenting trees from airborne LiDAR data in deciduous forests.
no code implementations • 30 Jul 2015 • Wei-Ya Ren, Shuo-Hao Li, Qiang Guo, Guo-Hui Li, Jun Zhang
A novel agglomerative clustering method is proposed by utilizing the path integral to define the affinity measure.
no code implementations • 3 Jun 2015 • Tianyi Liu, Shuangsang Fang, Yuehui Zhao, Peng Wang, Jun Zhang
Deep learning refers to the shining branch of machine learning that is based on learning levels of representations.
no code implementations • 3 Sep 2014 • Xinsheng Lai, Yuren Zhou, Jun He, Jun Zhang
We also show that GSEMO achieves a $(2ln(n))$-approximation ratio for the MLST problem in expected polynomial time of $n$ and $k$.
no code implementations • 4 Aug 2014 • Yue-Jiao Gong, Jun Zhang
This mechanism helps to instantly respond to the current traffic condition of the roundabout so as to improve real-timeness.
no code implementations • 5 Jul 2018 • Jun Zhang, Ashirbani Saha, Brian J. Soher, Maciej A. Mazurowski
Then, based on the segmentation results, a subject-specific piecewise linear mapping function was applied between the anchor points to normalize the same type of tissue in different patients into the same intensity ranges.
no code implementations • 18 May 2016 • Yuyi Mao, Jun Zhang, Khaled B. Letaief
Sample simulation results shall be presented to verify the theoretical analysis as well as validate the effectiveness of the proposed algorithm.
Information Theory Information Theory
no code implementations • 2 Sep 2018 • Guangxu Zhu, Dongzhu Liu, Yuqing Du, Changsheng You, Jun Zhang, Kaibin Huang
Accordingly, a new research area, called edge learning, emerges, which crosses and revolutionizes two disciplines: wireless communication and machine learning.
no code implementations • 5 Dec 2018 • Dongzhu Liu, Guangxu Zhu, Jun Zhang, Kaibin Huang
To solve the problem, a new retransmission protocol called data-importance aware automatic-repeat-request (importance ARQ) is proposed.
no code implementations • 18 Dec 2018 • Yifei Shen, Yuanming Shi, Jun Zhang, Khaled B. Letaief
To further address the task mismatch problem, we develop a transfer learning method via self-imitation in LORM, named LORM-TL, which can quickly adapt a pre-trained machine learning model to the new task with only a few additional unlabeled training samples.
no code implementations • EMNLP 2018 • Wayne Xiong, Lingfeng Wu, Jun Zhang, Andreas Stolcke
We propose to generalize language models for conversational speech recognition to allow them to operate across utterance boundaries and speaker changes, thereby capturing conversation-level phenomena such as adjacency pairs, lexical entrainment, and topical coherence.
no code implementations • 4 Jan 2019 • Rongrong Lin, Haizhang Zhang, Jun Zhang
We explore a generic definition of RKBS and the reproducing kernel for RKBS that is independent of construction.
no code implementations • 8 Apr 2019 • Leonid Polterovich, Daniel Rosen, Karina Samvelyan, Jun Zhang
The theory of persistence modules is an emerging field of algebraic topology which originated in topological data analysis.
Algebraic Topology Classical Analysis and ODEs Symplectic Geometry 55U99, 58Cxx, 53Dxx
no code implementations • 26 Apr 2019 • Khaled B. Letaief, Wei Chen, Yuanming Shi, Jun Zhang, Ying-Jun Angela Zhang
The recent upsurge of diversified mobile applications, especially those supported by Artificial Intelligence (AI), is spurring heated discussions on the future evolution of wireless communications.
no code implementations • 17 Nov 2018 • Yifei Shen, Yuanming Shi, Jun Zhang, Khaled B. Letaief
A unique advantage of the proposed method is that it can tackle the task mismatch issue with a few additional unlabeled training samples, which is especially important when transferring to large-size problems.
no code implementations • 2 Jun 2019 • Jun Zhang, Khaled B. Letaief
The Internet of Vehicles (IoV) is an emerging paradigm, driven by recent advancements in vehicular communications and networking.
Networking and Internet Architecture Signal Processing
no code implementations • 7 Jun 2019 • Jun Zhang, Yao-Kun Lei, Xing Che, Zhen Zhang, Yi Isaac Yang, Yi Qin Gao
In this paper we first analyzed the inductive bias underlying the data scattered across complex free energy landscapes (FEL), and exploited it to train deep neural networks which yield reduced and clustered representation for the FEL.
no code implementations • 5 Aug 2019 • Yinghui Zhang, Xiaojuan Deng, Jun Zhang, Hongwei Li
In this paper, a simple cutting-off strategy is introduced into the augmented Lagrangian based algorithms for minimizing the Euler's elastica energy, which leads to easy parameter tuning and fast convergence.
no code implementations • 25 Sep 2019 • Hao Fu, Liheng Bian, Jun Zhang
The conventional high-level sensing techniques require high-fidelity images as input to extract target features, which are produced by either complex imaging hardware or high-complexity reconstruction algorithms.
no code implementations • 26 Sep 2019 • Huapeng Wu, Zhengxia Zou, Jie Gui, Wen-Jun Zeng, Jieping Ye, Jun Zhang, Hongyi Liu, Zhihui Wei
In this paper, we make a thorough investigation on the attention mechanisms in a SR model and shed light on how simple and effective improvements on these ideas improve the state-of-the-arts.
no code implementations • 4 Nov 2019 • Shigang Liu, Jun Zhang, Yang Xiang, Wanlei Zhou, Dongxi Xiang
However, previous studies usually focused on different classifiers, and overlook the class imbalance problem in real-world biomedical datasets.
no code implementations • 10 Dec 2019 • Takuya Yoshioka, Igor Abramovski, Cem Aksoylar, Zhuo Chen, Moshe David, Dimitrios Dimitriadis, Yifan Gong, Ilya Gurvich, Xuedong Huang, Yan Huang, Aviv Hurvitz, Li Jiang, Sharon Koubi, Eyal Krupka, Ido Leichter, Changliang Liu, Partha Parthasarathy, Alon Vinnikov, Lingfeng Wu, Xiong Xiao, Wayne Xiong, Huaming Wang, Zhenghao Wang, Jun Zhang, Yong Zhao, Tianyan Zhou
This increases marginally to 1. 6% when 50% of the attendees are unknown to the system.
no code implementations • 28 Jan 2020 • Jialin Dong, Jun Zhang, Yuanming Shi, Jessie Hui Wang
In this paper, we develop multi-armed bandit approaches for more efficient detection via coordinate descent, which make a delicate trade-off between exploration and exploitation in coordinate selection.
no code implementations • 24 Feb 2020 • Xiangyu Yang, Sheng Hua, Yuanming Shi, Hao Wang, Jun Zhang, Khaled B. Letaief
By exploiting the inherent connections between the set of task selection and group sparsity structural transmit beamforming vector, we reformulate the optimization as a group sparse beamforming problem.
no code implementations • 22 Feb 2020 • Yuanming Shi, Kai Yang, Tao Jiang, Jun Zhang, Khaled B. Letaief
By pushing inference and training processes of AI models to edge nodes, edge AI has emerged as a promising alternative.
no code implementations • 23 Feb 2020 • Qingjian Lin, Weicheng Cai, Lin Yang, Jun-Jie Wang, Jun Zhang, Ming Li
Our diarization system includes multiple modules, namely voice activity detection (VAD), segmentation, speaker embedding extraction, similarity scoring, clustering, resegmentation and overlap detection.
no code implementations • 18 Mar 2020 • Xinjie Feng, Hongxun Yao, Yuankai Qi, Jun Zhang, Shengping Zhang
Different from previous transformer based models [56, 34], which just use the decoder of the transformer to decode the convolutional attention, the proposed method use a convolutional feature maps as word embedding input into transformer.
no code implementations • 10 Aug 2018 • Xiao Chen, Chaoran Li, Derui Wang, Sheng Wen, Jun Zhang, Surya Nepal, Yang Xiang, Kui Ren
In contrast to existing works, the adversarial examples crafted by our method can also deceive recent machine learning based detectors that rely on semantic features such as control-flow-graph.
Cryptography and Security
no code implementations • 25 Apr 2020 • Jun Zhang, Yao-Kun Lei, Zhen Zhang, Junhan Chang, Maodong Li, Xu Han, Lijiang Yang, Yi Isaac Yang, Yi Qin Gao
Deep learning is transforming many areas in science, and it has great potential in modeling molecular systems.
no code implementations • LREC 2020 • Xiaochang Gong, Qin Zhao, Jun Zhang, Ruibin Mao, Ruifeng Xu
Thus, the detection and processing of sarcasm is important to social media analysis. However, most existing sarcasm dataset are in English and there is still a lack of authoritative Chinese sarcasm dataset.
no code implementations • LREC 2020 • Chaofa Yuan, Yu-Han Liu, Rongdi Yin, Jun Zhang, Qinling Zhu, Ruibin Mao, Ruifeng Xu
Based on high quality annotation guideline and effective quality control strategy, a corpus with 8, 314 target-level sentiment annotation is constructed on 6, 336 paragraphs from Chinese financial news text.
no code implementations • ACL 2020 • Zhiquan Ye, Yuxia Geng, Jiaoyan Chen, Jingmin Chen, Xiaoxiao Xu, SuHang Zheng, Feng Wang, Jun Zhang, Huajun Chen
In this situation, transferring from seen classes to unseen classes is extremely hard.
no code implementations • 11 Aug 2020 • Masaki Ikuta, Jun Zhang
We showed that the WGAN-based method was effective to preserve image texture.
no code implementations • 11 Aug 2020 • Liwei Hu, Wenyong Wang, Yu Xiang, Jun Zhang
Motivated by the problems of existing approaches and inspired by the success of the generative adversarial networks (GANs) in the field of computer vision, we prove an optimal discriminator theorem that the optimal discriminator of a GAN is a radial basis function neural network (RBFNN) while dealing with nonlinear sparse FFD regression and generation.
no code implementations • 15 Sep 2020 • Jun Zhang, Kuan Tian, Pei Dong, Haocheng Shen, Kezhou Yan, Jianhua Yao, Junzhou Huang, Xiao Han
Recently, artificial intelligence (AI) has been used in various disease diagnosis to improve diagnostic accuracy and reliability, but the interpretation of diagnosis results is still an open problem.
no code implementations • 23 Oct 2020 • Xiaogang Zhu, Shigang Liu, Xian Li, Sheng Wen, Jun Zhang, Camtepe Seyit, Yang Xiang
Fuzzing is one of the most effective technique to identify potential software vulnerabilities.
no code implementations • 13 Nov 2020 • Jun Zhang, Yao-Kun Lei, Zhen Zhang, Xu Han, Maodong Li, Lijiang Yang, Yi Isaac Yang, Yi Qin Gao
Combining reinforcement learning (RL) and molecular dynamics (MD) simulations, we propose a machine-learning approach (RL$^\ddag$) to automatically unravel chemical reaction mechanisms.
no code implementations • 3 Nov 2020 • Mingkun Huang, Jun Zhang, Meng Cai, Yang Zhang, Jiali Yao, Yongbin You, Yi He, Zejun Ma
In this work, we analyze the cause of the huge gradient variance in RNN-T training and proposed a new \textit{normalized jointer network} to overcome it.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
no code implementations • 3 Nov 2020 • Mingkun Huang, Meng Cai, Jun Zhang, Yang Zhang, Yongbin You, Yi He, Zejun Ma
In this work we propose an inference technique, asynchronous revision, to unify streaming and non-streaming speech recognition models.
no code implementations • 24 Apr 2018 • Yuantian Miao, Zichan Ruan, Lei Pan, Yu Wang, Jun Zhang, Yang Xiang
Network traffic analytics technology is a cornerstone for cyber security systems.
Cryptography and Security
no code implementations • 18 Nov 2019 • Bin Li, Shuseng Wang, Jun Zhang, Xainbin Cao, Chenglin Zhao
Massive multiple-input multiple-output (MIMO) radar, enabled by millimeter-wave virtual MIMO techniques, provides great promises to the high-resolution automotive sensing and target detection in unmanned ground/aerial vehicles (UGA/UAV).
no code implementations • 26 Apr 2020 • Ye Xue, Yifei Shen, Vincent Lau, Jun Zhang, Khaled B. Letaief
Specifically, we propose a novel $\ell_3$-norm-based formulation to recover the data without channel estimation.
no code implementations • 14 May 2020 • Jingsheng Yu, Jun Zhang
We propose a new method to define trading algorithms in market design environments.
no code implementations • 19 May 2020 • Jun Zhang
The core is a traditional and useful solution concept in economic theory.
no code implementations • 2 Dec 2020 • Jun Zhang, Yuan Sun
Besides, we investigate the dependence of the greybody factor and the sparsity of Hawking radiation on the conformal parameters.
General Relativity and Quantum Cosmology
no code implementations • 11 Dec 2020 • Jie Gu, Feng Wang, Qinghui Sun, Zhiquan Ye, Xiaoxiao Xu, Jingmin Chen, Jun Zhang
In this work, we focus on developing universal user representation model.
no code implementations • 22 Dec 2020 • Jun Zhang, Yao-Kun Lei, Yaqiang Zhou, Yi Isaac Yang, Yi Qin Gao
Deep learning is changing many areas in molecular physics, and it has shown great potential to deliver new solutions to challenging molecular modeling problems.
no code implementations • 26 Mar 2021 • Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief
Hierarchical FL, with a client-edge-cloud aggregation hierarchy, can effectively leverage both the cloud server's access to many clients' data and the edge servers' closeness to the clients to achieve a high communication efficiency.
no code implementations • 10 Mar 2021 • Zheng-Ping Li, Jun-Tian Ye, Xin Huang, Peng-Yu Jiang, Yuan Cao, Yu Hong, Chao Yu, Jun Zhang, Qiang Zhang, Cheng-Zhi Peng, Feihu Xu, Jian-Wei Pan
Long-range active imaging has widespread applications in remote sensing and target recognition.
no code implementations • 12 Sep 2019 • Federico Echenique, Antonio Miralles, Jun Zhang
We propose a pseudo-market solution to resource allocation problems subject to constraints.
no code implementations • 6 Apr 2021 • Xuyang Chang, Liheng Bian, Jun Zhang
In computational phase imaging, phase retrieval (PR) is required to reconstruct both amplitude and phase in complex space from intensity-only measurements.
no code implementations • 8 Apr 2021 • Daoyu Li, Liheng Bian, Jun Zhang
Recovering these sharp video frames from a single blurred image is nontrivial, due to not only its strong ill-posedness, but also various types of complex motion in reality such as rotation and motion in depth.
no code implementations • 17 Feb 2021 • Xinyu Bian, Yuyi Mao, Jun Zhang
Massive machine-type communication (mMTC) has been regarded as one of the most important use scenarios in the fifth generation (5G) and beyond wireless networks, which demands scalable access for a large number of devices.
no code implementations • 2 Apr 2021 • Lu Huang, Jingyu Sun, Yufeng Tang, JunFeng Hou, Jinkun Chen, Jun Zhang, Zejun Ma
This work describes an encoder pre-training procedure using frame-wise label to improve the training of streaming recurrent neural network transducer (RNN-T) model.
no code implementations • 26 Apr 2021 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
Federated edge learning (FEEL) has emerged as an effective approach to reduce the large communication latency in Cloud-based machine learning solutions, while preserving data privacy.
no code implementations • 26 Apr 2021 • Zhefeng Qiao, Xianghao Yu, Jun Zhang, Khaled B. Letaief
Federated learning (FL) is a promising and powerful approach for training deep learning models without sharing the raw data of clients.
no code implementations • 26 Apr 2021 • Xinyu Bian, Yuyi Mao, Jun Zhang
In this paper, we propose a turbo receiver for joint activity detection and data decoding in grant-free massive random access, which iterates between a detector and a belief propagation (BP)-based channel decoder.
no code implementations • 13 May 2021 • XiaoYu Zhang, Chao Chen, Yi Xie, Xiaofeng Chen, Jun Zhang, Yang Xiang
This survey presents the most recent findings of privacy attacks and defenses appeared in cloud-based neural network services.
no code implementations • 31 May 2021 • Xuyang Chang, Liheng Bian, Shaowei Jiang, Guoan Zheng, Jun Zhang
Complex-domain imaging has emerged as a valuable technique for investigating weak-scattered samples.
no code implementations • 9 Jun 2021 • Lintao Peng, Liheng Bian, Tiexin Liu, Jun Zhang
In this work, we report an agile wide-field imaging framework with selective high resolution that requires only two detectors.
no code implementations • 11 Jun 2021 • Bo Li, Yifei Shen, Yezhen Wang, Wenzhen Zhu, Colorado J. Reed, Jun Zhang, Dongsheng Li, Kurt Keutzer, Han Zhao
IIB significantly outperforms IRM on synthetic datasets, where the pseudo-invariant features and geometric skews occur, showing the effectiveness of proposed formulation in overcoming failure modes of IRM.
no code implementations • 13 Jun 2021 • Huapeng Wu, Jie Gui, Jun Zhang, James T. Kwok, Zhihui Wei
Recently, deep convolutional neural network methods have achieved an excellent performance in image superresolution (SR), but they can not be easily applied to embedded devices due to large memory cost.
no code implementations • 13 Jun 2021 • Huapeng Wu, Jie Gui, Jun Zhang, James T. Kwok, Zhihui Wei
Recently, convolutional neural network (CNN) based image super-resolution (SR) methods have achieved significant performance improvement.
no code implementations • 12 Jul 2021 • Xinyu Bian, Yuyi Mao, Jun Zhang
In particular, the common sparsity pattern in the received pilot and data signal has been ignored in most existing studies, and auxiliary information of channel decoding has not been utilized for user activity detection.
no code implementations • 15 Jul 2021 • Yufeng Xia, Jun Zhang, Zhiqiang Gong, Tingsong Jiang, Wen Yao
Deep Ensemble is widely considered the state-of-the-art method which can estimate the uncertainty with higher quality, but it is very expensive to train and test.
no code implementations • 24 Jul 2021 • Xueke Zheng, Runze Cai, Shuixin Xiao, Yu Qiu, Jun Zhang, Mian Li
A real-world application to the estimation of the vertical wheel force in a full vehicle system are, respectively, conducted to demonstrate the effectiveness of the proposed method.
no code implementations • 3 Aug 2021 • Yifan Ma, Yifei Shen, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief
Channel estimation and beamforming play critical roles in frequency-division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems.
no code implementations • 19 Aug 2021 • Guohao Peng, Yufeng Yue, Jun Zhang, Zhenyu Wu, Xiaoyu Tang, Danwei Wang
(2) By exploiting the interpretability of the local weighting scheme, a semantic constrained initialization is proposed so that the local attention can be reinforced by semantic priors.
no code implementations • 24 Aug 2021 • Haiyan Liu, Liheng Bian, Jun Zhang
We envision that this image-free segmentation technique can be widely applied in various resource-limited platforms such as UAV and unmanned vehicle that require real-time sensing.
no code implementations • 30 Aug 2021 • Xinjie Zhang, Jiawei Shao, Yuyi Mao, Jun Zhang
Device-edge co-inference, which partitions a deep neural network between a resource-constrained mobile device and an edge server, recently emerges as a promising paradigm to support intelligent mobile applications.
no code implementations • 1 Sep 2021 • Sen yang, Feng Luo, Jun Zhang, Xiyue Wang
Mitotic count is the most important morphological feature of breast cancer grading.
no code implementations • 1 Oct 2021 • Yifan Ma, Yifei Shen, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief
Furthermore, such networks will vary dynamically in a significant way, which makes it intractable to develop comprehensive analytical models.
no code implementations • ICCV 2021 • Guohao Peng, Jun Zhang, Heshan Li, Danwei Wang
The core of visual place recognition (VPR) lies in how to identify task-relevant visual cues and embed them into discriminative representations.
no code implementations • EMNLP 2021 • Xiaoya Li, Jiwei Li, Xiaofei Sun, Chun Fan, Tianwei Zhang, Fei Wu, Yuxian Meng, Jun Zhang
Out-of-Distribution (OOD) detection is an important problem in natural language processing (NLP).
no code implementations • NeurIPS Workshop DL-IG 2020 • Tian Han, Jun Zhang, Ying Nian Wu
This paper reviews the em-projections in information geometry and the recent understanding of variational auto-encoder, and explains that they share a common formulation as joint minimization of the Kullback-Leibler divergence between two manifolds of probability distributions, and the joint minimization can be implemented by alternating projections or alternating gradient descent.
no code implementations • 9 Dec 2021 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Jun Zhang
Federated edge learning (FEEL) has drawn much attention as a privacy-preserving distributed learning framework for mobile edge networks.
no code implementations • 20 Dec 2021 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
By exploiting the low-latency communication among edge servers for efficient model sharing, SD-FEEL can incorporate more training data, while enjoying much lower latency compared with conventional federated learning.
no code implementations • 8 Jan 2022 • Xinrui Zhan, Liheng Bian, Chunli Zhu, Jun Zhang
While the network is training at a high sampling rate, the modulation patterns and corresponding weights are updated iteratively, which produces optimal ranked encoding series when converged.
no code implementations • 15 Jan 2022 • Meng Xu, Youchen Wang, Bin Xu, Jun Zhang, Jian Ren, Stefan Poslad, Pengfei Xu
Camera, and associated with its objects within the field of view, localization could benefit many computer vision fields, such as autonomous driving, robot navigation, and augmented reality (AR).
no code implementations • 18 Jan 2022 • Zhen Gao, Minghui Wu, Chun Hu, Feifei Gao, Guanghui Wen, Dezhi Zheng, Jun Zhang
To this end, by modeling the key transmission modules as an end-to-end (E2E) neural network, this paper proposes a data-driven deep learning (DL)-based unified hybrid beamforming framework for both the time division duplex (TDD) and frequency division duplex (FDD) systems with implicit channel state information (CSI).
no code implementations • 25 Jan 2022 • Yuchang Sun, Jiawei Shao, Songze Li, Yuyi Mao, Jun Zhang
Federated learning (FL) has attracted much attention as a privacy-preserving distributed machine learning framework, where many clients collaboratively train a machine learning model by exchanging model updates with a parameter server instead of sharing their raw data.
no code implementations • 9 Feb 2022 • Chen Shen, Yi Liu, Wenzhi Fan, Bin Wang, Shixue Wen, Yao Tian, Jun Zhang, Jingsheng Yang, Zejun Ma
For Track 1, we propose several approaches to empower the clustering-based speaker diarization system to handle overlapped speech.
no code implementations • 17 Feb 2022 • Xiangjie Kong, Jun Zhang, Da Zhang, Yi Bu, Ying Ding, Feng Xia
Under this consideration, our paper presents and analyzes the causal factors that are crucial for scholars' academic success.
no code implementations • 4 Mar 2022 • Wenhua Zhang, Jun Zhang
Connective nuclei may look very different from each other while some of them share a similar shape with the epithelial ones.
no code implementations • 14 Mar 2022 • Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief
Federated Distillation (FD) is a recently proposed alternative to enable communication-efficient and robust FL, which achieves orders of magnitude reduction of the communication overhead compared with FedAvg and is flexible to handle heterogeneous models at the clients.
no code implementations • 22 Mar 2022 • Chi Liu, Huajie Chen, Tianqing Zhu, Jun Zhang, Wanlei Zhou
To evaluate the attack efficacy, we crafted heterogeneous security scenarios where the detectors were embedded with different levels of defense and the attackers' background knowledge of data varies.
no code implementations • 29 Mar 2022 • Kunyuan Li, Jun Zhang, Jun Gao, Meibin Qi
In this paper, we propose a self-supervised learning framework for light field depth estimation.
no code implementations • 5 Apr 2022 • Qi Zhong, Leo Yu Zhang, Shengshan Hu, Longxiang Gao, Jun Zhang, Yong Xiang
Fine-tuning attacks are effective in removing the embedded watermarks in deep learning models.
no code implementations • 7 Apr 2022 • Siteng Chen, Jinxi Xiang, Xiyue Wang, Jun Zhang, Sen yang, Junzhou Huang, Wei Yang, Junhua Zheng, Xiao Han
MC-TMB algorithm also exhibited good generalization on the external validation cohort with an AUC of 0. 732 (0. 683-0. 761), and better performance when compared to other methods.
no code implementations • 29 Apr 2022 • Yuting Gao, Jinfeng Liu, Zihan Xu, Jun Zhang, Ke Li, Rongrong Ji, Chunhua Shen
Large-scale vision-language pre-training has achieved promising results on downstream tasks.
no code implementations • 27 May 2022 • Bowen Zhao, Wei-neng Chen, Feng-Feng Wei, Ximeng Liu, Qingqi Pei, Jun Zhang
Specifically, PEGA enables users outsourcing COPs to the cloud server holding a competitive GA and approximating the optimal solution in a privacy-preserving manner.
no code implementations • 11 Jun 2022 • Zijian Li, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
A combination of the local private dataset and synthetic dataset with confident pseudo labels leads to nearly identical data distributions among clients, which improves the consistency among local models and benefits the global aggregation.
no code implementations • 15 Jun 2022 • Rongkang Dong, Yuyi Mao, Jun Zhang
In this paper, we propose an early exit prediction mechanism to reduce the on-device computation overhead in a device-edge co-inference system supported by early-exit networks.
no code implementations • 23 Jun 2022 • Yu Xiang, Guangbo Zhang, Liwei Hu, Jun Zhang, Wenyong Wang
Geometrical shape of airfoils, together with the corresponding flight conditions, are crucial factors for aerodynamic performances prediction.
no code implementations • 10 Jul 2022 • Lin Li, Chao Chen, Lei Pan, Yonghang Tai, Jun Zhang, Yang Xiang
It reduces the success rate of rPPG spoofing attacks in user authentication to 0. 05.
no code implementations • Proceedings of the 2014 ACM SIGMOD International Conference on Management of Data 2014 • Jun Zhang, Graham Cormode, Cecilia M. Procopiuc, Divesh Srivastava, Xiaokui Xiao
Given a dataset D, PRIVBAYES first constructs a Bayesian network N , which (i) provides a succinct model of the correlations among the attributes in D and (ii) allows us to approximate the distribution of data in D using a set P of lowdimensional marginals of D. After that, PRIVBAYES injects noise into each marginal in P to ensure differential privacy, and then uses the noisy marginals and the Bayesian network to construct an approximation of the data distribution in D. Finally, PRIVBAYES samples tuples from the approximate distribution to construct a synthetic dataset, and then releases the synthetic data.
no code implementations • 3 Sep 2022 • Yifan Ma, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief
In limited feedback multi-user multiple-input multiple-output (MU-MIMO) cellular networks, users send quantized information about the channel conditions to the associated base station (BS) for downlink beamforming.
no code implementations • 27 Sep 2022 • Chengzhi Lin, AnCong Wu, Junwei Liang, Jun Zhang, Wenhang Ge, Wei-Shi Zheng, Chunhua Shen
To address this problem, we propose a Text-Adaptive Multiple Visual Prototype Matching model, which automatically captures multiple prototypes to describe a video by adaptive aggregation of video token features.
no code implementations • 6 Oct 2022 • Jiawei Shao, Yuchang Sun, Songze Li, Jun Zhang
Federated learning (FL) strives to enable collaborative training of machine learning models without centrally collecting clients' private data.
no code implementations • 27 Oct 2022 • Jun Zhang, Ping Li, Wei Wang
Recent advances in neural networks have been successfully applied to many tasks in online recommendation applications.
no code implementations • 8 Nov 2022 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Songze Li, Jun Zhang
During training, the server computes gradients on the global coded dataset to compensate for the missing model updates of the straggling devices.
no code implementations • 15 Nov 2022 • Wentao Yu, Hengtao He, Xianghao Yu, Shenghui Song, Jun Zhang, Khaled B. Letaief
Reliability is of paramount importance for the physical layer of wireless systems due to its decisive impact on end-to-end performance.
no code implementations • 28 Nov 2022 • Yifan Ma, Wentao Yu, Xianghao Yu, Jun Zhang, Shenghui Song, Khaled B. Letaief
In this paper, we propose a lightweight and flexible deep learning-based CSI feedback approach by capitalizing on deep equilibrium models.
no code implementations • 28 Dec 2022 • Liheng Bian, Haoze Song, Lintao Peng, Xuyang Chang, Xi Yang, Roarke Horstmeyer, Lin Ye, Tong Qin, Dezhi Zheng, Jun Zhang
Benefiting from its single-photon sensitivity, single-photon avalanche diode (SPAD) array has been widely applied in various fields such as fluorescence lifetime imaging and quantum computing.
no code implementations • 3 Jan 2023 • Yandong Shi, Lixiang Lian, Yuanming Shi, Zixin Wang, Yong Zhou, Liqun Fu, Lin Bai, Jun Zhang, Wei zhang
The sixth generation (6G) wireless systems are envisioned to enable the paradigm shift from "connected things" to "connected intelligence", featured by ultra high density, large-scale, dynamic heterogeneity, diversified functional requirements and machine learning capabilities, which leads to a growing need for highly efficient intelligent algorithms.
no code implementations • 12 Jan 2023 • Siteng Chen, Xiyue Wang, Jun Zhang, Liren Jiang, Ning Zhang, Feng Gao, Wei Yang, Jinxi Xiang, Sen yang, Junhua Zheng, Xiao Han
The OSrisk for the prediction of 5-year survival status achieved AUC of 0. 784 (0. 746-0. 819) in the TCGA cohort, which was further verified in the independent General cohort and the CPTAC cohort, with AUC of 0. 774 (0. 723-0. 820) and 0. 702 (0. 632-0. 765), respectively.
no code implementations • 16 Dec 2022 • Liheng Bian, Xinrui Zhan, Xuyang Chang, Daoyu Li, Rong Yan, Yinuo Zhang, Haowen Ruan, Jun Zhang
In the proposed framework of single-pixel detection, the optical field from a target is first scattered by an optical diffuser and then two-dimensionally modulated by a spatial light modulator.
no code implementations • 13 Feb 2023 • Fei Kong, Xiyue Wang, Jinxi Xiang, Sen yang, Xinran Wang, Meng Yue, Jun Zhang, Junhan Zhao, Xiao Han, Yuhan Dong, Biyue Zhu, Fang Wang, Yueping Liu
We assessed the effectiveness of FACL in cancer diagnosis and Gleason grading tasks using 19, 461 whole-slide images of prostate cancer from multiple centers.
no code implementations • 24 Feb 2023 • Xuefeng Wang, Xinran Li, Jiawei Shao, Jun Zhang
Learning communication strategies in cooperative multi-agent reinforcement learning (MARL) has recently attracted intensive attention.
Multi-agent Reinforcement Learning reinforcement-learning +2
no code implementations • 16 Mar 2023 • Yupeng Huang, Hong Zhang, Siyuan Jiang, Dajiong Yue, Xiaohan Lin, Jun Zhang, Yi Qin Gao
In this study, we take the advantage of both traditional and machine-learning based methods, and present a method Deep Site and Docking Pose (DSDP) to improve the performance of blind docking.
no code implementations • 22 Mar 2023 • Bowen Zhao, Wei-neng Chen, Xiaoguo Li, Ximeng Liu, Qingqi Pei, Jun Zhang
To this end, in this paper, we discuss three typical optimization paradigms (i. e., \textit{centralized optimization, distributed optimization, and data-driven optimization}) to characterize optimization modes of evolutionary computation and propose BOOM to sort out privacy concerns in evolutionary computation.
no code implementations • 12 Apr 2023 • Feng-Feng Wei, Wei-neng Chen, Xiao-Qi Guo, Bowen Zhao, Sang-Woon Jeon, Jun Zhang
Inspired by this, this paper intends to introduce crowdsourcing into evolutionary computation (EC) to propose a crowdsourcing-based evolutionary computation (CEC) paradigm for distributed optimization.
no code implementations • 12 Apr 2023 • Wei-neng Chen, Feng-Feng Wei, Tian-Fang Zhao, Kay Chen Tan, Jun Zhang
Based on this taxonomy, existing studies on DEC are reviewed in terms of purpose, parallel structure of the algorithm, parallel model for implementation, and the implementation environment.
no code implementations • 12 Apr 2023 • Xinyu Bian, Yuyi Mao, Jun Zhang
Specifically, by jointly leveraging the user activity correlation between adjacent transmission blocks and the historical channel estimation results, we first develop an activity-correlation-aware receiver for grant-free massive RA systems with retransmission based on the correlated approximate message passing (AMP) algorithm.
no code implementations • 19 Apr 2023 • Jingjin Li, Chao Chen, Lei Pan, Mostafa Rahimi Azghadi, Hossein Ghodosi, Jun Zhang
The privacy issues include technical-wise information stealing and policy-wise privacy breaches.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
no code implementations • 2 May 2023 • Wenqiang Sun, Sen Li, Yuchang Sun, Jun Zhang
Federated learning (FL) attempts to train a global model by aggregating local models from distributed devices under the coordination of a central server.
no code implementations • 2 May 2023 • Jun Zhang, Xiaohan Lin, Weinan E, Yi Qin Gao
Multiscale molecular modeling is widely applied in scientific research of molecular properties over large time and length scales.
no code implementations • 13 May 2023 • Tailin Zhou, Zehong Lin, Jun Zhang, Danny H. K. Tsang
Based on these findings from our loss landscape visualization and loss decomposition, we propose utilizing iterative moving averaging (IMA) on the global model at the late training phase to reduce its deviation from the expected minimum, while constraining client exploration to limit the maximum distance between the global and client models.
no code implementations • 21 May 2023 • Xinyu Bian, Yuyi Mao, Jun Zhang
Most existing studies on joint activity detection and channel estimation for grant-free massive random access (RA) systems assume perfect synchronization among all active users, which is hard to achieve in practice.
no code implementations • 10 May 2023 • Xiaorui Bai, Wenyong Wang, Jun Zhang, Yueqing Wang, Yu Xiang
Flow field segmentation and classification help researchers to understand vortex structure and thus turbulent flow.
no code implementations • 25 May 2023 • Liheng Bian, Daoyu Li, Shuoguang Wang, Chunyang Teng, Huteng Liu, Hanwen Xu, Xuyang Chang, Guoqiang Zhao, Shiyong Li, Jun Zhang
These elements are then sampled based on the ranking, building the experimentally optimal sparse sampling strategy that reduces the cost of antenna array by up to one order of magnitude.
no code implementations • 26 May 2023 • Yuchang Sun, Zehong Lin, Yuyi Mao, Shi Jin, Jun Zhang
In this paper, we propose a probabilistic device scheduling framework for over-the-air FL, named PO-FL, to mitigate the negative impact of channel noise, where each device is scheduled according to a certain probability and its model update is reweighted using this probability in aggregation.
no code implementations • 27 May 2023 • Linhao Dong, Zhecheng An, Peihao Wu, Jun Zhang, Lu Lu, Zejun Ma
We also observe the cross-modal representation extracted by CIF-PT obtains better performance than other neural interfaces for the tasks of SLU, including the dominant speech representation learned from self-supervised pre-training.
no code implementations • 7 Jun 2023 • Lu Huang, Boyu Li, Jun Zhang, Lu Lu, Zejun Ma
Domain adaptation using text-only corpus is challenging in end-to-end(E2E) speech recognition.
no code implementations • 6 Jul 2023 • Yifei Shen, Jiawei Shao, Xinjie Zhang, Zehong Lin, Hao Pan, Dongsheng Li, Jun Zhang, Khaled B. Letaief
The evolution of wireless networks gravitates towards connected intelligence, a concept that envisions seamless interconnectivity among humans, objects, and intelligence in a hyper-connected cyber-physical world.
no code implementations • 20 Jul 2023 • Jiawei Shao, Zijian Li, Wenqiang Sun, Tailin Zhou, Yuchang Sun, Lumin Liu, Zehong Lin, Yuyi Mao, Jun Zhang
Without data centralization, FL allows clients to share local information in a privacy-preserving manner.
no code implementations • 7 Aug 2023 • Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief
To improve communication efficiency and achieve a better privacy-utility trade-off, we propose a communication-efficient FL training algorithm with differential privacy guarantee.
no code implementations • 9 Aug 2023 • Zijian Li, Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
For better privacy preservation, we propose a hard feature augmentation method to transfer real features towards the decision boundary, with which the synthetic data not only improve the model generalization but also erase the information of real features.
no code implementations • 12 Aug 2023 • Yongcong Chen, Ting Zeng, Jun Zhang
At present, the mainstream artificial intelligence generally adopts the technical path of "attention mechanism + deep learning" + "reinforcement learning".
no code implementations • 27 Aug 2023 • Chen Shen, Jun Zhang, Xinggong Liang, Zeyi Hao, Kehan Li, Fan Wang, Zhenyuan Wang, Chunfeng Lian
Forensic pathology is critical in analyzing death manner and time from the microscopic aspect to assist in the establishment of reliable factual bases for criminal investigation.
no code implementations • 30 Aug 2023 • Zijian Li, Zehong Lin, Jiawei Shao, Yuyi Mao, Jun Zhang
However, devices often have non-independent and identically distributed (non-IID) data, meaning their local data distributions can vary significantly.
no code implementations • 14 Sep 2023 • Jiaheng Wei, Yanjun Zhang, Leo Yu Zhang, Chao Chen, Shirui Pan, Kok-Leong Ong, Jun Zhang, Yang Xiang
For the first time, we show the feasibility of a client-side adversary with limited knowledge being able to recover the training samples from the aggregated global model.
no code implementations • 18 Sep 2023 • Wentao Yu, Yifan Ma, Hengtao He, Shenghui Song, Jun Zhang, Khaled B. Letaief
Ultra-massive multiple-input multiple-output (UMMIMO) is a cutting-edge technology that promises to revolutionize wireless networks by providing an unprecedentedly high spectral and energy efficiency.
no code implementations • 20 Sep 2023 • Kuan Tian, Yonghang Guan, Jinxi Xiang, Jun Zhang, Xiao Han, Wei Yang
First, to solve the problem of inconsistency of codec caused by the uncertainty of floating point calculations across platforms, we design a calibration transmitting system to guarantee the consistent quantization of entropy parameters between the encoding and decoding stages.
no code implementations • 23 Sep 2023 • Peiwen Jiang, Chao-Kai Wen, Xinping Yi, Xiao Li, Shi Jin, Jun Zhang
Foundation models (FMs), including large language models, have become increasingly popular due to their wide-ranging applicability and ability to understand human-like semantics.
no code implementations • 29 Sep 2023 • Tailin Zhou, Jun Zhang, Danny H. K. Tsang
Empirically, reducing data heterogeneity makes the connectivity on different paths more similar, forming more low-error overlaps between client and global modes.
no code implementations • 16 Oct 2023 • Kuan Tian, Yonghang Guan, Jinxi Xiang, Jun Zhang, Xiao Han, Wei Yang
Due to the absence of autoregressive modeling and optical flow alignment, we can design an extremely minimalist framework that can greatly benefit computational efficiency.
no code implementations • 16 Oct 2023 • Jun Zhang, Lipeng Zhu, Chao Wang, Shutao Li
On the other hand, the tensor nuclear norm (TNN)-based approaches have recently demonstrated to be more efficient on keeping high-dimensional low-rank structures in tensor recovery.
no code implementations • 7 Nov 2023 • Yao Zhang, Zhiwen Yu, Jun Zhang, Liang Wang, Tom H. Luan, Bin Guo, Chau Yuen
Nevertheless, existing MARL algorithms ignore effective information aggregation which is fundamental for improving the learning capacity of decentralized agents.
no code implementations • 14 Nov 2023 • Wentao Yu, Hengtao He, Xianghao Yu, Shenghui Song, Jun Zhang, Ross D. Murch, Khaled B. Letaief
Holographic MIMO (HMIMO) has recently been recognized as a promising enabler for future 6G systems through the use of an ultra-massive number of antennas in a compact space to exploit the propagation characteristics of the electromagnetic (EM) channel.
no code implementations • 15 Nov 2023 • Jin Qiu, Lu Huang, Boyu Li, Jun Zhang, Lu Lu, Zejun Ma
Deep biasing for the Transducer can improve the recognition performance of rare words or contextual entities, which is essential in practical applications, especially for streaming Automatic Speech Recognition (ASR).
Automatic Speech Recognition Automatic Speech Recognition (ASR) +1
no code implementations • 21 Nov 2023 • Zhen Chen, Yuhao Zhai, Jun Zhang, Jinqiao Wang
Specifically, we propose an efficient multi-scale surgical temporal action (MS-STA) module, which integrates visual features with spatial and temporal knowledge of surgical actions at the cost of 2D networks.
no code implementations • 1 Dec 2023 • Yuyi Mao, Xianghao Yu, Kaibin Huang, Ying-Jun Angela Zhang, Jun Zhang
Guided by these principles, we then explore energy-efficient design methodologies for the three critical tasks in edge AI systems, including training data acquisition, edge training, and edge inference.
no code implementations • 16 Dec 2023 • Wentao Yu, Hengtao He, Xianghao Yu, Shenghui Song, Jun Zhang, Ross D. Murch, Khaled B. Letaief
In this paper, we address the fundamental challenge of designing a low-complexity Bayes-optimal channel estimator in near-field HMIMO systems operating in unknown EM environments.
no code implementations • 18 Dec 2023 • Jun Zhang, Shuyang Jiang, Jiangtao Feng, Lin Zheng, Lingpeng Kong
Given that orthogonal memory compresses global information, we further dissect the context to amplify fine-grained local information.
no code implementations • 21 Dec 2023 • Ruoxiao Cao, Hengtao He, Xianghao Yu, Shenghui Song, Kaibin Huang, Jun Zhang, Yi Gong, Khaled B. Letaief
To address the joint channel estimation and cooperative localization problem for near-field UM-MIMO systems, we propose a variational Newtonized near-field channel estimation (VNNCE) algorithm and a Gaussian fusion cooperative localization (GFCL) algorithm.
no code implementations • 31 Dec 2023 • Sihao Yuan, Xu Han, Jun Zhang, Zhaoxin Xie, Cheng Fan, Yunlong Xiao, Yi Qin Gao, Yi Isaac Yang
We applied this approach to study a Claisen rearrangement reaction and a Carbonyl insertion reaction catalyzed by Manganese.
no code implementations • 24 Jan 2024 • Yuchang Sun, Marios Kountouris, Jun Zhang
We show that the generalization performance of a client can be improved only by collaborating with other clients that have more training data and similar data distribution.
1 code implementation • 25 Jan 2024 • Jian Kuang, Wenjing Li, Fang Li, Jun Zhang, Zhongcheng Wu
Distracted driver activity recognition plays a critical role in risk aversion-particularly beneficial in intelligent transportation systems.
no code implementations • 29 Jan 2024 • Wenqiang Sun, Teng Li, Zehong Lin, Jun Zhang
Recently, text-to-image diffusion models have demonstrated impressive ability to generate high-quality images conditioned on the textual input.
no code implementations • 9 Feb 2024 • Zhuoran Zheng, Jun Zhang
In endoscopic imaging, the recorded images are prone to exposure abnormalities, so maintaining high-quality images is important to assist healthcare professionals in performing decision-making.
no code implementations • 8 Feb 2024 • Yasas Supeksala, Dinh C. Nguyen, Ming Ding, Thilina Ranbaduge, Calson Chua, Jun Zhang, Jun Li, H. Vincent Poor
In this light, it is crucial to utilize information in learning processes that are either distributed or owned by different entities.
no code implementations • 15 Feb 2024 • Tailin Zhou, Jiadong Yu, Jun Zhang, Danny H. K. Tsang
This paper investigates resource allocation to provide heterogeneous users with customized virtual reality (VR) services in a mobile edge computing (MEC) system.
no code implementations • 17 Feb 2024 • Xiaolu Wang, Zijian Li, Shi Jin, Jun Zhang
Federated learning (FL) is an emerging distributed training paradigm that aims to learn a common global model without exchanging or transferring the data that are stored locally at different clients.
no code implementations • 28 Feb 2024 • Xinyu Bian, Yuyi Mao, Jun Zhang
Grant-free random access (RA) has been recognized as a promising solution to support massive connectivity due to the removal of the uplink grant request procedures.
no code implementations • 4 Mar 2024 • Hongshu Guo, Yining Ma, Zeyuan Ma, Jiacheng Chen, Xinglin Zhang, Zhiguang Cao, Jun Zhang, Yue-Jiao Gong
As a proof-of-principle study, we apply this framework to a group of Differential Evolution algorithms.
no code implementations • 13 Mar 2024 • Xinjie Zhang, Shenyuan Gao, Zhening Liu, Jiawei Shao, Xingtong Ge, Dailan He, Tongda Xu, Yan Wang, Jun Zhang
Existing learning-based stereo image codec adopt sophisticated transformation with simple entropy models derived from single image codecs to encode latent representations.
no code implementations • 15 Mar 2024 • Xiaohang Yu, Zhengxian Yang, Shi Pan, Yuqi Han, Haoxiang Wang, Jun Zhang, Shi Yan, Borong Lin, Lei Yang, Tao Yu, Lu Fang
We have built a custom mobile multi-camera large-space dense light field capture system, which provides a series of high-quality and sufficiently dense light field images for various scenarios.
no code implementations • 15 Mar 2024 • Yuhao Liu, Xinyu Bian, Yizhou Xu, Tianqi Hou, Wenjie Wang, Yuyi Mao, Jun Zhang
In order to control the inter-cell interference for a multi-cell multi-user multiple-input multiple-output network, we consider the precoder design for coordinated multi-point with downlink coherent joint transmission.
no code implementations • 28 Mar 2024 • Xinyu Bian, Yuhao Liu, Yizhou Xu, Tianqi Hou, Wenjie Wang, Yuyi Mao, Jun Zhang
Simulation results demonstrate the effectiveness of our proposed decentralized precoding scheme, which achieves performance similar to the optimal centralized precoding scheme.
no code implementations • 30 Mar 2024 • Jingwen Tong, Zhenzhen Chen, Liqun Fu, Jun Zhang, Zhu Han
To address the challenges posed by system and data heterogeneities in the FL process, we study a goal-directed client selection problem based on the model analytics framework by selecting a subset of clients for the model training.
no code implementations • 7 Apr 2024 • Xingtong Ge, Jixiang Luo, Xinjie Zhang, Tongda Xu, Guo Lu, Dailan He, Jing Geng, Yan Wang, Jun Zhang, Hongwei Qin
Prior research on deep video compression (DVC) for machine tasks typically necessitates training a unique codec for each specific task, mandating a dedicated decoder per task.
no code implementations • 6 Apr 2024 • Liqun Fu, Jingwen Tong, Tongtong Lin, Jun Zhang
Due to the learned objective model is typically non-convex and challenging to solve in real-time, we leverage the Lyapunov optimization to decouple the long-term average constraint and apply the prime-dual method to solve this decoupled resource allocation problem.
1 code implementation • 3 Dec 2022 • Jiahao Li, Zhourun Wu, Wenhao Lin, Jiawei Luo, Jun Zhang, Qingcai Chen, Junjie Chen
Although many feature extraction methods have been proposed to improve the performance of enhancer identification, they cannot learn position-related multiscale contextual information from raw DNA sequences.
2 code implementations • 27 Nov 2022 • Zhenhao Shuai, Hongbo Liu, Zhaolin Wan, Wei-Jie Yu, Jun Zhang
One of the key settings in SANE is the search space defined by cells and organs self-adapted to different DNN types.
1 code implementation • 14 Oct 2023 • Shuyang Jiang, Jun Zhang, Jiangtao Feng, Lin Zheng, Lingpeng Kong
Furthermore, we marry AMLP with popular NAR models, deriving a highly efficient NAR-AMLP architecture with linear time and space complexity.
1 code implementation • 28 Nov 2023 • Biao Xu, Haijun Fu, Shasha Huang, Shihua Ma, Yaoxu Xiong, Jun Zhang, Xuepeng Xiang, Wenyu Lu, Ji-Jung Kai, Shijun Zhao
Interstitial diffusion is a pivotal process that governs the phase stability and irradiation response of materials in non-equilibrium conditions.
1 code implementation • 29 Aug 2021 • Xiaoya Li, Jiwei Li, Xiaofei Sun, Chun Fan, Tianwei Zhang, Fei Wu, Yuxian Meng, Jun Zhang
For a task with $k$ training labels, $k$Folden induces $k$ sub-models, each of which is trained on a subset with $k-1$ categories with the left category masked unknown to the sub-model.
1 code implementation • 21 May 2023 • Hongru Li, Wentao Yu, Hengtao He, Jiawei Shao, Shenghui Song, Jun Zhang, Khaled B. Letaief
Task-oriented communication is an emerging paradigm for next-generation communication networks, which extracts and transmits task-relevant information, instead of raw data, for downstream applications.
1 code implementation • 20 Nov 2023 • Lei Geng, Xu Yan, Ziqiang Cao, Juntao Li, Wenjie Li, Sujian Li, Xinjie Zhou, Yang Yang, Jun Zhang
We achieve a biomedical multilingual corpus by incorporating three granularity knowledge alignments (entity, fact, and passage levels) into monolingual corpora.
1 code implementation • 12 May 2019 • Jun Zhang, Tong Zheng, Shengping Zhang, Meng Wang
First, the contextual net with a center-surround architecture extracts local contextual features from image patches, and generates initial illuminant estimates and the corresponding color corrected patches.
1 code implementation • 4 Apr 2021 • He Wang, Yifei Shen, Ziyuan Wang, Dongsheng Li, Jun Zhang, Khaled B. Letaief, Jie Lu
In this paper, we investigate the decentralized statistical inference problem, where a network of agents cooperatively recover a (structured) vector from private noisy samples without centralized coordination.
1 code implementation • 22 Jun 2021 • Zhiqiang Gong, Weien Zhou, Jun Zhang, Wei Peng, Wen Yao
To solve this problem, this work develops a novel physics-informed deep reversible regression models for temperature field reconstruction of heat-source systems (TFR-HSS), which can better reconstruct the temperature field with limited monitoring points unsupervisedly.
1 code implementation • 22 Jul 2021 • Wen Yao, Xiaohu Zheng, Jun Zhang, Ning Wang, Guijian Tang
Based on the adaptive aPC, a semi-supervised deep adaptive arbitrary polynomial chaos expansion (Deep aPCE) method is proposed to reduce the training data cost and improve the surrogate model accuracy.
1 code implementation • Findings (EMNLP) 2021 • Jun Zhang, Yan Yang, Chencai Chen, Liang He, Zhou Yu
Recommendation dialogs require the system to build a social bond with users to gain trust and develop affinity in order to increase the chance of a successful recommendation.
1 code implementation • 26 Feb 2020 • Junjia Liu, Jiaying Shou, Zhuang Fu, Hangfei Zhou, Rongli Xie, Jun Zhang, Jian Fei, Yanna Zhao
Moreover, recent reinforcement learning methods are data-inefficient and can not be directly deployed to the robot without simulation.
1 code implementation • 21 Jun 2023 • Yuchang Sun, Yuyi Mao, Jun Zhang
Federated learning (FL) is a promising framework for privacy-preserving collaborative learning, where model training tasks are distributed to clients and only the model updates need to be collected at a server.
1 code implementation • 20 Mar 2021 • Xianqi Chen, Xiaoyu Zhao, Zhiqiang Gong, Jun Zhang, Weien Zhou, Xiaoqian Chen, Wen Yao
Thermal issue is of great importance during layout design of heat source components in systems engineering, especially for high functional-density products.
1 code implementation • 17 Nov 2022 • Tailin Zhou, Jun Zhang, Danny H. K. Tsang
This enables client models to be updated in a shared feature space with consistent classifiers during local training.
1 code implementation • 14 Feb 2023 • Hengtao He, Xianghao Yu, Jun Zhang, Shenghui Song, Khaled B. Letaief
As one of the core technologies for 5G systems, massive multiple-input multiple-output (MIMO) introduces dramatic capacity improvements along with very high beamforming and spatial multiplexing gains.
1 code implementation • 18 Apr 2023 • Weiqi Xu, Li Ling, Yiping Xie, Jun Zhang, John Folkesson
In this paper, a canonical transformation method consisting of intensity correction and slant range correction is proposed to decrease the above distortion.
1 code implementation • 24 Feb 2020 • Yifei Shen, Ye Xue, Jun Zhang, Khaled B. Letaief, Vincent Lau
Dictionary learning is a classic representation learning method that has been widely applied in signal processing and data analytics.
1 code implementation • 19 Dec 2023 • Fengli Xu, Jun Zhang, Chen Gao, Jie Feng, Yong Li
Urban environments, characterized by their complex, multi-layered networks encompassing physical, social, economic, and environmental dimensions, face significant challenges in the face of rapid urbanization.
1 code implementation • 4 Apr 2023 • Jiawei Shao, Fangzhao Wu, Jun Zhang
While federated learning is promising for privacy-preserving collaborative learning without revealing local data, it remains vulnerable to white-box attacks and struggles to adapt to heterogeneous clients.
1 code implementation • 25 Dec 2023 • Xinran Li, Jun Zhang
Following this, agents utilize attention mechanisms in the second stage to selectively generate messages personalized for the receivers.
1 code implementation • 17 Apr 2020 • Wei Peng, Weien Zhou, Jun Zhang, Wen Yao
Physics-Informed Neural Networks (PINNs) can be regarded as general-purpose PDE solvers, but it might be slow to train PINNs on particular problems, and there is no theoretical guarantee of corresponding error bounds.
1 code implementation • 9 Jul 2020 • Kunyuan Li, Jun Zhang, Rui Sun, Xu-Dong Zhang, Jun Gao
Based on the observation that an oriented line and its neighboring pixels in an EPI share a similar linear structure, we propose an end-to-end fully convolutional network (FCN) to estimate the depth value of the intersection point on the horizontal and vertical EPIs.
1 code implementation • 28 Feb 2024 • Xinjie Zhang, Ren Yang, Dailan He, Xingtong Ge, Tongda Xu, Yan Wang, Hongwei Qin, Jun Zhang
Implicit neural representations (INRs) have emerged as a promising approach for video storage and processing, showing remarkable versatility across various video tasks.