1 code implementation • ACL 2022 • Xu Han, Guoyang Zeng, Weilin Zhao, Zhiyuan Liu, Zhengyan Zhang, Jie zhou, Jun Zhang, Jia Chao, Maosong Sun
In recent years, large-scale pre-trained language models (PLMs) containing billions of parameters have achieved promising results on various NLP tasks.
1 code implementation • Findings (EMNLP) 2021 • Jun Zhang, Yan Yang, Chencai Chen, Liang He, Zhou Yu
Recommendation dialogs require the system to build a social bond with users to gain trust and develop affinity in order to increase the chance of a successful recommendation.
no code implementations • EMNLP 2021 • Xiaoya Li, Jiwei Li, Xiaofei Sun, Chun Fan, Tianwei Zhang, Fei Wu, Yuxian Meng, Jun Zhang
Out-of-Distribution (OOD) detection is an important problem in natural language processing (NLP).
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
+2
no code implementations • 20 Sep 2023 • Kuan Tian, Yonghang Guan, Jinxi Xiang, Jun Zhang, Xiao Han, Wei Yang
First, to solve the problem of inconsistency of codec caused by the uncertainty of floating point calculations across platforms, we design a calibration transmitting system to guarantee the consistent quantization of entropy parameters between the encoding and decoding stages.
no code implementations • 18 Sep 2023 • Wentao Yu, Yifan Ma, Hengtao He, Shenghui Song, Jun Zhang, Khaled B. Letaief
Ultra-massive multiple-input multiple-output (UM-MIMO) is a cutting-edge technology that promises to revolutionize wireless networks by providing an unprecedentedly high spectral and energy efficiency.
no code implementations • 15 Sep 2023 • Jun Zhang, Jue Wang, Huan Li, Lidan Shou, Ke Chen, Gang Chen, Sharad Mehrotra
This approach is characterized by a two-stage process: drafting and verification.
1 code implementation • 14 Sep 2023 • Jiaheng Wei, Yanjun Zhang, Leo Yu Zhang, Chao Chen, Shirui Pan, Kok-Leong Ong, Jun Zhang, Yang Xiang
For the first time, we show the feasibility of a client-side adversary with limited knowledge being able to recover the training samples from the aggregated global model.
1 code implementation • 2 Sep 2023 • Jun Zhang, Huayang Zhuge, Yiyao Liu, Guohao Peng, Zhenyu Wu, Haoyuan Zhang, Qiyang Lyu, Heshan Li, Chunyang Zhao, Dogan Kircali, Sanat Mharolkar, Xun Yang, Su Yi, Yuanzhe Wang, Danwei Wang
5) Considered both middle- and large- scale outdoor environments, i. e., the 6 trajectories range from 246m to 6. 95km.
no code implementations • 30 Aug 2023 • Zijian Li, Zehong Lin, Jiawei Shao, Yuyi Mao, Jun Zhang
However, devices often have non-independent and identically distributed (non-IID) data, meaning their local data distributions can vary significantly.
no code implementations • 27 Aug 2023 • Chen Shen, Jun Zhang, Xinggong Liang, Zeyi Hao, Kehan Li, Fan Wang, Zhenyuan Wang, Chunfeng Lian
Forensic pathology is critical in analyzing death manner and time from the microscopic aspect to assist in the establishment of reliable factual bases for criminal investigation.
1 code implementation • 15 Aug 2023 • Yue Lv, Jinxi Xiang, Jun Zhang, Wenming Yang, Xiao Han, Wei Yang
We thus introduce a dynamic gating network on top of the low-rank adaptation method, in order to decide which decoder layer should employ adaptation.
no code implementations • 13 Aug 2023 • Hu Ye, Jun Zhang, Sibo Liu, Xiao Han, Wei Yang
Despite the simplicity of our method, an IP-Adapter with only 22M parameters can achieve comparable or even better performance to a fully fine-tuned image prompt model.
no code implementations • 12 Aug 2023 • Yongcong Chen, Ting Zeng, Jun Zhang
At present, the mainstream artificial intelligence generally adopts the technical path of "attention mechanism + deep learning" + "reinforcement learning".
no code implementations • 9 Aug 2023 • Zijian Li, Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
For better privacy preservation, we propose a hard feature augmentation method to transfer real features towards the decision boundary, with which the synthetic data not only improve the model generalization but also erase the information of real features.
no code implementations • 7 Aug 2023 • Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief
To improve communication efficiency and achieve a better privacy-utility trade-off, we propose a communication-efficient FL training algorithm with differential privacy guarantee.
1 code implementation • 20 Jul 2023 • Chenxin An, Shansan Gong, Ming Zhong, Mukai Li, Jun Zhang, Lingpeng Kong, Xipeng Qiu
Recently, there has been growing interest in extending the context length of instruction-following models in order to effectively process single-turn long input (e. g. summarizing a paper) and conversations with more extensive histories.
no code implementations • 20 Jul 2023 • Jiawei Shao, Zijian Li, Wenqiang Sun, Tailin Zhou, Yuchang Sun, Lumin Liu, Zehong Lin, Jun Zhang
First, we present a new taxonomy of FL methods in terms of the sharing methods, which includes three categories of shared information: model sharing, synthetic data sharing, and knowledge sharing.
no code implementations • 6 Jul 2023 • Yifei Shen, Jiawei Shao, Xinjie Zhang, Zehong Lin, Hao Pan, Dongsheng Li, Jun Zhang, Khaled B. Letaief
The evolution of wireless networks gravitates towards connected intelligence, a concept that envisions seamless interconnectivity among humans, objects, and intelligence in a hyper-connected cyber-physical world.
no code implementations • 21 Jun 2023 • Yuchang Sun, Yuyi Mao, Jun Zhang
Federated learning (FL) is a promising framework for privacy-preserving collaborative learning, where model training tasks are distributed to clients and only the model updates need to be collected at a server.
no code implementations • 7 Jun 2023 • Lu Huang, Boyu Li, Jun Zhang, Lu Lu, Zejun Ma
Domain adaptation using text-only corpus is challenging in end-to-end(E2E) speech recognition.
no code implementations • 27 May 2023 • Linhao Dong, Zhecheng An, Peihao Wu, Jun Zhang, Lu Lu, Zejun Ma
We also observe the cross-modal representation extracted by CIF-PT obtains better performance than other neural interfaces for the tasks of SLU, including the dominant speech representation learned from self-supervised pre-training.
no code implementations • 26 May 2023 • Yuchang Sun, Zehong Lin, Yuyi Mao, Shi Jin, Jun Zhang
In this paper, we propose a probabilistic device scheduling framework for over-the-air FL, named PO-FL, to mitigate the negative impact of channel noise, where each device is scheduled according to a certain probability and its model update is reweighted using this probability in aggregation.
no code implementations • 25 May 2023 • Liheng Bian, Daoyu Li, Shuoguang Wang, Chunyang Teng, Huteng Liu, Hanwen Xu, Xuyang Chang, Guoqiang Zhao, Shiyong Li, Jun Zhang
These elements are then sampled based on the ranking, building the experimentally optimal sparse sampling strategy that reduces the cost of antenna array by up to one order of magnitude.
no code implementations • 21 May 2023 • Xinyu Bian, Yuyi Mao, Jun Zhang
Most existing studies on joint activity detection and channel estimation for grant-free massive random access (RA) systems assume perfect synchronization among all active users, which is hard to achieve in practice.
no code implementations • 21 May 2023 • Hongru Li, Wentao Yu, Hengtao He, Jiawei Shao, Shenghui Song, Jun Zhang, Khaled B. Letaief
Task-oriented communication is an emerging paradigm for next-generation communication networks, which extracts and transmits task-relevant information, instead of raw data, for downstream applications.
no code implementations • 13 May 2023 • Tailin Zhou, Zehong Lin, Jun Zhang, Danny H. K. Tsang
To further understand this phenomenon, we decompose the expected prediction error of the global model into five factors related to client models.
no code implementations • 10 May 2023 • Xiaorui Bai, Wenyong Wang, Jun Zhang, Yueqing Wang, Yu Xiang
Flow field segmentation and classification help researchers to understand vortex structure and thus turbulent flow.
no code implementations • 2 May 2023 • Jun Zhang, Xiaohan Lin, Weinan E, Yi Qin Gao
Multiscale molecular modeling is widely applied in scientific research of molecular properties over large time and length scales.
no code implementations • 2 May 2023 • Wenqiang Sun, Sen Li, Yuchang Sun, Jun Zhang
Federated learning (FL) attempts to train a global model by aggregating local models from distributed devices under the coordination of a central server.
no code implementations • 19 Apr 2023 • Jingjin Li, Chao Chen, Lei Pan, Mostafa Rahimi Azghadi, Hossein Ghodosi, Jun Zhang
The privacy issues include technical-wise information stealing and policy-wise privacy breaches.
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+2
no code implementations • 18 Apr 2023 • Weiqi Xu, Li Ling, Yiping Xie, Jun Zhang, John Folkesson
In this paper, a canonical transformation method consisting of intensity correction and slant range correction is proposed to decrease the above distortion.
no code implementations • 12 Apr 2023 • Xinyu Bian, Yuyi Mao, Jun Zhang
Specifically, by jointly leveraging the user activity correlation between adjacent transmission blocks and the historical channel estimation results, we first develop an activity-correlation-aware receiver for grant-free massive RA systems with retransmission based on the correlated approximate message passing (AMP) algorithm.
no code implementations • 12 Apr 2023 • Wei-neng Chen, Feng-Feng Wei, Tian-Fang Zhao, Kay Chen Tan, Jun Zhang
Based on this taxonomy, existing studies on DEC are reviewed in terms of purpose, parallel structure of the algorithm, parallel model for implementation, and the implementation environment.
no code implementations • 12 Apr 2023 • Feng-Feng Wei, Wei-neng Chen, Xiao-Qi Guo, Bowen Zhao, Sang-Woon Jeon, Jun Zhang
Inspired by this, this paper intends to introduce crowdsourcing into evolutionary computation (EC) to propose a crowdsourcing-based evolutionary computation (CEC) paradigm for distributed optimization.
no code implementations • 4 Apr 2023 • Jiawei Shao, Fangzhao Wu, Jun Zhang
While federated learning is promising for privacy-preserving collaborative learning without revealing local data, it remains vulnerable to white-box attacks and struggles to adapt to heterogeneous clients.
1 code implementation • CVPR 2023 • Shenyuan Gao, Chunluan Zhou, Jun Zhang
Compared with previous two-stream trackers, the recent one-stream tracking pipeline, which allows earlier interaction between the template and search region, has achieved a remarkable performance gain.
no code implementations • 22 Mar 2023 • Bowen Zhao, Wei-neng Chen, Xiaoguo Li, Ximeng Liu, Qingqi Pei, Jun Zhang
To this end, in this paper, we discuss three typical optimization paradigms (i. e., \textit{centralized optimization, distributed optimization, and data-driven optimization}) to characterize optimization modes of evolutionary computation and propose BOOM to sort out privacy concerns in evolutionary computation.
1 code implementation • 21 Mar 2023 • Xinjie Zhang, Jiawei Shao, Jun Zhang
This has inspired a distributed coding architecture aiming at reducing the encoding complexity.
no code implementations • 16 Mar 2023 • Yupeng Huang, Hong Zhang, Siyuan Jiang, Dajiong Yue, Xiaohan Lin, Jun Zhang, Yi Qin Gao
In this study, we take the advantage of both traditional and machine-learning based methods, and present a method Deep Site and Docking Pose (DSDP) to improve the performance of blind docking.
no code implementations • 11 Mar 2023 • Simon Graham, Quoc Dang Vu, Mostafa Jahanifar, Martin Weigert, Uwe Schmidt, Wenhua Zhang, Jun Zhang, Sen yang, Jinxi Xiang, Xiyue Wang, Josef Lorenz Rumberger, Elias Baumann, Peter Hirsch, Lihao Liu, Chenyang Hong, Angelica I. Aviles-Rivero, Ayushi Jain, Heeyoung Ahn, Yiyu Hong, Hussam Azzuni, Min Xu, Mohammad Yaqub, Marie-Claire Blache, Benoît Piégu, Bertrand Vernay, Tim Scherr, Moritz Böhland, Katharina Löffler, Jiachen Li, Weiqin Ying, Chixin Wang, Dagmar Kainmueller, Carola-Bibiane Schönlieb, Shuolin Liu, Dhairya Talsania, Yughender Meda, Prakash Mishra, Muhammad Ridzuan, Oliver Neumann, Marcel P. Schilling, Markus Reischl, Ralf Mikut, Banban Huang, Hsiang-Chin Chien, Ching-Ping Wang, Chia-Yen Lee, Hong-Kun Lin, Zaiyi Liu, Xipeng Pan, Chu Han, Jijun Cheng, Muhammad Dawood, Srijay Deshpande, Raja Muhammad Saad Bashir, Adam Shephard, Pedro Costa, João D. Nunes, Aurélio Campilho, Jaime S. Cardoso, Hrishikesh P S, Densen Puthussery, Devika R G, Jiji C V, Ye Zhang, Zijie Fang, Zhifan Lin, Yongbing Zhang, Chunhui Lin, Liukun Zhang, Lijian Mao, Min Wu, Vi Thi-Tuong Vo, Soo-Hyung Kim, Taebum Lee, Satoshi Kondo, Satoshi Kasai, Pranay Dumbhare, Vedant Phuse, Yash Dubey, Ankush Jamthikar, Trinh Thi Le Vuong, Jin Tae Kwak, Dorsa Ziaei, Hyun Jung, Tianyi Miao, David Snead, Shan E Ahmed Raza, Fayyaz Minhas, Nasir M. Rajpoot
Nuclear detection, segmentation and morphometric profiling are essential in helping us further understand the relationship between histology and patient outcome.
no code implementations • 24 Feb 2023 • Xuefeng Wang, Xinran Li, Jiawei Shao, Jun Zhang
Learning communication strategies in cooperative multi-agent reinforcement learning (MARL) has recently attracted intensive attention.
Multi-agent Reinforcement Learning
reinforcement-learning
+2
no code implementations • 14 Feb 2023 • Hengtao He, Xianghao Yu, Jun Zhang, Shenghui Song, Khaled B. Letaief
As one of the core technologies for 5G systems, massive multiple-input multiple-output (MIMO) introduces dramatic capacity improvements along with very high beamforming and spatial multiplexing gains.
no code implementations • 13 Feb 2023 • Zeqiang Lai, Ying Fu, Jun Zhang
The features of RGB reference images are then processed by a multi-stage alignment module to explicitly align the features of RGB reference with the LR HSI.
no code implementations • 13 Feb 2023 • Fei Kong, Jinxi Xiang, Xiyue Wang, Xinran Wang, Meng Yue, Jun Zhang, Sen yang, Junhan Zhao, Xiao Han, Yuhan Dong, Yueping Liu
Finally, FCL brings a robust, accurate, low-cost AI training model to biomedical research, effectively protecting medical data privacy.
1 code implementation • 9 Feb 2023 • Mukai Li, Shansan Gong, Jiangtao Feng, Yiheng Xu, Jun Zhang, Zhiyong Wu, Lingpeng Kong
Based on EVALM, we scale up the size of examples efficiently in both instruction tuning and in-context learning to explore the boundary of the benefits from more annotated data.
1 code implementation • 24 Jan 2023 • Xinjie Zhang, Jiawei Shao, Jun Zhang
Multi-view image compression plays a critical role in 3D-related applications.
no code implementations • 12 Jan 2023 • Siteng Chen, Xiyue Wang, Jun Zhang, Liren Jiang, Ning Zhang, Feng Gao, Wei Yang, Jinxi Xiang, Sen yang, Junhua Zheng, Xiao Han
The OSrisk for the prediction of 5-year survival status achieved AUC of 0. 784 (0. 746-0. 819) in the TCGA cohort, which was further verified in the independent General cohort and the CPTAC cohort, with AUC of 0. 774 (0. 723-0. 820) and 0. 702 (0. 632-0. 765), respectively.
no code implementations • 3 Jan 2023 • Yandong Shi, Lixiang Lian, Yuanming Shi, Zixin Wang, Yong Zhou, Liqun Fu, Lin Bai, Jun Zhang, Wei zhang
The sixth generation (6G) wireless systems are envisioned to enable the paradigm shift from "connected things" to "connected intelligence", featured by ultra high density, large-scale, dynamic heterogeneity, diversified functional requirements and machine learning capabilities, which leads to a growing need for highly efficient intelligent algorithms.
no code implementations • 28 Dec 2022 • Liheng Bian, Haoze Song, Lintao Peng, Xuyang Chang, Xi Yang, Roarke Horstmeyer, Lin Ye, Tong Qin, Dezhi Zheng, Jun Zhang
Benefiting from its single-photon sensitivity, single-photon avalanche diode (SPAD) array has been widely applied in various fields such as fluorescence lifetime imaging and quantum computing.
no code implementations • 16 Dec 2022 • Liheng Bian, Xinrui Zhan, Xuyang Chang, Daoyu Li, Rong Yan, Yinuo Zhang, Haowen Ruan, Jun Zhang
In the proposed framework of single-pixel detection, the optical field from a target is first scattered by an optical diffuser and then two-dimensionally modulated by a spatial light modulator.
1 code implementation • 4 Dec 2022 • Boxuan Zhao, Jun Zhang, Deheng Ye, Jian Cao, Xiao Han, Qiang Fu, Wei Yang
Most of the existing methods rely on a multiple instance learning framework that requires densely sampling local patches at high magnification.
1 code implementation • 3 Dec 2022 • Jiahao Li, Zhourun Wu, Wenhao Lin, Jiawei Luo, Jun Zhang, Qingcai Chen, Junjie Chen
Although many feature extraction methods have been proposed to improve the performance of enhancer identification, they cannot learn position-related multiscale contextual information from raw DNA sequences.
1 code implementation • 29 Nov 2022 • Wentao Yu, Yifei Shen, Hengtao He, Xianghao Yu, Shenghui Song, Jun Zhang, Khaled B. Letaief
For practical usage, the proposed framework is further extended to wideband THz UM-MIMO systems with beam squint effect.
no code implementations • 28 Nov 2022 • Yifan Ma, Wentao Yu, Xianghao Yu, Jun Zhang, Shenghui Song, Khaled B. Letaief
In this paper, we propose a lightweight and flexible deep learning-based CSI feedback approach by capitalizing on deep equilibrium models.
2 code implementations • 27 Nov 2022 • Zhenhao Shuai, Hongbo Liu, Zhaolin Wan, Wei-Jie Yu, Jun Zhang
One of the key settings in SANE is the search space defined by cells and organs self-adapted to different DNN types.
1 code implementation • 25 Nov 2022 • Jiawei Shao, Xinjie Zhang, Jun Zhang
With the development of artificial intelligence (AI) techniques and the increasing popularity of camera-equipped devices, many edge video analytics applications are emerging, calling for the deployment of computation-intensive AI models at the network edge.
no code implementations • 17 Nov 2022 • Tailin Zhou, Jun Zhang, Danny H. K. Tsang
However, we discover that these approaches fall short of the expected performance because they ignore the existence of a vicious cycle between classifier divergence and feature mapping inconsistency across clients, such that client models are updated in inconsistent feature space with diverged classifiers.
no code implementations • 15 Nov 2022 • Wentao Yu, Hengtao He, Xianghao Yu, Shenghui Song, Jun Zhang, Khaled B. Letaief
Reliability is of paramount importance for the physical layer of wireless systems due to its decisive impact on end-to-end performance.
no code implementations • 8 Nov 2022 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Songze Li, Jun Zhang
During training, the server computes gradients on the global coded dataset to compensate for the missing model updates of the straggling devices.
no code implementations • 27 Oct 2022 • Jun Zhang, Ping Li, Wei Wang
Recent advances in neural networks have been successfully applied to many tasks in online recommendation applications.
1 code implementation • 14 Oct 2022 • Jun Zhang, Shuyang Jiang, Jiangtao Feng, Lin Zheng, Lingpeng Kong
In this paper, we propose Comprehensive Attention Benchmark (CAB) under a fine-grained attention taxonomy with four distinguishable attention patterns, namely, noncausal self, causal self, noncausal cross, and causal cross attentions.
1 code implementation • 7 Oct 2022 • Jiangtao Feng, Yi Zhou, Jun Zhang, Xian Qian, Liwei Wu, Zhexi Zhang, Yanming Liu, Mingxuan Wang, Lei LI, Hao Zhou
PARAGEN is a PyTorch-based NLP toolkit for further development on parallel generation.
no code implementations • 6 Oct 2022 • Jiawei Shao, Yuchang Sun, Songze Li, Jun Zhang
Federated learning (FL) strives to enable collaborative training of machine learning models without centrally collecting clients' private data.
7 code implementations • 5 Oct 2022 • Silvio Giancola, Anthony Cioppa, Adrien Deliège, Floriane Magera, Vladimir Somers, Le Kang, Xin Zhou, Olivier Barnich, Christophe De Vleeschouwer, Alexandre Alahi, Bernard Ghanem, Marc Van Droogenbroeck, Abdulrahman Darwish, Adrien Maglo, Albert Clapés, Andreas Luyts, Andrei Boiarov, Artur Xarles, Astrid Orcesi, Avijit Shah, Baoyu Fan, Bharath Comandur, Chen Chen, Chen Zhang, Chen Zhao, Chengzhi Lin, Cheuk-Yiu Chan, Chun Chuen Hui, Dengjie Li, Fan Yang, Fan Liang, Fang Da, Feng Yan, Fufu Yu, Guanshuo Wang, H. Anthony Chan, He Zhu, Hongwei Kan, Jiaming Chu, Jianming Hu, Jianyang Gu, Jin Chen, João V. B. Soares, Jonas Theiner, Jorge De Corte, José Henrique Brito, Jun Zhang, Junjie Li, Junwei Liang, Leqi Shen, Lin Ma, Lingchi Chen, Miguel Santos Marques, Mike Azatov, Nikita Kasatkin, Ning Wang, Qiong Jia, Quoc Cuong Pham, Ralph Ewerth, Ran Song, RenGang Li, Rikke Gade, Ruben Debien, Runze Zhang, Sangrok Lee, Sergio Escalera, Shan Jiang, Shigeyuki Odashima, Shimin Chen, Shoichi Masui, Shouhong Ding, Sin-wai Chan, Siyu Chen, Tallal El-Shabrawy, Tao He, Thomas B. Moeslund, Wan-Chi Siu, Wei zhang, Wei Li, Xiangwei Wang, Xiao Tan, Xiaochuan Li, Xiaolin Wei, Xiaoqing Ye, Xing Liu, Xinying Wang, Yandong Guo, YaQian Zhao, Yi Yu, YingYing Li, Yue He, Yujie Zhong, Zhenhua Guo, Zhiheng Li
The SoccerNet 2022 challenges were the second annual video understanding challenges organized by the SoccerNet team.
no code implementations • 27 Sep 2022 • Chengzhi Lin, AnCong Wu, Junwei Liang, Jun Zhang, Wenhang Ge, Wei-Shi Zheng, Chunhua Shen
To address this problem, we propose a Text-Adaptive Multiple Visual Prototype Matching model, which automatically captures multiple prototypes to describe a video by adaptive aggregation of video token features.
1 code implementation • 26 Sep 2022 • Junwei Liang, Enwei Zhang, Jun Zhang, Chunhua Shen
We study the task of robust feature representations, aiming to generalize well on multiple datasets for action recognition.
1 code implementation • 13 Sep 2022 • Sen yang, Tao Shen, Yuqi Fang, Xiyue Wang, Jun Zhang, Wei Yang, Junzhou Huang, Xiao Han
The high-content image-based assay is commonly leveraged for identifying the phenotypic impact of genetic perturbations in biology field.
no code implementations • 3 Sep 2022 • Yifan Ma, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief
In limited feedback multi-user multiple-input multiple-output (MU-MIMO) cellular networks, users send quantized information about the channel conditions to the associated base station (BS) for downlink beamforming.
2 code implementations • 20 Aug 2022 • Jun Zhang, Sirui Liu, Mengyun Chen, Haotian Chu, Min Wang, Zidong Wang, Jialiang Yu, Ningxi Ni, Fan Yu, Diqing Chen, Yi Isaac Yang, Boxin Xue, Lijiang Yang, YuAn Liu, Yi Qin Gao
Data-driven predictive methods which can efficiently and accurately transform protein sequences into biologically active structures are highly valuable for scientific research and therapeutical development.
no code implementations • 10 Jul 2022 • Lin Li, Chao Chen, Lei Pan, Yonghang Tai, Jun Zhang, Yang Xiang
It reduces the success rate of rPPG spoofing attacks in user authentication to 0. 05.
2 code implementations • 24 Jun 2022 • Sirui Liu, Jun Zhang, Haotian Chu, Min Wang, Boxin Xue, Ningxi Ni, Jialiang Yu, Yuhao Xie, Zhenyu Chen, Mengyun Chen, YuAn Liu, Piya Patra, Fan Xu, Jie Chen, Zidong Wang, Lijiang Yang, Fan Yu, Lei Chen, Yi Qin Gao
We provide in addition the benchmark training procedure for SOTA protein structure prediction model on this dataset.
no code implementations • 23 Jun 2022 • Yu Xiang, Guangbo Zhang, Liwei Hu, Jun Zhang, Wenyong Wang
Geometrical shape of airfoils, together with the corresponding flight conditions, are crucial factors for aerodynamic performances prediction.
no code implementations • 15 Jun 2022 • Rongkang Dong, Yuyi Mao, Jun Zhang
In this paper, we propose an early exit prediction mechanism to reduce the on-device computation overhead in a device-edge co-inference system supported by early-exit networks.
no code implementations • 11 Jun 2022 • Zijian Li, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
A combination of the local private dataset and synthetic dataset with confident pseudo labels leads to nearly identical data distributions among clients, which improves the consistency among local models and benefits the global aggregation.
1 code implementation • 8 Jun 2022 • Bo Li, Yifei Shen, Jingkang Yang, Yezhen Wang, Jiawei Ren, Tong Che, Jun Zhang, Ziwei Liu
It is motivated by an empirical finding that transformer-based models trained with empirical risk minimization (ERM) outperform CNN-based models employing state-of-the-art (SOTA) DG algorithms on multiple DG datasets.
Ranked #9 on
Domain Generalization
on DomainNet
(using extra training data)
1 code implementation • IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops 2022 • Junwei Liang, He Zhu, Enwei Zhang, Jun Zhang
Distracted driver actions can be dangerous and cause severe accidents.
no code implementations • 27 May 2022 • Bowen Zhao, Wei-neng Chen, Feng-Feng Wei, Ximeng Liu, Qingqi Pei, Jun Zhang
Specifically, PEGA enables users outsourcing COPs to the cloud server holding a competitive GA and approximating the optimal solution in a privacy-preserving manner.
1 code implementation • 10 May 2022 • Wentao Yu, Yifei Shen, Hengtao He, Xianghao Yu, Jun Zhang, Khaled B. Letaief
We draw inspirations from fixed point theory to develop an efficient deep learning based channel estimator with adaptive complexity and linear convergence guarantee.
no code implementations • 29 Apr 2022 • Yuting Gao, Jinfeng Liu, Zihan Xu, Jun Zhang, Ke Li, Rongrong Ji, Chunhua Shen
Large-scale vision-language pre-training has achieved promising results on downstream tasks.
no code implementations • 7 Apr 2022 • Siteng Chen, Jinxi Xiang, Xiyue Wang, Jun Zhang, Sen yang, Junzhou Huang, Wei Yang, Junhua Zheng, Xiao Han
MC-TMB algorithm also exhibited good generalization on the external validation cohort with an AUC of 0. 732 (0. 683-0. 761), and better performance when compared to other methods.
no code implementations • 5 Apr 2022 • Qi Zhong, Leo Yu Zhang, Shengshan Hu, Longxiang Gao, Jun Zhang, Yong Xiang
Fine-tuning attacks are effective in removing the embedded watermarks in deep learning models.
no code implementations • 29 Mar 2022 • Kunyuan Li, Jun Zhang, Jun Gao, Meibin Qi
In this paper, we propose a self-supervised learning framework for light field depth estimation.
no code implementations • 22 Mar 2022 • Chi Liu, Huajie Chen, Tianqing Zhu, Jun Zhang, Wanlei Zhou
To evaluate the attack efficacy, we crafted heterogeneous security scenarios where the detectors were embedded with different levels of defense and the attackers' background knowledge of data varies.
1 code implementation • 21 Mar 2022 • Yifei Shen, Jun Zhang, S. H. Song, Khaled B. Letaief
For design guidelines, we propose a unified framework that is applicable to general design problems in wireless networks, which includes graph modeling, neural architecture design, and theory-guided performance enhancement.
no code implementations • 14 Mar 2022 • Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief
Federated Distillation (FD) is a recently proposed alternative to enable communication-efficient and robust FL, which achieves orders of magnitude reduction of the communication overhead compared with FedAvg and is flexible to handle heterogeneous models at the clients.
no code implementations • 4 Mar 2022 • Wenhua Zhang, Jun Zhang
Connective nuclei may look very different from each other while some of them share a similar shape with the epithelial ones.
no code implementations • 17 Feb 2022 • Xiangjie Kong, Jun Zhang, Da Zhang, Yi Bu, Ying Ding, Feng Xia
Under this consideration, our paper presents and analyzes the causal factors that are crucial for scholars' academic success.
no code implementations • 9 Feb 2022 • Chen Shen, Yi Liu, Wenzhi Fan, Bin Wang, Shixue Wen, Yao Tian, Jun Zhang, Jingsheng Yang, Zejun Ma
For Track 1, we propose several approaches to empower the clustering-based speaker diarization system to handle overlapped speech.
no code implementations • 25 Jan 2022 • Yuchang Sun, Jiawei Shao, Songze Li, Yuyi Mao, Jun Zhang
Federated learning (FL) has attracted much attention as a privacy-preserving distributed machine learning framework, where many clients collaboratively train a machine learning model by exchanging model updates with a parameter server instead of sharing their raw data.
no code implementations • 18 Jan 2022 • Zhen Gao, Minghui Wu, Chun Hu, Feifei Gao, Guanghui Wen, Dezhi Zheng, Jun Zhang
To this end, by modeling the key transmission modules as an end-to-end (E2E) neural network, this paper proposes a data-driven deep learning (DL)-based unified hybrid beamforming framework for both the time division duplex (TDD) and frequency division duplex (FDD) systems with implicit channel state information (CSI).
no code implementations • 15 Jan 2022 • Meng Xu, Youchen Wang, Bin Xu, Jun Zhang, Jian Ren, Stefan Poslad, Pengfei Xu
Camera, and associated with its objects within the field of view, localization could benefit many computer vision fields, such as autonomous driving, robot navigation, and augmented reality (AR).
no code implementations • 8 Jan 2022 • Xinrui Zhan, Liheng Bian, Chunli Zhu, Jun Zhang
While the network is training at a high sampling rate, the modulation patterns and corresponding weights are updated iteratively, which produces optimal ranked encoding series when converged.
1 code implementation • CVPR 2022 • Yonghang Guan, Jun Zhang, Kuan Tian, Sen yang, Pei Dong, Jinxi Xiang, Wei Yang, Junzhou Huang, Yuyao Zhang, Xiao Han
In this paper, we propose a hierarchical global-to-local clustering strategy to build a Node-Aligned GCN (NAGCN) to represent WSI with rich local structural information as well as global distribution.
no code implementations • 20 Dec 2021 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
By exploiting the low-latency communication among edge servers for efficient model sharing, SD-FEEL can incorporate more training data, while enjoying much lower latency compared with conventional federated learning.
no code implementations • 9 Dec 2021 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Jun Zhang
Federated edge learning (FEEL) has drawn much attention as a privacy-preserving distributed learning framework for mobile edge networks.
no code implementations • 1 Oct 2021 • Yifan Ma, Yifei Shen, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief
Furthermore, such networks will vary dynamically in a significant way, which makes it intractable to develop comprehensive analytical models.
no code implementations • 1 Sep 2021 • Sen yang, Feng Luo, Jun Zhang, Xiyue Wang
Mitotic count is the most important morphological feature of breast cancer grading.
2 code implementations • 1 Sep 2021 • Jiawei Shao, Yuyi Mao, Jun Zhang
To enable low-latency cooperative inference, we propose a learning-based communication scheme that optimizes local feature extraction and distributed feature encoding in a task-oriented manner, i. e., to remove data redundancy and transmit information that is essential for the downstream inference task rather than reconstructing the data samples at the edge server.
no code implementations • 30 Aug 2021 • Xinjie Zhang, Jiawei Shao, Yuyi Mao, Jun Zhang
Device-edge co-inference, which partitions a deep neural network between a resource-constrained mobile device and an edge server, recently emerges as a promising paradigm to support intelligent mobile applications.
1 code implementation • 29 Aug 2021 • Xiaoya Li, Jiwei Li, Xiaofei Sun, Chun Fan, Tianwei Zhang, Fei Wu, Yuxian Meng, Jun Zhang
For a task with $k$ training labels, $k$Folden induces $k$ sub-models, each of which is trained on a subset with $k-1$ categories with the left category masked unknown to the sub-model.
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
+2
no code implementations • 24 Aug 2021 • Haiyan Liu, Liheng Bian, Jun Zhang
We envision that this image-free segmentation technique can be widely applied in various resource-limited platforms such as UAV and unmanned vehicle that require real-time sensing.
no code implementations • 19 Aug 2021 • Guohao Peng, Yufeng Yue, Jun Zhang, Zhenyu Wu, Xiaoyu Tang, Danwei Wang
(2) By exploiting the interpretability of the local weighting scheme, a semantic constrained initialization is proposed so that the local attention can be reinforced by semantic priors.
1 code implementation • 17 Aug 2021 • Yifei Shen, Yongji Wu, Yao Zhang, Caihua Shan, Jun Zhang, Khaled B. Letaief, Dongsheng Li
In this paper, we endeavor to obtain a better understanding of GCN-based CF methods via the lens of graph signal processing.
Ranked #5 on
Collaborative Filtering
on Gowalla
no code implementations • 3 Aug 2021 • Yifan Ma, Yifei Shen, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief
Channel estimation and beamforming play critical roles in frequency-division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems.
no code implementations • 24 Jul 2021 • Xueke Zheng, Runze Cai, Shuixin Xiao, Yu Qiu, Jun Zhang, Mian Li
A real-world application to the estimation of the vertical wheel force in a full vehicle system are, respectively, conducted to demonstrate the effectiveness of the proposed method.
1 code implementation • 22 Jul 2021 • Wen Yao, Xiaohu Zheng, Jun Zhang, Ning Wang, Guijian Tang
Based on the adaptive aPC, a semi-supervised deep adaptive arbitrary polynomial chaos expansion (Deep aPCE) method is proposed to reduce the training data cost and improve the surrogate model accuracy.
no code implementations • 15 Jul 2021 • Yufeng Xia, Jun Zhang, Zhiqiang Gong, Tingsong Jiang, Wen Yao
Deep Ensemble is widely considered the state-of-the-art method which can estimate the uncertainty with higher quality, but it is very expensive to train and test.
no code implementations • 12 Jul 2021 • Xinyu Bian, Yuyi Mao, Jun Zhang
In particular, the common sparsity pattern in the received pilot and data signal has been ignored in most existing studies, and auxiliary information of channel decoding has not been utilized for user activity detection.
1 code implementation • 9 Jul 2021 • Wei Peng, Jun Zhang, Weien Zhou, Xiaoyu Zhao, Wen Yao, Xiaoqian Chen
Physics Informed Neural Network (PINN) is a scientific computing framework used to solve both forward and inverse problems modeled by Partial Differential Equations (PDEs).
1 code implementation • 22 Jun 2021 • Zhiqiang Gong, Weien Zhou, Jun Zhang, Wei Peng, Wen Yao
To solve this problem, this work develops a novel physics-informed deep reversible regression models for temperature field reconstruction of heat-source systems (TFR-HSS), which can better reconstruct the temperature field with limited monitoring points unsupervisedly.
1 code implementation • CVPR 2021 • Jiaxing Chen, Xinyang Jiang, Fudong Wang, Jun Zhang, Feng Zheng, Xing Sun, Wei-Shi Zheng
In this paper, rather than relying on texture based information, we propose to improve the robustness of person ReID against clothing texture by exploiting the information of a person's 3D shape.
Ranked #4 on
Person Re-Identification
on PRCC
no code implementations • 13 Jun 2021 • Huapeng Wu, Jie Gui, Jun Zhang, James T. Kwok, Zhihui Wei
Recently, deep convolutional neural network methods have achieved an excellent performance in image superresolution (SR), but they can not be easily applied to embedded devices due to large memory cost.
no code implementations • 13 Jun 2021 • Huapeng Wu, Jie Gui, Jun Zhang, James T. Kwok, Zhihui Wei
Recently, convolutional neural network (CNN) based image super-resolution (SR) methods have achieved significant performance improvement.
no code implementations • 11 Jun 2021 • Bo Li, Yifei Shen, Yezhen Wang, Wenzhen Zhu, Colorado J. Reed, Jun Zhang, Dongsheng Li, Kurt Keutzer, Han Zhao
IIB significantly outperforms IRM on synthetic datasets, where the pseudo-invariant features and geometric skews occur, showing the effectiveness of proposed formulation in overcoming failure modes of IRM.
no code implementations • 9 Jun 2021 • Lintao Peng, Liheng Bian, Tiexin Liu, Jun Zhang
In this work, we report an agile wide-field imaging framework with selective high resolution that requires only two detectors.
1 code implementation • 7 Jun 2021 • Jie Gui, Xiaofeng Cong, Yuan Cao, Wenqi Ren, Jun Zhang, Jing Zhang, Jiuxin Cao, DaCheng Tao
With the development of convolutional neural networks, hundreds of deep learning based dehazing methods have been proposed.
no code implementations • 31 May 2021 • Xuyang Chang, Liheng Bian, Shaowei Jiang, Guoan Zheng, Jun Zhang
Complex-domain imaging has emerged as a valuable technique for investigating weak-scattered samples.
no code implementations • 13 May 2021 • XiaoYu Zhang, Chao Chen, Yi Xie, Xiaofeng Chen, Jun Zhang, Yang Xiang
This survey presents the most recent findings of privacy attacks and defenses appeared in cloud-based neural network services.
no code implementations • 26 Apr 2021 • Zhefeng Qiao, Xianghao Yu, Jun Zhang, Khaled B. Letaief
Federated learning (FL) is a promising and powerful approach for training deep learning models without sharing the raw data of clients.
no code implementations • 26 Apr 2021 • Xinyu Bian, Yuyi Mao, Jun Zhang
In this paper, we propose a turbo receiver for joint activity detection and data decoding in grant-free massive random access, which iterates between a detector and a belief propagation (BP)-based channel decoder.
no code implementations • 26 Apr 2021 • Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang
Federated edge learning (FEEL) has emerged as an effective approach to reduce the large communication latency in Cloud-based machine learning solutions, while preserving data privacy.
no code implementations • 8 Apr 2021 • Daoyu Li, Liheng Bian, Jun Zhang
Recovering these sharp video frames from a single blurred image is nontrivial, due to not only its strong ill-posedness, but also various types of complex motion in reality such as rotation and motion in depth.
no code implementations • 6 Apr 2021 • Xuyang Chang, Liheng Bian, Jun Zhang
In computational phase imaging, phase retrieval (PR) is required to reconstruct both amplitude and phase in complex space from intensity-only measurements.
1 code implementation • 4 Apr 2021 • He Wang, Yifei Shen, Ziyuan Wang, Dongsheng Li, Jun Zhang, Khaled B. Letaief, Jie Lu
In this paper, we investigate the decentralized statistical inference problem, where a network of agents cooperatively recover a (structured) vector from private noisy samples without centralized coordination.
no code implementations • 2 Apr 2021 • Lu Huang, Jingyu Sun, Yufeng Tang, JunFeng Hou, Jinkun Chen, Jun Zhang, Zejun Ma
This work describes an encoder pre-training procedure using frame-wise label to improve the training of streaming recurrent neural network transducer (RNN-T) model.
no code implementations • 26 Mar 2021 • Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief
Hierarchical FL, with a client-edge-cloud aggregation hierarchy, can effectively leverage both the cloud server's access to many clients' data and the edge servers' closeness to the clients to achieve a high communication efficiency.
1 code implementation • 20 Mar 2021 • Xianqi Chen, Xiaoyu Zhao, Zhiqiang Gong, Jun Zhang, Weien Zhou, Xiaoqian Chen, Wen Yao
Thermal issue is of great importance during layout design of heat source components in systems engineering, especially for high functional-density products.
no code implementations • 10 Mar 2021 • Zheng-Ping Li, Jun-Tian Ye, Xin Huang, Peng-Yu Jiang, Yuan Cao, Yu Hong, Chao Yu, Jun Zhang, Qiang Zhang, Cheng-Zhi Peng, Feihu Xu, Jian-Wei Pan
Long-range active imaging has widespread applications in remote sensing and target recognition.
1 code implementation • ICCV 2021 • Guanyu Cai, Jun Zhang, Xinyang Jiang, Yifei Gong, Lianghua He, Fufu Yu, Pai Peng, Xiaowei Guo, Feiyue Huang, Xing Sun
However, the performance of existing methods suffers in real life since the user is likely to provide an incomplete description of an image, which often leads to results filled with false positives that fit the incomplete description.
no code implementations • 17 Feb 2021 • Xinyu Bian, Yuyi Mao, Jun Zhang
Massive machine-type communication (mMTC) has been regarded as one of the most important use scenarios in the fifth generation (5G) and beyond wireless networks, which demands scalable access for a large number of devices.
2 code implementations • 16 Feb 2021 • Yuantian Miao, Chao Chen, Lei Pan, Qing-Long Han, Jun Zhang, Yang Xiang
Stealing attack against controlled information, along with the increasing number of information leakage incidents, has become an emerging cyber security threat in recent years.
1 code implementation • 8 Feb 2021 • Jiawei Shao, Yuyi Mao, Jun Zhang
Extensive experiments evidence that the proposed task-oriented communication system achieves a better rate-distortion tradeoff than baseline methods and significantly reduces the feature transmission latency in dynamic channel conditions.
2 code implementations • 8 Jan 2021 • Chenyang Gao, Guanyu Cai, Xinyang Jiang, Feng Zheng, Jun Zhang, Yifei Gong, Pai Peng, Xiaowei Guo, Xing Sun
Secondly, a BERT with locality-constrained attention is proposed to obtain representations of descriptions at different scales.
Ranked #12 on
Text based Person Retrieval
on CUHK-PEDES
no code implementations • ICCV 2021 • Guohao Peng, Jun Zhang, Heshan Li, Danwei Wang
The core of visual place recognition (VPR) lies in how to identify task-relevant visual cues and embed them into discriminative representations.
no code implementations • 22 Dec 2020 • Jun Zhang, Yaqiang Zhou, Yao-Kun Lei, Yi Isaac Yang, Yi Qin Gao
Deep learning is changing many areas in molecular physics, and it has shown great potential to deliver new solutions to challenging molecular modeling problems.
no code implementations • 11 Dec 2020 • Jie Gu, Feng Wang, Qinghui Sun, Zhiquan Ye, Xiaoxiao Xu, Jingmin Chen, Jun Zhang
In this work, we focus on developing universal user representation model.
no code implementations • 2 Dec 2020 • Jun Zhang, Yuan Sun
Besides, we investigate the dependence of the greybody factor and the sparsity of Hawking radiation on the conformal parameters.
General Relativity and Quantum Cosmology
no code implementations • 13 Nov 2020 • Jun Zhang, Yao-Kun Lei, Zhen Zhang, Xu Han, Maodong Li, Lijiang Yang, Yi Isaac Yang, Yi Qin Gao
Combining reinforcement learning (RL) and molecular dynamics (MD) simulations, we propose a machine-learning approach (RL$^\ddag$) to automatically unravel chemical reaction mechanisms.
no code implementations • 3 Nov 2020 • Mingkun Huang, Jun Zhang, Meng Cai, Yang Zhang, Jiali Yao, Yongbin You, Yi He, Zejun Ma
In this work, we analyze the cause of the huge gradient variance in RNN-T training and proposed a new \textit{normalized jointer network} to overcome it.
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+2
no code implementations • 3 Nov 2020 • Mingkun Huang, Meng Cai, Jun Zhang, Yang Zhang, Yongbin You, Yi He, Zejun Ma
In this work we propose an inference technique, asynchronous revision, to unify streaming and non-streaming speech recognition models.
1 code implementation • 27 Oct 2020 • Jiawei Shao, Haowei Zhang, Yuyi Mao, Jun Zhang
The recent advancements of three-dimensional (3D) data acquisition devices have spurred a new breed of applications that rely on point cloud data processing.
Distributed, Parallel, and Cluster Computing
no code implementations • 23 Oct 2020 • Xiaogang Zhu, Shigang Liu, Xian Li, Sheng Wen, Jun Zhang, Camtepe Seyit, Yang Xiang
Fuzzing is one of the most effective technique to identify potential software vulnerabilities.
no code implementations • NeurIPS Workshop DL-IG 2020 • Tian Han, Jun Zhang, Ying Nian Wu
This paper reviews the em-projections in information geometry and the recent understanding of variational auto-encoder, and explains that they share a common formulation as joint minimization of the Kullback-Leibler divergence between two manifolds of probability distributions, and the joint minimization can be implemented by alternating projections or alternating gradient descent.
1 code implementation • 14 Oct 2020 • Jun Zhang, Chen Gao, Depeng Jin, Yong Li
Group-buying recommendation for social e-commerce, which recommends an item list when users want to launch a group, plays an important role in the group success ratio and sales.
no code implementations • 15 Sep 2020 • Jun Zhang, Kuan Tian, Pei Dong, Haocheng Shen, Kezhou Yan, Jianhua Yao, Junzhou Huang, Xiao Han
Recently, artificial intelligence (AI) has been used in various disease diagnosis to improve diagnostic accuracy and reliability, but the interpretation of diagnosis results is still an open problem.
1 code implementation • ECCV 2020 • Shizhen Zhao, Changxin Gao, Jun Zhang, Hao Cheng, Chuchu Han, Xinyang Jiang, Xiaowei Guo, Wei-Shi Zheng, Nong Sang, Xing Sun
In the conventional person Re-ID setting, it is widely assumed that cropped person images are for each individual.
no code implementations • 11 Aug 2020 • Masaki Ikuta, Jun Zhang
We showed that the WGAN-based method was effective to preserve image texture.
no code implementations • 11 Aug 2020 • Liwei Hu, Wenyong Wang, Yu Xiang, Jun Zhang
Motivated by the problems of existing approaches and inspired by the success of the generative adversarial networks (GANs) in the field of computer vision, we prove an optimal discriminator theorem that the optimal discriminator of a GAN is a radial basis function neural network (RBFNN) while dealing with nonlinear sparse FFD regression and generation.
2 code implementations • 28 Jul 2020 • Jun Zhang, Mina Henein, Robert Mahony, Viorela Ila
The problem of tracking self-motion as well as motion of objects in the scene using information from a camera is known as multi-body visual odometry and is a challenging task.
1 code implementation • 15 Jul 2020 • Yifei Shen, Yuanming Shi, Jun Zhang, Khaled B. Letaief
In this paper, we propose to apply graph neural networks (GNNs) to solve large-scale radio resource management problems, supported by effective neural network architecture design and theoretical analysis.
1 code implementation • 9 Jul 2020 • Kunyuan Li, Jun Zhang, Rui Sun, Xu-Dong Zhang, Jun Gao
Based on the observation that an oriented line and its neighboring pixels in an EPI share a similar linear structure, we propose an end-to-end fully convolutional network (FCN) to estimate the depth value of the intersection point on the horizontal and vertical EPIs.
no code implementations • ACL 2020 • Zhiquan Ye, Yuxia Geng, Jiaoyan Chen, Jingmin Chen, Xiaoxiao Xu, SuHang Zheng, Feng Wang, Jun Zhang, Huajun Chen
In this situation, transferring from seen classes to unseen classes is extremely hard.
1 code implementation • 3 Jun 2020 • Jiawei Shao, Jun Zhang
The recent breakthrough in artificial intelligence (AI), especially deep neural networks (DNNs), has affected every branch of science and technology.
1 code implementation • 3 Jun 2020 • Shicong Liu, Zhen Gao, Jun Zhang, Marco Di Renzo, Mohamed-Slim Alouini
Integrating large intelligent reflecting surfaces (IRS) into millimeter-wave (mmWave) massive multi-input-multi-ouput (MIMO) has been a promising approach for improved coverage and throughput.
1 code implementation • 22 May 2020 • Jun Zhang, Mina Henein, Robert Mahony, Viorela Ila
Combining Simultaneous Localisation and Mapping (SLAM) estimation and dynamic scene modelling can highly benefit robot autonomy in dynamic environments.
Robotics
no code implementations • 19 May 2020 • Jun Zhang
The core is a traditional and useful solution concept in economic theory.
no code implementations • 14 May 2020 • Jingsheng Yu, Jun Zhang
We propose a new method to define trading algorithms in market design environments.
no code implementations • LREC 2020 • Chaofa Yuan, Yu-Han Liu, Rongdi Yin, Jun Zhang, Qinling Zhu, Ruibin Mao, Ruifeng Xu
Based on high quality annotation guideline and effective quality control strategy, a corpus with 8, 314 target-level sentiment annotation is constructed on 6, 336 paragraphs from Chinese financial news text.
no code implementations • LREC 2020 • Xiaochang Gong, Qin Zhao, Jun Zhang, Ruibin Mao, Ruifeng Xu
Thus, the detection and processing of sarcasm is important to social media analysis. However, most existing sarcasm dataset are in English and there is still a lack of authoritative Chinese sarcasm dataset.
no code implementations • 26 Apr 2020 • Ye Xue, Yifei Shen, Vincent Lau, Jun Zhang, Khaled B. Letaief
Specifically, we propose a novel $\ell_3$-norm-based formulation to recover the data without channel estimation.
no code implementations • 25 Apr 2020 • Jun Zhang, Yao-Kun Lei, Zhen Zhang, Junhan Chang, Maodong Li, Xu Han, Lijiang Yang, Yi Isaac Yang, Yi Qin Gao
Deep learning is transforming many areas in science, and it has great potential in modeling molecular systems.
1 code implementation • 17 Apr 2020 • Wei Peng, Weien Zhou, Jun Zhang, Wen Yao
Physics-Informed Neural Networks (PINNs) can be regarded as general-purpose PDE solvers, but it might be slow to train PINNs on particular problems, and there is no theoretical guarantee of corresponding error bounds.
no code implementations • 18 Mar 2020 • Xinjie Feng, Hongxun Yao, Yuankai Qi, Jun Zhang, Shengping Zhang
Different from previous transformer based models [56, 34], which just use the decoder of the transformer to decode the convolutional attention, the proposed method use a convolutional feature maps as word embedding input into transformer.
1 code implementation • ECCV 2020 • Chuang Niu, Jun Zhang, Ge Wang, Jimin Liang
To train the GATCluster in a completely unsupervised manner, we design four self-learning tasks with the constraints of transformation invariance, separability maximization, entropy analysis, and attention mapping.