Search Results for author: Ke He

Found 4 papers, 2 papers with code

Learning based signal detection for MIMO systems with unknown noise statistics

1 code implementation21 Jan 2021 Ke He, Le He, Lisheng Fan, Yansha Deng, George K. Karagiannidis, Arumugam Nallanathan

Existing detection methods have mainly focused on specific noise models, which are not robust enough with unknown noise statistics.

Towards Optimally Efficient Search with Deep Learning for Large-Scale MIMO Systems

1 code implementation7 Jan 2021 Le He, Ke He, Lisheng Fan, Xianfu Lei, Arumugam Nallanathan, George K. Karagiannidis

This indicates that the proposed algorithm reaches almost the optimal efficiency in practical scenarios, and thereby it is applicable for large-scale systems.

Explicit Connection Distillation

no code implementations1 Jan 2021 Lujun Li, Yikai Wang, Anbang Yao, Yi Qian, Xiao Zhou, Ke He

In this paper, we present Explicit Connection Distillation (ECD), a new KD framework, which addresses the knowledge distillation problem in a novel perspective of bridging dense intermediate feature connections between a student network and its corresponding teacher generated automatically in the training, achieving knowledge transfer goal via direct cross-network layer-to-layer gradients propagation, without need to define complex distillation losses and assume a pre-trained teacher model to be available.

Knowledge Distillation Transfer Learning

FeCaffe: FPGA-enabled Caffe with OpenCL for Deep Learning Training and Inference on Intel Stratix 10

no code implementations18 Nov 2019 Ke He, Bo Liu, Yu Zhang, Andrew Ling, Dian Gu

In this paper, we firstly propose the FeCaffe, i. e. FPGA-enabled Caffe, a hierarchical software and hardware design methodology based on the Caffe to enable FPGA to support mainline deep learning development features, e. g. training and inference with Caffe.

Cannot find the paper you are looking for? You can Submit a new open access paper.