Search Results for author: Zhongnan Qu

Found 11 papers, 5 papers with code

Efficient Large Language Models: A Survey

3 code implementations6 Dec 2023 Zhongwei Wan, Xin Wang, Che Liu, Samiul Alam, Yu Zheng, Jiachen Liu, Zhongnan Qu, Shen Yan, Yi Zhu, Quanlu Zhang, Mosharaf Chowdhury, Mi Zhang

Large Language Models (LLMs) have demonstrated remarkable capabilities in important tasks such as natural language understanding, language generation, and complex reasoning and have the potential to make a substantial impact on our society.

Natural Language Understanding Text Generation

Enabling Deep Learning on Edge Devices

no code implementations6 Oct 2022 Zhongnan Qu

To deploy DNNs on edge devices, we need to reduce the size of DNNs, i. e., we target a better trade-off between resource consumption and model accuracy.

DRESS: Dynamic REal-time Sparse Subnets

no code implementations1 Jul 2022 Zhongnan Qu, Syed Shakib Sarwar, Xin Dong, Yuecheng Li, Ekin Sumbul, Barbara De Salvo

The limited and dynamically varied resources on edge devices motivate us to deploy an optimized deep neural network that can adapt its sub-networks to fit in different resource constraints.

SplitNets: Designing Neural Architectures for Efficient Distributed Computing on Head-Mounted Systems

no code implementations CVPR 2022 Xin Dong, Barbara De Salvo, Meng Li, Chiao Liu, Zhongnan Qu, H. T. Kung, Ziyun Li

We design deep neural networks (DNNs) and corresponding networks' splittings to distribute DNNs' workload to camera sensors and a centralized aggregator on head mounted devices to meet system performance targets in inference accuracy and latency under the given hardware resource constraints.

3D Classification Distributed Computing +1

Measuring what Really Matters: Optimizing Neural Networks for TinyML

2 code implementations21 Apr 2021 Lennart Heim, Andreas Biri, Zhongnan Qu, Lothar Thiele

With the surge of inexpensive computational and memory resources, neural networks (NNs) have experienced an unprecedented growth in architectural and computational complexity.

Benchmarking

RSKDD-Net: Random Sample-based Keypoint Detector and Descriptor

1 code implementation NeurIPS 2020 Fan Lu, Guang Chen, Yinlong Liu, Zhongnan Qu, Alois Knoll

To tackle the information loss of random sampling, we exploit a novel random dilation cluster strategy to enlarge the receptive field of each sampled point and an attention mechanism to aggregate the positions and features of neighbor points.

Point Cloud Registration Saliency Prediction

Deep Partial Updating: Towards Communication Efficient Updating for On-device Inference

no code implementations6 Jul 2020 Zhongnan Qu, Cong Liu, Lothar Thiele

Emerging edge intelligence applications require the server to retrain and update deep neural networks deployed on remote edge nodes to leverage newly collected data samples.

Event-based Robotic Grasping Detection with Neuromorphic Vision Sensor and Event-Stream Dataset

1 code implementation28 Apr 2020 Bin Li, Hu Cao, Zhongnan Qu, Yingbai Hu, Zhenke Wang, Zichen Liang

Based on the Event-Stream dataset, we develop a deep neural network for grasping detection which consider the angle learning problem as classification instead of regression.

Robotic Grasping

Adaptive Loss-aware Quantization for Multi-bit Networks

1 code implementation CVPR 2020 Zhongnan Qu, Zimu Zhou, Yun Cheng, Lothar Thiele

We investigate the compression of deep neural networks by quantizing their weights and activations into multiple binary bases, known as multi-bit networks (MBNs), which accelerate the inference and reduce the storage for the deployment on low-resource mobile and embedded platforms.

Quantization

Deep Anticipation: Light Weight Intelligent Mobile Sensing in IoT by Recurrent Architecture

no code implementations6 Dec 2017 Guang Chen, Shu Liu, Kejia Ren, Zhongnan Qu, Changhong Fu, Gereon Hinz, Alois Knoll

However, the mobile sensing perception brings new challenges for how to efficiently analyze and intelligently interpret the deluge of IoT data in mission- critical services.

Cannot find the paper you are looking for? You can Submit a new open access paper.