3 code implementations • 6 Dec 2023 • Zhongwei Wan, Xin Wang, Che Liu, Samiul Alam, Yu Zheng, Jiachen Liu, Zhongnan Qu, Shen Yan, Yi Zhu, Quanlu Zhang, Mosharaf Chowdhury, Mi Zhang
Large Language Models (LLMs) have demonstrated remarkable capabilities in important tasks such as natural language understanding, language generation, and complex reasoning and have the potential to make a substantial impact on our society.
no code implementations • 6 Oct 2022 • Zhongnan Qu
To deploy DNNs on edge devices, we need to reduce the size of DNNs, i. e., we target a better trade-off between resource consumption and model accuracy.
no code implementations • 1 Jul 2022 • Zhongnan Qu, Syed Shakib Sarwar, Xin Dong, Yuecheng Li, Ekin Sumbul, Barbara De Salvo
The limited and dynamically varied resources on edge devices motivate us to deploy an optimized deep neural network that can adapt its sub-networks to fit in different resource constraints.
no code implementations • 25 Jun 2022 • Zhongnan Qu, Zimu Zhou, Yongxin Tong, Lothar Thiele
Data collected by IoT devices are often private and have a large diversity across users.
no code implementations • CVPR 2022 • Xin Dong, Barbara De Salvo, Meng Li, Chiao Liu, Zhongnan Qu, H. T. Kung, Ziyun Li
We design deep neural networks (DNNs) and corresponding networks' splittings to distribute DNNs' workload to camera sensors and a centralized aggregator on head mounted devices to meet system performance targets in inference accuracy and latency under the given hardware resource constraints.
2 code implementations • 21 Apr 2021 • Lennart Heim, Andreas Biri, Zhongnan Qu, Lothar Thiele
With the surge of inexpensive computational and memory resources, neural networks (NNs) have experienced an unprecedented growth in architectural and computational complexity.
1 code implementation • NeurIPS 2020 • Fan Lu, Guang Chen, Yinlong Liu, Zhongnan Qu, Alois Knoll
To tackle the information loss of random sampling, we exploit a novel random dilation cluster strategy to enlarge the receptive field of each sampled point and an attention mechanism to aggregate the positions and features of neighbor points.
no code implementations • 6 Jul 2020 • Zhongnan Qu, Cong Liu, Lothar Thiele
Emerging edge intelligence applications require the server to retrain and update deep neural networks deployed on remote edge nodes to leverage newly collected data samples.
1 code implementation • 28 Apr 2020 • Bin Li, Hu Cao, Zhongnan Qu, Yingbai Hu, Zhenke Wang, Zichen Liang
Based on the Event-Stream dataset, we develop a deep neural network for grasping detection which consider the angle learning problem as classification instead of regression.
1 code implementation • CVPR 2020 • Zhongnan Qu, Zimu Zhou, Yun Cheng, Lothar Thiele
We investigate the compression of deep neural networks by quantizing their weights and activations into multiple binary bases, known as multi-bit networks (MBNs), which accelerate the inference and reduce the storage for the deployment on low-resource mobile and embedded platforms.
no code implementations • 6 Dec 2017 • Guang Chen, Shu Liu, Kejia Ren, Zhongnan Qu, Changhong Fu, Gereon Hinz, Alois Knoll
However, the mobile sensing perception brings new challenges for how to efficiently analyze and intelligently interpret the deluge of IoT data in mission- critical services.