no code implementations • 20 Feb 2025 • Zhongyi Zhou, Yichen Zhu, Minjie Zhu, Junjie Wen, Ning Liu, Zhiyuan Xu, Weibin Meng, Ran Cheng, Yaxin Peng, Chaomin Shen, Feifei Feng
Humans possess a unified cognitive ability to perceive, comprehend, and interact with the physical world.
no code implementations • 17 Feb 2025 • Juantao Zhong, Daoyuan Wu, Ye Liu, Maoyi Xie, Yang Liu, Yi Li, Ning Liu
DeFi (Decentralized Finance) is one of the most important applications of today's cryptocurrencies and smart contracts.
1 code implementation • 10 Jan 2025 • Yucheng Ding, Yangwenjian Tan, Xiangyu Liu, Chaoyue Niu, Fandong Meng, Jie zhou, Ning Liu, Fan Wu, Guihai Chen
In many practical natural language applications, user data are highly sensitive, requiring anonymous uploads of text data from mobile devices to the cloud without user identifiers.
no code implementations • 18 Dec 2024 • Kun Wu, Chengkai Hou, Jiaming Liu, Zhengping Che, Xiaozhu Ju, Zhuqin Yang, Meng Li, Yinuo Zhao, Zhiyuan Xu, Guang Yang, Shichao Fan, Xinhua Wang, Fei Liao, Zhen Zhao, Guangyu Li, Zhao Jin, Lecheng Wang, Jilei Mao, Ning Liu, Pei Ren, Qiang Zhang, Yaoxu Lyu, Mengzhen Liu, Jingyang He, Yulin Luo, Zeyu Gao, Chenxuan Li, Chenyang Gu, Yankai Fu, Di wu, Xingyu Wang, Sixiang Chen, Zhenyu Wang, Pengju An, Siyuan Qian, Shanghang Zhang, Jian Tang
To the best of our knowledge, RoboMIND is the largest multi-embodiment teleoperation dataset collected on a unified platform, providing large-scale and high-quality robotic training data.
1 code implementation • 31 Oct 2024 • Xinwang Chen, Ning Liu, Yichen Zhu, Feifei Feng, Jian Tang
Transformer-based Diffusion Probabilistic Models (DPMs) have shown more potential than CNN-based DPMs, yet their extensive computational requirements hinder widespread practical applications.
no code implementations • 3 Oct 2024 • Ning Liu, Lu Zhang, Tian Gao, Yue Yu
Specifically, we introduce DisentangO, a novel hyper-neural operator architecture designed to unveil and disentangle the latent physical factors of variation embedded within the black-box neural operator parameters.
no code implementations • 26 Sep 2024 • Shiqi Sun, Yantao Lu, Ning Liu, Bo Jiang, Jinchao Chen, Ying Zhang
The redundant parameters can be pruned by our proposed importance score evaluation function, Alternative Evaluation (AlterEva), which is based on the observation of the loss changes when certain modality parameters are activated and deactivated.
no code implementations • 19 Sep 2024 • Junjie Wen, Yichen Zhu, Jinming Li, Minjie Zhu, Kun Wu, Zhiyuan Xu, Ning Liu, Ran Cheng, Chaomin Shen, Yaxin Peng, Feifei Feng, Jian Tang
Vision-Language-Action (VLA) models have shown remarkable potential in visuomotor control and instruction comprehension through end-to-end learning processes.
no code implementations • 14 Aug 2024 • Yue Yu, Ning Liu, Fei Lu, Tian Gao, Siavash Jafarzadeh, Stewart Silling
In this work, we propose a novel neural operator architecture based on the attention mechanism, which we coin Nonlocal Attention Operator (NAO), and explore its capability towards developing a foundation physical model.
no code implementations • 11 Aug 2024 • Dingyi Rong, Wenzhuo Zheng, Bozitao Zhong, Zhouhan Lin, Liang Hong, Ning Liu
Accurate prediction of enzyme function is crucial for elucidating biological mechanisms and driving innovation across various sectors.
no code implementations • 3 Jul 2024 • Ning Liu, Siavash Jafarzadeh, Brian Y. Lattimer, Shuna Ni, Jim Lua, Yue Yu
Our framework features a two-phase training strategy: (1) utilizing the large-in-amount while less accurate synthetic data for supervised pretraining, and (2) finetuning the phase-1 model with limited experimental data.
no code implementations • 28 Jun 2024 • Jinming Li, Yichen Zhu, Zhiyuan Xu, Jindong Gu, Minjie Zhu, Xin Liu, Ning Liu, Yaxin Peng, Feifei Feng, Jian Tang
It is fundamentally challenging for robots to serve as useful assistants in human environments because this requires addressing a spectrum of sub-problems across robotics, including perception, language understanding, reasoning, and planning.
no code implementations • 8 Jun 2024 • Xunguang Wang, Daoyuan Wu, Zhenlan Ji, Zongjie Li, Pingchuan Ma, Shuai Wang, Yingjiu Li, Yang Liu, Ning Liu, Juergen Rahmel
Jailbreaking is an emerging adversarial attack that bypasses the safety alignment deployed in off-the-shelf large language models (LLMs) and has evolved into multiple categories: human-based, optimization-based, generation-based, and the recent indirect and multilingual jailbreaks.
no code implementations • 13 May 2024 • Ning Liu, Xuxiao Li, Manoj R. Rajanna, Edward W. Reutzel, Brady Sawyer, Prahalada Rao, Jim Lua, Nam Phan, Yue Yu
In terms of Laser Powder Bed Fusion (L-PBF) based additive manufacturing (AM), a DT can predict the current and future states of the melt pool and the resulting defects corresponding to the input laser parameters, evolve itself by assimilating in-situ sensor data, and optimize the laser parameters to mitigate defect formation.
no code implementations • 8 Apr 2024 • Zhengyang Zhao, Haitao Yuan, Nan Jiang, Minxiao Chen, Ning Liu, Zengxiang Li
Accurate Traffic Prediction is a challenging task in intelligent transportation due to the spatial-temporal aspects of road networks.
1 code implementation • 18 Mar 2024 • Tengchuan Kou, Xiaohong Liu, ZiCheng Zhang, Chunyi Li, HaoNing Wu, Xiongkuo Min, Guangtao Zhai, Ning Liu
Based on T2VQA-DB, we propose a novel transformer-based model for subjective-aligned Text-to-Video Quality Assessment (T2VQA).
no code implementations • 11 Mar 2024 • Leo Chen, Benjamin Boardley, Ping Hu, Yiru Wang, Yifan Pu, Xin Jin, Yongqiang Yao, Ruihao Gong, Bo Li, Gao Huang, Xianglong Liu, Zifu Wan, Xinwang Chen, Ning Liu, Ziyi Zhang, Dongping Liu, Ruijie Shan, Zhengping Che, Fachao Zhang, Xiaofeng Mou, Jian Tang, Maxim Chuprov, Ivan Malofeev, Alexander Goncharenko, Andrey Shcherbin, Arseny Yanchenko, Sergey Alyamkin, Xiao Hu, George K. Thiruvathukal, Yung Hsiang Lu
This article describes the 2023 IEEE Low-Power Computer Vision Challenge (LPCVC).
1 code implementation • 10 Mar 2024 • Minjie Zhu, Yichen Zhu, Xin Liu, Ning Liu, Zhiyuan Xu, Chaomin Shen, Yaxin Peng, Zhicai Ou, Feifei Feng, Jian Tang
Multimodal Large Language Models (MLLMs) have showcased impressive skills in tasks related to visual understanding and reasoning.
Ranked #183 on
Visual Question Answering
on MM-Vet
no code implementations • 24 Feb 2024 • Daoyuan Wu, Shuai Wang, Yang Liu, Ning Liu
Our key insight is that regardless of the kind of jailbreak strategies employed, they eventually need to include a harmful prompt (e. g., "how to make a bomb") in the prompt sent to LLMs, and we found that existing LLMs can effectively recognize such harmful prompts that violate their safety policies.
no code implementations • 31 Jan 2024 • Dong Chen, Ning Liu, Yichen Zhu, Zhengping Che, Rui Ma, Fachao Zhang, Xiaofeng Mou, Yi Chang, Jian Tang
Instead of a simple combination of pruning and SD, EPSD enables the pruned network to favor SD by keeping more distillable weights before training to ensure better distillation of the pruned network.
no code implementations • 17 Jan 2024 • Kun Wu, Ning Liu, Zhen Zhao, Di Qiu, Jinming Li, Zhengping Che, Zhiyuan Xu, Qinru Qiu, Jian Tang
High-quality segments from the failed data are used to expand the training dataset.
no code implementations • 13 Jan 2024 • Jie Tian, Jixin Hou, Zihao Wu, Peng Shu, Zhengliang Liu, Yujie Xiang, Beikang Gu, Nicholas Filla, Yiwei Li, Ning Liu, Xianyan Chen, Keke Tang, Tianming Liu, Xianqiao Wang
This study is a pioneering endeavor to investigate the capabilities of Large Language Models (LLMs) in addressing conceptual questions within the domain of mechanical engineering with a focus on mechanics.
no code implementations • 11 Jan 2024 • Siavash Jafarzadeh, Stewart Silling, Ning Liu, Zhongqiang Zhang, Yue Yu
In this work, we introduce a novel integral neural operator architecture called the Peridynamic Neural Operator (PNO) that learns a nonlocal constitutive law from data.
1 code implementation • 4 Jan 2024 • Yichen Zhu, Minjie Zhu, Ning Liu, Zhicai Ou, Xiaofeng Mou, Jian Tang
In this paper, we introduce LLaVA-$\phi$ (LLaVA-Phi), an efficient multi-modal assistant that harnesses the power of the recently advanced small language model, Phi-2, to facilitate multi-modal dialogues.
Ranked #212 on
Visual Question Answering
on MM-Vet
1 code implementation • 27 Dec 2023 • Weijun Chen, Heyuan Wang, Ye Tian, Shijie Guan, Ning Liu
Additionally, adopting a frequency-based perspective can effectively mitigate the influence of noise within MTS data, which helps capture more genuine dependencies.
1 code implementation • 18 Dec 2023 • Ning Liu, Yiming Fan, Xianyi Zeng, Milan Klöwer, Lu Zhang, Yue Yu
In this work, we introduce conservation law-encoded neural operators (clawNOs), a suite of NOs that endow inference with automatic satisfaction of such conservation laws.
1 code implementation • 4 Dec 2023 • Lei Wang, Jiabang He, Shenshen Li, Ning Liu, Ee-Peng Lim
The fine-grained object attributes and behaviors non-existent in the image may still be generated but not measured by the current evaluation methods.
no code implementations • 30 Nov 2023 • Dan Song, Xinwei Fu, Ning Liu, Weizhi Nie, Wenhui Li, Lanjun Wang, You Yang, AnAn Liu
Consequently, this paper aims to improve the confidence with view selection and hierarchical prompts.
no code implementations • 13 Nov 2023 • Fanlong Zeng, Wensheng Gan, Yongheng Wang, Ning Liu, Philip S. Yu
Understanding and assessing this intelligence is a complex task.
1 code implementation • 22 Oct 2023 • Zhuo Wang, Wei zhang, Ning Liu, Jianyong Wang
Rule-based models, e. g., decision trees, are widely used in scenarios demanding high model interpretability for their transparent inner structures and good model expressivity.
1 code implementation • 23 Aug 2023 • Zhenyu Li, Sunqi Fan, Yu Gu, Xiuxing Li, Zhichao Duan, Bowen Dong, Ning Liu, Jianyong Wang
Knowledge base question answering (KBQA) is a critical yet challenging task due to the vast number of entities within knowledge bases and the diversity of natural language questions posed by users.
no code implementations • 15 Aug 2023 • Wenzhuo Zheng, Junhao Zhao, Xiaohong Liu, Yongyang Pan, Zhenghao Gan, Haozhe Han, Ning Liu
Our work mainly addresses the problem of combining parametric models of faces with multi-view face 3D reconstruction and explores the implementation of a Flame based multi-view training and testing framework for contributing to the field of face 3D reconstruction.
1 code implementation • 9 Aug 2023 • Tengchuan Kou, Xiaohong Liu, Wei Sun, Jun Jia, Xiongkuo Min, Guangtao Zhai, Ning Liu
Indeed, most existing quality assessment models evaluate video quality as a whole without specifically taking the subjective experience of video stability into consideration.
1 code implementation • 25 Jul 2023 • Hongzuo Xu, Yijie Wang, Guansong Pang, Songlei Jian, Ning Liu, Yongjun Wang
anomaly contamination.
Semi-supervised Anomaly Detection
Supervised Anomaly Detection
+1
1 code implementation • 5 Jun 2023 • Jiabang He, Yi Hu, Lei Wang, Xing Xu, Ning Liu, Hui Liu, Heng Tao Shen
Results from the experiments demonstrate that there is a significant performance gap between the in-distribution (ID) and OOD settings for document images, and that fine-grained analysis of distribution shifts can reveal the brittle nature of existing pre-trained VDU models and OOD generalization algorithms.
no code implementations • 2 Jun 2023 • Zhuo Wang, Rongzhen Li, Bowen Dong, Jie Wang, Xiuxing Li, Ning Liu, Chenhui Mao, Wei zhang, Liling Dong, Jing Gao, Jianyong Wang
In this paper, we explore the potential of LLMs such as GPT-4 to outperform traditional AI tools in dementia diagnosis.
2 code implementations • 25 May 2023 • Hongzuo Xu, Yijie Wang, Juhui Wei, Songlei Jian, Yizhou Li, Ning Liu
Due to the unsupervised nature of anomaly detection, the key to fueling deep models is finding supervisory signals.
1 code implementation • 5 May 2023 • Lei Wang, Yi Hu, Jiabang He, Xing Xu, Ning Liu, Hui Liu, Heng Tao Shen
To address these issues, we propose a novel method termed T-SciQ that aims at teaching science question answering with LLM signals.
no code implementations • 6 Apr 2023 • Zhichao Duan, Xiuxing Li, Zhengyan Zhang, Zhenyu Li, Ning Liu, Jianyong Wang
As a popular topic in natural language processing tasks, extractive question answering task (extractive QA) has gained extensive attention in the past few years.
no code implementations • 23 Mar 2023 • Yaomin Huang, Ning Liu, Zhengping Che, Zhiyuan Xu, Chaomin Shen, Yaxin Peng, Guixu Zhang, Xinmei Liu, Feifei Feng, Jian Tang
CP$^3$ is elaborately designed to leverage the characteristics of point clouds and PNNs in order to enable 2D channel pruning methods for PNNs.
no code implementations • 23 Mar 2023 • Mingze Wei, Yaomin Huang, Zhiyuan Xu, Ning Liu, Zhengping Che, Xinyu Zhang, Chaomin Shen, Feifei Feng, Chun Shan, Jian Tang
Our work significantly outperforms the state-of-the-art for three-finger robotic hands.
1 code implementation • ICCV 2023 • Jiabang He, Lei Wang, Yi Hu, Ning Liu, Hui Liu, Xing Xu, Heng Tao Shen
To this end, we propose a simple but effective in-context learning framework called ICL-D3IE, which enables LLMs to perform DIE with different types of demonstration examples.
no code implementations • 9 Mar 2023 • Ning Liu, Benjamin Grimmer
We consider feasibility and constrained optimization problems defined over smooth and/or strongly convex sets.
no code implementations • CVPR 2023 • Yaomin Huang, Ning Liu, Zhengping Che, Zhiyuan Xu, Chaomin Shen, Yaxin Peng, Guixu Zhang, Xinmei Liu, Feifei Feng, Jian Tang
Directly implementing the 2D CNN channel pruning methods to PNNs undermine the performance of PNNs because of the different representations of 2D images and 3D point clouds as well as the network architecture disparity.
no code implementations • CVPR 2023 • Yichen Zhu, Qiqi Zhou, Ning Liu, Zhiyuan Xu, Zhicai Ou, Xiaofeng Mou, Jian Tang
Unlike existing works that struggle to balance the trade-off between inference speed and SOD performance, in this paper, we propose a novel Scale-aware Knowledge Distillation (ScaleKD), which transfers knowledge of a complex teacher model to a compact student model.
no code implementations • 29 Dec 2022 • Ning Liu, Yue Yu, Huaiqian You, Neeraj Tatikola
Neural operators, which emerge as implicit solution operators of hidden governing equations, have recently become popular tools for learning responses of complex real-world physical systems.
1 code implementation • 27 Nov 2022 • Lei Wang, Jiabang He, Xing Xu, Ning Liu, Hui Liu
In this paper, we propose a new model architecture with alignment-enriched tuning (dubbed AETNet) upon pre-trained document image models, to adapt downstream tasks with the joint task-specific supervised and alignment-aware contrastive objective.
no code implementations • 29 Aug 2022 • Rui Ma, Ning Liu, Jingsong Yuan, Huafeng Yang, Jiandong Zhang
Traditional recommendation systems mainly focus on modeling user interests.
no code implementations • 26 Aug 2022 • Lichen Jia, Bowen Tang, Chenggang Wu, Zhe Wang, Zihan Jiang, Yuanming Lai, Yan Kang, Ning Liu, Jingfeng Zhang
The binary code similarity detection (BCSD) method measures the similarity of two binary executable codes.
1 code implementation • 12 Jul 2022 • Xiuxing Li, Zhenyu Li, Zhengyan Zhang, Ning Liu, Haitao Yuan, Wei zhang, Zhiyuan Liu, Jianyong Wang
In this paper, we endeavor to solve the problem of few-shot entity linking, which only requires a minimal amount of in-domain labeled data and is more practical in real situations.
no code implementations • 21 Jun 2022 • Shaoyi Huang, Ning Liu, Yueying Liang, Hongwu Peng, Hongjia Li, Dongkuan Xu, Mimi Xie, Caiwen Ding
On MRPC, we obtain a 4. 6 higher score than the SOTA at the same overall pruning ratio of 0. 5.
no code implementations • 9 Jun 2022 • Yu Fan, ZiCheng Zhang, Wei Sun, Xiongkuo Min, Wei Lu, Tao Wang, Ning Liu, Guangtao Zhai
Point cloud is one of the most widely used digital formats of 3D models, the visual quality of which is quite sensitive to distortions such as downsampling, noise, and compression.
no code implementations • ACL 2022 • Sanjeev Kumar Karn, Ning Liu, Hinrich Schuetze, Oladimeji Farri
A cascade of tasks are required to automatically generate an abstractive summary of the typical information-rich radiology report.
no code implementations • 4 Dec 2021 • Ziqiang Wang, Yimao Sun, Qun Wan, Lei Xie, Ning Liu
Emitter localization is widely applied in the military and civilian _elds.
1 code implementation • NeurIPS 2021 • Geng Yuan, Xiaolong Ma, Wei Niu, Zhengang Li, Zhenglun Kong, Ning Liu, Yifan Gong, Zheng Zhan, Chaoyang He, Qing Jin, Siyue Wang, Minghai Qin, Bin Ren, Yanzhi Wang, Sijia Liu, Xue Lin
Systematical evaluation on accuracy, training speed, and memory footprint are conducted, where the proposed MEST framework consistently outperforms representative SOTA works.
no code implementations • 21 Oct 2021 • Wenzheng Hu, Zhengping Che, Ning Liu, Mingyang Li, Jian Tang, ChangShui Zhang, Jianqiang Wang
Deep convolutional neural networks are shown to be overkill with high parametric and computational redundancy in many application scenarios, and an increasing number of works have explored model pruning to obtain lightweight and efficient networks.
2 code implementations • NeurIPS 2021 • Zhuo Wang, Wei zhang, Ning Liu, Jianyong Wang
Rule-based models, e. g., decision trees, are widely used in scenarios demanding high model interpretability for their transparent inner structures and good model expressivity.
2 code implementations • NeurIPS 2021 • Xiaolong Ma, Geng Yuan, Xuan Shen, Tianlong Chen, Xuxi Chen, Xiaohan Chen, Ning Liu, Minghai Qin, Sijia Liu, Zhangyang Wang, Yanzhi Wang
Based on our analysis, we summarize a guideline for parameter settings in regards of specific architecture characteristics, which we hope to catalyze the research progress on the topic of lottery ticket hypothesis.
no code implementations • 16 Jun 2021 • Geng Yuan, Zhiheng Liao, Xiaolong Ma, Yuxuan Cai, Zhenglun Kong, Xuan Shen, Jingyan Fu, Zhengang Li, Chengming Zhang, Hongwu Peng, Ning Liu, Ao Ren, Jinhui Wang, Yanzhi Wang
More importantly, our method does not require extra hardware cost compared to the traditional two-column mapping scheme.
1 code implementation • 19 Apr 2021 • Hongzuo Xu, Yijie Wang, Songlei Jian, Zhenyu Huang, Ning Liu, Yongjun Wang, Fei Li
We obtain an optimal attention-guided embedding space with expanded high-level information and rich semantics, and thus outlying behaviors of the queried outlier can be better unfolded.
no code implementations • 13 Apr 2021 • Ning Liu, Songlei Jian, Dongsheng Li, Yiming Zhang, Zhiquan Lai, Hongzuo Xu
Graph neural networks (GNN) have been proven to be mature enough for handling graph-structured data on node-level graph representation learning tasks.
no code implementations • 22 Mar 2021 • Wei zhang, Yunfeng Zhang, Ning Liu, Kai Ren, Pengfei Wang
This paper studies how to improve the generalization performance and learning speed of the navigation agents trained with deep reinforcement learning (DRL).
no code implementations • 19 Feb 2021 • Ning Liu, Geng Yuan, Zhengping Che, Xuan Shen, Xiaolong Ma, Qing Jin, Jian Ren, Jian Tang, Sijia Liu, Yanzhi Wang
In deep model compression, the recent finding "Lottery Ticket Hypothesis" (LTH) (Frankle & Carbin, 2018) pointed out that there could exist a winning ticket (i. e., a properly pruned sub-network together with original weight initialization) that can achieve competitive performance than the original dense network.
no code implementations • 1 Jan 2021 • Zhuo Wang, Wei zhang, Ning Liu, Jianyong Wang
Rule-based models, e. g., decision trees, are widely used in scenarios demanding high model interpretability for their transparent inner structures and good model expressivity.
no code implementations • ICCV 2021 • Ye Chen, Jinxian Liu, Bingbing Ni, Hang Wang, Jiancheng Yang, Ning Liu, Teng Li, Qi Tian
Then the destroyed shape and the normal shape are sent into a point cloud network to get representations, which are employed to segment points that belong to distorted parts and further reconstruct them to restore the shape to normal.
no code implementations • 4 Nov 2020 • Yushuo Guan, Ning Liu, Pengyu Zhao, Zhengping Che, Kaigui Bian, Yanzhi Wang, Jian Tang
The convolutional neural network has achieved great success in fulfilling computer vision tasks despite large computation overhead against efficient deployment.
no code implementations • 20 Oct 2020 • Wei zhang, Ning Liu, Yunfeng Zhang
Deep reinforcement learning (DRL) demonstrates great potential in mapless navigation domain.
no code implementations • 25 Apr 2020 • Ning Liu, Zhanchun Tu
The momentum distribution and dynamical structure factor in a weakly interacting Bose gas with a time-dependent periodic modulation in terms of the Bogoliubov treatment are investigated.
Quantum Gases
1 code implementation • 10 Dec 2019 • Zhuo Wang, Wei zhang, Ning Liu, Jianyong Wang
In this paper, we propose a new hierarchical rule-based model for classification tasks, named Concept Rule Sets (CRS), which has both a strong expressive ability and a transparent inner structure.
no code implementations • 4 Nov 2019 • Hongjia Li, Sheng Lin, Ning Liu, Caiwen Ding, Yanzhi Wang
Deep neural networks (DNNs) have been expanded into medical fields and triggered the revolution of some medical applications by extracting complex features and achieving high accuracy and performance, etc.
no code implementations • 29 Sep 2019 • Caiwen Ding, Shuo Wang, Ning Liu, Kaidi Xu, Yanzhi Wang, Yun Liang
To achieve real-time, highly-efficient implementations on FPGA, we present the detailed hardware implementation of block circulant matrices on CONV layers and develop an efficient processing element (PE) structure supporting the heterogeneous weight quantization, CONV dataflow and pipelining techniques, design optimization, and a template-based automatic synthesis framework to optimally exploit hardware resource.
2 code implementations • CVPR 2020 • Lifeng Huang, Chengying Gao, Yuyin Zhou, Cihang Xie, Alan Yuille, Changqing Zou, Ning Liu
In this paper, we study physical adversarial attacks on object detectors in the wild.
no code implementations • 9 Aug 2019 • Qi He, Qijun Zhao, Ning Liu, Peng Chen, Zhihe Zhang, Rong Hou
We are going to release our database and model in the public domain to promote the research on automatic animal identification and particularly on the technique for protecting red pandas.
no code implementations • 22 Jul 2019 • Ruizhe Cai, Ao Ren, Olivia Chen, Ning Liu, Caiwen Ding, Xuehai Qian, Jie Han, Wenhui Luo, Nobuyuki Yoshikawa, Yanzhi Wang
Further, the application of SC has been investigated in DNNs in prior work, and the suitability has been illustrated as SC is more compatible with approximate computations.
no code implementations • 6 Jul 2019 • Ning Liu, Xiaolong Ma, Zhiyuan Xu, Yanzhi Wang, Jian Tang, Jieping Ye
This work proposes AutoCompress, an automatic structured pruning framework with the following key performance improvements: (i) effectively incorporate the combination of structured pruning schemes in the automatic process; (ii) adopt the state-of-art ADMM-based structured weight pruning as the core algorithm, and propose an innovative additional purification step for further weight reduction without accuracy loss; and (iii) develop effective heuristic search method enhanced by experience-based guided search, replacing the prior deep reinforcement learning technique which has underlying incompatibility with the target pruning problem.
no code implementations • 2019 IEEE 35th International Conference on Data Engineering (ICDE) 2019 • Ning Liu, Pan Lu, Wei zhang, Jianyong Wang
To address the above issues, we propose novel Knowledge-aware Deep Dual Networks (K-DDN) for the text-based mortality prediction task.
1 code implementation • CVPR 2019 • Ning Liu, Yongchao Long, Changqing Zou, Qun Niu, Li Pan, Hefeng Wu
We propose an attention-injective deformable convolutional network called ADCrowdNet for crowd understanding that can address the accuracy degradation problem of highly congested noisy scenes.
Ranked #2 on
Crowd Counting
on TRANCOS
1 code implementation • 29 Jul 2018 • Tianyun Zhang, Shaokai Ye, Kaiqi Zhang, Xiaolong Ma, Ning Liu, Linfeng Zhang, Jian Tang, Kaisheng Ma, Xue Lin, Makan Fardad, Yanzhi Wang
Without loss of accuracy on the AlexNet model, we achieve 2. 58X and 3. 65X average measured speedup on two GPUs, clearly outperforming the prior work.
no code implementations • 28 Mar 2018 • Caiwen Ding, Ao Ren, Geng Yuan, Xiaolong Ma, Jiayu Li, Ning Liu, Bo Yuan, Yanzhi Wang
For FPGA implementations on deep convolutional neural networks (DCNNs), we achieve at least 152X and 72X improvement in performance and energy efficiency, respectively using the SWM-based framework, compared with the baseline of IBM TrueNorth processor under same accuracy constraints using the data set of MNIST, SVHN, and CIFAR-10.
no code implementations • 2 Feb 2018 • Ruizhe Cai, Ao Ren, Ning Liu, Caiwen Ding, Luhao Wang, Xuehai Qian, Massoud Pedram, Yanzhi Wang
In this paper, we propose VIBNN, an FPGA-based hardware accelerator design for variational inference on BNNs.
no code implementations • 28 Jan 2018 • Ning Liu, Ying Liu, Brent Logan, Zhiyuan Xu, Jian Tang, Yanzhi Wang
This paper presents the first deep reinforcement learning (DRL) framework to estimate the optimal Dynamic Treatment Regimes from observational medical data.
no code implementations • 13 Dec 2017 • Sheng Lin, Ning Liu, Mahdi Nazemi, Hongjia Li, Caiwen Ding, Yanzhi Wang, Massoud Pedram
The large model size of DNNs, while providing excellent accuracy, also burdens the embedded platforms with intensive computation and storage.
no code implementations • 29 Aug 2017 • Caiwen Ding, Siyu Liao, Yanzhi Wang, Zhe Li, Ning Liu, Youwei Zhuo, Chao Wang, Xuehai Qian, Yu Bai, Geng Yuan, Xiaolong Ma, Yi-Peng Zhang, Jian Tang, Qinru Qiu, Xue Lin, Bo Yuan
As the size of DNNs continues to grow, it is critical to improve the energy efficiency and performance while maintaining accuracy.
no code implementations • 13 Mar 2017 • Ning Liu, Zhe Li, Zhiyuan Xu, Jielong Xu, Sheng Lin, Qinru Qiu, Jian Tang, Yanzhi Wang
Automatic decision-making approaches, such as reinforcement learning (RL), have been applied to (partially) solve the resource allocation problem adaptively in the cloud computing system.