Search Results for author: Linfeng Zhang

Found 47 papers, 24 papers with code

Revisiting Data Augmentation in Model Compression: An Empirical and Comprehensive Study

no code implementations22 May 2023 Muzhou Yu, Linfeng Zhang, Kaisheng Ma

In this paper, we revisit the usage of data augmentation in model compression and give a comprehensive study on the relation between model sizes and their optimal data augmentation policy.

Data Augmentation Knowledge Distillation +2

CORSD: Class-Oriented Relational Self Distillation

no code implementations28 Apr 2023 Muzhou Yu, Sia Huat Tan, Kailu Wu, Runpei Dong, Linfeng Zhang, Kaisheng Ma

Knowledge distillation conducts an effective model compression method while holding some limitations:(1) the feature based distillation methods only focus on distilling the feature map but are lack of transferring the relation of data examples; (2) the relational distillation methods are either limited to the handcrafted functions for relation extraction, such as L2 norm, or weak in inter- and intra- class relation modeling.

Knowledge Distillation Model Compression +1

Uni-QSAR: an Auto-ML Tool for Molecular Property Prediction

no code implementations24 Apr 2023 Zhifeng Gao, Xiaohong Ji, Guojiang Zhao, Hongshuai Wang, Hang Zheng, Guolin Ke, Linfeng Zhang

Recently deep learning based quantitative structure-activity relationship (QSAR) models has shown surpassing performance than traditional methods for property prediction tasks in drug discovery.

Drug Discovery Model Selection +2

Highly Accurate Quantum Chemical Property Prediction with Uni-Mol+

1 code implementation16 Mar 2023 Shuqi Lu, Zhifeng Gao, Di He, Linfeng Zhang, Guolin Ke

We observed the quality of the optimized conformation is crucial for QC property prediction performance.

Benchmarking Graph Regression

Autoencoders as Cross-Modal Teachers: Can Pretrained 2D Image Transformers Help 3D Representation Learning?

2 code implementations16 Dec 2022 Runpei Dong, Zekun Qi, Linfeng Zhang, Junbo Zhang, Jianjian Sun, Zheng Ge, Li Yi, Kaisheng Ma

The success of deep learning heavily relies on large-scale data with comprehensive labels, which is more expensive and time-consuming to fetch in 3D compared to 2D images or natural languages.

Few-Shot 3D Point Cloud Classification Knowledge Distillation +1

A deep variational free energy approach to dense hydrogen

1 code implementation13 Sep 2022 Hao Xie, Zi-Hang Li, Han Wang, Linfeng Zhang, Lei Wang

We present a deep generative model-based variational free energy approach to the equations of state of dense hydrogen.

Uni-Mol: A Universal 3D Molecular Representation Learning Framework

1 code implementation ChemRxiv 2022 Gengmo Zhou, Zhifeng Gao, Qiankun Ding, Hang Zheng, Hongteng Xu, Zhewei Wei, Linfeng Zhang, Guolin Ke

Uni-Mol is composed of two models with the same SE(3)-equivariant transformer architecture: a molecular pretraining model trained by 209M molecular conformations; a pocket pretraining model trained by 3M candidate protein pocket data.

 Ranked #1 on Molecular Property Prediction on QM9 (using extra training data)

3D Geometry Prediction Pose Prediction +1

DPA-1: Pretraining of Attention-based Deep Potential Model for Molecular Simulation

1 code implementation17 Aug 2022 Duo Zhang, Hangrui Bi, Fu-Zhi Dai, Wanrun Jiang, Linfeng Zhang, Han Wang

Machine learning assisted modeling of the inter-atomic potential energy surface (PES) is revolutionizing the field of molecular simulation.

Contrastive Deep Supervision

1 code implementation12 Jul 2022 Linfeng Zhang, Xin Chen, Junbo Zhang, Runpei Dong, Kaisheng Ma

The success of deep learning is usually accompanied by the growth in neural network depth.

Contrastive Learning Fine-Grained Image Classification +3

DeePKS+ABACUS as a Bridge between Expensive Quantum Mechanical Models and Machine Learning Potentials

no code implementations21 Jun 2022 Wenfei Li, Qi Ou, Yixiao Chen, Yu Cao, Renxi Liu, Chunyi Zhang, Daye Zheng, Chun Cai, Xifan Wu, Han Wang, Mohan Chen, Linfeng Zhang

However, for high-level QM methods, such as density functional theory (DFT) at the meta-GGA level and/or with exact exchange, quantum Monte Carlo, etc., generating a sufficient amount of data for training a ML potential has remained computationally challenging due to their high cost.

Efficient Neural Network

PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection

1 code implementation CVPR 2023 Linfeng Zhang, Runpei Dong, Hung-Shuo Tai, Kaisheng Ma

The remarkable breakthroughs in point cloud representation learning have boosted their usage in real-world applications such as self-driving cars and virtual reality.

3D Object Detection Knowledge Distillation +4

Cyber Risk Assessment for Capital Management

no code implementations17 May 2022 Wing Fung Chong, Runhuan Feng, Hins Hu, Linfeng Zhang

Cyber risk is an omnipresent risk in the increasingly digitized world that is known to be difficult to quantify and assess.

Management

Wavelet Knowledge Distillation: Towards Efficient Image-to-Image Translation

no code implementations CVPR 2022 Linfeng Zhang, Xin Chen, Xiaobing Tu, Pengfei Wan, Ning Xu, Kaisheng Ma

Instead of directly distilling the generated images of teachers, wavelet knowledge distillation first decomposes the images into different frequency bands with discrete wavelet transformation and then only distills the high frequency bands.

Image-to-Image Translation Knowledge Distillation +1

$m^\ast$ of two-dimensional electron gas: a neural canonical transformation study

1 code implementation10 Jan 2022 Hao Xie, Linfeng Zhang, Lei Wang

The quasiparticle effective mass $m^\ast$ of interacting electrons is a fundamental quantity in the Fermi liquid theory.

Finding the Task-Optimal Low-Bit Sub-Distribution in Deep Neural Networks

1 code implementation30 Dec 2021 Runpei Dong, Zhanhong Tan, Mengdi Wu, Linfeng Zhang, Kaisheng Ma

Besides, an efficient deployment flow for the mobile CPU is developed, achieving up to 7. 46$\times$ inference acceleration on an octa-core ARM CPU.

Image Classification Model Compression +3

Not All Regions are Worthy to be Distilled: Region-aware Knowledge Distillation Towards Efficient Image-to-Image Translation

no code implementations29 Sep 2021 Linfeng Zhang, Kaisheng Ma

To tackle this challenge, in this paper, we propose Region-aware Knowledge Distillation which first localizes the crucial regions in the images with attention mechanism.

Contrastive Learning Image-to-Image Translation +2

Ab-initio study of interacting fermions at finite temperature with neural canonical transformation

1 code implementation18 May 2021 Hao Xie, Linfeng Zhang, Lei Wang

The variational density matrix is parametrized by a permutation equivariant many-body unitary transformation together with a discrete probabilistic model.

The Phase Diagram of a Deep Potential Water Model

no code implementations9 Feb 2021 Linfeng Zhang, Han Wang, Roberto Car, Weinan E

Using the Deep Potential methodology, we construct a model that reproduces accurately the potential energy surface of the SCAN approximation of density functional theory for water, from low temperature and pressure to about 2400 K and 50 GPa, excluding the vapor stability region.

Chemical Physics

Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors

1 code implementation ICLR 2021 Linfeng Zhang, Kaisheng Ma

In this paper, we suggest that the failure of knowledge distillation on object detection is mainly caused by two reasons: (1) the imbalance between pixels of foreground and background and (2) lack of distillation on the relation between different pixels.

Image Classification Knowledge Distillation +3

Isotope effects on molecular structures and electronic properties of liquid water via deep potential molecular dynamics based on SCAN functional

no code implementations9 Dec 2020 Jianhang Xu, Chunyi Zhang, Linfeng Zhang, Mohan Chen, Biswajit Santra, Xifan Wu

Feynman path-integral deep potential molecular dynamics (PI-DPMD) calculations have been employed to study both light (H$_2$O) and heavy water (D$_2$O) within the isothermal-isobaric ensemble.

Chemical Physics Computational Physics

Pandemic risk management: resources contingency planning and allocation

no code implementations6 Dec 2020 Xiaowei Chen, Wing Fung Chong, Runhuan Feng, Linfeng Zhang

Repeated history of pandemics, such as SARS, H1N1, Ebola, Zika, and COVID-19, has shown that pandemic risk is inevitable.

Management

Task-Oriented Feature Distillation

1 code implementation NeurIPS 2020 Linfeng Zhang, Yukang Shi, Zuoqiang Shi, Kaisheng Ma, Chenglong Bao

Moreover, an orthogonal loss is applied to the feature resizing layer in TOFD to improve the performance of knowledge distillation.

3D Classification General Classification +2

DeePKS: a comprehensive data-driven approach towards chemically accurate density functional theory

no code implementations1 Aug 2020 Yixiao Chen, Linfeng Zhang, Han Wang, E Weinan

We propose a general machine learning-based framework for building an accurate and widely-applicable energy functional within the framework of generalized Kohn-Sham density functional theory.

BIG-bench Machine Learning

Deep Potential generation scheme and simulation protocol for the Li10GeP2S12-type superionic conductors

no code implementations5 Jun 2020 Jianxing Huang, Linfeng Zhang, Han Wang, Jinbao Zhao, Jun Cheng, Weinan E

It has been a challenge to accurately simulate Li-ion diffusion processes in battery materials at room temperature using {\it ab initio} molecular dynamics (AIMD) due to its high computational cost.

Computational Physics Materials Science Chemical Physics

Integrating Machine Learning with Physics-Based Modeling

no code implementations4 Jun 2020 Weinan E, Jiequn Han, Linfeng Zhang

Machine learning is poised as a very powerful tool that can drastically improve our ability to carry out scientific research.

BIG-bench Machine Learning

Auxiliary Training: Towards Accurate and Robust Models

no code implementations CVPR 2020 Linfeng Zhang, Muzhou Yu, Tong Chen, Zuoqiang Shi, Chenglong Bao, Kaisheng Ma

In the training stage, a novel distillation method named input-aware self distillation is proposed to facilitate the primary classifier to learn the robust information from auxiliary classifiers.

Pushing the limit of molecular dynamics with ab initio accuracy to 100 million atoms with machine learning

1 code implementation1 May 2020 Weile Jia, Han Wang, Mohan Chen, Denghui Lu, Lin Lin, Roberto Car, Weinan E, Linfeng Zhang

For 35 years, {\it ab initio} molecular dynamics (AIMD) has been the method of choice for modeling complex atomistic phenomena from first principles.

Computational Physics

Deep Density: circumventing the Kohn-Sham equations via symmetry preserving neural networks

no code implementations27 Nov 2019 Leonardo Zepeda-Núñez, Yixiao Chen, Jiefu Zhang, Weile Jia, Linfeng Zhang, Lin Lin

By directly targeting at the self-consistent electron density, we demonstrate that the adapted network architecture, called the Deep Density, can effectively represent the electron density as the linear combination of contributions from many local clusters.

Translation

DP-GEN: A concurrent learning platform for the generation of reliable deep learning based potential energy models

1 code implementation28 Oct 2019 Yuzhi Zhang, Haidi Wang, WeiJie Chen, Jinzhe Zeng, Linfeng Zhang, Han Wang, Weinan E

Materials 3, 023804] and is capable of generating uniformly accurate deep learning based PES models in a way that minimizes human intervention and the computational cost for data generation and model training.

Computational Physics

Neural Canonical Transformation with Symplectic Flows

1 code implementation30 Sep 2019 Shuo-Hui Li, Chen-Xiao Dong, Linfeng Zhang, Lei Wang

We construct flexible and powerful canonical transformations as generative models using symplectic neural networks.

Density Estimation

Non-Structured DNN Weight Pruning -- Is It Beneficial in Any Platform?

no code implementations3 Jul 2019 Xiaolong Ma, Sheng Lin, Shaokai Ye, Zhezhi He, Linfeng Zhang, Geng Yuan, Sia Huat Tan, Zhengang Li, Deliang Fan, Xuehai Qian, Xue Lin, Kaisheng Ma, Yanzhi Wang

Based on the proposed comparison framework, with the same accuracy and quantization, the results show that non-structrued pruning is not competitive in terms of both storage and computation efficiency.

Model Compression Quantization

Deep neural network for Wannier function centers

no code implementations27 Jun 2019 Linfeng Zhang, Mohan Chen, Xifan Wu, Han Wang, Weinan E, Roberto Car

We introduce a deep neural network (DNN) model that assigns the position of the centers of the electronic charge in each atomic configuration on a molecular dynamics trajectory.

Computational Physics Materials Science Chemical Physics

Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation

1 code implementation ICCV 2019 Linfeng Zhang, Jiebo Song, Anni Gao, Jingwei Chen, Chenglong Bao, Kaisheng Ma

Different from traditional knowledge distillation - a knowledge transformation methodology among networks, which forces student neural networks to approximate the softmax layer outputs of pre-trained teacher neural networks, the proposed self distillation framework distills knowledge within network itself.

Knowledge Distillation

Monge-Amp\`ere Flow for Generative Modeling

no code implementations ICLR 2019 Linfeng Zhang, Weinan E, Lei Wang

We present a deep generative model, named Monge-Amp\`ere flow, which builds on continuous-time gradient flow arising from the Monge-Amp\`ere equation in optimal transport theory.

Density Estimation

Active Learning of Uniformly Accurate Inter-atomic Potentials for Materials Simulation

no code implementations28 Oct 2018 Linfeng Zhang, De-Ye Lin, Han Wang, Roberto Car, Weinan E

An active learning procedure called Deep Potential Generator (DP-GEN) is proposed for the construction of accurate and transferable machine learning-based models of the potential energy surface (PES) for the molecular modeling of materials.

Active Learning BIG-bench Machine Learning

Monge-Ampère Flow for Generative Modeling

1 code implementation26 Sep 2018 Linfeng Zhang, Weinan E, Lei Wang

We present a deep generative model, named Monge-Amp\`ere flow, which builds on continuous-time gradient flow arising from the Monge-Amp\`ere equation in optimal transport theory.

Density Estimation

StructADMM: A Systematic, High-Efficiency Framework of Structured Weight Pruning for DNNs

1 code implementation29 Jul 2018 Tianyun Zhang, Shaokai Ye, Kaiqi Zhang, Xiaolong Ma, Ning Liu, Linfeng Zhang, Jian Tang, Kaisheng Ma, Xue Lin, Makan Fardad, Yanzhi Wang

Without loss of accuracy on the AlexNet model, we achieve 2. 58X and 3. 65X average measured speedup on two GPUs, clearly outperforming the prior work.

Model Compression

Solving Many-Electron Schrödinger Equation Using Deep Neural Networks

no code implementations18 Jul 2018 Jiequn Han, Linfeng Zhang, Weinan E

We introduce a new family of trial wave-functions based on deep neural networks to solve the many-electron Schr\"odinger equation.

Computational Physics Chemical Physics

End-to-end Symmetry Preserving Inter-atomic Potential Energy Model for Finite and Extended Systems

1 code implementation NeurIPS 2018 Linfeng Zhang, Jiequn Han, Han Wang, Wissam A. Saidi, Roberto Car, Weinan E

Machine learning models are changing the paradigm of molecular modeling, which is a fundamental tool for material science, chemistry, and computational biology.

Computational Physics Materials Science Chemical Physics

DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics

2 code implementations11 Dec 2017 Han Wang, Linfeng Zhang, Jiequn Han, Weinan E

Here we describe DeePMD-kit, a package written in Python/C++ that has been designed to minimize the effort required to build deep learning based representation of potential energy and force field and to perform molecular dynamics.

Reinforced dynamics for enhanced sampling in large atomic and molecular systems

no code implementations10 Dec 2017 Linfeng Zhang, Han Wang, Weinan E

Like metadynamics, it allows for an efficient exploration of the configuration space by adding an adaptively computed biasing potential to the original dynamics.

Efficient Exploration reinforcement-learning +1

Deep Potential Molecular Dynamics: a scalable model with the accuracy of quantum mechanics

3 code implementations30 Jul 2017 Linfeng Zhang, Jiequn Han, Han Wang, Roberto Car, Weinan E

We introduce a scheme for molecular simulations, the Deep Potential Molecular Dynamics (DeePMD) method, based on a many-body potential and interatomic forces generated by a carefully crafted deep neural network trained with ab initio data.

Deep Potential: a general representation of a many-body potential energy surface

1 code implementation5 Jul 2017 Jiequn Han, Linfeng Zhang, Roberto Car, Weinan E

When tested on a wide variety of examples, Deep Potential is able to reproduce the original model, whether empirical or quantum mechanics based, within chemical accuracy.

Computational Physics

Cannot find the paper you are looking for? You can Submit a new open access paper.