Search Results for author: Di Liu

Found 18 papers, 7 papers with code

You Only Search Once: On Lightweight Differentiable Architecture Search for Resource-Constrained Embedded Platforms

1 code implementation30 Aug 2022 Xiangzhong Luo, Di Liu, Hao Kong, Shuo Huai, Hui Chen, Weichen Liu

Benefiting from the search efficiency, differentiable neural architecture search (NAS) has evolved as the most dominant alternative to automatically design competitive deep neural networks (DNNs).

Autonomous Vehicles Neural Architecture Search

DeepRecon: Joint 2D Cardiac Segmentation and 3D Volume Reconstruction via A Structure-Specific Generative Method

no code implementations14 Jun 2022 Qi Chang, Zhennan Yan, Mu Zhou, Di Liu, Khalid Sawalha, Meng Ye, Qilong Zhangli, Mikael Kanski, Subhi Al Aref, Leon Axel, Dimitris Metaxas

Joint 2D cardiac segmentation and 3D volume reconstruction are fundamental to building statistical cardiac anatomy models and understanding functional mechanisms from motion patterns.

3D Reconstruction 3D Shape Reconstruction +4

Multi-scale frequency separation network for image deblurring

no code implementations1 Jun 2022 Yanni Zhang, Qiang Li, Miao Qi, Di Liu, Jun Kong, Jianzhong Wang

MSFS-Net introduces the frequency separation module (FSM) into an encoder-decoder network architecture to capture the low- and high-frequency information of image at multiple scales.

Contrastive Learning Deblurring +1

Causality Inspired Representation Learning for Domain Generalization

1 code implementation CVPR 2022 Fangrui Lv, Jian Liang, Shuang Li, Bin Zang, Chi Harold Liu, Ziteng Wang, Di Liu

Specifically, we assume that each input is constructed from a mix of causal factors (whose relationship with the label is invariant across domains) and non-causal factors (category-independent), and only the former cause the classification judgments.

Domain Generalization Representation Learning

A Data-scalable Transformer for Medical Image Segmentation: Architecture, Model Efficiency, and Benchmark

2 code implementations28 Feb 2022 Yunhe Gao, Mu Zhou, Di Liu, Zhennan Yan, Shaoting Zhang, Dimitris N. Metaxas

However, existing vision Transformers struggle to learn with limited medical data and are unable to generalize on diverse medical image tasks.

Image Segmentation Inductive Bias +2

Improving Robustness of Convolutional Neural Networks Using Element-Wise Activation Scaling

1 code implementation24 Feb 2022 Zhi-Yuan Zhang, Di Liu

Recent works reveal that re-calibrating the intermediate activation of adversarial examples can improve the adversarial robustness of a CNN model.

Adversarial Robustness

Pareto Domain Adaptation

1 code implementation NeurIPS 2021 Fangrui Lv, Jian Liang, Kaixiong Gong, Shuang Li, Chi Harold Liu, Han Li, Di Liu, Guoren Wang

Domain adaptation (DA) attempts to transfer the knowledge from a labeled source domain to an unlabeled target domain that follows different distribution from the source.

Domain Adaptation Image Classification +1

Incentive Compatible Pareto Alignment for Multi-Source Large Graphs

1 code implementation6 Dec 2021 Jian Liang, Fangrui Lv, Di Liu, Zehui Dai, Xu Tian, Shuang Li, Fei Wang, Han Li

Challenges of the problem include 1) how to align large-scale entities between sources to share information and 2) how to mitigate negative transfer from joint learning multi-source data.

HSCoNAS: Hardware-Software Co-Design of Efficient DNNs via Neural Architecture Search

no code implementations11 Mar 2021 Xiangzhong Luo, Di Liu, Shuo Huai, Weichen Liu

In this paper, we present a novel multi-objective hardware-aware neural architecture search (NAS) framework, namely HSCoNAS, to automate the design of deep neural networks (DNNs) with high accuracy but low latency upon target hardware.

Neural Architecture Search

A zero density estimate and fractional imaginary parts of zeros for $\mathrm{GL}_2$ $L$-functions

no code implementations2 Mar 2021 Olivia Beckwith, Di Liu, Jesse Thorner, Alexandru Zaharescu

We prove an analogue of Selberg's zero density estimate for $\zeta(s)$ that holds for any $\mathrm{GL}_2$ $L$-function.

Number Theory

Forbidden Dark Matter Annihilations into Standard Model Particles

no code implementations22 Dec 2020 Raffaele Tito D'Agnolo, Di Liu, Joshua T. Ruderman, Po-Jen Wang

We present kinematically forbidden dark matter annihilations into Standard Model leptons.

High Energy Physics - Phenomenology

Bringing AI To Edge: From Deep Learning's Perspective

no code implementations25 Nov 2020 Di Liu, Hao Kong, Xiangzhong Luo, Weichen Liu, Ravi Subramaniam

To bridge the gap, a plethora of deep learning techniques and optimization methods are proposed in the past years: light-weight deep learning models, network compression, and efficient neural architecture search.

Edge-computing Model Compression +1

Identifying Transition States of Chemical Kinetic Systems using Network Embedding Techniques

no code implementations29 Oct 2020 Paula Mercurio, Di Liu

Using random walk sampling methods for feature learning on networks, we develop a method for generating low-dimensional node embeddings for directed graphs and identifying transition states of stochastic chemical reacting systems.

Dimensionality Reduction Network Embedding

The Function Transformation Omics - Funomics

no code implementations17 Aug 2018 Yongshuai Jiang, Jing Xu, Simeng Hu, Di Liu, Linna Zhao, Xu Zhou

Function transformation, such as f(x, y) and f(x, y, z), can transform two, three, or multiple input/observation variables (in biology, it generally refers to the observed/measured value of biomarkers, biological characteristics, or other indicators) into a new output variable (new characteristics or indicators).

Cannot find the paper you are looking for? You can Submit a new open access paper.