no code implementations • COLING 2022 • Xiaofeng Qi, Chao Li, Zhongping Liang, Jigang Liu, Cheng Zhang, Yuanxin Wei, Lin Yuan, Guang Yang, Lanxiao Huang, Min Li
This paper introduces a generative system for in-battle real-time commentary in mobile MOBA games.
no code implementations • 27 Feb 2023 • Sadaf Khan, Zhengyuan Shi, Min Li, Qiang Xu
Circuit representation learning is a promising research direction in the electronic design automation (EDA) field.
1 code implementation • 20 Feb 2023 • Chang Chen, Min Li, Zhihua Wu, dianhai yu, Chao Yang
In this paper, we propose TA-MoE, a topology-aware routing strategy for large-scale MoE trainging, from a model-system co-design perspective, which can dynamically adjust the MoE dispatch pattern according to the network topology.
no code implementations • 7 Nov 2022 • Zhengkun Tian, Hongyu Xiang, Min Li, Feifei Lin, Ke Ding, Guanglu Wan
To reduce the peak latency, we propose a simple and novel method named peak-first regularization, which utilizes a frame-wise knowledge distillation function to force the probability distribution of the CTC model to shift left along the time axis instead of directly modifying the calculation process of CTC loss and gradients.
no code implementations • 9 Oct 2022 • Shichao Kan, Yixiong Liang, Min Li, Yigang Cen, Jianxin Wang, Zhihai He
To address this challenge, in this paper, we introduce a new method called coded residual transform (CRT) for deep metric learning to significantly improve its generalization capability.
no code implementations • 5 Sep 2022 • Min Li, Laurent Kneip
We apply our method to two distinct camera resectioning algorithms, and demonstrate highly efficient and reliable, geometric trim fitting.
no code implementations • 2 Sep 2022 • Zhengyuan Shi, Min Li, Sadaf Khan, Hui-Ling Zhen, Mingxuan Yuan, Qiang Xu
In this paper, we propose SATformer, a novel Transformer-based solution for Boolean satisfiability (SAT) solving.
no code implementations • 6 Aug 2022 • Meng Wang, Chuqi Lei, Jianxin Wang, Yaohang Li, Min Li
In conclusion, TripHLApan is a powerful tool for predicting the binding of HLA-I and HLA-II molecular peptides for the synthesis of tumor vaccines.
1 code implementation • 2 Jul 2022 • Huimin Zhu, Renyi Zhou, Jing Tang, Min Li
The rational design of novel molecules with desired bioactivity is a critical but challenging task in drug discovery, especially when treating a novel target family or understudied targets.
1 code implementation • 7 Jun 2022 • Zhengyuan Shi, Min Li, Sadaf Khan, Liuzheng Wang, Naixing Wang, Yu Huang, Qiang Xu
Unlike previous learning-based solutions that formulate the TPI task as a supervised-learning problem, we train a novel DRL agent, instantiated as the combination of a graph neural network (GNN) and a Deep Q-Learning network (DQN), to maximize the test coverage improvement.
no code implementations • 27 May 2022 • Min Li, Zhengyuan Shi, Qiuxia Lai, Sadaf Khan, Shaowei Cai, Qiang Xu
Based on this observation, we approximate the SAT solving procedure with a conditional generative model, leveraging a novel directed acyclic graph neural network (DAGNN) with two polarity prototypes for conditional SAT modeling.
no code implementations • 29 Apr 2022 • Shu-Mei Qin, Min Li, Tao Xu, Shao-Qun Dong
This work aims to provide an effective deep learning framework to predict the vector-soliton solutions of the coupled nonlinear equations and their interactions.
no code implementations • 16 Apr 2022 • Ying Wang, Min Li, Deirel Paz-Linares, Maria L. Bringas Vega, Pedro A. Valdés-Sosa
Kernel smooth is the most fundamental technique for data density and regression estimation.
1 code implementation • 26 Nov 2021 • Min Li, Sadaf Khan, Zhengyuan Shi, Naixing Wang, Yu Huang, Qiang Xu
We propose DeepGate, a novel representation learning solution that effectively embeds both logic function and structural information of a circuit as vectors on each gate.
1 code implementation • 26 Nov 2021 • Min Li, Zhengyuan Shi, Zezhong Wang, Weiwei Zhang, Yu Huang, Qiang Xu
The proposed GA-guided XORNets also allows reducing the number of control bits, and the total testing time decreases by 20. 78% on average and up to 47. 09% compared to the existing design without sacrificing test coverage.
no code implementations • ICLR 2022 • Minhao Liu, Ailing Zeng, Qiuxia Lai, Ruiyuan Gao, Min Li, Jing Qin, Qiang Xu
In this work, we propose a novel tree-structured wavelet neural network for time series signal analysis, namely T-WaveNet, by taking advantage of an inherent property of various types of signals, known as the dominant frequency range.
no code implementations • 7 Sep 2021 • Ruwen Bai, Min Li, Bo Meng, Fengfa Li, Miao Jiang, Junxing Ren, Degang Sun
Graph convolutional networks (GCNs) have emerged as dominant methods for skeleton-based action recognition.
1 code implementation • 27 Aug 2021 • Yang Yang, Min Li, Bo Meng, Junxing Ren, Degang Sun, Zihao Huang
On the basis of SALT and SDR loss, we propose SALT-Net, which explicitly exploits task-aligned point-set features for accurate detection results.
no code implementations • 12 Jun 2021 • Yifan Wu, Min Zeng, Ying Yu, Min Li
The label-wise attention mechanism is widely used in automatic ICD coding because it can assign weights to every word in full Electronic Medical Records (EMR) for different ICD codes.
no code implementations • 30 May 2021 • Xiongshi Deng, Min Li, Shaobo Deng, Lei Wang
In the second stage, XGBoost-MOGA searches for an optimal gene subset based on the most relevant genes's group using a multi-objective optimization genetic algorithm.
no code implementations • 30 May 2021 • Xiongshi Deng, Min Li, Lei Wang, Qikang Wan
Feature selection is a preprocessing step which plays a crucial role in the domain of machine learning and data mining.
no code implementations • NeurIPS 2021 • Yu Li, Min Li, Qiuxia Lai, Yannan Liu, Qiang Xu
To be specific, we first build a similarity graph on test instances and training samples, and we conduct graph-based semi-supervised learning to extract contextual features.
no code implementations • 10 May 2021 • Min Li, Yu Li, Ye Tian, Li Jiang, Qiang Xu
This paper presents AppealNet, a novel edge/cloud collaborative architecture that runs deep learning (DL) tasks more efficiently than state-of-the-art solutions.
no code implementations • 29 Apr 2021 • Yang Yang, Min Li, Bo Meng, Zihao Huang, Junxing Ren, Degang Sun
We also propose a new metric to measure the similarity between two groups of extreme points, namely, Extreme Intersection over Union (EIoU), and incorporate this EIoU as a new regression loss.
no code implementations • 21 Apr 2021 • Yunyan Hong, Ailing Zeng, Min Li, Cewu Lu, Li Jiang, Qiang Xu
Video action recognition (VAR) is a primary task of video understanding, and untrimmed videos are more common in real-life scenes.
1 code implementation • 29 Jan 2021 • Yifan Wu, Min Gao, Min Zeng, Feiyang Chen, Min Li, Jie Zhang
Therefore, we hope to develop a novel supervised learning method to learn the PPAs and DDAs effectively and thereby improve the prediction performance of the specific task of DPI.
no code implementations • 17 Nov 2020 • Hongru Wang, Min Li, Zimo Zhou, Gabriel Pui Cheong Fung, Kam-Fai Wong
In this paper, we publish a first Cantonese knowledge-driven Dialogue Dataset for REStaurant (KddRES) in Hong Kong, which grounds the information in multi-turn conversations to one specific restaurant.
no code implementations • 20 Oct 2020 • Min Li, Chuanfu Xiao, Chao Yang
A mode-wise flexible Tucker decomposition algorithm is proposed to enable the switch of different solvers for the factor matrices and core tensor, and a machine-learning adaptive solver selector is applied to automatically cope with the variations of both the input data and the hardware.
no code implementations • 21 Sep 2020 • Yu Li, Min Li, Bo Luo, Ye Tian, Qiang Xu
The key to enabling such lightweight checking is that the smaller neural network only needs to produce approximate results for the initial task without sacrificing fault coverage much.
no code implementations • 3 May 2020 • Chunshan Liu, Min Li, Lou Zhao, Philip Whiting, Stephen V. Hanly, Iain B. Collings, MinJian Zhao
Millimetre wave (mmWave) beam tracking is a challenging task because tracking algorithms are required to provide consistent high accuracy with low probability of loss of track and minimal overhead.
no code implementations • 6 Apr 2020 • Chuanfu Xiao, Chao Yang, Min Li
In this paper, we propose a new class of truncated HOSVD algorithms based on alternating least squares (ALS) for efficiently computing the low multilinear rank approximation of tensors.
no code implementations • 18 Jan 2020 • Min Li, Zhenglong Zhou, Zhe Wu, Boxin Shi, Changyu Diao, Ping Tan
From a single viewpoint, we use a set of photometric stereo images to identify surface points with the same distance to the camera.
no code implementations • 3 Jan 2020 • Xiaobo Zhou, Shihao Yan, Min Li, Jun Li, Feng Shu
This work, for the first time, considers confidential data collection in the context of unmanned aerial vehicle (UAV) wireless networks, where the scheduled ground sensor node (SN) intends to transmit confidential information to the UAV without being intercepted by other unscheduled ground SNs.
no code implementations • 6 Dec 2018 • Bo Luo, Min Li, Yu Li, Qiang Xu
Machine learning systems based on deep neural networks (DNNs) have gained mainstream adoption in many applications.
1 code implementation • CONLL 2018 • Min Li, Marina Danilevsky, Sara Noeman, Yunyao Li
Phonetic similarity algorithms identify words and phrases with similar pronunciation which are used in many natural language processing tasks.
no code implementations • NeurIPS 2015 • Tom Goldstein, Min Li, Xiaoming Yuan
The alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently.
no code implementations • 17 Nov 2015 • Min Li, Sudeep Gaddam, Xiaolin Li, Yinan Zhao, Jingzhe Ma, Jian Ge
In this paper, we apply Deep Learning techniques to detect the broad absorption bump.
1 code implementation • 2 May 2013 • Tom Goldstein, Min Li, Xiaoming Yuan, Ernie Esser, Richard Baraniuk
The Primal-Dual hybrid gradient (PDHG) method is a powerful optimization scheme that breaks complex problems into simple sub-steps.
Numerical Analysis 65K15 G.1.6