1 code implementation • 16 Jan 2025 • Zhaocheng Liu, Quan Tu, Wen Ye, Yu Xiao, Zhishou Zhang, Hengfu Cui, Yalun Zhu, Qiang Ju, Shizheng Li, Jian Xie
By inputting medical records into our patient simulator to simulate patient responses, we conduct extensive experiments to explore the relationship between "inquiry" and "diagnosis" in the consultation process.
no code implementations • 27 Jun 2024 • Zhongxiang Fan, Zhaocheng Liu, Jian Liang, Dongying Kong, Han Li, Peng Jiang, Shuang Li, Kun Gai
MEDA minimizes overfitting by reducing the dependency of the embedding layer on subsequent training data or the Multi-Layer Perceptron (MLP) layers, and achieves data augmentation through training the MLP with varied embedding spaces.
1 code implementation • 7 May 2024 • Jian Jia, Yipei Wang, Yan Li, Honggang Chen, Xuehan Bai, Zhaocheng Liu, Jian Liang, Quan Chen, Han Li, Peng Jiang, Kun Gai
Contemporary recommendation systems predominantly rely on ID embedding to capture latent associations among users and items.
no code implementations • 21 Dec 2023 • Wenbin Hu, Fernando Acero, Eleftherios Triantafyllidis, Zhaocheng Liu, Zhibin Li
We present a modular framework designed to enable a robot hand-arm system to learn how to catch flying objects, a task that requires fast, reactive, and accurately-timed robot motions.
no code implementations • 30 Jun 2023 • Eleftherios Triantafyllidis, Fernando Acero, Zhaocheng Liu, Zhibin Li
In this work, we present a Hybrid Hierarchical Learning framework, the Robotic Manipulation Network (ROMAN), to address the challenge of solving multiple complex tasks over long time horizons in robotic manipulation.
no code implementations • 31 May 2023 • Zhaocheng Liu, Zhongxiang Fan, Jian Liang, Dongying Kong, Han Li
However, it is still unknown whether a multi-epoch training paradigm could achieve better results, as the best performance is usually achieved by one-epoch training.
no code implementations • 12 Apr 2023 • Qiang Liu, Zhaocheng Liu, Zhenxi Zhu, Shu Wu, Liang Wang
However, none of existing multi-interest recommendation models consider the Out-Of-Distribution (OOD) generalization problem, in which interest distribution may change.
1 code implementation • 3 Sep 2022 • Yingtao Luo, Zhaocheng Liu, Qiang Liu
The unstable correlation between procedures and diagnoses existed in the training distribution can cause spurious correlation between historical EHR and future diagnosis.
no code implementations • 14 Jul 2022 • Zhaocheng Liu, Yingtao Luo, Di Zeng, Qiang Liu, Daqing Chang, Dongying Kong, Zhi Chen
Modeling users' dynamic preferences from historical behaviors lies at the core of modern recommender systems.
1 code implementation • 1 Jun 2022 • Yi Guo, Zhaocheng Liu, Jianchao Tan, Chao Liao, Sen yang, Lei Yuan, Dongying Kong, Zhi Chen, Ji Liu
When training is finished, some gates are exact zero, while others are around one, which is particularly favored by the practical hot-start training in the industry, due to no damage to the model performance before and after removing the features corresponding to exact-zero gates.
no code implementations • 13 May 2022 • Zhaocheng Liu, Luis Herranz, Fei Yang, Saiping Zhang, Shuai Wan, Marta Mrak, Marc Górriz Blanch
Neural video compression has emerged as a novel paradigm combining trainable multilayer neural networks and machine learning, achieving competitive rate-distortion (RD) performances, but still remaining impractical due to heavy neural architectures, with large memory and computational demands.
no code implementations • 15 Aug 2021 • Qiang Liu, Yanqiao Zhu, Zhaocheng Liu, Yufeng Zhang, Shu Wu
To train high-performing models with the minimal annotation cost, active learning is proposed to select and label the most informative samples, yet it is still challenging to measure informativeness of samples used in DNNs.
no code implementations • 24 Feb 2021 • Qiang Liu, Zhaocheng Liu, Haoli Zhang, Yuntian Chen, Jun Zhu
Accordingly, we can design an automatic feature crossing method to find feature interactions in DNN, and use them as cross features in LR.
2 code implementations • 8 Feb 2021 • Yingtao Luo, Qiang Liu, Zhaocheng Liu
The next location recommendation is at the core of various location-based applications.
Ranked #1 on
point of interests
on Gowalla
no code implementations • 22 Aug 2020 • Zhaocheng Liu, Qiang Liu, Haoli Zhang, Yuntian Chen
Simple classifiers, e. g., Logistic Regression (LR), are globally interpretable, but not powerful enough to model complex nonlinear interactions among features in tabular data.
no code implementations • 23 Jul 2020 • Qiang Liu, Zhaocheng Liu, Xiaofang Zhu, Yeliang Xiu
In this paper, inspired by piece-wise linear interpretability in DNN, we introduce the linearly separable regions of samples to the problem of active learning, and propose a novel Deep Active learning approach by Model Interpretability (DAMI).
no code implementations • 17 Jul 2020 • Qiang Liu, Haoli Zhang, Zhaocheng Liu
Moreover, we have also conducted experiments on a typical task of graph embedding, i. e., community detection, and the proposed UCMF model outperforms several representative graph embedding models.
no code implementations • 30 Jun 2020 • Dayu Zhu, Zhaocheng Liu, Lakshmi Raju, Andrew S. Kim, Wenshan Cai
Flat optics foresees a new era of ultra-compact optical devices, where metasurfaces serve as the foundation.
no code implementations • 27 Apr 2020 • Qiang Liu, Zhaocheng Liu, Haoli Zhang
When dealing with continuous numeric features, we usually adopt feature discretization.
no code implementations • ICLR 2020 • Zhaocheng Liu, Qiang Liu, Haoli Zhang, Jun Zhu
In recent years, substantial progress has been made on graph convolutional networks (GCN).
1 code implementation • 1 Jan 2020 • Feng Yu, Zhaocheng Liu, Qiang Liu, Haoli Zhang, Shu Wu, Liang Wang
IM is an efficient and exact implementation of high-order FM, whose time complexity linearly grows with the order of interactions and the number of feature fields.
Ranked #2 on
Click-Through Rate Prediction
on KKBox
no code implementations • CIKM 2020 • Feng Yu, Zhaocheng Liu, Qiang Liu, Haoli Zhang, Shu Wu, Liang Wang
IM is an efficient and exact implementation of high-order FM, whose time complexity linearly grows with the order of interactions and the number of feature fields.
no code implementations • 25 Sep 2019 • Zhaocheng Liu, Qiang Liu, Haoli Zhang
Automatically feature generation is a major topic of automated machine learning.
no code implementations • 23 Apr 2019 • Zhaocheng Liu, Guangxue Yin
This is because there is a gap between offline AUC and online CPM.
no code implementations • 25 May 2018 • Zhaocheng Liu, Dayu Zhu, Sean P. Rodrigues, Kyu-Tae Lee, Wenshan Cai
The advent of two-dimensional metamaterials in recent years has ushered in a revolutionary means to manipulate the behavior of light on the nanoscale.
no code implementations • 13 Oct 2017 • Zhaocheng Liu, Sean P. Rodrigues, Wenshan Cai
The deep learning framework is witnessing expansive growth into diverse applications such as biological systems, human cognition, robotics, and the social sciences, thanks to its immense ability to extract essential features from complicated systems.
Disordered Systems and Neural Networks