Search Results for author: Dehua Cheng

Found 8 papers, 1 papers with code

Towards Automated Neural Interaction Discovery for Click-Through Rate Prediction

no code implementations29 Jun 2020 Qingquan Song, Dehua Cheng, Hanning Zhou, Jiyan Yang, Yuandong Tian, Xia Hu

Click-Through Rate (CTR) prediction is one of the most important machine learning tasks in recommender systems, driving personalized experience for billions of consumers.

Click-Through Rate Prediction Learning-To-Rank +2

Feature Interaction Interpretability: A Case for Explaining Ad-Recommendation Systems via Neural Interaction Detection

1 code implementation ICLR 2020 Michael Tsang, Dehua Cheng, Hanpeng Liu, Xue Feng, Eric Zhou, Yan Liu

Recommendation is a prevalent application of machine learning that affects many users; therefore, it is important for recommender models to be accurate and interpretable.

Image Classification Recommendation Systems

Detecting Statistical Interactions from Neural Network Weights

no code implementations ICLR 2018 Michael Tsang, Dehua Cheng, Yan Liu

Interpreting neural networks is a crucial and challenging task in machine learning.

SPALS: Fast Alternating Least Squares via Implicit Leverage Scores Sampling

no code implementations NeurIPS 2016 Dehua Cheng, Richard Peng, Yan Liu, Ioakeim Perros

In this paper, we show ways of sampling intermediate steps of alternating minimization algorithms for computing low rank tensor CP decompositions, leading to the sparse alternating least squares (SPALS) method.

On Bochner's and Polya's Characterizations of Positive-Definite Kernels and the Respective Random Feature Maps

no code implementations27 Oct 2016 Jie Chen, Dehua Cheng, Yan Liu

A well-known construction of such functions comes from Bochner's characterization, which connects a positive-definite function with a probability distribution.

Gaussian Processes

Spectral Sparsification of Random-Walk Matrix Polynomials

no code implementations12 Feb 2015 Dehua Cheng, Yu Cheng, Yan Liu, Richard Peng, Shang-Hua Teng

Our work is particularly motivated by the algorithmic problems for speeding up the classic Newton's method in applications such as computing the inverse square-root of the precision matrix of a Gaussian random field, as well as computing the $q$th-root transition (for $q\geq1$) in a time-reversible Markov model.

Cannot find the paper you are looking for? You can Submit a new open access paper.