no code implementations • 14 Nov 2023 • Wei Wen, Kuang-Hung Liu, Igor Fedorov, Xin Zhang, Hang Yin, Weiwei Chu, Kaveh Hassani, Mengying Sun, Jiang Liu, Xu Wang, Lin Jiang, Yuxin Chen, Buyun Zhang, Xi Liu, Dehua Cheng, Zhengxing Chen, Guang Zhao, Fangqiu Han, Jiyan Yang, Yuchen Hao, Liang Xiong, Wen-Yen Chen
In industry system, such as ranking system in Meta, it is unclear whether NAS algorithms from the literature can outperform production baselines because of: (1) scale - Meta ranking systems serve billions of users, (2) strong baselines - the baselines are production models optimized by hundreds to thousands of world-class engineers for years since the rise of deep learning, (3) dynamic baselines - engineers may have established new and stronger baselines during NAS search, and (4) efficiency - the search pipeline must yield results quickly in alignment with the productionization life cycle.
2 code implementations • 14 Jul 2022 • Tunhou Zhang, Dehua Cheng, Yuchen He, Zhengxing Chen, Xiaoliang Dai, Liang Xiong, Feng Yan, Hai Li, Yiran Chen, Wei Wen
To overcome the data multi-modality and architecture heterogeneity challenges in the recommendation domain, NASRec establishes a large supernet (i. e., search space) to search the full architectures.
no code implementations • 29 Jun 2020 • Qingquan Song, Dehua Cheng, Hanning Zhou, Jiyan Yang, Yuandong Tian, Xia Hu
Click-Through Rate (CTR) prediction is one of the most important machine learning tasks in recommender systems, driving personalized experience for billions of consumers.
1 code implementation • ICLR 2020 • Michael Tsang, Dehua Cheng, Hanpeng Liu, Xue Feng, Eric Zhou, Yan Liu
Recommendation is a prevalent application of machine learning that affects many users; therefore, it is important for recommender models to be accurate and interpretable.
no code implementations • ICLR 2018 • Michael Tsang, Dehua Cheng, Yan Liu
Interpreting neural networks is a crucial and challenging task in machine learning.
no code implementations • NeurIPS 2016 • Dehua Cheng, Richard Peng, Yan Liu, Ioakeim Perros
In this paper, we show ways of sampling intermediate steps of alternating minimization algorithms for computing low rank tensor CP decompositions, leading to the sparse alternating least squares (SPALS) method.
no code implementations • 27 Oct 2016 • Jie Chen, Dehua Cheng, Yan Liu
A well-known construction of such functions comes from Bochner's characterization, which connects a positive-definite function with a probability distribution.
no code implementations • 12 Feb 2015 • Dehua Cheng, Yu Cheng, Yan Liu, Richard Peng, Shang-Hua Teng
Our work is particularly motivated by the algorithmic problems for speeding up the classic Newton's method in applications such as computing the inverse square-root of the precision matrix of a Gaussian random field, as well as computing the $q$th-root transition (for $q\geq1$) in a time-reversible Markov model.
no code implementations • 23 Oct 2014 • Dehua Cheng, Xinran He, Yan Liu
Topic models have achieved significant successes in analyzing large-scale text corpus.
no code implementations • 20 Oct 2014 • Dehua Cheng, Yu Cheng, Yan Liu, Richard Peng, Shang-Hua Teng
random samples for $n$-dimensional Gaussian random fields with SDDM precision matrices.