7 code implementations • 27 Feb 2024 • Jiaqi Zhai, Lucy Liao, Xing Liu, Yueming Wang, Rui Li, Xuan Cao, Leon Gao, Zhaojie Gong, Fangda Gu, Michael He, Yinghai Lu, Yu Shi
Large-scale recommendation systems are characterized by their reliance on high cardinality, heterogeneous features and the need to handle tens of billions of user actions on a daily basis.
Ranked #1 on Recommendation Systems on MovieLens 20M (HR@10 (full corpus) metric)
1 code implementation • 31 Mar 2022 • Neelay Junnarkar, He Yin, Fangda Gu, Murat Arcak, Peter Seiler
We propose a parameterization of a nonlinear dynamic controller based on the recurrent equilibrium network, a generalization of the recurrent neural network.
1 code implementation • 8 Sep 2021 • Fangda Gu, He Yin, Laurent El Ghaoui, Murat Arcak, Peter Seiler, Ming Jin
Neural network controllers have become popular in control tasks thanks to their flexibility and expressivity.
1 code implementation • NeurIPS 2020 • Fangda Gu, Heng Chang, Wenwu Zhu, Somayeh Sojoudi, Laurent El Ghaoui
Graph Neural Networks (GNNs) are widely used deep learning models that learn meaningful representations from graph-structured data.
no code implementations • 17 Aug 2019 • Laurent El Ghaoui, Fangda Gu, Bertrand Travacca, Armin Askari, Alicia Y. Tsai
Implicit deep learning prediction rules generalize the recursive rules of feedforward neural networks.
1 code implementation • 20 Nov 2018 • Fangda Gu, Armin Askari, Laurent El Ghaoui
In this paper, we introduce a new class of lifted models, Fenchel lifted networks, that enjoy the same benefits as previous lifted models, without suffering a degradation in performance over classical networks.
no code implementations • 11 Jun 2018 • Siyuan Li, Fangda Gu, Guangxiang Zhu, Chongjie Zhang
Transfer learning can greatly speed up reinforcement learning for a new task by leveraging policies of relevant tasks.