2 code implementations • 13 Jul 2022 • Aurko Roy, Rohan Anil, Guangda Lai, Benjamin Lee, Jeffrey Zhao, Shuyuan Zhang, Shibo Wang, Ye Zhang, Shen Wu, Rigel Swavely, Tao, Yu, Phuong Dao, Christopher Fifty, Zhifeng Chen, Yonghui Wu
Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there is significant recent interest and investment in scaling these models.
Ranked #5 on
Language Modelling
on C4
no code implementations • 2 Oct 2021 • Steven DiSilvio, Yu, Luo, Anthony Ozerov
We develop a strategy, the "bottom-feeder", which estimates value by observing orders sent by other agents, and find that it limits the success of fundamentalists.
no code implementations • NeurIPS 2020 • Jiayang Li, Jing Yu, Yu, Nie, Zhaoran Wang
In this paper, we provide a unified framework for learning and intervention in games.
no code implementations • 12 Nov 2019 • Ziyu, Zhang, Feipeng, Da, Yi, Yu
To ease the inconsistent distribution between model data and real faces, different point sampling methods are used in train and test phase.
no code implementations • 9 Nov 2019 • Tong Wang, Fujie Jin, Yu, Hu, Yuan Cheng
The prediction model and the interpretable insights can be applied to assist fundraisers with better promoting their fundraising campaigns and can potentially help crowdfunding platforms to provide more timely feedback to all fundraisers.
no code implementations • 7 Oct 2014 • QI, Yu, Xinran He, Yan Liu
Existing group anomaly detection approaches rely on the assumption that the groups are known, which can hardly be true in real world social media applications.