2 code implementations • 13 Jul 2022 • Aurko Roy, Rohan Anil, Guangda Lai, Benjamin Lee, Jeffrey Zhao, Shuyuan Zhang, Shibo Wang, Ye Zhang, Shen Wu, Rigel Swavely, Tao, Yu, Phuong Dao, Christopher Fifty, Zhifeng Chen, Yonghui Wu
Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there is significant recent interest and investment in scaling these models.
Ranked #5 on Language Modelling on C4
We develop a strategy, the "bottom-feeder", which estimates value by observing orders sent by other agents, and find that it limits the success of fundamentalists.
To ease the inconsistent distribution between model data and real faces, different point sampling methods are used in train and test phase.
The prediction model and the interpretable insights can be applied to assist fundraisers with better promoting their fundraising campaigns and can potentially help crowdfunding platforms to provide more timely feedback to all fundraisers.
Existing group anomaly detection approaches rely on the assumption that the groups are known, which can hardly be true in real world social media applications.