no code implementations • 18 Feb 2022 • Bingbin Liu, Daniel Hsu, Pradeep Ravikumar, Andrej Risteski
This lens is undoubtedly very interesting, but suffers from the problem that there isn't a "canonical" set of downstream tasks to focus on -- in practice, this problem is usually resolved by competing on the benchmark dataset du jour.
no code implementations • ICLR 2022 • Bingbin Liu, Elan Rosenfeld, Pradeep Ravikumar, Andrej Risteski
Noise-contrastive estimation (NCE) is a statistically consistent method for learning unnormalized probabilistic models.
no code implementations • 3 Mar 2021 • Bingbin Liu, Pradeep Ravikumar, Andrej Risteski
Contrastive learning is a family of self-supervised methods where a model is trained to solve a classification task constructed from unlabeled data.
no code implementations • NeurIPS 2020 • Arun Suggala, Bingbin Liu, Pradeep Ravikumar
Using thorough empirical evaluation, we show that our learning algorithms have superior performance over traditional additive boosting algorithms, as well as existing greedy learning techniques for DNNs.
1 code implementation • 20 Feb 2020 • Bingbin Liu, Ehsan Adeli, Zhangjie Cao, Kuan-Hui Lee, Abhijeet Shenoi, Adrien Gaidon, Juan Carlos Niebles
In addition, we introduce a new dataset designed specifically for autonomous-driving scenarios in areas with dense pedestrian populations: the Stanford-TRI Intent Prediction (STIP) dataset.
no code implementations • ECCV 2018 • Bingbin Liu, Serena Yeung, Edward Chou, De-An Huang, Li Fei-Fei, Juan Carlos Niebles
A major challenge in computer vision is scaling activity understanding to the long tail of complex activities without requiring collecting large quantities of data for new actions.
1 code implementation • NeurIPS 2018 • Jun-Ting Hsieh, Bingbin Liu, De-An Huang, Li Fei-Fei, Juan Carlos Niebles
Our goal is to predict future video frames given a sequence of input frames.