no code implementations • 18 Mar 2023 • Wenwen Tong, Jiangwei Xie, Tianyu Li, Hanming Deng, Xiangwei Geng, Ruoyi Zhou, Dingchen Yang, Bo Dai, Lewei Lu, Hongyang Li
The proposed data augmentation approach contributes to a gain of 1. 7% and 1. 4% in terms of detection accuracy, on Waymo and nuScences respectively.
no code implementations • 14 Mar 2023 • Zhaoyang Lyu, Jinyi Wang, Yuwei An, Ya zhang, Dahua Lin, Bo Dai
In this work, we design a novel sparse latent point diffusion model for mesh generation.
no code implementations • 13 Mar 2023 • Chaofan Zheng, Xinyu Lyu, Lianli Gao, Bo Dai, Jingkuan Song
Current Scene Graph Generation (SGG) methods explore contextual information to predict relationships among entity pairs.
no code implementations • 31 Jan 2023 • Yilun Du, Mengjiao Yang, Bo Dai, Hanjun Dai, Ofir Nachum, Joshua B. Tenenbaum, Dale Schuurmans, Pieter Abbeel
The proposed policy-as-video formulation can further represent environments with different state and action spaces in a unified space of images, which, for example, enables learning and generalization across a variety of robot manipulation tasks.
no code implementations • 30 Jan 2023 • Anyi Rao, Xuekun Jiang, Yuwei Guo, Linning Xu, Lei Yang, Libiao Jin, Dahua Lin, Bo Dai
Amateurs working on mini-films and short-form videos usually spend lots of time and effort on the multi-round complicated process of setting and adjusting scenes, plots, and cameras to deliver satisfying video shots.
no code implementations • 16 Jan 2023 • Jincheng Mei, Wesley Chung, Valentin Thomas, Bo Dai, Csaba Szepesvari, Dale Schuurmans
Instead, the analysis reveals that the primary effect of the value baseline is to \textbf{reduce the aggressiveness of the updates} rather than their variance.
no code implementations • 19 Dec 2022 • Yushi Lan, Chen Change Loy, Bo Dai
The neural radiance field (NeRF) has shown promising results in preserving the fine details of objects and scenes.
no code implementations • 17 Dec 2022 • Tongzheng Ren, Chenjun Xiao, Tianjun Zhang, Na Li, Zhaoran Wang, Sujay Sanghavi, Dale Schuurmans, Bo Dai
Theoretically, we establish the sample complexity of the proposed approach in the online and offline settings.
Model-based Reinforcement Learning
reinforcement-learning
+1
1 code implementation • 14 Dec 2022 • Zhuoqian Yang, Shikai Li, Wayne Wu, Bo Dai
We present 3DHumanGAN, a 3D-aware generative adversarial network (GAN) that synthesizes images of full-body humans with consistent appearances under different view-angles and body-poses.
no code implementations • 14 Dec 2022 • Yushi Lan, Xuyi Meng, Shuai Yang, Chen Change Loy, Bo Dai
In this paper, we study the challenging problem of 3D GAN inversion where a latent code is predicted given a single face image to faithfully recover its 3D shapes and detailed textures.
no code implementations • 30 Nov 2022 • Haoran Sun, Lijun Yu, Bo Dai, Dale Schuurmans, Hanjun Dai
Score-based modeling through stochastic differential equations (SDEs) has provided a new perspective on diffusion models, and demonstrated superior performance on continuous data.
no code implementations • 14 Nov 2022 • Hanjun Dai, Yuan Xue, Niao He, Bethany Wang, Na Li, Dale Schuurmans, Bo Dai
In real-world decision-making, uncertainty is important yet difficult to handle.
no code implementations • 3 Nov 2022 • Jonathan N. Lee, George Tucker, Ofir Nachum, Bo Dai, Emma Brunskill
We propose the first model selection algorithm for offline RL that achieves minimax rate-optimal oracle inequalities up to logarithmic factors.
no code implementations • 22 Oct 2022 • Zikai Wei, Bo Dai, Dahua Lin
Modeling and characterizing multiple factors is perhaps the most important step in achieving excess returns over market benchmarks.
no code implementations • 17 Oct 2022 • Zikai Wei, Xinge Zhu, Bo Dai, Dahua Lin
To accurately predict trajectories in multi-agent settings, e. g. team games, it is important to effectively model the interactions among agents.
no code implementations • 17 Oct 2022 • Anyi Rao, Xuekun Jiang, Sichen Wang, Yuwei Guo, Zihao Liu, Bo Dai, Long Pang, Xiaoyu Wu, Dahua Lin, Libiao Jin
The ability to choose an appropriate camera view among multiple cameras plays a vital role in TV shows delivery.
no code implementations • 20 Sep 2022 • Ceyuan Yang, Yujun Shen, Yinghao Xu, Deli Zhao, Bo Dai, Bolei Zhou
Two capacity adjusting schemes are developed for training GANs under different data regimes: i) given a sufficient amount of training data, the discriminator benefits from a progressively increased learning capacity, and ii) when the training data is limited, gradually decreasing the layer width mitigates the over-fitting issue of the discriminator.
no code implementations • 19 Aug 2022 • Tongzheng Ren, Tianjun Zhang, Lisa Lee, Joseph E. Gonzalez, Dale Schuurmans, Bo Dai
Representation learning often plays a critical role in reinforcement learning by managing the curse of dimensionality.
1 code implementation • 22 Jul 2022 • Yidi Shao, Chen Change Loy, Bo Dai
Consequently, in this paper we propose a novel Transformer-based method, dubbed as Transformer with Implicit Edges (TIE), to capture the rich semantics of particle interactions in an edge-free manner.
1 code implementation • 20 Jul 2022 • Davide Moltisanti, Jinyi Wu, Bo Dai, Chen Change Loy
Estimating human keypoints from these videos is difficult due to the complexity of the dance, as well as the multiple moving cameras recording setup.
1 code implementation • 20 Jul 2022 • Junzhe Zhang, Daxuan Ren, Zhongang Cai, Chai Kiat Yeo, Bo Dai, Chen Change Loy
Reconstruction is achieved by searching for a latent space in the 3D GAN that best resembles the target mesh in accordance with the single view observation.
no code implementations • 14 Jul 2022 • Tianjun Zhang, Tongzheng Ren, Mengjiao Yang, Joseph E. Gonzalez, Dale Schuurmans, Bo Dai
It is common to address the curse of dimensionality in Markov decision processes (MDPs) by exploiting low-rank representations.
no code implementations • 29 Jun 2022 • Haoran Sun, Hanjun Dai, Bo Dai, Haomin Zhou, Dale Schuurmans
It is known that gradient-based MCMC samplers for continuous spaces, such as Langevin Monte Carlo (LMC), can be derived as particle versions of a gradient flow that minimizes KL divergence on a Wasserstein manifold.
1 code implementation • 30 May 2022 • Jinyi Wang, Zhaoyang Lyu, Dahua Lin, Bo Dai, Hongfei Fu
In this paper, we propose a novel purification approach, referred to as guided diffusion model for purification (GDMP), to help protect classifiers from adversarial attacks.
no code implementations • CVPR 2022 • Jingbo Wang, Yu Rong, Jingyuan Liu, Sijie Yan, Dahua Lin, Bo Dai
The ability to synthesize long-term human motion sequences in real-world scenes can facilitate numerous applications.
1 code implementation • 25 May 2022 • Zhaoyang Lyu, Xudong Xu, Ceyuan Yang, Dahua Lin, Bo Dai
By modeling the reverse process of gradually diffusing the data distribution into a Gaussian distribution, generating a sample in DDPMs can be regarded as iteratively denoising a randomly sampled Gaussian noise.
1 code implementation • CVPR 2022 • Yanbo Xu, Yueqin Yin, Liming Jiang, Qianyi Wu, Chengyao Zheng, Chen Change Loy, Bo Dai, Wayne Wu
In this study, we highlight the importance of interaction in a dual-space GAN for more controllable editing.
1 code implementation • CVPR 2022 • Xian Liu, Qianyi Wu, Hang Zhou, Yinghao Xu, Rui Qian, Xinyi Lin, Xiaowei Zhou, Wayne Wu, Bo Dai, Bolei Zhou
To enhance the quality of synthesized gestures, we develop a contrastive learning strategy based on audio-text alignment for better audio representations.
Ranked #2 on
Gesture Generation
on TED Gesture Dataset
1 code implementation • 16 Mar 2022 • Ailing Zeng, Xuan Ju, Lei Yang, Ruiyuan Gao, Xizhou Zhu, Bo Dai, Qiang Xu
This paper proposes a simple baseline framework for video-based 2D/3D human pose estimation that can achieve 10 times efficiency improvement over existing works without any performance degradation, named DeciWatch.
Ranked #1 on
2D Human Pose Estimation
on JHMDB (2D poses only)
no code implementations • 10 Feb 2022 • Dylan Slack, Yinlam Chow, Bo Dai, Nevan Wichers
However, we identify these techniques are not well equipped for safe policy learning because they ignore negative experiences(e. g., unsafe or unsuccessful), focusing only on positive experiences, which harms their ability to generalize to new tasks safely.
no code implementations • 23 Dec 2021 • Jonathan N. Lee, George Tucker, Ofir Nachum, Bo Dai
We formalize the problem in the contextual bandit setting with linear model classes by identifying three sources of error that any model selection algorithm should optimally trade-off in order to be competitive: (1) approximation error, (2) statistical complexity, and (3) coverage.
no code implementations • CVPR 2022 • Yinghao Xu, Fangyun Wei, Xiao Sun, Ceyuan Yang, Yujun Shen, Bo Dai, Bolei Zhou, Stephen Lin
Typically in recent work, the pseudo-labels are obtained by training a model on the labeled data, and then using confident predictions from the model to teach itself.
no code implementations • 10 Dec 2021 • Yuanbo Xiangli, Linning Xu, Xingang Pan, Nanxuan Zhao, Anyi Rao, Christian Theobalt, Bo Dai, Dahua Lin
The wide span of viewing positions within these scenes yields multi-scale renderings with very different levels of detail, which poses great challenges to neural radiance field and biases it towards compromised results.
1 code implementation • 2 Dec 2021 • Chong Zhou, Chen Change Loy, Bo Dai
Contrastive Language-Image Pre-training (CLIP) has made a remarkable breakthrough in open-vocabulary zero-shot image recognition.
no code implementations • NeurIPS 2021 • Ruoxi Sun, Hanjun Dai, Li Li, Steven Kearnes, Bo Dai
In this paper, we propose a framework that unifies sequence- and graph-based methods as energy-based models (EBMs) with different energy functions.
no code implementations • ICLR 2022 • Hanjun Dai, Yuan Xue, Zia Syed, Dale Schuurmans, Bo Dai
Stochastic dual dynamic programming (SDDP) is a state-of-the-art method for solving multi-stage stochastic optimization, widely used for modeling real-world process optimization tasks.
no code implementations • 22 Nov 2021 • Tongzheng Ren, Tianjun Zhang, Csaba Szepesvári, Bo Dai
Representation learning lies at the heart of the empirical success of deep learning for dealing with the curse of dimensionality.
2 code implementations • NeurIPS 2021 • Liming Jiang, Bo Dai, Wayne Wu, Chen Change Loy
Generative adversarial networks (GANs) typically require ample data for training in order to synthesize high-fidelity images.
1 code implementation • NeurIPS 2021 • Xudong Xu, Xingang Pan, Dahua Lin, Bo Dai
In this paper, we propose Generative Occupancy Fields (GOF), a novel model based on generative radiance fields that can learn compact object surfaces without impeding its training convergence.
1 code implementation • NeurIPS 2021 • Xingang Pan, Xudong Xu, Chen Change Loy, Christian Theobalt, Bo Dai
Motivated by the observation that a 3D object should look realistic from multiple viewpoints, these methods introduce a multi-view constraint as regularization to learn valid 3D radiance fields from 2D images.
no code implementations • NeurIPS 2021 • Jincheng Mei, Bo Dai, Chenjun Xiao, Csaba Szepesvari, Dale Schuurmans
We study the effect of stochasticity in on-policy policy optimization, and make the following four contributions.
1 code implementation • 28 Oct 2021 • Hongyu Ren, Hanjun Dai, Bo Dai, Xinyun Chen, Denny Zhou, Jure Leskovec, Dale Schuurmans
There are two important reasoning tasks on KGs: (1) single-hop knowledge graph completion, which involves predicting individual links in the KG; and (2), multi-hop reasoning, where the goal is to predict which KG entities satisfy a given logical query.
no code implementations • 29 Sep 2021 • Junzhe Zhang, Daxuan Ren, Zhongang Cai, Chai Kiat Yeo, Bo Dai, Chen Change Loy
Reconstruction is achieved by searching for a latent space in the 3D GAN that best resembles the target mesh in accordance with the single view observation.
no code implementations • ICLR 2022 • Chenjun Xiao, Bo Dai, Jincheng Mei, Oscar A Ramirez, Ramki Gummadi, Chris Harris, Dale Schuurmans
To better understand the utility of deep models in RL we present an analysis of recursive value estimation using overparameterized linear representations that provides useful, transferable findings.
no code implementations • 29 Sep 2021 • Dylan Z Slack, Yinlam Chow, Bo Dai, Nevan Wichers
Though many reinforcement learning (RL) problems involve learning policies in settings that are difficult to specify safety constraints and sparse rewards, current methods struggle to rapidly and safely acquire successful policies.
no code implementations • 29 Sep 2021 • Yidi Shao, Chen Change Loy, Bo Dai
However, they force particles to interact with all neighbors without selection, and they fall short in capturing material semantics for different particles, leading to unsatisfactory performance, especially in generalization.
1 code implementation • NeurIPS 2021 • Hongyu Ren, Hanjun Dai, Zihang Dai, Mengjiao Yang, Jure Leskovec, Dale Schuurmans, Bo Dai
However, the key limitation of transformers is their quadratic memory and time complexity $\mathcal{O}(L^2)$ with respect to the sequence length in attention layers, which restricts application in extremely long sequences.
Ranked #2 on
Language Modelling
on Wiki-40B
no code implementations • 9 Jul 2021 • Hao Sun, Ziping Xu, Meng Fang, Zhenghao Peng, Jiadong Guo, Bo Dai, Bolei Zhou
Safe exploration is crucial for the real-world application of reinforcement learning (RL).
no code implementations • 18 Jun 2021 • Chenjun Xiao, Ilbin Lee, Bo Dai, Dale Schuurmans, Csaba Szepesvari
In high stake applications, active experimentation may be considered too risky and thus data are often collected passively.
1 code implementation • 3 Jun 2021 • Xiao Zhang, Dongrui Wu, Haoyi Xiong, Bo Dai
Unlike the conventional wisdom in statistical learning theory, the test error of a deep neural network (DNN) often demonstrates double descent: as the model complexity increases, it first follows a classical U-shaped curve and then shows a second descent.
no code implementations • CVPR 2021 • Jingbo Wang, Sijie Yan, Bo Dai, Dahua Lin
We revisit human motion synthesis, a task useful in various real world applications, in this paper.
no code implementations • 13 May 2021 • Jincheng Mei, Yue Gao, Bo Dai, Csaba Szepesvari, Dale Schuurmans
Classical global convergence results for first-order methods rely on uniform smoothness and the \L{}ojasiewicz inequality.
3 code implementations • CVPR 2022 • Haodong Duan, Yue Zhao, Kai Chen, Dahua Lin, Bo Dai
In this work, we propose PoseC3D, a new approach to skeleton-based action recognition, which relies on a 3D heatmap stack instead of a graph sequence as the base representation of human skeletons.
Ranked #1 on
Skeleton Based Action Recognition
on NTU RGB+D
no code implementations • CVPR 2021 • Junzhe Zhang, Xinyi Chen, Zhongang Cai, Liang Pan, Haiyu Zhao, Shuai Yi, Chai Kiat Yeo, Bo Dai, Chen Change Loy
In contrast to previous fully supervised approaches, in this paper we present ShapeInversion, which introduces Generative Adversarial Network (GAN) inversion to shape completion for the first time.
no code implementations • CVPR 2021 • Xudong Xu, Hang Zhou, Ziwei Liu, Bo Dai, Xiaogang Wang, Dahua Lin
Moreover, combined with binaural recordings, our method is able to further boost the performance of binaural audio generation under supervised settings.
no code implementations • 6 Apr 2021 • Chenjun Xiao, Yifan Wu, Tor Lattimore, Bo Dai, Jincheng Mei, Lihong Li, Csaba Szepesvari, Dale Schuurmans
First, we introduce a class of confidence-adjusted index algorithms that unifies optimistic and pessimistic principles in a common framework, which enables a general analysis.
no code implementations • NeurIPS 2021 • Tongzheng Ren, Jialian Li, Bo Dai, Simon S. Du, Sujay Sanghavi
To the best of our knowledge, these are the \emph{first} set of nearly horizon-free bounds for episodic time-homogeneous offline tabular MDP and linear MDP with anchor points.
no code implementations • 14 Mar 2021 • Zhiqiang Hu, Roy Ka-Wei Lee, Lei Wang, Ee-Peng Lim, Bo Dai
Authorship attribution (AA), which is the task of finding the owner of a given text, is an important and widely studied research topic with many applications.
no code implementations • NeurIPS 2020 • Zhuangdi Zhu, Kaixiang Lin, Bo Dai, Jiayu Zhou
To further accelerate the learning procedure, we regulate the policy update with an inverse action model, which assists distribution matching from the perspective of mode-covering.
1 code implementation • EMNLP 2021 • Haoming Jiang, Bo Dai, Mengjiao Yang, Tuo Zhao, Wei Wei
An ideal environment for evaluating dialog systems, also known as the Turing test, needs to involve human interaction, which is usually not affordable for large-scale experiments.
no code implementations • ICCV 2021 • Linning Xu, Yuanbo Xiangli, Anyi Rao, Nanxuan Zhao, Bo Dai, Ziwei Liu, Dahua Lin
City modeling is the foundation for computational urban planning, navigation, and entertainment.
no code implementations • 1 Jan 2021 • Hao Sun, Ziping Xu, Meng Fang, Yuhang Song, Jiechao Xiong, Bo Dai, Zhengyou Zhang, Bolei Zhou
Despite the remarkable progress made by the policy gradient algorithms in reinforcement learning (RL), sub-optimal policies usually result from the local exploration property of the policy gradient update.
no code implementations • 24 Dec 2020 • Xiyu Yan, Xun Chen, Yu Chen, Bo Dai, Heng Lin, Tao Li, Ke Han, Kaixiang Ni, Fusang Wang, Shaobo Wang, Qibin Zheng, Xinning Zeng
The PandaX-III experiment uses high pressure gaseous time projection chamber to search for the neutrinoless double beta decay of $^{136}$Xe.
Anomaly Detection
High Energy Physics - Experiment
Instrumentation and Detectors
1 code implementation • ICCV 2021 • Liming Jiang, Bo Dai, Wayne Wu, Chen Change Loy
In this study, we show that narrowing gaps in the frequency domain can ameliorate image reconstruction and synthesis quality further.
1 code implementation • 12 Dec 2020 • Mengjiao Yang, Bo Dai, Ofir Nachum, George Tucker, Dale Schuurmans
More importantly, we show how the belief distribution estimated by BayesDICE may be used to rank policies with respect to any arbitrary downstream policy selection metric, and we empirically demonstrate that this selection procedure significantly outperforms existing approaches, such as ranking policies according to mean or high-confidence lower bound value estimates.
no code implementations • NeurIPS 2020 • Jincheng Mei, Chenjun Xiao, Bo Dai, Lihong Li, Csaba Szepesvari, Dale Schuurmans
Both findings are based on an analysis of convergence rates using the Non-uniform \L{}ojasiewicz (N\L{}) inequalities.
no code implementations • NeurIPS 2020 • Yujia Xie, Hanjun Dai, Minshuo Chen, Bo Dai, Tuo Zhao, Hongyuan Zha, Wei Wei, Tomas Pfister
Finding the k largest or smallest elements from a collection of scores, i. e., top-k operation, is an important model component widely used in information retrieval, machine learning, and data mining.
no code implementations • NeurIPS 2020 • Luofeng Liao, You-Lin Chen, Zhuoran Yang, Bo Dai, Mladen Kolar, Zhaoran Wang
We study estimation in a class of generalized SEMs where the object of interest is defined as the solution to a linear operator equation.
no code implementations • NeurIPS 2020 • Hanjun Dai, Rishabh Singh, Bo Dai, Charles Sutton, Dale Schuurmans
In this paper we propose ALOE, a new algorithm for learning conditional and unconditional EBMs for discrete structured data, where parameter gradients are estimated using a learned sampler that mimics local search.
1 code implementation • ICLR 2021 • Xingang Pan, Bo Dai, Ziwei Liu, Chen Change Loy, Ping Luo
Through our investigation, we found that such a pre-trained GAN indeed contains rich 3D knowledge and thus can be used to recover 3D shape from a single 2D image in an unsupervised manner.
1 code implementation • EMNLP 2020 • Yuyang Nie, Yuanhe Tian, Xiang Wan, Yan Song, Bo Dai
In particular, we obtain the augmented semantic information from a large-scale corpus, and propose an attentive semantic augmentation module and a gate module to encode and aggregate such information, respectively.
Ranked #4 on
Chinese Named Entity Recognition
on Weibo NER
Chinese Named Entity Recognition
named-entity-recognition
+3
no code implementations • NeurIPS 2020 • Bo Dai, Ofir Nachum, Yinlam Chow, Lihong Li, Csaba Szepesvári, Dale Schuurmans
We study high-confidence behavior-agnostic off-policy evaluation in reinforcement learning, where the goal is to estimate a confidence interval on a target policy's value, given only access to a static experience dataset collected by unknown behavior policies.
no code implementations • NeurIPS Workshop LMCA 2020 • Yujia Xie, Hanjun Dai, Minshuo Chen, Bo Dai, Tuo Zhao, Hongyuan Zha, Wei Wei, Tomas Pfister
The top-$k$ operation, i. e., finding the $k$ largest or smallest elements from a collection of scores, is an important model component, which is widely used in information retrieval, machine learning, and data mining.
no code implementations • 13 Aug 2020 • Yuyan Wang, Zhe Zhao, Bo Dai, Christopher Fifty, Dong Lin, Lichan Hong, Ed H. Chi
A delicate balance between multi-task generalization and multi-objective optimization is therefore needed for finding a better trade-off between efficiency and generalization.
no code implementations • 14 Jul 2020 • Ruoxi Sun, Hanjun Dai, Li Li, Steven Kearnes, Bo Dai
Retrosynthesis -- the process of identifying a set of reactants to synthesize a target molecule -- is of vital importance to material design and drug discovery.
Ranked #1 on
Single-step retrosynthesis
on USPTO-50k
no code implementations • NeurIPS 2020 • Mengjiao Yang, Ofir Nachum, Bo Dai, Lihong Li, Dale Schuurmans
The recently proposed distribution correction estimation (DICE) family of estimators has advanced the state of the art in off-policy evaluation from behavior-agnostic data.
no code implementations • 2 Jul 2020 • Luofeng Liao, You-Lin Chen, Zhuoran Yang, Bo Dai, Zhaoran Wang, Mladen Kolar
We study estimation in a class of generalized SEMs where the object of interest is defined as the solution to a linear operator equation.
1 code implementation • 29 Jun 2020 • Yinghao Xu, Ceyuan Yang, Ziwei Liu, Bo Dai, Bolei Zhou
Recent attempts for unsupervised landmark learning leverage synthesized image pairs that are similar in appearance but different in poses.
1 code implementation • ICML 2020 • Hanjun Dai, Azade Nazi, Yujia Li, Bo Dai, Dale Schuurmans
Based on this, we develop a novel autoregressive model, named BiGG, that utilizes this sparsity to avoid generating the full adjacency matrix, and importantly reduces the graph generation time complexity to $O((n + m)\log n)$.
1 code implementation • 28 Jun 2020 • Ceyuan Yang, Yinghao Xu, Bo Dai, Bolei Zhou
Visual tempo, which describes how fast an action goes, has shown its potential in supervised action recognition.
no code implementations • 11 Jun 2020 • Hao Sun, Ziping Xu, Yuhang Song, Meng Fang, Jiechao Xiong, Bo Dai, Bolei Zhou
However, PG algorithms rely on exploiting the value function being learned with the first-order update locally, which results in limited sample efficiency.
1 code implementation • 21 May 2020 • Hao Sun, Zhenghao Peng, Bo Dai, Jian Guo, Dahua Lin, Bolei Zhou
In problem-solving, we humans can come up with multiple novel solutions to the same problem.
no code implementations • CVPR 2020 • Dian Shao, Yue Zhao, Bo Dai, Dahua Lin
Current methods for action recognition primarily rely on deep convolutional networks to derive feature embeddings of visual and motion features.
2 code implementations • 27 Apr 2020 • Hao Sun, Xinyu Pan, Bo Dai, Dahua Lin, Bolei Zhou
Solving the Goal-Conditioned Reward Sparse (GCRS) task is a challenging reinforcement learning problem due to the sparsity of reward signals.
no code implementations • CVPR 2020 • Dian Shao, Yue Zhao, Bo Dai, Dahua Lin
To take action recognition to a new level, we develop FineGym, a new dataset built on top of gymnastic videos.
3 code implementations • CVPR 2020 • Ceyuan Yang, Yinghao Xu, Jianping Shi, Bo Dai, Bolei Zhou
Previous works often capture the visual tempo through sampling raw videos at multiple rates and constructing an input-level frame pyramid, which usually requires a costly multi-branch network to handle.
Ranked #87 on
Action Recognition
on Something-Something V2
2 code implementations • CVPR 2020 • Xiaohang Zhan, Xingang Pan, Bo Dai, Ziwei Liu, Dahua Lin, Chen Change Loy
This is achieved via Partial Completion Network (PCNet)-mask (M) and -content (C), that learn to recover fractions of object masks and contents, respectively, in a self-supervised manner.
1 code implementation • 1 Apr 2020 • Zhuangdi Zhu, Kaixiang Lin, Bo Dai, Jiayu Zhou
SAIL bridges the advantages of IL and RL to reduce the sample complexity substantially, by effectively exploiting sup-optimal demonstrations and efficiently exploring the environment to surpass the demonstrated performance.
1 code implementation • ECCV 2020 • Xingang Pan, Xiaohang Zhan, Bo Dai, Dahua Lin, Chen Change Loy, Ping Luo
Learning a good image prior is a long-term goal for image restoration and manipulation.
1 code implementation • ICML 2020 • Mengjiao Yang, Bo Dai, Hanjun Dai, Dale Schuurmans
Recently there has been growing interest in modeling sets with exchangeability such as point clouds.
1 code implementation • ICML 2020 • Junfeng Wen, Bo Dai, Lihong Li, Dale Schuurmans
We consider the problem of approximating the stationary distribution of an ergodic Markov chain given a set of sampled transitions.
1 code implementation • ICLR 2020 • Ruiyi Zhang, Bo Dai, Lihong Li, Dale Schuurmans
An important problem that arises in reinforcement learning and Monte Carlo methods is estimating quantities defined by the stationary distribution of a Markov chain.
no code implementations • 16 Feb 2020 • Yujia Xie, Hanjun Dai, Minshuo Chen, Bo Dai, Tuo Zhao, Hongyuan Zha, Wei Wei, Tomas Pfister
The top-k operation, i. e., finding the k largest or smallest elements from a collection of scores, is an important model component, which is widely used in information retrieval, machine learning, and data mining.
2 code implementations • ICLR 2020 • Yuanbo Xiangli, Yubin Deng, Bo Dai, Chen Change Loy, Dahua Lin
While generative adversarial networks (GAN) have been widely adopted in various topics, in this paper we generalize the standard GAN to a new perspective by treating realness as a random variable that can be estimated from multiple angles.
1 code implementation • 7 Jan 2020 • Ofir Nachum, Bo Dai
We review basic concepts of convex duality, focusing on the very general and supremely useful Fenchel-Rockafellar duality.
1 code implementation • NeurIPS 2019 • Hanjun Dai, Chengtao Li, Connor W. Coley, Bo Dai, Le Song
Retrosynthesis is one of the fundamental problems in organic chemistry.
Ranked #8 on
Single-step retrosynthesis
on USPTO-50k
no code implementations • 4 Dec 2019 • Ofir Nachum, Bo Dai, Ilya Kostrikov, Yin-Lam Chow, Lihong Li, Dale Schuurmans
In many real-world applications of reinforcement learning (RL), interactions with the environment are limited due to cost or feasibility.
no code implementations • 3 Dec 2019 • Patrick H. Chen, Wei Wei, Cho-Jui Hsieh, Bo Dai
In this paper, we propose a new method to overcome catastrophic forgetting by adding generative regularization to Bayesian inference framework.
1 code implementation • NeurIPS 2019 • Dieterich Lawson, George Tucker, Bo Dai, Rajesh Ranganath
Motivated by this, we consider the sampler-induced distribution as the model of interest and maximize the likelihood of this model.
no code implementations • 25 Sep 2019 • Hao Sun, Bo Dai, Jiankai Sun, Zhenghao Peng, Guodong Xu, Dahua Lin, Bolei Zhou
In this work we model the social influence into the scheme of reinforcement learning, enabling the agents to learn both from the environment and from their peers.
no code implementations • ICCV 2019 • Xudong Xu, Bo Dai, Dahua Lin
Sounds provide rich semantics, complementary to visual data, for many tasks.
2 code implementations • NeurIPS 2019 • Ofir Nachum, Yin-Lam Chow, Bo Dai, Lihong Li
In contrast to previous approaches, our algorithm is agnostic to knowledge of the behavior policy (or policies) used to generate the dataset.
no code implementations • NeurIPS 2018 • Harsh Shrivastava, Eugene Bart, Bob Price, Hanjun Dai, Bo Dai, Srinivas Aluru
We propose a new approach, called cooperative neural networks (CoNN), which uses a set of cooperatively trained neural networks to capture latent representations that exploit prior given independence structure.
1 code implementation • NeurIPS 2019 • Bo Dai, Zhen Liu, Hanjun Dai, Niao He, Arthur Gretton, Le Song, Dale Schuurmans
We present an efficient algorithm for maximum likelihood estimation (MLE) of exponential family models, with a general parametrization of the energy function that includes neural networks.
2 code implementations • ICLR 2019 • Hongyang Li, Bo Dai, Shaoshuai Shi, Wanli Ouyang, Xiaogang Wang
We argue that the reliable set could guide the feature learning of the less reliable set during training - in spirit of student mimicking teacher behavior and thus pushing towards a more compact class centroid in the feature space.
Ranked #147 on
Object Detection
on COCO test-dev
no code implementations • ICLR Workshop DeepGenStruct 2019 • Dieterich Lawson, George Tucker, Bo Dai, Rajesh Ranganath
The success of enriching the variational family with auxiliary latent variables motivates applying the same techniques to the generative model.
no code implementations • ICLR Workshop DeepGenStruct 2019 • Zhehui Chen, Haoming Jiang, Yuyang Shi, Bo Dai, Tuo Zhao
From the perspective of generative learning, our proposed method can be viewed as learning a deep generative model for generating adversarial samples, which is adaptive to the robust classification.
1 code implementation • ICLR 2020 • Binghong Chen, Bo Dai, Qinjie Lin, Guo Ye, Han Liu, Le Song
We propose a meta path planning algorithm named \emph{Neural Exploration-Exploitation Trees~(NEXT)} for learning from prior experience for solving new path planning problems in high dimensional continuous state and action spaces.
1 code implementation • NeurIPS 2019 • Albert Shaw, Wei Wei, Weiyang Liu, Le Song, Bo Dai
Neural Architecture Search (NAS) has been quite successful in constructing state-of-the-art models on a variety of tasks.
1 code implementation • NeurIPS 2018 • Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song
This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution.
no code implementations • NeurIPS 2018 • Yingxiang Yang, Bo Dai, Negar Kiyavash, Niao He
Approximate Bayesian computation (ABC) is an important methodology for Bayesian inference when the likelihood function is intractable.
1 code implementation • 6 Nov 2018 • Bo Dai, Hanjun Dai, Arthur Gretton, Le Song, Dale Schuurmans, Niao He
We investigate penalized maximum log-likelihood estimation for exponential family distributions whose natural parameter resides in a reproducing kernel Hilbert space.
no code implementations • 3 Nov 2018 • Haoming Jiang, Zhehui Chen, Yuyang Shi, Bo Dai, Tuo Zhao
Adversarial training provides a principled approach for training robust neural networks.
1 code implementation • NeurIPS 2018 • Bo Dai, Sanja Fidler, Dahua Lin
Mainstream captioning models often follow a sequential structure to generate captions, leading to issues such as introduction of irrelevant semantics, lack of diversity in the generated captions, and inadequate generalization performance.
2 code implementations • ECCV 2018 • Hongyang Li, Xiaoyang Guo, Bo Dai, Wanli Ouyang, Xiaogang Wang
Motivated by the routing to make higher capsule have agreement with lower capsule, we extend the mechanism as a compensation for the rapid loss of information in nearby layers.
no code implementations • ECCV 2018 • Yilei Xiong, Bo Dai, Dahua Lin
We present an efficient framework that can generate a coherent paragraph to describe a given video.
no code implementations • ECCV 2018 • Bo Dai, Deming Ye, Dahua Lin
Taking advantage of this, we visually reveal the internal dynamics in the process of caption generation, as well as the connections between input visual domain and output linguistic domain.
no code implementations • 22 Jul 2018 • Yisen Wang, Bo Dai, Lingkai Kong, Sarah Monazam Erfani, James Bailey, Hongyuan Zha
Learning nonlinear dynamics from diffusion data is a challenging problem since the individuals observed may be different at different time points, generally following an aggregate behaviour.
no code implementations • ICML 2018 • Hanjun Dai, Zornitsa Kozareva, Bo Dai, Alex Smola, Le Song
Many graph analytics problems can be solved via iterative algorithms where the solutions are often characterized by a set of steady-state conditions.
4 code implementations • NeurIPS 2018 • Weiyang Liu, Rongmei Lin, Zhen Liu, Lixin Liu, Zhiding Yu, Bo Dai, Le Song
In light of this intuition, we reduce the redundancy regularization problem to generic energy minimization, and propose a minimum hyperspherical energy (MHE) objective as generic regularization for neural networks.
1 code implementation • CVPR 2018 • Weiyang Liu, Zhen Liu, Zhiding Yu, Bo Dai, Rongmei Lin, Yisen Wang, James M. Rehg, Le Song
Inner product-based convolution has been a central component of convolutional neural networks (CNNs) and the key to learning visual representations.
1 code implementation • ICLR 2018 • Hanjun Dai, Yingtao Tian, Bo Dai, Steven Skiena, Le Song
Deep generative models have been enjoying success in modeling continuous data.
no code implementations • ICML 2018 • Bo Dai, Albert Shaw, Lihong Li, Lin Xiao, Niao He, Zhen Liu, Jianshu Chen, Le Song
When function approximation is used, solving the Bellman optimality equation with stability guarantees has remained a major open problem in reinforcement learning for decades.
no code implementations • ICLR 2018 • Bo Dai, Albert Shaw, Niao He, Lihong Li, Le Song
This paper proposes a new actor-critic-style algorithm called Dual Actor-Critic or Dual-AC.
no code implementations • NeurIPS 2017 • Weiyang Liu, Yan-Ming Zhang, Xingguo Li, Zhiding Yu, Bo Dai, Tuo Zhao, Le Song
In light of such challenges, we propose hyperspherical convolution (SphereConv), a novel learning framework that gives angular representations on hyperspheres.
no code implementations • ICML 2018 • Weiyang Liu, Bo Dai, Xingguo Li, Zhen Liu, James M. Rehg, Le Song
We propose an active teacher model that can actively query the learner (i. e., make the learner take exams) for estimating the learner's status and provably guide the learner to achieve faster convergence.
no code implementations • NeurIPS 2017 • Bo Dai, Dahua Lin
Specifically, via two constraints formulated on top of a reference model, the proposed method can encourage distinctiveness, while maintaining the overall quality of the generated captions.
2 code implementations • ICML 2017 • Weiyang Liu, Bo Dai, Ahmad Humayun, Charlene Tay, Chen Yu, Linda B. Smith, James M. Rehg, Le Song
Different from traditional machine teaching which views the learners as batch algorithms, we study a new paradigm where the learner uses an iterative algorithm and a teacher can feed examples sequentially and intelligently based on the current performance of the learner.
1 code implementation • CVPR 2017 • Bo Dai, Yuqi Zhang, Dahua Lin
Relationships among objects play a crucial role in image understanding.
1 code implementation • ICCV 2017 • Bo Dai, Sanja Fidler, Raquel Urtasun, Dahua Lin
Despite the substantial progress in recent years, the image captioning techniques are still far from being perfect. Sentences produced by existing methods, e. g. those based on RNNs, are often overly rigid and lacking in variability.
2 code implementations • ICML 2017 • Bo Dai, Ruiqi Guo, Sanjiv Kumar, Niao He, Le Song
Learning-based binary hashing has become a powerful paradigm for fast search and retrieval in massive databases.
no code implementations • 15 Jul 2016 • Bo Dai, Niao He, Yunpeng Pan, Byron Boots, Le Song
In such problems, each sample $x$ itself is associated with a conditional distribution $p(z|x)$ represented by samples $\{z_i\}_{i=1}^M$, and the goal is to learn a function $f$ that links these conditional distributions to target values $y$.
1 code implementation • 17 Mar 2016 • Hanjun Dai, Bo Dai, Le Song
Kernel classifiers and regressors designed for structured data, such as sequences, trees and graphs, have significantly advanced a number of interdisciplinary areas such as computational biology and drug design.
no code implementations • 9 Jun 2015 • Bo Dai, Niao He, Hanjun Dai, Le Song
Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters.
1 code implementation • NeurIPS 2014 • Bo Dai, Bo Xie, Niao He, YIngyu Liang, Anant Raj, Maria-Florina Balcan, Le Song
The general perception is that kernel methods are not scalable, and neural nets are the methods of choice for nonlinear learning problems.
no code implementations • 3 Feb 2014 • Gang Niu, Bo Dai, Marthinus Christoffel du Plessis, Masashi Sugiyama
Given a hypothesis space, the large volume principle by Vladimir Vapnik prioritizes equivalence classes according to their volume in the hypothesis space.
no code implementations • NeurIPS 2013 • Le Song, Bo Dai
Kernel embedding of distributions has led to many recent advances in machine learning.
no code implementations • 13 Nov 2013 • Le Song, Animashree Anandkumar, Bo Dai, Bo Xie
We establish that the sample complexity for the proposed method is quadratic in the number of latent components and is a low order polynomial in the other relevant parameters.