Search Results for author: Shuhan Tan

Found 7 papers, 2 papers with code

Language Conditioned Traffic Generation

1 code implementation16 Jul 2023 Shuhan Tan, Boris Ivanovic, Xinshuo Weng, Marco Pavone, Philipp Kraehenbuehl

In this work, we turn to language as a source of supervision for dynamic traffic scene generation.

Language Modelling Large Language Model +1

SceneGen: Learning to Generate Realistic Traffic Scenes

no code implementations CVPR 2021 Shuhan Tan, Kelvin Wong, Shenlong Wang, Sivabalan Manivasagam, Mengye Ren, Raquel Urtasun

Existing methods typically insert actors into the scene according to a set of hand-crafted heuristics and are limited in their ability to model the true complexity and diversity of real traffic scenes, thus inducing a content gap between synthesized traffic scenes versus real ones.

Improving the Fairness of Deep Generative Models without Retraining

1 code implementation9 Dec 2020 Shuhan Tan, Yujun Shen, Bolei Zhou

Generative Adversarial Networks (GANs) advance face synthesis through learning the underlying distribution of observed data.

Face Generation Face Recognition +2

LiDARsim: Realistic LiDAR Simulation by Leveraging the Real World

no code implementations CVPR 2020 Sivabalan Manivasagam, Shenlong Wang, Kelvin Wong, Wenyuan Zeng, Mikita Sazanovich, Shuhan Tan, Bin Yang, Wei-Chiu Ma, Raquel Urtasun

We first utilize ray casting over the 3D scene and then use a deep neural network to produce deviations from the physics-based simulation, producing realistic LiDAR point clouds.

Generalized Domain Adaptation with Covariate and Label Shift CO-ALignment

no code implementations25 Sep 2019 Shuhan Tan, Xingchao Peng, Kate Saenko

In this paper, we explore the task of Generalized Domain Adaptation (GDA): How to transfer knowledge across different domains in the presence of both covariate and label shift?

Domain Adaptation Transfer Learning

Weakly Supervised Open-set Domain Adaptation by Dual-domain Collaboration

no code implementations CVPR 2019 Shuhan Tan, Jiening Jiao, Wei-Shi Zheng

Thus, it is meaningful to let partially labeled domains learn from each other to classify all the unlabeled samples in each domain under an open-set setting.

Domain Adaptation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.