1 code implementation • 16 Oct 2024 • Jiaqi Han, Minkai Xu, Aaron Lou, Haotian Ye, Stefano Ermon
In this work, we propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
1 code implementation • 19 Jan 2024 • Minkai Xu, Jiaqi Han, Aaron Lou, Jean Kossaifi, Arvind Ramanathan, Kamyar Azizzadenesheli, Jure Leskovec, Stefano Ermon, Anima Anandkumar
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods, thanks to the equivariant temporal modeling.
1 code implementation • CVPR 2024 • Bram Wallace, Meihua Dang, Rafael Rafailov, Linqi Zhou, Aaron Lou, Senthil Purushwalkam, Stefano Ermon, Caiming Xiong, Shafiq Joty, Nikhil Naik
Large language models (LLMs) are fine-tuned using human comparison data with Reinforcement Learning from Human Feedback (RLHF) methods to make them better aligned with users' preferences.
2 code implementations • 25 Oct 2023 • Aaron Lou, Chenlin Meng, Stefano Ermon
Experimentally, we test our Score Entropy Discrete Diffusion models (SEDD) on standard language modeling tasks.
2 code implementations • 29 Sep 2023 • Linqi Zhou, Aaron Lou, Samar Khanna, Stefano Ermon
However, for many applications such as image editing, the model input comes from a distribution that is not random noise.
1 code implementation • 10 Apr 2023 • Aaron Lou, Stefano Ermon
To incorporate data constraints in a principled manner, we present Reflected Diffusion Models, which instead reverse a reflected stochastic differential equation evolving on the support of the data.
Ranked #1 on Image Generation on CIFAR-10 (Inception score metric)
2 code implementations • NeurIPS 2021 • Tolga Birdal, Aaron Lou, Leonidas Guibas, Umut Şimşekli
Disobeying the classical wisdom of statistical learning theory, modern deep neural networks generalize well even though they typically contain millions of parameters.
no code implementations • 29 Sep 2021 • Aaron Lou, Maximilian Nickel, Mustafa Mukadam, Brandon Amos
We present Deep Riemannian Manifolds, a new class of neural network parameterized Riemannian manifolds that can represent and learn complex geometric structures.
1 code implementation • NeurIPS 2021 • Isay Katsman, Aaron Lou, Derek Lim, Qingxuan Jiang, Ser-Nam Lim, Christopher De Sa
Tractably modelling distributions over manifolds has long been an important goal in the natural sciences.
3 code implementations • NeurIPS 2020 • Aaron Lou, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser-Nam Lim, Christopher De Sa
To better conform to data geometry, recent deep generative modelling techniques adapt Euclidean constructions to non-Euclidean spaces.
2 code implementations • ICML 2020 • Aaron Lou, Isay Katsman, Qingxuan Jiang, Serge Belongie, Ser-Nam Lim, Christopher De Sa
Recent advances in deep representation learning on Riemannian manifolds extend classical deep learning operations to better capture the geometry of the manifold.
no code implementations • 4 Dec 2018 • Horace He, Aaron Lou, Qingxuan Jiang, Isay Katsman, Serge Belongie, Ser-Nam Lim
Research has shown that widely used deep neural networks are vulnerable to carefully crafted adversarial perturbations.