no code implementations • 15 Mar 2025 • Sebastian Pineda Arango, Pedro Mercado, Shubham Kapoor, Abdul Fatir Ansari, Lorenzo Stella, Huibin Shen, Hugo Senetaire, Caner Turkmen, Oleksandr Shchur, Danielle C. Maddix, Michael Bohlke-Schneider, Yuyang Wang, Syama Sundar Rangapuram
Covariates provide valuable information on external factors that influence time series and are critical in many real-world time series forecasting tasks.
no code implementations • 6 Dec 2024 • Luca Masserano, Abdul Fatir Ansari, Boran Han, Xiyuan Zhang, Christos Faloutsos, Michael W. Mahoney, Andrew Gordon Wilson, Youngsuk Park, Syama Rangapuram, Danielle C. Maddix, Yuyang Wang
To address this question, we develop WaveToken, a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.
no code implementations • 2 Dec 2024 • Chaoran Cheng, Boran Han, Danielle C. Maddix, Abdul Fatir Ansari, Andrew Stuart, Michael W. Mahoney, Yuyang Wang
Generative models that satisfy hard constraints are crucial in many scientific and engineering applications where physical laws or system requirements must be strictly respected.
1 code implementation • 19 Jul 2024 • Matthias Karlbauer, Danielle C. Maddix, Abdul Fatir Ansari, Boran Han, Gaurav Gupta, Yuyang Wang, Andrew Stuart, Michael W. Mahoney
Remarkable progress in the development of Deep Learning Weather Prediction (DLWP) models positions them to become competitive with traditional numerical weather prediction (NWP) models.
6 code implementations • 12 Mar 2024 • Abdul Fatir Ansari, Lorenzo Stella, Caner Turkmen, Xiyuan Zhang, Pedro Mercado, Huibin Shen, Oleksandr Shchur, Syama Sundar Rangapuram, Sebastian Pineda Arango, Shubham Kapoor, Jasper Zschiegner, Danielle C. Maddix, Hao Wang, Michael W. Mahoney, Kari Torkkola, Andrew Gordon Wilson, Michael Bohlke-Schneider, Yuyang Wang
We introduce Chronos, a simple yet effective framework for pretrained probabilistic time series models.
1 code implementation • NeurIPS 2023 • Marcel Kollovieh, Abdul Fatir Ansari, Michael Bohlke-Schneider, Jasper Zschiegner, Hao Wang, Yuyang Wang
Prior works on time series diffusion models have primarily focused on developing conditional models tailored to specific forecasting or imputation tasks.
2 code implementations • 7 Mar 2023 • Alvin Heng, Abdul Fatir Ansari, Harold Soh
We present Flow-Guided Density Ratio Learning (FDRL), a simple and scalable approach to generative modeling which builds on the stale (time-independent) approximation of the gradient flow of entropy-regularized f-divergences introduced in recent work.
1 code implementation • 26 Jan 2023 • Abdul Fatir Ansari, Alvin Heng, Andre Lim, Harold Soh
Learning accurate predictive models of real-world dynamic phenomena (e. g., climate, biological) remains a challenging task.
1 code implementation • NeurIPS 2021 • Abdul Fatir Ansari, Konstantinos Benidis, Richard Kurle, Ali Caner Turkmen, Harold Soh, Alexander J. Smola, Yuyang Wang, Tim Januschowski
We propose the Recurrent Explicit Duration Switching Dynamical System (RED-SDS), a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
1 code implementation • ICLR 2021 • Abdul Fatir Ansari, Ming Liang Ang, Harold Soh
We introduce Discriminator Gradient flow (DGflow), a new technique that improves generated samples via the gradient flow of entropy-regularized f-divergences between the real and the generated data distributions.
Ranked #1 on
Text Generation
on One Billion Word
1 code implementation • CVPR 2020 • Abdul Fatir Ansari, Jonathan Scarlett, Harold Soh
In this paper, we formulate the problem of learning an IGM as minimizing the expected distance between characteristic functions.
2 code implementations • 12 Sep 2018 • Abdul Fatir Ansari, Harold Soh
We address the problem of unsupervised disentanglement of latent representations learnt via deep generative models.