Search Results for author: Xiaoyong Jin

Found 8 papers, 5 papers with code

PreDiff: Precipitation Nowcasting with Latent Diffusion Models

1 code implementation NeurIPS 2023 Zhihan Gao, Xingjian Shi, Boran Han, Hao Wang, Xiaoyong Jin, Danielle Maddix, Yi Zhu, Mu Li, Yuyang Wang

We conduct empirical studies on two datasets: N-body MNIST, a synthetic dataset with chaotic behavior, and SEVIR, a real-world precipitation nowcasting dataset.

Denoising Earth Observation

First De-Trend then Attend: Rethinking Attention for Time-Series Forecasting

1 code implementation15 Dec 2022 Xiyuan Zhang, Xiaoyong Jin, Karthick Gopalswamy, Gaurav Gupta, Youngsuk Park, Xingjian Shi, Hao Wang, Danielle C. Maddix, Yuyang Wang

Transformer-based models have gained large popularity and demonstrated promising results in long-term time-series forecasting in recent years.

Time Series Time Series Forecasting

Domain Adaptation for Time Series Forecasting via Attention Sharing

1 code implementation13 Feb 2021 Xiaoyong Jin, Youngsuk Park, Danielle C. Maddix, Hao Wang, Yuyang Wang

Recently, deep neural networks have gained increasing popularity in the field of time series forecasting.

Domain Adaptation Time Series +1

Inter-Series Attention Model for COVID-19 Forecasting

1 code implementation25 Oct 2020 Xiaoyong Jin, Yu-Xiang Wang, Xifeng Yan

COVID-19 pandemic has an unprecedented impact all over the world since early 2020.

Time Series Time Series Analysis

Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting

2 code implementations NeurIPS 2019 Shiyang Li, Xiaoyong Jin, Yao Xuan, Xiyou Zhou, Wenhu Chen, Yu-Xiang Wang, Xifeng Yan

Time series forecasting is an important problem across many domains, including predictions of solar plant energy output, electricity consumption, and traffic jam situation.

Ranked #27 on Image Generation on ImageNet 64x64 (Bits per dim metric)

Time Series Time Series Forecasting

Cannot find the paper you are looking for? You can Submit a new open access paper.