Search Results for author: Ziqing Ma

Found 8 papers, 4 papers with code

FusionSF: Fuse Heterogeneous Modalities in a Vector Quantized Framework for Robust Solar Power Forecasting

no code implementations8 Feb 2024 Ziqing Ma, Wenwei Wang, Tian Zhou, Chao Chen, Bingqing Peng, Liang Sun, Rong Jin

Current research predominantly relies on historical solar power data or numerical weather prediction in a single-modality format, ignoring the complementary information provided in different modalities.

SaDI: A Self-adaptive Decomposed Interpretable Framework for Electric Load Forecasting under Extreme Events

no code implementations14 Jun 2023 Hengbo Liu, Ziqing Ma, Linxiao Yang, Tian Zhou, Rui Xia, Yi Wang, Qingsong Wen, Liang Sun

In this paper, we propose a novel forecasting framework, named Self-adaptive Decomposed Interpretable framework~(SaDI), which ensembles long-term trend, short-term trend, and period modelings to capture temporal characteristics in different components.

Load Forecasting Management

TreeDRNet:A Robust Deep Model for Long Term Time Series Forecasting

no code implementations24 Jun 2022 Tian Zhou, Jianqing Zhu, Xue Wang, Ziqing Ma, Qingsong Wen, Liang Sun, Rong Jin

Various deep learning models, especially some latest Transformer-based approaches, have greatly improved the state-of-art performance for long-term time series forecasting. However, those transformer-based models suffer a severe deterioration performance with prolonged input length, which prohibits them from using extended historical info. Moreover, these methods tend to handle complex examples in long-term forecasting with increased model complexity, which often leads to a significant increase in computation and less robustness in performance(e. g., overfitting).

Computational Efficiency feature selection +2

FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting

2 code implementations18 May 2022 Tian Zhou, Ziqing Ma, Xue Wang, Qingsong Wen, Liang Sun, Tao Yao, Wotao Yin, Rong Jin

Recent studies have shown that deep learning models such as RNNs and Transformers have brought significant performance gains for long-term forecasting of time series because they effectively utilize historical information.

Dimensionality Reduction Time Series +1

Transformers in Time Series: A Survey

10 code implementations15 Feb 2022 Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, Liang Sun

From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers in order to accommodate the challenges in time series analysis.

Anomaly Detection Time Series +1

FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

3 code implementations30 Jan 2022 Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, Rong Jin

Although Transformer-based methods have significantly improved state-of-the-art results for long-term series forecasting, they are not only computationally expensive but more importantly, are unable to capture the global view of time series (e. g. overall trend).

Time Series Time Series Analysis

Hybrid Attention Networks for Flow and Pressure Forecasting in Water Distribution Systems

no code implementations13 Apr 2020 Ziqing Ma, Shuming Liu, Guancheng Guo, Xipeng Yu

Specifically, a hybrid spatial attention mechanism that employs inputs along temporal and spatial axes is proposed.

Anomaly Detection Decision Making +2

Cannot find the paper you are looking for? You can Submit a new open access paper.