no code implementations • 8 Feb 2024 • Peisong Niu, Tian Zhou, Xue Wang, Liang Sun, Rong Jin
Time series forecasting is essential for many practical applications, with the adoption of transformer-based models on the rise due to their impressive performance in NLP and CV.
no code implementations • 8 Feb 2024 • Ziqing Ma, Wenwei Wang, Tian Zhou, Chao Chen, Bingqing Peng, Liang Sun, Rong Jin
Current research predominantly relies on historical solar power data or numerical weather prediction in a single-modality format, ignoring the complementary information provided in different modalities.
no code implementations • 8 Feb 2024 • Yanjun Zhao, Tian Zhou, Chao Chen, Liang Sun, Yi Qian, Rong Jin
Time series analysis is vital for numerous applications, and transformers have become increasingly prominent in this domain.
Computational Efficiency Multivariate Time Series Forecasting +2
1 code implementation • 6 Dec 2023 • Chao Chen, Tian Zhou, Yanjun Zhao, Hui Liu, Liang Sun, Rong Jin
Moreover, we approximate the sparse regression process using a blend of a two-layer MLP and an extensive codebook.
Ranked #5 on Traffic Prediction on BJTaxi
no code implementations • 28 Nov 2023 • Yifan Zhang, Xue Wang, Tian Zhou, Kun Yuan, Zhang Zhang, Liang Wang, Rong Jin, Tieniu Tan
We demonstrate the effectiveness of \abbr through comprehensive experiments on multiple OOD detection benchmarks, extensive empirical studies show that \abbr significantly improves the performance of OOD detection over state-of-the-art methods.
1 code implementation • 24 Nov 2023 • Tian Zhou, Peisong Niu, Xue Wang, Liang Sun, Rong Jin
Despite the impressive achievements of pre-trained models in the fields of natural language processing (NLP) and computer vision (CV), progress in the domain of time series analysis has been limited.
2 code implementations • 17 Jun 2023 • Yiyuan Yang, Chaoli Zhang, Tian Zhou, Qingsong Wen, Liang Sun
On the other hand, contrastive learning aims to find a representation that can clearly distinguish any instance from the others, which can bring a more natural and promising representation for time series anomaly detection.
1 code implementation • 14 Jun 2023 • Yanjun Zhao, Ziqing Ma, Tian Zhou, Liang Sun, Mengni Ye, Yi Qian
On the other hand, the long input sequence usually leads to large model size and high time complexity.
no code implementations • 14 Jun 2023 • Hengbo Liu, Ziqing Ma, Linxiao Yang, Tian Zhou, Rui Xia, Yi Wang, Qingsong Wen, Liang Sun
In this paper, we propose a novel forecasting framework, named Self-adaptive Decomposed Interpretable framework~(SaDI), which ensembles long-term trend, short-term trend, and period modelings to capture temporal characteristics in different components.
1 code implementation • 20 May 2023 • Wang Xue, Tian Zhou, Qingsong Wen, Jinyang Gao, Bolin Ding, Rong Jin
In this work, we design a special Transformer, i. e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
no code implementations • 11 May 2023 • Ming Jin, Guangsi Shi, Yuan-Fang Li, Qingsong Wen, Bo Xiong, Tian Zhou, Shirui Pan
In this paper, we establish a theoretical framework that unravels the expressive power of spectral-temporal GNNs.
3 code implementations • 23 Feb 2023 • Tian Zhou, Peisong Niu, Xue Wang, Liang Sun, Rong Jin
The main challenge that blocks the development of pre-trained model for time series analysis is the lack of a large amount of data for training.
1 code implementation • 18 Oct 2022 • Chaoli Zhang, Tian Zhou, Qingsong Wen, Liang Sun
Time series anomaly detection is a challenging problem due to the complex temporal dependencies and the limited label data.
no code implementations • 24 Jun 2022 • Tian Zhou, Jianqing Zhu, Xue Wang, Ziqing Ma, Qingsong Wen, Liang Sun, Rong Jin
Various deep learning models, especially some latest Transformer-based approaches, have greatly improved the state-of-art performance for long-term time series forecasting. However, those transformer-based models suffer a severe deterioration performance with prolonged input length, which prohibits them from using extended historical info. Moreover, these methods tend to handle complex examples in long-term forecasting with increased model complexity, which often leads to a significant increase in computation and less robustness in performance(e. g., overfitting).
2 code implementations • 18 May 2022 • Tian Zhou, Ziqing Ma, Xue Wang, Qingsong Wen, Liang Sun, Tao Yao, Wotao Yin, Rong Jin
Recent studies have shown that deep learning models such as RNNs and Transformers have brought significant performance gains for long-term forecasting of time series because they effectively utilize historical information.
Ranked #3 on Time Series Forecasting on ETTh2 (96) Univariate
10 code implementations • 15 Feb 2022 • Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, Liang Sun
From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers in order to accommodate the challenges in time series analysis.
3 code implementations • 30 Jan 2022 • Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, Rong Jin
Although Transformer-based methods have significantly improved state-of-the-art results for long-term series forecasting, they are not only computationally expensive but more importantly, are unable to capture the global view of time series (e. g. overall trend).
no code implementations • 15 Jul 2021 • Hao He, Tian Zhou, Lihua Ren, Niklas Karlsson, Aaron Flores
For Verizon MediaDemand Side Platform(DSP), forecasting of ad campaign performance not only feeds key information to the optimization server to allow the system to operate on a high-performance mode, but also produces actionable insights to the advertisers.
no code implementations • 12 Jul 2021 • Tian Zhou, Hao He, Shengjun Pan, Niklas Karlsson, Bharatbhushan Shetty, Brendan Kitts, Djordje Gligorijevic, San Gultekin, Tingyu Mao, Junwei Pan, Jianlong Zhang, Aaron Flores
Since 2019, most ad exchanges and sell-side platforms (SSPs), in the online advertising industry, shifted from second to first price auctions.
no code implementations • 19 Sep 2020 • Shengjun Pan, Brendan Kitts, Tian Zhou, Hao He, Bharatbhushan Shetty, Aaron Flores, Djordje Gligorijevic, Junwei Pan, Tingyu Mao, San Gultekin, Jianlong Zhang
We found that bid shading, in general, can deliver significant value to advertisers, reducing price per impression to about 55% of the unshaded cost.
no code implementations • 2 Sep 2020 • Djordje Gligorijevic, Tian Zhou, Bharatbhushan Shetty, Brendan Kitts, Shengjun Pan, Junwei Pan, Aaron Flores
Online auctions play a central role in online advertising, and are one of the main reasons for the industry's scalability and growth.
2 code implementations • 17 Feb 2020 • Wei Deng, Junwei Pan, Tian Zhou, Deguang Kong, Aaron Flores, Guang Lin
To address the issue of significantly increased serving delay and high memory usage for ad serving in production, this paper presents \emph{DeepLight}: a framework to accelerate the CTR predictions in three aspects: 1) accelerate the model inference via explicitly searching informative feature interactions in the shallow component; 2) prune redundant layers and parameters at intra-layer and inter-layer level in the DNN component; 3) promote the sparsity of the embedding layer to preserve the most discriminant signals.
Ranked #7 on Click-Through Rate Prediction on Avazu