Search Results for author: Tian Zhou

Found 22 papers, 11 papers with code

Attention as Robust Representation for Time Series Forecasting

no code implementations8 Feb 2024 Peisong Niu, Tian Zhou, Xue Wang, Liang Sun, Rong Jin

Time series forecasting is essential for many practical applications, with the adoption of transformer-based models on the rise due to their impressive performance in NLP and CV.

Multivariate Time Series Forecasting Time Series

FusionSF: Fuse Heterogeneous Modalities in a Vector Quantized Framework for Robust Solar Power Forecasting

no code implementations8 Feb 2024 Ziqing Ma, Wenwei Wang, Tian Zhou, Chao Chen, Bingqing Peng, Liang Sun, Rong Jin

Current research predominantly relies on historical solar power data or numerical weather prediction in a single-modality format, ignoring the complementary information provided in different modalities.

SVQ: Sparse Vector Quantization for Spatiotemporal Forecasting

1 code implementation6 Dec 2023 Chao Chen, Tian Zhou, Yanjun Zhao, Hui Liu, Liang Sun, Rong Jin

Moreover, we approximate the sparse regression process using a blend of a two-layer MLP and an extensive codebook.

Computational Efficiency Quantization +6

Model-free Test Time Adaptation for Out-Of-Distribution Detection

no code implementations28 Nov 2023 Yifan Zhang, Xue Wang, Tian Zhou, Kun Yuan, Zhang Zhang, Liang Wang, Rong Jin, Tieniu Tan

We demonstrate the effectiveness of \abbr through comprehensive experiments on multiple OOD detection benchmarks, extensive empirical studies show that \abbr significantly improves the performance of OOD detection over state-of-the-art methods.

Decision Making Out-of-Distribution Detection +2

One Fits All: Universal Time Series Analysis by Pretrained LM and Specially Designed Adaptors

1 code implementation24 Nov 2023 Tian Zhou, Peisong Niu, Xue Wang, Liang Sun, Rong Jin

Despite the impressive achievements of pre-trained models in the fields of natural language processing (NLP) and computer vision (CV), progress in the domain of time series analysis has been limited.

Anomaly Detection Few-Shot Learning +2

DCdetector: Dual Attention Contrastive Representation Learning for Time Series Anomaly Detection

2 code implementations17 Jun 2023 Yiyuan Yang, Chaoli Zhang, Tian Zhou, Qingsong Wen, Liang Sun

On the other hand, contrastive learning aims to find a representation that can clearly distinguish any instance from the others, which can bring a more natural and promising representation for time series anomaly detection.

Anomaly Detection Contrastive Learning +3

SaDI: A Self-adaptive Decomposed Interpretable Framework for Electric Load Forecasting under Extreme Events

no code implementations14 Jun 2023 Hengbo Liu, Ziqing Ma, Linxiao Yang, Tian Zhou, Rui Xia, Yi Wang, Qingsong Wen, Liang Sun

In this paper, we propose a novel forecasting framework, named Self-adaptive Decomposed Interpretable framework~(SaDI), which ensembles long-term trend, short-term trend, and period modelings to capture temporal characteristics in different components.

Load Forecasting Management

CARD: Channel Aligned Robust Blend Transformer for Time Series Forecasting

1 code implementation20 May 2023 Wang Xue, Tian Zhou, Qingsong Wen, Jinyang Gao, Bolin Ding, Rong Jin

In this work, we design a special Transformer, i. e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.

Time Series Time Series Forecasting

One Fits All:Power General Time Series Analysis by Pretrained LM

3 code implementations23 Feb 2023 Tian Zhou, Peisong Niu, Xue Wang, Liang Sun, Rong Jin

The main challenge that blocks the development of pre-trained model for time series analysis is the lack of a large amount of data for training.

Anomaly Detection Few-Shot Learning +2

TFAD: A Decomposition Time Series Anomaly Detection Architecture with Time-Frequency Analysis

1 code implementation18 Oct 2022 Chaoli Zhang, Tian Zhou, Qingsong Wen, Liang Sun

Time series anomaly detection is a challenging problem due to the complex temporal dependencies and the limited label data.

Anomaly Detection Data Augmentation +2

TreeDRNet:A Robust Deep Model for Long Term Time Series Forecasting

no code implementations24 Jun 2022 Tian Zhou, Jianqing Zhu, Xue Wang, Ziqing Ma, Qingsong Wen, Liang Sun, Rong Jin

Various deep learning models, especially some latest Transformer-based approaches, have greatly improved the state-of-art performance for long-term time series forecasting. However, those transformer-based models suffer a severe deterioration performance with prolonged input length, which prohibits them from using extended historical info. Moreover, these methods tend to handle complex examples in long-term forecasting with increased model complexity, which often leads to a significant increase in computation and less robustness in performance(e. g., overfitting).

Computational Efficiency feature selection +2

FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting

2 code implementations18 May 2022 Tian Zhou, Ziqing Ma, Xue Wang, Qingsong Wen, Liang Sun, Tao Yao, Wotao Yin, Rong Jin

Recent studies have shown that deep learning models such as RNNs and Transformers have brought significant performance gains for long-term forecasting of time series because they effectively utilize historical information.

Dimensionality Reduction Time Series +1

Transformers in Time Series: A Survey

10 code implementations15 Feb 2022 Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, Liang Sun

From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers in order to accommodate the challenges in time series analysis.

Anomaly Detection Time Series +1

FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

3 code implementations30 Jan 2022 Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, Rong Jin

Although Transformer-based methods have significantly improved state-of-the-art results for long-term series forecasting, they are not only computationally expensive but more importantly, are unable to capture the global view of time series (e. g. overall trend).

Time Series Time Series Analysis

Mid-flight Forecasting for CPA Lines in Online Advertising

no code implementations15 Jul 2021 Hao He, Tian Zhou, Lihua Ren, Niklas Karlsson, Aaron Flores

For Verizon MediaDemand Side Platform(DSP), forecasting of ad campaign performance not only feeds key information to the optimization server to allow the system to operate on a high-performance mode, but also produces actionable insights to the advertisers.

Management

An Efficient Deep Distribution Network for Bid Shading in First-Price Auctions

no code implementations12 Jul 2021 Tian Zhou, Hao He, Shengjun Pan, Niklas Karlsson, Bharatbhushan Shetty, Brendan Kitts, Djordje Gligorijevic, San Gultekin, Tingyu Mao, Junwei Pan, Jianlong Zhang, Aaron Flores

Since 2019, most ad exchanges and sell-side platforms (SSPs), in the online advertising industry, shifted from second to first price auctions.

Bid Shading by Win-Rate Estimation and Surplus Maximization

no code implementations19 Sep 2020 Shengjun Pan, Brendan Kitts, Tian Zhou, Hao He, Bharatbhushan Shetty, Aaron Flores, Djordje Gligorijevic, Junwei Pan, Tingyu Mao, San Gultekin, Jianlong Zhang

We found that bid shading, in general, can deliver significant value to advertisers, reducing price per impression to about 55% of the unshaded cost.

Attribute

Bid Shading in The Brave New World of First-Price Auctions

no code implementations2 Sep 2020 Djordje Gligorijevic, Tian Zhou, Bharatbhushan Shetty, Brendan Kitts, Shengjun Pan, Junwei Pan, Aaron Flores

Online auctions play a central role in online advertising, and are one of the main reasons for the industry's scalability and growth.

DeepLight: Deep Lightweight Feature Interactions for Accelerating CTR Predictions in Ad Serving

2 code implementations17 Feb 2020 Wei Deng, Junwei Pan, Tian Zhou, Deguang Kong, Aaron Flores, Guang Lin

To address the issue of significantly increased serving delay and high memory usage for ad serving in production, this paper presents \emph{DeepLight}: a framework to accelerate the CTR predictions in three aspects: 1) accelerate the model inference via explicitly searching informative feature interactions in the shallow component; 2) prune redundant layers and parameters at intra-layer and inter-layer level in the DNN component; 3) promote the sparsity of the embedding layer to preserve the most discriminant signals.

Click-Through Rate Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.