Conditional Temporal Neural Processes with Covariance Loss

We introduce a novel loss function, Covariance Loss, which is conceptually equivalent to conditional neural processes and has a form of regularization so that is applicable to many kinds of neural networks. With the proposed loss, mappings from input variables to target variables are highly affected by dependencies of target variables as well as mean activation and mean dependencies of input and target variables. This nature enables the resulting neural networks to become more robust to noisy observations and recapture missing dependencies from prior information. In order to show the validity of the proposed loss, we conduct extensive sets of experiments on real-world datasets with state-of-the-art models and discuss the benefits and drawbacks of the proposed Covariance Loss.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Traffic Prediction METR-LA GWNET-Cov MAE @ 12 step 3.50 # 13
MAE @ 3 step 2.69 # 5
Traffic Prediction PEMS-BAY GWNET-Cov MAE @ 12 step 1.91 # 8
RMSE 4.40 # 5
Time Series Forecasting PeMSD7 STGCN-Cov 9 steps MAE 3.51 # 1

Methods


No methods listed for this paper. Add relevant methods here