Search Results for author: Lingkai Kong

Found 6 papers, 4 papers with code

CAMul: Calibrated and Accurate Multi-view Time-Series Forecasting

1 code implementation15 Sep 2021 Harshavardhan Kamarthi, Lingkai Kong, Alexander Rodríguez, Chao Zhang, B. Aditya Prakash

We use CAMul for multiple domains with varied sources and modalities and show that CAMul outperforms other state-of-art probabilistic forecasting models by over 25\% in accuracy and calibration.

Decision Making Probabilistic Time Series Forecasting +1

When in Doubt: Neural Non-Parametric Uncertainty Quantification for Epidemic Forecasting

1 code implementation7 Jun 2021 Harshavardhan Kamarthi, Lingkai Kong, Alexander Rodríguez, Chao Zhang, B. Aditya Prakash

We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP, which directly models the probability density of the forecast value.

Time Series Time Series Forecasting

Calibrated Language Model Fine-Tuning for In- and Out-of-Distribution Data

1 code implementation EMNLP 2020 Lingkai Kong, Haoming Jiang, Yuchen Zhuang, Jie Lyu, Tuo Zhao, Chao Zhang

Fine-tuned pre-trained language models can suffer from severe miscalibration for both in-distribution and out-of-distribution (OOD) data due to over-parameterization.

Language Modelling Text Classification

SDE-Net: Equipping Deep Neural Networks with Uncertainty Estimates

2 code implementations ICML 2020 Lingkai Kong, Jimeng Sun, Chao Zhang

We propose a new method for quantifying uncertainties of DNNs from a dynamical system perspective.

Stochasticity of Deterministic Gradient Descent: Large Learning Rate for Multiscale Objective Function

no code implementations NeurIPS 2020 Lingkai Kong, Molei Tao

This article suggests that deterministic Gradient Descent, which does not use any stochastic gradient approximation, can still exhibit stochastic behaviors.

Learning Deep Hidden Nonlinear Dynamics from Aggregate Data

no code implementations22 Jul 2018 Yisen Wang, Bo Dai, Lingkai Kong, Sarah Monazam Erfani, James Bailey, Hongyuan Zha

Learning nonlinear dynamics from diffusion data is a challenging problem since the individuals observed may be different at different time points, generally following an aggregate behaviour.

Cannot find the paper you are looking for? You can Submit a new open access paper.