Search Results for author: Xuan-Hong Dang

Found 9 papers, 3 papers with code

Modality-aware Transformer for Financial Time series Forecasting

no code implementations2 Oct 2023 Hajar Emami, Xuan-Hong Dang, Yousaf Shah, Petros Zerfos

In practice, the key challenge lies in constructing a reliable time series forecasting model capable of harnessing data from diverse sources and extracting valuable insights to predict the target time series accurately.

Feature Importance Time Series +1

Maximal Domain Independent Representations Improve Transfer Learning

no code implementations1 Jun 2023 Adrian Shuai Li, Elisa Bertino, Xuan-Hong Dang, Ankush Singla, Yuhai Tu, Mark N Wegman

To address this shortcoming, we developed a new algorithm wherein a stronger constraint is imposed to minimize the DDRep by using a KL divergent loss for the DDRep in order to create the maximal DIRep that enhances transfer learning performance.

Domain Adaptation Transfer Learning

AutoAI-TS: AutoAI for Time Series Forecasting

no code implementations24 Feb 2021 Syed Yousaf Shah, Dhaval Patel, Long Vu, Xuan-Hong Dang, Bei Chen, Peter Kirchner, Horst Samulowitz, David Wood, Gregory Bramble, Wesley M. Gifford, Giridhar Ganapavarapu, Roman Vaculin, Petros Zerfos

We present AutoAI for Time Series Forecasting (AutoAI-TS) that provides users with a zero configuration (zero-conf ) system to efficiently train, optimize and choose best forecasting model among various classes of models for the given dataset.

Benchmarking BIG-bench Machine Learning +3

"The Squawk Bot": Joint Learning of Time Series and Text Data Modalities for Automated Financial Information Filtering

no code implementations20 Dec 2019 Xuan-Hong Dang, Syed Yousaf Shah, Petros Zerfos

In this work, we address the problem of given a numerical time series, and a general corpus of textual stories collected in the same period of the time series, the task is to timely discover a succinct set of textual stories associated with that time series.

Time Series Time Series Analysis

seq2graph: Discovering Dynamic Dependencies from Multivariate Time Series with Multi-level Attention

no code implementations7 Dec 2018 Xuan-Hong Dang, Syed Yousaf Shah, Petros Zerfos

We introduce a key component of Dual-purpose recurrent neural network that decodes information in the temporal domain to discover lagged dependencies within each time series, and encodes them into a set of vectors which, collected from all component time series, form the informative inputs to discover inter-dependencies.

Clustering Management +2

Cannot find the paper you are looking for? You can Submit a new open access paper.