no code implementations • 7 Jun 2024 • Haibei Zhu, Yousef El-Laham, Elizabeth Fons, Svitlana Vyetrenko
Effective utilization of time series data is often constrained by the scarcity of data quantity that reflects complex dynamics, especially under the condition of distributional shifts.
no code implementations • 25 Apr 2024 • Elizabeth Fons, Rachneet Kaur, Soham Palande, Zhen Zeng, Tucker Balch, Manuela Veloso, Svitlana Vyetrenko
Large Language Models (LLMs) offer the potential for automatic time series analysis and reporting, which is a critical task across many domains, spanning healthcare, finance, climate, energy, and many more.
no code implementations • 29 Dec 2023 • Vamsi K. Potluru, Daniel Borrajo, Andrea Coletta, Niccolò Dalmasso, Yousef El-Laham, Elizabeth Fons, Mohsen Ghassemi, Sriram Gopalakrishnan, Vikesh Gosai, Eleonora Kreačić, Ganapathy Mani, Saheed Obitayo, Deepak Paramanand, Natraj Raman, Mikhail Solonin, Srijan Sood, Svitlana Vyetrenko, Haibei Zhu, Manuela Veloso, Tucker Balch
Synthetic data has made tremendous strides in various commercial settings including finance, healthcare, and virtual reality.
no code implementations • 20 Dec 2023 • Yousef El-Laham, Elizabeth Fons, Dillon Daudert, Svitlana Vyetrenko
Evaluations across diverse regression tasks show that UMAP Mixup is competitive with or outperforms other Mixup variants, show promise for its potential as an effective tool for enhancing the generalization performance of deep learning models.
no code implementations • 28 Sep 2023 • Tom Bamford, Andrea Coletta, Elizabeth Fons, Sriram Gopalakrishnan, Svitlana Vyetrenko, Tucker Balch, Manuela Veloso
Moreover, the required storage, computational time, and retrieval complexity to search in the time-series space are often non-trivial.
no code implementations • 3 Jul 2023 • Tom Bamford, Elizabeth Fons, Yousef El-Laham, Svitlana Vyetrenko
Time series imputation remains a significant challenge across many fields due to the potentially significant variability in the type of data being modelled.
no code implementations • 12 Jun 2023 • Yousef El-Laham, Niccolò Dalmasso, Elizabeth Fons, Svitlana Vyetrenko
This work introduces a novel probabilistic deep learning technique called deep Gaussian mixture ensembles (DGMEs), which enables accurate quantification of both epistemic and aleatoric uncertainty.
no code implementations • 11 Aug 2022 • Elizabeth Fons, Alejandro Sztrajman, Yousef El-Laham, Alexandros Iosifidis, Svitlana Vyetrenko
We show how these networks can be leveraged for the imputation of time series, with applications on both univariate and multivariate data.
no code implementations • 16 Feb 2021 • Elizabeth Fons, Paula Dawson, Xiao-jun Zeng, John Keane, Alexandros Iosifidis
Data augmentation methods have been shown to be a fundamental technique to improve generalization in tasks such as image, text and audio classification.
1 code implementation • 28 Oct 2020 • Elizabeth Fons, Paula Dawson, Xiao-jun Zeng, John Keane, Alexandros Iosifidis
Data augmentation methods in combination with deep neural networks have been used extensively in computer vision on classification tasks, achieving great success; however, their use in time series classification is still at an early stage.
no code implementations • 28 Oct 2020 • Elizabeth Fons, Paula Dawson, Xiao-jun Zeng, John Keane, Alexandros Iosifidis
In this paper we show that using transfer learning can help with this task, by pre-training a model to extract universal features on the full universe of stocks of the S$\&$P500 index and then transferring it to another model to directly learn a trading rule.
no code implementations • 28 Feb 2019 • Elizabeth Fons, Paula Dawson, Jeffrey Yau, Xiao-jun Zeng, John Keane
The financial crisis of 2008 generated interest in more transparent, rules-based strategies for portfolio construction, with Smart beta strategies emerging as a trend among institutional investors.