no code implementations • 16 Feb 2024 • Chin-Chia Michael Yeh, Yujie Fan, Xin Dai, Vivian Lai, Prince Osei Aboagye, Junpeng Wang, Huiyuan Chen, Yan Zheng, Zhongfang Zhuang, Liang Wang, Wei zhang
All-Multi-Layer Perceptron (all-MLP) mixer models have been shown to be effective for time series forecasting problems.
no code implementations • 2 Jan 2024 • Prince Aboagye, Yan Zheng, Junpeng Wang, Uday Singh Saini, Xin Dai, Michael Yeh, Yujie Fan, Zhongfang Zhuang, Shubham Jain, Liang Wang, Wei zhang
The emergence of pre-trained models has significantly impacted Natural Language Processing (NLP) and Computer Vision to relational datasets.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Xin Dai, Yan Zheng, Yujie Fan, Vivian Lai, Junpeng Wang, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang
To facilitate this investigation, we introduce a CTSR benchmark dataset that comprises time series data from a variety of domains, such as motion, power demand, and traffic.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Yujie Fan, Xin Dai, Yan Zheng, Vivian Lai, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
The ego-networks of all subsequences collectively form a time series subsequence graph, and we introduce an algorithm to efficiently construct this graph.
1 code implementation • 20 Oct 2023 • Dongyu Zhang, Liang Wang, Xin Dai, Shubham Jain, Junpeng Wang, Yujie Fan, Chin-Chia Michael Yeh, Yan Zheng, Zhongfang Zhuang, Wei zhang
FATA-Trans is field- and time-aware for sequential tabular data.
no code implementations • 17 Oct 2023 • Yoshio Ebihara, Xin Dai, Victor Magron, Dimitri Peaucelle, Sophie Tarbouriech
By following a standard procedure using multipliers that capture the behavior of ReLUs, we first reduce the upper bound computation problem of the local Lipschitz constant into a semidefinite programming problem (SDP).
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Xin Dai, Yan Zheng, Junpeng Wang, Huiyuan Chen, Yujie Fan, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang
In this paper, we investigate the application of MTL to the time series classification (TSC) problem.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Xin Dai, Yan Zheng, Junpeng Wang, Vivian Lai, Yujie Fan, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang, Jeff M. Phillips
A Content-based Time Series Retrieval (CTSR) system is an information retrieval system for users to interact with time series emerged from multiple domains, such as finance, healthcare, and manufacturing.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Xin Dai, Huiyuan Chen, Yan Zheng, Yujie Fan, Audrey Der, Vivian Lai, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
A foundation model is a machine learning model trained on a large and diverse set of data, typically using self-supervised learning-based pre-training techniques, that can be adapted to various downstream tasks.
no code implementations • 2 Jun 2023 • Xin Dai, Yujie Fan, Zhongfang Zhuang, Shubham Jain, Chin-Chia Michael Yeh, Junpeng Wang, Liang Wang, Yan Zheng, Prince Osei Aboagye, Wei zhang
Pre-training on large models is prevalent and emerging with the ever-growing user-generated content in many machine learning application categories.
no code implementations • 24 Mar 2023 • Yiran Li, Junpeng Wang, Xin Dai, Liang Wang, Chin-Chia Michael Yeh, Yan Zheng, Wei zhang, Kwan-Liu Ma
Multi-head self-attentions are then applied to the sequence to learn the attention between patches.
1 code implementation • 6 Dec 2022 • Yao Su, Xin Dai, Lifang He, Xiangnan Kong
Recent research on deformable image registration is mainly focused on improving the registration accuracy using multi-stage alignment methods, where the source image is repeatedly deformed in stages by a same neural network until it is well-aligned with the target image.