no code implementations • 13 Mar 2025 • Jiarui Sun, Chin-Chia Michael Yeh, Yujie Fan, Xin Dai, Xiran Fan, Zhimeng Jiang, Uday Singh Saini, Vivian Lai, Junpeng Wang, Huiyuan Chen, Zhongfang Zhuang, Yan Zheng, Girish Chowdhary
However, its practical application in large-scale settings is limited by quadratic time and space complexity ($O(N^2)$) with respect to the number of entities $N$.
no code implementations • 28 Feb 2025 • Chin-Chia Michael Yeh, Xiran Fan, Zhimeng Jiang, Yujie Fan, Huiyuan Chen, Uday Singh Saini, Vivian Lai, Xin Dai, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Yan Zheng
SparseTSF, a recently introduced competitive univariate forecasting model, harnesses periodicity to achieve compactness by concentrating on cross-period dynamics, thereby extending the Pareto frontier with respect to model size and predictive performance.
no code implementations • 14 Sep 2024 • Chin-Chia Michael Yeh, Audrey Der, Uday Singh Saini, Vivian Lai, Yan Zheng, Junpeng Wang, Xin Dai, Zhongfang Zhuang, Yujie Fan, Huiyuan Chen, Prince Osei Aboagye, Liang Wang, Wei zhang, Eamonn Keogh
This paper delves into the problem of anomaly detection in multidimensional time series, a common occurrence in real-world applications.
no code implementations • 6 Sep 2024 • Liang Wang, Shubham Jain, Yingtong Dou, Junpeng Wang, Chin-Chia Michael Yeh, Yujie Fan, Prince Aboagye, Yan Zheng, Xin Dai, Zhongfang Zhuang, Uday Singh Saini, Wei zhang
Our findings underscore the significance of individual user tastes in the context of online product rating prediction and the robustness of our approach across different model architectures.
no code implementations • 4 Sep 2024 • Takanori Fujiwara, Kostiantyn Kucher, Junpeng Wang, Rafael M. Martins, Andreas Kerren, Anders Ynnerman
Research in ML4VIS investigates how to use machine learning (ML) techniques to generate visualizations, and the field is rapidly growing with high societal impact.
no code implementations • 15 Aug 2024 • Audrey Der, Chin-Chia Michael Yeh, Xin Dai, Huiyuan Chen, Yan Zheng, Yujie Fan, Zhongfang Zhuang, Vivian Lai, Junpeng Wang, Liang Wang, Wei zhang, Eamonn Keogh
Self-supervised Pretrained Models (PTMs) have demonstrated remarkable performance in computer vision and natural language processing tasks.
no code implementations • 19 Jul 2024 • Guan Li, Yang Liu, Guihua Shan, Shiyu Cheng, Weiqun Cao, Junpeng Wang, Ko-Chih Wang
Through experiments conducted on real-world simulations and comparisons with state-of-the-art deep learning-based approaches, we demonstrate the efficacy of our solution.
no code implementations • 16 Feb 2024 • Chin-Chia Michael Yeh, Yujie Fan, Xin Dai, Uday Singh Saini, Vivian Lai, Prince Osei Aboagye, Junpeng Wang, Huiyuan Chen, Yan Zheng, Zhongfang Zhuang, Liang Wang, Wei zhang
Spatial-temporal forecasting systems play a crucial role in addressing numerous real-world challenges.
Ranked #6 on
Traffic Prediction
on LargeST
no code implementations • 16 Jan 2024 • Audrey Der, Chin-Chia Michael Yeh, Yan Zheng, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn J. Keogh
In this work we introduce a domain agnostic counterfactual explanation technique to produce explanations for time series anomalies.
no code implementations • 2 Jan 2024 • Prince Aboagye, Yan Zheng, Junpeng Wang, Uday Singh Saini, Xin Dai, Michael Yeh, Yujie Fan, Zhongfang Zhuang, Shubham Jain, Liang Wang, Wei zhang
The emergence of pre-trained models has significantly impacted Natural Language Processing (NLP) and Computer Vision to relational datasets.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Yujie Fan, Xin Dai, Yan Zheng, Vivian Lai, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
The ego-networks of all subsequences collectively form a time series subsequence graph, and we introduce an algorithm to efficiently construct this graph.
no code implementations • 5 Nov 2023 • Audrey Der, Chin-Chia Michael Yeh, Yan Zheng, Junpeng Wang, Huiyuan Chen, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
As a result, unmodified data mining tools can obtain near-identical performance on the synthesized time series as on the original time series.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Xin Dai, Yan Zheng, Yujie Fan, Vivian Lai, Junpeng Wang, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang
To facilitate this investigation, we introduce a CTSR benchmark dataset that comprises time series data from a variety of domains, such as motion, power demand, and traffic.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Yan Zheng, Menghai Pan, Huiyuan Chen, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang, Jeff M. Phillips, Eamonn Keogh
In this work, we propose a sketch for discord mining among multi-dimensional time series.
no code implementations • 2 Nov 2023 • Yiran Li, Junpeng Wang, Prince Aboagye, Michael Yeh, Yan Zheng, Liang Wang, Wei zhang, Kwan-Liu Ma
On the one hand, by visually examining the captions automatically generated from language-image models for an image dataset, we gain deeper insights into the semantic underpinnings of the visual contents, unearthing data biases that may be entrenched within the dataset.
1 code implementation • 20 Oct 2023 • Dongyu Zhang, Liang Wang, Xin Dai, Shubham Jain, Junpeng Wang, Yujie Fan, Chin-Chia Michael Yeh, Yan Zheng, Zhongfang Zhuang, Wei zhang
FATA-Trans is field- and time-aware for sequential tabular data.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Xin Dai, Huiyuan Chen, Yan Zheng, Yujie Fan, Audrey Der, Vivian Lai, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
A foundation model is a machine learning model trained on a large and diverse set of data, typically using self-supervised learning-based pre-training techniques, that can be adapted to various downstream tasks.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Xin Dai, Yan Zheng, Junpeng Wang, Huiyuan Chen, Yujie Fan, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang
In this paper, we investigate the application of MTL to the time series classification (TSC) problem.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Xin Dai, Yan Zheng, Junpeng Wang, Vivian Lai, Yujie Fan, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang, Jeff M. Phillips
A Content-based Time Series Retrieval (CTSR) system is an information retrieval system for users to interact with time series emerged from multiple domains, such as finance, healthcare, and manufacturing.
no code implementations • 2 Aug 2023 • Yan Zheng, Junpeng Wang, Chin-Chia Michael Yeh, Yujie Fan, Huiyuan Chen, Liang Wang, Wei zhang
The tool helps users discover nuance features of data entities, perform feature denoising/injecting in embedding training, and generate embeddings for unseen entities.
no code implementations • 18 Jul 2023 • Huiyuan Chen, Chin-Chia Michael Yeh, Yujie Fan, Yan Zheng, Junpeng Wang, Vivian Lai, Mahashweta Das, Hao Yang
Graph Neural Networks (GNNs) have achieved impressive performance in collaborative filtering.
no code implementations • 15 Jul 2023 • Junpeng Wang, Shixia Liu, Wei zhang
The past decade has witnessed a plethora of works that leverage the power of visualization (VIS) to interpret machine learning (ML) models.
no code implementations • 2 Jun 2023 • Xin Dai, Yujie Fan, Zhongfang Zhuang, Shubham Jain, Chin-Chia Michael Yeh, Junpeng Wang, Liang Wang, Yan Zheng, Prince Osei Aboagye, Wei zhang
Pre-training on large models is prevalent and emerging with the ever-growing user-generated content in many machine learning application categories.
no code implementations • 30 May 2023 • Junpeng Wang, Mengke Ge, Bo Ding, Qi Xu, Song Chen, Yi Kang
As one of the feasible processing-in-memory(PIM) architectures, 3D-stacked-DRAM-based PIM(DRAM-PIM) architecture enables large-capacity memory and low-cost memory access, which is a promising solution for DNN accelerators with better performance and energy efficiency.
no code implementations • 24 Mar 2023 • Yiran Li, Junpeng Wang, Xin Dai, Liang Wang, Chin-Chia Michael Yeh, Yan Zheng, Wei zhang, Kwan-Liu Ma
Multi-head self-attentions are then applied to the sequence to learn the attention between patches.
no code implementations • 6 Mar 2023 • Yiran Li, Junpeng Wang, Takanori Fujiwara, Kwan-Liu Ma
Adversarial attacks on a convolutional neural network (CNN) -- injecting human-imperceptible perturbations into an input image -- could fool a high-performance CNN into making incorrect predictions.
no code implementations • 11 Dec 2022 • Bo Ding, Jinglei Huang, Junpeng Wang, Qi Xu, Song Chen, Yi Kang
To better solve the problems in the automation process of FPGA-PDRS and narrow the gap between algorithm and application, in this paper, we propose a complete workflow including three parts, pre-processing to generate the list of task modules candidate shapes according to the resources requirements, exploration process to search the solution of task modules partitioning, scheduling, and floorplanning, and post-optimization to improve the success rate of floorplan.
no code implementations • 9 Dec 2022 • Audrey Der, Chin-Chia Michael Yeh, Renjie Wu, Junpeng Wang, Yan Zheng, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
PRCIS is a distance measure for long time series, which exploits recent progress in our ability to summarize time series with dictionaries.
no code implementations • AMTA 2022 • Prince O Aboagye, Yan Zheng, Michael Yeh, Junpeng Wang, Zhongfang Zhuang, Huiyuan Chen, Liang Wang, Wei zhang, Jeff Phillips
Optimal Transport (OT) provides a useful geometric framework to estimate the permutation matrix under unsupervised cross-lingual word embedding (CLWE) models that pose the alignment task as a Wasserstein-Procrustes problem.
no code implementations • 11 Aug 2022 • Chin-Chia Michael Yeh, Mengting Gu, Yan Zheng, Huiyuan Chen, Javid Ebrahimi, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
Graph neural networks (GNNs) are deep learning models designed specifically for graph data, and they typically rely on node features as the input to the first layer.
no code implementations • 19 Jan 2022 • Junpeng Wang, Liang Wang, Yan Zheng, Chin-Chia Michael Yeh, Shubham Jain, Wei zhang
With these metrics, one can easily identify meta-features with the most complementary behaviors in two classifiers, and use them to better ensemble the classifiers.
no code implementations • 24 Dec 2021 • Chin-Chia Michael Yeh, Yan Zheng, Junpeng Wang, Huiyuan Chen, Zhongfang Zhuang, Wei zhang, Eamonn Keogh
The matrix profile is an effective data mining tool that provides similarity join functionality for time series data.
no code implementations • 29 Sep 2021 • Chin-Chia Michael Yeh, Mengting Gu, Yan Zheng, Huiyuan Chen, Javid Ebrahimi, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
When applying such type of networks on graph without node feature, one can extract simple graph-based node features (e. g., number of degrees) or learn the input node representation (i. e., embeddings) when training the network.
no code implementations • 21 Sep 2021 • Chin-Chia Michael Yeh, Zhongfang Zhuang, Junpeng Wang, Yan Zheng, Javid Ebrahimi, Ryan Mercer, Liang Wang, Wei zhang
In this work, we study the problem of multivariate time series prediction for estimating transaction metrics associated with entities in the payment transaction database.
1 code implementation • NeurIPS 2021 • Prince Osei Aboagye, Jeff Phillips, Yan Zheng, Chin-Chia Michael Yeh, Junpeng Wang, Wei zhang, Liang Wang, Hao Yang
Learning a good transfer function to map the word vectors from two languages into a shared cross-lingual word vector space plays a crucial role in cross-lingual NLP.
1 code implementation • 6 Apr 2021 • Archit Rathore, Sunipa Dev, Jeff M. Phillips, Vivek Srikumar, Yan Zheng, Chin-Chia Michael Yeh, Junpeng Wang, Wei zhang, Bei Wang
To aid this, we present Visualization of Embedding Representations for deBiasing system ("VERB"), an open-source web-based visualization tool that helps the users gain a technical understanding and visual intuition of the inner workings of debiasing techniques, with a focus on their geometric properties.
no code implementations • 5 Nov 2020 • Chin-Chia Michael Yeh, Zhongfang Zhuang, Yan Zheng, Liang Wang, Junpeng Wang, Wei zhang
In this work, we approach this problem from a multi-modal learning perspective, where we use not only the merchant time series data but also the information of merchant-merchant relationship (i. e., affinity) to verify the self-reported business type (i. e., merchant category) of a given merchant.
no code implementations • 8 Sep 2020 • Guan Li, Junpeng Wang, Han-Wei Shen, Kaixin Chen, Guihua Shan, Zhonghua Lu
It considers the importance of convolutional filters through both instability and sensitivity, and allows users to interactively create pruning plans according to a desired goal on model size or accuracy.
no code implementations • 25 Jul 2020 • Zhongfang Zhuang, Chin-Chia Michael Yeh, Liang Wang, Wei zhang, Junpeng Wang
New challenges have surfaced in monitoring and guaranteeing the integrity of payment processing systems.
no code implementations • 1 Aug 2019 • Wenbin He, Junpeng Wang, Hanqi Guo, Ko-Chih Wang, Han-Wei Shen, Mukund Raj, Youssef S. G. Nashed, Tom Peterka
We propose InSituNet, a deep learning based surrogate model to support parameter space exploration for ensemble simulations that are visualized in situ.