Search Results for author: Emadeldeen Eldele

Found 12 papers, 10 papers with code

TSLANet: Rethinking Transformers for Time Series Representation Learning

1 code implementation12 Apr 2024 Emadeldeen Eldele, Mohamed Ragab, Zhenghua Chen, Min Wu, XiaoLi Li

Time series data, characterized by its intrinsic long and short-range dependencies, poses a unique challenge across analytical applications.

Anomaly Detection Computational Efficiency +4

Source-Free Domain Adaptation with Temporal Imputation for Time Series Data

1 code implementation14 Jul 2023 Mohamed Ragab, Emadeldeen Eldele, Min Wu, Chuan-Sheng Foo, XiaoLi Li, Zhenghua Chen

The existing SFDA methods that are mainly designed for visual applications may fail to handle the temporal dynamics in time series, leading to impaired adaptation performance.

Imputation Source-Free Domain Adaptation +1

Label-efficient Time Series Representation Learning: A Review

no code implementations13 Feb 2023 Emadeldeen Eldele, Mohamed Ragab, Zhenghua Chen, Min Wu, Chee-Keong Kwoh, XiaoLi Li

The scarcity of labeled data is one of the main challenges of applying deep learning models on time series data in the real world.

Representation Learning Self-Supervised Learning +3

Contrastive Domain Adaptation for Time-Series via Temporal Mixup

1 code implementation3 Dec 2022 Emadeldeen Eldele, Mohamed Ragab, Zhenghua Chen, Min Wu, Chee-Keong Kwoh, XiaoLi Li

Specifically, we propose a novel temporal mixup strategy to generate two intermediate augmented views for the source and target domains.

Contrastive Learning Time Series +2

Self-supervised Contrastive Representation Learning for Semi-supervised Time-Series Classification

2 code implementations13 Aug 2022 Emadeldeen Eldele, Mohamed Ragab, Zhenghua Chen, Min Wu, Chee-Keong Kwoh, XiaoLi Li, Cuntai Guan

Specifically, we propose time-series specific weak and strong augmentations and use their views to learn robust temporal relations in the proposed temporal contrasting module, besides learning discriminative representations by our proposed contextual contrasting module.

Contrastive Learning Data Augmentation +5

ADATIME: A Benchmarking Suite for Domain Adaptation on Time Series Data

1 code implementation15 Mar 2022 Mohamed Ragab, Emadeldeen Eldele, Wee Ling Tan, Chuan-Sheng Foo, Zhenghua Chen, Min Wu, Chee-Keong Kwoh, XiaoLi Li

Our evaluation includes adapting state-of-the-art visual domain adaptation methods to time series data as well as the recent methods specifically developed for time series data.

Benchmarking Time Series +2

Self-supervised Autoregressive Domain Adaptation for Time Series Data

1 code implementation29 Nov 2021 Mohamed Ragab, Emadeldeen Eldele, Zhenghua Chen, Min Wu, Chee-Keong Kwoh, XiaoLi Li

Second, we propose a novel autoregressive domain adaptation technique that incorporates temporal dependency of both source and target features during domain alignment.

Self-Supervised Learning Time Series +2

A Systematic Evaluation of Domain Adaptation Algorithms On Time Series Data

no code implementations29 Sep 2021 Mohamed Ragab, Emadeldeen Eldele, Wee Ling Tan, Chuan-Sheng Foo, Zhenghua Chen, Min Wu, Chee Kwoh, XiaoLi Li

Our evaluation includes adaptations of state-of-the-art visual domain adaptation methods to time series data in addition to recent methods specifically developed for time series data.

Benchmarking Model Selection +3

ADAST: Attentive Cross-domain EEG-based Sleep Staging Framework with Iterative Self-Training

1 code implementation9 Jul 2021 Emadeldeen Eldele, Mohamed Ragab, Zhenghua Chen, Min Wu, Chee-Keong Kwoh, XiaoLi Li, Cuntai Guan

Second, we design an iterative self-training strategy to improve the classification performance on the target domain via target domain pseudo labels.

Automatic Sleep Stage Classification Domain Adaptation +2

Time-Series Representation Learning via Temporal and Contextual Contrasting

1 code implementation26 Jun 2021 Emadeldeen Eldele, Mohamed Ragab, Zhenghua Chen, Min Wu, Chee Keong Kwoh, XiaoLi Li, Cuntai Guan

In this paper, we propose an unsupervised Time-Series representation learning framework via Temporal and Contextual Contrasting (TS-TCC), to learn time-series representation from unlabeled data.

Automatic Sleep Stage Classification Contrastive Learning +9

An Attention-Based Deep Learning Approach for Sleep Stage Classification With Single-Channel EEG

1 code implementation28 Apr 2021 Emadeldeen Eldele, Zhenghua Chen, Chengyu Liu, Min Wu, Chee-Keong Kwoh, XiaoLi Li, Cuntai Guan

The MRCNN can extract low and high frequency features and the AFR is able to improve the quality of the extracted features by modeling the inter-dependencies between the features.

Automatic Sleep Stage Classification EEG +1

Cannot find the paper you are looking for? You can Submit a new open access paper.