CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients

27 May 2020 Dani Kiyasseh Tingting Zhu David A. Clifton

The healthcare industry generates troves of unlabelled physiological data. This data can be exploited via contrastive learning, a self-supervised pre-training method that encourages representations of instances to be similar to one another... (read more)

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Batch Normalization
Normalization
1x1 Convolution
Convolutions
Residual Connection
Skip Connections
Bottleneck Residual Block
Skip Connection Blocks
Max Pooling
Pooling Operations
Residual Block
Skip Connection Blocks
Dense Connections
Feedforward Networks
Kaiming Initialization
Initialization
Average Pooling
Pooling Operations
Convolution
Convolutions
Global Average Pooling
Pooling Operations
Random Resized Crop
Image Data Augmentation
ResNet
Convolutional Neural Networks
ReLU
Activation Functions
Feedforward Network
Feedforward Networks
Random Gaussian Blur
Image Data Augmentation
ColorJitter
Image Data Augmentation
NT-Xent
Loss Functions
SimCLR
Self-Supervised Learning