Search Results for author: Rahul Rahaman

Found 6 papers, 3 papers with code

C2F-TCN: A Framework for Semi and Fully Supervised Temporal Action Segmentation

no code implementations20 Dec 2022 Dipika Singhania, Rahul Rahaman, Angela Yao

For the task of temporal action segmentation, we propose an encoder-decoder-style architecture named C2F-TCN featuring a "coarse-to-fine" ensemble of decoder outputs.

Action Segmentation Representation Learning +1

A Generalized & Robust Framework For Timestamp Supervision in Temporal Action Segmentation

no code implementations20 Jul 2022 Rahul Rahaman, Dipika Singhania, Alexandre Thiery, Angela Yao

In temporal action segmentation, Timestamp supervision requires only a handful of labelled frames per video sequence.

Action Segmentation TAG

Iterative Contrast-Classify For Semi-supervised Temporal Action Segmentation

1 code implementation2 Dec 2021 Dipika Singhania, Rahul Rahaman, Angela Yao

Our method hinges on unsupervised representation learning, which, for temporal action segmentation, poses unique challenges.

Action Segmentation Representation Learning +2

Coarse to Fine Multi-Resolution Temporal Convolutional Network

1 code implementation23 May 2021 Dipika Singhania, Rahul Rahaman, Angela Yao

In this work, we propose a novel temporal encoder-decoder to tackle the problem of sequence fragmentation.

Action Segmentation Segmentation +2

Pretrained equivariant features improve unsupervised landmark discovery

no code implementations7 Apr 2021 Rahul Rahaman, Atin Ghosh, Alexandre H. Thiery

Locating semantically meaningful landmark points is a crucial component of a large number of computer vision pipelines.

Uncertainty Quantification and Deep Ensembles

1 code implementation NeurIPS 2021 Rahul Rahaman, Alexandre H. Thiery

In fact, we show that standard ensembling methods, when used in conjunction with modern techniques such as mixup regularization, can lead to less calibrated models.

Data Augmentation Uncertainty Quantification

Cannot find the paper you are looking for? You can Submit a new open access paper.