Pose Tracking

62 papers with code • 3 benchmarks • 10 datasets

Pose Tracking is the task of estimating multi-person human poses in videos and assigning unique instance IDs for each keypoint across frames. Accurate estimation of human keypoint-trajectories is useful for human action recognition, human interaction understanding, motion capture and animation.

Source: LightTrack: A Generic Framework for Online Top-Down Human Pose Tracking


Use these libraries to find Pose Tracking models and implementations
3 papers
2 papers
See all 6 libraries.

Most implemented papers

Deep High-Resolution Representation Learning for Human Pose Estimation

leoxiaobin/deep-high-resolution-net.pytorch CVPR 2019

We start from a high-resolution subnetwork as the first stage, gradually add high-to-low resolution subnetworks one by one to form more stages, and connect the mutli-resolution subnetworks in parallel.

Simple Baselines for Human Pose Estimation and Tracking

leoxiaobin/pose.pytorch ECCV 2018

There has been significant progress on pose estimation and increasing interests on pose tracking in recent years.

BlazePose: On-device Real-time Body Pose tracking

google/mediapipe 17 Jun 2020

We present BlazePose, a lightweight convolutional neural network architecture for human pose estimation that is tailored for real-time inference on mobile devices.

Event-based Camera Pose Tracking using a Generative Event Model

uzh-rpg/rpg_image_reconstruction_from_events 7 Oct 2015

Event-based vision sensors mimic the operation of biological retina and they represent a major paradigm shift from traditional cameras.

PoseTrack: Joint Multi-Person Pose Estimation and Tracking

umariqb/posetrack-cvpr2017 CVPR 2017

In this work, we introduce the challenging problem of joint multi-person pose estimation and tracking of an unknown number of persons in unconstrained videos.

Capturing Hand Motion with an RGB-D Sensor, Fusing a Generative Model with Salient Points

cvlabbonn/hands_3d_motion_viewer 3 Apr 2017

In this work, we propose a framework for hand tracking that can capture the motion of two interacting hands using only a single, inexpensive RGB-D camera.

PoseTrack: A Benchmark for Human Pose Estimation and Tracking

open-mmlab/mmpose CVPR 2018

In this work, we aim to further advance the state of the art by establishing "PoseTrack", a new large-scale benchmark for video-based human pose estimation and articulated tracking, and bringing together the community of researchers working on visual human analysis.

Multigrid Predictive Filter Flow for Unsupervised Learning on Videos

aimerykong/predictive-filter-flow 2 Apr 2019

We introduce multigrid Predictive Filter Flow (mgPFF), a framework for unsupervised learning on videos.

LightTrack: A Generic Framework for Online Top-Down Human Pose Tracking

Guanghan/lighttrack 7 May 2019

To the best of our knowledge, this is the first paper to propose an online human pose tracking framework in a top-down fashion.

6-PACK: Category-level 6D Pose Tracker with Anchor-Based Keypoints

j96w/6-PACK 23 Oct 2019

We present 6-PACK, a deep learning approach to category-level 6D object pose tracking on RGB-D data.