Real-Time Human Pose Tracking from Range Data


Varun Ganapathi Christian Plagemann Daphne Koller Sebastian Thrun

ECCV 2012


Abstract

Tracking human pose in real-time is a difficult problem with many interesting applications. Existing solutions suffer from a variety of problems, especially when confronted with unusual human poses. In this paper, we derive an algorithm for tracking human pose in real-time from depth sequences based on MAP inference in a probabilistic tempo- ral model. The key idea is to extend the iterative closest points (ICP) objective by modeling the constraint that the observed subject cannot enter free space, the area of space in front of the true range measure- ments. Our primary contribution is an extension to the articulated ICP algorithm that can efficiently enforce this constraint. Our experiments show that including this term improves tracking accuracy significantly. The resulting filter runs at 125 frames per second using a single desk- top CPU core. We provide extensive experimental results on challenging real-world data, which show that the algorithm outperforms the previous state-of-the-art trackers both in computational efficiency and accuracy.

Paper

Real-Time Human Pose Tracking from Range Data (PDF, 2.4 MB)

Video

Processing the EVAL dataset: MP4 Video (15.9 MB)

Evaluation Data Set

3D point clouds from depth camera, 3D marker positions from Vicon motion capture system, and estimated true body skeleton (3D joint positions): main_eccv_data.cpp, data (2.3 GB)


BibTex

@inproceedings{ganapathi12realtime,
  title = {Real-Time Human Pose Tracking from Range Data},
  author = {Varun Ganapathi, Christian Plagemann, Daphne Koller, Sebastian Thrun},
  booktitle = {ECCV},
  year = 2012
}