Realtime and Robust Hand Tracking from Depth
Citations Over TimeTop 1% of 2014 papers
Abstract
We present a realtime hand tracking system using a depth sensor. It tracks a fully articulated hand under large viewpoints in realtime (25 FPS on a desktop without using a GPU) and with high accuracy (error below 10 mm). To our knowledge, it is the first system that achieves such robustness, accuracy, and speed simultaneously, as verified on challenging real data. Our system is made of several novel techniques. We model a hand simply using a number of spheres and define a fast cost function. Those are critical for realtime performance. We propose a hybrid method that combines gradient based and stochastic optimization methods to achieve fast convergence and good accuracy. We present new finger detection and hand initialization methods that greatly enhance the robustness of tracking.
Related Papers
- → An Object Detection and Pose Estimation Approach for Position Based Visual Servoing(2017)5 cited
- → Foreground object segmentation from binocular stereo video(2005)2 cited
- → Object-oriented stripe structured-light vision-guided robot(2017)2 cited
- → 6-DOF object localization by combining monocular vision and robot arm kinematics(2017)1 cited
- → Robust object tracking based on RGB-D camera(2014)