Robust Articulated‐ICP for Real‐Time Hand Tracking
Citations Over TimeTop 1% of 2015 papers
Abstract
Abstract We present a robust method for capturing articulated hand motions in realtime using a single depth camera. Our system is based on a realtime registration process that accurately reconstructs hand poses by fitting a 3D articulated hand model to depth images. We register the hand model using depth, silhouette, and temporal information. To effectively map low‐quality depth maps to realistic hand poses, we regularize the registration with kinematic and temporal priors, as well as a data‐driven prior built from a database of realistic hand poses. We present a principled way of integrating such priors into our registration optimization to enable robust tracking without severely restricting the freedom of motion. A core technical contribution is a new method for computing tracking correspondences that directly models occlusions typical of single‐camera setups. To ensure reproducibility of our results and facilitate future research, we fully disclose the source code of our implementation.
Related Papers
- → Gait recognition using partial silhouette-based approach(2014)19 cited
- Extracting silhouette curves of NURBS surfaces by tracing silhouette points(1998)
- → A Matching Method Based on Partial Body Silhouette(2011)
- → Significance of Dynamic Content of Gait Present in the Lower Silhouette Region(2011)
- A general solution to the silhouette problem(1987)