Gerard Pons-Moll, Andreas Baak, Thomas Helten, Meinard Müller, Hans-Peter Seidel, and Bodo Rosenhahn
In this work, we present an approach to fuse video with orientation data obtained from extended inertial sensors to improve and stabilize full-body human motion capture. Even though video data is a strong cue for motion analysis, tracking artifacts occur frequently due to ambiguities in the images, rapid motions, occlusions or noise. As a complementary data source, inertial sensors allow for drift- free estimation of limb orientations even under fast motions. However, accurate position information cannot be obtained in continuous operation. Therefore, we propose a hybrid tracker that combines video with a small number of inertial units to compensate for the drawbacks of each sensor type: on the one hand, we obtain drift-free and accurate position information from video data and, on the other hand, we obtain accurate limb orientations and good performance under fast motions from inertial sensors. In several experiments we demonstrate the increased performance and stability of our human motion tracker.
The indoor motion capture dataset (MPI08) used in the CVPR 2010 paper is freely available for your own tests and experiments.
The data is only for research purposes. If you use this data, please acknowledge the effort that went into data collection by citing the corresponding papers Multisensor Fusion for 3D Full-Body motion capture pdfBibTeX and Analyzing and Evaluating Markerless Motion Tracking Using Inertial Sensors pdf BibTeX
The dataset consists of: