Occlusion Handling for the
Integration of Virtual Objects into Video
VISAPP 2012 - International Conference on Computer Vision Theory and Applications
Abstract
This paper demonstrates how to effectively exploit occlusion and reappearance information of feature points in structure and motion recovery from video.
Due to temporary occlusion with foreground objects, feature tracks discontinue.
If these features reappear after their occlusion, they are connected to the correct previously discontinued trajectory during sequential camera and scene estimation.
The combination of optical flow for features in consecutive frames and SIFT matching for the wide baseline feature connection provides accurate and stable feature tracking.
The knowledge of occluded parts of a connected feature track is used to feed a segmentation algorithm which crops the foreground image regions automatically.
The resulting segmentation provides an important step in scene understanding which eases integration of virtual objects into video significantly.
The presented approach enables the automatic occlusion of integrated virtual objects with foreground regions of the video.
Demonstrations show very realistic results in augmented reality.
Input
Result
Video download (24 MB):
VISAPP2012.avi
Presentation:
Full Paper presentation at VISAPP 2012 pdf
Selected to appear as Extended and Revised Version in:
Communications in Computer and Information Science (CCIS),
Springer
Computer Vision, Imaging and Computer Graphics. Theory and Applications
International Joint Conference, VISIGRAPP 2012,
Revised Selected Papers
Related Work:
Feature Trajectory Retrieval with Application to
Accurate Structure and Motion Recovery