Abstract :
[en] Interactivity is one of the key challenges for immersive applications like gaming. Manufacturers have been working towards interfaces that are driven by a device (e.g. a Wiimote) or interfaces
that are controlled by a camera with a subsequent computer vision module. Both approaches have unique advantages, but they do not permit to localize users in the scene with an appropriate accuracy.
Therefore, we propose to use both a range camera and accurate range sensors to enable the interpretation of movements.
This paper describes a platform that uses a range camera to acquire the silhouettes of users, regardless of illumination, and to improve the pose recovery with range information after some
image processing steps. In addition, to circumvent the difficult process of calibration required to map range values to physical distances, we complete the system with several range laser sensors. These sensors are located in a horizontal plane, and measure distances up to a few centimeters. We combine all these measurements to obtain a localization map, used to locate users in the scene at a negligible computational cost. Our method fills a gap in
3D applications that requires absolute positions.
Scopus citations®
without self-citations
0