Jan Fischer's Homepage - EGVE 2007 Paper Details
  
  

 Using Time-of-Flight Range Data for Occlusion Handling in Augmented Reality

Abstract

One of the main problems of monoscopic video see-through augmented reality (AR) is the lack of reliable depth information. This makes it difficult to correctly represent complex spatial interactions between real and virtual objects, e.g., when rendering shadows. The most obvious graphical artifact is the incorrect display of the occlusion of virtual models by real objects. Since the graphical models are rendered opaquely over the camera image, they always appear to occlude all objects in the real environment, regardless of the actual spatial relationship. In this paper, we propose to utilize a new type of hardware in order to solve some of the basic challenges of AR rendering. We introduce a depth-of-flight range sensor into AR, which produces a 2D map of the distances to real objects in the environment. The distance map is registered with high resolution color images delivered by a digital video camera. When displaying the virtual models in AR, the distance map is used in order to decide whether the camera image or the virtual object is visible at any position. This way, the occlusion of virtual models by real objects can be correctly represented. Preliminary results obtained with our approach show that a useful occlusion handling based on time-of-flight range data is possible.

Bibtex

@INPROCEEDINGS{Fischer-2007-UsingTimeOfFlight,
   AUTHOR = {J. Fischer and B. Huhle and A. Schilling},
   TITLE = {{Using Time-of-Flight Range Data for Occlusion Handling in Augmented Reality}},
   BOOKTITLE = {{Eurographics Symposium on Virtual Environments (EGVE)}},
   YEAR = {2007},
   PAGES = {109--116},
   LOCATION = {Weimar}
}

Documents

Document  Download paper (original copyright by Eurographics) [PDF]
Document  Download presentation slides [PDF]

Example Images