Occlusion in Collaborative Augmented Environments


Project start: 1998
Funding: Austrian Science Foundation under contract no. P-12074-MAT 
Contact: Anton Fuhrmann, Gerd Hesina, François Faure, Michael Gervautz

Description

Augmented environments superimpose computer enhancements on the real world. Such augmented environments are well suited for collaboration of multiple users. To improve the quality and consistency of the augmentation the occlusion of real objects by computer-generated objects and vice versa has to be implemented. We present methods how this can be done for a tracked user's body and other real objects and how irritating artifacts due to misalignments can be reduced. Our method is based on simulating the occlusion of virtual objects by a representation of the user modeled as kinematic chains of articulated solids. Registration and modeling errors of this model are being reduced by smoothing the border between virtual world and occluding real object. An implementation in our augmented environment and the resulting improvements are presented.

Application

We use tracked articulated objects for occlusion. The primary application of this is the occlusion generated by participant's bodies moving in front of virtual objects.

Problems

One of the main advantages of using an augmented environment for collaboration as opposed to an immersive setup is the direct interaction of participants in reality. While the collaborators in an immersive setup always have to rely on more or less satisfying representations of each other, ranging from disembodied hands or heads to complete bodies visualized in plausible poses, users of an augmented scenario always are able to directly see each other and the interface devices they are using. This combination of reality and virtuality leads to the problem of correct occlusion between real and virtual objects, which of course does not exist in an immersive environment.

Approach

Since we are using a tracking system, which does not supply geometric information of occluding real objects, we have to acquire this information previously. This can easily be done by modeling or digitizing sufficiently precise representations of the objects offline and placing them inside the virtual scene. We call these mockups of real objects "phantoms". By rendering them in an appropriate way occlusion of the virtual objects by real objects can be simulated.

Publications

Occlusion in Collaborative Augmented Environments
Submitted to EGVE'99
A. Fuhrmann, G. Hesina, F. Faure, Michael Gervautz


Movies

"Phantom" avatar rendered visible
(QuickTime, 1MB)
Articulated arm hiding virtual object
(QuickTime, 1.2MB)
Moving head occluding virtual object
(QuickTime, 1.4MB)
Virtual object intersecting real head
(QuickTime, 830KB)
Virtual penguin (disguised as a chicken) occluding & getting occluded by users arm
(QuickTime, 1MB)

Page maintained by Gerd Hesina & A.L.Fuhrmann