Consistent Illumination for Mixed Reality Szenarios

Stanford Buddha illuminated in real-time with colorbleeding effects casted from the users red folder.

English Abstract:

We present an interactive real-time augmented reality system with realistic illumination of integrated virtual objects. A light-sphere is used to capture the lighting environment of the place the virtual object is to be rendered to. Therefor we developed an algorithm which does the lighting calculation for the object in real-time including soft diffuse shading and color-bleeding effects from the real world onto the virtual object.

German Abstract:

Wir präsentieren ein System zur interaktiven Integration und realistischen Beleuchtung von virtuellen Objekten in realen Szenen. Die Umgebung wird dabei durch eine verspiegelte Kugel aufgenommen, die sich an der realen Stelle befindet, an der das virtuelle Object gezeichnet werden soll. Dazu wurde ein Algorithmus entwickelt, der die notwendigen Lichtberechnungen in Echtzeit durchführt und das Objekt mit sanften Schatten und indirekter Ausleuchtung in die Realität integriert.

This is joint work with Fraunhofer HHI - Image Processing Department, Berlin.

Description

In augmented reality applications, simple patterns (markers) are often used to calculate the position and perspective of a certain part of the real environment with respect to the camera. The marker used in our case is visible as a black square in Fig.3. With this information one can project arbitrary virtual objects onto to real scene that fit in size and orientation over the marker. This technique only provides the user with the position and perspective warping of the virtual object but does not deliver any information about lighting conditions in the environment, that the object is to be placed into.

We now mounted a mirror sphere onto the marker and placed them in a scene that is captured by a video camera. Dynamic illumination (as the marker is interactively moved) is estimated from the video signal and used for realistic real-time integration of virtual objects into the video. The user can interact by moving the marker and with that, the integrated virtual object in the augmented reality video. The computed illumination is adapted for every frame the camera captures. To achieve interactive real-time performance, we have developed a graphics-hardware accelerated approach.

In a preprocessing step we compute the virtual objects’ self-occlusion and the form factors with respect to an environment map for each vertex. This information is stored in a low-resolution cube map for each vertex. These form factor cube maps for all the vertices are unfolded and assembled in a single large texture. The illumination information estimated from the video is resampled and low pass filtered into a low-resolution cube map. The form factor cube maps and the illumination cube map are multiplied on a per vertex basis. The results are summed up to compute the diffuse illumination for each vertex.

Our approach works well for small virtual objects moving around in real environments. The real environment operates as a large area light source, which casts soft shadows on our diffusely lit virtual objects. The prototype is implemented almost completely on the GPU, which allows real-time frame rates for virtual objects with ten thousands of vertices.

The current implementation uses just a single camera to capture the image of the mirror sphere and to insert the virtual object into the real scene. In the future we are planning to use a higher resolution camera with a zoom lens to capture the image on the mirror sphere and a separate camera or a video or optical see through HMD to allow users to walk freely around in an augmented environment with consistently illuminated virtual objects. This approach is not limited to augmented reality scenarios but could also be applied for lighting objects in complex virtual scenes with preprocessed lighting.

Documents and Publications

Contact

     

     

  • Sebastian Heymann
    Bauhaus Universität, Weimar
    heimie (at) selective.de
    www.uni-weimar.de/~heymann
  •  

     

  • Prof. Dr. Bernd Fröhlich
    Professur Systeme der virtuelle Realität, Fakultät Medien
    Bauhaus Universität, Weimar
    bernd.froehlich (at) medien.uni-weimar.de
    www.uni-weimar.de/medien/vr
  •  

     

  • Dr. Aljoscha Smolic
    Image Processing Department
    Fraunhofer HHI, Berlin
    Aljoscha.Smolic (at) hhi.fraunhofer.de
    iphome.hhi.de/smolic/
  •