Immersive Group-to-Group Telepresence

Best Paper Award IEEE Virtual Reality 2013

 

Abstract

We present a novel immersive telepresence system that allows distributed groups of users to meet in a shared virtual 3D world. Our approach is based on two coupled projection-based multi-user setups, each providing multiple users with perspectively correct stereoscopic images. At each site the users and their local interaction space are continuously captured using a cluster of registered depth and color cameras. The captured 3D information is transferred to the respective other location, where the remote participants are virtually reconstructed. We explore the use of these virtual user representations in various interaction scenarios in which local and remote users are face-to-face, side-by-side or decoupled.

Initial experiments with distributed user groups indicate the mutual understanding of pointing and tracing gestures independent of whether they were performed by local or remote participants. Our users were excited about the new possibilities of jointly exploring a virtual city, where they relied on a world-in-miniature metaphor for mutual awareness of their respective locations.

 

This work was supported in part by the European Union under grant AAT-285681 (project VR-HYPERSPACE), the German Federal Ministry of Education and Research (BMBF) under grant 03IP704 (project Intelligentes Lernen), and the Thuringian Ministry of Education and Cultural Affairs (TKM) under grant B514-08028 (project Visual Analytics in Engineering).

 

The New Scientist featured our work with the article Virtual traveller: Beam a live, 3D you into the world

See also the German New Scientist article Holodeck im Bauhaus

 

Publications

  • Beck, S., Kunert, A., Kulik, A., Froehlich B.
    Immersive Group-to-Group Telepresence (Best Paper Award IEEE Virtual Reality 2013)
    IEEE Transactions on Visualization and Computer Graphics, 19(4):616-25, March 2013 (Proceedings of IEEE Virtual Reality 2013, Orlando, Florida).
    [preprint] [video (41MB)] [video on vimeo]

  • Liu, Y., Beck, S., Wang, R., Li, J., Xu, H., Yao, S., Tong, X., Froehlich,
    B.Hybrid Lossless-Lossy Compression for Real-Time Depth-Sensor Streams in 3D Telepresence Applications
    In Proceedings of Advances in Multimedia Information Processing -- PCM 2015: 16th Pacific-Rim Conference on Multimedia, Gwangju, South Korea, September 16-18, 2015, Part I.
    [abstract and preprint]
  • Beck, S., Froehlich, B.
    Volumetric Calibration and Registration of Multiple RGBD-Sensors into aJoint Coordinate System. 
    In Proceedings of IEEE Symposium on 3D User Interfaces (3DUI), Arles, France, pp. 89-96, March 2015. DOI=10.1109/3DUI.2015.7131731
    [abstract, preprint and video]
  • Beck, S., Froehlich, B.Volumetric Calibration and Registration of RGBD-Sensors. Virtual Reality (VR), 2015 IEEE, Arles, pp. 151-152, March 2015.DOI=10.1109/VR.2015.7223340
    [preprint]