Project 4D-SPACE: 4D Scene and Performance Analysis in Collaborative virtual Environments

Project Real-Time Avatars for 3D Telepresence in Unity

Prof. Dr. Bernd Fröhlich
M.Sc. Adrian Kreskowski
M.Sc. Gareth Rendle

DegreeStudy ProgrammeExamination RegulationsECTS
M.Sc.Computer Science for 
Digital Media
PV18 and lower15
M.Sc.Computer Science for
Digital Media
M.Sc.Computer Science and Mediaall15
M.Sc.Human-Computer InteractionPV17 and lower15
M.Sc.Human-Computer InteractionPV1912/18


Collaborative virtual reality systems, such as our immersive group-to-group telepresence system [1], allow multiple users to interact in a shared virtual environment. The inclusion of realistic user representations (known as volumetric avatars) can enhance collaboration between distributed parties by facilitating expressive gestural communication. As well as offering myriad possibilities for more effective remote working, such systems can also be leveraged to analyze human actions and interactions. For example, researchers may want to study social interaction in realistic situations, but desire a strict control over the situation that a real-life setting may not afford [2]. An experiment that takes place in virtual reality can provide that control, while maintaining the plausibility of the situation, and therefore the experimental findings. In creative fields, the possibility to create realistic virtual user representations gives physical performers like actors and dancers the opportunity to evaluate their movements with richer information than that provided by a simple video stream.

Image: Local and remote users meeting in a shared virtual environment. The remote users are captured by multiple RGBD-Cameras (e.g. Kinect 4 Azure), 3D-reconstructed as a volumetric avatar in real-time, and sent over the network to the local users. The goal of our project is to create an environment to record, replay, reexperience and retrospectively analyse interaction between parties in telepresence meetings.

To support retrospective analysis of action and interaction, it is essential that user sessions in virtual environments can be recorded and subsequently replayed for exploration, annotation, and coding. In this project, we aim to develop a tool for 4D scene and performance analysis in collaborative environments. The software will be able to capture and replay multi-modal interaction between users in a virtual environment, as well as dynamic performances recorded in our lab space. Continuous information about users' position and orientation should be recorded, as well as audio streams for speech and conversation analysis. When realistic user representations such as volumetric avatars are required, these should also be encoded in a manner that allows reconstruction at the original quality level.

The main challenges in this project are recording and synchronizing a plethora of different data streams, and storing them in a compact format that preserves the quality of the live reconstruction and allows the performance to be replayed on-demand for analysis and annotation purposes.

You have an affinity for real-time systems and in particular Unity, feel confident in C++ programming and are interested in asynchronous and concurrent programming? Great!  You want to learn about standard compression libraries or even want to explore state-of-the-art compression papers to tackle the challenge of real-time compression of large data streams? Perfect! If at least one of the two sentences describes you, we would look forward to welcoming you in our project!


As well as willingness to work in a team, and enthusiasm for learning about and developing rendering and compression techniques on cutting edge hardware, you should have the following competencies:

  • Solid C++ skills, both conceptual and practical (STL, C++14 or higher standards)
  • Experience in the field of real-time computer graphics
  • Basic analysis and linear algebra skills

If you are in doubt as to whether you fulfil the requirements, or if you have any further questions regarding the project, we are happy to have a discussion with you during the project fair on 11th of October. You can find us in our BigBlueButton room at the fair for the entire time between 5pm and 7pm.


The final assessment of your work will be conducted based on the project contributions of every team member, including:

  • Active participation in the project during and in between weekly meetings
  • Design, implementation and evaluation of a tools for the 4D-Space in form of Unity modules
  • Intermediate talks
  • Intermediate and final project presentations
  • Documentation in form of a short paper

If you are excited about avatars and virtual reality, do not hesitate to come to our room during the project fair!


[1] Beck, S., Kunert, A., Kulik, A., Froehlich B. (2013) Immersive Group-to-Group Telepresence. IEEE Transactions on Visualization and Computer Graphics, 19(4):616-625, March 2013 (Proceedings of IEEE Virtual Reality 2013, Orlando, Florida).

[2] de la Rosa, S., Breidt, M. (2018). Virtual reality: A new track in psychological research. British journal of psychology (London, England : 1953), 109(3), 427–430