Zur Seitennavigation oder mit Tastenkombination für den accesskey-Taste und Taste 1 
Zum Seiteninhalt oder mit Tastenkombination für den accesskey und Taste 2 
Switch to english language
Startseite    Anmelden     
Logout in [min] [minutetext]
SoSe 2022

4D-SPACE: 4D Scene and Performance Analysis in Collaborative virtual Environments - Einzelansicht

  • Funktionen:
Grunddaten
Veranstaltungsart Projekt SWS 10
Veranstaltungsnummer 421210000 Max. Teilnehmer/-innen 4
Semester WiSe 2021/22 Zugeordnetes Modul
Erwartete Teilnehmer/-innen
Rhythmus einmalig
Hyperlink http://www.uni-weimar.de/vr
Weitere Links https://www.uni-weimar.de/en/media/chairs/computer-science-department/vr/teaching/ws-202122/project-4d-space/
Sprache englisch


Zugeordnete Personen
Zugeordnete Personen Zuständigkeit
Fröhlich, Bernd, Prof., Dr.rer.nat.
Kreskowski, Adrian , Master of Science
Rendle, Gareth
Studiengänge
Abschluss Studiengang Semester Leistungspunkte
Bachelor Medieninformatik (B.Sc.), PV 29 - 15
Master Computer Science and Media (M.Sc.), PV 11 - 15
Bachelor Medieninformatik (B.Sc.), PV 11 - 15
Master Human-Computer Interaction (M.Sc.), PV15 - 15
Master Computer Science for Digital Media (M.Sc.), PV 17 - 15
Bachelor Medieninformatik (B.Sc.), PV 17 - 15
Bachelor Medieninformatik (B.Sc.), PV 16 - 15
Master Human-Computer Interaction (M.Sc.), PV17 - 15
Master Computer Science for Digital Media (M.Sc.), PV 18 - 15
Master Human-Computer Interaction (M.Sc.), PV19 - 12/18
Master Computer Science for Digital Media (M.Sc.), PV 2020 - 12
Zuordnung zu Einrichtungen
Systeme der Virtuellen Realität
Fakultät Medien
Inhalt
Beschreibung

4D-SPACE: 4D Scene and Performance Analysis in Collaborative virtual Environments

 

Collaborative virtual reality systems, such as our immersive group-to-group telepresence system [1], allow multiple users to interact in a shared virtual environment. Collaboration between distributed parties and in particular gestural communication can be facilitated by including realistic user representations (volumetric avatars). Such systems can be leveraged to analyse human actions and interactions. For example, researchers may want to study social interaction in realistic situations, but desire a strict control over the situation that a real-life setting may not afford [2]. An experiment that takes place in virtual reality can provide that control, while maintaining the plausibility of the situation. In creative fields, the possibility to create realistic virtual user representations gives physical performers like actors and dancers the opportunity to evaluate their movements with richer information than that provided by a simple video stream.

 

To support retrospective analysis of action and interaction, it is essential that user sessions in virtual environments can be recorded and subsequently replayed for exploration, annotation, and coding. In this project, we aim to develop a tool for 4D scene and performance analysis in collaborative environments. The software will be able to capture and replay multi-modal interaction between users in a virtual environment, as well as dynamic performances recorded in our lab space. Continuous information about users' position and orientation should be recorded, as well as audio streams for speech and conversation analysis. When realistic user representations, such as volumetric avatars are required, these should also be encoded in a manner that allows reconstruction at the original quality level.

 

The main challenges in this project are recording and synchronizing a plethora of different data streams, and storing them in a compact format that preserves the quality of the live reconstruction and allows the performance to be replayed on-demand for analysis and annotation purposes.

 

 You have an affinity for real-time systems and in particular Unity, feel confident in C++ programming and are interested in asynchronous and concurrent programming? Great!  You want to learn about standard compression libraries or even want to explore state-of-the-art compression papers to tackle the challenge of real-time compression of large data streams? Perfect! If at least one of the two sentences describes you, we would look forward to welcoming you in our project!

Literatur

[1] Beck, S., Kunert, A., Kulik, A., Froehlich B. (2013) Immersive Group-to-Group Telepresence. IEEE Transactions on Visualization and Computer Graphics, 19(4):616-625, March 2013 (Proceedings of IEEE Virtual Reality 2013, Orlando, Florida).

[2] de la Rosa, S., Breidt, M. (2018). Virtual reality: A new track in psychological research. British journal of psychology (London, England : 1953), 109(3), 427–430

Bemerkung

time and place: t.b.a.

Voraussetzungen

Solide C++-Kenntnisse (STL, C++14 oder höhere Standards) , Erfahrung im Bereich der Echtzeit-Computergrafik

Zielgruppe

B.Sc. Medieninformatik

M.Sc. Computer Science for Digital Media /Computer Science and Media

M.Sc. Human-Computer Interaction


Strukturbaum
Keine Einordnung ins Vorlesungsverzeichnis vorhanden. Veranstaltung ist aus dem Semester WiSe 2021/22 , Aktuelles Semester: SoSe 2022

BISON-Portal Startseite   Zurück Kontakt/Impressum Datenschutz