GMU:Home Made Bioelectronics/Marah Doleh/Docummentation of progress: Difference between revisions

From Medien Wiki
No edit summary
Line 1: Line 1:


==CONCEPT==
In this project, I plan to create an interactive installation that reflects this relationship between architectural space and music. While putting an emphasis on the emotional aspect of this experience.
The installation will have a musical instrument, VR glasses, and HR sensor. The participater is is allowed to play any music he/she wants and to change the music they are playing whenever the well through the experience.
The interaction starts after the participator starts playing and an emotional change is detected. This will trigger an audio-reactive environment to appear. The interactive environment will keep changing with respect to the emotional change during the musical performance, which will (probably) in return lead to affect the emotion of the participator. By this participator is influenced by music, changing his emotions, influencing the virtual space, followed by another emotional change, changing the music, ect..  This closed feedback loop is an embodiment of architecture, music, and emotion relationship.
==TECHNICAL ASPECT==
This installation will be divided into three main parts; emotion detector (HR sensor+arduino+python), music (piano or any other instrument), and audio-reactive environment (touchdesigner viewed by VR glasses).
==AESTHETICAL ASPECT==
Although the program that will be used and the general vision of the visual outcome of this interaction is predetermined; an interactive architectural space that is changed by emotions and music. The design is not yet clear, and will be worked on during this semester in the Touchdesigner course I am currently taking.


==What has been done so far:==
==What has been done so far:==

Revision as of 08:09, 12 July 2022

CONCEPT

In this project, I plan to create an interactive installation that reflects this relationship between architectural space and music. While putting an emphasis on the emotional aspect of this experience. The installation will have a musical instrument, VR glasses, and HR sensor. The participater is is allowed to play any music he/she wants and to change the music they are playing whenever the well through the experience. The interaction starts after the participator starts playing and an emotional change is detected. This will trigger an audio-reactive environment to appear. The interactive environment will keep changing with respect to the emotional change during the musical performance, which will (probably) in return lead to affect the emotion of the participator. By this participator is influenced by music, changing his emotions, influencing the virtual space, followed by another emotional change, changing the music, ect.. This closed feedback loop is an embodiment of architecture, music, and emotion relationship.



TECHNICAL ASPECT

This installation will be divided into three main parts; emotion detector (HR sensor+arduino+python), music (piano or any other instrument), and audio-reactive environment (touchdesigner viewed by VR glasses).


AESTHETICAL ASPECT

Although the program that will be used and the general vision of the visual outcome of this interaction is predetermined; an interactive architectural space that is changed by emotions and music. The design is not yet clear, and will be worked on during this semester in the Touchdesigner course I am currently taking.

What has been done so far:

  • Testing receiving data from Arduino to Touchdesigner using Ultrasonic Distance Sensor (only for this test)

/Code on arduino

  • Manipulating 3D object using data received from Arduino

Links used for this step: Feedback in Touchdesigner, TOUCHDESIGNER & ARDUINO

  • Audio-reactive 3D model

Links used for this step: Instancing in Touchdesigner, Touchdesigner audio analysis

What's next?

Arduino

  • Send pulse data to TD
  • Send to Emotion recognition code

Python | Emotion recognition

  • Emotion recognition
  • Send data to TD

Touchdesinger

  • test other forms of 3D (point clouds/ more complex instancing/....)
  • try other sets of reactions to music
  • create multiple scenes for different emotions
  • intersection with Arduino: reaction to the pulse input
  • intersection with python: trigger change in scenes by the emotional input