GMU:Critical VR Lab I/Paulina M Chwala: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 1: Line 1:
NEW CONCEPT


I'm very interested to create an interactive VR experience. The interaction will be based on position of the user. If the person will come closer to the sensor, the Virtual Reality will change according to it.
INTERACTION WITH UNITY
I'm working in Max Msp to receive data from Arduino uno sensors. By this data I also can create sound in Max Msp.


PROPOSAL
I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world. 


I want to create an simpe experience, situation, that would be not possible in the real world.
______________________________________________________________________________________
The user will be situated in a room full of lighten spheres, depend of his position, spheres will appear and disappear. It can also move or light up.
Based on position of the user, there will also appear sound.


[[File:Dd.jpg]]
IMPLEMENTATION


[[File:Ee.jpg]] [[File:Ff.jpg]]
The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.  


IMPLEMENTATION
data loop: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)


1. As an input I will use a motion or an ultrasonic sensor, to react on moves of the user.


2. Data will be received, unpacked transformed and in the Max loop.
Arduino ultrasonic sensor detects the position of the user


3. External c# plugins in Unity receive the data and transform objects in the scene
[[File:Arduino loop1.jpeg]]
Arduino code collects the data


[[File:Unityosc1.gif]]
[[File:Max loop2.jpeg]]
Data is received, unpacked and transformed in the Max Msp loop, creates sound according to data 
 
[[File:Unity33.jpeg]]
Data is sent from Max Msp to Unity by the OSC protocol, external c# plugins in Unity receive the data and transform objects in the scene
 
______________________________________________________________________________________
 
UNITY SCENE
 
I wanted to create a dark game scene, that is lighted up by the environment. I also wanted to play with Unity modules. For that, I created a Shader Graph Material. I used also the Particle System to make glowing modules.
[[File:SCHADER111.jpeg]]

Revision as of 17:38, 7 September 2020

INTERACTION WITH UNITY

I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.

______________________________________________________________________________________

IMPLEMENTATION

The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.

data loop: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)


Arduino ultrasonic sensor detects the position of the user

Arduino loop1.jpeg Arduino code collects the data

Max loop2.jpeg Data is received, unpacked and transformed in the Max Msp loop, creates sound according to data

Unity33.jpeg Data is sent from Max Msp to Unity by the OSC protocol, external c# plugins in Unity receive the data and transform objects in the scene

______________________________________________________________________________________

UNITY SCENE

I wanted to create a dark game scene, that is lighted up by the environment. I also wanted to play with Unity modules. For that, I created a Shader Graph Material. I used also the Particle System to make glowing modules. SCHADER111.jpeg