GMU:My Computer, Max, and I/Chia-Yun Kuo: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
 
(16 intermediate revisions by 2 users not shown)
Line 1: Line 1:
[[File:0.jpg|800px]]
[[File:plant_2_kuo.jpg|700px]]


the difference between vision and the tactual sensation in the VR world.


[[File:1_kuo.jpg|800px]]
[[File:plant_3_kuo.jpg|700px]]


[[File:2_kuo.jpg|800px]]


the possibility of the connections between sensor with body: arm movement, breath, microphone


--->
I am interested in the self generated graphic and the similarity between the characters of the algorithm and biology. So that  I have created a self growing fractal in Unity, which the selected materials are applied to selected elements randomly. Each play creates a different fractal and I will like to build up an installation that the viewer can start the play by their own and viewing the interesting graphic through the google cardboard glasses.


question: can the user really detect that the movement of the object in the virtual reality world are caused by them?


[[File:3_kuo.jpg|800px]]
[[File:4_kuo.jpg|800px]]


the possibility of the connections between sensor with body: the distance between the user and the object
[[File:concept_kuo_plant.jpg|600px]]


sensor: Sonic ultra sound distance sensor


--->


question: how to different the object in the real world and the one in the viral world for the user by the concept of the project?
The installation will compose of a google cardboard, a game controller, a organism ( small green house or simply a plant), and maybe a headphone.




https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Connecting_Max_to_the_World/unity%2Bosc_scripts
 
The relationship between the Max/Msp and Unity will not only be the switch to start the plays but also getting data from a
organism to change the rotation speed of the fractal in the unity and maybe with sounds generated from the signal of the plants.

Latest revision as of 08:46, 20 June 2019

Plant 2 kuo.jpg


Plant 3 kuo.jpg


I am interested in the self generated graphic and the similarity between the characters of the algorithm and biology. So that I have created a self growing fractal in Unity, which the selected materials are applied to selected elements randomly. Each play creates a different fractal and I will like to build up an installation that the viewer can start the play by their own and viewing the interesting graphic through the google cardboard glasses.


Concept kuo plant.jpg


The installation will compose of a google cardboard, a game controller, a organism ( small green house or simply a plant), and maybe a headphone.


The relationship between the Max/Msp and Unity will not only be the switch to start the plays but also getting data from a organism to change the rotation speed of the fractal in the unity and maybe with sounds generated from the signal of the plants.