GMU:My Computer, Max, and I/Chia-Yun Kuo: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
 
(5 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[File:0_kuo.jpg|800px]]
[[File:plant_2_kuo.jpg|700px]]


[[File:3_kuo_yun.jpg|800px]]
[[File:4_kuo.jpg|800px]]


the possibility of the connections between sensor with body: the distance between the user and the object
[[File:plant_3_kuo.jpg|700px]]


sensor: Sonic ultra sound distance sensor


--->


question1: how to different the object in the real world and the one in the VR world for the user?
I am interested in the self generated graphic and the similarity between the characters of the algorithm and biology. So that  I have created a self growing fractal in Unity, which the selected materials are applied to selected elements randomly. Each play creates a different fractal and I will like to build up an installation that the viewer can start the play by their own and viewing the interesting graphic through the google cardboard glasses.
question2: VR or AR?






[[File:concept_kuo_plant.jpg|600px]]






/////////////////////////////
The installation will compose of a google cardboard, a game controller, a organism ( small green house or simply a plant), and maybe a headphone.


topic:
Surveillance of the digital data


[[File:1_kuo_yun.jpg|800px]]


[[File:2_kuo.jpg|800px]]
The relationship between the Max/Msp and Unity will not only be the switch to start the plays but also getting data from a
 
organism to change the rotation speed of the fractal in the unity and maybe with sounds generated from the signal of the plants.
the possibility of the connections between the sensor and body:
arm muscles detection, breath, microphone
 
--->
 
question1: can the user really detect that the movement of the object in the virtual reality world are caused by them?
question2: user is connected to the sensor: what is the tactual sensation?
 
https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Connecting_Max_to_the_World/unity%2Bosc_scripts

Latest revision as of 08:46, 20 June 2019

Plant 2 kuo.jpg


Plant 3 kuo.jpg


I am interested in the self generated graphic and the similarity between the characters of the algorithm and biology. So that I have created a self growing fractal in Unity, which the selected materials are applied to selected elements randomly. Each play creates a different fractal and I will like to build up an installation that the viewer can start the play by their own and viewing the interesting graphic through the google cardboard glasses.


Concept kuo plant.jpg


The installation will compose of a google cardboard, a game controller, a organism ( small green house or simply a plant), and maybe a headphone.


The relationship between the Max/Msp and Unity will not only be the switch to start the plays but also getting data from a organism to change the rotation speed of the fractal in the unity and maybe with sounds generated from the signal of the plants.