I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. First I will collect the information to Processing. Then the message will be sent to puredata through OSC[[OSC|Open Sound Control]].
My concept come from one project online,it is called "Touche for Arduino: Advanced touch sensing'. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified. Then I use the classified information in Pure Data define different actions. In Pd[[Pure Data]] a dataflow programming environment I will create a gem and then using the data to move it. At the end it will like a game maze.
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor.
In my project I can get different data from Arduino to Processing, when I touching the “Touch Sensor” . Then I will define different gestures. In my case I have four gestures.
Because I want to use the different gestures to control the Gem[[Tracking Motion Detection#Gem|Graphics Environment for Multimedia]] an OpenGL extension for →[[Pure Data|Pd]] object in Pure Data. So I name them "Go right"、“Go up”、“Go left”、“Go down”.
In processing the gesture data is remembered as numbers.