GMU:Keeping Track/assignment four: Difference between revisions

From Medien Wiki
mNo edit summary
Line 4: Line 4:


Please read the [http://www.palindrome.de/d2/C13.pdf whitepaper] by Robert Wechsler.
Please read the [http://www.palindrome.de/d2/C13.pdf whitepaper] by Robert Wechsler.
<br clear="all" />


== Moritz und Frederic ==
== Moritz and Frederic ==
 
==== Documentary about WHAT WE DID ====
==== Documentary about WHAT WE DID ====


Line 12: Line 12:


== Moritz und Frederic ==
== Moritz und Frederic ==
 
===Granular meets OSC~===
==='''Granular meets OSC~''===
 
Pure Data Patch converts EyeCon Data into sound:
Pure Data Patch converts EyeCon Data into sound:
In eyeCon there are three Fields (Activity, CenterX, Top) and two Lines (Position) that send seperated data via OSC-protocol to pd. In Pure Data there are two "scenes" that generate different soundscapes. By crossing the two lines in eyeCon (touch the "buttons" on the ground) the two scenes are switchable. The first scene is based on  the Designing Sound Patch "granular2" (granular Synthesis) by Andy Farnell. Activity, CenterX and Top from the Fields in EyeCon influence grainstart, grainduration and grainpitch.
In eyeCon there are three Fields (Activity, CenterX, Top) and two Lines (Position) that send seperated data via OSC-protocol to pd. In Pure Data there are two "scenes" that generate different soundscapes. By crossing the two lines in eyeCon (touch the "buttons" on the ground) the two scenes are switchable. The first scene is based on  the Designing Sound Patch "granular2" (granular Synthesis) by Andy Farnell. Activity, CenterX and Top from the Fields in EyeCon influence grainstart, grainduration and grainpitch.
The second scene combines several oscillators to generate a polyphone sinus soundwave. Actvity changes freqeuncy and Toplevel changes the volume of the main oscillator.  
The second scene combines several oscillators to generate a polyphone sinus soundwave. Actvity changes freqeuncy and Toplevel changes the volume of the main oscillator.  


<videoflash type>dQe09QA6TGc</videoflash>
<videoflash type>dQe09QA6TGc|450|360</videoflash>
 


[[Media: GranularMeetsOSC.zip]]
[[Media: GranularMeetsOSC.zip]]
<br clear="all" />


==Ana, Laura, Gesine and Marianne==
==Ana, Laura, Gesine and Marianne==
 
=== stringsound ===
 
==== Description ====
===stringsound===
 
'''Description'''<br>
The installation consists of strings, which are strained horizontally about two meters above the floor. A video camera is set up to capture the area and EyeCon is grabbing the video image. Visitors are invited to enter the area and to touch and “play” with the strings.  It is built up as a self-explaining installation in which the participants experience the signification of the construction on their own. They will perceive a sound coming from the string as soon as it is touched.  
The installation consists of strings, which are strained horizontally about two meters above the floor. A video camera is set up to capture the area and EyeCon is grabbing the video image. Visitors are invited to enter the area and to touch and “play” with the strings.  It is built up as a self-explaining installation in which the participants experience the signification of the construction on their own. They will perceive a sound coming from the string as soon as it is touched.  


'''Software'''<br>
==== Software ====
We are working with the motion sensing software EyeCon, using the elements line, static and field. There is a background sound added to the installation that changes slightly when somebody enters the playing area. To each string, there is a line element attached and as soon somebody touches the string, the line element is triggered and plays the belonging sound. Each string in the room plays a specific sound and is made out of different materials.
We are working with the motion sensing software EyeCon, using the elements line, static and field. There is a background sound added to the installation that changes slightly when somebody enters the playing area. To each string, there is a line element attached and as soon somebody touches the string, the line element is triggered and plays the belonging sound. Each string in the room plays a specific sound and is made out of different materials.


'''Concept'''<br>
==== Concept ====
The term “handicapped people” involves a large number of people with different skills and as it is not further specified, we thought about people who have difficulties to move. The goal is to sensitize the tactile act of touching to give these people a better feeling of what movement involves and how it can lead to something and maybe change something. The visitor finds himself in this sensorial experience in which every move results directly into action. Because of this it is possible for him or her to be more aware of the body and to get a better control over it.
The term “handicapped people” involves a large number of people with different skills and as it is not further specified, we thought about people who have difficulties to move. The goal is to sensitize the tactile act of touching to give these people a better feeling of what movement involves and how it can lead to something and maybe change something. The visitor finds himself in this sensorial experience in which every move results directly into action. Because of this it is possible for him or her to be more aware of the body and to get a better control over it.


'''Difficulties'''<br>
==== Difficulties ====
EyeCon is working with simple Boolean data, such as the line is triggered or it is not.
EyeCon is working with simple Boolean data, such as the line is triggered or it is not.
If the strings are not properly arranged through the room, the visitors will accidentally trigger a line with their body even if they do not touch the string with the hand. A possible solution for this issue would be a buildup with more than one video camera. In this case, every camera would only be responsible for two strings/ lines.
If the strings are not properly arranged through the room, the visitors will accidentally trigger a line with their body even if they do not touch the string with the hand. A possible solution for this issue would be a buildup with more than one video camera. In this case, every camera would only be responsible for two strings/ lines.


<gallery>
<gallery>
Line 52: Line 46:


<videoflash>hvSax3QrzoA|500|305</videoflash>
<videoflash>hvSax3QrzoA|500|305</videoflash>
<br clear="all" />


== Liana ==
== Liana ==
===virtual choir===
===virtual choir===
The main idea is to help handicaped people who are not able to speak. The target is to create some voices which sounds like a choir, give them feeling like they're singing by moving their bodies specially with hands. I was imagine to be in the church doing the choir with just raise up my hands and create some voices with that. I'm using 8 different tones of man and women voices. Man's voices are lower than the women's voices.
The main idea is to help handicaped people who are not able to speak. The target is to create some voices which sounds like a choir, give them feeling like they're singing by moving their bodies specially with hands. I was imagine to be in the church doing the choir with just raise up my hands and create some voices with that. I'm using 8 different tones of man and women voices. Man's voices are lower than the women's voices.


<videoflash>beomcz_62ic|500|305</videoflash>
<videoflash>beomcz_62ic|500|305</videoflash>


[[Media:Choir.pd]]


[[Media: Choir.pd]]
<gallery>
<gallery>
File:Choirpatch.jpg|puredata patch screenshot
File:Choirpatch.jpg|puredata patch screenshot
Line 68: Line 61:
File:Movementandvoices.jpg|movement
File:Movementandvoices.jpg|movement
</gallery>
</gallery>
 
<br clear="all" />
 


==Smooth Music==
==Smooth Music==
Line 83: Line 75:
File:smooth music patch.jpg|patch for smooth music
File:smooth music patch.jpg|patch for smooth music
</gallery>
</gallery>
<br clear="all" />


== Henning ==
== Henning: Step Sequencer ==
 
Step Sequencer
 
The EyeCon file consists of one field (movement) and 3 lines (trigger). The movement which is tracked is mapped to the bpm of a Pure Data step sequencer. When a tracked person moves fast the tempo is high and vice versa. The pattern which is played by the step sequencer can be changed. The trigger lines start the playback of a single sample.<br>
The EyeCon file consists of one field (movement) and 3 lines (trigger). The movement which is tracked is mapped to the bpm of a Pure Data step sequencer. When a tracked person moves fast the tempo is high and vice versa. The pattern which is played by the step sequencer can be changed. The trigger lines start the playback of a single sample.<br>
When i used the patch and played around a bit, it was a challenge to move with the right tempo. If you move to fast the beat is playing to fast, but when you get used to it, the sound should swing with your movement. To hit the trigger lines during movement is a bit tricky, but could be improved by putting them at a different position.
When i used the patch and played around a bit, it was a challenge to move with the right tempo. If you move to fast the beat is playing to fast, but when you get used to it, the sound should swing with your movement. To hit the trigger lines during movement is a bit tricky, but could be improved by putting them at a different position.
Line 97: Line 87:


[[Media:step sequencer.zip]]
[[Media:step sequencer.zip]]
<br clear="all" />


== Jörg ==
== Jörg ==
===''EYEDRUM''===
===''EYEDRUM''===
Just imagine you are a drum, filled with little balls
Just imagine you are a drum, filled with little balls
Line 106: Line 96:


[[Media:EYEDRUM.zip]]
[[Media:EYEDRUM.zip]]
[[Category:Dokumentation]]