GMU:Keeping Track/assignment four: Difference between revisions

From Medien Wiki
Line 51: Line 51:
===virtual choir===
===virtual choir===


The main idea is to help handicaped people who are not able to speak. The target is to create some voices which sounds like a choir, give them feeling like they're singing by moving their bodies specialy with hands. I was imagine to be in the church doing the choir with just raise up our hands and create some voices with that. I'm using 8 different tones of man and women voices.  
The main idea is to help handicaped people who are not able to speak. The target is to create some voices which sounds like a choir, give them feeling like they're singing by moving their bodies specially with hands. I was imagine to be in the church doing the choir with just raise up our hands and create some voices with that. I'm using 8 different tones of man and women voices.  


<videoflash type>beomcz_62ic</videoflash>
<videoflash type>beomcz_62ic</videoflash>

Revision as of 12:44, 1 February 2011

The task is to create a sonification of movement using either Pure Data or eyecon or both connected by OSC. This sonification should have 3 parts, while the user is able to switch between those parts by holding still for a few seconds.

Target group are handicapped people.

Please read the whitepaper by Robert Wechsler.

Moritz und Frederic

'Granular meets OSC~

Pure Data Patch converts EyeCon Data into sound: In eyeCon there are three Fields (Activity, CenterX, Top) and two Lines (Position) that send seperated data via OSC-protocol to pd. In Pure Data there are two "scenes" that generate different soundscapes. By crossing the two lines in eyeCon (touch the "buttons" on the ground) the two scenes are switchable. The first scene is based on the Designing Sound Patch "granular2" (granular Synthesis) by Andy Farnell. Activity, CenterX and Top from the Fields in EyeCon influence grainstart, grainduration and grainpitch. The second scene combines several oscillators to generate a polyphone sinus soundwave. Actvity changes freqeuncy and Toplevel changes the volume of the main oscillator.

<videoflash type>dQe09QA6TGc</videoflash>


Media: GranularMeetsOSC.zip

Ana, Laura, Gesine and Marianne

stringsound

Description
The installation consists of strings, which are strained horizontally about two meters above the floor. A video camera is set up to capture the area and EyeCon is grabbing the video image. Visitors are invited to enter the area and to touch and “play” with the strings. It is built up as a self-explaining installation in which the participants experience the signification of the construction on their own. They will perceive a sound coming from the string as soon as it is touched.

Software
We are working with the motion sensing software EyeCon, using the elements line, static and field. There is a background sound added to the installation that changes slightly when somebody enters the playing area. To each string, there is a line element attached and as soon somebody touches the string, the line element is triggered and plays the belonging sound. Each string in the room plays a specific sound and is made out of different materials.

Concept
The term “handicapped people” involves a large number of people with different skills and as it is not further specified, we thought about people who have difficulties to move. The goal is to sensitize the tactile act of touching to give these people a better feeling of what movement involves and how it can lead to something and maybe change something. The visitor finds himself in this sensorial experience in which every move results directly into action. Because of this it is possible for him or her to be more aware of the body and to get a better control over it.

Difficulties
EyeCon is working with simple Boolean data, such as the line is triggered or it is not. If the strings are not properly arranged through the room, the visitors will accidentally trigger a line with their body even if they do not touch the string with the hand. A possible solution for this issue would be a buildup with more than one video camera. In this case, every camera would only be responsible for two strings/ lines.



liana

virtual choir

The main idea is to help handicaped people who are not able to speak. The target is to create some voices which sounds like a choir, give them feeling like they're singing by moving their bodies specially with hands. I was imagine to be in the church doing the choir with just raise up our hands and create some voices with that. I'm using 8 different tones of man and women voices.

<videoflash type>beomcz_62ic</videoflash>


Media: Choir.pd