GMU:In Sync/Projects/AUDIOVIDEOCUTUP

From Medien Wiki

Idea

In the beginning there was Tim Vets' autocutup-patch. His patch is based on the literary cut-up technique popularized by William S. Burroughs.
Autocutup analyses an audio-file and makes slices - which means it makes several parts out of the audio by detecting attacks. Then the slices will be played back in random order.

Making slices is a common method in music making. You can seperate different parts of a drum-sample to make new rhythmical structures out of it.

I try to work with slices in a combination of audio and video.

Exercise

I started to add video to Tim Vets' original autocutup-patch.

To have more control over the slices i added MIDI to the patch, so that the slices can be played with a keyboard for example. I decided to use AbletonLive to send rhythmical strucutres to the patch to make it more musical.

The next step was to work with live-video-material recorded by a webcam. So now I can record myself and the slices will be played back in realtime.

For the presentation i decided to have a prepared track in AbletonLive with beats and harmonics. So there is a kind of rhythmical preset wich triggers the slices. I can trigger the start of a recording-phrase (automatically 4sec of recording time) wich prepares the patch with new material to make new slices.

... but better: watch.  :) <videoflash type=vimeo>39588791|650|370</videoflash>

Problems

This lecture was called “In Sync”. When you watch the documentation-video you might notice that not all is in sync, or that often it is not really clear if the audio and video are really synchronised. There could be many reasons:

I recorded the live-video of myself with a HD-camera. The sliced material is recorded with a HD-webcam. The sliced result is recorded with a epiphan frame-grabber to another computer. The sliced audio is recorded with QuickTime on the second computer. All record-buttons had not been pressed at the same time. So the documentation-technique was not the best. It was kind of hard to put the several formats together after recording. To resync a desynced synchronised formation is not the best way to make it.

Other “normal” problems are for example the resolution of video (framerate) compared to audio (samplerate).
And I will solve the problems with the click-sounds you sometimes here when a slice is triggered.

Patch

Looking Forward

I would like make the Patch independent from Ableton Live. I would like to have more control and possibilities to work more freely with the slices.

I would like to have the audiovideocutuppatch for performance, for more theatrical use. I would like to play with language. And I am looking for a stronger content within the patch or the usage of using sliced video and sound.