Luise Krumbein: Difference between revisions

From Medien Wiki
No edit summary
 
(64 intermediate revisions by 2 users not shown)
Line 1: Line 1:
''(rethinking titel)''
== PULSE — distorting live images ==


== Motion ==






''(video)''
[[File:Keep Moving.jpg|800px]]


[[File:manipulate webcam image_7.png|400px]]


''(concept)''
Video manipulating in real-time, 3min
We perceive things that move, they trigger our attention and curiosity. We immediately perceive them as living matter. Though, when there is a sudden noise, we prick up our ears.


Combining these expressive triggers, can lead to an alarming outcome, though this project aims to visualise a living organism, which can be only seen when moving and is captured by sound.
Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms?
A soundless video shows the sequence of single snapshots.


Thereby other motions are seen, different from the original one. Our world consists of movement, of changes in spatial references. Humans are able to perceive movement by motion perception. Motion perception accompanies us in everyday life and is important for finding our way in the world.
Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound.
However, we see not only real movement, but also apparent movement. Apparent movement is the perception of movement in objects that are not really moving in the physical sense. It refers to the stroboscopic movement, which is the perception of movement when viewing a sequence of slightly varied individual images.  


This setup encourages to move in front of the camera, to show an emotion, to make a sound, to speak out loud.
{{#ev:vimeo|572240468}}
A normal pulse is regular in rhythm and strength. Sometimes it's higher, sometimes lower, depending on the situation and health condition.You rarely see it and feel it from time to time. Apart from the visualisation of an EKG, what could it look like? How can this vital rhythm be visualised?


<gallery>
File:Theater.png
File:Ilmpark.png
File:Koriat.png
File:Wieland.png
</gallery>


''(more screenshots, different outcomes, sketch in gallery view)''
[[File:Pulse Interaction.jpg|400px]]


''(technical solution)''




----
----


== DOCUMENTATION ==
'''Technical Solution'''
 
Please find here the patch  [[:File:210616_Keep Moving.maxpat]]
 
[[File:Screenshot 2021-06-17 at 00.05.59.png|800px]]
 
The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, ''jit.rgb2luma'' is often used to identify an moving object. This command caught my attention. By ''jit.unpack'' and ''jit.pack'' the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.
 
----
 
'''Documentation'''


[[/Process/]]
[[/Process/]]

Latest revision as of 20:24, 13 July 2021

Motion

Keep Moving.jpg


Video manipulating in real-time, 3min

Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms?

Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound.




Technical Solution

Please find here the patch File:210616_Keep Moving.maxpat

Screenshot 2021-06-17 at 00.05.59.png

The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, jit.rgb2luma is often used to identify an moving object. This command caught my attention. By jit.unpack and jit.pack the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.


Documentation

Process

References

Tutorials