Luise Krumbein: Difference between revisions

From Medien Wiki
No edit summary
(45 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{#ev:youtube|Mha9aTXIeDU}}


== '''FINAL PROJECT''': pulse — distorting live images ==


(video)
[[File:manipulate webcam image_7.png|400px]]


A normal pulse is regular in rhythm and strength. Sometimes it's higher, sometimes lower, depending on the situation and health condition.You rarely see it and feel it from time to time. Apart from the visualisation of an EKG, what could it look like? How can this vital rhythm be visualised?
We perceive things that move, they trigger our attention and curiosity. We immediately perceive them as living matter. Though, when there is a sudden noise, we prick up our ears.  


[[File:Pulse Interaction.jpg|400px]]


I would like to use an Arduino pulse sensor to create an interactive installation. The setup contains the pulse sensor and a screen. The user can easily connect to the Max by clipping the Arduino sensor to its finger. The webcam is connected and the screen shows a pulsating and distorted image of the participant. The pulse sensor visually stretches the webcam image, moves in rhythm of the pulse and creates a new result.


A more complex version would be the interaction of two pulses. Two Arduino pulse sensors are connected to MAX and one distorted webcam image is showing the combination of the pulses from two people. This idea could be further developed during the semester.
{{#ev:youtube|S-nIkN0xIuk}}


How does it work: Arduino pulse sensor that measures the pulse with the fingers. Usually, this is used to measure the radial pulse. In MAX, a patch must be created in which the data from Arduino is converted into parameters which distorts the image.


some links:
Combining these two expressive triggers, can lead to an alarming outcome, though this project aims to visualise a living organism, which can be only seen when moving and is captured by sound. Finger snapping, hand clapping, whistling, drumming, hitting, screaming, stamping, clanging, etc., are wether visible or not in the image. The sequence of the single snapshots are shown in a soundless video. As a result, the brain interprets the new sequence differently from the original webcam video and new movements are seen.
* webcam interaction [https://www.youtube.com/watch?v=j0mMU5Pu5EE&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=4 ]
 
* distorting images [https://www.youtube.com/watch?v=Rgcvnm-lcF8&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=10&t=101s]
Humans are able to perceive movement by motion perception. Our world consists of movement, of changes in spatial references. Motion perception accompanies us in everyday life and is important for finding our way in the world. However, we see not only real movement, but also apparent movement. Apparent movement is the perception of movement in objects that are not really moving in the physical sense. It refers to the stroboscopic movement, which is the perception of movement when viewing a sequence of slightly varied individual images.
 
This setup encourages to move in front of the camera,
to show an emotion, to make a sound, to speak out loud.
 
'''Next step: going outside, capturing people/cars/objects/animals/plants which move and caught by random sounds of the location.'''
 
''Please find in process the patch of my previous project idea regarding the pulse.''
 
----
 
'''Technical Solution'''
 
Please find here the patch  [[:File:210616_Keep Moving.maxpat]]
 
[[File:Screenshot 2021-06-17 at 00.05.59.png|800px]]
 
The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, ''jit.rgb2luma'' is often used to identify an moving object. This command caught my attention. By ''jit.unpack'' and ''jit.pack'' the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.


----
----


== DOCUMENTATION ==
'''Documentation'''


[[/Process/]]
[[/Process/]]

Revision as of 06:39, 17 June 2021


We perceive things that move, they trigger our attention and curiosity. We immediately perceive them as living matter. Though, when there is a sudden noise, we prick up our ears.



Combining these two expressive triggers, can lead to an alarming outcome, though this project aims to visualise a living organism, which can be only seen when moving and is captured by sound. Finger snapping, hand clapping, whistling, drumming, hitting, screaming, stamping, clanging, etc., are wether visible or not in the image. The sequence of the single snapshots are shown in a soundless video. As a result, the brain interprets the new sequence differently from the original webcam video and new movements are seen.

Humans are able to perceive movement by motion perception. Our world consists of movement, of changes in spatial references. Motion perception accompanies us in everyday life and is important for finding our way in the world. However, we see not only real movement, but also apparent movement. Apparent movement is the perception of movement in objects that are not really moving in the physical sense. It refers to the stroboscopic movement, which is the perception of movement when viewing a sequence of slightly varied individual images.

This setup encourages to move in front of the camera, to show an emotion, to make a sound, to speak out loud.

Next step: going outside, capturing people/cars/objects/animals/plants which move and caught by random sounds of the location.

Please find in process the patch of my previous project idea regarding the pulse.


Technical Solution

Please find here the patch File:210616_Keep Moving.maxpat

Screenshot 2021-06-17 at 00.05.59.png

The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, jit.rgb2luma is often used to identify an moving object. This command caught my attention. By jit.unpack and jit.pack the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.


Documentation

Process

References

Tutorials