Luise Krumbein: Difference between revisions

From Medien Wiki
 
(96 intermediate revisions by 2 users not shown)
Line 1: Line 1:


== '''FINAL PROJECT''': pulse — distorting live images ==
== Motion ==


'''10.06.2021''' exploring effects


update from the pixel manipulation


<gallery>
[[File:Keep Moving.jpg|800px]]
File:manipulate webcam image_1.png
 
File:manipulate webcam image_2.png
 
</gallery>
Video manipulating in real-time, 3min


I was not so happy with the pixel manipulation. Online, I found an effect that I really like. It’s used for motion tracking. If the object is still, you don't see it. If it moves, you can hardly see the outline. If it moves faster, you can see more and more.
Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms?


I simply changed the color of the image with jit.unpack and jit.pack. For capturing a snapshot, I used the microphone which is triggered by snapping fingers or clapping hands. Every snapshot should be saved directly, but somehow this didn’t work. You can save it manually right now.  
Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound.


Here is the patch [[:File:210609_Moving Fast.maxpat]] and some screenshots
{{#ev:vimeo|572240468}}


<gallery>
<gallery>
File:manipulate webcam image_3.png
File:Theater.png
File:manipulate webcam image_4.png
File:Ilmpark.png
File:manipulate webcam image_5.png
File:Koriat.png
File:manipulate webcam image_7.png
File:Wieland.png
</gallery>
</gallery>


----
'''04.06.2021''' trying to manipulate webcam image by pixels
Please find here the patch: [[:File:210604_Project.maxpat]]
I can't figure out how to create a new image / video from the edited cellblock (the edited pixels).
Do you have a tutorial or references for that? I am wondering if a command which manipulates the whole image - not individual pixels - would be interesting. Probably that's why you recommended jit.gl.pix, but I can't make it work.
[[File:manipulate webcam image.png|400px]]




Here are some reference videos I found online
* Turning my face into a drum [https://www.youtube.com/watch?v=cOa-RuN8cO8&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=21&t=122s]
* Creating Images with jit.matrix [https://www.youtube.com/watch?v=M7VrjJJFQbM&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=22]
* RGB Glitch Effect [https://www.youtube.com/watch?v=sO5NaTjBvL8]


----
----


'''03.06.2021''' sensor arrived / try and error
'''Technical Solution'''
 
I connected the pulse sensor to Max and had some troubles on my way there.
 
[[File:Screenshot 2021-06-03 at 11.16.31.png|400px]]
 
As the sensor is sensitive to light, I came up with the idea to put it inside a box. I seems to make a difference, but I am not sure.
 
[[File:Pulse Sensor to Max_1.jpg|400px]]
 
I used similar objects from the "sensing physical parameters" patch and an Arduino code I found online.
 
[[File:Pulse Sensor to Max_2.jpg|400px]]


I have no clue how to connect the input from the sensor to the webcam image. I am searching online, but I haven’t found good references / tutorial how to distort a live image with specific data input. I would be very happy for any hint.
Please find here the patch [[:File:210616_Keep Moving.maxpat]]
During the research, I found the method of using the webcam as the tool to measure your heart rate. I think this would be way more interesting!! [https://hackaday.com/2020/12/25/webcam-heart-rate-monitor-brings-photoplethysmography-to-your-pc/] Using the webcam as the sensor and for creating the visuals as well.


[[File:PPG_forehead.png|400px]]
[[File:Screenshot 2021-06-17 at 00.05.59.png|800px]]


 
The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, ''jit.rgb2luma'' is often used to identify an moving object. This command caught my attention. By ''jit.unpack'' and ''jit.pack'' the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.  
What I am imagining right now, but don’t know where to start is the following: The visitor is coming closer to the screen, seeing itself inside the webcam. The pulse gets identified by the webcam and the human is recognized as well, (ideally turns red-ish; opacitiy 50%; maybe by using the cellblock object?) The pixels of the human stretches to the rhythm of the pulse.


----
----


'''20.05.2021''' A normal pulse is regular in rhythm and strength. Sometimes it's higher, sometimes lower, depending on the situation and health condition.You rarely see it and feel it from time to time. Apart from the visualisation of an EKG, what could it look like? How can this vital rhythm be visualised?
'''Documentation'''
 
[[File:Pulse Interaction.jpg|400px]]
 
I would like to use an Arduino pulse sensor to create an interactive installation. The setup contains the pulse sensor and a screen. The user can easily connect to the Max by clipping the Arduino sensor to its finger. The webcam is connected and the screen shows a pulsating and distorted image of the participant. The pulse sensor visually stretches the webcam image, moves in rhythm of the pulse and creates a new result.
 
A more complex version would be the interaction of two pulses. Two Arduino pulse sensors are connected to MAX and one distorted webcam image is showing the combination of the pulses from two people. This idea could be further developed during the semester.
 
How does it work: Arduino pulse sensor that measures the pulse with the fingers. Usually, this is used to measure the radial pulse. In MAX, a patch must be created in which the data from Arduino is converted into parameters which distorts the image.
 
some links:
* webcam interaction [https://www.youtube.com/watch?v=j0mMU5Pu5EE&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=4 ]
* distorting images [https://www.youtube.com/watch?v=Rgcvnm-lcF8&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=10&t=101s]
 
----


== Process ==
[[/Process/]]
[[/Trials/]]


[[/Research/]]
[[/References/]]


[[/Homework/]]
[[/Tutorials/]]

Latest revision as of 20:24, 13 July 2021

Motion

Keep Moving.jpg


Video manipulating in real-time, 3min

Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms?

Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound.




Technical Solution

Please find here the patch File:210616_Keep Moving.maxpat

Screenshot 2021-06-17 at 00.05.59.png

The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, jit.rgb2luma is often used to identify an moving object. This command caught my attention. By jit.unpack and jit.pack the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.


Documentation

Process

References

Tutorials