Luise Krumbein: Difference between revisions

From Medien Wiki
No edit summary
 
(176 intermediate revisions by 2 users not shown)
Line 1: Line 1:


== '''Project Idea 1''' ==
== Motion ==
first thoughts / please take a look at Johannes Schneemann Project idea
 
 


What does the space we share online look like?
[[File:Keep Moving.jpg|800px]]


Participants in an online meeting share a virtual space without having the same circumstances. What circumstances influence a meeting in reality? Someone might be cold or hot, very noisy, the washing machine is running in the background, late at night or early in the morning with a time shift... 


The shared virtual space should be represented by its own added member of the BBB conference displaying an interactive moving image. It is generated from data collected from all participants at the time of the meeting. For 5 participants, the parameters are: body temperature, room temperature, pulse, time, noise.
Video manipulating in real-time, 3min
This non-visual data is translated into visual images. The room we are sharing online gets another dimension.


I have to admit that it won’t be technically feasible in this short time (real time data exchange, embed in BBB, interactive visualisation)
Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms?


Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound.


== '''Homework 3''' ==
{{#ev:vimeo|572240468}}


Sensing physical parameters  [[:File:Luise Krumbein_homework 3.maxpat]]
<gallery>
File:Theater.png
File:Ilmpark.png
File:Koriat.png
File:Wieland.png
</gallery>


Video: https://www.youtube.com/watch?v=5iw1E1od4O8


[[File:Luise Krumbein_homework 3_screenshot.png|400px]]




== '''Homework 2''' ==
----


1. Sound to number [[:File:Luise Krumbein homework 2_part 1.maxpat]]
'''Technical Solution'''


[[File:Luise Krumbein homework 2_part 1_screenshot.png|400px]]
Please find here the patch  [[:File:210616_Keep Moving.maxpat]]


[[File:Luise Krumbein homework 2_part 1_screenrecord.mov]]
[[File:Screenshot 2021-06-17 at 00.05.59.png|800px]]


The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, ''jit.rgb2luma'' is often used to identify an moving object. This command caught my attention. By ''jit.unpack'' and ''jit.pack'' the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.


2. Video analysis [[:File:Luise Krumbein_homework 2_part 2.maxpat]]
----


[[File:Luise Krumbein homework 2_part 2_screenshot.png|400px]]
'''Documentation'''


[[/Process/]]


== '''Homework 1''' ==
[[/References/]]
First patch [[:File:210422_Luise Krumbein.maxpat]]


Projects from other students:  F.Z. Aygüler [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler]
[[/Tutorials/]]
Wei Ru Tai [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai]

Latest revision as of 20:24, 13 July 2021

Motion

Keep Moving.jpg


Video manipulating in real-time, 3min

Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms?

Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound.




Technical Solution

Please find here the patch File:210616_Keep Moving.maxpat

Screenshot 2021-06-17 at 00.05.59.png

The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, jit.rgb2luma is often used to identify an moving object. This command caught my attention. By jit.unpack and jit.pack the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off.


Documentation

Process

References

Tutorials