|
|
(132 intermediate revisions by 2 users not shown) |
Line 1: |
Line 1: |
|
| |
|
| == '''FINAL PROJECT''': pulse — distorting live images == | | == Motion == |
| | | |
| | |
|
| |
|
| '''03.07.2021''' sensor arrived / try and error
| |
|
| |
|
|
| |
|
| I connected the pulse sensor to Max and had some troubles on my way there. As the sensor is sensitive to light, I came up with the idea to put it inside a box. I seems to make a difference, but I am not sure.
| | [[File:Keep Moving.jpg|800px]] |
|
| |
|
| [[File:Pulse Sensor to Max_1.jpg|400px]]
| |
|
| |
|
| I used the "sensing physical parameters" to show the incoming numbers.
| | Video manipulating in real-time, 3min |
|
| |
|
| [[File:Pulse Sensor to Max_2.jpg|400px]]
| | Things that move trigger our attention and curiosity. We immediately perceive them as living matter. Something that does not move from time to time is quickly perceived as dead. How is our environment shaped? Is it alive, livable, boring? Are cars living organisms? |
|
| |
|
| I have no clue how to connect the input from the sensor to the webcam image. While I was searching online I found the method of using the webcam as the tool to measure your heart rate. I think this would be way more interesting!!
| | Video recordings of public places are made at various locations in Weimar using a laptop and its webcam. The video is embedded in Max/MSP, which manipulated the recordings in real-time. The setup filters out everything that doesn't move. Things that do move become visible. The top right calculates the difference between the frames of the video. Therefore it filters the non-moving objects and highlights changes in the video feed. The bottom left takes snapshots, triggered by loud noises. A resulting video is displayed at the bottom right, a sequence of images created from the individual snapshots captured by sound. |
|
| |
|
| [[File:PPG_forehead.png|400px]]
| | {{#ev:vimeo|572240468}} |
|
| |
|
| '''27.05.2021''' moodboard / waiting for sensor to arrive
| | <gallery> |
| | File:Theater.png |
| | File:Ilmpark.png |
| | File:Koriat.png |
| | File:Wieland.png |
| | </gallery> |
|
| |
|
| Since I want to set up a user-friendly installation and I failed building a pulse sensor with the equipment I have at home, I ordered one online.
| |
|
| |
|
| [[File:260521_moodboard.png|400px]]
| |
|
| |
|
| more links
| |
| * heart beat Arduino processing [https://www.youtube.com/watch?v=2_c0yE9QHNI&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=14&t=319s]
| |
| * heart, national geographic [https://www.youtube.com/watch?v=GMBSU-2GK3E&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=13]
| |
| * human heart [https://www.youtube.com/watch?v=ebzbKa32kuk&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=16]
| |
| * Herzgeräusche [https://www.heilpraktikerausbildung24.de/herzgeraeusche-ansehen-und-anhoeren]
| |
|
| |
|
| ---- | | ---- |
|
| |
|
| '''20.05.2021''' A normal pulse is regular in rhythm and strength. Sometimes it's higher, sometimes lower, depending on the situation and health condition.You rarely see it and feel it from time to time. Apart from the visualisation of an EKG, what could it look like? How can this vital rhythm be visualised? | | '''Technical Solution''' |
|
| |
|
| [[File:Pulse Interaction.jpg|400px]] | | Please find here the patch [[:File:210616_Keep Moving.maxpat]] |
|
| |
|
| I would like to use an Arduino pulse sensor to create an interactive installation. The setup contains the pulse sensor and a screen. The user can easily connect to the Max by clipping the Arduino sensor to its finger. The webcam is connected and the screen shows a pulsating and distorted image of the participant. The pulse sensor visually stretches the webcam image, moves in rhythm of the pulse and creates a new result.
| | [[File:Screenshot 2021-06-17 at 00.05.59.png|800px]] |
|
| |
|
| A more complex version would be the interaction of two pulses. Two Arduino pulse sensors are connected to MAX and one distorted webcam image is showing the combination of the pulses from two people. This idea could be further developed during the semester.
| | The lower display window shows the entire webcam video. The upper window shows the current snapshot until the next one is taken. For motion tracking, ''jit.rgb2luma'' is often used to identify an moving object. This command caught my attention. By ''jit.unpack'' and ''jit.pack'' the color get changed in and bright yellow for the background and bright red for the moving object. The trigger of the microphone is set very low to take a snapshot. Even a big bite into an apple can set it off. |
| | |
| How does it work: Arduino pulse sensor that measures the pulse with the fingers. Usually, this is used to measure the radial pulse. In MAX, a patch must be created in which the data from Arduino is converted into parameters which distorts the image.
| |
| | |
| some links:
| |
| * webcam interaction [https://www.youtube.com/watch?v=j0mMU5Pu5EE&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=4 ]
| |
| * distorting images [https://www.youtube.com/watch?v=Rgcvnm-lcF8&list=PLvpdUWhGx43-OttXG4lp11-UohI3wPeqV&index=10&t=101s]
| |
|
| |
|
| ---- | | ---- |
|
| |
|
| == '''PROJECT THOUGHTS''' ==
| | '''Documentation''' |
| Please take a look at Johannes Schneemann Project idea 1
| |
|
| |
|
| This is a short summary of my thoughts. But I would like to develop further my final project idea.
| | [[/Process/]] |
|
| |
|
| What does the space we share online look like?
| | [[/References/]] |
| | |
| Participants in an online meeting share a virtual space without having the same circumstances. What circumstances influence a meeting in reality? Someone might be cold or hot, very noisy, the washing machine is running in the background, late at night or early in the morning with a time shift...
| |
| | |
| The shared virtual space should be represented by its own added member of the BBB conference displaying an interactive moving image. It is generated from data collected from all participants at the time of the meeting. For 5 participants, the parameters are: body temperature, room temperature, pulse, time, noise.
| |
| This non-visual data is translated into visual images. The room we are sharing online gets another dimension.
| |
| | |
| I have to admit that it won’t be technically feasible in this short time (real time data exchange, embed in BBB, interactive visualisation)
| |
| | |
| ----
| |
| | |
| == '''Homework 3''' ==
| |
| | |
| Sensing physical parameters [[:File:Luise Krumbein_homework 3.maxpat]]
| |
| | |
| Video: https://www.youtube.com/watch?v=5iw1E1od4O8
| |
| | |
| [[File:Luise Krumbein_homework 3_screenshot.png|400px]]
| |
| | |
| | |
| == '''Homework 2''' ==
| |
| | |
| 1. Sound to number [[:File:Luise Krumbein homework 2_part 1.maxpat]]
| |
| | |
| [[File:Luise Krumbein homework 2_part 1_screenshot.png|400px]]
| |
| | |
| [[File:Luise Krumbein homework 2_part 1_screenrecord.mov]]
| |
| | |
| | |
| 2. Video analysis [[:File:Luise Krumbein_homework 2_part 2.maxpat]]
| |
| | |
| [[File:Luise Krumbein homework 2_part 2_screenshot.png|400px]]
| |
| | |
| | |
| == '''Homework 1''' ==
| |
|
| |
| First patch [[:File:210422_Luise Krumbein.maxpat]]
| |
|
| |
|
| Projects from other students: F.Z. Aygüler [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler]
| | [[/Tutorials/]] |
| Wei Ru Tai [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai]
| |