GMU:I and my Max/Mojjo Krenz: Difference between revisions

From Medien Wiki
No edit summary
 
(15 intermediate revisions by 2 users not shown)
Line 1: Line 1:
"Greenscreen"  
===Project===
'''Chicken Tracker'''
 
We humans communicate. At work, with friends, colleagues and with relatives. To classify the messages communicated to us, we use our senses. In direct verbal communication, our auditory perception is primarily engaged. Peripherally one could think the acoustic channel is clear and unmistakable. What may be simply true in individual cases, however, reveals itself in everyday life as a diverse, complex act. Human communication and perception is multi-layered. Besides the acoustic channel we use optical, olfactory as well as tactile stimuli to communicate with our fellow human being. What happens to the understanding of our counterparts when visual communication is restricted, feeling and smelling are even completely eliminated, and when eye contact, facial expressions and gestures are distorted by the Internet?

 
[[File:photo_2021-01-14 17.16.54.jpeg|400px]]
 
The current pandemic and the associated lockdown limit verbal, interpersonal exchange to two elementary channels. Acoustic communication and visual communication made possible by video telephony are coming to the fore in our everyday lives. However, contact via the medium of the Internet lacks the possibility of expressing empathy and feelings.  For this purpose I have invented an „Chicken Tracker". 
As a support tool to the usual video chat, the „Chicken Tracker" is supposed to visually amplify a feeling. 
The basic idea is simple. Participant A starts talking, initially appears completely realistic in the video chat - if A talks himself into a rage, the volume of his acoustic output increases steadily. The amplified signal in turn turns his skin color increasingly red through the "Chicken Tracker" - until A appears completely red. Once in a rage, the volume of A's output increases until it collapses and A turns into a chicken for the time in which the volume is higher than X.
 
The whole idea is about loosening up in online discussions and not taking yourself too seriously.
 
[[File:photo_2021-01-14 17.15.54.jpeg|400px]]
 
===Development===
The project has several stages of development
 
====change certain colours in  Max video - using grab====
<gallery>
File:Bildschirmfoto 2021-01-18 um 17.15.33.png|patch exchanging bright / dark colours by a matrix.
</gallery>


*[[:File: GREENSCREEN.maxpat]]
==== tracking human skin color by RGB-Code====
*[[:File:GREENSCREEN-2.maxpat]]
<gallery>
File:Bildschirmfoto 2021-01-15 um 09.44.05.png|Source: https://colorswall.com/palette/2513/
</gallery>


1. try 2 videos in 1 patch - goal will be they run mixed into each other
====Changing color by sound====
*test 1 [[:File:Max-test1.maxpat]]
*test 2 [[:File:Max-test2 .maxpat]]


[[:File:VIDEO_PLAY1.maxpat]]
====Changing skin color to red by volume====


https://www.studioroosegaarde.net/info
====Building two interconnected patches====
https://www.studioroosegaarde.net/project/space-waste-lab
*to use OSC protocol
*alternatively „jit.net.send/recv“




Semster Project.


In times of endless zoom/bbb/etc. meetings its easy to loose patience. Some might need to see visually themselves turning red. The idea of my project is to change the skin color of the user inside a zoom meeting. The parameters for changing the color is decided by the volume of speech. Basically you turn yourself red on screen - while screaming.
===Exercises===
"Greenscreen"


The project has different development stages.  
*[[:File: GREENSCREEN.maxpat]]
*[[:File:GREENSCREEN-2.maxpat]]


1. changing specific skin colours into red colour  in max video using the volume input of the mic
1. try 2 videos in 1 patch - goal will be they run mixed into each other


2. changing specific skin colours into different shades of red colours - floating - using the volume input of the mic
*[[:File:VIDEO_PLAY1.maxpat]]


3. connecting video of max msp to zoom - implementing the skin changing ability for both partys
*https://www.studioroosegaarde.net/info
*https://www.studioroosegaarde.net/project/space-waste-lab

Latest revision as of 15:34, 6 April 2021

Project

Chicken Tracker

We humans communicate. At work, with friends, colleagues and with relatives. To classify the messages communicated to us, we use our senses. In direct verbal communication, our auditory perception is primarily engaged. Peripherally one could think the acoustic channel is clear and unmistakable. What may be simply true in individual cases, however, reveals itself in everyday life as a diverse, complex act. Human communication and perception is multi-layered. Besides the acoustic channel we use optical, olfactory as well as tactile stimuli to communicate with our fellow human being. What happens to the understanding of our counterparts when visual communication is restricted, feeling and smelling are even completely eliminated, and when eye contact, facial expressions and gestures are distorted by the Internet?


Photo 2021-01-14 17.16.54.jpeg

The current pandemic and the associated lockdown limit verbal, interpersonal exchange to two elementary channels. Acoustic communication and visual communication made possible by video telephony are coming to the fore in our everyday lives. However, contact via the medium of the Internet lacks the possibility of expressing empathy and feelings. For this purpose I have invented an „Chicken Tracker". 
As a support tool to the usual video chat, the „Chicken Tracker" is supposed to visually amplify a feeling. 
The basic idea is simple. Participant A starts talking, initially appears completely realistic in the video chat - if A talks himself into a rage, the volume of his acoustic output increases steadily. The amplified signal in turn turns his skin color increasingly red through the "Chicken Tracker" - until A appears completely red. Once in a rage, the volume of A's output increases until it collapses and A turns into a chicken for the time in which the volume is higher than X.

The whole idea is about loosening up in online discussions and not taking yourself too seriously.

Photo 2021-01-14 17.15.54.jpeg

Development

The project has several stages of development

change certain colours in Max video - using grab

tracking human skin color by RGB-Code

Changing color by sound

Changing skin color to red by volume

Building two interconnected patches

  • to use OSC protocol
  • alternatively „jit.net.send/recv“


Exercises

"Greenscreen"

1. try 2 videos in 1 patch - goal will be they run mixed into each other