GMU:Max and I/Aliya Sayfart: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
 
(3 intermediate revisions by one other user not shown)
Line 1: Line 1:
                                                         
==Motion to sound==
===idea===
The aim of the project is to play sounds by moving the body. I would like to use 3 webcams that run under different tracking systems.The movement is recorded in max msp and converted into numbers. This numbers are converted into the sounds or they activate the sounds.


Here is a little insight from outside.


                                                                    ''Idee'' 
[[File:71F3113C-0DE0-4032-985D-5BAC0C799F2D.jpeg|400px]]


Die Ausgangsidee ist es das Gesicht als ein Instrument zu nutzen mit Hilfe von der Webcam.Dabei soll das Gesichterkennungstool cv.jit.faces  genutzt werden.
*1: cam:face trackin
*2: cam:motion tracking legs
*3: cam:motion tracking arms/body
*4: screens


[[File:BodyFace.jpg|400px]]
===process===
abb.1
First step is to look how to get  three webcams in max to run in the same time. There for I used the the jit.qt.grab or jit.dx.grab (Windows) and see whether I can have three of these objects running simultaneously for the three different inputs.


Wie in der abb.1 Body dargestellt wird das Bild der Webcam in 4 Quadranten eingeteilt werden . Wenn das Gesicht in dem Quadranten=Q1 erkannt wird hört man einen C-ton . Wenn das Gesicht im Q2 erkannt wird dann wird der Ton G abgespielt usw. ..  
So now we come to the point where I start to play around with different opurtunities for tracking.  
Face tracking : run with Face OSC


abb.1 Face
There for I need to install Face Osc and connect it to max with the object:  udprecive 8338 the number is the ip of face osc you can change if like. Then you need the object OSC-route/… after you can say that you wonna route the mouth or the left eyebrow. If something in the route area will change it gets a bang wich activate the sound.
Das könnte ein extra Tool sein zu dem man umschalten kann. Beim drücken z.b der Taste Enter , zoomt das Programm rein und die Reinfolge der Töne in den Quadranten ändert sich zufällig.


 
==Patch==
[[File:closeopen1.jpg|400px]]
===Motion tracking===
close/open eyes
at first i used the jit.change object wich is connected to jit.rgb2luma this both objects converts RGB values to luminosity which is then sent to the ‘jit.matrix’ object.
 
Dazu kann man umschalten wenn es einem langweilig wird. Bei geschlossenen Augen wird ein Pitch von z.b d-fis gespielt und bei offenen Augen von fis zu c. Die Varianten könnten sich auch nacheinander abwechseln.
 
 
[[File:roundsound.jpg|400px]]
 
Das wäre eine einfachere alternative. Die Töne würden feste Postionen in den Quadranten bekommen und beim erkennen eines Gesichtes aktiviert werden .

Latest revision as of 15:06, 27 April 2020

Motion to sound

idea

The aim of the project is to play sounds by moving the body. I would like to use 3 webcams that run under different tracking systems.The movement is recorded in max msp and converted into numbers. This numbers are converted into the sounds or they activate the sounds.

Here is a little insight from outside.

71F3113C-0DE0-4032-985D-5BAC0C799F2D.jpeg

  • 1: cam:face trackin
  • 2: cam:motion tracking legs
  • 3: cam:motion tracking arms/body
  • 4: screens

process

First step is to look how to get three webcams in max to run in the same time. There for I used the the jit.qt.grab or jit.dx.grab (Windows) and see whether I can have three of these objects running simultaneously for the three different inputs.

So now we come to the point where I start to play around with different opurtunities for tracking. Face tracking : run with Face OSC

There for I need to install Face Osc and connect it to max with the object: udprecive 8338 the number is the ip of face osc you can change if like. Then you need the object OSC-route/… after you can say that you wonna route the mouth or the left eyebrow. If something in the route area will change it gets a bang wich activate the sound.

Patch

Motion tracking

at first i used the jit.change object wich is connected to jit.rgb2luma this both objects converts RGB values to luminosity which is then sent to the ‘jit.matrix’ object.