234
edits
(18 intermediate revisions by the same user not shown) | |||
Line 19: | Line 19: | ||
all the things stated above will be added here as soon as the project is finished | all the things stated above will be added here as soon as the project is finished | ||
'''ReadMe''' | |||
Please note that to use the patch, you also need to download max-externals, vst-plugins and audio files. All these are available for free. | |||
Also please note that some one the externals you need only works with Max 7 or below. Max 8 (or above) won't work yet. (at least that still was the case on 5th October 2020) | |||
== '''Introduction''' == | == '''Introduction''' == | ||
Line 50: | Line 54: | ||
[[File:SetupCorrect.jpg|500px]] | [[File:SetupCorrect.jpg|500px]] | ||
[[File:Speaker Bild.jpg|200px]] [[File:Kinect2.jpg|200px]] [[File:Kinect.jpg|200px]] | |||
As you can see on the pictures above, the kinect camera was hung to the ceiling facing downwards. | |||
== '''Video Documentation''' == | == '''Video Documentation''' == | ||
Line 60: | Line 66: | ||
00:17 - 10:00 Introduction | 00:17 - 10:00 Introduction | ||
10:00 - 18:40 Demonstration | 10:00 - 18:40 Demonstration | ||
18:40 - End Explanation | 18:40 - End Explanation | ||
https://youtu.be/70AmTmC-R4E | https://youtu.be/70AmTmC-R4E | ||
== '''Patch Walkthrough and Setup Tutorial''' == | |||
I will add a video about the patch very soon! | |||
There's a quick one I made for the main project module, you can watch it already, but it's only in german. | |||
You can find it here: https://youtu.be/z4eN5nxDMtQ | |||
The patch of course is the heart of the whole installation. It processes the live video input from the Kinect, tracks the movement of the recipient visible on the video, connects the tracking data to the sound parameters and the spatialisation of the sounds. It also samples the sound material and routes the sound to the different speakers. | |||
The "front page" of the patch is divided into two parts. At the top of the patch you can see a blue headline, containing all other subpatchers and explanations of those. You can see this as an index or table of contents. | |||
[[File:Bildschirmfoto 2020-09-17 um 15.41.47.png|300px]] | |||
Right below this table of contents you can find the tracking process patch. That's the part of the main patch that collects the Kinect video input, tracks the movement of the recipient in the video and sends the tracking data to wherever it's being further processed in the rest of the patch. | |||
At the beginning you can find the "freenect.jit.grab" object. This is an external, that collects the live video input from the Kinect. It only works with the first version of the XBOX Kinect 360, model type 1414, and only with Max 7 on below. | |||
[[File:Bildschirmfoto 2020-10-14 um 13.22.11.png|300px]] | |||
The freenect object can be operated with different input messages. First, as you can see, it needs a "qmetro" object to collect the video input. The freenect object can be switched on and off with an "open" and "close" message box. You can choose if you want to work with the RGB or infrared video output with the "format RGB" or "format ir" messages. With a "tilt" object you can adjust the angle of the Kinect Camera Module. | |||
First, the live video output of the freenect is run through a subpatcher called "p normalize". The Kinect automatically detects the distance of the objects that are visible on the video from the Kinect itself. Here in he patcher you can choose how "far" the Kinect is supposed to look. You can adjust the maximum distance in which the Kinect is supposed to scan objects. | |||
[[File:Bildschirmfoto 2020-10-14 um 13.40.51.png|300px]] | |||
This normalisation patcher is taken from a Kinect tutorial. Some parts are added to the patcher, to adjust the normalisation value with a send object. You can find the source in Tutorial number 2. | |||
After the normilasation, the signal runs through multiple filters, and into a "jit.findbounds" object. The objects that interact with this "jit.findbound" can be found in the help patch of the object itself. This process determines the position of the object thats visible on the video in relation to the video frame. | |||
[[File:Bildschirmfoto 2020-10-14 um 14.26.03.png|300px]] | |||
So, if you put up the camera on the ceiling and shoot from above, the normalisation will only allow the Kinect to record the object that's the closest, in this case the head of the person in the room. The "jit.findbound" then only sees the head on the video, and determines its position in realtime. Like this you can track the position of the recipient in the room. | |||
== '''Resources''' == | == '''Resources''' == | ||
'''SOUND MATERIAL''' | |||
Line 89: | Line 128: | ||
'''ICST Ambisonics Plugins free download''' https://www.zhdk.ch/forschung/icst/icst-ambisonics-plugins-7870 | '''ICST Ambisonics Plugins free download''' https://www.zhdk.ch/forschung/icst/icst-ambisonics-plugins-7870 | ||
'''Argot Lunar plugin''' http://mourednik.github.io/argotlunar/ | |||
Line 94: | Line 135: | ||
'''TUTORIALS''' | '''TUTORIALS''' | ||
'''Color Tracking in Max MSP''' https://www.youtube.com/watch?v=t0OncCG4hMw&list=PLG-tSxIO2Jkjj0BthZ_y0GRkWRjAvL1Uo&index=4&t=310s (Part 1 of 3) | 1: '''Color Tracking in Max MSP''' https://www.youtube.com/watch?v=t0OncCG4hMw&list=PLG-tSxIO2Jkjj0BthZ_y0GRkWRjAvL1Uo&index=4&t=310s (Part 1 of 3) | ||
2: '''Kinect Input and normalisation''' https://www.youtube.com/watch?v=ro3OwWnjfDk&list=PLG-tSxIO2Jkjj0BthZ_y0GRkWRjAvL1Uo&index=5 | |||
== '''Conclusion and future works''' == | == '''Conclusion and future works''' == |
edits