GMU:Computer's Cut/Kristin Jakubek

From Medien Wiki
< GMU:Computer's Cut
Revision as of 15:56, 31 January 2022 by A-Kristin Jakubek (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

(1) Bicycle Supercut with image recognition

Password: bike


To create this supercut of bicycles in film and television I used the Python script for 'object detection' we developed together during our course. By running the MobileNetSSD caffemodel, which includes amongst others, the CLASS of 'bicycle' the pre-trained neural network is able to apply image recognition to the supplied video inputs and indeed detect 'bicycles'.

Instantly, when reading the CLASS 'bicycle' provided by the model, I had to think of a pivotal scene from the anime 'The girl who leapt through time', in which the main character dies due to failure of her bike's breaks. I was curious to see how the computer would 'see' this scene. As humans we can deduct that the bike is present - yet sometimes only partially in frame - for almost all of the shots of this short sequence. But when run through the image recognition script only brief appearances of the 'bicycle' can be detected, subsequently the scene looses its narrative and meaning. I remained curious to see how the machine would see other scenes of film and television history, which specifically came up in my mind as featuring, being defined by or being memorable because of the bicycles in them. The result is a short trip through film history from one of the first real motion picture films ever made ('Workers Leaving the Lumière Factory in Lyon') other classics and personal favourites. Do you recognise them?

Screenshot 2022-01-24 at 09.06.08.png


Screenshot 2022-01-24 at 09.06.22.png


Screenshot 2022-01-26 at 13.23.34.png

(2) Videoediting with Subtitles

  1. Download video from Youtube (e.g. noTube converter)
  2. Download Youtube subtitles for same video (e.g. downsub converter)
  3. Create file in Aegisub ( simply with video is loaded into, save and close)
  4. Use .srt file of subtiles with 'findword' python script to generate a .xml file with information on timecodes of certain, selected word
    1. Open Mac terminal > change to 'findword' directory > command 'cd' and manually pull in find word folder into Terminal > ( easiest if both subtile file and video are in the 'findword' folder, otherwise full path must be given > use command python3 find_word.py [-h] [-i INPUTFILE] [-o OUTPUTFILE] [-w WORD] [-c] [-v] e.g. python3 find_word.py -i “Life.srt” -o “Life.xml” -w work -c -v
  5. Insert generated .xml into full code of Aegisub file by opening the file via e.g. Visual Studio Code
    1. Insert generated .xml; under playlist id=playlist0 replace entry producer="chain1" (*"chain1" needs to be adapted/named as such in the generated .xml file for successful insertion)
  6. Open Aegisub in program to see edited video

07.11.2021 - Computer's Cut via Subtitles of 'Doug & Mike Starn Interview: The Invisible Architecture of Life'


Resulting video-edit: https://vimeo.com/643262108 Password: work


Comment: I had not watched this interview in its original entirety. Instead, when clicking through it I always coincidentally caught the artists talking about 'work'. I used the word 'work' in hopes of filtering out the artist's working approach / get an idea of their work, exclusively via the short spoken fragments directly about 'work'.


File source: https://www.youtube.com/watch?v=AcOV75yuPF4&t=423s&ab_channel=LouisianaChannel Video via: notube Subtitles via: downsub

(3) Videoediting with sound using PD

PD Patch MIDI file.png


PD Patch GEM Videoplayback.png


Assignment 13.12.2021



Password: PD