GMU:Bits, Beats & Pieces/Ariel Lo, Pik Ying: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 41: Line 41:


[[File:final_patch.png|500px]]
[[File:final_patch.png|500px]]
'''Work'''
[[File:grab_03.jpeg|200px]]
[[File:grab_01.jpeg|200px]]
[[File:grab_02.jpeg|200px]]
[[File:grab_04.jpeg|200px]]


'''Reference'''  
'''Reference'''  

Revision as of 18:22, 20 March 2019

Stairs

Pik Ying Lo, Media Art and Design (M.F.A) WS2018


Inspiration

The project is inspired by two old films. The first one is titled Pièce Touchée (1989) by Martin Arnold. This is an analysis of a pre-existing film, which is a 1954 film titled The Human Jungle by Joseph M. Newman. He extracted 18 second from that film as a material and reorganised the frames to 16 minutes. He stressed and twisted the original’s narrative and formed a backwards and forwards movements.

It is surprised that a new concept is evoked by destruction and decomposition. Usually, the raw materials are so much longer than the final version. However, he created a new film by reorder sequence of the frames and challenged the concept of space and time in a film.

The second film is Vertigo (Alfred hitchcock, 1958). In the film, there was a scene showing that a actress runned into the church and up the bell tower. And the actor wanted to follow her but halted on the steps by his acrophobia. The scene was ended with the falls of the actress from the tower. The filming location was tower’s staircase which was assembled inside a studio.

The space in the stair created a feeling of infinity. It forms a spiral and people could only walk within it and never able to escape.


Idea

The project is presented by a short video showing people climbing up the stairs and some of them are going downstairs. The purpose is to reflect, repeat and delay the movements in order to reconstruct the new sequence of the frames.

Audience can interact with the video by creating sound. The sound input received then reorganize the sequence of the frames, making it flows in a new order.


Implementation

The raw material is filmed in the main building in the Bauhaus University Weimar.

Date: 8 December 2018
Duration: 16 seconds

The project is built with Pure Data, which is a real-time graphical programming environment for audio, video and graphical processing.

First, “gemwin” is used to access to the window manager in order to create a window. Then, “gemhead” is used to connect gem objects to the window manager and display video 1 and video 2 by the number added behind.

I decided three ways to manipulate the sequence of the frames. Option 1 is by random movement. However, it could not satisfy the purpose of interactive engagement. Option 2 is by mouse movement in the gem window. However, there is limitation in the movement. Option 3 is audio input. With the use of “adc~”, the microphone could receive sound input. Then, the sound volume is used to manipulate the frames. The total numbers of frames are 469. The volume detected would be linked to the frames between 1 to 469.

On the other hand, the video 2 ran in a reversed order by counting down the frames. When the volume detected is over 60 Hz, video 2, which is flipped horizontally, would be displayed. This would destruct the original sequence of the frames and forms a new order. It evoked an interaction between the audience and the film.


Patch

Final patch.png

Work

Grab 03.jpeg Grab 01.jpeg Grab 02.jpeg Grab 04.jpeg

Reference