GMU:Körper Raum Stadt/Anastasiia Nakaliuzhna: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 18: Line 18:


Whether you smile or don't processing gets number and depend on this number visualization changes.
Whether you smile or don't processing gets number and depend on this number visualization changes.
Without smile:
<br>
<br>
[[File:Smile_screen_1.png||500px]]
[[File:Smile_screen_1.png||500px]]
<br>
<br>
[[File:Smile_screen_1.png]]
 
With a small smile:
<br>
[[File:Smile_screen_2.png||500px]]
<br>
 
With a smile:
<br>
[[File:Smile_screen_3.png||500px]]
<br>

Revision as of 15:58, 30 September 2018

In the beginning I knew that I want to make a project with data visualization. After research I have noticed that some of visualizations besides representing a huge amount of information also create some illusion of alive organism. When I see visualization with organic shapes which are moving not randomly but depending on some particular information it gets from outside, somehow it reminds a human reaction. Basically our movements it’s reactions on everything what’s going on around us. Our brain collects all the data from our surroundings and creates reactions on it.

I was thinking about how much influence we have from our surrounding, the space we are in, from people we are with and came to the thought that we also could have an influence on a space. What if we could imagine that some particular space could show us its reaction, to show its feelings about us being there.

So my first project idea was to choose some room and using a few different sensors make a realtime visualization that represents «feelings» of the space.

Due to lack of time and conceptual meaning I decided to refuse the idea of using a few sensors and focus on one instead. I wanted my project to reflect my personality and to mean something to me. I spent some time analyzing myself and trying to find something that will be important and interesting for me personal. I realized that I have been always interested in peoples emotions. On of the things I would really love to know is to know how to read people by their faces. In addition I could describe myself as a very emotional person, and especially my face expresses more emotions and feelings that I describe with words. Thats why I came to the idea of visualizing emotions.

To collect all emotional spectrum, one semester is not enough, so I decided to limit it and make a project prototype visualizing peoples smile.

Since art is supposed to evoke emotions, why not to make art react on emotions. People observe art and react on it, art reacts on peoples reaction and changes, it make people react more and so on. That creates a reactions loop. Similar to human behavior, when you smile to a person, this person usually smiles you back. In that way just reacting on art piece you create an interaction with this art.

How does it work:

First step is to create a visual graphics. To make it I used openprocessing.org. I found a similar to what I want project and changed it. The link to original project: https://www.openprocessing.org/sketch/521050

To make visualization depend on the smile, I need to use smile recognition. For that I needed camera, processing and example code. I used smile recognition code (from here: http://www.magicandlove.com/blog/2011/05/04/smile-detection-in-processing-mac-osx/) and connected it with my visualization in processing.

Whether you smile or don't processing gets number and depend on this number visualization changes.

Without smile:
Smile screen 1.png

With a small smile:
Smile screen 2.png

With a smile:
Smile screen 3.png