GMU:(In)Visible Networks/Esra Demirel: Difference between revisions

From Medien Wiki
Line 5: Line 5:
Alan Watts  <br>
Alan Watts  <br>


We mostly use aft process to evolve our works. We first create our idea from craft materials as a craft way and <br>
We mostly use craft process to evolve our works. We first create our idea from craft materials as a craft way and <br>
with digitalization it becomes more based on computers. Becoming computers more reachable and also digitalization we start <br>
with digitalization it becomes more based on computers. Becoming computers more reachable and also digitalization we start <br>
to prefer directly digital tools to create our ideas.According to this idea this projects aim is to create nature reflections inside <br>
to prefer directly digital tools to create our ideas.According to this idea this projects aim is to create nature reflections inside <br>

Revision as of 11:21, 20 August 2017

NATURE INSPIRED

You didn't come into this world.You came out of it,
like a wave from ocean. You are not a stranger here.
Alan Watts

We mostly use craft process to evolve our works. We first create our idea from craft materials as a craft way and
with digitalization it becomes more based on computers. Becoming computers more reachable and also digitalization we start
to prefer directly digital tools to create our ideas.According to this idea this projects aim is to create nature reflections inside
digital environment.In a way of thinking nature inspiration, project reflects four element of nature. Those are earth, fire,
water and air.
If we are talking about nature it is impossible to skip human itself. Human is a part of nature.Those four elements common
environment is air and this takes us to Visual data. Air is common environment and touches both earth, fire and water.
On the other hang it is possible to show air in a very different ways. But here I experiment that in a very simple way to see air in craft ways if you are inside a dark room when the light come inside that room there you can see air particles very poetic way.That visualization we received from Processing refers that light beams.

In that project we use human and non-human actors.Human are tracked inside the performance platform with non-human actors we translate human to digital platform. We use invisible actors to transform human actions. This invisible actors such as OSS protocol inside processing program. As in Actor-Network Theory (ANT) human and non-human actors are bound to each other and form a whole.Project main idea is merging human and non-human actors inside the same conceptual framework.In this way it makes possible to create a wholeness with actors network impartial way (Law, 2009).

Nature inspired network ideal graph

Human body is tracking by captury system and according to users movement environment is changing on video wall.
In this graphic I wanted to connect all input and output data,tools that I need to use in order to reach my output.Also all conceptual idea is within this graphic.How I connect to each other(conceptual frame to final result) which environments are used, in what way they connect to each other...(You can see pushing your mouse on the wires for example; silhouette of human and inspiration from nature is connected to each other with the way of interaction.) You can reach the final graphic from link | Ideal graphic .

Network.jpg





TOOLS

Captury System is used in this project in order to gather nature and human together in digital environment. Tracking people
with Captury System and then having visualization from it is achieved with Processing.After having some visual data in
Processing Video Wall is used in order to show data realtime.
While Captury is tracking people in order to create visualization inside Processing OSC protocol is used to have
communication between Captury System and Processing.

For more information about OSC protocol you can check website OSC communication

import oscP5.*;
import netP5.*;
OscP5 oscP5;
NetAddress myRemoteLocation; // IP of Computer running captury, Port configured in software
long lastMillis=0; // used to periodically send subscribe messages
ArrayList <PVector> joints=new ArrayList <PVector>(); // used to buffer joint positions between oscEvent callbacks and "draw"  calls
void setup() {
 fullScreen();
 //size(800, 600);
 oscP5 = new OscP5(this, 1065);
 myRemoteLocation = new NetAddress("141.54.159.160", 1065); // our computer...
 void draw(){
 // periodically ask the captury server to send us data.
 if (millis()-lastMillis>5000) {
   // The format is as follows:
   //"/subscribe/<Actor_name>/<skeleton_data_type>/<joint_name>/<data_format>"
   //most of the placeholders can be replaced by a "*" for "everything"
   // unfortunately, if you subscribe to too many things, a bug in the captury will lead to malformed OSC-bundles that in turn crash OSCP5
OscMessage myMessage = new OscMessage("/subscribe/*/blender/Head/vector"); 
 // get positions ("vector") of all joints of actor "felix_braun_rot" in mm
 oscP5.send(myMessage, myRemoteLocation);}
 strokeWeight(4);  
 // go through list of joints and draw them all
 frameRate(5);
 fill(101, 50);
 stroke(255, 30);
 background(0);
 for (int i=0; i<joints.size(); i++) {
   //draw everything here
   point(joints.get(i).x/5+width/2, joints.get(i).y/5+height/2);
   line(random(600), random(600), joints.get(i).x/5+width/2, joints.get(i).y/5+height/2);
 }
 joints.clear();
 }
 /* incoming osc message are forwarded to the oscEvent method. */
 void oscEvent(OscMessage theOscMessage) {
 println(theOscMessage); // debug out
 // only use packages that contain position data as three floats
 if (theOscMessage.checkTypetag("fff")) {
   joints.add(new PVector(theOscMessage.get(0).floatValue(), theOscMessage.get(1).floatValue(), theOscMessage.get(2).floatValue()));
   println( theOscMessage.get(0).floatValue());
   println( theOscMessage.get(1).floatValue());
   println( theOscMessage.get(2).floatValue());}}


This Processing Sketch show how OSC protocol is used and how our draw function is inserted inside that protocol. First of all oscP5 and netP5 libraries are imported inside the sketch.In order to use OSC protocol we have to import those libraries. There are 3 messages received from Captury, they are x,y,z coordinates of the tracked person locations.


VIDEO IMPORT ON PROCESSING

After creating video with after effects It supposed to be used as a background of Video wall representation.
This is how video import is possible while we are using OSC protocol inside processing

in order to see that video you should put this video in the same folder inside the processing folder. By creating another folder inside the processing file called "Data" put that video inside this subfolder and save processing definition.




import oscP5.*;
import netP5.*;
import processing.video.*;
Movie myMovie;
OscP5 oscP5;
NetAddress myRemoteLocation; // IP of Computer running captury, Port configured in software
long lastMillis=0; // used to periodically send subscribe messages
ArrayList <PVector> joints=new ArrayList <PVector>(); // used to buffer joint positions between oscEvent callbacks and "draw"        calls
void setup() {
fullScreen();
//size(800, 600);
oscP5 = new OscP5(this, 1065);
myRemoteLocation = new NetAddress("141.54.159.160", 1065); // our computer...
myMovie = new Movie(this, "Natural.mp4");
myMovie.loop();
}
void draw() {
// periodically ask the captury server to send us data.
tint(255);
image(myMovie, 0, 0);
if (millis()-lastMillis>5000) {
// The format is as follows:
//"/subscribe/<Actor_name>/<skeleton_data_type>/<joint_name>/<data_format>"
//most of the placeholders can be replaced by a "*" for "everything"
// unfortunately, if you subscribe to too many things, a bug in the captury will lead to malformed OSC-bundles that in turn crash   OSCP5
OscMessage myMessage = new OscMessage("/subscribe/*/blender/Head/vector"); // get positions ("vector") of all joints of actor   "felix_braun_rot" in mm
oscP5.send(myMessage, myRemoteLocation);
}
for (int i=0; i<joints.size(); i++) {
//draw everything here
point(joints.get(i).x/5+width/2, joints.get(i).y/5+height/2);
}
joints.clear();
}
// Called every time a new frame is available to read
void movieEvent(Movie m) {
m.read();
}
/* incoming osc message are forwarded to the oscEvent method. */
void oscEvent(OscMessage theOscMessage) {
println(theOscMessage); // debug out
// only use packages that contain position data as three floats
if (theOscMessage.checkTypetag("fff")) {
joints.add(new PVector(theOscMessage.get(0).floatValue(), theOscMessage.get(1).floatValue(),   theOscMessage.get(2).floatValue()));
println( theOscMessage.get(0).floatValue());
println( theOscMessage.get(1).floatValue());
println( theOscMessage.get(2).floatValue());
}
}

PROCESS OF WORK

This is how we evolve the project.This video shows captury system while tracking people as a skeleton inside the linux captury program.
At the end we manage to fix bugs of tracking people as a light beams.
This image shows how OSC protocol is running.It listens messages from our computer and sends messages to our computer.So it provides communication from one computer to another computer.




As A Result

Nature Inspired Visualization- Output from Processing


Tracked person realtime on video wall as a light beam reflection of nature


It represent four element of nature (earth, fire, water, air).This video is background of the light beams to see 4 nature element while light beams connecting human to digital environment inside that natural conceptual aspect.




References

A Brief Overview of Actor-Network Theory: Punctualization, Heterogeneous Engineering & Translation, Darryl Cressman, CT Lab/Centre for Policy Research on Science & Technology (CPROST) School of Communication, Simon Fraser University, http://faculty.georgetown.edu/irvinem/theory/Cressman-ABriefOverviewofANT.pdf, April 2009.

J Law, "Actor-Network Theory (ANT)," in Learning Theories, https://www.learning-theories.com/actor-network-theory-ant.html , March 23, 2007.