GMU:Sensor Hacklab/Rachel Smith

From Medien Wiki

Project Overview


Origami2.png Umbrella2.png Origami1.png


In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.

The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.


Outline of installation in performance platform (12 cameras surrounding for tracking) One continuous origami structure

Rssketch1.png

Units of origami

Rssketch2.png

Link to main project


Technical Implementation

For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.


Link to video of eye tracking programme


*Link to processing code 
*Link to Arduino code 

Shrs5.png

Shrs4.png

Shrs2.png

Protoyping Experiments

Experimenting with spaghetti as a sensor using variable resistance in a voltage divider circuit to send values to the arduino analogue input pin.

Shrs1.png

Here is an experiment with 'Muscle Wire' which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project so I looked again at servo motors.

Shrs3.png