<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Rachel</id>
	<title>Medien Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Rachel"/>
	<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/Special:Contributions/Rachel"/>
	<updated>2026-05-15T20:41:45Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.6</generator>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Appropriation_within_Digital_Worlds&amp;diff=97272</id>
		<title>GMU:Appropriation within Digital Worlds</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Appropriation_within_Digital_Worlds&amp;diff=97272"/>
		<updated>2018-05-15T15:55:48Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Lecturer: [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/Jörg_Brinkmann Jörg Brinkmann]&amp;lt;br&amp;gt;&lt;br /&gt;
Credits: 6 ECTS, 4 SWS&amp;lt;br&amp;gt;&lt;br /&gt;
Date: Thursdays, 17:00 - 20:30&amp;lt;br&amp;gt; &lt;br /&gt;
Venue: Performance Platform, Digital Bauhaus Lab (Room 001)&amp;lt;br&amp;gt;&lt;br /&gt;
First meeting: Thursday, April 12th 2018, 17:00&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br style=&amp;quot;clear: both&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;DESCRIPTION&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Appropriation in art is the use of pre-existing objects or images with little or no transformation applied to them. &lt;br /&gt;
The use of appropriation has played a significant role in the history of the arts (literary, visual, musical and performing arts). &lt;br /&gt;
In the visual arts, to appropriate means to properly adopt, borrow, recycle or sample aspects of human-made visual culture. A notable example are the Readymades of Marcel Duchamp. &lt;br /&gt;
&lt;br /&gt;
But when is appropriation a homage, when is it art and when is it just plain plagiarism? And what are the effects of technology on this ongoing appropriation? Now that boundaries of authenticity and originality are even more blurred, artists (indeed anyone) can recycle and re-upload images, text and audio material more quickly and easily than ever before. Sampling, remixing and mashups proliferate online, and allow people to even adopt a social media profile that appropriates or parodies a well-known persona.&lt;br /&gt;
&lt;br /&gt;
In this course we will look at contemporary artistic strategies of appropriation. We will discuss artforms like Post-Internet art that embraces meme-culture, or music styles like Vapourwave that appropriates 1980s and 1990s styles of mood music. &lt;br /&gt;
&lt;br /&gt;
We will investigate, question and challenge modern and historical concepts related to the topic and create artistic works that can be presented in the form of live performances, video works or installations. The course supports and excercises independent, &lt;br /&gt;
self-motivated work. Together, we will create an environment in which students can produce and discuss their own subjects related to the matter.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;CRITERIA FOR PASSING&#039;&#039;&#039;&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
In order to successfully participate you will have to develop and document your own project on the GMU Wiki. Also, regularly attend to the sessions and participation is mandatory.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;LANGUAGE&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
The course will be in English, unless all participants are speaking German.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;STUDENTWORKS&#039;&#039;&#039; ==&lt;br /&gt;
*[[/Ulas Yener|Ulas Yener]]&lt;br /&gt;
*[[/Suriñe G.-Pozuelo Iglesias|Suriñe G.-Pozuelo Iglesias]]&lt;br /&gt;
*[[/Lise Epinat|Lise Epinat]]&lt;br /&gt;
*[[/Holly Kempton|Holly Kempton]]&lt;br /&gt;
*[[Ruo-Jin Yen]]&lt;br /&gt;
*[[Eduardo Moreno]]&lt;br /&gt;
*[[Grace Quintero]]&lt;br /&gt;
*[[Sarah Pacheco Alvim]]&lt;br /&gt;
*[[Nicha Boonyawairote]]&lt;br /&gt;
*[[Marie Ebel]]&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;12.04.2018&#039;&#039;&#039; ==&lt;br /&gt;
Introduction&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;19.04.2018&#039;&#039;&#039; ==&lt;br /&gt;
Lecture&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;26.04.2018&#039;&#039;&#039; ==&lt;br /&gt;
Student presentation&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;03.05.2018&#039;&#039;&#039; ==&lt;br /&gt;
Workshop&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;17.05.2018&#039;&#039;&#039; ==&lt;br /&gt;
&#039;&#039;&#039;Consultations (Marienstr. 5, Room 208)&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
10:00 - 10:30 (Suriñe G.-Pozuelo Iglesias) &amp;lt;br&amp;gt;&lt;br /&gt;
10:30 - 11:00 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
11:00 - 11:30 Lise Epinat &amp;lt;br&amp;gt;&lt;br /&gt;
11:30 - 12:00 Forat Elalfy &amp;lt;br&amp;gt;&lt;br /&gt;
– &amp;lt;br&amp;gt;&lt;br /&gt;
13:00 - 13:30 Nicha Boonyawairote &amp;lt;br&amp;gt;&lt;br /&gt;
13:30 - 14:00 Marie Ebel &amp;lt;br&amp;gt;&lt;br /&gt;
14:00 - 14:30 Eduardo Moreno &amp;lt;br&amp;gt;&lt;br /&gt;
14:30 - 15:00 Grace Quintero &amp;lt;br&amp;gt;&lt;br /&gt;
– &amp;lt;br&amp;gt;&lt;br /&gt;
16:00 - 16:30 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
16:30 - 17:00 Sarah Alvim &amp;lt;br&amp;gt;&lt;br /&gt;
17:00 - 17:30 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
17:30 - 18:00 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
18:00 - 18:30 Rachel &amp;lt;br&amp;gt;&lt;br /&gt;
18:30 - 19:00 Fabian Krzich &amp;lt;br&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;24.05.2018&#039;&#039;&#039; ==&lt;br /&gt;
Lecture&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;31.05.2018&#039;&#039;&#039; ==&lt;br /&gt;
Student presentation&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;07.06.2018&#039;&#039;&#039; ==&lt;br /&gt;
Workshop&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;14.06.2018&#039;&#039;&#039; ==&lt;br /&gt;
&#039;&#039;&#039;Consultations (Marienstr. 5, Room 208)&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
11:00 - 11:30 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
11:30 - 12:00 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
– &amp;lt;br&amp;gt;&lt;br /&gt;
13:00 - 13:30 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
13:30 - 14:00 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
14:00 - 14:30 (put your name here) &amp;lt;br&amp;gt;&lt;br /&gt;
14:30 - 15:00 (put your name here)&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;21.06.2018&#039;&#039;&#039; ==&lt;br /&gt;
presentation of your concepts&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;28.06.2018&#039;&#039;&#039; ==&lt;br /&gt;
lab time: working on your projects&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;05.07.2018&#039;&#039;&#039; ==&lt;br /&gt;
lab time: working on your projects&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;12.07.2018&#039;&#039;&#039; ==&lt;br /&gt;
Summaery Exhibition&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== &#039;&#039;&#039;APPROPRIATION TOOLBOX&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Software&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Audacity - free, open source, cross-platform audio software for multi-track recording and editing &amp;lt;br&amp;gt;&lt;br /&gt;
https://www.audacityteam.org&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
OpenShot - free, open source, cross platform video Editor &amp;lt;br&amp;gt;&lt;br /&gt;
https://www.openshot.org &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
GIMP - GNU Image Manipulation Program &amp;lt;br&amp;gt;&lt;br /&gt;
https://www.gimp.org&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Open Broadcaster Software - free and open-source streaming and recording program &amp;lt;br&amp;gt;&lt;br /&gt;
https://obsproject.com&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
JDownloader - free, open-source download management tool&amp;lt;br&amp;gt;&lt;br /&gt;
http://jdownloader.org&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
HandBrake – The open source video transcoder &amp;lt;br&amp;gt;&lt;br /&gt;
https://handbrake.fr &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Infovox iVox: The most natural sounding Text to Speech voices yet for Mac OS X &amp;lt;br&amp;gt;&lt;br /&gt;
http://www.assistiveware.com/product/infovox-ivox &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;sources&#039;&#039;&#039;&amp;lt;br&amp;gt; &lt;br /&gt;
Internet Archive is a non-profit library of millions of free books, movies, software, music, websites, and more &amp;lt;br&amp;gt;&lt;br /&gt;
https://archive.org &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
UbuWeb – All avant-garde. All the time. &amp;lt;br&amp;gt;&lt;br /&gt;
http://ubuweb.com &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Freesound: collaborative database of creative-commons licensed sound for musicians and sound lovers &amp;lt;br&amp;gt;&lt;br /&gt;
https://freesound.org &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Youtube – video-sharing website… &amp;lt;br&amp;gt;&lt;br /&gt;
https://www.youtube.com &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Know Your Meme: Internet Meme Database &amp;lt;br&amp;gt;&lt;br /&gt;
http://knowyourmeme.com &amp;lt;br&amp;gt;&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Artists_Lab/Rachel_Smith&amp;diff=91205</id>
		<title>GMU:Artists Lab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Artists_Lab/Rachel_Smith&amp;diff=91205"/>
		<updated>2017-05-07T17:23:57Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Intercepting Neural Networks&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Convolutional neural networks have a strong ability to recognise and describe images. They are behind Google’s ‘Deep Dream’, ‘Style Transfer’ and also used in practical applications such as driverless cars and text-to-speech. Images are abstracted through many ‘hidden’ layers of the network in order to gain high level information (such as the existence of leaves, eyes etc) as easily as possible. When inspected by human eyes, these layers bear little or no resemblance to the original image and, in latter stages, are reduced to completely abstract forms. What happens when these layers are reinterpreted in a human way? How will the network function with subjective intruders?&lt;br /&gt;
&lt;br /&gt;
In this project I will intercept the hidden layers of an image classifying neural network, extract several images, draw them myself and reinsert them into the process.&lt;br /&gt;
                                                    &lt;br /&gt;
&lt;br /&gt;
[[File:Jason Yosinksi.png|400px]][[File:Gene Kogan.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Jason Yosinski, Gene Kogan&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Jason_Yosinksi.png&amp;diff=91203</id>
		<title>File:Jason Yosinksi.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Jason_Yosinksi.png&amp;diff=91203"/>
		<updated>2017-05-07T17:20:18Z</updated>

		<summary type="html">&lt;p&gt;Rachel: Rachel uploaded a new version of File:Jason Yosinksi.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Jason_Yosinksi.jpg&amp;diff=91202</id>
		<title>File:Jason Yosinksi.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Jason_Yosinksi.jpg&amp;diff=91202"/>
		<updated>2017-05-07T17:17:13Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Artists_Lab/Rachel_Smith&amp;diff=91201</id>
		<title>GMU:Artists Lab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Artists_Lab/Rachel_Smith&amp;diff=91201"/>
		<updated>2017-05-07T17:12:40Z</updated>

		<summary type="html">&lt;p&gt;Rachel: Created page with &amp;quot;&amp;#039;&amp;#039;&amp;#039;Intercepting Neural Networks&amp;#039;&amp;#039;&amp;#039;  Convolutional neural networks have a strong ability to recognise and describe images. They are behind Google’s ‘Deep Dream’, ‘Style...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Intercepting Neural Networks&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Convolutional neural networks have a strong ability to recognise and describe images. They are behind Google’s ‘Deep Dream’, ‘Style Transfer’ and also used in practical applications such as driverless cars and text-to-speech. Images are abstracted through many ‘hidden’ layers of the network in order to gain high level information (such as the existence of leaves, eyes etc) as easily as possible. When inspected by human eyes, these layers bear little or no resemblance to the original image and, in latter stages, are reduced to completely abstract forms. What happens when these layers are reinterpreted in a human way? How will the network function with subjective intruders?&lt;br /&gt;
&lt;br /&gt;
In this project I will intercept the hidden layers of an image classifying neural network, extract several images, draw them myself and reinsert them into the process.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery mode=&amp;quot;packed-hover&amp;quot;&amp;gt;&lt;br /&gt;
Image:Gene Kogan.png| Gene Kogan&lt;br /&gt;
Image:Jason Yosinksi.png| Jason Yosinski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Jason_Yosinksi.png&amp;diff=91200</id>
		<title>File:Jason Yosinksi.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Jason_Yosinksi.png&amp;diff=91200"/>
		<updated>2017-05-07T17:00:55Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Gene_Kogan.png&amp;diff=91199</id>
		<title>File:Gene Kogan.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Gene_Kogan.png&amp;diff=91199"/>
		<updated>2017-05-07T17:00:53Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Artists_Lab_SS17&amp;diff=91198</id>
		<title>GMU:Artists Lab SS17</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Artists_Lab_SS17&amp;diff=91198"/>
		<updated>2017-05-07T16:59:45Z</updated>

		<summary type="html">&lt;p&gt;Rachel: /* Students */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;-----&lt;br /&gt;
= Artists Lab =&lt;br /&gt;
== Application ==&lt;br /&gt;
If you want to take part please apply with the normal application procedure of our faculty and let me know that you want to join our class: ursula.damm [at] uni-weimar.de &amp;lt;br&amp;gt;&lt;br /&gt;
For prospective students we are avaliable for consultations on &#039;&#039;&#039;Tuesday 04.04. from 9.00 to 12.00 at the Marienstrasse 5, room 304 (3rd floor)&#039;&#039;&#039; &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;u&amp;gt;Location:&amp;lt;/u&amp;gt; &#039;&#039;&#039;Room 204, Marienstrasse 7b&#039;&#039;&#039;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;u&amp;gt;First Meeting:&amp;lt;/u&amp;gt; &#039;&#039;&#039;Thursday 06.04.2017, 09:15&#039;&#039;&#039; (please note that on Thursday April 13th we are not avaliable, but you are welcome to meet with us on Thursday 6.4.)&lt;br /&gt;
The module is very open ;-)&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;u&amp;gt;Regular meetings:&amp;lt;/u&amp;gt; Thursday 09:15 to 12:30&lt;br /&gt;
&amp;lt;br&amp;gt;Participation at the Presentation on Media Art &#039;&#039;&#039;Tuesday 17.30 Uhr at room 204, Marienstrasse 7b&#039;&#039;&#039; is mandatory&lt;br /&gt;
&lt;br /&gt;
== Description ==&lt;br /&gt;
&lt;br /&gt;
The artists lab is a project module for students who wish to have an intensive exchange on contemporary artistic approaches to art, media, interactivity and matter. GMU offers our labs (Performance Platform and DIY Biolab) to develop your ideas in suitable environments. &amp;lt;br&amp;gt;The modul expects you to be self-motivated and self-organized. It offers a culture of discussion. You should consider to participate if you wish to work with the [https://www.uni-weimar.de/de/kunst-und-gestaltung/professuren/media-environments/laboratories/dbl-platform/ Performance Platform] or our [https://www.uni-weimar.de/de/kunst-und-gestaltung/professuren/media-environments/laboratories/diy-biolab/ DIY Biolab]. Part of the module is a lecture on media art  thuesday evening at 5.30 pm. Pleople from other study programs are also very welcome, just be aware that we do not offer a strong guidance as it might be necessary if you are getting familiar with media art for a first time. &lt;br /&gt;
If you are interested to work in our laboratories, please choose the relevant fachmodules from my co-workers. &lt;br /&gt;
&lt;br /&gt;
Das Projekt bietet eine Klammer für studentische Vorhaben, in welchen Experimente Grundlagen werden für künstlerische Ausdrucksformen. Der Besuch der Fachmodule der Professur wird empfohlen.&lt;br /&gt;
&lt;br /&gt;
== Exhibitions ==&lt;br /&gt;
GMU is invited to three shows late 2017: &amp;lt;br&amp;gt;&lt;br /&gt;
We will participate with biological inspired games and game settings at the Next Level Festival in Duesseldorf&lt;br /&gt;
* [http://www.next-level.org/start/#/alle/ Next Level] &lt;br /&gt;
We are developing artworks and interventions two major locations in Kaohsiung in cooperation with [http://www.goethe.de/ges/umw/prj/kuk/fot/mal/deindex.htm Mali Wu]: &lt;br /&gt;
* [http://pwbgis.kcg.gov.tw/zhongdu1/index_en.html Kaohsiung Jhongdou Wetlands Park]&amp;lt;br&amp;gt;&lt;br /&gt;
We were asked to contribute/work in closed cooperation with the Phyletische Museum Jena, please contact your teacher for further information.&lt;br /&gt;
*  [[Image:1_Weimar_Jena_Duft.jpg|100px|Dem Geruch auf der Spur]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Lecture on Media Art on Tuesdays, 5.30 pm, room 204==&lt;br /&gt;
&lt;br /&gt;
* 11.4.2017: Interactive Art 1 (Robots, Breitenberg)&lt;br /&gt;
* 18.4.2017: EAT&lt;br /&gt;
* 25.4.2017: Interactive Art 2 (Artists…)&lt;br /&gt;
* 2.5.2017: Algorithms&lt;br /&gt;
* 9.5.2017: Media Performance 1 &lt;br /&gt;
* 16.5.2017: Media Performance 2&lt;br /&gt;
* 23.5.2017: Artificial Life &lt;br /&gt;
* 30.5.2017: Joe Davis + early Bioart&lt;br /&gt;
* 6.5.2017: Bioart 1&lt;br /&gt;
* 13.5.2017: Bioart 2&lt;br /&gt;
* 20.6.2017: Image to Sound 1&lt;br /&gt;
* 27.6.2017: Image to Sound 2&lt;br /&gt;
* 4.7.2017: Agents and Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Presentations==&lt;br /&gt;
* please insert a date for your presentation&lt;br /&gt;
&lt;br /&gt;
== Materials ==&lt;br /&gt;
* Valentino Braitenberg, Vehicles: Experiments in Synthetic Psychology ISBN 978-0262521123&lt;br /&gt;
* Tilman Baumgärtel, Games: Computerspiele von KünstlerInnen, hardware medien kunstverein ISBN 978-3936919776&lt;br /&gt;
&lt;br /&gt;
== Artists ==&lt;br /&gt;
* [http://www.ortner-ortner.com/en/haus-rucker-co Haus Rucker]&lt;br /&gt;
* [http://www.spatialagency.net/database/ant.farm Ant Farm]&lt;br /&gt;
* [http://raumlabor.net/ Raumlabor]&lt;br /&gt;
* [http://www.auditory-seismology.org/version2004/ Florian Dombois Auditory Seismology]&lt;br /&gt;
* [https://vimeo.com/yunchulkim Yunchul Kim]&lt;br /&gt;
* [http://www.f-john.de/index.php?id=36 Turing Tables Franz John]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Loie_Fuller Loie Fuller]&lt;br /&gt;
* [http://deweyhagborg.com/ Heather Dewey-Hagborg]&lt;br /&gt;
* [http://www.dwbowen.com/ David Bowen]&lt;br /&gt;
* [http://wwwwwwwww.jodi.org/ Jodi.org]&lt;br /&gt;
* [http://www.carliergebauer.com/artists/aernout_mik Aernout Mik]&lt;br /&gt;
* [http://www.harunfarocki.de/home.html Harun Farocki]&lt;br /&gt;
* [http://h--a.org/en/project/back-to-the-future-forward-to-the-past-three-time-travel-experiments/ Hoerner Antlfinger Time Travel Experiment]&lt;br /&gt;
&lt;br /&gt;
[[Category:SS17]]&lt;br /&gt;
[[Category:Projektmodul]]&lt;br /&gt;
[[Category:Ursula Damm]]&lt;br /&gt;
&lt;br /&gt;
== Students ==&lt;br /&gt;
&lt;br /&gt;
* [[Christian Döller]] &lt;br /&gt;
* [[David Howlett]]&lt;br /&gt;
* [[Sarah H]]&lt;br /&gt;
* [[Tobias Zimmer]]&lt;br /&gt;
* [[/Rachel Smith|Rachel Smith]]&lt;br /&gt;
* [[Cagdas Biber]]&lt;br /&gt;
* [[Justina]] &lt;br /&gt;
* [[Maike Effenberg]]&lt;br /&gt;
* [[Chen Mu]]&lt;br /&gt;
* [[Wisanu Phu Artdun &#039;Q&#039;]]&lt;br /&gt;
* [[Yasemin Yagcii]]&lt;br /&gt;
* [[Ioannis Oriwol]] &lt;br /&gt;
* [[Grit Lieder]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:SS17]]&lt;br /&gt;
[[Category:Projektmodul]]&lt;br /&gt;
[[Category:Ursula Damm]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Meandering_through_Space/Rachel_Smith&amp;diff=88266</id>
		<title>GMU:Meandering through Space/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Meandering_through_Space/Rachel_Smith&amp;diff=88266"/>
		<updated>2016-11-22T21:13:26Z</updated>

		<summary type="html">&lt;p&gt;Rachel: Created page with &amp;quot;&amp;#039;&amp;#039;&amp;#039;Turtle Drawings from Wikipedia inputs&amp;#039;&amp;#039;&amp;#039;  Turtle, Cat  The cat looks quite a lot like a cat (!)  450px450px]   Do...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Turtle Drawings from Wikipedia inputs&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Turtle, Cat&lt;br /&gt;
&lt;br /&gt;
The cat looks quite a lot like a cat (!)&lt;br /&gt;
&lt;br /&gt;
[[File:Turtle Screenshot.png|450px]][[File:Cat Screenshot.png|450px]]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Dog, God&lt;br /&gt;
&lt;br /&gt;
[[File:Dog Screenshot.png|450px]]] [[File:God Screenshot.png|450px]]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Britney Spears, Rachel Smith&lt;br /&gt;
&lt;br /&gt;
[[File:Britney Screenshot.png|450px]]] [[File:Rachel Screensho.png|450px]]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Dog_Screenshot.png&amp;diff=88265</id>
		<title>File:Dog Screenshot.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Dog_Screenshot.png&amp;diff=88265"/>
		<updated>2016-11-22T21:04:04Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Cat_Screenshot.png&amp;diff=88264</id>
		<title>File:Cat Screenshot.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Cat_Screenshot.png&amp;diff=88264"/>
		<updated>2016-11-22T21:03:51Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Britney_Screenshot.png&amp;diff=88263</id>
		<title>File:Britney Screenshot.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Britney_Screenshot.png&amp;diff=88263"/>
		<updated>2016-11-22T21:03:50Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:God_Screenshot.png&amp;diff=88262</id>
		<title>File:God Screenshot.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:God_Screenshot.png&amp;diff=88262"/>
		<updated>2016-11-22T21:03:37Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Rachel_Screensho.png&amp;diff=88261</id>
		<title>File:Rachel Screensho.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Rachel_Screensho.png&amp;diff=88261"/>
		<updated>2016-11-22T21:02:44Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Turtle_Screenshot.png&amp;diff=88260</id>
		<title>File:Turtle Screenshot.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Turtle_Screenshot.png&amp;diff=88260"/>
		<updated>2016-11-22T21:02:42Z</updated>

		<summary type="html">&lt;p&gt;Rachel: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Meandering_through_Space&amp;diff=88259</id>
		<title>GMU:Meandering through Space</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Meandering_through_Space&amp;diff=88259"/>
		<updated>2016-11-22T21:00:59Z</updated>

		<summary type="html">&lt;p&gt;Rachel: /* Participants */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:Meandering-through-space-teaser.png|thumb|left|300px|Martin Schneider: [https://github.com/bitcraftlab/meandering-code meandering code] (2014)]] &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Meandering through Space&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[:Category:Werkmodul|Werkmodul]]/[[:Category:Fachmodul|Fachmodul]]&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;Lecturer:&#039;&#039; [[Martin Schneider]]&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;Credits:&#039;&#039; 6 [[ECTS]], 4 [[SWS]]&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;Date:&#039;&#039; Thursday / Donnerstag 17:00 - 20:30 &amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;Venue:&#039;&#039; [[Performance Plattform]], Digital Bauhaus Lab (Raum 001)&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;First meeting:&#039;&#039; Donnerstag, 20. October, 2016 um 17:00 Uhr&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br style=&amp;quot;clear:both&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
{{Note|Please create a subpage with your name and add a link below}}&lt;br /&gt;
&lt;br /&gt;
* [[/Martin Schneider|Martin Schneider]]&lt;br /&gt;
&lt;br /&gt;
* [[/Kei Kitamura|Kei Kitamura]]&lt;br /&gt;
&lt;br /&gt;
* [[/Jessica Hüttig|Jessica Hüttig]]&lt;br /&gt;
&lt;br /&gt;
* [[/Rachel Smith|Rachel Smith]]&lt;br /&gt;
&lt;br /&gt;
== Beschreibung ==&lt;br /&gt;
&lt;br /&gt;
Die Interaktion des Wassers mit geologischen Formationen führt zu geschlängelten und gewundenen Formen, die seit Urzeiten Vorbild sind für dekorative und mystische Gestaltung. Auch das Labyrinth, das um ein Zentrum meandert ist ein kulturelles Mem das seit Jahrtausenden kopiert und immer wieder neu interpretiert wird. Ob in ritueller Bewegung, im Tanz oder als Graffitti. In diesem Modul gehen wir den mäandernden Bewegungsspuren nach und erlernen die programmatische Gestaltung von raumfüllenden Kurven, Meandern und Labyrinthen. Am Ende des Moduls steht die kollaborative Gestaltung eines Meanders, das  auf der Performance-Plattform aufgeführt werden soll.&lt;br /&gt;
&lt;br /&gt;
Folgende Kenntnisse werden vermittelt:&lt;br /&gt;
&lt;br /&gt;
* Programmieren mit Processing&lt;br /&gt;
* Steuerung der Videowall und des Sound-Systems der Peformance-Plattform&lt;br /&gt;
* Interfaces zur Aufzeichnung und Gestaltung von Trajektorien&lt;br /&gt;
* Kurven, Trajektorien und Bewegung im Raum&lt;br /&gt;
* Theorie und Praxis der Meander und Labyrinthe&lt;br /&gt;
* Programmierte Gestaltung von Linien und Kurven&lt;br /&gt;
* Raumfüllende Kurven und fraktale Mathematik&lt;br /&gt;
* Generative Systeme&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Dieses Modul erfordert keine Programmierkentnisse.&lt;br /&gt;
&lt;br /&gt;
Im Rahmen des Kurses werden grundlegende Programmierkentnisse vermittelt, die es ermöglichen in Zukunft fortgeschrittene Kurse zu belegen.&lt;br /&gt;
&lt;br /&gt;
== Empfehlung ==&lt;br /&gt;
&lt;br /&gt;
Das Modul wird in enger Zusammenarbeit mit der Professur Elektroakustische Komposition und Klanggestaltung und der Professur Experimentelles Radio veranstaltet.&lt;br /&gt;
Es wird empfohlen den Kurs mit &amp;quot;Sounds in Motion&amp;quot; (EKK) oder mit  &amp;quot;Big Data / Archiv 2&amp;quot;  (RADIO) zu kombinieren. Der Kurs richtet sich auch an Studierende der Medienwissenschaften, die das Seminar &amp;quot;Experimentalkulturen&amp;quot; und das Projekt-Modul &amp;quot;Experimente, Artefakte und ihre Performance&amp;quot; bei der Professur Gestaltung medialer Umgebungen belegen, sowie Studiernde der Medienarchitektur.&lt;br /&gt;
&lt;br /&gt;
== Description ==&lt;br /&gt;
&lt;br /&gt;
The interaction of water with geological formations leads to convoluted meandering forms, which have inspired decorative and ritual design practices since primeval times. The labyrinth, meandering around a central point is a cultural Meme that has been copied and reinterpreted for thousands of years, be it as ritual motion, dance or graffitti. In this module we are following the meandering trajectories and learn how to programmatically design curves, meanders and labyrinths. The module culminates in the collaborative design of a meander to be presented on the performance platform.&lt;br /&gt;
&lt;br /&gt;
No programming skills required. This modules teaches basic programming skills which will enable you to visit advanved modules in the future.&lt;br /&gt;
&lt;br /&gt;
== Language ==&lt;br /&gt;
&lt;br /&gt;
The course will be in English, unless all participants are speaking German.&lt;br /&gt;
&lt;br /&gt;
== Eligible Participants ==&lt;br /&gt;
&lt;br /&gt;
Undergraduates and graduates enrolled in the faculties of D, M and A.&lt;br /&gt;
&lt;br /&gt;
== Application ==&lt;br /&gt;
&lt;br /&gt;
You can apply for this class via email.&amp;lt;br&amp;gt;&lt;br /&gt;
Make sure to use the email subject &#039;&#039;&amp;quot;Meandering through Space /// Application&amp;quot;&#039;&#039; and include all information listed below.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;To:&#039;&#039;&#039; [[User:ms|Martin Schneider]]&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Subject:&#039;&#039;&#039; Meandering through Space /// Application&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Content:&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
# Name, Surname  &lt;br /&gt;
# Your study program (including name of study + master or bachelor)&lt;br /&gt;
# Your study semester (as of SS 2016)&lt;br /&gt;
# matriculation number (Matrikelnummer)&lt;br /&gt;
# Valid email address @uni-weimar.de &lt;br /&gt;
# &#039;&#039;&#039;Provide a short and precise instruction that explains how to draw a curve that completely fills a page&#039;&#039;&#039;&lt;br /&gt;
# &#039;&#039;&#039;Follow your instruction and attach the resulting page as an image&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
{{note|&#039;&#039;&#039;Click the links to learn more about the topics of the course.&amp;lt;br&amp;gt;&lt;br /&gt;
The schedule may still be subject to change.&lt;br /&gt;
}}&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Theme !! Topic !! Date&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;3&amp;quot; | [[/Part1|Meandering Curves]]&lt;br /&gt;
| [[/Part1#Introduction|Introduction]]&lt;br /&gt;
| 20. Oct 2016&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part1#Collaborative Drawing|Collaborative Drawing]]&lt;br /&gt;
| 27. Oct 2016&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part1#Random Walks|Random Walks]]&lt;br /&gt;
| 03. Nov 2016&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;4&amp;quot; | [[/Part2|Meandering Code]]&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part2#From Code to Curves|From Code to Curves]]&lt;br /&gt;
| 10. Nov 2016&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part2#Processing Curves|Processing Curves]]&lt;br /&gt;
| 17. Nov 2016&lt;br /&gt;
|- &lt;br /&gt;
| [[/Part2#Transforming Curves|Transforming Curves]]&lt;br /&gt;
| 24. Nov 2016&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;5&amp;quot; | [[/Part3|Meandering Machines]]&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part3#Braitenberg Vehicles|Braitenberg Vehicles]]&lt;br /&gt;
| 01. Dec 2016&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part3#Drawing Homeostats|Drawing Homeostats]]&lt;br /&gt;
| 08. Dec 2016&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part3#Reactive Curves|Reactive Curves]]&lt;br /&gt;
| 15. Dec 2016&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part3#Swarm Based Curves|Swarm Based Curves]]&lt;br /&gt;
| 05. Jan 2017&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;4&amp;quot; | [[/Part4|Meandering Media]]&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part4#Meandering Sound|Meandering Sound]]&lt;br /&gt;
| 12. Jan 2017&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part4#Meandering Video|Meandering Video]]&lt;br /&gt;
| 19. Jan 2017&lt;br /&gt;
|-&lt;br /&gt;
| [[/Part4#The Meandering Mind|The Meandering Mind]]&lt;br /&gt;
| 26. Jan 2017&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Evaluation ==&lt;br /&gt;
* 50 % Kursbegleitende Aufgaben, Experimente und Sketche&lt;br /&gt;
* 30 % Dokumentation (davon 10%  Mitarbeit im Medien-Wiki)&lt;br /&gt;
* 20 % Kollaboratives Design (Meandering Bauhaus)&lt;br /&gt;
&lt;br /&gt;
== Prerequisites ==&lt;br /&gt;
No previous rogramming skills required.&lt;br /&gt;
&lt;br /&gt;
== Literature ==&lt;br /&gt;
&lt;br /&gt;
* Tim Ingold: &#039;&#039;Lines: a brief history&#039;&#039; (2007) — ISBN 978-0415424271 (hint: [https://www.google.de/#q=Lines:+a+brief+history google it])&lt;br /&gt;
* Jeffrey Ventrella: [https://archive.org/stream/BrainfillingCurves-AFractalBestiary/BrainFilling Brain-Filling Curves] (2012)&lt;br /&gt;
* Braitenberg Valentino: &#039;&#039;Vehikel. Experimente mit kybernetischen Wesen&#039;&#039; (1993) — ISBN 978-3499195310&lt;br /&gt;
&lt;br /&gt;
[[Category:WS16]]&lt;br /&gt;
[[Category:Werkmodul]]&lt;br /&gt;
[[Category:Fachmodul]]&lt;br /&gt;
[[Category:Martin Schneider]]&lt;br /&gt;
[[Category:Processing]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
&lt;br /&gt;
[[Category:Programming for Beginners]]&lt;br /&gt;
[[Category:Meander]]&lt;br /&gt;
[[Category:Labyrinth]]&lt;br /&gt;
[[Category:Digital Bauhaus Lab]]&lt;br /&gt;
[[Category:Video Wall]]&lt;br /&gt;
[[Category:Spatial Audio]]&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86451</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86451"/>
		<updated>2016-08-02T12:49:37Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png|385px]]&lt;br /&gt;
[[File:schematic2rs.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:shrs8.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
Here I managed to get the servo and umbrella mechanism working with the spaghetti sensor but when I attached the umbrella spokes to the origami, the tension became too much for the motor. To get round this problem I thought of trying to make less tension in the paper by using thinner paper and smaller folds. I have started to fold the structure underneath (which is taking an incredibly long time) and will test this soon.&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;br /&gt;
&lt;br /&gt;
The full realisation of this project will be documented [https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith here] when complete.&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86450</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86450"/>
		<updated>2016-08-02T12:48:07Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png|385px]]&lt;br /&gt;
[[File:schematic2rs.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:shrs8.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
Here I managed to get the servo and umbrella mechanism working with the spaghetti sensor but when I attached the umbrella spokes to the origami, the tension became too much for the motor. To get round this problem I thought of trying to make less tension in the paper by using thinner paper and smaller folds. I have started to fold the structure underneath (which is taking an incredibly long time) and will test this soon.&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Shrs8.png&amp;diff=86449</id>
		<title>File:Shrs8.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Shrs8.png&amp;diff=86449"/>
		<updated>2016-08-02T12:44:33Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86448</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86448"/>
		<updated>2016-08-02T12:44:15Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png|385px]]&lt;br /&gt;
[[File:schematic2rs.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:shrs8.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86447</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86447"/>
		<updated>2016-08-02T12:39:48Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png|385px]]&lt;br /&gt;
[[File:schematic2rs.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Schematic2rs.png&amp;diff=86446</id>
		<title>File:Schematic2rs.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Schematic2rs.png&amp;diff=86446"/>
		<updated>2016-08-02T12:34:57Z</updated>

		<summary type="html">&lt;p&gt;Rachel: uploaded a new version of &amp;amp;quot;File:Schematic2rs.png&amp;amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Schematic2rs.png&amp;diff=86445</id>
		<title>File:Schematic2rs.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Schematic2rs.png&amp;diff=86445"/>
		<updated>2016-08-02T12:33:56Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86444</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86444"/>
		<updated>2016-08-02T12:33:02Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
[[File:schematic2rs.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86443</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86443"/>
		<updated>2016-08-02T12:32:24Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
[[File:schematic2rs.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86442</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86442"/>
		<updated>2016-08-02T10:14:01Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177210267 Video of servo moving umbrella mechanism]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86441</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86441"/>
		<updated>2016-08-02T09:47:35Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Using Muscle Wire&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86440</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86440"/>
		<updated>2016-08-02T09:45:51Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png|600px]]&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/177207484 Video of this here]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Expandingcirlces.png&amp;diff=86439</id>
		<title>File:Expandingcirlces.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Expandingcirlces.png&amp;diff=86439"/>
		<updated>2016-08-02T09:44:54Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86438</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=86438"/>
		<updated>2016-08-02T09:44:08Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Expanding Circles Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Here is another experiment with mechanisms for opening and closing the origami balls. This time I made expanding circles by fitting together sections of squares - The idea being to place a motor in the middle, twisting the structures. In the end this turned out to be much more complicated than the umbrella idea so I discontinued but I really liked the variety of shapes which were made just from squares.&lt;br /&gt;
&lt;br /&gt;
[[File:expandingcirlces.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Expandingcircles.png&amp;diff=86437</id>
		<title>File:Expandingcircles.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Expandingcircles.png&amp;diff=86437"/>
		<updated>2016-08-02T09:38:11Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85807</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85807"/>
		<updated>2016-07-29T22:05:39Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
Here the transistor is needed to provide the muscle wire with enough current whilst not drawing too much current from the Arduino and therefore preventing it from breaking.&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85806</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85806"/>
		<updated>2016-07-29T21:39:15Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.jpg]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.jpg]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85805</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85805"/>
		<updated>2016-07-29T21:37:20Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl1.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl2.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl3.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:sensorhl7.png]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl7.jpg&amp;diff=85804</id>
		<title>File:Sensorhl7.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl7.jpg&amp;diff=85804"/>
		<updated>2016-07-29T21:35:24Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl6.jpg&amp;diff=85803</id>
		<title>File:Sensorhl6.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl6.jpg&amp;diff=85803"/>
		<updated>2016-07-29T21:35:07Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl5.jpg&amp;diff=85802</id>
		<title>File:Sensorhl5.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl5.jpg&amp;diff=85802"/>
		<updated>2016-07-29T21:34:35Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl4.png&amp;diff=85801</id>
		<title>File:Sensorhl4.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl4.png&amp;diff=85801"/>
		<updated>2016-07-29T21:34:12Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl3.png&amp;diff=85800</id>
		<title>File:Sensorhl3.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl3.png&amp;diff=85800"/>
		<updated>2016-07-29T21:33:55Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl2.png&amp;diff=85799</id>
		<title>File:Sensorhl2.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl2.png&amp;diff=85799"/>
		<updated>2016-07-29T21:33:36Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl1.png&amp;diff=85798</id>
		<title>File:Sensorhl1.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Sensorhl1.png&amp;diff=85798"/>
		<updated>2016-07-29T21:33:12Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85797</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85797"/>
		<updated>2016-07-29T20:46:27Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
I needed to find a way of translating circular motion (of the servo) into push-pull motion to control the umbrella structure. I made little wooden attachments for the servo propeller which I thought could push directly against the bottom of the internal umbrella structure, moving it up and down regularly in different patterns. It turned out that this approach put too much force onto the servo and caused it to stop turning. Then I attached a wooden stick between the propeller and the bottom &#039;pushing&#039; part of the umbrella. This worked well but didn&#039;t allow enough range of pushing, due to the size of the propeller. I solved this by putting on of my wooden propeller adaptions on, and then connecting the umbrella with this. I found that with a large oval shape a far bigger range could be achieved.&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85796</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85796"/>
		<updated>2016-07-29T20:37:17Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project. Muscle wire also turned out to be quite complicated to use. It is easy to burn it out by applying too much current or an appropriate amount of current for too long resulting in it becoming unresponsive. I think it could be an interesting material to work with in the future but needs a lot more research/experimentation. For this reason I went back to working with servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85795</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85795"/>
		<updated>2016-07-29T20:22:12Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/169906515 Video showing the development of servo motion]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=tGhcTwIJ4Es Link to a nice (if you mute it) youtube tutorial about how to fold this type of origami]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project so I looked again at servo motors.&lt;br /&gt;
&lt;br /&gt;
[http://makezine.com/2012/01/31/skill-builder-working-with-shape-memory-alloy/ Link to a Make article about using muscle wire, I found this useful]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85794</id>
		<title>GMU:Sensor Hacklab/Rachel Smith</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Sensor_Hacklab/Rachel_Smith&amp;diff=85794"/>
		<updated>2016-07-29T19:51:02Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;Project Overview&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Origami2.png|255px]]&lt;br /&gt;
[[File:Umbrella2.png|300px]]&lt;br /&gt;
[[File:Origami1.png|300px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In this course I wanted to develop an interactive object for use with the performance platform, the idea being that the tracking system would sense human movement and send this data to the object which, in turn, would respond. Eventually, I want to build a wall/curtain structure which will hang in between two people and act as a medium of communication. I am particularly interested in the small and subconscious movements made by humans while interacting and will concentrate on eye gaze as my sensory input.&lt;br /&gt;
&lt;br /&gt;
The wall will be made up of units, each an origami structure with the ability to move individually in response to the eye movements of the interactors. In this course I have experimented with various ways of achieving this movement and also with various sensory inputs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Outline of installation in performance platform (12 cameras surrounding for tracking) &lt;br /&gt;
One continuous origami structure&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch1.png]]&lt;br /&gt;
&lt;br /&gt;
Units of origami&lt;br /&gt;
&lt;br /&gt;
[[File:Rssketch2.png]]&lt;br /&gt;
&lt;br /&gt;
[https://www.uni-weimar.de/medien/wiki/GMU:Human_and_Nonhuman_Performances_II_SS16/Rachel_Smith Link to main project]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Technical Implementation&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the sensor I have used an eye tracking processing sketch, developed in a previous course and tweaked it to send messages to arduino when your eyes hit certain targets on the screen. Each target sets off a different motor. The communication between the two programmes works using the serial port.&lt;br /&gt;
&lt;br /&gt;
[https://vimeo.com/163158806 Link to video of eye tracking programme]&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs5.png|600px]]&lt;br /&gt;
 *[[/Link to processing code /]]&lt;br /&gt;
&lt;br /&gt;
 *[[/Link to Arduino code /]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
As an alternative sensor, I also made a spaghetti sensor by using the variable resistance inherent in cooked spaghetti as part of a voltage divider. Attached to an analogue Arduino input, I was able to get a motor to respond to my hand moving around in the bowl.&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/170165232 video here]]&lt;br /&gt;
&lt;br /&gt;
I then used the spaghetti sensor in another project [[https://www.uni-weimar.de/medien/wiki/GMU:Digital_Puppetry_Lab/Group_Leif/Rachel/Kei see here]], using OSC to send these analogue signals to max msp and Unity.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs1.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for spaghetti sensor&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic1.png]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Protoyping Experiments&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs2.png| 300px]]&lt;br /&gt;
[[File:Rssh12.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:Rssh10.png|200px]]&lt;br /&gt;
[[File:Rssh11.png|200px]]&lt;br /&gt;
[[File:Shrs4.png| 400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[https://vimeo.com/169906515 Video showing the development of servo motion]]&lt;br /&gt;
&lt;br /&gt;
Here is an experiment with &#039;Muscle Wire&#039; which shrinks when certain currents are applied. I tried sewing it into the paper to see what kinds of movement could happen. It resulted in subtle, slow movements which were too slight for this project so I looked again at servo motors.&lt;br /&gt;
&lt;br /&gt;
[[File:Shrs3.png]]&lt;br /&gt;
&lt;br /&gt;
Schematic for muscle wire&lt;br /&gt;
&lt;br /&gt;
[[File:Rsschematic2.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Fitting together umbrellas, servos and origami&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella8.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella9.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella4.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella5.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella6.png]]&lt;br /&gt;
&lt;br /&gt;
[[File:Umbrella7.png]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Tutorials/Performance_Platform/Recognizing_Gestures_with_Wekinator&amp;diff=85638</id>
		<title>GMU:Tutorials/Performance Platform/Recognizing Gestures with Wekinator</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Tutorials/Performance_Platform/Recognizing_Gestures_with_Wekinator&amp;diff=85638"/>
		<updated>2016-07-29T12:48:59Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&#039;&#039;&#039;What is Wekinator?&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Wekinator is an open source machine learning programme with a simple interface which allows you to use any OSC output and input. Have a look at these quick tutorial videos made by the creator, Rebecca Fiebrink.&lt;br /&gt;
&lt;br /&gt;
http://www.wekinator.org/videos/&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Facial Expression Recognition (using a webcam, not the performance platform)&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
First Download Wekinator and openFrameworks (you will need to also download an IDE such as Xcode in order to run openFrameworks)&lt;br /&gt;
&lt;br /&gt;
Instructions for downloading and setting up both of those can be found here:&lt;br /&gt;
http://www.wekinator.org/downloads/&lt;br /&gt;
http://openframeworks.cc/download/&lt;br /&gt;
&lt;br /&gt;
For Facial Gestures, you can use FaceOSC which runs on openFrameworks and can be downloaded here&lt;br /&gt;
https://github.com/kylemcdonald/ofxFaceTracker/releases&lt;br /&gt;
&lt;br /&gt;
This programme can be used with a webcam and tracks the position data of facial features, sending the values as OSC.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Now, open Wekinator and listen for incoming OSC messages from FaceOSC. It will send 66 pairs of xy values, i.e 132 values each frame. Set Wekinator to listen to port 8338 and for the input message /raw, and choose 132 for number of inputs. For this example I have chosen 1 output with 3 classes under a classifier output. (You can choose anything depending on what sort of output you want). This output will assign one of three images to three different facial expressions.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In Wekinator, click next. You will now come to a screen with a dropdown menu next to outputs. Select 1, make your desired facial expression and press start recording. Try recording around 20 examples and then select 2 in the dropdown menu. Now make your second desired facial expression and record again. Do this again for option 3. Now press Train. When training is finished, the status circle will turn green and you can press run. You should notice that when you make one of your three chosen facial expressions, the dropdown menu automatically changes to 1, 2 or 3. If this is happening for the wrong expressions, you may need to add more examples. If this is the case, stop running and repeat the previous steps. Keep adding examples until you get the accuracy you want.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic3.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Now for an output! Here is a simple output example I made in processing for one classifier with three classes. Just load the sketch with three images and change “happy.jpg&amp;quot; etc to match your file names.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic5.png|600px]]&lt;br /&gt;
[[File:Tutorialpic4.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
IMPORTANT: Wekinator sends values as floats, not ints - so if you are writing an output sketch, you will have to convert.&lt;br /&gt;
&lt;br /&gt;
If you have, for example, chosen three pictures of your dog, you could achieve something like this.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic6.png|600px]]&lt;br /&gt;
[[File:Tutorialpic7.png|600px]]&lt;br /&gt;
[[File:Tutorialpic8.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Gesture Recognition using the Performance Platform&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
First Download Wekinator (on your laptop/ it may be loaded on to one of the machines in the lab in time)&lt;br /&gt;
&lt;br /&gt;
Instructions for downloading and setting up can be found here:&lt;br /&gt;
http://www.wekinator.org/downloads/&lt;br /&gt;
&lt;br /&gt;
Open Captury in the performance platform and check for calibration etc. (See other tutorials) &lt;br /&gt;
&lt;br /&gt;
To access the OSC messages from Captury you need to subscribe. You can do this with Processing by tweaking the oscP5 send receive example. We added the line exit(); at the end of setup so that once the connection is established, the programme will automatically quit and free up the port it used. The remote location is “kosmos.medien.uni-weimar.de&amp;quot; for the DBL and the OSC message needs to be “/subscribe/name_of_your_skeleton/blender/Root/vector&amp;quot; (or Spine or Neck etc instead of Root - it depends what you want to track)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
!!THE SKELETON MUST BE PRESENT WHEN YOU SUBSCRIBE AND STAY PRESENT ON THE SCREEN, OTHERWISE THE SUBSCRIPTION GETS LOST AND YOU HAVE TO SET IT UP AGAIN!!&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic9.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
For simplicity, you can download an example output programme from the Wekinator website. For this tutorial I will use one for Processing called ‘Simple continuous colour control’. You can download this here &lt;br /&gt;
http://www.wekinator.org/examples/#Processing_animation_audio&lt;br /&gt;
&lt;br /&gt;
This works with one continuous parameter, instead of classifiers. So, it can be a measure of how extreme one gesture is, for example. You could also use a simple classifier like the one used in the facial expression recognition part above.&lt;br /&gt;
&lt;br /&gt;
For this tutorial I will look at the gesture of raising the right arm. I am using the playback recording of LiQianqian’s skeleton so I tell Wekinator to listen for /LiQianqian/blender/RightForeArm/vector&lt;br /&gt;
The listening port for Wekinator is 1065 since Captury is set to send to this port and the input number is 3 since it is a vector (x, y and z coordinates are sent each time). &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic11.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Once you have set up Wekinator, click on the button with a picture of a pen under ‘Edit’ and select just vector-3. This will focus in only on the up-down movement of the arm which is the important part of this gesture and won’t confuse Wekinator with data concerning where the person is in space.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic10.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
With the simple continuous colour control sketch and Wekinator open, you should see that moving the sliding bar of values in Wekinator causes the sketch to change colour. Now it is time to record some examples.&lt;br /&gt;
&lt;br /&gt;
Find a place in the playback video where the right fore arm is low, set the Wekinator control bar to 0 and press start recording in Wekinator. No need to record many examples, less than 20 is fine. Now find a place where the arms are high, set the Wekinator bar to a high level and again, record. In my example I have set the highest level to 0.7 so that high arms mean ‘blue’ and low arms mean ‘red’. Now find a place where the right arm is a medium height and put the bar somewhere in the middle. Again record. Now you can hit ‘train’ and when the status bar turns green, you can ‘run’. What you should see is Wekinator filling in all the gaps and making a smooth transition through the colours as the arms move. If not, you can add more examples!&lt;br /&gt;
&lt;br /&gt;
Remember - don’t let the skeleton move off the screen!&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&lt;br /&gt;
Gesture Recognition with a live skeleton&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
As above, subscribe to the desired part of your skeleton by sending a message like /subscribe/Christian/blender/RightHand/vector&lt;br /&gt;
&lt;br /&gt;
Make sure you set the same message, without subscribe, in Wekinator as the incoming message. I.e /Christian/blender/RightHand/vector&lt;br /&gt;
&lt;br /&gt;
In the same way as above, record examples for different stages of the gesture and press train. Now you should be able to control the colour of the output programme live by moving your arms up and down.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This was a very simple example but Wekinator can be used with any OSC input/output. There are also patches available to control Ableton. Have fun!&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Tutorialpic11.png&amp;diff=85636</id>
		<title>File:Tutorialpic11.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Tutorialpic11.png&amp;diff=85636"/>
		<updated>2016-07-29T12:39:27Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Tutorialpic10.png&amp;diff=85635</id>
		<title>File:Tutorialpic10.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Tutorialpic10.png&amp;diff=85635"/>
		<updated>2016-07-29T12:39:12Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Tutorialpic9.png&amp;diff=85634</id>
		<title>File:Tutorialpic9.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Tutorialpic9.png&amp;diff=85634"/>
		<updated>2016-07-29T12:38:48Z</updated>

		<summary type="html">&lt;p&gt;Rachel: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Tutorials/Performance_Platform/Recognizing_Gestures_with_Wekinator&amp;diff=85385</id>
		<title>GMU:Tutorials/Performance Platform/Recognizing Gestures with Wekinator</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Tutorials/Performance_Platform/Recognizing_Gestures_with_Wekinator&amp;diff=85385"/>
		<updated>2016-07-20T14:22:53Z</updated>

		<summary type="html">&lt;p&gt;Rachel: Created page with &amp;quot;First Download Wekinator and openFrameworks (you will need to also download an IDE such as Xcode in order to run openFrameworks)  Instructions for downloading and setting up both...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;First Download Wekinator and openFrameworks (you will need to also download an IDE such as Xcode in order to run openFrameworks)&lt;br /&gt;
&lt;br /&gt;
Instructions for downloading and setting up both of those can be found here:&lt;br /&gt;
http://www.wekinator.org/downloads/&lt;br /&gt;
http://openframeworks.cc/download/&lt;br /&gt;
&lt;br /&gt;
For Facial Gestures, you can use FaceOSC which runs on openFrameworks and can be downloaded here&lt;br /&gt;
https://github.com/kylemcdonald/ofxFaceTracker/releases&lt;br /&gt;
&lt;br /&gt;
This programme can be used with a webcam and tracks the position data of facial features, sending the values as OSC.&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic1.png]]&lt;br /&gt;
&lt;br /&gt;
Now, open Wekinator and listen for incoming OSC messages from FaceOSC. It will send 66 pairs of xy values, i.e 132 values each frame. Set Wekinator to listen to port 8338 and for the input message /raw, and choose 132 for number of inputs. For this example I have chosen 1 output with 3 classes under a classifier output. (You can choose anything depending on what sort of output you want). This output will assign one of three images to three different facial expressions.&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic2.png]]&lt;br /&gt;
&lt;br /&gt;
In Wekinator, click next. You will now come to a screen with a dropdown menu next to outputs. Select 1, make your desired facial expression and press start recording. Try recording around 20 examples and then select 2 in the dropdown menu. Now make your second desired facial expression and record again. Do this again for option 3. Now press Train. When training is finished, the status circle will turn green and you can press run. You should notice that when you make one of your three chosen facial expressions, the dropdown menu automatically changes to 1, 2 or 3. If this is happening for the wrong expressions, you may need to add more examples. If this is the case, stop running and repeat the previous steps. Keep adding examples until you get the accuracy you want.&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic3.png]]&lt;br /&gt;
&lt;br /&gt;
Now for an output! Here is a simple output example I made in processing for one classifier with three classes. Just load the sketch with three images and change “happy.jpg&amp;quot; etc to match your file names.&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic5.png]]&lt;br /&gt;
[[File:Tutorialpic4.png]]&lt;br /&gt;
&lt;br /&gt;
IMPORTANT: Wekinator sends values as floats, not ints - so if you are writing an output sketch, you will have to convert.&lt;br /&gt;
&lt;br /&gt;
If you have, for example, chosen three pictures of your dog, you could achieve something like this.&lt;br /&gt;
&lt;br /&gt;
[[File:Tutorialpic6.png]]&lt;br /&gt;
[[File:Tutorialpic7.png]]&lt;br /&gt;
[[File:Tutorialpic8.png]]&lt;/div&gt;</summary>
		<author><name>Rachel</name></author>
	</entry>
</feed>