<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Li+lingling</id>
	<title>Medien Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Li+lingling"/>
	<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/Special:Contributions/Li_lingling"/>
	<updated>2026-04-04T07:23:48Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.6</generator>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Processing%2Bpuredata.zip&amp;diff=55491</id>
		<title>File:Processing+puredata.zip</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Processing%2Bpuredata.zip&amp;diff=55491"/>
		<updated>2013-03-31T21:09:53Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: uploaded a new version of &amp;amp;quot;File:Processing+puredata.zip&amp;amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Processing%2Bpuredata.zip&amp;diff=55471</id>
		<title>File:Processing+puredata.zip</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Processing%2Bpuredata.zip&amp;diff=55471"/>
		<updated>2013-03-31T20:42:37Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: uploaded a new version of &amp;amp;quot;File:Processing+puredata.zip&amp;amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor4.jpg&amp;diff=55470</id>
		<title>File:Touch sensor4.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor4.jpg&amp;diff=55470"/>
		<updated>2013-03-31T20:40:50Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55464</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55464"/>
		<updated>2013-03-31T20:33:42Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from one project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified.Then I use the classified information in pure data define different actions. In pure data I will create a gem and then using the data to move it. At the end it will like a game “maze”.&lt;br /&gt;
[[File:touch sensor3.jpg]]&lt;br /&gt;
==Working Process==&lt;br /&gt;
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
In my project I can get different data from Arduino to processing, when I touching the &amp;quot;Touch Sensor&amp;quot; . Then I will define different gestures. In my case I have four gestures. &amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 1: No touching.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 2: One finger touching.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 3: Grab.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 4: Put the finger in the water.&amp;lt;br&amp;gt;&lt;br /&gt;
[[File:touch sensor4.jpg]]&lt;br /&gt;
Because I want to use the different gestures to control the Gem object in pure data. So I name them &amp;quot;Go right&amp;quot;、“Go up”、“Go left”、“Go down”.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63046775&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
In processing the gesture data is remembered as numbers.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 1 is &amp;quot;0&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 2 is &amp;quot;1&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 3 is &amp;quot;2&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
Gesture 4 is &amp;quot;3&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
Then I send the gesture data to pure data through OSC. So in pure data I define the numbers with different movement.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;quot;0&amp;quot; is &amp;quot;Go Left&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;quot;1&amp;quot; is &amp;quot;Go up&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;quot;2&amp;quot; is &amp;quot;Go right&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;quot;3&amp;quot; is &amp;quot;Go down&amp;quot;.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63048325&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55463</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55463"/>
		<updated>2013-03-31T20:31:52Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from one project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified.Then I use the classified information in pure data define different actions. In pure data I will create a gem and then using the data to move it. At the end it will like a game “maze”.&lt;br /&gt;
[[File:touch sensor3.jpg]]&lt;br /&gt;
==Working Process==&lt;br /&gt;
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
In my project I can get different data from Arduino to processing, when I touching the &amp;quot;Touch Sensor&amp;quot; . Then I will define different gestures. In my case I have four gestures. [br]&lt;br /&gt;
Gesture 1: No touching.[br]&lt;br /&gt;
Gesture 2: One finger touching.[br]&lt;br /&gt;
Gesture 3: Grab.[br]&lt;br /&gt;
Gesture 4: Put the finger in the water.[br]&lt;br /&gt;
[[File:touch sensor4.jpg]][br]&lt;br /&gt;
Because I want to use the different gestures to control the Gem object in pure data. So I name them &amp;quot;Go right&amp;quot;、“Go up”、“Go left”、“Go down”.&lt;br /&gt;
In processing the gesture data is remembered as numbers.&lt;br /&gt;
Gesture 1 is &amp;quot;0&amp;quot;.&lt;br /&gt;
Gesture 2 is &amp;quot;1&amp;quot;.&lt;br /&gt;
Gesture 3 is &amp;quot;2&amp;quot;.&lt;br /&gt;
Gesture 4 is &amp;quot;3&amp;quot;.&lt;br /&gt;
Then I send the gesture data to pure data through OSC. So in pure data I define the numbers with different movement.&lt;br /&gt;
&amp;quot;0&amp;quot; is &amp;quot;Go Left&amp;quot;.&lt;br /&gt;
&amp;quot;1&amp;quot; is &amp;quot;Go up&amp;quot;.&lt;br /&gt;
&amp;quot;2&amp;quot; is &amp;quot;Go right&amp;quot;.&lt;br /&gt;
&amp;quot;3&amp;quot; is &amp;quot;Go down&amp;quot;.&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63048325&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63046775&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55461</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55461"/>
		<updated>2013-03-31T20:30:39Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from one project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified.Then I use the classified information in pure data define different actions. In pure data I will create a gem and then using the data to move it. At the end it will like a game “maze”.&lt;br /&gt;
[[File:touch sensor3.jpg]]&lt;br /&gt;
==Working Process==&lt;br /&gt;
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
In my project I can get different data from Arduino to processing, when I touching the &amp;quot;Touch Sensor&amp;quot; . Then I will define different gestures. In my case I have four gestures. &lt;br /&gt;
Gesture 1: No touching.&lt;br /&gt;
Gesture 2: One finger touching.&lt;br /&gt;
Gesture 3: Grab.&lt;br /&gt;
Gesture 4: Put the finger in the water.&lt;br /&gt;
[[File:touch sensor4.jpg]]&lt;br /&gt;
Because I want to use the different gestures to control the Gem object in pure data. So I name them &amp;quot;Go right&amp;quot;、“Go up”、“Go left”、“Go down”.&lt;br /&gt;
In processing the gesture data is remembered as numbers.&lt;br /&gt;
Gesture 1 is &amp;quot;0&amp;quot;.&lt;br /&gt;
Gesture 2 is &amp;quot;1&amp;quot;.&lt;br /&gt;
Gesture 3 is &amp;quot;2&amp;quot;.&lt;br /&gt;
Gesture 4 is &amp;quot;3&amp;quot;.&lt;br /&gt;
Then I send the gesture data to pure data through OSC. So in pure data I define the numbers with different movement.&lt;br /&gt;
&amp;quot;0&amp;quot; is &amp;quot;Go Left&amp;quot;.&lt;br /&gt;
&amp;quot;1&amp;quot; is &amp;quot;Go up&amp;quot;.&lt;br /&gt;
&amp;quot;2&amp;quot; is &amp;quot;Go right&amp;quot;.&lt;br /&gt;
&amp;quot;3&amp;quot; is &amp;quot;Go down&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63046775&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55453</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55453"/>
		<updated>2013-03-31T20:05:52Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from one project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified.Then I use the classified information in pure data define different actions. In pure data I will create a gem and then using the data to move it. At the end it will like a game “maze”.&lt;br /&gt;
[[File:touch sensor3.jpg]]&lt;br /&gt;
==Working Process==&lt;br /&gt;
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
In my project I can get different data from Arduino to processing, when I touching the &amp;quot;Touch Sensor&amp;quot; . Then I will define different gestures. In my case the information is divided into four categories.&lt;br /&gt;
Because I want to use the different gestures to control the Gem object in pure data. &lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63046775&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor3.jpg&amp;diff=55452</id>
		<title>File:Touch sensor3.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor3.jpg&amp;diff=55452"/>
		<updated>2013-03-31T20:05:29Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55448</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55448"/>
		<updated>2013-03-31T20:02:23Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from one project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified.Then I use the classified information in pure data define different actions. In pure data I will create a gem and then using the data to move it. At the end it will like a game “maze”.&lt;br /&gt;
 &lt;br /&gt;
==Working Process==&lt;br /&gt;
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
In my project I can get different data from Arduino to processing, when I touching the &amp;quot;Touch Sensor&amp;quot; . Then I will define different gestures. In my case the information is divided into four categories.&lt;br /&gt;
Because I want to use the different gestures to control the Gem object in pure data. &lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;63046775&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55446</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55446"/>
		<updated>2013-03-31T20:01:52Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from one project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. In my project I make a combination between processing and pure data, using processing as a medium to send data from Arduino to pure data. In terms of processing, the information from Arduino is classified.Then I use the classified information in pure data define different actions. In pure data I will create a gem and then using the data to move it. At the end it will like a game “maze”.&lt;br /&gt;
 &lt;br /&gt;
==Working Process==&lt;br /&gt;
First I build a circuit on breadboard by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
In my project I can get different data from Arduino to processing, when I touching the &amp;quot;Touch Sensor&amp;quot; . Then I will define different gestures. In my case the information is divided into four categories.&lt;br /&gt;
Because I want to use the different gestures to control the Gem object in pure data. &lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115436&amp;lt;/videoflash&amp;gt; &lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55427</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55427"/>
		<updated>2013-03-31T19:35:15Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
My concept come from the project online,it is called &amp;quot;Touche for Arduino: Advanced touch sensing&#039;. Then I build a circuit by myself to change the Arduino to be a touch sensor. &lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55413</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55413"/>
		<updated>2013-03-31T19:24:22Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: /* Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
[[File:touch sensor2.jpg]]&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor1.jpg&amp;diff=55411</id>
		<title>File:Touch sensor1.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor1.jpg&amp;diff=55411"/>
		<updated>2013-03-31T19:23:39Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor2.jpg&amp;diff=55410</id>
		<title>File:Touch sensor2.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Touch_sensor2.jpg&amp;diff=55410"/>
		<updated>2013-03-31T19:22:39Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55407</id>
		<title>GMU:Dataflow I WS12/Li Lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Dataflow_I_WS12/Li_Lingling&amp;diff=55407"/>
		<updated>2013-03-31T19:18:32Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: /* Arduino sensor */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Touch Sensor==&lt;br /&gt;
I work on the interaction between object and sound. I will connect the object with arduino and turn arduino into a touch sensor. &lt;br /&gt;
First I will sammeln the information to processing. Then the message will be sent to puredata through OSC.&lt;br /&gt;
==Description==&lt;br /&gt;
[[File:touch sensor1.jpg]]&lt;br /&gt;
[[File:processing+puredata.zip]]&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano/li_lingling&amp;diff=46910</id>
		<title>GMU:Montepulciano/li lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano/li_lingling&amp;diff=46910"/>
		<updated>2012-11-08T22:53:10Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115436&amp;lt;/videoflash&amp;gt;&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115394&amp;lt;/videoflash&amp;gt;&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53116988&amp;lt;/videoflash&amp;gt;&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano/li_lingling&amp;diff=46909</id>
		<title>GMU:Montepulciano/li lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano/li_lingling&amp;diff=46909"/>
		<updated>2012-11-08T22:38:47Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115436&amp;lt;/videoflash&amp;gt;&lt;br /&gt;
&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115394&amp;lt;/videoflash&amp;gt;&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano/li_lingling&amp;diff=46908</id>
		<title>GMU:Montepulciano/li lingling</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano/li_lingling&amp;diff=46908"/>
		<updated>2012-11-08T22:37:25Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: Created page with &amp;quot;&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115436&amp;lt;/videoflash&amp;gt;&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;videoflash type=&amp;quot;vimeo&amp;quot;&amp;gt;53115436&amp;lt;/videoflash&amp;gt;&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano&amp;diff=46906</id>
		<title>GMU:Montepulciano</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Montepulciano&amp;diff=46906"/>
		<updated>2012-11-08T22:13:16Z</updated>

		<summary type="html">&lt;p&gt;Li lingling: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[:Category:Werkmodul|Werkmodul]]/[[:Category:Fachmodul|Fachmodul]]&amp;lt;br /&amp;gt;&lt;br /&gt;
&#039;&#039;Lecturers:&#039;&#039; [[Max Neupert]]&amp;lt;br /&amp;gt;&lt;br /&gt;
&#039;&#039;Credits:&#039;&#039; 6 [[ECTS]], 4 [[SWS]]&amp;lt;br /&amp;gt;&lt;br /&gt;
&#039;&#039;Date:&#039;&#039; Thursday, 13:30 until 16:45 h &amp;lt;br /&amp;gt;&lt;br /&gt;
&#039;&#039;Venue:&#039;&#039; [[Marienstraße 7b]], Room 204&amp;lt;br /&amp;gt;&lt;br /&gt;
&#039;&#039;First meeting:&#039;&#039; 19.4.&lt;br /&gt;
&lt;br /&gt;
==Description==&lt;br /&gt;
The modul focus on the impact of algorithms on urban space&lt;br /&gt;
Part of the class is an excursion to Montepulciano, Italy 3.-10.6. We&#039;ll be working there together with Students of the Academy of Media in Cologne on our projects. Georg Trogemann, Lasse Schreffig of the KHM, as well as Ursula Damm and Bernhard Hopfengärtner will be joining us.&lt;br /&gt;
&lt;br /&gt;
===German description===&lt;br /&gt;
Das Modul beschäftig sich mit Algorithmen und ihrer Auswirkung auf den Stadtraum. &lt;br /&gt;
&lt;br /&gt;
Teil des Moduls ist eine Exkursion nach Montepulciano, Italien vom 3.-10.6. Dort werden wir gemeinsam mit Studierenden der Kunsthochschule für Medien, Köln an Projekten arbeiten. Georg Trogemann und Lasse Scherffig von der KHM, sowie Ursula Damm und Bernhard Hopfengärtner sind ebenfalls mit dabei.&lt;br /&gt;
&lt;br /&gt;
==Assignments==&lt;br /&gt;
* [[/Ludocity|Games and the city]]&lt;br /&gt;
* [[/Julia/]]&lt;br /&gt;
* [[/li lingling/]]&lt;br /&gt;
&lt;br /&gt;
==Assignment==&lt;br /&gt;
Active participation, presentation, artistic examination, documentation, edits in the wiki.&lt;br /&gt;
&lt;br /&gt;
==Eligible participants==&lt;br /&gt;
Graduates enrolled in the Faculties of Media, Gestaltung and in the MediaArchitecture program&lt;br /&gt;
&lt;br /&gt;
==Syllabus==&lt;br /&gt;
Termine des Semesters&lt;br /&gt;
# 19.4. First meeting with Bernd Hopfengärtner&lt;br /&gt;
# 26.4. Ludic City&lt;br /&gt;
# 3.5.&lt;br /&gt;
# 10.5. cancelled (Tampere)&lt;br /&gt;
# 17.5. cancelled (Christi Himmelfahrt)&lt;br /&gt;
# 24.5. Maps and finding a way in the city&lt;br /&gt;
# 31.5.&lt;br /&gt;
# 2.-10.6. Excursion&lt;br /&gt;
# 14.6.&lt;br /&gt;
# 21.6. cancelled (Istanbul)&lt;br /&gt;
# 28.6.&lt;br /&gt;
# 5.7.&lt;br /&gt;
# 12.7.&lt;br /&gt;
&lt;br /&gt;
==Literature==&lt;br /&gt;
[http://www.uni-weimar.de/cms/universitaet/zentrale-einrichtungen/universitaetsbibliothek/recherche/semesterapparate/semesterapparate-medien/neu-medien-semesterapparate/ms-2012-35.html Semesterapparat]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Busse, Klaus-Peter: Vom Bild zum Ort: Mapping lernen ISBN 978-3-8334-7031-8&lt;br /&gt;
* Bianchi, Paolo: Atlas : 6. Juni - 11. Juli 1997 Offenes Kulturhaus Linz ISBN 3-85132-162-6&lt;br /&gt;
* Thompson, Nato: Experimental geography : radical approaches to landscape, cartography, and urbanism ISBN 978-0-09-163658-6&lt;br /&gt;
* Harmon, Katharine A.: The map as art ISBN 978-1-568-98762-0&lt;br /&gt;
*Harmon, Katharine A.: You are here ISBN 1-568-98430-8&lt;br /&gt;
* Harzinski, Kris: From Here to There: A Curious Collection from the Hand Drawn Map ISBN 978-1568988825&lt;br /&gt;
* Jacobs, Frank: Strange Maps: An Atlas of Cartographic Curiosities ISBN 978-0142005255&lt;br /&gt;
* Koch, Tom: Desease Maps: Epidemics on the Ground ISBN 9780226449357&lt;br /&gt;
* MacEachren, Alan M.: How maps work : representation, visualization, and design ISBN 0-89862-589-0&lt;br /&gt;
* Möntmann, Nina: Mapping a City ISBN 3-7757-1442-1&lt;br /&gt;
* Ramm, Frederik: OpenStreetMap : die freie Weltkarte nutzen und mitgestalten ISBN 978-1-906860-11-0&lt;br /&gt;
* Rosenberg, Daniel: Cartographies of time ISBN 978-1-568-98763-7&lt;br /&gt;
* Schalansky, Judith: Atlas der abgelegenen Inseln ISBN 978-3-86648-117-6&lt;br /&gt;
* Die Sehnsucht des Kartografen : Franz Ackermann ISBN 3-00-012475-6&lt;br /&gt;
* Turchi, Peter : Maps of the Imagination: The Writer as Cartographer ISBN 978-1595340412&lt;br /&gt;
* Caquard, Sébastien: Mapping environmental issues in the city : arts and cartography cross perspectives ISBN 978-3-642-22440-9&lt;br /&gt;
* Journal of Aesthetics and Protest Press: An Atlas of Radical Cartography ISBN 978-0979137723&lt;br /&gt;
* Casey, Edward: Earth-mapping: artists reshaping landscape ISBN 0-8166-4332-6&lt;br /&gt;
&lt;br /&gt;
==Links==&lt;br /&gt;
* [https://mg.medien.uni-weimar.de/mailman/listinfo/montepulciano Mailinglist]&lt;br /&gt;
* [http://www.palazzoricci.com Palazzo Ricci, Montepulciano]&lt;br /&gt;
* Gaspar Battha: Allure de Wonderland&lt;br /&gt;
* Max Neupert: [http://medienkunst.burg-halle.de/km/proj_max2.html Perspektive]&lt;br /&gt;
* [http://www.grandin.com Temple Grandin]&lt;br /&gt;
* [http://www.julia-stoschek-collection.net/typo3temp/pics/8be560e791.jpg Jon Kessler: Heavenʼs Gate]&lt;br /&gt;
&lt;br /&gt;
==Map==&lt;br /&gt;
{{&lt;br /&gt;
#display_map:&lt;br /&gt;
43.092101, 11.78148&lt;br /&gt;
| service=openlayers&lt;br /&gt;
| layers=osm-mapnik,osmarender,osm-cyclemap,osm-oepnv,bing&lt;br /&gt;
| zoom=14&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Li lingling</name></author>
	</entry>
</feed>