<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Tiji3841</id>
	<title>Medien Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Tiji3841"/>
	<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/Special:Contributions/Tiji3841"/>
	<updated>2026-04-04T14:33:01Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.6</generator>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104375</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104375"/>
		<updated>2019-02-14T10:46:08Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University was for me a great opportunity to discover new working methods in immersive and interactive Medias. I had the opportunity to be initiated to Max, Pure Data and OSC data.&lt;br /&gt;
&lt;br /&gt;
I was eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;br /&gt;
The aim of my experiment was to be able to generate and control sounds by body movements thanks to the captury Live tracking system.&lt;br /&gt;
&lt;br /&gt;
The experiment was a real challenge because I had to quickly familiarize myself with how Max and the Captury Live system  work and OSC data that I had never had to deal with before.&lt;br /&gt;
&lt;br /&gt;
Most of the difficulties I encountered were due to my low level of knowledge of how Max works and especially how to generate sound.  Indeed, I wanted to not use recorded sounds but rather to generate them directly from the Max patch. &lt;br /&gt;
So I could only generate sounds from basic oscillators or noise generators. That is why the possible controls on these sources were very limited and I established that the body&#039;s movements would eventually have an impact on the effects applied to these basic signals. &lt;br /&gt;
&lt;br /&gt;
I thus established that the Root position (x coordinate) in the space will control the carrier frequency of a ring modulation, and that the Head position (x coordinate) in the space will control the center frequency of a notch filter. In that way, when the listener moves on the platform, frequency modulation and a sweep effect will result.&lt;br /&gt;
Besides, the elevation of the left arm (y coordinate) controls the speed of a metronome that generates a fixed frequency subjected to an envelope generator.  The elevation of the right arm (y coordinate) controls the feedback of a delay applied to the same source. When the arm is down, there is no delay (feedback=0) when the arm is raised to the maximum the feedback goes up to 0.7. The association of interactions with both arms was supposed to have a really playful effect on the listener who literally controls the &amp;quot;instrument&amp;quot; like a conductor.&lt;br /&gt;
&lt;br /&gt;
[[File:subscriber_EColin_1.jpg]]&lt;br /&gt;
&lt;br /&gt;
I must admit that I was frankly disappointed with the final result but I still want to go further in this direction because the possibilities offered by Max are really impressive and have made me want to devote myself seriously into it.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104374</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104374"/>
		<updated>2019-02-14T10:43:59Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University was for me a great opportunity to discover new working methods in immersive and interactive Medias. I Had the opportunity to be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I was eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;br /&gt;
The aim of my experiment was to be able to generate and control sounds by body movements thanks to the captury Live tracking system.&lt;br /&gt;
&lt;br /&gt;
The experiment was a real challenge because I had to quickly familiarize myself with how Max and the Captury Live system  work and OSC data that I had never had to deal with before.&lt;br /&gt;
&lt;br /&gt;
Most of the difficulties I encountered were due to my low level of knowledge of how Max works and especially how to generate sound.  Indeed, I wanted to not use recorded sounds but rather to generate them directly from the Max patch. &lt;br /&gt;
So I could only generate sounds from basic oscillators or noise generators. That is why the possible controls on these sources were very limited and I established that the body&#039;s movements would eventually have an impact on the effects applied to these basic signals. &lt;br /&gt;
&lt;br /&gt;
I thus established that the Root position (x coordinate) in the space will control the carrier frequency of a ring modulation, and that the Head position (x coordinate) in the space will control the center frequency of a notch filter. In that way, when the listener moves on the platform, frequency modulation and a sweep effect will result.&lt;br /&gt;
Besides, the elevation of the left arm (y coordinate) controls the speed of a metronome that generates a fixed frequency subjected to an envelope generator.  The elevation of the right arm (y coordinate) controls the feedback of a delay applied to the same source. When the arm is down, there is no delay (feedback=0) when the arm is raised to the maximum the feedback goes up to 0.7. The association of interactions with both arms was supposed to have a really playful effect on the listener who literally controls the &amp;quot;instrument&amp;quot; like a conductor.&lt;br /&gt;
&lt;br /&gt;
[[File:subscriber_EColin_1.jpg]]&lt;br /&gt;
&lt;br /&gt;
I must admit that I was frankly disappointed with the final result but I still want to go further in this direction because the possibilities offered by Max are really impressive and have made me want to devote myself seriously into it.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Subscriber_EColin_1.jpg&amp;diff=104373</id>
		<title>File:Subscriber EColin 1.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Subscriber_EColin_1.jpg&amp;diff=104373"/>
		<updated>2019-02-14T10:35:03Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104372</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104372"/>
		<updated>2019-02-14T10:34:18Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University was for me a great opportunity to discover new working methods in immersive and interactive Medias. I Had the opportunity to be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I was eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;br /&gt;
The aim of my experiment was to be able to generate and control sounds by body movements thanks to the captury Live tracking system.&lt;br /&gt;
&lt;br /&gt;
The experiment was a real challenge because I had to quickly familiarize myself with how Max and the Captury Live system  work and OSC data that I had never had to deal with before.&lt;br /&gt;
&lt;br /&gt;
Most of the difficulties I encountered were due to my low level of knowledge of how Max works and especially how to generate sound.  Indeed, I wanted to not use recorded sounds but rather to generate them directly from the Max patch. &lt;br /&gt;
So I could only generate sounds from basic oscillators or noise generators. That is why the possible controls on these sources were very limited and I established that the body&#039;s movements would eventually have an impact on the effects applied to these basic signals. &lt;br /&gt;
&lt;br /&gt;
I thus established that the Root position (x coordinate) in the space will control the carrier frequency of a ring modulation, and that the Head position (x coordinate) in the space will control the center frequency of a notch filter. In that way, when the listener moves on the platform, frequency modulation and a sweep effect will result.&lt;br /&gt;
Besides,&lt;br /&gt;
[[File:subscriber_EColin_1.jpg]]&lt;br /&gt;
&lt;br /&gt;
The elevation of the left arm (y coordinate) controls the speed of a metronome that generates a fixed frequency subjected to an envelope generator.  The elevation of the right arm (y coordinate) controls the feedback of a delay applied to the same source. When the arm is down, there is no delay (feedback=0) when the arm is raised to the maximum the feedback goes up to 0.7. The association of interactions with both arms was supposed to have a really playful effect on the listener who literally controls the &amp;quot;instrument&amp;quot; like a conductor.&lt;br /&gt;
&lt;br /&gt;
I must admit that I was frankly disappointed with the final result but I still want to go further in this direction because the possibilities offered by Max are really impressive and have made me want to devote myself seriously into it.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104371</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104371"/>
		<updated>2019-02-14T10:32:08Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University was for me a great opportunity to discover new working methods in immersive and interactive Medias. I Had the opportunity to be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I was eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;br /&gt;
The aim of my experiment was to be able to generate and control sounds by body movements thanks to the captury Live tracking system.&lt;br /&gt;
&lt;br /&gt;
The experiment was a real challenge because I had to quickly familiarize myself with how Max and the Captury Live system  work and OSC data that I had never had to deal with before.&lt;br /&gt;
&lt;br /&gt;
Most of the difficulties I encountered were due to my low level of knowledge of how Max works and especially how to generate sound.  Indeed, I wanted to not use recorded sounds but rather to generate them directly from the Max patch. &lt;br /&gt;
So I could only generate sounds from basic oscillators or noise generators. That is why the possible controls on these sources were very limited and I established that the body&#039;s movements would eventually have an impact on the effects applied to these basic signals. &lt;br /&gt;
&lt;br /&gt;
I thus established that the Root position (x coordinate) in the space will control the carrier frequency of a ring modulation, and that the Head position (x coordinate) in the space will control the center frequency of a notch filter. In that way, when the listener moves on the platform, frequency modulation and a sweep effect will result.&lt;br /&gt;
Besides,&lt;br /&gt;
[[File:subscriber_EColin_1.png]]&lt;br /&gt;
&lt;br /&gt;
The elevation of the left arm (y coordinate) controls the speed of a metronome that generates a fixed frequency subjected to an envelope generator.  The elevation of the right arm (y coordinate) controls the feedback of a delay applied to the same source. When the arm is down, there is no delay (feedback=0) when the arm is raised to the maximum the feedback goes up to 0.7. The association of interactions with both arms was supposed to have a really playful effect on the listener who literally controls the &amp;quot;instrument&amp;quot; like a conductor.&lt;br /&gt;
[[File:Delay.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I must admit that I was frankly disappointed with the final result but I still want to go further in this direction because the possibilities offered by Max are really impressive and have made me want to devote myself seriously to it.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Subscriber_EColin_1.png&amp;diff=104370</id>
		<title>File:Subscriber EColin 1.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Subscriber_EColin_1.png&amp;diff=104370"/>
		<updated>2019-02-14T10:30:02Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
&lt;br /&gt;
== Copyright status: ==&lt;br /&gt;
&lt;br /&gt;
== Source: ==&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104369</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104369"/>
		<updated>2019-02-14T10:29:38Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University was for me a great opportunity to discover new working methods in immersive and interactive Medias. I Had the opportunity to be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I was eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;br /&gt;
The aim of my experiment was to be able to generate and control sounds by body movements thanks to the captury Live tracking system.&lt;br /&gt;
&lt;br /&gt;
The experiment was a real challenge because I had to quickly familiarize myself with how Max and the Captury Live system  work and OSC data that I had never had to deal with before.&lt;br /&gt;
&lt;br /&gt;
Most of the difficulties I encountered were due to my low level of knowledge of how Max works and especially how to generate sound.  Indeed, I wanted to not use recorded sounds but rather to generate them directly from the Max patch. &lt;br /&gt;
So I could only generate sounds from basic oscillators or noise generators. That is why the possible controls on these sources were very limited and I established that the body&#039;s movements would eventually have an impact on the effects applied to these basic signals. &lt;br /&gt;
&lt;br /&gt;
I thus established that the Root position (x coordinate) in the space will control the carrier frequency of a ring modulation, and that the Head position (x coordinate) in the space will control the center frequency of a notch filter. In that way, when the listener moves on the platform, frequency modulation and a sweep effect will result.&lt;br /&gt;
Besides,&lt;br /&gt;
&lt;br /&gt;
The elevation of the left arm (y coordinate) controls the speed of a metronome that generates a fixed frequency subjected to an envelope generator.  The elevation of the right arm (y coordinate) controls the feedback of a delay applied to the same source. When the arm is down, there is no delay (feedback=0) when the arm is raised to the maximum the feedback goes up to 0.7. The association of interactions with both arms was supposed to have a really playful effect on the listener who literally controls the &amp;quot;instrument&amp;quot; like a conductor.&lt;br /&gt;
&lt;br /&gt;
[[Image:subscriber_EColin_1.png]]&lt;br /&gt;
&lt;br /&gt;
I must admit that I was frankly disappointed with the final result but I still want to go further in this direction because the possibilities offered by Max are really impressive and have made me want to devote myself seriously to it.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104368</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=104368"/>
		<updated>2019-02-14T10:28:13Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University was for me a great opportunity to discover new working methods in immersive and interactive Medias. I Had the opportunity to be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I was eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;br /&gt;
The aim of my experiment was to be able to generate and control sounds by body movements thanks to the captury Live tracking system.&lt;br /&gt;
&lt;br /&gt;
The experiment was a real challenge because I had to quickly familiarize myself with how Max and the Captury Live system  work and OSC data that I had never had to deal with before.&lt;br /&gt;
&lt;br /&gt;
Most of the difficulties I encountered were due to my low level of knowledge of how Max works and especially how to generate sound.  Indeed, I wanted to not use recorded sounds but rather to generate them directly from the Max patch. &lt;br /&gt;
So I could only generate sounds from basic oscillators or noise generators. That is why the possible controls on these sources were very limited and I established that the body&#039;s movements would eventually have an impact on the effects applied to these basic signals. &lt;br /&gt;
&lt;br /&gt;
I thus established that the Root position (x coordinate) in the space will control the carrier frequency of a ring modulation, and that the Head position (x coordinate) in the space will control the center frequency of a notch filter. In that way, when the listener moves on the platform, frequency modulation and a sweep effect will result.&lt;br /&gt;
Besides,&lt;br /&gt;
&lt;br /&gt;
The elevation of the left arm (y coordinate) controls the speed of a metronome that generates a fixed frequency subjected to an envelope generator.  The elevation of the right arm (y coordinate) controls the feedback of a delay applied to the same source. When the arm is down, there is no delay (feedback=0) when the arm is raised to the maximum the feedback goes up to 0.7. The association of interactions with both arms was supposed to have a really playful effect on the listener who literally controls the &amp;quot;instrument&amp;quot; like a conductor.&lt;br /&gt;
&lt;br /&gt;
[[File:subscriber_EColin_1.png]]&lt;br /&gt;
&lt;br /&gt;
I must admit that I was frankly disappointed with the final result but I still want to go further in this direction because the possibilities offered by Max are really impressive and have made me want to devote myself seriously to it.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=99056</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=99056"/>
		<updated>2018-10-12T15:12:48Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University is a great opportunity to discover new working methods in immersive and interactive Medias. I would really like to improve my knowledge in learning how to use software that goes with it and also be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I am eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=99055</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=99055"/>
		<updated>2018-10-12T15:11:58Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: /* Etienne COLIN */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance Platform of the Bauhaus University is a great opportunity to discover new working methods in immersive and interactive Medias. I would really like to improve my knowledge in learning how to use software that goes with it and also be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I am eager to familiarize myself with the state-of-the-art equipment of the Performance Platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=99054</id>
		<title>GMU:Performance Platform Introduction/Etienne Colin</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction/Etienne_Colin&amp;diff=99054"/>
		<updated>2018-10-12T15:10:34Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: Created page with &amp;quot;My name is Etienne COLIN, I am currently studying in a Master&amp;#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-M...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;My name is Etienne COLIN, I am currently studying in a Master&#039;s degree in Image and Sound Sciences, Arts and Techniques with a specialisation in sound engineering at the Aix-Marseille University (France).&lt;br /&gt;
&lt;br /&gt;
The Performance platform of the Bauhaus University is a great opportunity to discover new working methods in immersive and interactive Medias. I would really like to improve my knowledge in learning how to use software that goes with it and also be initiated to Max, Pure Data and OSC. &lt;br /&gt;
&lt;br /&gt;
I am eager to familiarize myself with the state-of-the-art equipment of the Performance platform in order to experiment on interactions between a performer and a video and especially its potential regarding sound.&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction&amp;diff=99053</id>
		<title>GMU:Performance Platform Introduction</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction&amp;diff=99053"/>
		<updated>2018-10-12T14:57:05Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: /* Student works */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Lecturer: [[GMU:Max Neupert|Max Neupert]], [[Jörg Brinkmann]]&amp;lt;br&amp;gt;&lt;br /&gt;
Credits: 6 ECTS, 4 SWS&amp;lt;br&amp;gt;&lt;br /&gt;
Venue: [[GMU:Performance Platform|Performance Platform]], Digital Bauhaus Lab (Room 001)&amp;lt;br&amp;gt;&lt;br /&gt;
First meeting: Friday, 12.10.18, 10:00&lt;br /&gt;
&lt;br /&gt;
==Dates==&lt;br /&gt;
Block1: 02.11.2018, 03.11.2018, 10:00 - 18:00 &amp;lt;br&amp;gt; &lt;br /&gt;
Block2: 23.11.2018, 24.11.2018, 10:00 - 18:00 &amp;lt;br&amp;gt; &lt;br /&gt;
Block3: 14.12.2018, 15.12.2018, 10:00 - 18:00 &amp;lt;br&amp;gt; &lt;br /&gt;
Block4: 18.01.2019, 19.01.2019, 10:00 - 18:00 &lt;br /&gt;
&lt;br /&gt;
&amp;lt;br style=&amp;quot;clear: both&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Description==&lt;br /&gt;
The Interactive Performance Platform, an innovative laboratory for artistic research, offers access to various technologies such as: markerless multi-person tracking, a highspeed camera for longterm-recording, a 12.2 channel audio system, a 4 x 4 tiled video wall and a workstation for VR. Within the course, students will be introduced to these technologies&lt;br /&gt;
&lt;br /&gt;
Workshop students will be encouraged to combine their acquired knowledge to create individual works&lt;br /&gt;
&lt;br /&gt;
==Criteria for passing==&lt;br /&gt;
In order to successfully participate you will have to develop and document your own project that can be in the form of a performance, video work or installation for example. Also, regularly attend to the sessions and participation is mandatory.  &lt;br /&gt;
&lt;br /&gt;
==Language==&lt;br /&gt;
The course will be in English, unless all participants are speaking German.&lt;br /&gt;
&lt;br /&gt;
== Student works==&lt;br /&gt;
*[[/Jörg Brinkmann/]]&lt;br /&gt;
*[[/Max Neupert/]]&lt;br /&gt;
*[[/Etienne Colin/]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:WS18]]&lt;br /&gt;
[[Category:Werkmodul]]&lt;br /&gt;
[[Category:Fachmodul]]&lt;br /&gt;
[[Category:Jörg Brinkmann]]&lt;br /&gt;
[[Category:Max Neupert]]&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction&amp;diff=99052</id>
		<title>GMU:Performance Platform Introduction</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Performance_Platform_Introduction&amp;diff=99052"/>
		<updated>2018-10-12T14:39:03Z</updated>

		<summary type="html">&lt;p&gt;Tiji3841: /* Student works */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Lecturer: [[GMU:Max Neupert|Max Neupert]], [[Jörg Brinkmann]]&amp;lt;br&amp;gt;&lt;br /&gt;
Credits: 6 ECTS, 4 SWS&amp;lt;br&amp;gt;&lt;br /&gt;
Venue: [[GMU:Performance Platform|Performance Platform]], Digital Bauhaus Lab (Room 001)&amp;lt;br&amp;gt;&lt;br /&gt;
First meeting: Friday, 12.10.18, 10:00&lt;br /&gt;
&lt;br /&gt;
==Dates==&lt;br /&gt;
Block1: 02.11.2018, 03.11.2018, 10:00 - 18:00 &amp;lt;br&amp;gt; &lt;br /&gt;
Block2: 23.11.2018, 24.11.2018, 10:00 - 18:00 &amp;lt;br&amp;gt; &lt;br /&gt;
Block3: 14.12.2018, 15.12.2018, 10:00 - 18:00 &amp;lt;br&amp;gt; &lt;br /&gt;
Block4: 18.01.2019, 19.01.2019, 10:00 - 18:00 &lt;br /&gt;
&lt;br /&gt;
&amp;lt;br style=&amp;quot;clear: both&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Description==&lt;br /&gt;
The Interactive Performance Platform, an innovative laboratory for artistic research, offers access to various technologies such as: markerless multi-person tracking, a highspeed camera for longterm-recording, a 12.2 channel audio system, a 4 x 4 tiled video wall and a workstation for VR. Within the course, students will be introduced to these technologies&lt;br /&gt;
&lt;br /&gt;
Workshop students will be encouraged to combine their acquired knowledge to create individual works&lt;br /&gt;
&lt;br /&gt;
==Criteria for passing==&lt;br /&gt;
In order to successfully participate you will have to develop and document your own project that can be in the form of a performance, video work or installation for example. Also, regularly attend to the sessions and participation is mandatory.  &lt;br /&gt;
&lt;br /&gt;
==Language==&lt;br /&gt;
The course will be in English, unless all participants are speaking German.&lt;br /&gt;
&lt;br /&gt;
== Student works==&lt;br /&gt;
*[[/Jörg Brinkmann/]]&lt;br /&gt;
*[[/Max Neupert/]]&lt;br /&gt;
*[[/Etienne Colin/]]&lt;br /&gt;
&lt;br /&gt;
[[Category:WS18]]&lt;br /&gt;
[[Category:Werkmodul]]&lt;br /&gt;
[[Category:Fachmodul]]&lt;br /&gt;
[[Category:Jörg Brinkmann]]&lt;br /&gt;
[[Category:Max Neupert]]&lt;/div&gt;</summary>
		<author><name>Tiji3841</name></author>
	</entry>
</feed>