<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Lalalolo</id>
	<title>Medien Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Lalalolo"/>
	<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/Special:Contributions/Lalalolo"/>
	<updated>2026-04-13T10:14:52Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.6</generator>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131285</id>
		<title>GMU:Media Art Strategies/Leon-Etienne Kühr/Project</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131285"/>
		<updated>2022-09-14T10:44:06Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Kurzbeschreibung */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Media_Art_Strategies/Leon-Etienne_Kühr/Project_old old project about telepresence]]&lt;br /&gt;
&lt;br /&gt;
=Präsenz  (Mensch + Prothese)=&lt;br /&gt;
Entwurf für eine Installation von Leon-Etienne Kühr und Clemens Hornemann&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
==Kurzbeschreibung==&lt;br /&gt;
Die Installation &amp;quot;Präsenz (Mensch + Prothese)&amp;quot; befasst sich mit der Untrennbarkeit physischer und digitaler Präsenz in der heutigen Zeit.&lt;br /&gt;
&lt;br /&gt;
Ein mittig im Raum installiertes Smartphone misst die Signale und Wellen, die wir und unsere digitalen Geräte ausstrahlen. In einem Wasserbecken werden durch Interferenzmuster und Wellen die gemessenen Daten dargestellt und reagieren kurz und langfristig auf die Veränderungen im Raum — die Menschliche Präsenz als messbare Größe sichtbar gemacht.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Prototypen==&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Skizze_new.jpg|351px]]&lt;br /&gt;
[[File:PMP_Function_b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_O2.JPG|351px]] &lt;br /&gt;
[[File:PMP_O4b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_P1.jpg|285px]] &lt;br /&gt;
[[File:PMP_P2.JPG|285px]] &lt;br /&gt;
[[File:PMP_P4.jpg|127px]]&lt;br /&gt;
&lt;br /&gt;
----&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131284</id>
		<title>GMU:Media Art Strategies/Leon-Etienne Kühr/Project</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131284"/>
		<updated>2022-09-14T10:43:30Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Kurzbeschreibung */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Media_Art_Strategies/Leon-Etienne_Kühr/Project_old old project about telepresence]]&lt;br /&gt;
&lt;br /&gt;
=Präsenz  (Mensch + Prothese)=&lt;br /&gt;
Entwurf für eine Installation von Leon-Etienne Kühr und Clemens Hornemann&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
==Kurzbeschreibung==&lt;br /&gt;
Die Installation &amp;quot;Präsenz (Mensch + Prothese)&amp;quot; befasst sich mit der Untrennbarkeit physischer und digitaler Präsenz in der heutigen Zeit.&lt;br /&gt;
&lt;br /&gt;
Ein mittig im Raum installiertes Smartphone misst die Signale und Wellen, die wir und unsere digitalen Geräte ausstrahlen. In einem Becken voll schwarzer Farbe werden durch Interferenzmuster und Wellen die gemessenen Daten dargestellt und reagieren kurz und langfristig auf die Veränderungen im Raum — die Menschliche Präsenz als messbare Größe sichtbar gemacht.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Prototypen==&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Skizze_new.jpg|351px]]&lt;br /&gt;
[[File:PMP_Function_b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_O2.JPG|351px]] &lt;br /&gt;
[[File:PMP_O4b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_P1.jpg|285px]] &lt;br /&gt;
[[File:PMP_P2.JPG|285px]] &lt;br /&gt;
[[File:PMP_P4.jpg|127px]]&lt;br /&gt;
&lt;br /&gt;
----&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131283</id>
		<title>GMU:Media Art Strategies/Leon-Etienne Kühr/Project</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131283"/>
		<updated>2022-09-14T10:43:13Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Installation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Media_Art_Strategies/Leon-Etienne_Kühr/Project_old old project about telepresence]]&lt;br /&gt;
&lt;br /&gt;
=Präsenz  (Mensch + Prothese)=&lt;br /&gt;
Entwurf für eine Installation von Leon-Etienne Kühr und Clemens Hornemann&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
==Kurzbeschreibung==&lt;br /&gt;
Die Installation &amp;quot;Präsenz (Mensch + Prothese)&amp;quot; befasst sich mit der Untrennbarkeit physischer und digitaler Präsenz in der heutigen Zeit.&lt;br /&gt;
&lt;br /&gt;
Ein mittig im Raum installiertes Smartphone misst die Signale und Wellen, die wir und unsere digitalen Geräte ausstrahlen. In einem Becken voll schwarzer Farbe werden durch Interferenzmuster und Wellen die gemessenen Daten dargestellt und reagieren kurz und langfristig auf die Veränderungen im Raum — die Menschliche Präsenz als messbare Größe sichtbar gemacht.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Render_front.png]]&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Prototypen==&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Skizze_new.jpg|351px]]&lt;br /&gt;
[[File:PMP_Function_b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_O2.JPG|351px]] &lt;br /&gt;
[[File:PMP_O4b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_P1.jpg|285px]] &lt;br /&gt;
[[File:PMP_P2.JPG|285px]] &lt;br /&gt;
[[File:PMP_P4.jpg|127px]]&lt;br /&gt;
&lt;br /&gt;
----&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131282</id>
		<title>GMU:Media Art Strategies/Leon-Etienne Kühr/Project</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131282"/>
		<updated>2022-09-14T10:42:59Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Präsenz  (Mensch + Prothese) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Media_Art_Strategies/Leon-Etienne_Kühr/Project_old old project about telepresence]]&lt;br /&gt;
&lt;br /&gt;
=Präsenz  (Mensch + Prothese)=&lt;br /&gt;
Entwurf für eine Installation von Leon-Etienne Kühr und Clemens Hornemann&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
==Kurzbeschreibung==&lt;br /&gt;
Die Installation &amp;quot;Präsenz (Mensch + Prothese)&amp;quot; befasst sich mit der Untrennbarkeit physischer und digitaler Präsenz in der heutigen Zeit.&lt;br /&gt;
&lt;br /&gt;
Ein mittig im Raum installiertes Smartphone misst die Signale und Wellen, die wir und unsere digitalen Geräte ausstrahlen. In einem Becken voll schwarzer Farbe werden durch Interferenzmuster und Wellen die gemessenen Daten dargestellt und reagieren kurz und langfristig auf die Veränderungen im Raum — die Menschliche Präsenz als messbare Größe sichtbar gemacht.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Render_front.png]]&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Installation==&lt;br /&gt;
[[File:PMP_Input_Output.jpg|351px]] [[File:PMP_Render_frosch.png|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Render_top.png|351px]] [[File:PMP_Render_top_farbe.png|351px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_GR1.jpg|350px]] [[File:PMP_GR2.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_GR4.jpg|350px]] [[File:PMP_GR5.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Sensoren.png|351px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Prototypen==&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Skizze_new.jpg|351px]]&lt;br /&gt;
[[File:PMP_Function_b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_O2.JPG|351px]] &lt;br /&gt;
[[File:PMP_O4b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_P1.jpg|285px]] &lt;br /&gt;
[[File:PMP_P2.JPG|285px]] &lt;br /&gt;
[[File:PMP_P4.jpg|127px]]&lt;br /&gt;
&lt;br /&gt;
----&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131281</id>
		<title>GMU:Media Art Strategies/Leon-Etienne Kühr/Project</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr/Project&amp;diff=131281"/>
		<updated>2022-09-14T10:42:11Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Einflüsse anderer KünstlerInnen */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Media_Art_Strategies/Leon-Etienne_Kühr/Project_old old project about telepresence]]&lt;br /&gt;
&lt;br /&gt;
=Präsenz  (Mensch + Prothese)=&lt;br /&gt;
Entwurf für eine Installation von Leon-Etienne Kühr und Clemens Hornemann&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{{#ev:youtube|reuBmgCcBew|735}}&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
==Kurzbeschreibung==&lt;br /&gt;
Die Installation &amp;quot;Präsenz (Mensch + Prothese)&amp;quot; befasst sich mit der Untrennbarkeit physischer und digitaler Präsenz in der heutigen Zeit.&lt;br /&gt;
&lt;br /&gt;
Ein mittig im Raum installiertes Smartphone misst die Signale und Wellen, die wir und unsere digitalen Geräte ausstrahlen. In einem Becken voll schwarzer Farbe werden durch Interferenzmuster und Wellen die gemessenen Daten dargestellt und reagieren kurz und langfristig auf die Veränderungen im Raum — die Menschliche Präsenz als messbare Größe sichtbar gemacht.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Render_front.png]]&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Hintergrund==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Das Smartphone:&lt;br /&gt;
&lt;br /&gt;
Moderne Smartphones bieten vielfältige sensorische Schnittstellen zur Wahrnehmung und Interaktion mit ihrer Umgebung. Zudem sind sie die am [https://gs.statcounter.com/platform-market-share/desktop-mobile-tablet/worldwide häufigsten genutzte] und wohl intuitivste Mensch-Maschinen-Schnittstelle.&lt;br /&gt;
&lt;br /&gt;
Durch ihre sensorische Vielfalt ist nicht nur die Vernetzung zu anderen Endpunkten des Internets, sondern auch zu ihrer physischen Umwelt möglich. Dabei werden sowohl physikalische Größen wie Temperatur, Ton, Erdbeschleunigung oder Lichtintensität als auch verschiedene digitale Schnittstellen wie Bluetooth, WLAN oder das Funknetz gemessen. Ein Großteil dieser Sensoren wurde im Laufe der Jahre zur Standartausstattung der Geräte, um bestimmte anwendungsspezifische Funktionen zu erfüllen, so in etwa ein Distanzsensor, um zu erkennen, ob das Smartphone gerade an die Ohrmuschel gehalten wird. Die Nutzung ist aber nichtmehr nur auf diese Funktionen beschränkt. So werden Sensordaten vom Gerät gesammelt, und auch Applikationen von Drittanbietern können auf die Sensoren zugreifen.&lt;br /&gt;
&lt;br /&gt;
Manche dieser Schnittstellen zur Außenwelt wie WLAN oder Bluetooth arbeiten kontinuierlich und selbstständig im Hintergrund, beispielsweise, um nach Netzwerken in der Umgebung der Geräte zu suchen. Zu sehen ist dies am Beispiel der Corona-Warn-App; hier wird der Bluetooth-Standard verwendet, um von den in der Umgebung vorhandenen Smartphones auf die Präsenz anderer Menschen zurückzuschließen. Im Rückschluss bedeutet dies, dass in der gesellschaftlichen Wahrnehmung das Smartphone mittlerweile ein essentieller Teil unserer menschlichen Präsenz geworden ist. Das Smartphone wird als individueller Gegenstand untrennbar mit unserer Persönlichkeit verknüpft.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Über die Entität des Menschen:&lt;br /&gt;
&lt;br /&gt;
Der Mensch ist, wenn es unter anderem nach dem französischen Philosophen Bernard Stiegler (1952-2020) geht, ein blindes, unvollständiges - ein frühzeitig geborenes Wesen – in der griechischen Mythologie nach Platon dadurch dargestellt, dass Epimetheus bei der Erschaffung der Lebewesen keine Gaben, keine Eigenschaften mehr für den Menschen übrig hatte. Erst durch den Diebstahl des Feuers und der Weisheit durch Prometheus, wurde der Mensch zu einer Entität, einem sehenden Wesen.&lt;br /&gt;
&lt;br /&gt;
»Um einer zu werden, musste er negieren, was er war, indem er seinem Körper eine Prothese hinzufügte.« — Roberto Calasso (geb. 1941).&lt;br /&gt;
&lt;br /&gt;
In der Frage nach der menschlichen Identität spielen demnach Bewusstsein und Zeitobjekte eine unbestreitbare Rolle. Vom Faustkeil über die Schrift zur Fotografie, bis zum Telefon und Internet: Prothesen, die das generationenübergreifende, kollektive Gedächtnis und damit die menschliche Geschichte darstellen — Werkzeuge, Gegenstände, Dinge.&lt;br /&gt;
&lt;br /&gt;
Während Platon bereits die Schrift als problematische Gedankenprothese kritisierte, hat das Verhältnis von Mensch und Prothese (und die Kritik daran) mit den fortlaufenden Jahrhunderten der Kulturgeschichte nur an Komplexität gewonnen. Heute sind unsere Werkzeuge, Gegenstände, Maschinen, Konsumgüter längst nicht mehr nur Gedankenstützen, Verlängerungen und Verbesserungen menschlicher Organe und Gliedmaßen, sondern alles zusammen und noch viel mehr — Teile komplexer gesellschaftlicher Systeme. Unsere Prothesen haben sich zu (für den Einzelnen) immer unüberschaubaren Konstrukten mit Eigendynamik entwickelt, welche dennoch untrennbar und unverzichtbar mit unserem Leben verknüpft sind.&lt;br /&gt;
&lt;br /&gt;
Besonders sichtbar geworden, ist diese komplexe, wechselseitig-elastische Beziehung zwischen Mensch und Prothese noch einmal in den letzten zwei Jahrzehnten. Smartphones, unsere immer greifbaren Universalprothesen zwischen Analogem und Digitalem, haben unser alltägliches Leben, unsere Gesellschaft maßgebend verändert. Immer mehr sind die Prothesen also auch Teil von uns geworden.&lt;br /&gt;
&lt;br /&gt;
In unserer Arbeit wollen wir diese Entität der Präsenz aus Mensch und Prothese betrachten und betrachten lassen, doch nicht aus anthropologischer Sicht, sondern möglichst neutral, aus der Sicht unserer Prothesen auf uns. Ein Becken voller Farbe wird Behälter unserer Identität/Präsenz, bestehend aus Wellen, welche eine andere als die sichtbare Ebene der Wirklichkeit darstellen.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Installation==&lt;br /&gt;
[[File:PMP_Input_Output.jpg|351px]] [[File:PMP_Render_frosch.png|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Render_top.png|351px]] [[File:PMP_Render_top_farbe.png|351px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Eine ein mal ein Meter große Metallwanne steht in der Mitte des Raumes. Auf Augenhöhe, durch ein Kabel verbunden mit der Wanne, schwebt ein Smartphone. Die Wanne ist gefüllt mit schwarzer, etwas dickflüssiger Farbe. Im Ausgangszustand ist die Oberfläche der Farbe glatt und ruhig. Betreten Menschen den Raum, reagiert die Installation.&lt;br /&gt;
&lt;br /&gt;
Die Betrachter sind für das installierte Smartphone durch alle möglichen sensorisch messbaren Wellen wahrnehmbar: Licht/Schatten, Geräusche, Luftbewegung, Wärme etc. Auch die technischen Geräte der Betrachter, ihre Smartphones senden ununterbrochen Signale wie Bluetooth, UMTS oder WLAN aus.&lt;br /&gt;
&lt;br /&gt;
Alle diese Signale und Wellen werden durch die Sensoren des Smartphones der Installation gemessen, in einer eigens programmierten App erfasst, auf unterschiedliche Audiofrequenzen und Steuersignale gemappt und an einen Raspberry Pi weitergegeben. Von dort laufen die Audiosignale durch einen Verstärker und speisen 8 Subwoofer-Konstruktionen unter der Wanne. Die Steuersignale regulieren zudem 4 Turbinen in der Farbe.&lt;br /&gt;
&lt;br /&gt;
Die auf die Installation eintreffenden Signale sorgen für eine Reaktion in der Wanne. Die Subwoofer-Konstruktionen erzeugen über eine Membran Interferenzen und Wellenmuster, die Turbinen noch stärkere Wellen und Bewegungen. Bei wenigen Besuchern und wenig Input-Signalen gibt es feine Reaktionen — Interferenzmuster durchlaufen das Becken. Bei vielen Leuten und vielen Signalen wie etwa hoher Lautstärke oder Blitzlicht steigern sich die Reaktionen, die Turbinen drehen sich bis zu dem Punkt, dass die Farbe aus ihrem Becken in den Raum spritzt. Die Präsenz der Menschen manifestiert sich im Raum.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_GR1.jpg|350px]] [[File:PMP_GR2.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_GR4.jpg|350px]] [[File:PMP_GR5.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Programmierung==&lt;br /&gt;
&lt;br /&gt;
Für die Installation werden die Daten der ein- wie auch multidimensionalen Sensoren und Schnittstellen wie WLAN und Bluetooth gesammelt und über acht Audiokanäle sonifiziert. Zusätzlich werden bei starken Veränderungen drei Propeller gesteuert, um die Farboberfläche stärker zu beeinflussen.&lt;br /&gt;
&lt;br /&gt;
Um Grundrauschen und Fehler der Sensoren zu minimieren, wird pro Datenkanal zunächst ein Mittelwert aus einigen vorherigen Werten gebildet. Dadurch wird gewährleistet, dass sowohl kurzfristige wie auch langfristige Veränderungen abgebildet werden.&lt;br /&gt;
&lt;br /&gt;
Dabei sind den Audiokanälen keine jeweils festen Datenkanäle zugewiesen, sodass pro Lautsprecherkonstruktion nicht nur ein Datenkanal dargestellt wird, sondern die generelle Wahrnehmung des Raumes gemischt, über alle Konstruktionen abgebildet wird.&lt;br /&gt;
&lt;br /&gt;
Neben einfachem Mapping der Daten auf geeignete Frequenzspektren, wird ein PID-Regler (proportional-integral-derivative controller) verwendet. &lt;br /&gt;
Dieser Kontrollmechanismus nutzt das Feedback der Sensoren, um durch Gegenbewegung zu versuchen, ein Gleichgewicht wieder herzustellen. Anders als bei der industriellen Verwendung dieses Reglers wird dieser hier benutzt, um eine Balance zwischen kurzfristigen und langfristigen Inputs zu gewährleisten.&lt;br /&gt;
&lt;br /&gt;
Die über eine Sensor-Programmierschnittstelle zur Verfügung gestellten Daten werden kontinuierlich gesammelt und verarbeitet, wodurch eine adäquate Abbildung der für das Smartphone messbaren  Präsenz erreicht wird.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Tabelle der Verfügbaren Sensoren:&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Sensoren.png|351px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Prototypen==&lt;br /&gt;
Auf dem Weg zur fertigen Installation wurden mehrere unterschiedliche Prototypen und Studien zur Erzeugung von Wellen angefertigt; auch wurde mit verschiedenen Materialien experimentiert.&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_Skizze_new.jpg|351px]]&lt;br /&gt;
[[File:PMP_Function_b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Unter anderem wurde ein Prototyp mit einem Exciter (Körperschallwandler) an einer Holzplatte als schwimmender &amp;quot;Interferenz-Insel&amp;quot; angefertigt.&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_O2.JPG|351px]] &lt;br /&gt;
[[File:PMP_O4b.jpg|351px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Beim aktuellen Prototyp wird die Schallwellenübertragung mit Hilfe von Latex-Membranen realisiert, wie sie auch in der finalen Installation zur Anwendung kommen sollen.&lt;br /&gt;
&lt;br /&gt;
[[File:PMP_P1.jpg|285px]] &lt;br /&gt;
[[File:PMP_P2.JPG|285px]] &lt;br /&gt;
[[File:PMP_P4.jpg|127px]]&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Umsetzung==&lt;br /&gt;
&lt;br /&gt;
Aufgrund der Corona-Lage hat es sich bis jetzt als schwierig erwiesen, einen geeigneten Rahmen für die Präsentation unserer Installation zu finden, besonders da die Arbeit von der Interaktion mit den Betrachtern lebt. Unser Ziel ist es dennoch, möglichst bis zum Sommer einen geeigneten Ausstellungsort sowie Fördergelder bzw. finanzielle Zuschüsse zu akquirieren, damit wir die Installation final umsetzen können.&lt;br /&gt;
&lt;br /&gt;
Besonders beachten müssen wir bei der fertigen Installation bzw. dem Ausstellungsort die vorhandene Lichtsituation, da Spiegelungen und Reflektionen einen großen Einfluss darauf haben, wie die Wellen in der schwarzen Farbe wahrgenommen werden. Außerdem sollte der Boden möglichst weiß oder zumindest mit weißem Papier bedeckt sein, damit sich die Farbe außerhalb des Beckens manifestieren kann; hierbei ist auf den Schutz des Ausstellungsortes vor Schädigungen durch die Farbe zu achten. Auch die Transportlogistik, Lieferzeiten sowie der benötigte Arbeitsaufwand von mindestens zwei bis drei Tagen zum Bau der Installation sollten bedacht werden.&lt;br /&gt;
&lt;br /&gt;
Die Materialkosten haben wir in der folgenden Tabelle festgehalten:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Kostenaufstellung&#039;&#039;&#039;: [https://docs.google.com/spreadsheets/d/1448o1PZdCmi9e1ZzCmdhUpYzGk5ClCqz45YITzwZX_w/edit?usp=sharing|Tabelle Preiskalkulation]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr&amp;diff=131280</id>
		<title>GMU:Media Art Strategies/Leon-Etienne Kühr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Media_Art_Strategies/Leon-Etienne_K%C3%BChr&amp;diff=131280"/>
		<updated>2022-09-14T10:41:26Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* /Project/ */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Project  ==&lt;br /&gt;
together with [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Media_Art_Strategies/Clemens_Hornemann Clemens Hornemann]&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
=== Task 1 ===&lt;br /&gt;
It is hard for me to pick one favourite artwork as they all pursue very different strategies. &lt;br /&gt;
Julian Oliver’s &#039;&#039;Men in Grey&#039;&#039; captures the &#039;&#039;Zeitgeist&#039;&#039; of the uncontrollable rise of WiFi within our lives, while exposing us to our technological life, defined by groups and organizations that are beyond our own reach, without fear mongering or shaming us for our digital life. While the &#039;&#039;Men in Grey&#039;&#039; exposed one concerning facet of our daily life, the &#039;&#039;Yes Men&#039;&#039; achieve to confront us with the grim reality of a capitalistic economy in various ways and their ability and relentlessness to repeat variations of their media activistic stunts just makes abundantly clear, that change takes time and the will and courage of a few individuals that are in power.&lt;br /&gt;
&lt;br /&gt;
Besides the politically motivated works, I find great pleasure in simplistic works like David Bowen’s &#039;&#039;Tele-present water&#039;&#039;, while simple in concept, it focuses on the relation between our perceived world and the world of a buoy station at an unknown location, highlighting that even when not perceived by us the ocean is, was and always will be moving relentlessly, whether we are there to perceive it or not.&lt;br /&gt;
&lt;br /&gt;
Even though its technical implementation is marvelous, what intrigues me the most about Random international’s &#039;&#039;Rain Room&#039;&#039; is its ability to convey the viewer an alternate reality, where it is not the rain that is repelled by us, but we repel the rain and how this simple change might change our behaviour and perception of the space around us.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[:File:media-art-strategies_telegarden_slide.pdf|Slide Telegarden (Ken Goldberg &amp;amp; Joseph Santarromana)]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Colloquium&amp;diff=128675</id>
		<title>GMU:Colloquium</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Colloquium&amp;diff=128675"/>
		<updated>2022-01-10T06:04:44Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Here are the thesis projects from Bachelor, Master, and PhD students in winter semester 21/22&lt;br /&gt;
&lt;br /&gt;
*[[/Jenny Brockmann/]]&lt;br /&gt;
*[[/Isabella Lee Aturo/]]&lt;br /&gt;
*[[/Ksenia Tajsic/]]&lt;br /&gt;
*[[/Joel Schäfer/]]&lt;br /&gt;
*[[/Yasemin Yagci/]]&lt;br /&gt;
*[[Rico Graupner/]]&lt;br /&gt;
*[[/Jörg Brinkmann/]]&lt;br /&gt;
*[[/Daniel Wolter/]]&lt;br /&gt;
*[[/Gabriel Moses/]]&lt;br /&gt;
*[[/Christina Schinzel/]]&lt;br /&gt;
*[[/Sandra Anhalt/]]&lt;br /&gt;
*[[/Vaclav Harsa/]]&lt;br /&gt;
*[[/Mindaugas Gapsevicius/]]&lt;br /&gt;
*[[/Sarah Alvim/]]&lt;br /&gt;
*[[/Benazir Basauri/]]&lt;br /&gt;
*[[/Lennart Oberlies/]]&lt;br /&gt;
*[[/Beatrice Perlato/]]&lt;br /&gt;
*[[/Quan Zhou/]]&lt;br /&gt;
*[[/Leon-Etienne Kühr/]]&lt;br /&gt;
== Attendance Timetable ==&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
| &#039;&#039;&#039;Week&#039;&#039;&#039;&lt;br /&gt;
| &#039;&#039;&#039;Date&#039;&#039;&#039;&lt;br /&gt;
| &#039;&#039;&#039;Presenter&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
| 1&lt;br /&gt;
| 25. October 21&lt;br /&gt;
| N.N.&lt;br /&gt;
|-&lt;br /&gt;
| 2&lt;br /&gt;
|2. Novmber 21&lt;br /&gt;
| Beatrice Perlato&lt;br /&gt;
|-&lt;br /&gt;
| 3&lt;br /&gt;
| 08. Novmber 21&lt;br /&gt;
| Yasemin Yagci&lt;br /&gt;
|-&lt;br /&gt;
| 4&lt;br /&gt;
| 15. Novmber 21&lt;br /&gt;
| Christina Schinzel&lt;br /&gt;
|- &lt;br /&gt;
| 5&lt;br /&gt;
|22. November 21&lt;br /&gt;
| Isabella Lee Arturo&lt;br /&gt;
|-&lt;br /&gt;
|7&lt;br /&gt;
|29. November  21&lt;br /&gt;
| Ksenija T.&lt;br /&gt;
|-&lt;br /&gt;
|8&lt;br /&gt;
|6. December 21&lt;br /&gt;
| Yasemin Yagci, Sarah Alvim&lt;br /&gt;
|-&lt;br /&gt;
| 9&lt;br /&gt;
|13. December&lt;br /&gt;
| Joel&lt;br /&gt;
|-&lt;br /&gt;
|10&lt;br /&gt;
| 20. Dec&lt;br /&gt;
| Holiday&lt;br /&gt;
|-&lt;br /&gt;
|11&lt;br /&gt;
| 3. January 2022&lt;br /&gt;
| N.N.&lt;br /&gt;
|-&lt;br /&gt;
| 12&lt;br /&gt;
|10. January&lt;br /&gt;
| N.N.&lt;br /&gt;
|-&lt;br /&gt;
| 13&lt;br /&gt;
|17.January 2022&lt;br /&gt;
| Sarah Alvim&lt;br /&gt;
|-&lt;br /&gt;
|14&lt;br /&gt;
| 24. January 2022&lt;br /&gt;
| Quan Zhou&lt;br /&gt;
|-&lt;br /&gt;
|15&lt;br /&gt;
| 31. January 2022&lt;br /&gt;
| Joel (short), Ksenija (short)&lt;br /&gt;
|-&lt;br /&gt;
|16&lt;br /&gt;
| 07. February 2022 &lt;br /&gt;
| Leon&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:WS21]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124460</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124460"/>
		<updated>2021-05-20T17:44:29Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* GOL-Midi Semesterprojekt */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Usage==&lt;br /&gt;
To use the build just plug it in using a micro-USB cable and enable it in the DAW as a Midi-Device (named Arduino Micro).&lt;br /&gt;
Since the build only has two buttons (left and right) for many functionalities, alls buttons have multiple uses and combinations:&lt;br /&gt;
* &#039;&#039;&#039;Click Left&#039;&#039;&#039; (Down, Up under 200ms): &#039;&#039;&#039;Next Chord&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Right &amp;amp; Click Left&#039;&#039;&#039; (Hold first): &#039;&#039;&#039;Previous Chord&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Click Right&#039;&#039;&#039; (Down, Up under 200ms): &#039;&#039;&#039;Next Cellular Automata Rule&#039;&#039;&#039; (0-255 wraps around)&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Click Right&#039;&#039;&#039; (Hold first): &#039;&#039;&#039;Previous Cellular Automata Rule&#039;&#039;&#039; (0-255 wraps around)&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Hold Left&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Increase BPM&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Right&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Decrease BPM&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Hold Right&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Start/Stop playing&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Hold Right&#039;&#039;&#039; (Longer than 2000ms): Reset everything&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
-----&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124459</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124459"/>
		<updated>2021-05-20T17:43:02Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Demonstration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Usage==&lt;br /&gt;
To use the build just plug it in using a micro-USB cable and enable it in the DAW as a Midi-Device (named Arduino Micro).&lt;br /&gt;
Since the build only has two buttons (left and right) for many functionalities, alls buttons have multiple uses and combinations:&lt;br /&gt;
* &#039;&#039;&#039;Click Left&#039;&#039;&#039; (Down, Up under 200ms): &#039;&#039;&#039;Next Chord&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Right &amp;amp; Click Left&#039;&#039;&#039; (Hold first): &#039;&#039;&#039;Previous Chord&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Click Right&#039;&#039;&#039; (Down, Up under 200ms): &#039;&#039;&#039;Next Cellular Automata Rule&#039;&#039;&#039; (0-255 wraps around)&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Click Right&#039;&#039;&#039; (Hold first): &#039;&#039;&#039;Previous Cellular Automata Rule&#039;&#039;&#039; (0-255 wraps around)&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Hold Left&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Increase BPM&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Right&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Decrease BPM&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Hold Right&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Start/Stop playing&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Hold Right&#039;&#039;&#039; (Longer than 2000ms): Reset everything&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124458</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124458"/>
		<updated>2021-05-20T17:42:39Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Demonstration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
To use the build just plug it in using a micro-USB cable and enable it in the DAW as a Midi-Device (named Arduino Micro).&lt;br /&gt;
Since the build only has two buttons (left and right) for many functionalities, alls buttons have multiple uses and combinations:&lt;br /&gt;
* &#039;&#039;&#039;Click Left&#039;&#039;&#039; (Down, Up under 200ms): &#039;&#039;&#039;Next Chord&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Right &amp;amp; Click Left&#039;&#039;&#039; (Hold first): &#039;&#039;&#039;Previous Chord&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Click Right&#039;&#039;&#039; (Down, Up under 200ms): &#039;&#039;&#039;Next Cellular Automata Rule&#039;&#039;&#039; (0-255 wraps around)&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Click Right&#039;&#039;&#039; (Hold first): &#039;&#039;&#039;Previous Cellular Automata Rule&#039;&#039;&#039; (0-255 wraps around)&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Hold Left&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Increase BPM&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Right&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Decrease BPM&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Hold Right&#039;&#039;&#039; (Longer than 500ms): &#039;&#039;&#039;Start/Stop playing&#039;&#039;&#039;&lt;br /&gt;
* &#039;&#039;&#039;Hold Left &amp;amp; Hold Right&#039;&#039;&#039; (Longer than 2000ms): Reset everything&#039;&#039;&#039;&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124457</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124457"/>
		<updated>2021-05-20T17:40:38Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Demonstration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
Since the build only has two buttons (left and right) for many functionalities, alls buttons have multiple uses and combinations:&lt;br /&gt;
* Click Left (Down, Up under 200ms): Next Chord&lt;br /&gt;
* Hold Right &amp;amp; Click Left (Hold first): Previous Chord&lt;br /&gt;
&lt;br /&gt;
* Click Right (Down, Up under 200ms): Next Cellular Automata Rule (0-255 wraps around)&lt;br /&gt;
* Hold Left &amp;amp; Click Right (Hold first): Previous Cellular Automata Rule (0-255 wraps around)&lt;br /&gt;
&lt;br /&gt;
* Hold Left (Longer than 500ms): Increase BPM&lt;br /&gt;
* Hold Right (Longer than 500ms): Decrease BPM&lt;br /&gt;
&lt;br /&gt;
* Hold Left &amp;amp; Hold Right (Longer than 500ms): Start/Stop playing&lt;br /&gt;
* Hold Left &amp;amp; Hold Right (Longer than 2000ms): Reset everything&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124456</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124456"/>
		<updated>2021-05-20T17:40:18Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Demonstration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
Since the build only has two buttons (left and right) for many functionalities, alls buttons have multiple uses and combinations:&lt;br /&gt;
* Click Left (Down, Up under 200ms): Next Chord&lt;br /&gt;
* Hold Right &amp;amp; Click Left (Hold first): Previous Chord&lt;br /&gt;
* Click Right (Down, Up under 200ms): Next Cellular Automata Rule (0-255 wraps around)&lt;br /&gt;
* Hold Left &amp;amp; Click Right (Hold first): Previous Cellular Automata Rule (0-255 wraps around)&lt;br /&gt;
* Hold Left (Longer than 500ms): Increase BPM&lt;br /&gt;
* Hold Right (Longer than 500ms): Decrease BPM&lt;br /&gt;
* Hold Left &amp;amp; Hold Right (Longer than 500ms): Start/Stop playing&lt;br /&gt;
* Hold Left &amp;amp; Hold Right (Longer than 2000ms): Reset everything&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124455</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124455"/>
		<updated>2021-05-20T17:39:44Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Demonstration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
Since the build only has two buttons for many functionalities, alls buttons have multiple uses and combinations:&lt;br /&gt;
* Click Left Button (Down, Up under 200ms): Next Chord&lt;br /&gt;
* Hold Right Button &amp;amp; Click Left Button (Hold first): Previous Chord&lt;br /&gt;
* Click Right Button (Down, Up under 200ms): Next Cellular Automata Rule (0-255 wraps around)&lt;br /&gt;
* Hold Left Button &amp;amp; Click Right Button (Hold first): Previous Cellular Automata Rule (0-255 wraps around)&lt;br /&gt;
* Hold Left Button (Longer than 500ms): Increase BPM&lt;br /&gt;
* Hold Right Button (Longer than 500ms): Decrease BPM&lt;br /&gt;
* Hold Left Button &amp;amp; Hold Right Button (Longer than 500ms): Start/Stop playing&lt;br /&gt;
* Hold Left Button &amp;amp; Hold Right Button (Longer than 2000ms): Reset everything&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124454</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124454"/>
		<updated>2021-05-20T17:33:14Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124453</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124453"/>
		<updated>2021-05-20T17:33:01Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|176px]][[File:PCB Arts -L.E.K Build 2.jpg|559px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124452</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124452"/>
		<updated>2021-05-20T17:32:04Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|232px]][[File:PCB Arts -L.E.K Build 2.jpg|503px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124451</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124451"/>
		<updated>2021-05-20T17:31:04Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|220px]][[File:PCB Arts -L.E.K Build 2.jpg|510px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124450</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124450"/>
		<updated>2021-05-20T17:30:57Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|220px]][[File:PCB Arts -L.E.K Build 2.jpg|520px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124449</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124449"/>
		<updated>2021-05-20T17:30:49Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|240px]][[File:PCB Arts -L.E.K Build 2.jpg|520px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124448</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124448"/>
		<updated>2021-05-20T17:30:42Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|240px]][[File:PCB Arts -L.E.K Build 2.jpg|520px]]&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124447</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124447"/>
		<updated>2021-05-20T17:30:31Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|240px]][[File:PCB Arts -L.E.K Build 2.jpg|520px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124446</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124446"/>
		<updated>2021-05-20T17:27:49Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|253px]][[File:PCB Arts -L.E.K Build 2.jpg|508px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:PCB_Arts_-L.E.K_Build_2.jpg&amp;diff=124445</id>
		<title>File:PCB Arts -L.E.K Build 2.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:PCB_Arts_-L.E.K_Build_2.jpg&amp;diff=124445"/>
		<updated>2021-05-20T17:26:40Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124444</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124444"/>
		<updated>2021-05-20T17:23:57Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|200px]]&lt;br /&gt;
&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124443</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124443"/>
		<updated>2021-05-20T17:23:42Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|200px]]&lt;br /&gt;
Due to an error I did on the left potentionmeter (missing connection). I was not able to follow my planned design and had to improvise.&lt;br /&gt;
To the same functionality as planned, I replaced the one potentiometer left with another button, giving me two buttons to work with.&lt;br /&gt;
I sadly had to use cables for the LED Matrix and parts of the buttons, due to my GND and 5V+ beeing all connected to each other.&lt;br /&gt;
The combination of lines and cables however works as intended.&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124442</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124442"/>
		<updated>2021-05-20T17:19:40Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|200px]]&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124441</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124441"/>
		<updated>2021-05-20T17:19:31Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:PCB Arts -L.E.K Build.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:PCB_Arts_-L.E.K_Build.jpg&amp;diff=124440</id>
		<title>File:PCB Arts -L.E.K Build.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:PCB_Arts_-L.E.K_Build.jpg&amp;diff=124440"/>
		<updated>2021-05-20T17:19:18Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124439</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124439"/>
		<updated>2021-05-20T17:18:56Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124438</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124438"/>
		<updated>2021-05-20T17:11:45Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Demonstration */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|7doCJ39owTw|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124437</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124437"/>
		<updated>2021-05-20T17:06:22Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Build */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
(GIFs, click to play)&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|hLPOWT7qm5U|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124436</id>
		<title>GMU:PCB Arts/L.-E. Kühr/GOL-Midi</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:PCB_Arts/L.-E._K%C3%BChr/GOL-Midi&amp;diff=124436"/>
		<updated>2021-05-20T17:04:19Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* GOL-Midi Semesterprojekt */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=GOL-Midi Semesterprojekt=&lt;br /&gt;
==Concept==&lt;br /&gt;
In this project I wanted to revisit my first-semester project about generating sound using cellular automata. Instead of synthesizing the sound directly from the state of the cellular automata, I wanted to take advantage of using the MIDI-capabilities of the ATmega32U4 microcontroller and design a custom PCB for an Arduino Micro shield which controlls the simulation. The state of the simulation is then send as MIDI over the USB-C port of the Arduino, which can then be used as a MIDI-controller. For the design of the PCB I wanted to take a procedural approach aswell while keeping it simple but taking advantage of all the layers available. The custom PCB controlls the speed and rules used in the one-dimensional automata. To visualize the automate an 8x8 LED matrix is used.&lt;br /&gt;
&lt;br /&gt;
[[File:all_rules.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Schematic==&lt;br /&gt;
To keep the controls simple, the only inputs used are two potentiometers and a button. for the LED matrix a preassebled 8x8 matrix with a MAX7219 controller are mounted through the back to make it shine through the PCB. Thus five holes have to be added for the five pins of the MAX7219 have to be added.&lt;br /&gt;
[[File:Schematic.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
==PCB==&lt;br /&gt;
For the general PCB shape I wanted to use the retro-look of old hand-held game-consoles in combination with a texture that was generated by cellular automata.&lt;br /&gt;
After a rough layout in PCBnew, I went to Inkscape, traced an old GameBoy and began to make outlines for the edge-cuts and silkscreen layer. After importing the general shape, I layouted all the mounting holes and pins and began to wire everything on the backside of the PCB. I left a window in the mask-layer, where the LED matrix is going to be mounted to and surrounded every component with a symbolic outline in the silkscreen-layer.&lt;br /&gt;
[[File:pcb_new.PNG|800px]]&lt;br /&gt;
&lt;br /&gt;
===Generation===&lt;br /&gt;
To fill the general outline with texture, a 1-dimensional cellular automata was used with random inputs and the infamous rule 30. I upscaled the resulting texture and imported it into Inkscape. The texture was then masked to exclude the immediate surroundings of the components. It was then once used for the silkscreen (just around the screen) and once for the front copper for the rest of the PCB.&lt;br /&gt;
[[File:rule30.png|600px]]&lt;br /&gt;
&lt;br /&gt;
==Final Design==&lt;br /&gt;
[[File:3d-back.png|800px]]&lt;br /&gt;
[[File:3d-front.png|800px]]&lt;br /&gt;
&lt;br /&gt;
==Programming==&lt;br /&gt;
===Simulation===&lt;br /&gt;
To efficiently simulate the 8-Bit automata the state of the cell and it&#039;s neighbours are interpreted as 3-Bit numbers that are then looked up in two arrays. One for the rules whether a cell is created and one for whether the cell should die. Rules for the automata can be changed using the shield.&lt;br /&gt;
&lt;br /&gt;
===MIDI===&lt;br /&gt;
To send the state of the simulation using MIDI the library MIDIUSB (https://github.com/arduino-libraries/MIDIUSB) was used. The state of the simulation is mapped to chrods of a selected scale that are hardcoded in memory and can be send all at once or as an arpeggio. To map to the scales the state of the automata is interpreted as an 8-Bit number that is then mapped to the tone. The speed or time of the interval is controlled by the shield.&lt;br /&gt;
&lt;br /&gt;
==Parts==&lt;br /&gt;
Besides the PCB the following parts are necessary:&lt;br /&gt;
* 2 Potentiometer &lt;br /&gt;
* Button 6mm&lt;br /&gt;
* MAX7219 8x8 LED Matrix-Modul&lt;br /&gt;
* Arduino Micro&lt;br /&gt;
&lt;br /&gt;
==Build==&lt;br /&gt;
[[File:ezgif-6-ba2d047897d0.gif|360px]]&lt;br /&gt;
[[File:ezgif-6-46dbab99ab2a.gif|360px]]&lt;br /&gt;
&lt;br /&gt;
==Demonstration==&lt;br /&gt;
{{#ev:youtube|hLPOWT7qm5U|735}}&lt;br /&gt;
&lt;br /&gt;
==Further Reading:==&lt;br /&gt;
===Elementary Cellular Automaton===&lt;br /&gt;
* http://mathworld.wolfram.com/ElementaryCellularAutomaton.html&lt;br /&gt;
* http://atlas.wolfram.com/01/01/rulelist.html&lt;br /&gt;
* https://plato.stanford.edu/entries/cellular-automata/supplement.html&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Ezgif-6-46dbab99ab2a.gif&amp;diff=124435</id>
		<title>File:Ezgif-6-46dbab99ab2a.gif</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Ezgif-6-46dbab99ab2a.gif&amp;diff=124435"/>
		<updated>2021-05-20T17:03:45Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Ezgif-6-ba2d047897d0.gif&amp;diff=124434</id>
		<title>File:Ezgif-6-ba2d047897d0.gif</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:Ezgif-6-ba2d047897d0.gif&amp;diff=124434"/>
		<updated>2021-05-20T17:03:44Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124169</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124169"/>
		<updated>2021-05-13T05:39:39Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Video */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
Digital images can be treated as pseudo-random distributions of color. If these are interpreted as three-dimensional point-clouds, it is possible to overcome the planar image by adding depth and creating new ways to interpret the image by exploring the color distribution in its purest form. After using different color metrics to calculate the three-dimensional position of every point, complex structures and patterns emerge, colors shift, strains form and clusters of colors move from position to position.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
===Video===&lt;br /&gt;
The uniforms of the compute-shader can be updated during runtime. With the use of the Unity-Video-Player the input image can be the current frame of a video. If the video is updated every frame the particles get a new target position every frame creating a moving cloud of particles that create ever-emerging patterns of movment.&lt;br /&gt;
{{#ev:youtube|ZSVnI2nV1V4|735}}&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124168</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124168"/>
		<updated>2021-05-13T05:24:16Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Concept */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
Digital images can be treated as pseudo-random distributions of color. If these are interpreted as three-dimensional point-clouds, it is possible to overcome the planar image by adding depth and creating new ways to interpret the image by exploring the color distribution in its purest form. After using different color metrics to calculate the three-dimensional position of every point, complex structures and patterns emerge, colors shift, strains form and clusters of colors move from position to position.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
===Video===&lt;br /&gt;
The uniforms of the compute-shader can be updated during runtime. With the use of the Unity-Video-Player the input image can be the current frame of a video. If the video is updated every frame the particles get a new target position every frame creating a moving cloud of particles that create ever-emerging patterns of movment.&lt;br /&gt;
{{#ev:youtube|ztNLUFrtVVQ|360}}&lt;br /&gt;
{{#ev:youtube|4QoyVeXbwGY|360}}&lt;br /&gt;
{{#ev:youtube|kzz4ly4o89M|360}}&lt;br /&gt;
{{#ev:youtube|9kwDKTb-t1w|360}}&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124167</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124167"/>
		<updated>2021-05-13T04:23:19Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Video */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
===Video===&lt;br /&gt;
The uniforms of the compute-shader can be updated during runtime. With the use of the Unity-Video-Player the input image can be the current frame of a video. If the video is updated every frame the particles get a new target position every frame creating a moving cloud of particles that create ever-emerging patterns of movment.&lt;br /&gt;
{{#ev:youtube|ztNLUFrtVVQ|360}}&lt;br /&gt;
{{#ev:youtube|4QoyVeXbwGY|360}}&lt;br /&gt;
{{#ev:youtube|kzz4ly4o89M|360}}&lt;br /&gt;
{{#ev:youtube|9kwDKTb-t1w|360}}&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124166</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124166"/>
		<updated>2021-05-13T04:12:42Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
===Video===&lt;br /&gt;
The uniforms of the compute-shader can be updated during runtime. With the use of the Unity-Video-Player the input image can be the current frame of a video. If the video is updated every frame the particles get a new target position every frame creating a moving cloud of particles that create ever-emerging patterns of movment.&lt;br /&gt;
{{#ev:youtube|ztNLUFrtVVQ|735}}&lt;br /&gt;
{{#ev:youtube|4QoyVeXbwGY|735}}&lt;br /&gt;
{{#ev:youtube|kzz4ly4o89M|735}}&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124165</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124165"/>
		<updated>2021-05-13T04:12:10Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Video */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
===Video===&lt;br /&gt;
The uniforms of the compute-shader can be updated during runtime. With the use of the Unity-Video-Player the input image can be the current frame of a video. If the video is updated every frame the particles get a new target position every frame creating a moving cloud of particles that create ever-emerging patterns of movment.&lt;br /&gt;
{{#ev:youtube|ztNLUFrtVVQ|735}}&lt;br /&gt;
{{#ev:youtube|4QoyVeXbwGY|735}}&lt;br /&gt;
{{#ev:youtube|kzz4ly4o89M|735}}&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124164</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124164"/>
		<updated>2021-05-13T04:11:09Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Approach */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
===Video===&lt;br /&gt;
The uniforms of the compute-shader can be updated during runtime. With the use of the Unity-Video-Player the input image can be the current frame of a video. If the video is updated every frame the particles get a new target position every frame creating a moving cloud of particles that create ever-emerging patterns of movment.&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124157</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124157"/>
		<updated>2021-05-12T18:14:24Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Manipulation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124156</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124156"/>
		<updated>2021-05-12T18:13:26Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124155</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124155"/>
		<updated>2021-05-12T18:01:06Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Manipulation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|Hand Manipulation|735px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Outlook==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124154</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124154"/>
		<updated>2021-05-12T18:00:20Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Manipulation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|none|1000px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Outlook==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124153</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124153"/>
		<updated>2021-05-12T17:59:57Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Manipulation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|none|735px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Outlook==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124152</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124152"/>
		<updated>2021-05-12T17:59:18Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Outlook */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|none|500px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Outlook==&lt;br /&gt;
{{#ev:youtube|cA7D-l4XI84|735}}&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124151</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124151"/>
		<updated>2021-05-12T17:56:59Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Outlook */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|none|500px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Outlook==&lt;br /&gt;
&amp;lt;youtube&amp;gt;cA7D-l4XI84&amp;lt;/youtube&amp;gt;&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ImageCloud-Simple2.gif&amp;diff=124150</id>
		<title>File:ImageCloud-Simple2.gif</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ImageCloud-Simple2.gif&amp;diff=124150"/>
		<updated>2021-05-12T17:54:02Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: Lalalolo uploaded a new version of File:ImageCloud-Simple2.gif&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ImageCloud-Simple2.gif&amp;diff=124149</id>
		<title>File:ImageCloud-Simple2.gif</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ImageCloud-Simple2.gif&amp;diff=124149"/>
		<updated>2021-05-12T17:50:10Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: Lalalolo uploaded a new version of File:ImageCloud-Simple2.gif&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124148</id>
		<title>GMU:Critical VR Lab II - Lab Work/L.-E. Kuehr</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Critical_VR_Lab_II_-_Lab_Work/L.-E._Kuehr&amp;diff=124148"/>
		<updated>2021-05-12T17:15:52Z</updated>

		<summary type="html">&lt;p&gt;Lalalolo: /* Approach */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:ImageCompute-header.png|700px]]&lt;br /&gt;
=color%20=&lt;br /&gt;
==Context==&lt;br /&gt;
In this project I wanted to revisit the idea of treating digital images as spatial pseudo-random distributions of color-entities (pixels in 2d, voxels in 3d).&lt;br /&gt;
As digital color is mostly stored as a triplett of color-channels (red, green and blue), it can easily be mapped into a three-dimensional space. We can further take advantage of the possible perception of true three-dimensionality using a VR-Headset. Having the image represented as a cloud of voxels in digital environment, we can then use the VR-controllers to manipulate the image in this three-dimensional domain using our hands by pushing and pulling parts of the image-cloud and remapping the cloud to the two-dimensional canvas. This technique might provide a more intuitive approach of manipulating images and their colors.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Concept==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Approach==&lt;br /&gt;
For every pixel in the image a colored voxel is spawned. The position of the voxel is than determined by different mapping functions that map the pixel (x, y, r, g, b) to position in space (x, y, z). In the default mapping every voxel is on a plane (z = 0) and the position in the plane is directly determined by the position in the original two-dimensional image.&lt;br /&gt;
[[File:ImageCloud-Default-Mapping.png|Default mapping|360px]]&lt;br /&gt;
[[File:ImageCloud-Simple2.gif|RGB Space|360px]]&lt;br /&gt;
&lt;br /&gt;
Since the virtual space is three-dimensional space, the third dimension or depth (z-axis in this case) can be used to convey additional information thus revealing a new perspective on the image. The red channel could for example be mapped to the z-coordinate of every voxel. But we aren&#039;t limited to only using the z-axis. Since every digital image represents a subset of the rgb-space (cube), we can also map every pixel to a position in the rgb-space (represented as a cube).&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:ImageCloud-Approach1.png|Original Image&lt;br /&gt;
File:ImageCloud-Approach2.png|Red Color Channel Depth&lt;br /&gt;
File:ImageCloud-Approach3.png|RGB Color Space&lt;br /&gt;
File:ImageCloud-Approach4.png|HSV Color Space&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Mapping===&lt;br /&gt;
Since there are more three-dimensional color-representations of digital color such as HSV (or HSL) and YCbCr that have their own three-dimensional interpretations (HSV a cylinder, YCbCr a cube) and every singel channel can be mapped to the z-axis the following mappings were implemented:&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
! Name&lt;br /&gt;
! X-Axis&lt;br /&gt;
! Y-Axis&lt;br /&gt;
! Z-Axis&lt;br /&gt;
|-&lt;br /&gt;
| Red Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Red&lt;br /&gt;
|-&lt;br /&gt;
| Green Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Green&lt;br /&gt;
|-&lt;br /&gt;
| Blue Color Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| RGB Space&lt;br /&gt;
| Red&lt;br /&gt;
| Green&lt;br /&gt;
| Blue&lt;br /&gt;
|-&lt;br /&gt;
| Hue Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Hue&lt;br /&gt;
|-&lt;br /&gt;
| Saturation Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Saturation&lt;br /&gt;
|-&lt;br /&gt;
| Value Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| HSV Space (Cylinder)&lt;br /&gt;
| Hue&lt;br /&gt;
| Saturation&lt;br /&gt;
| Value&lt;br /&gt;
|-&lt;br /&gt;
| Y Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
|-&lt;br /&gt;
| Cb Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cb&lt;br /&gt;
|-&lt;br /&gt;
| Cr Channel Depth&lt;br /&gt;
| Original&lt;br /&gt;
| Original&lt;br /&gt;
| Cr&lt;br /&gt;
|-&lt;br /&gt;
| YCbCr Space&lt;br /&gt;
| Y (luma component)&lt;br /&gt;
| Cb&lt;br /&gt;
| Cr&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Implementation===&lt;br /&gt;
Depending on the resolution of the original image the amount of voxels that have to be rendered and updated can easily exceed the millions. Since the resolution of VR-headsets is still limited aroud 2k it is sufficient to rescale the input-image to about 1024*1024 = 1048576 pixels. This is still too much to render and update every voxel as an independent object in real-time. A solution to this is using instancing on the GPU and just passing the color and position of the instances as a buffer. When using direct instanced drawing in Unity the calculation of the bounds of the object is very important to support native-features like view-frustum-culling.&lt;br /&gt;
To calculate the position- and color-buffer in realtime a compute-shader is used that can easily compute millions of independent calculations in parallel per frame. Compute-Shader inputs can be changed at runtime enabling transitions between diffrent resolutions, mappings and positions.&lt;br /&gt;
&lt;br /&gt;
===Transitions===&lt;br /&gt;
Because the compute-shader can run every frame transitions between different states can be calculated. In this case a simple particle-based system was used where every particle (voxel) has a position, velocity and acceleration.&lt;br /&gt;
&lt;br /&gt;
[[File:ImageCloud-Transition1.mp4|Mapping Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition2.mp4|Image Transition|245px]]&lt;br /&gt;
[[File:ImageCloud-Transition3.mp4|Resolution Transition|245px]]&lt;br /&gt;
&lt;br /&gt;
Exmaples of transitions between mappings (left), images (middle) and resolutions (right).&lt;br /&gt;
&lt;br /&gt;
===Manipulation===&lt;br /&gt;
Since VR-headsets were rare this semester I tried to use the [https://www.ultraleap.com/product/leap-motion-controller/ Leap-controller] to capture the movement of the hands and try to manipulate the image with it.&lt;br /&gt;
While I was able to set up and calibrate the controller in Unity, the captured motion was very inconsistent and jittery and the movement of the hand was limited. Since the camera had to be static the added value of the three-dimensionality was limited, using a VR-headset in combination with the Leap-controller might solve this problem.&lt;br /&gt;
[[File:ImageCloud-Manipulation1.mp4|none|500px]]&lt;br /&gt;
Another problem with manipulating the voxels is that the particle representations of the voxels are not part of the physics-world and cannot collide with the other objects. Every collision of every particle with every object has to be calculated in the compute-shader. While I was able to implement simple interaction with the center of one hand, I failed to implement a working collision for simple geometric forms in the compute-shader that was efficient enough to run every frame. This is possible, but GPU-collision is a difficult task that takes a lot more work than expected.&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Outlook==&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
==Media==&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_56_28.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_57_26.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_58_03.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 08_59_21.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_23 (1).png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_00_44.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 11.05.2021 09_01_11.png|360px]]&lt;br /&gt;
[[File:Creating a Graph 12.05.2021 14_03_36.png|360px]]&lt;/div&gt;</summary>
		<author><name>Lalalolo</name></author>
	</entry>
</feed>