<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Js</id>
	<title>Medien Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Js"/>
	<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/Special:Contributions/Js"/>
	<updated>2026-04-28T10:05:27Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.6</generator>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:BioArt_Forum/Lichens_workshop&amp;diff=132186</id>
		<title>GMU:BioArt Forum/Lichens workshop</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:BioArt_Forum/Lichens_workshop&amp;diff=132186"/>
		<updated>2022-11-08T08:48:45Z</updated>

		<summary type="html">&lt;p&gt;Js: added links&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;BioArt Forum: Lichens workshop with Dr. Jan Vondrak &lt;br /&gt;
&lt;br /&gt;
Meeting: Marienstrasse 7b, room 204&amp;lt;br&amp;gt;&lt;br /&gt;
Meeting point: Marienstrasse 7b, room 204&amp;lt;br&amp;gt;&lt;br /&gt;
Max participants: 20&lt;br /&gt;
&lt;br /&gt;
* Saturday, 12.11.2022. 10.00 -18.00&lt;br /&gt;
* Sunday, 13.11.2022. 10.00 - 15.00&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Dr. Jan Vondrak is an academic assistant at the [https://botanika.prf.jcu.cz/english/ University of South Bohemia, Faculty of Biological Sciences, Department of Botany in Budweis Jeske Budovice (CZ)]. Jan Vondrak is one of the few lichen specialists in the world and an experienced mycologist.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
==EN==&lt;br /&gt;
(scroll down for the version in German)&lt;br /&gt;
&lt;br /&gt;
For our first workshop of the Environmental Nature Labs we have the very experienced mushroom and lichen researcher Dr. Jan Vondrak from the [https://botanika.prf.jcu.cz/english/ botanical institute of the University of Budweis] was invited to Weimar. There are only a few such proven connoisseurs of lichens as him, most of them come from the Czech Republic, the reasons for this will be explained later. Jan is willing to go on two excursions with the participants in the city/outskirts of Weimar and present lichens and their ecology living on natural and building sites. Bringing a strong magnifying glass and umbrellas is recommended. However, we also provide a set of magnifying glasses on loan, and there is also the opportunity to learn about the study and identification and preservation of these creatures in the Environmenatal Biolaboratory (microscopy, color tests, etc.).&lt;br /&gt;
&lt;br /&gt;
Lichens are inconspicuous, often annoying, but so widespread that they make an enormous contribution to the regulation of CO2 and nitrogen oxides in the atmosphere. It is also about getting a different perspective and looking at the colonization of lichens in gardens, house walls and paths in a positive way and forgoing their chemical or physical removal.&lt;br /&gt;
&lt;br /&gt;
In addition, lichens are partners from two organisms that have blessedly come together: a fungus and a green or blue-green algae (cyanobacteria). The fungus and its partner live together on a substrate (rock, humus, bark). The fungus receives energy-rich carbohydrates from the partner that carries out photosynthesis and in return the fungus supplies water and mineral salts from the subsoil and forms pigments that protect the algae/bacteria from excessive UV radiation. Lichens grow on bark, leaves, rocks, dry stone walls or on the ground. Lichens grow very slowly compared to mosses and flowering plants. Some only a few millimeters to a centimeter per year. They can live for hundreds to thousands of years. This makes them one of the longest-lived creatures on earth. Like mosses, they can survive longer periods of drought &amp;quot;in their sleep&amp;quot; and start photosynthesis and continue to grow when humidity increases. Like mosses, they are regarded as pioneer organisms. That means they are found in extreme locations where little to no other plants survive. For example, barren nutrient-poor rocks or on walls and walls. Lichens are not parasites and do not harm the tree. They can even protect it from bacteria and fungi. They do not penetrate the living part of the tree (cambium), but sit on the outermost layer, the bark.&lt;br /&gt;
&lt;br /&gt;
==DE==&lt;br /&gt;
(English above)&lt;br /&gt;
&lt;br /&gt;
Wir haben für unseren ersten Workshop der Environmental Nature Labs den sehr erfahrenen Pilz- und Flechtenforscher Dr. Jan Vondrak vom [https://botanika.prf.jcu.cz/english/ botanischen Institut der Universität Budweis] nach Weimar eingeladen. Es gibt nur wenige so ausgewiesene Kenner von Flechten wie Ihn, die meisten stammen aus Tschechien, die Gründe dafür werden noch erläutert. Jan wird mit den Teilnehmern zwei Exkursionen im Stadt/Randgebiet von Weimar unternehmen und dort auf Natur- und Bauflächen lebende Flechten und Ihre Ökologie vorzustellen. Die Mitnahme einer starken Lupe und von Regenschirmen ist empfohlen. Wir stellen jedoch auch ein Set von Lupen leihweise zur Verfügung. Es besteht auch die Möglichkeit etwas über die Untersuchung und  Identifizierung und Bewahrung  dieser Lebewesen im Environmenatal Biolabor zu erfahren (Mikroskopie, Farbtests etc.).&lt;br /&gt;
Flechten sind unauffällig, oft lästig, aber derart verbreitet, daß sie einen enorm hohen Beitrag zur Regulation von CO2 und Stickoxiden in der Atmosphäre leisten. Es geht auch darum einen anderen Blick zu bekommen und die Ansiedlung von Flechten in Gärten, Hausmauern und Wegen positiv zu betrachten und auf Ihre chemische oder physikalische Beseitigung zu verzichten. &lt;br /&gt;
&lt;br /&gt;
Flechten sind Partnerwesen aus zwei Organismen, die sich segensreich zusammen geschlossen haben: Einem Pilz und einer Grün- oder Blaualge (Cyanobakterien). Der Pilz und sein Partner leben zusammen auf einem Substrat (Gestein, Humus, Rinde). Der Pilz erhält von dem Partner, der Fotosynthese betreibt, energiereiche Kohlenhydrate und im Gegenzug liefert der Pilz Wasser und Mineralsalze aus dem Untergrund und bildet Pigmente, die die Algen/Bakterien vor zu starker UV-Strahlung schützen. Flechten wachsen an Rinde, Blättern, Felsen, Trockenmauern oder auf dem Boden. Flechten wachsen im Vergleich zu Moosen und Blütenpflanzen sehr langsam. Manche nur wenige Millimeter bis einen Zentimeter pro Jahr. Sie können mehrere hundert bis tausend Jahre alt werden. Damit gehören sie zu den langlebigsten Lebewesen der Erde. Ebenso wie Moose können sie längere Trockenperioden “im Schlaf“ überleben und bei steigender Feuchtigkeit mit der Fotosynthese beginnen und weiter wachsen. Sie gelten wie Moose als Pionierorganismen. Das heißt, sie kommen an extremen Standorten vor, wo wenige bis keine anderen Pflanzen überleben. Zum Beispiel karge nährstoffarme Felsen oder  an Mauern und Wänden. Flechten sind keine Parasiten und schaden dem Baum nicht. Sie können ihn sogar vor Bakterien und Pilzen schützen.Sie dringen nicht in den lebendigen Teil des Baumes (Kambium) vor, sondern sitzen auf der äußersten Schicht, der Rinde auf.&lt;br /&gt;
&lt;br /&gt;
==Photos==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:k_DSC02317.jpeg&lt;br /&gt;
File:k_DSC02322.jpeg&lt;br /&gt;
File:k_DSC02327.jpeg&lt;br /&gt;
File:k_DSC02350.jpeg&lt;br /&gt;
File:k_DSC02390.jpeg&lt;br /&gt;
File:k_DSC02398.jpeg&lt;br /&gt;
File:k_DSC02400.jpeg&lt;br /&gt;
File:k_DSC02481.jpeg&lt;br /&gt;
File:k_DSC02631.jpeg&lt;br /&gt;
File:k_IMG_7190.jpeg&lt;br /&gt;
File:k_IMG_7376.jpeg&lt;br /&gt;
File:k_IMG_7385.jpeg&lt;br /&gt;
File:k_IMG_7401.jpeg&lt;br /&gt;
File:k_IMG_7460.jpeg&lt;br /&gt;
File:k_IMG_7656.jpeg&lt;br /&gt;
File:k_IMG_8308.jpeg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:BioArt_Forum/Lichens_workshop&amp;diff=132185</id>
		<title>GMU:BioArt Forum/Lichens workshop</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:BioArt_Forum/Lichens_workshop&amp;diff=132185"/>
		<updated>2022-11-08T08:41:48Z</updated>

		<summary type="html">&lt;p&gt;Js: polished DE version&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;BioArt Forum: Lichens workshop with Dr. Jan Vondrak &lt;br /&gt;
&lt;br /&gt;
Meeting: Marienstrasse 7b, room 204&amp;lt;br&amp;gt;&lt;br /&gt;
Meeting point: Marienstrasse 7b, room 204&amp;lt;br&amp;gt;&lt;br /&gt;
Max participants: 20&lt;br /&gt;
&lt;br /&gt;
* Saturday, 12.11.2022. 10.00 -18.00&lt;br /&gt;
* Sunday, 13.11.2022. 10.00 - 15.00&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Dr. Jan Vondrak is an academic assistant at the University of South Bohemia, Faculty of Biological Sciences, Department of Botany in Budweis Jeske Budovice (CZ). Jan Vondrak is one of the few lichen specialists in the world and an experienced mycologist.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
==EN==&lt;br /&gt;
(scroll down for the version in German)&lt;br /&gt;
&lt;br /&gt;
For our first workshop of the Environmental Nature Labs we have the very experienced mushroom and lichen researcher Dr. Jan Vondrak from the botanical institute of the University of Budweis was invited to Weimar. There are only a few such proven connoisseurs of lichens as him, most of them come from the Czech Republic, the reasons for this will be explained later. Jan is willing to go on two excursions with the participants in the city/outskirts of Weimar and present lichens and their ecology living on natural and building sites. Bringing a strong magnifying glass and umbrellas is recommended. However, we also provide a set of magnifying glasses on loan, and there is also the opportunity to learn about the study and identification and preservation of these creatures in the Environmenatal Biolaboratory (microscopy, color tests, etc.).&lt;br /&gt;
&lt;br /&gt;
Lichens are inconspicuous, often annoying, but so widespread that they make an enormous contribution to the regulation of CO2 and nitrogen oxides in the atmosphere. It is also about getting a different perspective and looking at the colonization of lichens in gardens, house walls and paths in a positive way and forgoing their chemical or physical removal.&lt;br /&gt;
&lt;br /&gt;
In addition, lichens are partners from two organisms that have blessedly come together: a fungus and a green or blue-green algae (cyanobacteria). The fungus and its partner live together on a substrate (rock, humus, bark). The fungus receives energy-rich carbohydrates from the partner that carries out photosynthesis and in return the fungus supplies water and mineral salts from the subsoil and forms pigments that protect the algae/bacteria from excessive UV radiation. Lichens grow on bark, leaves, rocks, dry stone walls or on the ground. Lichens grow very slowly compared to mosses and flowering plants. Some only a few millimeters to a centimeter per year. They can live for hundreds to thousands of years. This makes them one of the longest-lived creatures on earth. Like mosses, they can survive longer periods of drought &amp;quot;in their sleep&amp;quot; and start photosynthesis and continue to grow when humidity increases. Like mosses, they are regarded as pioneer organisms. That means they are found in extreme locations where little to no other plants survive. For example, barren nutrient-poor rocks or on walls and walls. Lichens are not parasites and do not harm the tree. They can even protect it from bacteria and fungi. They do not penetrate the living part of the tree (cambium), but sit on the outermost layer, the bark.&lt;br /&gt;
&lt;br /&gt;
==DE==&lt;br /&gt;
(English above)&lt;br /&gt;
&lt;br /&gt;
Wir haben für unseren ersten Workshop der Environmental Nature Labs den sehr erfahrenen Pilz- und Flechtenforscher Dr. Jan Vondrak vom botanischen Institut der Universität Budweis nach Weimar eingeladen. Es gibt nur wenige so ausgewiesene Kenner von Flechten wie Ihn, die meisten stammen aus Tschechien, die Gründe dafür werden noch erläutert. Jan wird mit den Teilnehmern zwei Exkursionen im Stadt/Randgebiet von Weimar unternehmen und dort auf Natur- und Bauflächen lebende Flechten und Ihre Ökologie vorzustellen. Die Mitnahme einer starken Lupe und von Regenschirmen ist empfohlen. Wir stellen jedoch auch ein Set von Lupen leihweise zur Verfügung. Es besteht auch die Möglichkeit etwas über die Untersuchung und  Identifizierung und Bewahrung  dieser Lebewesen im Environmenatal Biolabor zu erfahren (Mikroskopie, Farbtests etc.).&lt;br /&gt;
Flechten sind unauffällig, oft lästig, aber derart verbreitet, daß sie einen enorm hohen Beitrag zur Regulation von CO2 und Stickoxiden in der Atmosphäre leisten. Es geht auch darum einen anderen Blick zu bekommen und die Ansiedlung von Flechten in Gärten, Hausmauern und Wegen positiv zu betrachten und auf Ihre chemische oder physikalische Beseitigung zu verzichten. &lt;br /&gt;
&lt;br /&gt;
Flechten sind Partnerwesen aus zwei Organismen, die sich segensreich zusammen geschlossen haben: Einem Pilz und einer Grün- oder Blaualge (Cyanobakterien). Der Pilz und sein Partner leben zusammen auf einem Substrat (Gestein, Humus, Rinde). Der Pilz erhält von dem Partner, der Fotosynthese betreibt, energiereiche Kohlenhydrate und im Gegenzug liefert der Pilz Wasser und Mineralsalze aus dem Untergrund und bildet Pigmente, die die Algen/Bakterien vor zu starker UV-Strahlung schützen. Flechten wachsen an Rinde, Blättern, Felsen, Trockenmauern oder auf dem Boden. Flechten wachsen im Vergleich zu Moosen und Blütenpflanzen sehr langsam. Manche nur wenige Millimeter bis einen Zentimeter pro Jahr. Sie können mehrere hundert bis tausend Jahre alt werden. Damit gehören sie zu den langlebigsten Lebewesen der Erde. Ebenso wie Moose können sie längere Trockenperioden “im Schlaf“ überleben und bei steigender Feuchtigkeit mit der Fotosynthese beginnen und weiter wachsen. Sie gelten wie Moose als Pionierorganismen. Das heißt, sie kommen an extremen Standorten vor, wo wenige bis keine anderen Pflanzen überleben. Zum Beispiel karge nährstoffarme Felsen oder  an Mauern und Wänden. Flechten sind keine Parasiten und schaden dem Baum nicht. Sie können ihn sogar vor Bakterien und Pilzen schützen.Sie dringen nicht in den lebendigen Teil des Baumes (Kambium) vor, sondern sitzen auf der äußersten Schicht, der Rinde auf.&lt;br /&gt;
&lt;br /&gt;
==Photos==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:k_DSC02317.jpeg&lt;br /&gt;
File:k_DSC02322.jpeg&lt;br /&gt;
File:k_DSC02327.jpeg&lt;br /&gt;
File:k_DSC02350.jpeg&lt;br /&gt;
File:k_DSC02390.jpeg&lt;br /&gt;
File:k_DSC02398.jpeg&lt;br /&gt;
File:k_DSC02400.jpeg&lt;br /&gt;
File:k_DSC02481.jpeg&lt;br /&gt;
File:k_DSC02631.jpeg&lt;br /&gt;
File:k_IMG_7190.jpeg&lt;br /&gt;
File:k_IMG_7376.jpeg&lt;br /&gt;
File:k_IMG_7385.jpeg&lt;br /&gt;
File:k_IMG_7401.jpeg&lt;br /&gt;
File:k_IMG_7460.jpeg&lt;br /&gt;
File:k_IMG_7656.jpeg&lt;br /&gt;
File:k_IMG_8308.jpeg&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125749</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125749"/>
		<updated>2021-06-17T13:11:00Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
= implementation notes =&lt;br /&gt;
The underlying Max patch for tracking the ISS and playing audio looks like this:&lt;br /&gt;
[[File:ISS_overhead_patch.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead_2021-06-14.maxpat|Max patch]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
= references =&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125748</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125748"/>
		<updated>2021-06-17T13:10:24Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
= implementation notes =&lt;br /&gt;
The underlying Max patch for tracking the ISS and playing audio looks like this:&lt;br /&gt;
[[File:ISS_overhead_patch.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead_2021-06-14.maxpat Max patch]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
= references =&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125747</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125747"/>
		<updated>2021-06-17T13:10:07Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
= implementation notes =&lt;br /&gt;
The underlying Max patch for tracking the ISS and playing audio looks like this:&lt;br /&gt;
[[File:ISS_overhead_patch.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead_2021-06-14.maxpat]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
= references =&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ISS_overhead_2021-06-14.maxpat&amp;diff=125746</id>
		<title>File:ISS overhead 2021-06-14.maxpat</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ISS_overhead_2021-06-14.maxpat&amp;diff=125746"/>
		<updated>2021-06-17T13:09:46Z</updated>

		<summary type="html">&lt;p&gt;Js: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125745</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125745"/>
		<updated>2021-06-17T13:07:04Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
= implementation notes =&lt;br /&gt;
The underlying Max patch for tracking the ISS and playing audio looks like this:&lt;br /&gt;
[[File:ISS_overhead_patch.png|400px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
= references =&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ISS_overhead_patch.png&amp;diff=125744</id>
		<title>File:ISS overhead patch.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ISS_overhead_patch.png&amp;diff=125744"/>
		<updated>2021-06-17T13:06:06Z</updated>

		<summary type="html">&lt;p&gt;Js: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125743</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125743"/>
		<updated>2021-06-17T12:22:24Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
= implementation notes =&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
= references =&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125742</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125742"/>
		<updated>2021-06-17T12:21:56Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: distance communication */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125741</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125741"/>
		<updated>2021-06-17T12:21:53Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: i see you and you see me */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125740</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125740"/>
		<updated>2021-06-17T12:21:22Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: i see you and you see me */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125739</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125739"/>
		<updated>2021-06-17T12:21:18Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: video theremin */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125738</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125738"/>
		<updated>2021-06-17T12:17:52Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: color based movement tracker */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125737</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125737"/>
		<updated>2021-06-17T12:16:21Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: video theremin */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125736</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125736"/>
		<updated>2021-06-17T12:16:06Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: color based movement tracker */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125735</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125735"/>
		<updated>2021-06-17T12:16:02Z</updated>

		<summary type="html">&lt;p&gt;Js: /* process: retrieve images from a camera */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125734</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125734"/>
		<updated>2021-06-17T12:15:34Z</updated>

		<summary type="html">&lt;p&gt;Js: /* process: record and save audio and video data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125733</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125733"/>
		<updated>2021-06-17T12:15:31Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: theremin */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125732</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125732"/>
		<updated>2021-06-17T12:15:17Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: screamie */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125731</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125731"/>
		<updated>2021-06-17T12:13:12Z</updated>

		<summary type="html">&lt;p&gt;Js: /* process: retrieve images from a camera */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125730</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125730"/>
		<updated>2021-06-17T12:13:10Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: theremin */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125729</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125729"/>
		<updated>2021-06-17T12:12:49Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project: theremin */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125728</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125728"/>
		<updated>2021-06-17T12:12:46Z</updated>

		<summary type="html">&lt;p&gt;Js: /* process: other peoples work to investigate */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125727</id>
		<title>Max and I, Max and Me class notebook</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=Max_and_I,_Max_and_Me_class_notebook&amp;diff=125727"/>
		<updated>2021-06-17T12:11:32Z</updated>

		<summary type="html">&lt;p&gt;Js: Created page with &amp;quot;= process: other peoples work to investigate = * [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXP...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125726</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125726"/>
		<updated>2021-06-17T12:11:17Z</updated>

		<summary type="html">&lt;p&gt;Js: /* process: other peoples work to investigate */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125725</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125725"/>
		<updated>2021-06-17T12:10:52Z</updated>

		<summary type="html">&lt;p&gt;Js: /* references */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
* [[Max and I, Max and Me class notebook]]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125722</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125722"/>
		<updated>2021-06-17T11:56:41Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup. The data about the ISS crew is scraped from the internet using custom written software.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125720</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125720"/>
		<updated>2021-06-17T11:52:11Z</updated>

		<summary type="html">&lt;p&gt;Js: /* references */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
* [https://www.howmanypeopleareinspacerightnow.com/ how many people are in space right now?]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125719</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125719"/>
		<updated>2021-06-17T11:50:02Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125718</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125718"/>
		<updated>2021-06-17T11:49:33Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition. A handbell is rung and the names of the people currently in outer space are read out loud.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:People_in_space.mp3&amp;diff=125717</id>
		<title>File:People in space.mp3</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:People_in_space.mp3&amp;diff=125717"/>
		<updated>2021-06-17T11:47:05Z</updated>

		<summary type="html">&lt;p&gt;Js: Js uploaded a new version of File:People in space.mp3&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125716</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125716"/>
		<updated>2021-06-17T11:41:46Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
The Max patch was [https://docs.cycling74.com/max8/vignettes/standalone_building exported to a standalone application]. It is now run on a small computer board to reduce the complexity of the setup.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125715</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125715"/>
		<updated>2021-06-17T11:37:32Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125714</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125714"/>
		<updated>2021-06-17T11:37:19Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
{{Listen&lt;br /&gt;
|title=&amp;quot;ISS announcement&amp;quot;&lt;br /&gt;
|filename=people_in_space.mp3}}&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125713</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125713"/>
		<updated>2021-06-17T11:34:48Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:people_in_space.mp3]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:People_in_space.mp3&amp;diff=125712</id>
		<title>File:People in space.mp3</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:People_in_space.mp3&amp;diff=125712"/>
		<updated>2021-06-17T11:34:28Z</updated>

		<summary type="html">&lt;p&gt;Js: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:People_in_space_graham.wav&amp;diff=125711</id>
		<title>File:People in space graham.wav</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:People_in_space_graham.wav&amp;diff=125711"/>
		<updated>2021-06-17T11:34:08Z</updated>

		<summary type="html">&lt;p&gt;Js: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125710</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125710"/>
		<updated>2021-06-17T11:28:55Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
[[File:ISS_overhead.png|400px]]&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ISS_overhead.png&amp;diff=125709</id>
		<title>File:ISS overhead.png</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=File:ISS_overhead.png&amp;diff=125709"/>
		<updated>2021-06-17T11:28:41Z</updated>

		<summary type="html">&lt;p&gt;Js: File uploaded with MsUpload&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;File uploaded with MsUpload&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125704</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125704"/>
		<updated>2021-06-17T11:14:47Z</updated>

		<summary type="html">&lt;p&gt;Js: /* implementation notes */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125703</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125703"/>
		<updated>2021-06-17T11:08:57Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125702</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125702"/>
		<updated>2021-06-17T11:08:29Z</updated>

		<summary type="html">&lt;p&gt;Js: /* proof of concept: 2021-05-19 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== implementation note ==s&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125148</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125148"/>
		<updated>2021-06-10T12:46:10Z</updated>

		<summary type="html">&lt;p&gt;Js: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== proof of concept: 2021-05-19 ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125145</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125145"/>
		<updated>2021-06-10T12:45:23Z</updated>

		<summary type="html">&lt;p&gt;Js: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== proof of concept: 2021-05-19 ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
__FORCETOC__&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125129</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125129"/>
		<updated>2021-06-10T12:41:09Z</updated>

		<summary type="html">&lt;p&gt;Js: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= final project 3: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== proof of concept: 2021-05-19 ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125127</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125127"/>
		<updated>2021-06-10T12:40:52Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project 3: ISS overhead */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125102</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125102"/>
		<updated>2021-06-10T12:25:48Z</updated>

		<summary type="html">&lt;p&gt;Js: /* project 1: i see you and you see me */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= final project 3: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== proof of concept: 2021-05-19 ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125100</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125100"/>
		<updated>2021-06-10T12:25:30Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project 2: distance communication  */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project 1: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= project: distance communication =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= final project 3: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== proof of concept: 2021-05-19 ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
	<entry>
		<id>https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125099</id>
		<title>GMU:Max and I, Max and Me/Johannes Schneemann</title>
		<link rel="alternate" type="text/html" href="https://www.uni-weimar.de/kunst-und-gestaltung/wiki/index.php?title=GMU:Max_and_I,_Max_and_Me/Johannes_Schneemann&amp;diff=125099"/>
		<updated>2021-06-10T12:25:08Z</updated>

		<summary type="html">&lt;p&gt;Js: /* final project 1: i see you and you see me  */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents my work and experiments done during the &amp;quot;Max and I, Max and Me&amp;quot; course. Feel free to copy, but give attribution where appropriate.&lt;br /&gt;
&lt;br /&gt;
= process: other peoples work to investigate =&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/F.Z._Ayg%C3%BCler F.Z._Aygüler AN EYE TRACKING EXPERIMENT]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/WeiRu_Tai Does social distancing change the image of connections between friends, family and strangers? (includes patch walkthrough)]&lt;br /&gt;
* [https://www.uni-weimar.de/kunst-und-gestaltung/wiki/GMU:Max_and_the_World/Paulina_Magdalena_Chwala Paulina Magdalena Chwala (VR, OSC, Unity)]&lt;br /&gt;
&lt;br /&gt;
= project: theremin =&lt;br /&gt;
Assignment: Please do your own first patch and upload onto the wiki under your name&lt;br /&gt;
&lt;br /&gt;
I did build a simple [https://en.wikipedia.org/wiki/Theremin Theremin] to get to know the workflow with Max. Move your mouse cursor around to control the pitch and volume.&lt;br /&gt;
&lt;br /&gt;
[[File:simple_mouse_theremin_screenshot.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:simple_mouse_theremin.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/loadbang loadbang]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/mousestate mousestate]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/cycle~ cycle~]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/slider slider]&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezdac~ ezdac~]&lt;br /&gt;
&lt;br /&gt;
= process: retrieve images from a camera =&lt;br /&gt;
This simple patch allows you to grab (still) images from the camera.&lt;br /&gt;
&lt;br /&gt;
[[File:retrieve_images_from_camera.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_retrieve_images_from_camera_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab]&lt;br /&gt;
&lt;br /&gt;
= project: screamie =&lt;br /&gt;
If you scream loud enough ... you will get a selfie.&lt;br /&gt;
&lt;br /&gt;
[[File:screamie_photo.png|400px]]&lt;br /&gt;
&lt;br /&gt;
This little art is a first shot. Also: snarky comment on social media and selfie-culture.&lt;br /&gt;
[[File:screamie.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-25_screamie_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/levelmeter~ levelmeter~] - to display the audio signal level&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/peakamp~ peakamp~] - find peak in signal amplitude (of the last second)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/past past] - send a bang when a number is larger then a given value&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get image from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.matrix jit.matrix] - to buffer the image and write to disk&lt;br /&gt;
&lt;br /&gt;
= process: record and save audio and video data =&lt;br /&gt;
This patch demonstrates how to get audio/video data and how to write it to disk.&lt;br /&gt;
[[File:audio_video_recorder.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_audio_video_recorder_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
Objects used:&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/ezadc~ ezadc~] - to get audio input&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/meter~ meter~] - to display (audio) signal levels&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - to get a steady pulse of bangs&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/counter counter] - to count bangs/toggles&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/select select] - to detect certain values in number stream&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/delay delay] - to delay a bang (to get DSP processing a head start)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - to get images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.vcr jit.vcr] - to write audio and video data to disk&lt;br /&gt;
&lt;br /&gt;
= project: color based movement tracker =&lt;br /&gt;
This movement tracker uses color as a marker for an object to follow. Hold the object in front of the camera an click on it to select an RGB value to track. The object color needs to be distinctly&lt;br /&gt;
different from the background in order for this simple approach to work. This patch yields both a bounding box and its center point to process further.&lt;br /&gt;
&lt;br /&gt;
The main trick is to overlay the video display area (jit.pwindow) with a color picker (suckah) to get the color. A drop-down menu is added as a convenience function to select the input camera (e.g. an external one vs. the build-in device).&lt;br /&gt;
The rest is math.&lt;br /&gt;
&lt;br /&gt;
[[File:color_based_movement_tracker.png|400px]]&lt;br /&gt;
&lt;br /&gt;
[[:File:2021-04-23_color_based_movement_tracker_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
The following objects are used&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/metro metro] - steady pulse to grab images from camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.grab jit.grab] - actually grab an image from the camera upon a bang&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/prepend prepend] - put a stored message in front of any input message&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/umenu umenu] - a drop-down menu with selection&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.pwindow jit.pwindow] - a canvas to display the image from the camera&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/suckah suckah] - pick a colour from the underlying part of the screen (yields RGBA as 4-tuple of floats (0.0 .. 1.0))&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/swatch swatch] - display a colour (picked)&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/vexpr vexpr] - for (C-style) math operations with elements in lists&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.findbounds findbounds] - search values in a given range in the video matrix&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/jit.lcd jit.lcd] - a canvas to draw on&lt;br /&gt;
* [https://docs.cycling74.com/max8/refpages/send send] &amp;amp; [https://docs.cycling74.com/max8/refpages/receive receive] - avoid cable mess&lt;br /&gt;
&lt;br /&gt;
= project: video theremin =&lt;br /&gt;
Assignment: Combine your first patch with sound/video conversion/analysis tools&lt;br /&gt;
&lt;br /&gt;
This patch combines the previously developed color-based movement tracker with the first theremin.&lt;br /&gt;
&lt;br /&gt;
[[File:video_theremin.png|400px]]&lt;br /&gt;
[[:File:2021-04-23_video_theremin_JS.maxpat]]&lt;br /&gt;
&lt;br /&gt;
= project 1: i see you and you see me =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;The project did not grew legs upon more detailed examination. It therefore was not persued further.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
* in collaboration with [[Luise Krumbein]]&lt;br /&gt;
&lt;br /&gt;
Video conference calls replaced in-person meetings on a large scale in a short amount of time. Most software tries to mimic a conversation bus actually either forces people into the spotlight and into monologues or off to the side into silence. One to one communication allows a user to concentrate on the (one) other, but group meetings blur the personalities and pigeonhole expression into a cookie-cutter shape of what is technically efficient and socially barely tolerable. The final project in this course seeks to explore different representations in synchronous online group communication.&lt;br /&gt;
&lt;br /&gt;
To shift the perception data will be gathered from multiple (ambient) sources. This means tapping into public data sets for broad context. It also means gathering hyper-local/personal data via an arduino or similar microcontroller (probably ESP8266) for sensing the individual. These data streams will be combined to produce a visual output using jitter. This result aims to represent the other humans in a manner to emphasize personality and not the emulation of the technicality in face to face exchanges.&lt;br /&gt;
&lt;br /&gt;
= &amp;lt;s&amp;gt;final project 2: distance communication &amp;lt;/s&amp;gt; =&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Different implementations of the algorithm were tried in Max. None yielded a convergence of the phases of the oscillators. Therefore the project was cancelled.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The installation &amp;quot;distance communication&amp;quot; explores and reflects on (romantic) relationships at a time where partners are apart. Their lives don&#039;t take place in the same environment anymore and are open to different influences. It takes effort to stay in sync even though a distance may be bridged by modern communication means. Two blinking LEDs represent the partners, the blinking rhythm their respective lives.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of two LEDs driven by a microcontroller and sensed via a light-dependent resistor. A small computer (laptop or small form factor PC) runs both Max and a video conference software to display &amp;quot;the other&amp;quot;. The Max patch and the &amp;quot;other&amp;quot; are displayed on a screen or video projector. The Max patch tries to synchronize the respective blinking pattern by leveraging the well studied behaviour of fireflies. A microphone on the floor picks up ambient sound. If this audio level exceeds a given a threshold the blinking of one LEDs is triggers / reset to introduce disturbance in the process. The installation may be presented with both parts in the same room or one part off-site somewhere else. &lt;br /&gt;
&lt;br /&gt;
== implementation notes ==&lt;br /&gt;
The firefly model turned out to be insufficient for the current setup due to the small number of agents. The interaction has been remodeled from first principles based on two harmonic oscillators and the Kuramoto–Daido model for phase locking.&lt;br /&gt;
&lt;br /&gt;
[[File:lover_equations_2021-05-24.jpg|400px]]&lt;br /&gt;
&lt;br /&gt;
Observing the equation system over time shows the expected behavior (when implemented in Python).&lt;br /&gt;
&lt;br /&gt;
[[File:lover_oscillators_2021-05-24.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Max implementation of the Kuramoto-Daido model&lt;br /&gt;
&lt;br /&gt;
[[File:portrait_of_two_lovers_at_a_distance_2021-05-31.png|400px]]&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
*  [https://www.springer.com/gp/book/9783540071747 Kuramoto, Yoshiki (1975). H. Araki (ed.). Lecture Notes in Physics, International Symposium on Mathematical Problems in Theoretical Physics. 39. Springer-Verlag, New York. p. 420.)]&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.jstor.org/stable/2808377 Buck, John. (1988). Synchronous Rhythmic Flashing of Fireflies. The Quarterly Review of Biology, September 1988, 265 - 286.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://www.journals.uchicago.edu/doi/10.1086/414564 Carlson, A.D. &amp;amp; Copeland, J. (1985). Flash Communication in Fireflies. The Quarterly Review of Biology, December 1985, 415 - 433.]&amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://deepblue.lib.umich.edu/handle/2027.42/56374 Lloyd, J. E. Studies on the flash communication system in Photinus fireflies 1966] &amp;lt;/s&amp;gt;&lt;br /&gt;
* &amp;lt;s&amp;gt;[https://academic.oup.com/icb/article/44/3/225/600910 Nobuyoshi Ohba: Flash Communication Systems of Japanese Fireflies]&amp;lt;/s&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= final project 3: ISS overhead =&lt;br /&gt;
&lt;br /&gt;
The International Space Station is currently the only extraterrestrial habitat for humans. It circles the earth approximately every 90 minutes and thus can look down on a vast planet. Humans on the other hand rarely look up and grasp what we accomplished and what we can reach for. This small installation tracks the International Space Station and emits a notification if it is above the place of exhibition.&lt;br /&gt;
&lt;br /&gt;
The technical setup consists of a computer running Max and some means of notification like a bell or a speaker. The current position of the International Space Station is calculated from its orbital data and set into relation to the place of exhibition (e.g. Weimar). If the ISS is at a place which can be considered overhead the notification is triggered. If a speaker is used the sound synthesis can encode different parameters.&lt;br /&gt;
&lt;br /&gt;
== proof of concept: 2021-05-19 ==&lt;br /&gt;
The basic Max patch for tracking the ISS and sending a bang is done.&lt;br /&gt;
[[File:ISS_notifier_2021-05-19.png|400px]]&lt;br /&gt;
&lt;br /&gt;
Processing TLE files and orbital data was sidestepped by relying on [http://open-notify.org/Open-Notify-API/ISS-Location-Now/ open-notify.org] and processing [https://json.org/ JSON] data. The distance (on earth) to the city center of Weimar (at 50.979444, 11.329722) calculated using the euclidian distance as a metric. To simplify the math it is assumed that the world is flat and the [https://en.wikipedia.org/wiki/World_Geodetic_System#WGS84 WGS 84] reference system is ignored. This is ok for small distances where the curvature of the earth does not result in too much of an error.&lt;br /&gt;
&lt;br /&gt;
Lessons learned:&lt;br /&gt;
* sometimes you can find an API for exactly the data you need&lt;br /&gt;
* occasionally ignoring the material reality is feasible on very narrow bounds&lt;br /&gt;
* school math turned out to be useful&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Todo:&lt;br /&gt;
* extend to use different satellites&lt;br /&gt;
* leverage their orbital data&lt;br /&gt;
* attach a loudspeaker for announcements&lt;br /&gt;
&lt;br /&gt;
== references ==&lt;br /&gt;
* [https://en.wikipedia.org/wiki/International_Space_Station wikipedia: International Space Station]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Two-line_element_set wikipedia: two-line element set (TLE)]&lt;br /&gt;
* [https://nssdc.gsfc.nasa.gov/nmc/SpacecraftQuery.jsp NASA Space Science Data Coordinated Archive Master Catalog Search]&lt;/div&gt;</summary>
		<author><name>Js</name></author>
	</entry>
</feed>