425
edits
No edit summary |
No edit summary |
||
(4 intermediate revisions by the same user not shown) | |||
Line 8: | Line 8: | ||
{{#ev:youtube|BRHOUWcj2sI|560|left}} | {{#ev:youtube|BRHOUWcj2sI|560|left}} | ||
<br/><br/><br/><br/><br/><br/><br/><br/><br/><br/><br/><br/> | <br/><br/><br/><br/><br/><br/><br/><br/><br/><br/><br/><br/> | ||
<br/><br/><br/><br/><br/><br/> | <br/><br/><br/><br/><br/><br/> | ||
During the winter semster 2019 i took the project module "from random to fiction" with Ursula Damm. | During the winter semster 2019 i took the project module "from random to fiction" with Ursula Damm. | ||
Line 32: | Line 31: | ||
Through different presentations during the course i got to know many new and very creative works towards AI, like "Emissaries" from Ian Cheng for example, which i found highly inspirational. This and other more Artificial Life oriented approaches were giving me some ideas and directions towards i want to go. | Through different presentations during the course i got to know many new and very creative works towards AI, like "Emissaries" from Ian Cheng for example, which i found highly inspirational. This and other more Artificial Life oriented approaches were giving me some ideas and directions towards i want to go. | ||
So i got back to Processing and practised with Daniel Shiffmanns Book "The nature of code". Starting with Steering Behaviour and the Vector Math Basics to Autonomous Agents and Genetic Algorithms. My aim at that time actually was to in the end of the course be able to write my own "intelligent" agents with super simple neural network algorithms. I didnt got that far to neural nets because the topic of evolutionary computing | So i got back to Processing and practised with Daniel Shiffmanns Book "The nature of code". Starting with Steering Behaviour and the Vector Math Basics to Autonomous Agents and Genetic Algorithms. My aim at that time actually was to in the end of the course be able to write my own "intelligent" agents with super simple neural network algorithms. I didnt got that far to neural nets because the topic of evolutionary computing and artificial life simulations took me quite a while to get into but also it was very interesting for me. So i decided do make my project more based around that. | ||
Time to come to the actual project, "Artificial Ocean". Please take a look at the video for visual documentation of the setup, some recorded final outcomes and a very amateurish recording of a very early test projection in my living room (due to covid19 situation i wasn't able to access any university infrastructure) (due to my inability to take the time to make a better recording the quality got really bad but u can see/hear the audio and video working together in realtime, at least a bit) | |||
It was planned to be a simple wall projection installation. No special forms or mask in the projection mapping stage. Wasn't making any sense for me at that time to plan further about hardware setup because of covid19. | |||
The style should be generative, so basically a sound and video setup that constantly generates related, continiously changing output from "itself" (based on the Artificial Life Simulation running in the background). So i thought about 3 Layers. Video, Audio, Computation. All connected through abstract relations. | |||
As I still didn't really came up with a story or something that guides me in the coding stage towards more well thought out system structures, i choose changes like the death of an agent or the birth of a new agent or the position/velocity/size of longest living agent as data for the particle system in unity and the modular synthesizers. In the particle system was a death of a agent shown as a circular impulse on the flow field of the particles. At the same time the death triggered a clocked analog noise generator where i sampled different modulation signals from. Those then were going into different parameters of a stereo delay which sits behind the also triggered synth. So on every death a wide variety of delayed sounds we're playing with according visual feedback. another example would be the position of the oldest agent, where the x value determined the frequency of a filter layered over the drone background sound. The same value was hooked up to the x transform of a perlin noise velocity field which affects particles position (wave movement). | |||
As I already said, unfortunately this project never became a real installation because of covid but i have to also say that i didn't found out what i really want to do with my agent system and those algorithms and thats why i didn't came up with a consistent idea that really connects those layers. So on different levels my initial project idea failed, but still, i'm very happy with this project because i learned really a lot on many different and ground-setting levels. Besides that was the whole setup process a very good pratice for setting Unity, Processing and Modular Synths up together not just for installative ways bur also for performative use cases. Because the setup would aswell fit for a audio visual-performance piece. |
edits