Extended View Toolkit

Authors: Peter Venus, Marian Weger, Cyrille Henry, Winfried Ritsch

Download full paper: Media:Extended View Toolkit.pdf

Project Site: extendedview.mur.at, the toolkit may be downloaded there.

The Extended View Toolkit is a set of abstractions developed using Pd[[Pure Data]] a dataflow programming environment/GEM and the openGL capabilities of GEM.

The motivation to create systems, that are able to produce media with a wider projection area lies in our own visual and aural capabilities: nature gave us the ability to gather visual information with an angle of almost 180 degrees and to detect acoustic sensations in 360 degrees around us. Although relevant information gets filtered, so that there is a center of focus we can concentrate on, these abilities form our main ambience perception. Even more, we tend to "turn our heads" towards an event that attracts our attention.

Based around this motivation and the idea of the universal usage of such a toolkit, it was intended to create a open-source and community driven project to experiment and realise panoramic video in combination with complex projection environments to create immersive media experiences in different setups. Furthermore, the camera system built for the project incorporated common webcams to make experimenting possible at low development costs.

The toolkit can be divided into two basic parts: the image-merging part and the projection part. The source for the image-merging (stitching) part is generated with a multiple-camera system, where the cameras are horizontally aligned on a circle to cover a panoramic-viewing area. With the help of the toolkit, those sources can then be combined into one continuous image or video stream. The abstractions of the toolkit are taking care of the edge-blending, as well as the correction of lens-distortion, which is caused by optics and arrangement of the cameras. Those features are implemented with the help of openGL shaders in order to keep CPUCentral Processing Unit, an →IC, generally refered to as the processor in a computer-usage at a minimum.

The second part of this set of abstractions aims at the creation of multi-screen and multi-projector environments, not only to display a correct representation of the generated panoramic material, but to enable easy creation of immersive visual media environments and video mapping. These abstractions can be combined to form big screens out of multiple projectors with edge blending between overlapping borders of single projectors and contains a vertex model that can be adjusted in 4 points for plane surfaces, respectively 9 points for curved surfaces and correction of projection distortions. Thus, projection mapping on challenging geometric objects and forms is possible and easy to realise. Since every single projection-abstraction reads the same framebuffer, the abstractions can be adjusted in a way, that just a portion of this framebuffer is displayed on each instance individually. That way it is possible to combine multiple abstractions to form fragments of a big image, which reassemble into one big continuous screen. These vertex and texture features are also implemented with openGL shaders within GEM.

The Extended View Toolkit was originally developed at the IEM for the Comedia-Project “Extended View Streamed“.

See also Extended View Toolkit workshop

Kreativfonds Bauhaus-Univeristät WeimarElectronic Arts Blog für digitale SpielkulturThe Mozilla FoundationAllied Vision TechnologiesReality Jockey Ltd.Freistaat ThüringenBauhaus-Universität WeimarHochschule für Musik Franz Liszt WeimarFraunhofer Institute for Digital Media Technology IDMTStadt WeimarKlassik Stiftung WeimarNKFaculty of MediaStudio for electro-acoustic MusicKulturTragWerk e.V.Elektronisches Studio der TU BerlinMaschinenraum Hackerspace WeimarParlomar5 e.V.Lab for Electronic Arts and PerformanceRadio Lotte WeimarSponsors and partners of the 4th internationals Pure Data Convention in Weimar 2011

4th international Pure Data Convention 2011 Weimar ~ Berlin