Abstract-Codex | index |> works | bio | memory | network | contact

RGB 2.0 is an interactive musical installation for 2-9 users created in collaboration with Tomas Dvorak (CZ) and Matous Godik (CZ). Main purpose of the work is idea of audience participation in the musical creation on the live performance. Communication between musicians and audience is made trough specially designed flashlights. Each of them can emit 3 basic colours of the spectrum: red, green or blue. Other colours can be made by combining primary colours together.Goal of the project is the research for possibilities of the live performance using such frame of the communication. The art content are the algorithms and schemes of the communication.
Implemented algorithms are:

LIVE PROCESSING, a system to interact with live recorded samples (visualized as abstract 3D shapes). Connecting lights to these "sound objects" a specific DSP parameter changes, any light color interacts with sounds in a different way.

ATOMS, a sonic cellular-automata generated and modified by lights actions.
What is resulting it's a collaborative audio-visual performance through a "public" interface at the same time functional and aesthetical.

Images and video are taken from Prague performance of 9th May 2005 (at Palac Akropolis for Entermultimediale festival), on the stage with Tomas (live electronics) and Matous (Fender Rhodes) there was the polystrumentist Jarda Koran. All music was improvised.
A very special thanks to Ales Cerny our producer.

For detailed information and more media:

>>go to official web page

work: rgb 2.0 - photo documentation -
year: 2005
medium: mixed media


>> go to the page
work: rgb 2.0 - video documentation -
year: 2005
medium: mixed media


>> go to the page

RGB 2.0 was built up using a mix of digital and analogue media, in the following graph a simplifyed scheme of installation is shown. It is organized on three different layers of informations:
PHYSICAL layer shows the devices used in the installation and what audience could perceive in the environment.
COMPUTATIONAL layer shows how the system was organized in different tasks.
NETWORK layer shows technologies, programming laguages and protocols used for the installation: EYESWEB (motion tracking), PROCESSING (visualization, generative graphics and interaction) and MAX/MSP (sound analysis, sound live processing and synthesis). Moreover we used OSC protocol and oscP5 library (thanks to Andreas Schlegel to share it).

Download Processing code ( it's still a work in progress):

>>rgb_p5_code


  << back ^page-up
  > works > rgb 2.0