Body Hotel by Thomas Schaeben & Schad Privat (remix by COMA)
Directed & produced: David Lüpschen
“Featuring a geometrically vivacious blend of color and form, these bikes showcase the artwork of world-renowned artist Dalek (James Marshall). These rolling art pieces were commissioned by Hurley to compliment their ultra high-end line of Phantom boardshorts and swimwear.”
Just two minutes in length, a teaser from a new documentary on the Mapping Festival encapsulates the growing range today’s visualist can cover. Far from look-alike video loops projected on narrow rectangles, that gamut extends to mapping and human performance that explodes the screen, from advanced, special-effects-laden cinema to abstract visual design. The video is appropriately titled: these are, simply, artists working in light, exploring with that light the concepts of rhythm and space long exploited by music and sound.
121 fully assignable controllersCome on, no snickering about “potards.”
matrix button de 4×16
2×16 potards
8 sliders
16 touch note on note off
1 jog weel
2 crossfaders
Power supplied by USB port
As I mentioned, the interactive modular L.E.D. system is made by my friends at NYC-based Sensacell Corporation. There is tons of information about their offering and the technical specs on their system on their site at www.sensacell.com but basically, they make modular L.E.D. tiles with built-in capacitive proximity sensors. I’ve been working with them doing content and display software for each successive version of the system since they started in the early 2000s (I was a partner with one of the principles, Leo Fernekes, in the technology/surveillance-themed nightclub Remote Lounge in the early part of the decade). The tiles can work in various “autonomous modes” where they simply light up when the sensors are triggered or (and this is where I come in) one can write software to read the sensors and to send display data to the tiles. I started Madbutter (www.madbutter.com) as a design and programming studio to develop content and programming for these interactive installations earlier this year. I’ve gotten a lot of favorable attention for the pieces I have done so far but I’m still looking to get the word out on the capabilities of this system to architects and designers looking to install interactive art that can work reliably and effectively at this “architectural” scale.Aside from this particular work, I’m very intrigued in the potential of the technology and how a variety of artists might push it in different directions. I hope we can get a discussion going here; do join in.
The current version of the system that I am using here is based on 6″ square tiles which has full RGB LEDs set at an inch pitch – for 36 independently addressable LEDs per tile – and 4 proximity sensors. These can be arranged into arrays (or irregular shapes for that matter) of virtually any size. We have put them in floors, in and on walls, in windows, in furniture, around pillars etc. The general idea is to layer the tiles with a translucent, non-conductive surface that protects the tile and diffuses the LED slightly. In the case of this installation the tiles are fully interactive through a half inch of frosted plexiglass and a quarter inch of plate glass.
I wrote the display system in Max/MSP/Jitter. I worked with Sensacell to develop a custom box (we call it a SensaNode) that sends to my control computer, over TCP/IP, the polled sensor data of the whole tile array as a Jitter matrix and also in turn reads in a matrix of display data sent from Jitter to cut up and process and route to the individual tiles. Currently we are comfortably and reliably getting 20 fps I/O on our read/write cycle, so I author display data accordingly. Because the pixel pitch is relatively large, the combined triptych (three 4 foot wide by 6 foot tall panels) in the 53rd St installation uses only a 144×72 display matrix, and the sensor matrix is only 48×24, so it is important to author content that is effective at that resolution! More importantly, of course, is to come up with interesting and immediately apparent ways to use the incoming sensor matrix to manipulate the outgoing display data in real time. I usually use some computer vision externals (the excellent cv.jit package) to process the sensor data to give me centroid or blob coordinates that I can use to interactively track some value I can manipulate in Jitter.