Today’s technical sessions were designed to make us aware of the possibilities of the various technologies that can be used to author works for the touchscreens. I’m already quite good friends with Processing, and am on speaking terms with Quartz Composer. Flash, however, has never been particularly pleasant company and while I’ve heard interesting things about VVVV, we never cross paths in the OSX environment. This means that I’ll probably be making whatever I make using Processing, since time is tight and building up an adequate skill level in any of the others will take more time than we have.
The afternoon was spent engaging in an interesting discussion about what our projects might be and what steps we might go through to carry them out. Some members of the group have made innovative discoveries by changing the orientation of the touchscreens and placing objects upon them, and some had gone as far as to make a working prototype project. A sizeable cohort of us were having a more difficult time marrying the limitations and expectations of the touchscreen interface with the concerns of our own practices.
Touchscreens carry with them an expectation of immediacy, and have associations with the delivery of information, supply and demand, or real-time fulfilment. My work is often the complete opposite of this, using unedited footage of slowly-changing events, and offering up a version of “real-time” that is more akin to paint drying than whizz-bang interaction. The challenge, then, is to find a way of visualising slow change without compromising the immediacy that is a strength of the touchscreen interface.
I was pleased to discover that an older sketch of mine could be adapted to grab live camera images in time-lapse style, and decided to give this a try as a way of getting together some content. I’m also planning to scope for some locations Wednesday morning.