In a little over two weeks we will be performing a film at the Glasgow Film Fest – improvised sound, music and video to make a film in real-time. I outlined the project in a previous post and starting with this one (and another three or four that will follow) I will collect and share my ideas behind this project – creative limitations, Max patches, hardware interfacing, field recording, communication, video and the performance itself.
A project such as this is not without problems:
- Real-time: The project requires us to trigger and process sound and video in real time. A non-linear DAW (Pro Tools, Logic, Nuendo, etc) would be useless simply because they aren’t built for something like this. Other performance-centric solutions (like Ableton Live or Resolume Avenue) might work, but we are still tied down to it’s architecture. Why not create custom audio, video and performance solutions in Max/MSP with specific functionality? Rebuilding a different version of Ableton Live (with a gazillion features) is not only difficult, but impractical and stupid.
- Balance: There needs to be a fine balance between creativity, purpose and technology. Getting carried away with technology is not cool – we aren’t building tools and performing to a Max/MSP convention of Geeks. It will be an audience made of curious people who might not care if I’m granulating a sound in to a million grains with variable pitch followed by an auto filter and the most awesome reverb ever known to mankind! The only thing they will take away is the emotional impact and experience.
- Technology: While technology is our solution, it can also be our biggest problem. In creating custom software we not only have to make sure it works but also that it works well. Time has to be invested in making sure it is stable and that it doesn’t crash every five minutes!
- Communication: This project reminds me of when I used to play in a band. A lot of improvisation is based on trust and giving the other people in the group a chance to take center stage (or not to). Silence isn’t a bad thing. The music, sound effects, voice and video must find their own space and form dynamic and resonating relationships (just like in every other form of audio-visual media).
- Logistics: Equipment type, content type, equipment reliability, ease of use and communication between performing members (how do we know when to stop or bring the piece to an end?) is important in making sure everything works well.
- Unpredictability: Even with all this thinking and planning, there will be surprises.
I have divided my work on this project in to three sections, which fall in to the larger ecosystem of the project itself:
Video: Susan Kemp is currently shooting a variety of footage for us in Glasgow. We won’t be rehearsing to any of it, to try and retain the spontaneity in our first performance. Gervaiswill be be editing and effecting the video on the fly. He should blog about his tools soon.
Voice: Fiona Rintoul has written a poem specifically for this project, titled ‘Twelve Polaroids: Glasgow and Environs’. As the name suggests, it is divided in to twelve parts. We recorded her reading it out in different locations. The voice might lead the film (we are still unsure, but an audience will always latch on to the voice compared to the other sounds in a soundtrack).
Sound effects/design: Gervais and I spent the last weekend in Glasgow, walking around and recording the city. Some of our location choices were based on the above mentioned poem.
Music: We will be using some pre-composed music which I will be effecting and designing using a newer version of the patch I used for the Designing Sound/Sonic-Terrain year end sonic-mash.
Software: All three of us will be using custom tools built in Max/MSP to make the film performable. We have also built in a layer of communication where we can get our individual patches to influence the others.
Hardware: I will be using TouchOSC on an iPad to control my patch – greater resolution (floating point and not MIDI 0-127) and a highly customisable interface. I was a bit skeptical about not having hardware faders (the ‘feel’), but I was quickly convinced once I began using it.
The performance is obviously a combination of the content, tools, imagination and spontaneity – with a time factor. We also need to make sure we don’t stray from the title (Glasgow: A Symphony of a Great City). It is easy to get lost in a world of drones and electronic noises, the film needs to make some amount of sense to the audience! Things can and might go wrong, which is why my patch has a PANIC! button and my sound interface will be within easy reach if I need to quickly mute my output.
In the future posts I will look in to each of the sections in greater detail and outline the solutions to the above mentioned problems.
If you are in Glasgow around 18 February, do drop in. More info here.