A collaboratively devised experiment named "Proxima" had a showing Friday week ago during a series called "The Experiment" at the East Sydney Community Arts Centre in Darlinghurst (the old Heffron Hall) presented by Brand X and curated by Jaimie Leonarder. I say "showing" because I consider it as an experiment with different elements of sound, technology and movement.... The 30 minute experience started with Ben and I (as our artist name "Flœk") explaining to the audience a bit about what we were going to do - I was wearing a heart rate monitor, our phones were harnessed to our chests. During the performance, my live heartbeat was sonified and soundscapes were affected by the movement of Ben and I (via the accelerometer/gyro in our phones). We also used our phones as the controller to trigger each section /phrase of the work - which meant we had a movement theme of separation and coming back together, and there was a sequencer component triggering live synthesis algorithms we'd designed (some quite simple, some quite complex).
Because we were using the audience's phones as the sound system we were able to create some interesting spacial effects, and as we were able to isolate specific phones, Ben and I were able to have unique sounds on our phones.... the whole experience was unique to each individual - as each person's phone had a unique part to play in the music.
I don't think this has been done before... and I'm really excited by its possibilities.
It was really successful as a developmental showing of some new ideas and tech. Compositionally, it of course needs work. It had some interesting ideas - from very high pitched "noise" sounds (like bell birds or crickets) to a cicada scream, to later octatonic mega-chord structures - but the whole performance aspect was Ben and I moving apart and coming back together in different ways, at different speeds, and at different heights/levels. There were lots of human ways to interpret it and if we take it further these aspects would be honed with more precision...
Dean Walsh helped us with some movement ideas we could play with, and Guy James Whitworth helped out with costuming and makeup.
It was quite feat to get it together; Ben was coding until 8:40pm... and I was thinking about what to cut, how to simplify without ruining it entirely. We'd been glittered, painted, phones harnessed to our chests, but the bloody code wasn't functioning as we'd thought it would! The hilarity of it was that the bits that were working just hours before the performance were the bits that weren't working the previous week, and the bits that weren't working hours before the performance were the bits that were working previously.
Fortunately, Ben discovered the issue... at the very last minute, less than an hour before we were meant to be on. He added a button for the audience to "Enable Sound" through their phones, and hey presto, it worked (before we were expecting the site to self-trigger automatically, which it did with some browsers but not others).
There's heaps of work to do to but the potential of the capabilities are evident. For me, expanding on tech and the possibilities for composition it can bring is a great evolution in my composition practice and amalgamates my interests.
We'll see where it leads us!