Tim Murray-Browne – Interview Snippet

floating world

Tim Murray-Browne discusses Art and Interactive Technology


Interview Snippets Banner

Tim Murray-Browne is an artist who creates performances and installations with interactive technologies. He recently spoke at an Artful Spark event under the theme of “Immersive Sound” at Google CampusSamuel Fry interviews him about his work.

How would you describe what you do?

It depends who I’m talking to, but usually I would say: I’m an artist working with interactive technology. I create performances and installations that feature sound and visuals that respond to the actions of people in the space. In terms of realisation, much of my work involves coding. I code algorithyms to interpret data from sensors like cameras to understand what is happening in a space, logic to define what the system should do in response to this and generative graphics and sound design.

But a key part of the process is the concept and design that defines the overall audience experience. This often involves collaborating with others and spending time understanding related disciplines such as music and dance.

What interests you about people’s sense of identity in relation to their environment?

I’ve always been fascinated by how seemingly random events in our environment contribute to the person we become. We moved a lot when I was growing up so I went to lots of different schools. Without really intending it, every time I started a new school I seemed to be a different person. But I didn’t really have any agency over who this person was – it just seemed to arise out of the combination of me and my new peers. It seems apparent that if someone acts towards you in a manner assuming a certain role of you then we will to some extent naturally take on that role. But I feel like we still get trapped into this thinking paradigm that we are a fixed self that must act on our environment to nurture that self. We forget that we also rely on that environment to inform us of who this self is.

This is what draws me to working with interactive technology so much – particularly when it involves the moving body with music and abstract imagery. Music and dance have this strange way of saying so much while also saying nothing. The abstraction lets us explore our human activities together before we get focused on the specific personal details of our lives. In some sense, you can reduce human experience down to this dialogue between what we do and what we sense. Mixing interaction, music and dance lets you create an abstract microcosm of experience. This is a space where you can explore this complex relation between identity and environment.

Last year, you created a big interactive dance project called “This Floating World“. What did it involve?

This Floating World was a collaboration with the dance artist Jan Lee. It is a dance solo in an immersive interactive audiovisual space.

We set out working together with an aim to see how far we could interweave our disciplines. We wanted to avoid some of the usual hierarchies that can arise in these kinds of collaboration and avoid creating interactive audiovisuals that supported a dance piece, or dancer showcasing an interactive space, but create a cohesive work where both disciplines shine in their natural way. Everything, from concept to code to choreography was a part of this collaboration so it was a lengthy process of storyboards and prototypes and others ways of getting to grips with each other’s creative process. But well worth it. Working like this has opened a lot of doors in terms of how I create work.

The piece charts an individual’s journey of building and dismantling the self through worlds evoking ancient organisms, glowing neurone patterns, and smears of cave paintings. We worked with narratives of evolution, from amoeba to humans on the edge of self-destruction, and human development, through stages of becoming aware of the self, exploring the world, controlling our environment. This lends itself naturally to exploring different kinds of relationship between the performer and the space. Sometimes these two entities coexist independently. Sometimes they are in dialogue. Othertimes they fire off each other in a more chaotic way.

Tech-wise, our kit is simply Kinect, laptop, projector and sound system. I wanted to keep the materials simple and familiar so we could push what is possible with interactive technology at an artistic rather than technical level. Keeping the tech simple meant that the piece had to work at an artistic and conceptual level to impress (for a tech-savvy audience at least). But it also gave us time and space to focus on these elements.

I understand that you have released an open source version of the code for that project. What do you hope that developers will do with this?

Having said how we tried to keep the tech simple, working with one Kinect is complicated enough. The normal body tracking software from Microsoft is too unstable for this style of dance. It’s great when you’re facing the camera head-on but it quickly gets confused when you move more creatively, like lying on the ground and stick your feet in the air. We didn’t want to have these kinds of limitations on the choreography for the piece so we developed our own skeleton tracking software which was less accurate but more flexible with unusual body shapes and so better suited for this kind of project.

I made this using a number of open source libraries from the internet, as well as some help from my friend Ali Nakipoglu from Marshmallow Laser Feast who had worked on an interactive dance project the previous year. So it felt right to release our work back into the community. There’s also a lot of coding work around the edges on this kind of project. Tools to help you prototype and tweak things in the studio, record screengrabs, manage the progression between different chapters of the piece. These are all part of the open source release which we called This Floating Tracker. We haven’t included code for generating the graphics or sound at this stage as it’s still an active show but I’m planning to release these later.

What do I expect developers to do with this? Well it’s a great starting point if you’re working in this area, as you can get coding on some generative graphics and pull in data from a Kinect such as the end points of the body or the optical flow form the scene (a measurement that describes the overall visual motion in the scene). But I think these kind of releases are often most useful as a learning tool. You can see a lot of someone’s working process through their code as well as the algorithms they’re using to make sense of the data arriving from the sensors. One caveat, however – This Floating Tracker is the tracking system from This Floating World as used in our performances. As we developed the piece we had to prioritise our time and we also had a rule of no software changes in the two weeks before a performance so we could be sure the software was stable and responded consistently. So there are plenty of bugs and idiosyncracities that we decided to work with rather than root out.

How does your work on This Floating World relate to your research with Jan Lee?

After This Floating World, we were keen to continue collaborating together. This Floating World had been very much a production – we were making a single piece to a deadline and everything we did revolved around making that piece as good as it could be. This meant passing over a lot of exciting possibilities along the way. So afterwards, we were keen to continue collaborating together and research some of these avenues, which we’ve done under the umbrella project name of Waiting for a Grain of Sand to Leap Into the Air.

We’re both musicians and we wanted to focus on sound – what I describe as ‘interactive sound spaces’, open spaces without any objects where movement of the body is tracked to produce sound. For me, there’s something pure about the simplicity of this as a canvas and I’ve been using this project to try to deepen my understanding of interactivity as a medium in its own right. We’ve constructed lots of simple interactive systems and games – a few leading to public performances. It’s been a lot of fun.

How can people without technical knowledge get started on interactive projects?

I think it’s important firstly not to label yourself as a person who does things without technical knowledge. Everything you learn along the way is technical and there isn’t a boundary place where you suddenly start needing to become ‘technical’. Even coding, which initially seems like a perpetually alien world, is something that people gradually drift into. Even coding is something people find themselves inadvertantly doing with graphics programming languages like Max. And if you’re collaborating with a technologist then the more you learn about what they do, the more effective you’ll be able to use it creatively. Also, there is so much open source material available online that you can make do wildly different things with a few tiny tweaks. But you need to get stuck in and be prepared to break stuff along the way.

If you’re interested in working with visuals, I would suggest starting with Processing. It has a great community and lots of libraries to help you work with interactive tech such as cameras. Different people learn in different ways. I like to just buy a really good tutorial book and work through it cover to cover before I really start experimenting but I think I’m unusual in that respect. Good Processing books are Daniel Shiffman’s Learning Processing and The Nature of Code – the latter is free online.

For sound, I would start with Max. Max is a graphical programming language which means you place little objects on a canvas and use the mouse to connect them up. It’s not cheap unfortunately, but there’s a 30 day trial to try it out. And it has some excellent tutorials built in. And again there is a huge community online with projects you can download and start tweaking. You can also connect it to some more conventional music production software like Ableton Live using MIDI which can get you quick results. There’s also an open source alternative to Max called PureData which is as effective but a little rougher around the edges.

What work do you have coming up in the future?

We’re currently fleshing out ideas for a new interactive piece looking at the relationship between our movement habits and our handwriting. Working title, Movement Alphabet. As with all interactive art, you really have to be there in the flesh as a participant to experience it. If any of your readers would like to hear about it before we exhibit it rather than afterwards then head to my website at http://timmb.com and join the mailing list, and I’ll send you an email when we have some more details.

tim floating world

Tim Murray-Browne spoke at an ArtfulSpark event. For more information, visit www.artfulspark.org or follow the organisers @stellarnetwork and @thoughtben on Twitter.
All images courtesy of Tadeo Sendon and Tim Murray-Browne.

Author: admin

Share This Post On
Loading...