Coding Darkstream

By
Henry Garner
January 14, 2017

For our production of Darknet (2016), our Associate Artist Henry Garner lead on creating darkstream - a bespoke piece of technology allowing us to stream images and videos between phones and tablets. We caught up with him about his experiences coding new technology from the rehearsal room.

Tell us about the project

Darknet is a play that explores questions of privacy and ownership of data online. It's the product of a collaboration between director Russell Bender and writer Rose Lewenstein. I was asked to join the project as a creative technologist.

Tell us more about how you became involved 

I've known Russell since we were at school together. Back then, I was studying art and he was focusing on maths and the sciences and we went on to get degrees in fine art and computing respectively. Somewhere since then we must have crossed over, because I've been working in technology for the past decade. For 4 of those years I was director of my own data analytics company. The company sourced information from social media in order to understand consumer's interests for marketing purposes so the character of Allen in the play was particularly resonant for me. I chatted to Russell regularly over the course of the play's development and I'd like to think some of those conversations fed into the plot. 

What was the brief?

To develop an application which would allow a suite of mobile devices to be controlled from the cue desk. This would enable the custom technology commissioned for the production to mesh with the lighting and sound that help create the show. Russell's early technical experiments had already shown that a really interesting hybrid physical / visual space could be created by creating mosaics of screens of different sizes. They were reminiscent of Hockney's polaroid collages or cubist portraits, but using video and 3 dimensions to create a disorientating physical space.

What was your contribution?

I worked closely with Russell in the early stages to devise a reliable and secure way of remote-controlling the devices. Russell had already produced a sophisticated prototype himself that he'd workshopped with actors, so he already knew the sorts of things he wanted to achieve. Things like showing images on screens simultaneously, or sending video from one device camera to another device screen. We're both familiar with modern software development methodologies, and we captured each of these requirements as a 'feature' and prioritised them according to how important they were for the play. As it's often hard to know how long software development will take, working in priority order ensured that we wouldn't waste valuable time on unnecessary details.

What was the most important feature?

This was probably ensuring that there were no glitches that would affect the flow of the play. The prototype had shown what was possible, and wasn't as reliable as we needed it to be. One of the first things I set about doing was rewriting the app in a programming language called Clojure which I was more comfortable with, and which I felt would enable me to work more quickly in the run-up to opening night without compromising code reliability. It's a cliché that a new developer will come along and rewrite a codebase when they get started, but in this case I felt justified.

Henry Garner working on the Darkstream app

Why Clojure?

I've been using Clojure professionally for about 5 years, so it was already a language I could be productive in. It also happens to be one of only a handful of options available for writing code on Android devices. It has a terse syntax, which means I don't have to write very much code to achieve the same amount. This pays dividends on a project where speed of development is key. Perhaps most importantly though, it's an example of a'functional' programming language, and functional languages in general have become more popular in the past few years as developers find them effective at containing and managing complexity associated with changing requirements. I fully expected that I would be tweaking and adjusting the features right up until the first performance, and I wanted a language that wouldn't impose a drag on this way of working. 

What was the development process?

I've mentioned that we captured features and sorted them by priority. We used an online tool to store these so that both Russell and I always had access to the most up-to-date requirements, and I could keep Russell notified about my progress. I would work on the most important feature and test on a real Android device connected to my laptop, checking things worked whilst simultaneously developing software that would eventually run on the cue desk.

Once we got into rehearsals though, there was a different class of problem to solve. Issues emerged when we tried to run the code on different models of tablet, and when we exposed it to rugged treatment by the actors. This meant that in the final week I sat in on all rehearsals taking notes, fixing anything I could from the auditorium. At night I would work directly on the misbehaving devices that were being used by the actors during the day. It was a long week.

What were the challenges in getting it all working?

One unexpected challenge was getting the software working the same way on the Android devices being used. We were using several models made by several manufacturers and subtle differences between, for example,screen and camera resolutions or aspect ratios meant that certain combinations of devices would cause stretched video when streaming from one to the other which needed to be fixed on a case-by-case basis. Ensuring that it was impossible for the actors to accidentally exit the software by disabling the touchscreen and physical buttons also required different approaches depending on the device and operating system it was running: so a completely unexpected problem was keeping the software running at all!

What are some of the ways it was used in the play?

We had 7 handheld screens of different sizes to work with in the play. At various points they were used to bring out humorous details with digital Personal Assistants Charlotte and David with images and video, and to display the melancholy connection between Allen and Candy using streaming video. But where they really came into their own was in representing on stage the disorientating virtual space of the dark web itself with all techniques and all devices at once. In particular, being able to stream video allowed us to present the actors both on stage and also in virtual space simultaneously, and other screens could show virtual characters present only in the dark web. Hopefully these scenes brought stepping into this anonymised world viscerally live for the audience.

Darkstream in action

So how does it work?

We made use of standard web technologies in putting the show together. All the devices run on a local network like you might have at home, and each is connected to the cue desk via something called WebSockets, which is a technology all modern web browsers support. It allows two-way communication between the device and the server, and it allowed us to push messages to individual devices (to get it to display a particular image, for example). Setting up streaming video involved a chain of messages, where the server would request a camera stream from device A, which would send a message back once it was ready, triggering a third message to device B to display the stream directly from device A. As you'd imagine, these more complex sequences were more challenging to get working reliably.

What was it like working alongside the actors and the rest of the creative team?

There's no doubt that a theatre is a challenging environment to write code, but it was so important for me to see how the software was being used by the actors. A particularly rewarding aspect of being a freelance developer is that direct connection you have to the users of your software, and releasing new code every day between rehearsals provided a strong sense of achievement. Conversely, when things were not working as intended there was a strong motivation to fix it! There were tense moments: it was a complicated show from a technical perspective with lighting and sound and projection all being worked on simultaneously right up to the first preview. But the cast were patient and understanding, and the crew pulled together. It was a really rewarding show to be involved in. 

What advice would you have for anyone else trying to make new software for theatre?

The most important advice is the same advice I'd give to any software developer, but it especially applies to software for use during a play: keep it simple. The first several days of my time were spent on establishing a very clear design for the software, and a naming convention for commands that would be sent to and from the cue desk. It wasn't fancy, but it provided a basic framework for everything that followed.

If the budget allows, make sure you have the resources to test the software without interrupting actors during rehearsals, or be prepared to work late nights. Also, allow plenty of contingency. I never would have guessed that the most basic requirement - keeping the software running as devices are handled aggressively backstage - would require several days to resolve!

More blog posts