in quarantine

Darkstream

Created by Henry Garner

-

Darkstream

Created by Henry Garner

-

During workshops on Darknet in 2014, we experimented with the potential for using smartphones and tablets, manipulated by performers to create visual images. We knew this was something we wanted to build on in the full production. In particular, we were excited by the idea of streaming video live between the camera of one device and the screen of another, and layering up several streams between different devices at the same time to create a unique puppetry-language. To achieve this, we developed an in-house solution that would talk to QLab (the software by the stage manager to run cues for the show) and stream photos and video over a WiFi network between phontes and tablets being used by the performers.

What does it do?

During the performances of Darknet, we used two smartphones and three mid sized (7 inch) tablets and two large (10 inch) tablets running the Darknet app.

The cast manipulated these phones and tablets while they displayed videos and live-streamed footage shot elsewhere on stage. These were used to create a host of arresting and unusual images.

Charlotte, the virtual PA, was tailed by screens showing readouts and data. Candy, a futuristic cam-girl, became a puppet with a video feed for a head. And in the anonymous stores of the Dark Net, avatars were created as a collage of different body parts filmed from different parts of the stage.

How Does it work?

The technology consists of a native Android app written in Clojure called darknet-app. And a lightweight node-js server written in javascript, called darknet-server. The app connects to the server with a websocket connection allowing the server to send it commands. QLab script cues can be used to make http requests to the server to issue commands to the app.

The app connected over a WiFi network to a server running on the show computer which integrated with QLab. This allowed the stage manager, to send cues to any of the devices. It could display text, pictures or videos on the screens of any of the devices. And it could start a stream from the camera one phone or tablet to the screen of another. During the performances of Darknet, it could reliably run up to three video streams at the same time.

The app also kept the device awake and ignored accidental button pushes during the performance. All the performers had to do was pick up the right device and hold it in the right place.

During performances, we ran our own WiFi network using a Ubiquiti Unifi UAC which was only used for the show.

The source code for both the app and server are released open source under the GNU General Public License v2.0. We are delighted if in anyone else wants to use this technology for their own purposes. You can follow a link to the source code below, which includes some simple documentation how to use it. We are always happy to try to answer any questions about app or server, or give you advice on how to use it.

Cast

To be announced

Creative Team

To be announced

Photo galleries

Related stories

With support from

No items found.