One of the compelling aspects of augmented and virtual reality is the capacity to create a secret world, an immersive experience that only you are witnessing.
From a compositional standpoint, AR and VR also offers a means to push forward the two-dimensional graphic score by placing the musician inside the score itself. Although virtual reality has been part of the computing landscape for several decades, it is only recently that the specialized equipment and software required has been available outside of academia. The proliferation of VR and AR games, apps and its presence in the general culture has become more widespread since the introduction of Oculus Rift (DK1) in 2013 and then later the HTC Vive in 2016. This paper follows the development of my VR / AR composition Hidden Motive from its initial stages in February, 2017 to its present development in November, 2018. My work with 360, VR and AR graphic scores have included many different types of technology, from AR phone apps, the Oculus Rift DK2, the HTC VIVE, and the METAVision AR headset. In the process I have developed techniques for mixed reality for music concerts, live mixed reality projection, live telematic graphic scores, interactive score/ controllers and sound sculpting with hand-tracking technology.
Composing in VR and AR
While primarily used for gaming and mobile apps, AR and VR also has unique capabilities in music. In the last decade, composers have put to use various 360, AR and VR technologies, in particular Paola Prestini (Hubble Cantata), Giovanni Santini (LINEAR) and in opera (X). In talking about composition for AR and VR it is important to distinguish between 360, augmented reality and virtual reality. 360 scores are two-dimensional images that can be viewed from all angles. With 360, the photo or video remains static, and cannot be touched or moved by the viewer. Virtual reality is interactive, using either external sensors in the corners of the room (HTC Vive) or inside-out tracking from the headset (METAVision), to map the position of the user and the environment, to make interactivity possible. A user touches, moves and interacts with the environment using either handheld controllers (VIVE, Oculus) or with hand-tracking technology (META). Virtual reality is entirely digital environment (Oculus, VIVE), whereas augmented reality layers digital elements overtop real surroundings (METAVision),
Introduction to VR and AR at the Banff Centre - February, 2017
My own work on compositions for 360, AR and VR began while at the Banff Centre in February 2017 during a residency called 'Concert in the 21st Century'. Although using the VR system Oculus Rift DK2, I worked primarily with 360 still and video graphic scores. During this residency, I also collaborated with New Zealand artist Shannon Novak to create augmented reality sound installations using his artwork and my music. These were created using the program Aurasma and installed throughout the Banff Centre campus.
These initial 360 and AR projects began to introduce me to the significant difficulties with bringing VR into live music performance. The first and most immediate difficulty in VR and AR music composition is that most headsets like the Oculus Rift and the Vive occlude vision and cover part of the face – which makes it not suitable for every musical instrument.
Although this is changing rapidly, VR technology has not always been particularly portable or affordable. It is only recently that laptops were available that could run Vive or Oculus headsets. Wireless capabilities for headsets were only introduced in 2018, although 'backpack' systems for VR have been available since 2017. The equipment remains expensive, however. Working at the Banff Centre meant I had access to specialized VR equipment. After the Banff residency until November, 2017 I worked primarily with 360 video because my only access to VR equipment was through my phone (Samsung S8) and the accompanying GearVR headset mount.
Audiences and Mixed Reality
Besides these practical considerations, from a compositional standpoint the primary difficulty in creating new works for VR and AR is the challenge of bringing the audience into the first-person experience of the musician. One method of bringing the audience into the experience is simply to mirror the musician's score to a projector. Projecting 360 graphic scores works well because they are 2D, not 3D. 360 scores are simply photos or video is sized twice as wide as it is tall, and can be viewed through many different programs and headsets including Google Cardboard, Oculus Go and the Samsung GearVR.
From March 2017 until October 2017 I performed several shows using 360 graphic scores including performances in Winnipeg, Manitoba and for Nocturne in Halifax, Nova Scotia.
Research at Copernicus Labs (October-November, 2017)
Starting in October, 2017 I began a collaboration with Ryan Cameron of Copernicus Labs and Electric Puppets. Ryan offered me access to VR equipment including an HTC Vive. With the HTC Vive came the use of the VR art program Tilt Brush. The use of Tilt Brush as a starting point to creating three-dimensional VR graphic scores has both a technical and artistic reasoning behind it. Within the VR platform Tilt Brush can create astonishing 3D visuals. It even has audio reactivity built in, which makes it an exciting platform for the creating of VR graphic scores.
However, as you can see, the exciting 3D aspect of Tilt Brush becomes is extremely difficult to mirror for the audience. Projecting the score onto a screen or wall flattens the image to 2D and the score loses some of its magic. This posed a serious problem for the composition, If I was creating a live 3D score in Tilt Brush for a musician, how could this be displayed for the audience in a way that could mimic the 'immersive' nature of it?
The first solution was a technique called mixed reality which is often used to layer 2D camera footage of the user spliced with a 3rd person perspective of their VR environment.
I was fortunate that along with Job Simulator (above) Tilt Brush is one of the only VR game platforms to support quartered screens for mixed reality. Quartered screens peels the layers of the 3D environment into slices, allowing them to be re-layered with live video feed in an innovative streaming program called Open Broadcasting Software. However, in order to create mixed reality, a green screen must first be used to subtract the background from the musician to enable this layering. Once I started working with my rudimentary green screen it became very clear that this technique could not possible work in a music concert scenario.
Green screens are not practical for live music concerts for a few reasons. Firstly, the garish screen would be very distracting for the audience. As well it was too complex of a set-up for a concert, as green screens need excellent lighting to work properly. Itsoon became clear that I needed to find a different means of subtracting the background from the musician. In order to mimic a green screen effect, I used ChromaCam and Isadora to cut away my background in a more practical way.
By using ChromaCam, Isadora and OBS I was able to created a mixed reality illusion that I felt captured in some ways the immersive nature of Tilt Brush graphic scores.
Berklee Presentation - December, 2017
In December, 2017 I was invited to present my work to students at Berklee College of Music in Boston. Although I felt I was making progress with mixed reality, it still felt 2D and lacked the three-dimensional magic that users experience within the VR headset. Was it possible to bring this experience to the audience? For this Berklee workshop I experimented with using two projectors to create live mixed reality, allowing the live Tilt Brush graphics to the painted over the performer as they performed.
Hidden Motive I (January to March, 2018)
In early 2018 I continued working on mixed reality and Tilt Brush, this time looking at its telemetic properties. One of the advantages of the Tilt Brush platform was its ability to position virtual cameras within the VR space, and send that camera feed to a live Youtube channel. A performer anywhere in the world could then view that channel, allowing them to perform from a 360 graphic score that was created live. The first Hidden Motive was performed at Interactive Traces – part of the Open Circuit Festival at the University of Liverpool with the cellist was Kevin Davis. While the piece worked reasonably well, there was definitely frustration on the part of the performer, who was not used to wearing a VR headset. While he initially found the 360 display interesting, the visual occlusion and the disruption it had on his performance practice led him to remove the headset halfway through the piece.
Acquisition of the METAVision Headset (April-May, 2018)
Following the presentation of Hidden Motive I, the META headset arrived along with a computer powerful enough to run it, thanks to a grant from my supervisor Dr. David Westwood and technical assistance from Ryan Cameron of Copernicus Labs.
Initially I thought to use the headset in a similar way to Hidden Motive I – as a delivery system for live improvised graphic scores, but with the practical addition of having a clear visor.
Hidden Motive II (May, 2018)
Initially I thought to use the headset in a similar way to Hidden Motive I – as a delivery system for live improvised graphic scores, but with the practical addition of having a clear visor. However, as I explored the META, particularly within the programming software Unity, the affordances of the system began to emerge as something quite different than I anticipated. The META headset uses infrared light to map its surroundings, and uses that mapping to both recognize and track hand gestures (grab, stretch) in realtime as well as allow for the placement of virtual objects relative to real ones. In the simplest terms – it is augmented reality – it layers a virtual landscape overtop real life. A second advantage to the Meta is the use of Unity programming, which can be combined with Tilt Brush SDK. Essentially, the graphic scores I was designing in TB could be easily imported and displayed in the Meta. As well, I could add functionality to the graphic score – make it interactive or animate them. In performance terms, the musician would be able to grab and move elements of the score, and would be able to trigger audio files in the process.
The second version of Hidden Motive was performed at TENOR Conference in Montreal in May 2018. Here the performer is laptop artist Kristina Warren with the Concordia Laptop Ensemble. As you can see the visor is also clear which makes performance a lot easier. It also projects well for the audience – giving them a first person perspective of the artist interacting with the score. The artist can grab the graphics with their hands and they loop sounds which they can then use to improvise.
Score or controller?
At this point Hidden Motive moved from the realm of 'graphic score' to the realm of 'controller' which brought up some interesting points. When I presented the work at TENOR, which is a notation conference, questions were raised by audience members as to whether the piece was a score or a controller. My impression was that it was perceived that these two categories were mutually exclusive. A score cannot be a controller and a controller cannot be a score. That would equivalent to saying that a MIDI drum machine was a score, or that Ableton's Push device was a score. This presented me with what I thought was an interesting question in the virtual age. If something is no longer a physical object, but retains the same interactive properties, does it keep it's classification or not. If 'touching' a floating graphic triggers an audio response, is it more a visual representation or an object?. What does this mean in terms of composing a VR score?
Hidden Motive III (September - November, 2018)
The third version of Hidden Motive is a further refinement, one that includes not just the triggering of sounds, but the manipulation of them through hand movements. One of the aspects of the META headset which I appreciate as a musician is it's reactivity to movement. It has great potential as both a locus of controlling sound and a locus of visual representation of sounds. The headset uses tracking technology similar to what was used in the Kinect. What I find unique about the META is it also synthesizes this movement tracking with 3 dimensional graphics. So although you see a 2D representation on the screen – within the headset the work has depth and width and can be moved in a natural manner. Sound files can be started and stopped with different movements. In addition, a manipulation field allows for working with pitch, distortion and other parameters.
But what I love about working with this headset is that the piece becomes more than the sum of it's programming. Because you can't see the field you are interacting with, you become an explorer. - carefully working out the affordances of this particular piece. There is also a glitching aspect, where slower hand movements can create richer sound effects.
Hidden Motive lies at the intersection of graphic score, controller and improvisational movement. The goal of the work is the discovery and manipulation by the performer of the affordances of the technology, which is done through physical interaction and exploration. Hidden Motive explores how individual bodies interact within a performative augmented reality environment.