Ö÷²¥´óÐã

Exploring film soundtracks with Radio 2 and Ö÷²¥´óÐã R&D

Published: 10 February 2017
  • Bruce Weir (PhD)

    Bruce Weir (PhD)

    Senior Engineer

With developments such as , the and the , Ö÷²¥´óÐã R&D has a long history of developing new ways of mixing video with graphics in order to offer new ways of presenting interesting content to the viewer. An earlier version of , our experimental web application, was trialled at the Glasgow Commonwealth Games and we are now trying it out with a series of concert sessions and a special episode of .

We first trialled the concept at the , and we have since been working with the and orchestras to develop the technology further. As well as offering extra information to the viewer in the form of graphical overlays describing the performance, both of these recordings also include audio ‘zooming’.

As you zoom your view into the video you will notice the audio mix changing, highlighting the playing of the section of the orchestra which you are viewing.

 

The server uses and to deliver the web application, and to act as the real-time messaging interface between the editing tool, control interfaces, server and client browsers. We currently deploy instances of the service using .

This technology stack provides us with a simple deployment route, as well as the ability to handle both recorded and on-demand video services. The real-time comms channel via SocketIO/WebSocket allows live control over the service - triggering, adding or editing the graphical overlays during a streamed performance for example, or displaying live data from an external source, such as a musical score follower.



Audio Zooming

To produce the audio remixing effect, we capture multiple audio channels from the Ö÷²¥´óÐã outside broadcast truck.  Rather than streaming the stereo broadcast mix to the browser we split the audio into its component parts which when combined (or summed) produce the stereo mix.

This means that at any time we can be streaming up to 24 channels of audio to the browser. The streams are generally separated by instrument group, for example, Piano, First Violins, Harp, Woodwind...When you zoom in or pan the video we track the zoom-level and pan-position in the video scene, and modify the loudness and spatial position of the audio sources or objects in response. Static sources such as the overall sound of the venue are louder when zoomed out and quieter when zoomed in.

Conversely, dynamic sources such as individual instrument groups are louder when zoomed in and quieter when zoomed out; and louder when closer to the pan-position of the user. The level of the dynamic sources in the mix is adjusted to reflect the distance of the sound source from your pan position in the scene.Audio processing is all carried out in the browser using the .

Try for Yourself

 explains how you can play around with some of the most iconic film soundtracks of all time. 

— Ö÷²¥´óÐã Radio 2 (@Ö÷²¥´óÐãRadio2)

Friday Night is Music Night Remixed

Recently, Venue Explorer has been used to create a demo with the Ö÷²¥´óÐã Concert Orchestra, called . Three pieces of music were recorded for it (Star Wars, Jaws and Jurassic Park) and .

Ö÷²¥´óÐã Philharmonic - The Red Brick Sessions

We have been working with the Ö÷²¥´óÐã Philharmonic to offer enhancements to the concert series. .

  • -



  • Immersive and Interactive Content section

    IIC section is a group of around 25 researchers, investigating ways of capturing and creating new kinds of audio-visual content, with a particular focus on immersion and interactivity.

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: