主播大秀

Venue Explorer

Enhance a live event for local or remote audiences using interactive data, video and audio.

Published: 1 January 2013

A set of tools to provide live data to accompany a performance or event, allow the user to interactively explore a panoramic view of the scene and remix the audio to match their view.

Project from 2013 - present

Venue Explorer delivering programme notes during a 主播大秀 Philharmonic concert in November 2018 at the Bridgewater Hall, Manchester

To enhance the experience of the audience at live events, we have been focusing on a specific use case in partnership with the 主播大秀 Philharmonic: providing live programme notes that audience members can view on their mobile phones. The notes are prepared in advance, and then triggered at the right moments in the performance by an operator following the musical score and delivered to a web application. The application can also show the musical score, with an indication of the current location.

  • Much excitement in the booth at the back of Bridgewater Hall as we get ready for our first live programme notes to mobile devices. Phones on in a concert?!
  • 鈥 主播大秀 Philharmonic (@主播大秀Philharmonic) September 22, 2018

Improving the experience for the audience at home

For remote audiences, we have focused on allowing them to freely explore a wide-angle view of the scene, with audio being re-mixed to match their view.

An ultra-high definition video of a live scene is captured from a fixed wide-angle camera overlooking the whole event. The image is displayed on a conventional tablet or PC web browser, in a way that allows the user to pan and zoom around the scene to explore the areas of most interest to them, much as they would when using a map application. We have investigated approaches whereby we only have to transmit to them the portion of the scene that they are looking at, significantly reducing the bandwidth requirements. This type of experience could work as either a stand-alone or second screen experience; a tablet is an obvious starting point for second screen applications.

To provide an audio feed of the area that the user is currently looking at, we create audio feeds relating to individual areas of the scene and an overall mix suitable for a wide view, and mix between these as the view is changed. When viewing a wide shot, the audio will convey the overall ambience, similar to what would be heard by someone in the audience. As the viewer zooms in to an area, the audio is re-mixed to be appropriate for the selected region. For an application in an athletics stadium, the audio feeds for different events could be obtained from the existing outside broadcast operation, and the ambience feed from a microphone near the camera. For an application in a music or arts event, different audio mixes could be created for different areas, using feeds from many microphones. The work on audio is being carried out by , and forms part of our work on the .

We also acquire data relating to the scene. For an application at an athletics meet, this data could include background information on various athletics events, and live information giving the latest results. For an arts event, it might include the names and biographies of actors. We use a version of the authoring tool we developed for the , modified to receive a live video input, to specify the location in the image associated with live data feeds, and also to create additional overlays manually. The user can choose to overlay this information on the image, aligned with the corresponding location, providing an 鈥榓ugmented reality鈥 display. This approach could in future be automated by using techniques such as object and face recognition, potentially allowing details of every athlete visible in a stadium to be made available as an overlay.

">主播大秀 Philharmonic - Introducing the Red Brick Sessions

Venue Explorer is an example of one way in which broadcasting could move towards what is known as an approach: current TV systems send the same audio and video to everyone, mixed by the broadcaster. In this system, the content is divided into separate 鈥榦bjects鈥: the video is divided up into tiles, the audio is sent as a number of separate streams relating to particular picture areas, and overlay data is sent separately, with information about the place in the image it relates to, and what kind of data it is (results, schedule, etc). The user鈥檚 application assembles these objects according to the view selected by the user.

Outcomes

We conducted the first live public trial of the system at the 2014 Commonwealth Games in Glasgow.  We had a camera viewing the opening ceremony in Celtic Park, which we then moved to Hampden Park to cover the athletics and the closing ceremony. In our public demonstration area at the Glasgow Science Centre, we showed both live and recorded content to members of the public. Data feeds for schedules and results were taken from the sources already available to 主播大秀 Sport from the games organisers. Audio feeds were taken from a double MS mic next to the camera (for stadium ambience) as well as from feeds provided for various events produced as a part of the main TV coverage. All video and audio feeds at the production side were handled using 主播大秀 R&D's system.

p02zj3rf

We also worked with TNO (one of our partners from the project), to conduct trials on the open internet, using their tiled streaming application for the iPad.

The results of the trial were presented in a , and shown on 主播大秀 R&D's stand in the .

Following this, we captured a performance of the to investigate an application in the area of music and arts, and showed the results of this on the .

We then conducted various other public trials, including the , and Radio 2's in February 2017, both of which are discussed in a . NB: these trials only work in the Chrome web browser.

The in-venue programme notes system is being used regularly for the 主播大秀 Philharmonic concerts at the Bridgewater Hall in Manchester from September 2018. Look for the 鈥Philharmonic Lab鈥 symbol in the programme and book a seat in the designated area in the auditorium.

  • -
">主播大秀 Philharmonic - Introducing the Red Brick Sessions

Project Team

  • Bruce Weir (PhD)

    Bruce Weir (PhD)

    Senior Engineer
  • Stephen Perrott

    Senior R&D Engineer
  • Dave Evans

    Technologist
  • Matthew Paradis (BA(Hons), MSc, PhD)

    Senior R&D Engineer (Audio)
  • Paul Debenham

    Senior Engineer
  • Graham Thomas (MA PhD CEng FIET)

    Graham Thomas (MA PhD CEng FIET)

    Head of Applied Research, Production
  • Hannah Birch (CEng MIET)

    Hannah Birch (CEng MIET)

    Research Technologist
  • Becky Gregory-Clarke

    Becky Gregory-Clarke

    Research Technologist
  • Samuel Bason (MEng)

    Samuel Bason (MEng)

    Research Technologist
  • Immersive and Interactive Content section

    IIC section is a group of around 25 researchers, investigating ways of capturing and creating new kinds of audio-visual content, with a particular focus on immersion and interactivity.

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: