主播大秀

360 Video and Virtual Reality

Investigating and developing 360-degree video and VR for broadcast-related applications

Published: 1 January 2014

Immersive experiences for audiences present new challenges for content creation and delivery

Project from 2014 - present


What we're doing?

360 video and VR encompass a wide range of technology, from monoscopic rotatable video on a web page or mobile device, through headsets such as  with , to high-end headset-based VR experiences allowing full movement and interaction. 360-degree pannable interactive stills have been around since the advent of  in 1995, but it is only recently that it has become practical to capture and deliver video, and to view it on an affordable headset of reasonable quality.

VR experiences, by which we mean those based on 3D graphics or 鈥淐GI鈥 (computer-generated imagery), have been available in research labs and industrial applications for many years. Recent advances in headsets (displays and motion sensors) plus increased rendering power mean that such experiences are becoming available in consumer-level devices. 360 video can provide highly-realistic imagery, is comparatively easy to produce, but lacks interactivity or the ability for the viewer to move freely.

VR, on the other hand, can provide very rich interactivity and movement but tends to be more expensive to produce and it is technically challenging to support photorealistic imagery - although commercial solutions are becoming available, such as volumetric capture and early approaches to light field acquisition and rendering.

David Attenborough in a VR video

At 主播大秀 R&D our job is to help the 主播大秀 understand more about what these technologies might offer our audience and to influence and contribute to developments in technology, from editorial, through to production, distribution and consumption. There is already a lot of activity in the wider industry, particularly focused on gaming: our focus is instead on how this technology can complement the kind of content that the 主播大秀 produces, so , rather than a pure 鈥榞aming鈥 experience.

It is still very early days for this, with no established editorial guidelines, production pipelines or universal distribution methods, and care is needed to identify the end-user experiences that will truly benefit from 360 or VR technology rather than simply creating a brief 鈥榳ow鈥 moment.This is a very broad area of work, involving a number of projects across R&D teams in areas including editorial, production and user experience. Links to other projects, blog posts and demonstrators are given below.

 VR headset user

360-degree video and VR are examples of what we refer to as a 鈥樷: a form of content that goes beyond what the 主播大秀 currently produces, and that can be enabled by an all-IP production and distribution chain that is not tied to traditional audio-visual content formats. Our VR/360 work thus forms a part of our broader programme of work in this area, which is helping us to imagine a 主播大秀 of the future, not necessarily the one that exists today.

This is not the first time that we have looked at how immersive and interactive video and audio systems and real-time 3D graphics can be used to enhance our audience鈥檚 experience. Previous examples include:

  • Technology for mixed reality TV production that we developed between 1997-2005, used in programmes like  to allow real-time compositing of virtual content into a , and resulting in commercial products for ,  and .
  • Augmented reality  for a 主播大秀/Natural History Museum installation in the Attenborough Studio, , launched in 2010
  • 3D audio, with our work on applications of  and continuing in various projects in our 
  • 鈥樷 鈥 an investigation into the possibilities of a room-sized 180-degree projection system to augment a conventional TV display, developed in 2010-2011
  • Panoramic video and 3D audio to provide navigable interactive experiences, through projects like  and 

R&D鈥檚 360/VR work can be broadly divided into four areas covering investigating the latest 360/VR technology, development of some specific production tools to fill gaps, commissioning content across a range of genres to assess the editorial possibilities, and user experience research to get a deeper insight into viewer behaviour. Put another way, this is 鈥榳hat new technology is already available, what other technology would we need, what can we make with it, and what do viewers think?鈥

 'Easter Rising: Voice of a Rebel' screenshot

Understanding the technology and implications for production and delivery

We have investigated key bits of the production chain and various filming techniques so that we understand the limitations: this makes us an informed customer when dealing with external production companies, and allows us to provide advice to the wider 主播大秀. It should be noted that 360/VR production is still fraught with technical problems, and based on our experience and general technical knowledge we have been called upon to provide advice to external production companies concerning various 主播大秀 commissions 鈥 no one is truly an expert in this area yet!

One of our first production experiments was setting up a live stream of 360 video and 3D audio for a . Since then we have been directly involved in producing a number of 360-degree videos, often working with independent production companies.We have also been investigating various delivery approaches, including linking from our experimental content portal 主播大秀 Taster to YouTube and assessing other web-based players.

More recently . We also launched an that uses a WebVR 360 player we developed, it allows producers to .

 A man wearing a VR headset

Production tools and technology gaps

We are developing production tools to fill some of the gaps between current TV workflow and what is needed for VR/360.Most of our work so far has been in the area of object-based 3D audio production tools, building on the standard for representing object-based audio in production for which we have led the development. This work was already underway before the current rise in prominence of VR/360, making us well-placed to apply it in this area.

Our recent developments include two plug-ins for the Unity games engine to import ADM files and for binaural rendering (). We also developed a tool to allow the location of sound objects to be visualised in a VR headset during audio post-production, which has been used for various 360 video productions including released on Taster in 2015. We have recently configured our production tools to export 3D audio in the .

 We wait


Trial/demo productions across a range of genres

We have commissioned the production of various pieces of content that explore different aspects of what 360/VR might offer. Some of these have been commissioned as part of season. Examples include:- Tests with , including pieces like in which used 360-video footage beside one of the main memorials to those who lost their lives to give audiences a better impression of what the atmosphere at the Place de la Republique was like.

360 video to explore approaches to shooting grammar- as an audio-only interactive binaural experience.- produced in collaboration with as a CGI audio-led experience that also brought interactive elements into a narrative-led piece. This built heavily on our work on 3D audio.- is a fictional depiction of migrants making the perilous journey from Turkey to Greece on smugglers boats. It is based on accounts gathered by 主播大秀 News from migrants and brought to life as a CGI experience in collaboration with using animation techniques. The aim was to explore how VR has the potential to give audiences a greater understanding of a topic they otherwise wouldn鈥檛 be able to experience.

More information and links to other experimental commissions can be found on .We have also provided advice to 360/VR productions led by other parts of the 主播大秀, including , and including a visit to the .

Scene from a 360 video research shoot
Scene from a 360 video research shoot

User experience research

We have carried out some user experience research using some of this content plus other specially-shot material to understand its impact on viewers, as well as gathering feedback from others via rating system. Examples include:- looking at what controls viewers鈥 attention by recording head movement whilst viewing 360 video- the effect of different - different approaches to placing overlays and subtitles- how the viewing device Outlook

There is still much to learn about how to produce good 360/VR content in a broadcast-related context, for example how to make best use of existing assets and opportunities for additional location shooting that can be done relatively cheaply as a part of existing TV shooting. We are learning more about what experiences work best in this new medium and how to create them, but we are a long way from a well-established commissioning process or production grammar. Delivery methods and consumer hardware are still evolving rapidly, and more work is needed to get a solid understanding of how to create a good user experience, and even to understand what the long-term audience benefits are. Watch this space for further developments!

    • Our Controller on why we're exploring Virtual Reality and 360掳 video:
    • 鈥 主播大秀 R&D (@主播大秀RD)


  • -
  • More on Virtual Reality and 360 Video:
  • from day

Project Team

  • Graham Thomas (MA PhD CEng FIET)

    Graham Thomas (MA PhD CEng FIET)

    Head of Applied Research, Production
  • Maxine Glancy (BA, MA, MPhil, BSc)

    Senior Research Scientist
  • Becky Gregory-Clarke

    Becky Gregory-Clarke

    Research Technologist
  • Alia Sheikh

    Alia Sheikh

    Senior Development Producer
  • Chris Pike (MEng PhD)

    Chris Pike (MEng PhD)

    Lead R&D Engineer - Audio
  • Andy Brown (BA, MSc, PhD)

    Project R&D Engineer
  • Simon Lumb

    Senior Product Manager
  • David Johnston

    David Johnston

    Senior Product Manager, VR/AR
  • Immersive and Interactive Content section

    IIC section is a group of around 25 researchers, investigating ways of capturing and creating new kinds of audio-visual content, with a particular focus on immersion and interactivity.

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: