We spoke to the creators behind a platform designed to bring augmented, virtual and mixed reality to life – a tool being used in a climate where live video now takes prominence.

The houselights go down, your heart is racing, you’ve got goosebumps, everyone in earshot is screaming, and the performance is about to start… The difference, however, is that you’re not in a venue, you’re actually in the comfort of your own home.

As we all know, the coronavirus pandemic has caused all live performances to come to a complete standstill in the first half of the year. Because of this, live video has come to the forefront within many industries, whether that be for live events or university lectures. This sets the question; how can you achieve an experience that is just as immersive as attending a concert or lecture, for example, in person? Extended reality may be the answer.

Extended reality (XR) is an umbrella term in live production that combines augmented (AR), virtual (VR) and mixed reality (MR) elements to extend the reality we experience by blending the virtual and ‘real’ worlds. According to Visual Capitalist, XR is expected to grow to a market size of more than $209bn (£160bn) by 2022 and is already being used across many industries such as corporate events, education, broadcast, live music, and e-sports, allowing brands, artists and organisations alike to connect with their audiences remotely.

The technology, however, is thought to take virtual productions to the next level. Indeed, the company ‘disguise’ has developed an extended reality (xR) platform that allows both creatives and technical people to imagine, create, and deliver spectacular live visual experiences. “It is a hardware and software solution that allows us to map video onto all sorts of creative surfaces and respond to the environment in various ways,” says Peter Kirkup, global technical solutions manager at disguise.

So how does it work? xR’s virtual set extension places presenters in environments larger than the spaces available, creating more compelling content to increase audience engagement. This complete immersion allows for interaction with computer graphic (CG) elements, real lighting, and support for reflective and refractive props. The virtual environment combines camera tracking and real-time content not only visible on the screen but live on set and on camera. This process gives directors and designers more control, and faster calibration workflows. 

How disguise's xR technology works

Image credit: disguise

Its creators say the xR workflow can also quickly and accurately align the virtual worlds, bringing together the content system, camera tracking system, and the LED screen, with pixel-accurate precision. This method, known as spatial calibration, takes place on set and can be done in under 30 minutes. “The system allows us to put structure light patterns onto the LED screen, and then perceive those through the camera,” Kirkup explains. “This system also helps us understand where the camera is in reality against what the tracking system is telling us, and calibrate the relationship between the two.”

As part of the xR workflow, disguise handles the blending of real and virtual worlds due to the colour calibration process embedded into the system. This enables the different ‘worlds’ to appear as one seamless environment. The system is also render-engine agnostic, which allows creatives to select their preferred content engine, such as game engines Notch, Unreal and Unity, in order to deliver high-quality visuals for their productions. Furthermore, disguise allows users to synchronise multiple render engines from a single timeline. It also has latency compensation built into the workflow to ensure minimal latencies to deliver such experiences.

According to the company, artists are one of many groups who can use xR to perform live, using content engines to create and render limitless possibilities for the video narrative in real-time. This transports the viewers to a world beyond the physical LED walls – an experience that has recently been attained by American singer Katy Perry. Here, tech company XR Studios leveraged disguise’s technology to bring what they described as the “future of broadcast” to primetime American TV, where the Grammy nominee performed her latest single ‘Daisies’ on the season finale of American Idol. 

“The seamless extension of the real-world LED screens to the virtual world environments could only be done by disguise’s xR camera registration workflow, allowing switching between camera perspectives and the LED content. Multi-Camera switching between perspectives also allowed us to switch cameras, and the LED content,” explains Scott Millar, a disguise xR workflow specialist who provided creative technology support on the project. Using disguise xR spatial mapping, the teams working on the project were able to accurately keyframe the position of real and virtual worlds into one coherent place where choreographed action could take place.

Here is an annotation of the xR set up and components demonstrated during Katy Perry's American Idol performance. It's important to note that the AR elements and set extension aren't there in real life.

Here is an annotation of the xR set up and components demonstrated during Katy Perry’s American Idol performance. It’s important to note that the AR elements and set extension aren’t there in real life and the camera doesn’t ‘see’ these elements.

Image credit: disguise

Katy Perry has not been the only world-renowned artist to perform virtually using this platform. The technology enabled partner XR Studios to create an environment for the Black Eyed Peas to promote their new album across primetime TV shows across the world. The Grammy award-winning hip-hop group recorded performances of four new songs, from their eighth studio album ‘Translation’, with XR Studios. These were then distributed in ‘promotion packages’ to a handful of primetime shows across the world, including Good Morning America and across Europe.

The platform allowed in-house teams to design and recreate environments, as well as bring in an artist who was unable to make the shoot. In the performance of new single ‘Ritmo (Bad Boys For Life)’, the song credits a guest feature with Colombian reggaeton singer J Balvin who could not be present in person. To overcome this, disguise xR allowed Silent Partners Studios, who designed the content for the videos, to create a 3D digital version of the singer within the content and virtually perform his verse amongst the original designs of the songs music video.

“Will.i.am and the band have always pushed the boundaries of technology as it crosses over into music production, promotion and consumption, so a disguise xR workflow for their performances really was a perfect fit for them as a group,” explains JT Rooney of Silent Partners, who was closely involved in the creation of the content with Will.i.am and the band’s management. The band also performed one of their most well-known songs, ‘Where Is The Love?’ as part of the promotion package. Beginning on a round platform, with standout AR elements of the red question mark logo, as seen in the original music video for the song, floating above the band, the scenery around them slowly evolves. The background displayed the names of recent police brutality victims – a poignant statement in light of recent global events.

Other performances by the band saw the xR platform ‘extending the set’, with two dancers on set reacting to the virtual dancers in the scene behind them. The teams also decided to use mirrors in the performance. And combining the mirrors with the LED walls in the virtual space allowed for natural reflections of content across the LED – a technique in which the green screen faults. “Green screen technology is a very isolating and difficult environment for presenters and artists to work in,” Kirkup agrees. “You don’t have a genuine reaction to the environment around you as all you can see is green. But with this extended reality workflow, we overcome that.”

Prior to the pandemic, the xR technology was first used in a broadcast that brought on a visual spectacle for viewers watching the HP OMEN Challenge e-sports tournament in September 2019. Here, the disguise gx 2c and gx 1 media servers were used alongside the xR workflow to power the real-time generative Notch content that accompanied the gameplay. The company’s media servers also help power audio-visual company White Light’s ‘SmartStage’ system, which has been used to create an immersive virtual classroom solution for an online Masters in Business Administration programme at Michigan University in the US.

The company believes xR technology will change the face of delivery in broadcast, “proving to be a vital lifeline in taking virtual productions to the next level”. Tom Rockhill, chief sales executive at disguise says: “xR enables creative and technical professionals to tell stories in new and innovative ways for the world’s leading brands and artists and produces opportunities for collaboration, combining teams, and technologies to drive a single creative vision.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Source: https://eandt.theiet.org/content/articles/2020/09/a-virtual-spectacle-bringing-video-production-to-life/