Image Courtesy of Ncam

Meet the company transforming real-time content creation

Virtual Production (VP), as a term, is in vogue: it’s being bandied around with growing frequency in the media and entertainment industries. And for good reason. VP has long held promise as the way to unify the disparate elements of the production pipeline, creating a more collaborative, cohesive and efficient process.

Yet many remain unsure as to what VP actually means in practice, and this - in part - may explain why adoption has been slow.

Foundry Trends spoke to Nic Hatch, founder and CEO of technology company Ncam, which works to transform the way visual content is created in real-time, to explain how VP has been applied, what it aims to solve, and where its potential lies. 

“At present when you look through the lens on set, you see the action taking place, but it’s all happening in front of a blue screen. Ncam was born out of the goal to enable a VFX preview through the lens in real-time,” he said.

Ncam work

In practice this means that, when directors or cinematographers look through the electronic viewfinder (EVF), they can see action playing out against a backdrop that resembles the environment in which it takes place in the final scene. 

This on-set previsualization is deployed using Ncam’s markerless camera-tracking, a solution that enables the placement of CG objects with the correct spatial location, sale, orientation and motion onto live-action footage.

This isn’t just a far-out ambition for the future; it has already been used in major productions. A standout example is Disney’s Solo: A Star Wars Story Most notable in that production was a major stunt sequence featuring a heist atop carriages on a speeding train against a mountainous backdrop - a sequence lasting more than ten minutes. 

The Ncam system was used to provide a live preview of the CG background showing the mountains going past the train, which enabled cast, camera operators and editors to understand what they were shooting for each of the complicated beats of the scene. 

Hatch wants to maximize this capability even further.

Ncam work

“Eventually we’d like to reach a point where we’ll help deliver real-time VFX during production, which will enable filmmakers to perhaps see around 80 per cent of the ultimate frame they are looking to create - compared to the 10 per cent they would otherwise see,” he said. 

The effect will be to dramatically improve production efficiency and enable new levels of collaboration and creative control on set.

According to Hatch, “not only will this save an incredible amount of time in post, it will enable all stakeholders - from directors and producers to actors and VFX crew - to make better informed decisions on set.”

Meanwhile, Ncam has worked to tackle other layers of complexity brought by the use of augmented graphics. Its ‘Real Light’ solution enables CG to respond to real-world lighting conditions, in real-time. This will go a long way in helping Hatch meet his aim of providing those on set with the most comprehensive representation possible of the final scene in real-time.

Ncam working on the set

Of course, much of VP technology is still in its infancy - and, while Ncam’s solutions already offer real-time previsualization, Hatch wants to go much further.

“There’s a lot more we want to achieve to make the process more efficient,” he added. “We eventually want to eliminate green and blue screens altogether. So many hours are spent trying to fix the issues they cause post-production.”