What's new in virtual production?

Although technology advancements bring out innovative ways of solving problems and their aim is to support creativity, sometimes skepticism arises and adapting to them can become a real challenge. That also happens when talking about virtual production (VP). It’s new, it’s exciting and it’s ever-evolving. But should we leave the conventional ways behind in totality? Should we dive in head first and see if it’s worth it? Or should we play it safe and just wait until there are more tangible results?

There are still a lot of unknowns but it’s undeniable that virtual production tools and other technologies that bring value to this space are becoming more commonplace. With that in mind, let’s explore some of these technologies and the latest discussions on this topic.

For each problem, there’s at least one solution

Having been at SIGGRAPH, our Foundry team was at the forefront of virtual production sessions and was able to take their pulse. One recurring theme was around educating industry professionals on VP. Some filmmakers are still wary of LED walls: if they should be used and, if so, how and where they should use them. Their main concerns are a lack of control on set, not knowing how to get the most out of the available tools, and wasting time reinventing the wheel on every production.

Foundry Team at SIGGRAPH 2022

The steady expansion in virtual production raises the question of sustainability—how sustainable would a shift towards these technologies be? The research team at Filmakademie Baden-Wuerttemberg (Germany), led by Prof. Volker Helzle, started exploring the potential of sustainability VP could bring into VFX production. At SIGGRAPH, their presentation showed an energy budget comparison between two student projects, Sprout and Awakening. While the first one used traditional offline production, such as the green screen, the latter was a virtual production shoot that used an LED wall for in-camera effects (ICVFX). 

So what were their findings? In this case, this virtual production of a reasonably similar scale required significantly less energy. Why? Because offline rendering consumes more energy. Not only that, but they highlighted that virtual production can also reduce travel costs and CO2. Overall, it can be said that it’s a more sustainable solution than traditional offline shooting but it’s crucial to keep in mind that not all shoots are suitable for virtual production or for using an LED wall.

A very difficult but equally relevant problem that arises in virtual production is the potential mismatch between the color on the LED wall and the color that you get on camera. This happens either because of the LED panels’ limitations or due to additional light sources on set which have an effect on the wall. For this, Carol Payne and Francesco Luigi Giardiello from Netflix jumped in with a solution—they proposed a calibration toolset called OpenVPCal

OpenVPCal displays calibration patches on the wall which are then captured in the camera. From that, an OpenColorIO transform is produced, enabling productions to manage and reproduce tracking and transport much easier. That OCIO transform can then be used on the pixels going to the LED wall so that what you see in-camera is the same color you actually sent to the wall. Netflix’s huge amount of interest in this only shows the importance of reliable color workflows and how complex this issue actually is. The best part? OpenVPCal is an open source framework!

Foundry Team on Comandante at SIGGRAPH 2022

The longest talk at SIGGRAPH’s Virtual Production session was given by our very own team on the Near Real-Time (NRT) workflow and its advantages. Led by Dan Ring and Dan Caffrey, with support from Peter Canning from High-Res and VFX designer Kevin Tod Haug, this talk presented a new machine learning assisted VP workflow for Comandante, an Italian film that follows the true story of a submarine commander in World War II. Before Foundry were brought on to the project by Peter, Kevin had been conducting research with High-Res to solve one of the production's hardest challenges. Given the evident aquatic theme of the film, the filmmakers wanted to use a lot of water near an LED wall, meaning they had to use waterproof LED panels which are naturally low pixel pitch while the images have a lower quality. But with NRT brought in, the quality of the shot is instantly improved, the quality of light from the panels is maintained, and in-camera VFX and post-production are seamlessly connected as VFX can start on-set. The workflow itself opens a plethora of possibilities and involves new tools such as CopyCat and UnrealReader, as well as some unreleased tools and lens calibration workflows. For a more in-depth analysis, you can watch the session on-demand here.

Less talking, more showing

Exhibitor-led rooms at SIGGRAPH were a great way of showing how new technologies in virtual production actually work, so our Foundry team didn’t stay away from those either.

Industrial Light & Magic (ILM) brought StageCraft to the table—or better said, to the screens. This is their very own production solution which they have been using for years, and most famously, on The Mandalorian. With StageCraft, filmmakers are able to scout and design before the shoot, and then reproduce that vision in-camera during the principal photography stage. When combined with LEDs or projection methodologies, StageCraft can help filmmakers save time while shooting and reduce VFX shot costs, an immense advantage for current tight schedules in post-production. But, as we mentioned earlier, virtual production is not a solution for everyone. ILM also highlighted that an LED volume is only suitable for approximately 50% of productions and shared that they sometimes actually have to convince people not to use it. 

Lux Machina, a studio that offers consultation and on-the-ground production support and which specializes in virtual production, display technology and in-camera VFX, also brought their extensive infrastructure knowledge. Phil Galler gave a talk on some of their network setups and how they use a high bandwidth centralized build and storage server in order to reduce value points. An impressive example was their use in a live broadcaster environment on the League of Legends Worlds 2020, in Shanghai. They also provided LED augmented reality (AR) throughout the competition with camera tracking and multi-camera workflows. 

This was a very complex setup that was On Air for 240 hours and had approximately 60 million viewers at the peak. Any hiccup at that time couldn’t happen—not because it wouldn’t look good but because, in that particular scenario, it would affect the competition and its results. Phil shared some stories that would make anyone have sweaty palms—they had switching clusters between two-minute commercial breaks to ensure everything was in place for the next setup. This only proved that Lux Machina truly know their stuff when it comes to virtual production and how dependable they are in this space.

VP momentum

With advancements like virtual production, there’s always a learning curve for both filmmakers and software creators. Firstly, we need to familiarize ourselves not only with its capabilities but the potential problems it can bring and the best solutions to these. Then we need to keep in mind that education is key—we should learn exactly how best to use these tools but also what are the right scenarios where we can use them and accept that it’s not that kind of “one size fits all” technology. Nevertheless, virtual production supports filmmakers in reducing VFX shot costs and saving them precious time but also works as a catalyst for software creators like ourselves so we can offer them solutions for challenges they may encounter.

Can't get enough of the latest trends in VFX?

Sign up to our Insights newsletter!