Q&A with Important Looking Pirates: the studio pivoting to virtual production

Use of virtual production has soared in recent years, popularised by industry juggernaut Jon Favreu on both feature and episodic productions such as The Lion King and, more recently, The Mandalorian.

Its rising popularity is in part down to advancements in the tech behind it. Evolving real-time rendering capabilities combined with LED screens to create in-camera VFX enable greater creative freedom on-set, and brings post production artists and processes in earlier along the production pipeline. The ultimate aim is to give filmmakers and directors a better view of the final picture to make more informed decisions on-set—where time matters most.

One such studio to test the waters of virtual production is Stockholm-based Important Looking Pirates (ILP). Started in 2007 by Niklas Jacobson and Yafei Wu, the studio has worked on Star Wars Episode VIII, The Mandalorian, Westworld and HBO’s Watchmen, with VFX supported by Foundry tools Mari and Nuke for look development and compositing, respectively.

150-strong, ILP holds fast to its roots: creating a great company to work for, made by artists for artists where making amazing pictures is as important as having a good time.

As a full service studio, ILP works from concept design to on set supervision, CG asset creation and VFX. They have hands-on experience with using a wide range of virtual production techniques—from previs to mocap and in-camera VFX—and fundamental insight into how these are transforming production and post.

Below, we caught up with the studio to discuss its adoption of virtual production workflows, how post production tools Nuke and Mari support the team’s creative vision, and what’s in store for the future of this kind of content creation.

Q: What were the advantages that you, as a studio, saw in using virtual production?

A: Utilizing real-time rendering to substitute for physical builds of props or sets, or instead of offline renders, has obvious advantages for the cadence of the creative process. Real-time rendering allows us to iterate and share more in the same amount of time. This advantage is there before we get to real-time rendering for final pixel output, which remains a bit more elusive for our type of work. And even beyond those advantages, various forms of real-time rendering seems like a truly inevitable part of the VFX future, so that alone means we’re onboard.

Q: How has virtual production changed your workflow for asset creation and in post?

A: As this is evolving so rapidly, the answer is unfortunately “it depends”. But in a general sense, the better your monitor can look during a mocap shoot or camera tracking session, the more confident you can be of the final result. That improved monitor view depends on the quality of your real-time assets, rigs, textures, and light, which means more of that polish has to happen prior to shoot, so in “pre” rather than “post”.  

A VR tech scout doesn’t need final pixel type quality, but the environment needs to be close enough that the team can be confident in their decisions and conclusions. Obviously, assets for an LED screen shoot need to be final pixel ready in terms of quality while remaining performant enough as to not trigger frame drops. All of this demands that we are involved earlier and be given greater creative license than in a traditional VFX job, and that change can at times seem more daunting than the technical challenges. 

Abstract figure with VR headset

Q: What challenges have you encountered during virtual production? How have you overcome these?

A: The more complex you make something, the more things can break. That extra pretty monitor view comes at the prices of additional complexity, at least for now. A major challenge is also pipeline integration, where super secured, offline, Linux-based and Python driven pipelines need to play nice with Windows based tools unless you choose to compile Unreal Engine from source, in which case you’re facing a different set of obstacles. 

Getting a real-time renderer to be so seamlessly integrated as to be just another render target in a pipeline using the same assets, materials, and color science is the biggest goal for anyone with a traditional VFX pipeline.  

Q: Virtual production and high-end VFX typically involve passing large amounts of data across the production pipeline. How do you manage this?

A: Networks and storage are fast enough that real-time work isn’t at some specific disadvantage compared to the traditional pipeline. Real-time engines and the GPU horsepower available are getting to the point where you don’t need to mangle your assets much at all, in many cases, and being able to use the same data formats everywhere is really key to making this work. 

Where you are over budget for texture memory for example, there are ways to generate proxy instances so that you can still stay current with the latest assets in the project without maintaining a separate real-time lane in your pipeline. And if you’re not using real-time for final pixel, 1 fps is still far better than waiting an hour for a frame on the farm.

Q: How do Foundry products Nuke and Mari fit into your virtual production workflows, and beyond this into your pre-and-post production workflows? 

A: We rely on Nuke for all our final compositing in our post production workflow. We also use it whenever we need to add things like transitions or extra tweaking to our pre-vis or animatics.

We use Mari as our main texturing package, from concept to pre-lookdev. We usually work on very complex assets from creatures to spaceships and huge environments. Mari's virtual texturing technology is capable of handling these types of assets with dozens of UDIMs and very high resolutions.

Mari's shading system allows us to pre-visualize the look of our assets without leaving the texturing package under specific lighting situations that later will be recreated in the lookdev stage. Thanks to this we are able to tweak texture values in real-time looking at the final look of the asset.

Colorful abstract imagery of gateway

Q: Are there any techniques or tips that you would have for others working on projects using virtual production? 

A: It’s still true that virtual production technologies, tools, formats and so on are far less established than traditional VFX. This means that the best tip we can give anyone in virtual production is to dedicate a chunk of time to staying up-to-date with the industry, and to be flexible in adapting and exploring new solutions. This will move very quickly for the foreseeable future, so just enjoy the ride! 

Q: What are your thoughts on the VFX industry and what the future holds? With more remote work, how do you think virtual production techniques can cater to this change in workflow, if at all? 

A: We were lucky to be ahead of the curve with all workstations being remote even before covid hit, so adapting to working remotely was as seamless as it could get. In a way virtual production is more difficult to solve, since working with VR tech or virtual cameras using a wifi connected iPad doesn’t work as well when your workstation is in a data centre. 

As for the future of VFX, multiple people working on a scene or asset at once, all viewing it with real-time path tracing, is a big shift. This is starting to become possible through formats like USD and platforms like Nvidia’s Omniverse, which we’re actively evaluating at least for real-time. Better quality viewports in the DCC is of course nice, but the shift to truly parallel workflows might be a more revolutionary change once we truly get there.

Colorful abstract imagery of space

Q: What does the move towards real time mean for the VFX industry, and what role will post production take on as part of this?

A: The boost in iteration cadence cannot be overstated. Being able to see in real-time what you’re doing in anything from simulations to light or in-camera on set gives everyone involved so much more clarity and confidence about the outcome. But to truly leverage the opportunities afforded by this tech, the VFX vendor needs to be much more integrated in the early stages of development, and this changes the dynamics of the industry. There will probably always be shots or even shows or films where this tech simply isn’t the answer, so it’s more of a great new tool than a universal solution to everything. 

Q: Are there any other upcoming developments or tech that you think will have a big impact on the VFX industry, or any other avenues you’d be keen to explore further?

A: Machine learning is already massive, and will only grow. Denoising, inpainting, tracking, face replacements, relighting, style transfers, the list is endless. One day we might get proper light field cameras too, but let’s not hold our breath about that one. The biggest boost right now is the continuation of what has already been going on for some time: integration of the data formats, color science, tools and pipelines of the VFX world into real-time graphics engines. 

Want more on the latest trends like virtual production sent straight to your inbox? Sign-up for our Insights newsletter

Subscribe today