Image courtesy of Thiago Christo

Micro Monsters: mighty next-gen immersive content in stunning 8K

The clue’s in the name with Alchemy Immersive. Blending the latest immersive technologies and content, the award-winning studio concocts magical experiences that transport audiences to entirely new worlds—whether that be sunken depths of the ocean, or the outer depths of space.

Their latest project, Micro Monsters, may be smaller in scale, but is no less breathtaking in its delivery—especially as it allows Oculus users to experience live-action virtual environments in stunning 8K 3D at 60fps for the very first time.

Part of the Atlantic Production Group, Alchemy Immersive worked alongside Zoo VFX—also part of Atlantic Productions—to bring this five-part immersive series to life for the newly-released Quest 2. Viewers will join UK national treasure Sir David Attenborough to discover the real-life, hidden world of arthropods, from a completely unique perspective.

Such an unprecedented project required a novel approach to production. For this, both the Alchemy Immersive and Zoo VFX teams used a combination of specially developed 3D stereoscopic camera rigs, 180° live-action capture, and newly-developed VFX compositing techniques in a pioneering feat of immersive media production.

Happily, Foundry’s compositing tool Nuke counted itself a crucial part of this innovative process. The production pipeline for Micro Monsters was built in Nuke 12.0, with Nuke also acting as the main compositing pipeline for the team’s stereo 180 renders.

Overseeing the project was its director Elliot Graves, Alchemy Immersive’s Creative and Technical Lead. Working closely with David Barker, Head of Zoo VFX, the two worked tirelessly along with their teams to bring this microscopic world to life in glorious 8K resolution.

We caught up both Elliot and David to explore the accomplishments, challenges, and technicalities of Micro Monsters—and what they think is in store for the future of immersive content.

A bug’s-eye view

Elliot Graves is no stranger to Foundry, having worked closely with our research team in 2019 to build a LiDAR and videogrammetry rig with E2 cameras. The data from this was then used to support the development of volumetric research project VoluMagic.

His experience and expertise meant that Micro Monsters was in good hands. And just as well, because with a credit list featuring just over 100 team members, this was no small project.

“The main objective of the Micro Monsters series was to reveal the micro-world of bugs in a way never experienced before in virtual reality,” Elliot comments. “We wanted to take full advantage of the new Oculus Quest headset and its ability to play 8K video, combining it with genuinely eye-opening stories from the natural world and the latest visual effects pipelines.”

Taking place over the course of four months, the pressure was on to turn the project around in time for the release of the new Quest 2—no mean feat, but one Elliot and his team were well-prepared to handle.

 “The whole team worked tirelessly to ensure that what we produced was one of the highest-quality live-action VR projects to date, showcasing the huge potential of non-native immersive media within VR production,” he tells us.

Asked what made the project unique, an obvious one springs to mind for Elliot, born from the uncharted territory that 2020 brought with it. 

“Working through a pandemic certainly had its challenges,” he comments. “With a team spread across Thailand, Australia, London and with Oculus based in California, we spent a lot of time on remote calls. Luckily we had our Master Jedi Producer, Vianney Comot on hand constantly, guiding us!”

David Attenborough onset of Micro Monsters
David Attenborough onset of Micro Monsters

Breaking new ground

Workforce limitations brought on by COVID-19 present practical challenges that all businesses and studios have had to adapt to. But what project-specific, technical barriers did the team come up against during the production of Micro Monsters?

Crucially, producing high resolution, non-native hero content was key to the success of the project. This wasn’t our first 8K project, but it was the first where 8K could actually be seen in the headset,” Elliot tells us.

“In the past, creators of live-action VR production have had to come to terms with the lack of detail in deliverables within VR. With the launch of the Quest 2 headset, 8K video fully saturates the headset. This means our 8K deliverable is visible at 100%, and in turn, our delivery had to be flawless—there was nowhere to hide! 

For Elliot and his team, then, the main technical challenges of Micro Monsters were three-fold: creating an immersive experience using non-native immersive content, building a new VFX pipeline to support this, and rendering at 8K and 60fps.

“To utilise non-native immersive content, we had to think carefully about how each episode in the series would be composed,” Elliot explains. “We opted to start each episode with the same introduction, easing viewers into the world of bugs whilst using original and native 180 content that was either rendered or captured with ZCAM K2 cameras.”

“This provided viewers a familiar environment, albeit at a much higher quality than previously expected. From here we leveraged the set extended high-res 3D macro rectilinear content that was adapted for VR. Creating the VFX pipeline to produce these conversions in a way that would be seamless and as immersive as native content was challenging.”

Fortunately, Nuke was on hand to navigate challenges like the above, and support the team as they headed into the uncharted territory that came with this unique project. Elliot, as the project’s director, recalls Nuke’s role throughout the production process.

“This pipeline was built using Nuke 12.0 with our team creating a set of gizmos that could some-what automate the process, alleviating a shot-by-shot approach,” he tells us.

“Built into this was the capability of FFMPEG encoding for the multiple headset review formats that were required for a proper QA process. Nuke was used across the set extension work, but also acted as our main comp pipeline for our stereo 180 renders.”

Yet it was not all smooth-sailing, as Elliot explains: “Producing these to such a high resolution and framerate was challenging for our infrastructure and team. QAing heavy EXR renders took much longer and getting quick turnaround on playouts was harder again.”

“Here FFMPEG came to the rescue, automating the process somewhat, allowing headset review of all individual shots as they were worked on through Shotgun. The outcome were shots that felt totally immersive, spherical and 3D but with the visual qualities of cinema cameras that we’ve all come to rely on.”

But what did it take to get to this point? Time, resource, and technical expertise, according to Elliot: “The technical process to take this evolution in live-action content into VR was the post-production challenge where most of the resource on this project was allocated, as David outlines. Our main challenges were converting the footage and processing it through various pipelines at 8K 60fps.”

Compositing creatures for 8K: a technical deep-dive

David Barker has over ten years’ experience in documentary and VFX production, working mostly for the BBC’s science department. Having produced the VFX work on Micro Monsters, he’s well-placed to provide insight into the nuances of this technical process.

“Our challenge at Zoo VFX was to re-format rectangular images for a Virtual Reality headset, by mapping them onto a 180ox180o hemisphere that accurately represents the human field of view. ” David tells us. 

“To put it simply, we had to turn a rectangle into a circle. Further complicating the challenge is that camera motion in immersive VR can rapidly cause nausea that persists for the rest of the day—until the viewer sleeps. This type of stabilisation required new tools to be developed.“

“The conventional technique is to simply project the image onto a hemisphere and magnify it until the edges—which turn into a bow tie shape—aren't visible. But this left the viewer unable to see more than a fraction of the intended subject while the excessive stereoscopic disparity forced their eyes to painfully diverge.”

Diving into how his team used Nuke to overcome this issue, David explains: “Stereoscopic visionary and world-class compositor Tim Baier (The Irishman, The Hobbit, Rise of the Planet of the Apes) developed a nimble and powerful toolset extension for Nuke. We began with stereoscopic image sequences and utilised STMaps to re-project the rectangular images onto a three-dimensional hemisphere. The mapping technique procedurally and dynamically warped the images to increase the field of view without creating a sense of distortion in the viewer.”

Features and techniques such as centre magnification, batch processing, X/Y Warp to reposition the image center, and using Nuke v12’s new GPU-accelerated InPaint tool to fix the ‘bow tie’ shape previously mentioned were all deployed by David and his team. The use of each meant that they could successfully convert live-action content to a VR medium at the quality needed.

Compositing Micro Monters in Nuke

Crucially, however, stabilisation is required when translating live-action content to virtual reality. “Camera moves look nice on screen but cause viewer discomfort and nausea in VR,” David tells us. “Some of these were particularly problematic as they were the result of more than just a pan and tilt of the camera—the camera was actually craned and dollied at the same time. Additional challenges were presented as stereoscopic alignment of many shots in the 3D master for television had been dynamically animated.”

David and his team certainly had their work cut out when tackling this issue. “A great deal of complex image tracking was required—especially when mirror edge extensions were used— for which a new high-speed, high accuracy tracking tool that leveraged precalculated motion vectors was developed,” he continues. 

“This, in combination with the development of a warp stabilisation tool, could anchor our most problematic images in-place on the hemisphere while making it simple to extend the image using real details actually photographed by the camera at other points in time.”

“It was critical this could operate independently on the left and right images to remove the techniques that work fine for television, but induce nausea in VR.”

The results speak for themselves. Micro Monsters is a visual marvel—one that induces amazement, not motion sickness. Asked what makes Nuke a particularly good fit for projects like these, David is quick to lend his thoughts.

“Node-based scripts make working on complex compositions much easier,” he tells us.

“What’s more, the ease of customisation was essential in creating our toolset extension. And Nuke’s compatibility with Python and CSV files was essential for us to work on such a high shot volume.”

“[Nuke’s] integration with Deadline also allows for rapid, mass-deployment of render workers.”

Working in Premiere Pro - Micro Monsters
Working in Premiere Pro - Micro Monsters

What does the future of immersive content look like?

With the Oculus Quest 2 just launched, Micro Monsters is now available on Oculus TV, a central hub where viewers can discover and watch media on both Oculus Quest and Oculus Go. It’s a prospect Elliot Graves couldn’t be happier with. 

“I’m super proud of the team and our ability to produce a VR experience that generally lives up to user’s expectations,” he tells us. “I feel with Micro Monsters, we created a series that you can watch multiple times, without noticing you're wearing a VR headset. VR is simply the delivery, with the content leading the experience.”

“The quality of the experience means you aren’t pixel hunting, and David Attenborough’s soothing voice leads you through some of nature's most curious stories. I really hope our approach inspires other creators to think about non-native immersive media and applying storytelling techniques first.”

Speaking of the Quest 2 and the myriad of opportunities for next-generation VR content that its launch brings, Elliot gave his thoughts on the exciting trends he’s seeing around VR technology and immersive experiences such as Micro Monsters.

“With the launch of Quest 2, we’ve seen users being able to enjoy high-resolution immersive media for the first time,” he comments. “People can be truly transported to a new location and access worlds otherwise inaccessible, without having to rely on CGI.”

“The future of this will inevitably involve making the experience have six degrees of freedom. This means users would be able to view content in VR which has a volumetric element to it, moving around the video and seeing it from unique perspectives.”

Yet what does the future hold for volumetric capture and the potential it unlocks for immersive experiences?

“Producing this type of immersive content is currently challenging, especially at any impactful quality,” Elliot continues. “However, there are some impressive teams around the world working hard to make it happen. Thankfully, owing to real-time production, the workflows often used in VR production are starting to grow, as too are the demand of volumetric capture and toolsets offered by the likes of VoluMagic.”

“Whilst computation power is still a blocker to mass-adoption by creators, I’ve got little doubt that volumetric capture and performance will be the next evolution in immersive media.”

Discover Nuke for immersive content

Get a free 30-day trial