Creating magical animated tales with TAT Productions and NukeX
Tackling the anti-heroes of compositing with NukeX and HieroPlayer.
Let’s start with an imagination exercise: try to imagine a superhero. What do they look like? What is their mission? How powerful are they? Chances are you’re probably thinking of a well-built character with superpowers, fighting evil to restore peace within their community, alongside a wise sidekick who always supports them.
But superheroes aren’t always what we imagine them to be. The smallest ones are often the mightiest. And that’s also the case in Epic Tails (originally Pattie et la colère de Poséidon), a family adventure feature animation film that tells the story of a super smart mouse called Pattie and Sam, her ginger feline friend. Together they try to save Yolkos, a port city of ancient Greece, from Poseidon’s wrath and embark on a journey where they meet with gods and anthropomorphized creatures from the Greek mythology.
TAT Productions took on the challenge and showed their skills by telling a story like no other. Based in Toulouse, France, the studio was founded 23 years ago by David Alaux, Eric and Jean-François Tosti. Their most notable project, The Jungle Bunch, won an International Emmy Award and is broadcast in more than 200 countries and translated in 50 languages. Over the past five years, TAT Productions has grown to be one of the most prolific European studios in the field of cinema and they’ve been aiming—and succeeding—to make a film every two years: The Jungle Bunch (2017), Terra Willy (2019), and Pil’s Adventures (2021).
When working on Epic Tails, TAT Productions weren’t playing cat and mouse when meeting challenges along the pipeline. They combined the power of NukeX, our advanced industry-standard compositing toolkit, with HieroPlayer's capabilities of reviewing shots in context.
We sat down with Technical Director Romain Teyssonneyre to tell us more about their creative process and share insights into the studio’s workflows and solutions. Prepare to be amazed by one of our most visual case studies yet!
How it all started
“The studio moved from After Effects to NukeX in 2020 for Pil’s adventures. We gave ourselves six months of R&D for building a fresh new workflow with the help of two technical artists, Jérômes Desplas and Kevin Kergoat. After that, we could enter into production more serenely,” says Romain.
Since then, the pipeline is now imagined and built by Compositing Pipeline Supervisor Colin Wibaux with the help of CG Developer Romain Truchon. At the moment, Justine Thibaut is the Compositing Supervisor of the studio—“she has an artistic view on all the projects and also helps to drive the pipeline from the production side,” Romain tells us.
Before they moved to Nuke, the studio wasn’t able to work as flexibly with their CG renders, for example having rendered-in cutouts, instead of the option to use deep, meaning they had limited options for adjusting the lighting post-render. And so they chose NukeX.
Romain explains the motivation behind moving to Foundry’s advanced compositing toolset, NukeX: “We wanted to have the possibility of working in Back To Beauty from V-Ray, more specifically to work on lights AOVs added instead of working on beauty directly. We also wanted to work with deep compositing and avoid rendering shots multiple times just because of the cutout mask. Additionally, we wanted to make use of cryptomattes, Viewport 3D and DeepToPoints.”
Exploring the 3D pipeline
First, let’s have a look at TAT Productions’ 3D pipeline before going deeper into their compositing pipeline. The studio used 3ds Max and V-ray for lighting and rendering.
The VFX team developed an internal tool to separate every element in the image on a dedicated layer that they named La PassTech, giving them full creative control in the comp.
They used the tool to separate each main character, groups of secondary characters or crowd, foreground, background, fires, smokes, waters and volume lights.
Each of these layers is rendered within three passes with many different AOVs in a multi-channel EXR.
Beautify it and render!
These multi-channel EXRs can contain up to six light groups to make up the final image, as well as technical AOVs like Albedo, SSS and Velocity.
For occlusion, the studio used a custom override shader to manage different distance cases like environments, characters, fur or opacity maps. The pass above also contains many technical AOVs such as cryptomattes, normals, position reference passes (PRefs), and custom objects IDs.
Finally, the team rendered a Deep image that only contains the Z information.
“As we can see, our rendering process is pretty heavy. We have an average render time of between six and eight hours for a final image—this can increase up to more than 15 hours for big shots. We have an internal render farm of 280 nodes and 200 workstations and everything is managed with Deadline,” Romain tells us.
But what happens when a shot is ready to render? TAT Productions implemented an internal process for precomping that launches a Nuke job which stacks passes just for rendering and AOVs verification. “Here, instead of importing from 3D scenes, we also extract metadata from EXR to export a Nuke Camera that will match perfectly to the renders,” Romain explains.
Given the complexity of our compositing tool, every studio chooses to use it differently, depending on their own challenges and workflows. TAT Productions gave us an insight into the pipeline they built around NukeX and the tools they developed in order to improve their workflow. One of the tools they developed was The Lego and they used it for shot creation.
“Our first tool was for creating a shot and importing all the image sequences. We had to inform the shot name, therefore we had the Key Shot, the first shot that led the settings for the rest of the sequence, or a Baby Shot, where all the settings were copied from another shot and just the content was updated.”
On the right of the Node Graph, we can see the imported layers. Here, Lego will duplicate—or delete layer templates depending on the shot content which could range from the number of backgrounds or crowds to characters layers or FX. Here, the artist can access every AOV in every layer.
In the middle, we have global controllers for all of the elements of the shots, such as motion blur, fog, depth of field, and light wraps.
The left-hand side area was used for work specific to the shot like digital matte painting (DMP), particles, and skies.
Above can see Feng-shui, a tool they created and used for Holdout and order management. For large comps with a lot of layers, this is a great quality-of-life tool for the artists, allowing them to access, inspect and change the order of the layers, all in a simple window.
Inspired by Bokeh, an essential tool brought by the Nuke 14.0 release, TAT Productions created Optical Defocus—an in-house Depth of Field tool based on ZDefocus.
Finally, they created Position Matte, a tool based on Deep and Position World information that allowed the studio to create custom Position Mattes, Z Depth, 3D noise or Vertical Mattes. All these tools do the math operation in Deep, therefore the anti-aliasing is preserved, and they have little to no edging artifacts. “It’s all linked to the shot camera—extracted from EXR metadata—so we don’t have to manually rotoscope,” Romain adds.
On to reviewing
“All of our shots were rendered with NukeX in Deadline with a light denoiser. We also created a proxy video for all our EXR sequences for performance reviews.” Romain explained.
When it came to reviewing, they turned to HieroPlayer, Foundry’s flexible— and free— review tool that allows artists to review shots in context and compare render versions efficiently.
“When HieroPlayer became available with Nuke and NukeX, we integrated it into our pipeline to review multiple shots with multiple timelines.” The team at TAT Productions used our tool for lighting video, lighting precomped EXRs, comp videos, denoised EXRs, comp EXRs, and denoised videos.
As shown above, HieroPlayer allowed them to check an entire sequence at the same time and to visualize every step.
“Moreover, we linked this to our internal production tracker, so we could directly change the status of the shot.”
Let’s get creative
One of the most technical challenges the team had to manage on Epic Tails was the Zeus sequence. In that particular sequence, Zeus is looking at the main character's adventures in his basin and he’s playing with it. But what were the challenges?
“On the 3D side, we had separate scenes but, because of the complexity of the content, we couldn’t afford to make Level of Detail (LODs) or other assets,” Romain explains. “So we rendered each scene separately and reunited them in compositing while still managing FX, camera movement and parallax,” he continues.
We asked the team at TAT Productions to share with us some of their most favorite scenes they worked on. Keep on scrolling to unravel the fascinating visual history they created. If you already watched Epic Tails, we’re sure these scenes will ring a bell. In case you haven’t, prepare to be amazed by the studio’s artistry!
Sailing to the future
What does the future look like at TAT Productions? “We are actually working on our fifth and sixth films, The Jungle Bunch 2 (2023) and Pets on a train (2025), as well as on the famous Netflix series, Asterix, which is scheduled for 2024.”
And NukeX will be there to support them along the way. “We have created all these custom tools for multiple projects and maintained them across multiple NukeX versions. We have planned to keep on working with NukeX on this workflow and constantly improve it. We are also looking closely at ACES & HDR workflows,” Romain tells us.
TAT Productions are storytellers (créateurs d’histoires) on the animation stage—they create the kind of stories you instantly become part of just by watching. The stories that make you feel. If you haven’t watched Epic Tails yet and wanted a reason, this is it. Go watch a story you’ll remember!