Project EIST: bringing the beauty of choice beyond media and entertainment

The art of storytelling is as old as the dawn of time, and has evolved to encompass a huge array of mediums as technology has advanced alongside it. A huge fuelling factor in this evolution is us, the audience, and our ever-increasing appetite for stories that engage and delight. As a society, it seems safe to say that we’re constantly on a quest for immersion.

More recently there’s been a rise in interactive storytelling across media and entertainment, which comes in response to our shifting media consumption habits. Interactive content such as Netflix’s 2018 Bandersnatch goes a long way in bringing this form of storytelling into the modern zeitgeist. Let’s not forget that allowing an audience to ‘choose their own story’ has served as a cornerstone of massively popular role-playing games (RPG) for years, reflecting our enduring interest in autonomous choice, now made possible in live action productions.

As interactive storytelling has gained a new lease of life in recent years, so too have immersive experiences driven by virtual reality (VR) and augmented reality (AR) become hugely popular thanks to technological advances in encumbent headsets and tech. 

It makes sense, then, for the combination of both to serve as the apex of immersive storytelling—yet as we’ll explore below, this exact combination retains an air of exclusivity. We’ve written before on how a pioneering Dublin-based studio used AR to interactively re-spin a classic tale, but this was a complex undertaking which required no-end of technical depth and expertise. 

The Enabling Interactive Story Telling (EIST) project, which saw Foundry collaborate with the BBC R&D team, addressed the need to allow more people—from different backgrounds and industries, and with different skill sets—to engage with immersive media. 

Below, we dive into the details of this just-wrapped project, and the benefits it brings to creatives looking to generate content from CGI and/or live action footage faster, and sectors outside of media and entertainment which want to explore a new kind of storytelling.

Man in VR headset

Scoping the market

Despite a somewhat lacklustre few years, the future looks bright for immersive technology: its expected market potential stands at $98.4m by 2023, with a global population penetration of 2%. Both end-users and content creators anticipate what the future holds for this tech with baited breath—not just in the entertainment industry, but beyond this into other professional sectors. 

With recent hardware advancements for displaying immersive experiences—such as the all-in-one Oculus Quest, which promises ‘no PC’, ‘no wires’ and ‘no limits’—interest is at an all-time high across all sectors around what will be possible for experiences that transport us beyond our own reality.

In order to sustain this growth, AR and VR headsets and tech now require content in the form of compelling immersive experiences that are well-designed and of a high enough quality to satisfy a discerning audience.

Yet these same experiences, and their promise of highly-interactive storytelling, use bespoke tools specifically designed for either the film or games sector, and often require the hiring of many games or media developers to create the experience. Not only does this send costs soaring, but it also raises the bar to entry, limiting the potential for widespread impact beyond the media and entertainment industries.

Foundry firmly believed that immersive experiences are not—and should not—be limited to creative and/or entertainment sectors. We’ve covered the use of AR/VR elsewhere extensively—such as in the architecture, engineering and construction industry—and the sentiment still stands. There’s a need across many sectors to use AR/VR for both in-house purposes, as well as end user experiences which aren’t currently economically viable, such as virtual tourism.

Enter EIST

Spanning the course of two years, EIST was a government funded grant project via the Technology Strategy Board's emerging and enabling technologies programme. It aimed to enable tools that are generally reserved for the creative sector to be used across other industries—such as those mentioned above—to allow companies to quickly create branching and non-linear stories that were prohibitively expensive for them to do before.

By reducing the barrier to entry for content creators in these industries, businesses would be better placed to improve productivity or generate new business models using AR or VR technology—whether that be through interactive staff training or totally new user experiences.

Quote from Dan Ring, Foundry's Head of Research

As part of the project, Foundry engaged with institutions in sectors outside of media and entertainment in a bid to understand their motivations and behavior. This ensured that the design of the overall product and framework was appropriate to their needs in enabling story-telling for end-user scenarios.

Following this feedback and together with the BBC R&D team, Foundry set out to develop a scene layout and sequencing tool containing non-linear media objects that included native support for Pixar’s Universal Scene Description (USD), and multiple renderer support including real-time raytracing. This ensured that various media types—including 360° video and light fields—could also be supported.

Speaking of the project, Dan Ring, Foundry’s head of Research, comments: “[EIST] helped drive our investigations into key technologies like USD, Hydra & game engines, making it one of our most valuable research projects to date.”

Continuity is key

Before embarking on the development of project EIST’s scene layout and sequencing tool, Foundry’s research team first scoped what was already out there for content creators looking to tell interactive stories. Unsurprisingly, game engines proved the most prevalent tool being used in this space thanks to their software development environments boasting the rendering capabilities and frameworks required to engineer complex virtual worlds and storylines.

Yet the use of game engines in constructing interactive content comes with a caveat. Their real-time rendering capabilities makes them especially valuable at early stages of the production pipeline, such as in previsualization, but these same packages rarely provide the infrastructure for wholesale integration into the pipeline.

Consequently, decisions and changes made in these environments are not easily propagated, and have to be tracked and replicated elsewhere, leading to a lack of data continuity and decision-making. And as we’ve recently explored, both of these are indispensable in integrating real-time workflows into modern post-production pipelines.

With the above in mind, Foundry’s research team uncovered the need for a ‘real-time production hub’ which, right from the beginning of pre-production, sits at the core of the pipeline.

UI grab of the project's editing and sequencing tool

The right tools for the job

Under the umbrella of project EIST, a real-time scene layout and playback review tool was developed as an alternative to game engines in creating interactive stories, using the Oculus and Vive systems for real-time collaboration in VR. 

It aims to solve a host of pipeline efficiency problems posed by game engines, whilst offsetting continuity issues that come with introducing volumetric video into a dynamic timeline (made particularly difficult in first person mediums such as VR, where frequently cutting between cameras to mask continuity issues can be uncomfortable).

Using USD and FBX files, the editing tool creates an environment in which animatics are the starting point and can be iterated on throughout production. As higher fidelity assets are created, they can be layered into the scene and immediately reflected in the content visualisation, providing a continuity of both environment and decision making that reduces the effort of replication. 

The result is a more structured approach to capturing all the data, making sure it’s timestamped, synchronised and arranged into a format, and delivered in a more sensible way downstream.

Second UI grab of EIST's editing and sequencing tool

As the primary medium for interactive content, the tool explores solutions for clients engaged in making VR experiences, and for smaller content producers for whom hiring a team of game developers is expensive, in support of the overarching goal of project EIST. 

Foundry’s efforts in upholding this same goal pair perfectly with the StoryFormer Tool, created by the BBC as part of the EIST project. The structure behind its narrative paths has been designed to be easily navigated by people from a film/video background, without requiring constant input from game engineers and numerous iterations for coding. As a result, reviewing and amending the underlying story graph would require (near to) no programming experience. 

Needless to say, we’re very excited about the potential both tools unlock for content creators wishing to experiment with interactive experiences involving VR, AR, volumetric and light field imagery. Stay tuned as we share more information over the coming months on how we aim to help you bring your immersive experiences to life—right before your very eyes.