Image courtesy of Volograms

Not the same old story: how Volograms used volumetric to respin a classic tale

Humanity has a habit of seeking increasingly engaging, captivating experiences, particularly when it comes to media and entertainment. 

We’ve seen it in TV and cinema—from color TV, to 3D viewing, and now VR. The same can be said for games. If there’s spectacle involved, chances are that an audience wants to be immersed as possible in the experience, which is often facilitated by cutting edge technology.

Theatre, as a storytelling medium, has been a little slow to catch up—but caught up it has. There’s been a steady rise in interest in ‘promenade theatre’, where a traditionally closed set is transformed into a living, breathing environment in which the audience actively participates.

The autonomy granted to the audience to roam freely about a production set, choose who to speak to and how to act is what sets promenade theatre apart from traditional theatre, making the whole experience ultra-immersive.

It’s also what makes this type of theatre perfect for location-based AR, VR and MR.

MWM Interactive’s Chained: A Victorian Nightmare is a pop-up arcade based on Charles Dickens’ renowned tale A Christmas Carol. It uses VR to immerse stout-hearted participants in a...well, victorian nightmare.

This isn’t just a re-telling of the classic story. Instead, when participants are asked by performers to ‘cross over’—an ominous way of asking them to put the VR headset on—they become the central Scrooge character.  

The participant’s perception of reality changes, shifting the narrative and focus on themselves so they can tell their own story—entirely through tech. 

Pioneering companies are already making inroads into this space using volumetric imagery, with fascinating results.

Behind the curtain

Behind the curtain

We caught up with Rafael Pagés, CEO of Dublin-based Volograms, who specialize in turning real people into volumetric holograms—or, ‘volograms’.

These volograms can then be used in a huge array of AR and VR experiences: from apps and social media marketing efforts to—yep, you guessed it—immersive storytelling.

Volograms’ most recent efforts in this space involve a retelling of Samuel Beckett’s short production Play!. Inroads were made into this project before Volograms even existed as a company.

Speaking of the project’s inception, Rafael comments: “We created the very first version of Play! while we were still researchers at V-SENSE, a research group in Trinity College Dublin in late 2017.”

“This first version we worked on was tailored for virtual reality, and as we didn’t have a studio, we gathered some of our own cameras and captured the actors in an improvised green-screen studio.”

Volograms was then established in 2018, and continued to improve upon the technology used in the first iteration of the project. All the sequences were reprocessed using volumetric imagery, and a version of Play! was created for augmented reality, taking place at Trinity College, Dublin.

Rafael comments: “This presented new challenges but gave us the possibility of mixing a real environment with the volograms of the actors, something we had never seen before.” 

Light's up

Light’s up

Taking place over one act, Play! focuses on a husband, his wife and his mistress as they recount their experiences with infidelity in a nonsensical stream of consciousness. 

In a traditional, stage-based version of the play, the three actors are typically presented to the audience in identical funeral urns. Their cues to speak are heralded by a spotlight which shines on them in turn.

Volograms identified this mechanic of the play as something that could be built upon using volumetric technology to heighten a sense of immersion and presence in the viewer. 

“The idea behind both the VR and AR versions of Play! was giving the user the possibility of impersonating the interrogator, so for that, we needed it to be interactive,” Rafael tells us.

“In the original play, each actor talks when the interrogator wants them to, which is normally signalled by pointing a spotlight in their face. In our version, the volograms of each character are gaze-activated, so they only talk when the user is looking at them.”

Immediately, the viewer becomes the central figure. They assume the role of interrogator and can control the pace and direction of the play in a way that traditional audiences can’t.

What’s more, the order in which each vologram speaks is entirely dependant on the viewer’s gaze. The story then becomes non-linear, and what one viewer experiences may be completely different to the person next to them.

What is a vologram

What’s in a vologram?

The AR version of Play! relied entirely on live actors being turned into volumetric holograms, or volograms, which would then be placed into the viewer’s sense of reality through the Magic Leap One as a 3D image.

Described technically, a vologram is a set of 3D models that are played sequentially—one per frame, 30 per second. Each model is composed of a 3D mesh and texture image.

But how are volograms made? We’ll let the expert explain: “We film that person from different viewpoints and we use the resulting synchronised videos to run our 3D reconstruction algorithms.”

“The resulting assets can have different polygonal and texture resolutions, so the user can use the version that guarantees the best performance for his/her development platform.”

Since so much data needs to be stored just to create one vologram with its own 3D meshes and textures, this posed its own challenges for the production of Play!.

Whilst only having a 10-minute runtime, Play! nonetheless involved around 20,000 frames—and, consequently, 20,000 models.

“Luckily, we were able to compress the sequence and it is now in a reasonable size,” Rafael comments.

The final product speaks for itself. Play! has been well received by critics and casual participants alike, with some claiming that even Samuel Beckett himself—famed for his eccentric methods of storytelling—would appreciate the artistry and creativity of this version of Play!.

When asked about what particularly stood out for him as a highlight when working on the project, Rafael reflects on how interdisciplinarity it all was: “This is a research project involving artists, actors, drama scholars and computer scientists all working towards an ambitious goal. It was very exciting!”

Volumetric due an encore

Volumetric: due an encore?

So what is it about volumetric imaging, if anything, that makes it a more immersive storytelling experience—and where does this leave more traditional forms of storytelling?

Rafael comments: “I think volumetric video has the potential to become a new medium for storytelling.”

“In a way, it feels theatrical, as you are standing in front of actors. However, you can build cinematographic effects on top of it, bringing the experience to the next level.”

And this, says Rafael, is what gives volumetric video the edge over VR when it comes to telling a story: “Having real humans in the experience, as opposed to avatars, makes it realistic and much more immersive.”

It seems, then, that volumetric has the potential to provide the best of both worlds.

If used as part of promenade theatre, the audience can experience a heightened sense of immersion that is only unlocked through volumetric technology, whilst still enjoying the fundamental aspects of traditional theatre—live actors, sets, and so forth.

Their reality is not changed completely—only augmented to go above and beyond the realms of possibility in a terrific display of science fiction.

There are challenges to be addressed if volumetric imagery is to become a new medium for storytelling, as Rafael postulates. Huge data sizes, and how to handle these, keep studios from diving headfirst into using volumetric as a go-to method of heightening immersion. But if these challenges can be tackled and solved, the possibilities are endless for this burgeoning technology.