The Big Interview: Ben Grossmann, co-founder and CEO of Magnopus

Ben Grossmann is an Academy Award-winning visual effects supervisor and virtual reality (VR) director who won a Outstanding Special Visual Effects Oscar for his work on Hugo in 2012. In 2014, just after receiving a second Academy Award-nomination for Star Trek: Into Darkness, he co-founded Magnopus – a VR experience company – with Alex Henning and Rodrigo Teixeira.

As part of our VR series, Foundry Trends sat down with Ben for a fascinating and forthright chat on just how far the space has come—and what lies ahead for experiential VR.

Foundry Trends (FT): Why did you set up Magnopus?

Ben Grossmann (BG): The three founders mostly came from the movie business, and in the years prior to forming the company we’d noticed a hunger from filmmakers for creating a more immersive moviegoing experience.

We were working with directors who were all reaching for techniques that could make the storytelling process more engaging—whether that was through high dynamic range, high frame rates, or 3D and stereoscopic view.

But we felt that they were going to hit a ceiling, because these are all techniques to fool the human eye into thinking something was real. Directors were still inherently controlling the audience’s viewpoint through lensing and editing, which is always going to feel passive and unnatural.

We believed that immersion would be the next evolution of narrative storytelling, and had been trying to move more in that direction for years. But we couldn’t do that from within the world of traditional VFX, bound by the constraints of filmmaking—we needed to start afresh from a different perspective.

Cinema rendered in 3d

FT: Do you have a process you follow, or do you rely on inspiration?

BG: When we started we took a relatively methodical approach to it. We sketched out a list of the kinds of technological capabilities that were coming to market, and how those would affect engagement between content creators and content consumers. In the early days we wanted to test out 360 video cameras—it was a very new thing at the time and we needed to figure out what it was capable of.

So we built some cameras, made a few experiences, and won some awards. It was definitely a leap forward, but it didn’t complete the journey because we found that audiences were still kinda passive. Sure, they had a little more control, but decision-making within the narrative was still missing.

We saw burgeoning potential in augmented reality (AR) and fully immersive VR powered by game engines, so that’s what we started moving towards—and we learned some surprising things.

 

FT: Can you give us any examples?

BG: We’d have suspected instinctively that having team of highly qualified VFX artists and engineers would allow us to become experts in using game engines, but in reality it was almost counter-productive. The language of VFX is very technically proficient, yet the techniques and tools, and capabilities learnt, were in many cases, antithetical to the real-time game engine process.

That meant we had to evolve the company to draw more heavily on talent from the games industry and academia. The quality of our experiences, both from a visual and interactivity standpoint, became much higher.

We also wrestled with 360 video as a format. Generally speaking, audiences engage with it well when it’s in the context of a live transmission or broadcast—like a piece of journalism or a live stream of a specific event. But we discovered that people wouldn’t pay for this content unless it something really exclusive, and then they’d expect it to be very highly produced. That meant it was disproportionately expensive to create.

In the VFX world we struggle to render and composite 2K, 24fps material. But for good quality 360 video the absolute minimum is 4K, with a frame rate of at least 30fps. You really want high dynamic range, but you can’t get it easily due to bandwidth and computing power restrictions, especially on the mobile platforms most people want to use to consume 360 videos.

It was a massive headache, and it wasn’t like there was a bucket full of cash waiting for us at the end of that rainbow. We realised pretty quickly that 360 videos needed to be smaller part of a much bigger experience.

 

FT: What kind of experiences are currently succeeding with consumers?

BG: Since our inception we’ve made more than 50 experiences, but there are three that stand out recently as very popular.

Mission ISS is an educational tool that puts you inside the International Space Station. You can control things, complete missions, and even go outside the station. We also did two movie tie-ins: BladeRunner 2049: Memory Lab and Pixar’s Coco in VR. People love the idea of being part of a movie and being able to explore that world.

In traditional narrative storytelling, you sit down and listen to a story that I tell. I make all the choices as to where that story goes, and in a cinema or a theatre the audience accepts that without question. But, if you transport that into VR, the audience starts to reject it—they get frustrated and angry because they’re in an environment where they should be allowed to do almost anything, but you’re not letting them.

Of course, giving them more agency makes it exponentially more expensive. Creating a 30-minute linear story will be one cost but a story in which the audience can decide on, say, five different outcomes, means producing 250 minutes.

So we thought: “what if we don’t tell the audience what to do at all?” What if we just put them in a world containing characters they know, and we let them go free to find little things to do and explore at their own pace? That was our thinking behind Coco in VR.

We also discovered that, when you put people into a story without a focus, they feel lonely. By adding a ‘social’ element to Coco, allowing friends to explore together, people started creating a story for themselves. The more people you add, the more fun you have, and this is something we’re now thinking about adding to every experience we do.

Noir cyberpunk streets in 3D

FT: How can the industry drive people to pick up a headset and get involved?

BG: There are two strategic ways you can look at that: from a consumer point of view and a business one. To the latter, hardware manufacturers have a lot invested and if they don’t successfully create a viable marketplace for consumers and developers they will suffer. It’s still very hard to get consumers to buy devices, but they need to keep pushing it.

From a consumer standpoint I think there’s a couple of different strategies—one being to make experiences that become killer apps. People didn’t buy computers until there was spreadsheet software that definitively increased productivity. The VR sector should be focused on creating killer apps for enterprise and killer experiences for consumers.

Manufacturers also need to improve the quality of VR headsets while reducing their cost. People want faster, lighter headsets with better resolution, and that’s what’s being worked towards now.

At some point all of that will combine to create a tipping point, not dissimilar from most major technological movements in society.

FT: What role does mobile play?

BG: At the same time as VR is happening, AR is also picking up speed—and content developers are starting to have to make difficult choices. AR is appealing because it’s compatible with mobile devices that already have hundreds of millions of users, but right now the experience on mobile is generally poor. Yes, I can stick something on my coffee table and watch it run around, but do I really want to sit here holding my phone and losing the interactivity of my hands while watching that?

Another thing I’ve been told by a lot of my peers in the industry is that people who have VR hardware for PC are more likely to pay for content, and more likely to pay more for content—as much as $60, which is akin to a mainstream video game. But people who use mobile VR are far less likely to want to pay, and thanks to App Store prices their cap is around $5.99 per experience.

Essentially, you end up delivering a limited experience to mobile users, who end up thinking this whole VR thing is a bust. I speak to people all the time who’ll say “Yeah I’ve done Google Cardboard and it was okay I guess”. But that’s not true VR—a lot of mobile experiences just aren’t really helping the cause.

Screens and monitors on the spaceship

FT: Is there a piece of work Magnopus has made that you’re most proud of?

BG: Some of the most innovative, exciting experiences we create never actually see the light of day. One we’ve tested recently was an open world that exists entirely in the cloud, allowing a mobile client a much better experience than the device itself could possibly render. We’re pretty excited to see where we could go with that.

In BladeRunner 2049 we experimented with one of first commercial applications of volumetric video. Viewers see a volumetric capture of an actual human performance – a much more faithful reproduction of an actor’s appearance – and we found that they really engaged with that, so we’re excited about pushing volumetric video forward more.

In addition we’ve created platforms like Disney Movies VR, and an upcoming project with a news company for 360 video and immersive information that I think people will love.

We’re also working in the background on things we’re really proud of, but that just don’t have the user base right now. They’re products we’ve been developing for a number of years already, related less to storytelling but giving people a better way to do everyday tasks. Watch this space for more on that.