More real than real: creating a feeling of 'presence' in VR
May 18, 2017
Virtual reality (VR) is on the verge of mainstream adoption, while augmented reality (AR) experiences have already begun to enter the public consciousness. What began as a niche is finally accelerating in its journey towards popular use.
From film studios and games developers to global brands and advertising agencies, developers are creating more and more sophisticated, immersive experiences to captivate and emotionally engage audiences. In its latest global forecast, CCS Insight predicts sales of dedicated VR headsets to grow to 22 million by 2021 – an 800 percent increase versus 2017. The same report claims sales of smartphone headsets, such as Samsung’s Gear VR, will grow five-fold to 70 million during the same period.
Despite the success of 360-degree video content to date, the entertainment industry has encountered numerous problems in achieving true immersion within those experiences. The Holy Grail in VR is to create a true sense of presence by making the viewer believe, even for a split second, they’re truly inside the virtual world. But that’s easier said than done and, while every technological advancement brings us a step closer, there remain a number of hurdles to overcome.
Most VR experiences are designed to be viewed on a headset strapped around your eyes like goggles. Stereoscopic 360 video adds depth in a similar way to stereo 3D movies but, while the experience is compelling, the depth is only on the horizon and the illusion breaks as soon as you start to shift your head - the world moves with your head, rather than your head moving within the world.
This is one of the reasons VR in particular has a reputation for making people feel sick.
Solving the limitations of 360 video is the development of volumetric video and positional VR. Companies like Lytro – with its light-field Immerge camera – are carving a new path in immersive content. The Immerge camera records the depth and distance of objects in an environment. Then, rather than stitching images together like a traditional 360-degree camera, it effectively recreates the scene in a 3D virtual space.
By capturing information on all light passing into the camera sensor, it’s possible to move around inside a scene, even looking under or behind objects, creating a true sense of presence. This has been termed ‘six degrees of freedom’ in VR circles.
The experience is also impacted by the current field of view (FOV) offered by first- and second-generation headsets. Our binocular vision makes the human FOV around 200° horizontally, but most headsets give a measly 110° – just over half of what we see in reality.
We’re also still a long way off creating experiences of as high a resolution as we see with the human eye; it’s immediately evident we’re watching via a screen and is therefore difficult to persuade the user that the action is unfolding in front of their very eyes.
Latency, too, can undo the illusion. Even the tiniest delay in the display reacting to a movement or command dissolves any sense of reality. To achieve full immersion, headsets need to drastically improve across FOV, resolution and latency. Foveated rendering is another emerging technology making waves in VR. The image rendering technique, which mimics the way humans focus on and process the world around them, uses gaze detection to tell the VR application where the user is looking and therefore which area of the view to construct in high definition.
Just as the human eye only focuses on a small window of the world around us at any one time, foveated rendering draws the rest of our FOV at lower resolutions. As well as saving an enormous amount of pixel data, the technology better replicates how we truly see the world, creating a deeper, more immersive experience.
We have a way to go before we reach true immersion in VR and AR content, with major developments in both software and hardware still to be made. The push towards non-linear, adaptive experiences is an exciting one, but it will all be undone if a slight movement of the head means the illusion is broken – or indeed if we all end up feeling sick halfway through.
That we need hardware at all is a barrier in itself. After all, how immersed can you really feel if you’ve got a headset strapped to your face or are holding a controller in your hands? Perhaps somewhere down the line we’ll have an ocular implant, or maybe we’ll just see the arrival of something a little less ugly than Google Glass that people are more willing to wear.
Advancements in hardware and software need to go hand-in-hand if we’re to create a real sense of presence, and progress will likely be made in waves as a number of intertwined breakthroughs create a chain reaction that propels us forwards. We’re certainly on the road to something special, and it’s undoubtedly a fascinating time to be at the heart of the visual effects industry.
If this article has piqued your interest, why not get an insight into some of the hands-on challenges of VR production?