VR Technology Convergence
March 09, 2020
User Experience Architect
The first mention of virtual reality (VR) technologies in our annual Ten Digital Trends report was in our 2014 edition. Ever since our 2016 report, an augmented reality (AR) or VR-related article has been included. While marketing trends related to these technologies will continue into the foreseeable future, some discussion of the progression, and capabilities, of those technologies would be helpful.
Interestingly enough, although related, AR and VR tend to have opposite goals: One strives to seamlessly merge with our day-to-day reality, while the other strives to take us as far away from reality as possible. Frankly, while the technology hurdles around AR are significant, the number of related/required capabilities are few when compared to those required for immersive VR. So our focus here will mainly be on VR technologies.
You don your VR gear in your small apartment and find yourself in a VR adventure game, exploring a castle. The view is stunning. It is crisp, clear, and feels natural. Your eyes are relaxed. You see a pedestal with a book at the end of a long hallway. As you physically walk toward the book, you instinctively check your wrist for the time, but you have no watch in this world. However, as your eyes focus on your empty wrist, you catch some movement ahead. As you look up and quickly refocus on the pedestal, you see nothing. What was that? You hear a scurrying noise on the floor behind you, but you don’t want to look away as you continue walking toward the book.
As you stand before the book, you reach out and touch its leather binding. You can feel the texture of it. As you open the book, you feel the weight of the cover. You feel the edges of each page as they brush past your fingertips. You notice the smell of the pages, like the rarely visited sections of an old library. The spell you’ve been looking for is before you on the page, and there is faint, small writing in the margin warning against its use. You turn, speak the incantation…and summon your friend Jimmy from Chicago into the game.
As Jimmy walks up the hall toward you, you can hear how the echo of his voice sounds as natural as the sound of his footsteps bouncing off the stone walls. As you tell him about the book, you can see in his face that he is excited. Then you see his expression change as his eyes look over your shoulder. So as not to warn the creature, you silently begin to summon a fireball with your mind. Then you quickly step forward and turn, feeling the heat of the fireball in your hands, but it is too late. You feel a sharp, forceful impact on your chest and look down to see the hilt of the dagger with which you were just stabbed. As the world fades to black, you can hear Jimmy’s screams echoing further and further away as he continues to struggle with the beast.
This VR experience is the stuff of science fiction: You can easily read small, faint script in a book. Your eyes can focus close-up, causing further-away objects to blur naturally. You walk, physically, naturally, down the long hallway. You can feel the texture of the book and the weight of its cover. You can smell the musty odor of the pages. Your friend’s voice echoes naturally off the walls. You can see his expression, see the direction of his gaze. You conjure the fireball using only your thoughts, and you feel the heat on your hand. You feel the impact of the dagger.
Now the amazing thing: The technologies to make all this happen are not fictional. They are not vaporware or the empty promises of a vague patent. They are real and being experienced independently from one another today. If we were to plunder the labs of the companies working on these technologies, we could build a system that fully allows for the experience described above. Want to know more? At the bottom of this trend is a list of the various technologies in development. Each of them has, at a minimum, functional prototypes in the laboratories of the companies working on them.
As with any technological evolution, everything won’t happen at once. Some of these technologies are already starting to appear in consumer-available form. Others are still years away from being refined and ready for mass production. In a number of cases, there are competing approaches to solving the same basic problem, and a winning approach isn’t clear yet. A comprehensive discussion of these is beyond the scope of a single trend, but let’s briefly look at just the display problems and the technologies being developed to solve them.
The optical technology available in consumer VR gear today is quite basic: two displays, one for each eye, viewed through a special lens that stretches that image over a wide field of view. These displays have two major issues.
Resolution and Display Density
Even though the displays seem fairly high resolution, because they are stretched so wide, users experience what is called the “screen door” effect. Basically, you can see the black spaces in between the lighted parts of the display. So, for example, reading a book in VR requires you to bring the book very close to your face.
The most obvious solution to this problem is higher-resolution displays. All the major players in the VR space are working on this already. However, they are also trying to present an ever-widening field of view. One company, Pimax Technologies, recently released the highest-resolution headset to date. The Pimax 8K headset has 3840×2160 pixels per eye and a 200-degree field of view. Initial reviews say the screen door effect is all but gone. The issue is that there isn’t yet a consumer-level computer plus video card that can drive those displays at native resolution. No doubt such a system will be available in the next year or two, but the cost of that system, not to mention the Pimax 8K headset, is prohibitive to a regular consumer.
Another approach to the problem takes advantage of your eyes’ anatomy. Your eyes’ ability to perceive fine detail is a relatively narrow area near the center of your retina called the fovea centralis. You can experience the effect quite easily. Just try reading a sentence five lines above this one while keeping your eyes focused here. The technology to exploit this difference in acuity is called foveated rendering. It reduces the computational load by allowing your graphics card to render the majority of the view at very low resolution, only retaining the fine detail at the center of your gaze. However, for foveated rendering to work, we have to be able to track where the user is looking and update the area of detailed rendering to match. The eye-tracking technology is something we already have, but the system reaction time for updating the location of detail isn’t quite there yet.
Current thinking on this problem is that a combination of these two approaches will likely be the near-term solution to this problem.
Static Focal Plane
All consumer 3D-viewing technologies currently have this issue. That includes VR, AR, and even 3D movies you see in the theater. In real life, if you focus on the foreground, things in the background get blurry, and vice versa. Flat displays, like those in the theater and VR headsets, show the entire scene at the same focal distance. Even if you blur some elements, your eyes will still be focused at the same depth to see those blurry elements. The stereo view, provided by each eye seeing a different image, is only part of how our brain perceives depth. When that stereo view and our eyes’ focus don’t match, we experience discomfort. We still perceive 3D, but the view, especially for a close-up object within a 6-foot radius, feels wrong. (This is especially an issue in AR where digital elements are directly viewed in relation to the real world.)
The nearest solution to this uses eye tracking to render the scene using the focal depth based on what the user is looking at. This is really only a partial solution, though, because, as mentioned above, the out-of-focus areas of the scene are still at the same physical focal distance as everything else in the display.
A better approach, and one that is most promising, is based on light field technology. A traditional camera captures a scene as a flat arrangement of colored pixels. A light field camera also captures the inbound direction of the light rays that make those pixels, generating an array of images, each from a slightly different point of view. A light field display presents that image such that, to see an object clearly, your eye has to focus on that object. It essentially presents light to the eye the way our eye was built to receive it.
To have the most immersive experience possible, many of these VR technologies will have to be brought together into a unified system. A good analogy would be the convergence of technologies brought together to create the iPhone. Capacitive touch screens, miniaturized components, mobile operating systems, mobile apps, near field detectors, mobile cameras, the ability to sync your data to other services and/or your computer, and more all existed before the iPhone. But the full realization of the smartphone required all of them to be brought together into a unified form. The iPhone was a product of the convergence of all those technologies. Given the pace of research and innovation, it seems that we are rapidly approaching the inevitable convergence of VR technologies, and I’m excited to see the result.
List of Technologies
The following is list of technologies, demonstrations, and presentations, that, if they matured, would allow someone to experience the VR scenario we imagined.
- Walk in all directions (360-degree treadmill)
- Feel things (gloves with force and haptic feedback, and temperature control)
- Feel larger things (haptic suits)
- Hear things (3D spatial audio)
- Smell things, feel the heat, feel moisture in the air
- Vari-focus display system (another approach to addressing convergence and accommodation)
- Integrated low-latency eye/eye area tracking (also see foveated rendering)
- Speak to a computer with our minds
- Display light field that allows for natural functioning of the eye (convergence (cross eye) and accommodation (focus))
The list above is by no means comprehensive. There are other technical advancements being worked on that, while they don’t have direct application to our example VR experience, are important and exciting nonetheless.
- Feel rotational, and even some linear, motion without external mechanical gear
- Redirected walking
- Bare hand tracking
- Feel things in mid-air
- Real-time capture and stream volumetric people
- Light field recording and 360-degree video
Hands-on: PiMAX’s 8K Headset Proves That High FOV VR Is ComingRoad to VR
Infinadeck – ‘Ready Player One’ VR TreadmillYouTube
KAT Walk Mini – A Ready Player One Omni-Directional VR TreadmillYouTube
Virtuix Omni – Omniverse TrailerYouTube
HaptX Gloves Launch Video – Realistic Touch for Virtual RealityYouTube
A Real Life Haptic Glove (Ready Player One Technology Today) – Smarter Every Day 190YouTube
Feeling Temperature With a Haptic Glove – Smarter Every Day 191YouTube
Force Jacket: Pneumatically Actuated Jacket for Embodied Haptic ExperiencesYouTube
NVIDA VRWorks Audio Release 1.0YouTube
Oculus Reveals Its Next Big VR Trick: Wider-FOV Lenses That Automatically MoveARS Technica
Tobii Proves That Eye Tracking Is VR’s Next Killer FeatureEngadget
Eye-Tracking Is a Game Changer for VR That Goes Far Beyond Foveated RenderingRoad to VR
MIT’s Wearable Device Can ‘Hear’ the Words You Say in Your HeadEngadget
Light Field Technology: the Future of VR and 3D DisplaysYouTube
Edward Tang (Avegant): How Light Field Technology Will Change the Future of AR & Mixed RealityYouTube
Researchers Induce Artificial Movement Sensation in VR Using Four-Pole Galvanic Vestibular StimulationRoad to VR
Researchers Exploit Natural Quirk of Human Vision for Hidden Redirected Walking in VRRoad to VR
Oculus Claims Breakthrough in Hand-Tracking AccuracyRoad to VR
Leap MotionLeap Motion
Jaunt Reveals Livestreamed Volumetric Video Solution for ARRoad to VR
Facebook Partners With RED Digital Cinema for Volumetric VR CameraRoad to VR
Adobe Motion Parallax Demo 1080YouTube
Add Depth to VR Video With SidewinderAdobe Research
Intel Demos World's First ‘Walk-Around’ VR Video ExperienceYouTube
FeelReal | Multisensory VR Mask ReviewYouTube
Founded by CERN Engineers, CREAL3D’s Light-field Display Is the Real DealRoad to VR