Top
“Eye Tracking Is Coming to Virtual Reality Sooner Than You Think. What Now?” – ANITH
fade
128246
post-template-default,single,single-post,postid-128246,single-format-standard,eltd-core-1.1.1,flow child-child-ver-1.0.0,flow-ver-1.3.6,eltd-smooth-scroll,eltd-smooth-page-transitions,ajax,eltd-blog-installed,page-template-blog-standard,eltd-header-standard,eltd-fixed-on-scroll,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

“Eye Tracking Is Coming to Virtual Reality Sooner Than You Think. What Now?”

“Eye Tracking Is Coming to Virtual Reality Sooner Than You Think. What Now?”


Joakim Karlén handed me the headset, such as it was. It was a reference design only; the hard plastic box lacked a headstrap, and had a utilitarian form factor only a dystopian sci-fi fan would love. However, it was also completely self-contained—no cables trailing away from it to a nearby PC, no cellphone to power it. This was Qualcomm’s latest “standalone” headset reference, a prototype and platform architecture that the company would provide to developers in order to create all-in-one devices.

When I held the headset up to my eyes (no headstrap, remember?) I found myself looking into a mirror, seeing the reflection of the young woman who was my avatar. When I turned my head from side to side, so did the reflection—except her eyes stayed centered in their sockets. I could look at the mirror out of the corner of my eye, but my avatar couldn’t.

Until she could. I pressed a small button on the side of the headset, and immediately my avatar started acting less like a collection of sequences bounded by code, and more human. If I turned my head but looked at the mirror, my avatar’s pupils glided over to match mine. I could close my eyes. I could wink. All of it translated into my avatar’s facial expressions. That button had activated the eye-tracking technology of of Tobii, the Swedish company where Karlén is a director of product management for VR. Two cameras inside the headset had begun watching my eyes, illuminating them with near-IR light, and making sure that my avatar’s eyes did exactly what mine did.

Tobii isn’t the only eye-tracking company around, but with 900 employees, it may be the largest. And while the Swedish company has been around since 2006, Qualcomm’s prototype headset—and the latest version of its Snapdragon mobile-VR platform, which it unveiled at the Game Developers Conference in San Francisco this week—marks the first time that eye-tracking is being included in a mass-produced consumer VR device.

Eye-tracking has been a part of the VR conversation for years; a company called FOVE even crowdsourced its own eye-tracking headset in 2015. Yet, when the feature would actually arrive at scale has been an open-ended question. Even as multiple companies race to release all-in-one “standalone” headsets—Lenovo’s Mirage Solo launches in May, with Oculus’ $199 Go device rumored to be close behind, and HTC’s Vive Focus will be arriving stateside this year after a China-only debut—the dream of our eyes making it into VR seemed farsighted at best.

But this week has made clear that the technology may be closer to our consumer immersive-tech devices than many thought. The day before Qualcomm showed off its latest version of Snapdragon, Magic Leap released the developer toolkit for its own mixed-/augmented-reality headset, which included eye-tracking. Within the next year, we won’t just have better headsets—we’ll be better able to control them, using some of our most precise organs.

However. There’s also that whole cameras-watching-your-eyes thing. Watching not just what your eyes are doing, but where they look and for how long—in other words, tracking your attention. If you’re looking for the Black Mirror outcome to the VR/AR story, those are the things to file in your Dystopian Potential subfolder.

There’s no question that the power of the gaze has significant repercussions for virtual experiences. Eye-tracking unlocks “foveated rendering,” a technique in which graphical fidelity is only prioritized for the tiny portion of the display your pupils are focused on. For Tobii’s version, that’s anywhere from one-tenth to one-sixteenth of the display; everything outside that area can be dialed down as much as 40 or 50 percent without you noticing, which means less load on the graphics processor. VR creators can leverage that luxury in order to coax current-gen performance out of a last-gen GPU, or achieve a higher frame rate than they might otherwise be able to.

That’s just the ones and zeros stuff. There are compelling interface benefits as well. Generally, input in VR is a three-step process: look at something, point at it to select it, then click to input the selection. When your eyes become the selection tool, those first two steps become one. It’s almost like a smartphone, where pointing collapses the selection and click into a single step. And because you’re using your eyes and not your head, that means less head motion, less fatigue, less chance for discomfort.

But all of that might pale next to the power of non-verbal cues in multi-user VR. At the moment, avatars’ eyes are fixed in a forward position, and any eye contact is simulated; if you’re in a social VR space and you look around with just your eyes, nothing in your avatar demonstrates what you’re actually doing. With eye-tracking, though, your avatar can flick a sideways glance, blink, or give someone else a once-over—all crucial parts of a natural social dynamic.

However. There’s also that whole cameras-watching-your-eyes thing. Watching not just what your eyes are doing, but where they look and for how long—in other words, tracking your attention. That’s the kind of information advertisers and marketers would do just about anything to get their hands on. One study has even shown that gaze-tracking can be (mis)used to influence people’s biases and decision-making. If you’re looking for the Black Mirror outcome to the VR/AR story, those are the things to file in your Dystopian Potential subfolder.

Not a chance, according to Oscar Werner, the president of Tobii’s consumer business unit. “We take a very hard, open stance,” he says. “Pictures of your eyes never go to developers—only gaze direction. We do not allow applications to store or transfer eye-tracking data or aggregate over multiple users. It’s not storable, and it doesn’t leave the device.”

Tobii does allow for analytic collection, Werner allows; the company has a business unit focused on working with research facilities and universities. He points to eye-tracking’s potential as a diagnostic tool for autism spectrum disorders, to its applications for phobia research. But anyone using that analytical license, he says, must inform users and make eye-tracking data collection an opt-in process.

The company claims to have made this policy clear to all the headset manufacturers that it works with, from Qualcomm to StarVR. And Oculus, which by nature of its Facebook connection has already found itself under the data-collection microscope, has confirmed to WIRED that eye-tracking will not appear in either of its planned standalone devices. (In addition to the Go, the company is working on a more full-featured untethered headset, known as Santa Cruz.)

This particular conversation, though, is just beginning. VR is moving into its second generation, becoming lighter, stronger, and more immersive—and while eye-tracking is shaping up to be a key part of both performance and immersion, it’s also one of the most fraught improvements. If the eyes are the windows to our souls, it’s time to talk about how to draw the curtains.

Ready For More Stories? VR Too



Source link

Anith Gopal
No Comments

Post a Comment