For XR, the Eyes are the Prize

About: This article was originally published on Motherboard (aka vice.com) in May 2019. This version contains new information and graphics.

How can we foster the best uses while minimizing harm?

Now is the time to begin building awareness and expectations. Eventually, I anticipate companies will have to concede to a government somewhere in the world (most likely in Europe) that eye data is like other health or biometric data, meaning it must be secured and protected as least as well as your medical records and fingerprints, perhaps better.

Eye Tracking vs. Human Vision

A short primer on human vision and eye tracking will make our evolutionary limitations and vulnerabilities much easier to understand. I’ll include some fun videos to watch along side.

Our foveae see only a small part of the elephant at any given time, while our brains reconstruct a full image as we scan around. The green line is our horizon estimate.
The Art of Misdirection
Eye Tracking for Research Studies
You are blind more often than you realize
Redirected walking is helpful, but makes our vulnerability very clear

The Best Uses of Eye Tracking

There are enough positive and beneficial uses of eye-tracking to warrant its continued deployment, with sufficient safeguards.

Eye tracking for research

Level of Interest

There is a growing body of research to indicate that pupil-dilation, directly measurable by many eye-tracking approaches, can indicate the level of interest, emotional, sexual or otherwise, in what one is looking at.

The Dark Side of Eye Tracking

Some VR companies already collect your body-motion data. This can help them study and improve their user experience. I’ve worked on projects that did the same, albeit always with informed consent. But there’s a danger to us, and to these companies, that this data could be used improperly. And by not sharing exactly what’s being saved, how it’s being stored, protected and used, we’d never know.

Anonymity and Unique Identifiers

Machine-learning algorithms can uniquely identify you from your movements in VR (e.g., walk, head and hand gestures). It’s been recently shown that you can be de-anonymized from your motion data in as little as 5 minutes. Dr. Jeremy Bailenson of Stanford VR has written for a long time about how these kinds of algorithms can be used both for and against us. It seems that the patterns of your irises and blood vessels in your retinas are even more unique.

Data Mining

Microsoft and Magic Leap both make it relatively easy for third party developers to build apps that use eye-gaze data from their devices. This is exciting for developers exploring new interaction paradigms, but it also represents a serious privacy concern.

Facebook’s Project Aria to learn how we see IRL
Future XR glasses will let us track you and everything else

Eye-tracking represents an unconscious “like” button for everything

Your eye’s movements are largely involuntary and unconscious. If a company is collecting the data, you won’t notice. The main limitation on these activities today is the fact that AR headsets are still big and hot, not yet ready to be worn all day, everywhere. That will no doubt change with additional R&D.

Extrapolation

Have you ever searched for a product on Google and then seen ads on that topic for days or weeks? Have you ever seen those same ads continue to appear even after you bought said product?

Experimentation

It’s worth reminding ourselves at this point that advertising is not inherently evil. Some ads make you aware of products or ideas you might actually like, thus providing a mutual benefit to you and the manufacturer. But too much advertising today is already manipulative and misleading, albeit in a grossly inept kind of way.

Emotional Responses

Take a moment to imagine something that made you feel envious, regretful, angry, or happy. It could be something from your social life, family, work, politics, or any media that really moved you. Perhaps it’s a song lyric that perfectly matched your mood.

  • make more money from each user
Advertising vs. Predation

Plan A: Let’s Do This The Easy Way

Here’s a starter set of some policies that almost any company could live by:

  • Raw eye data and related camera image streams should neither be stored nor transmitted. Highly sensitive data, like Iris-ID signatures, may be stored on each user’s device in a securely encrypted vault, protected by special hardware from hacking and spoofing. Protect this at all costs.
  • Derivatives of biometric data, if retained, must be encrypted on-device and never transmitted anywhere without informed consent. There must be a clear chain of custody, permission, encryption, authentication, authorization and accounting (AAA) wherever the data goes (ideally nowhere) . This data should never be combined with other data, without additional informed consent and the same verifiably high security everywhere.
  • Apps may only receive eye-gaze data, if at all, when a user is looking directly at the app, and must verifiably follow these same rules.
  • Behavioral models exist solely for the benefit of the users they represent.The models must never be made available to, or used by, any third parties on or off device. It would be impossible for a user to properly consent to all of the unknown threats surfaced by sharing these models.
  • EULAs, TOS, and pop-up agreements don’t provide informed consent.Companies must ensure that their users/customers are clearly aware of when, why and how their personal data is being used; that each user individually agrees, after a full understanding; and that the users can truly delete data and revoke permissions, in full, whenever they wish.
  • Don’t promise anonymity in place of real security, especially if anonymity can later be reversed. Anonymization does not prevent manipulation. Bucketing or other micro-targeting methods can still enable the more manipulative applications of sensitive user data in groups.
  • Users must be given an easy way to trace “why” any content was shown to them, which would expose to sunlight any such targeting and manipulation.

Plan B: “They took the hard way…”

Failing these voluntary steps leads to almost certain regulation. Eventually, at least one company goes too far, gets caught, and everyone suffers.

Plan C: For Anything Else We Didn’t Think Of…

Plan C is the most speculative, but worth following to see where it leads. Some companies may not heed the call for self-policing. Legislation may be slow or get derailed for any number of reasons, including that the “good side” is really busy. However, the same ideas in this article that put us at risk could be used to protect us, proactively, via some future private enterprise.

Design and Technology Leader (fmr. HoloLens, Apple, Google Earth, Second Life, Disney VR) Profile photo is from generated.photos (read “Who owns YOU?” for why)

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store