August 22, 2016
Sune Alstrup Johansen is cofounder and CEO of The Eye Tribe.
Sune Alstrup Johansen of The Eye Tribe discusses how eye tracking will soon become an essential interface for AR devices.
PwC: Sune, can you tell us a little bit about your company?
Sune Alstrup Johansen: The Eye Tribe is based in Copenhagen, Denmark. We develop eye tracking software that enables a person to control a device by using the eyes. It can predict the intention of the user by knowing where the user looks. It works on everything from eye-based authentication to user interface control and to analytics based on visual attention. Components can be integrated into next-generation devices such as smartphones, cars, head-mounted displays like virtual reality [VR] headsets, or glasses. That’s what we’re now commercializing.
PwC: What are the different components of the technology you’ve developed?
Sune Alstrup Johansen: The technology consists of software, an infrared camera, and infrared illumination. Using an infrared image of the eye, we can track very accurately where a person looks on a screen or in space. All the secret sauce is in the software. The hardware is basically standard components. It’s similar to a Kinect, but we focus on the eyes instead of the body.
PwC: What do you think this technology will be good for?
Sune Alstrup Johansen: There are multiple use cases. One is authentication based on the eye, so users can log in to devices by using only their eyes. We also enable user interface interaction, controlling the interface with the eyes instead of using touch, hands, or voice. There are also more passive features, such as using eye tracking for diagnostics. That could be behavioral diagnostics, like tracking where users look and using that information for advertising purposes. Or it could be medical diagnostics, since a lot of diseases can be detected in eye movements. Many companies are working on algorithms for detecting specific diseases. All these vendors need an eye tracking system to enable their algorithms.
PwC: What are some other strong use cases for this technology?
Sune Alstrup Johansen: Eye tracking for authentication is completely effortless. The user just needs to be present and look at a screen, and then the technology can authenticate and log in the user. When eye tracking is used for controlling a user interface, the interface will know what a user wants to do before the user even thinks about it. If someone is reading an e-book, for instance, the book will know when they want to turn the the page; the person doesn’t need to think about turning the page. If someone wants to click a link on the web, they just look at it.
“Eye tracking for authentication is completely effortless. The user just needs to be present and look at a screen, and then the technology can authenticate and log in the user.”
PwC: You’re using infrared illumination. Is that the only way to track an eye, or are there other methods?
Sune Alstrup Johansen: Some people attempt to do eye tracking without infrared. That approach is based on visible light, and it produces very inaccurate eye tracking, especially if the subject moves. Infrared light is required for professional eye tracking systems. Some companies are performing eye tracking by using normal RGB cameras, and they can get a very crude estimation of whether a person looks left or right or up and down, for instance, but not much more than that. But that’s not the purpose of our company.
PwC: What are some of the limitations of this system today?
Sune Alstrup Johansen: The main limitation right now is that the sensors available on the market are sensitive to sunlight. If you have a camera sensor and the sun shines into it, it will see just a blank image. So for eye tracking, we cannot use the camera sensor if the sun is shining into it. That’s not directly a limitation of the eye tracking algorithm, but it’s a limitation of the sensors we use for eye tracking.
PwC: In the next three to five years, what is on your roadmap?
Sune Alstrup Johansen: In the next three to five years, eye tracking will become part of many everyday devices. Adoption in the mass market will occur like it did with touch screens. When Apple launched the iPhone in 2007, there was massive adoption of the touch screen in all kinds of devices. And I think eye tracking will be more or less the same. After the first successful implementation of eye tracking, it will spread from there.
PwC: What is the current point on the maturity curve for eye tracking?
Sune Alstrup Johansen: I think it’s still very early. The first mass-market adoption hasn’t happened yet. Eye tracking has been a niche technology until now, much like what has happened with other technologies, including the touch screen and the fingerprint reader. Both went from niche to mass adoption. That is happening with eye tracking, too, and I think it will be adopted by the big vendors. Eye tracking will improve technologically as time passes.
“I think the first devices to integrate eye tracking will likely be VR headsets.”
PwC: What device do you expect will be the one that drives adoption?
Sune Alstrup Johansen: I think the first devices to integrate eye tracking will likely be VR headsets. They don’t have it now, but the second revision of VR hardware will definitely have eye tracking. After that I expect adoption into smartphones and augmented reality [AR]. In the future, every professional AR headset will include eye tracking. Most vendors are now looking for eye tracking solutions to put into their headsets.
PwC: Do you think eye tracking technology will be good enough to displace the richness of all the interactions people do with the touch screen now—slide, tap, zoom, pinch, and other actions?
Sune Alstrup Johansen: The accuracy of our system is 0.2 degrees. You can compare it to touch. For example, people had trouble hitting buttons on the first smartphones. But in later revisions, developers adapted the keyboard to compensate for the size of users’ fingers. For example, there was a magnifying glass on the keyboard. All these things are making the user experience adapt to the new input modality, and the same thing will happen with the user interfaces that take advantage of eye tracking.