To control your phone or computer, you are familiar with the touch screen, the mouse and the keyboard. A new device could become widespread: control with the gaze.
Published
Reading time :
2 min
This is what English speakers call “eye tracking”. Technology that lets you know precisely where you are looking on a screen. Several computers have already adopted it. And Honor’s latest phone, the Magic 6, includes it as standard.
Many phones today already use the front camera to avoid putting the screen to sleep while you are looking at the screen. With eye tracking, we can go even further. For example, by scrolling the screen automatically as soon as their eyes are reading the bottom of the page. You can even activate a function on the screen just by looking at it. Which can be practical when your hands are full.
No need to wink to click, you can imagine, if we have tics, there would be a whole bunch of bad interpretations. The device will simply wait for you to stare at a notification, for example. Then, it will flash it, before displaying it. It can also work in addition to the keyboard and mouse. For example, if you are asked: “Do you want to save?”. Just look at the “Yes” button and it will light up. And we just have to press enter. No more lifting your hand from the keyboard, going to the mouse and clicking. It also works on games, you know the 3D ones that are quite difficult to control. It is directly with your eyes that you will be able to control the camera which follows your character.
Functions designed for a disability that are becoming more widespread
There are similarities with functions that already existed for people with disabilities. To help paralyzed people communicate, for example. Today, they can use their eyes like a mouse to type messages on an on-screen keyboard.
In fact, we realize that many functions initially designed for a disability can ultimately be useful to everyone. Example: dark mode on phones. Initially, it was used to protect photosensitive people. Another example: videos which are now systematically subtitled on social networks so that they can be understood even without sound. So expect to see more and more devices controlled just with the apple of our eye.