How Umoove's Eye-Tracking Tech Will Make Apps That Can 'Flirt'

How Eye-Tracking Technology Can Ensure You Never Look Away
HOLLYWOOD, CA - FEBRUARY 24: Singer Adele Adkins arrives at the Oscars at Hollywood & Highland Center on February 24, 2013 in Hollywood, California. (Photo by Jason Merritt/Getty Images)
HOLLYWOOD, CA - FEBRUARY 24: Singer Adele Adkins arrives at the Oscars at Hollywood & Highland Center on February 24, 2013 in Hollywood, California. (Photo by Jason Merritt/Getty Images)

Over the past several decades, computers have evolved to become ever more sensitive to our cues. Room-sized machines controlled by punch cards were gradually replaced by desktop computers hooked up to mice, which have recently given way to palm-sized touchscreens we control with our fingers.

Next, predicts Israeli entrepreneur Moti Krispil, we’ll direct computers with our eyes: Ads on smartphones will come alive when we glance in their direction, ebooks will flip pages themselves when we've finished a section and videos will pause instantly when we look away.

Krispil is the chief executive of Umoove, an Israeli startup developing head- and eye-tracking technology for mobile devices that, according to Krispil, can be used to create more “human” devices that respond to our subtle movements and natural cues.

Next month, Umoove will make its tracking technology available to anyone from app developers to smartphone and tablet makers and Kirspil predicts applications that integrate Umoove’s tracking tools will be available within three to four months.

The Huffington Post spoke with Krispil about how his technology can spot movement in the eye's pupils; how it could be used to create content that seizes -- and holds -- users' attention; and what it means for the future of our interactions with machines.

What would we be able to do with head- and eye-tracking technology in our phones?Imagine you go to your architect and you bring out an iPad to look at a diagram. By the angle of your head, the diagram will automatically tilt -- the content will be automatically aligned with your face. If you look closer, it automatically zooms in.

Now say you’re rushing into the subway and you want to read the latest news, but you only have one free hand, and you can’t scroll with your thumb. So you simply move your head, up down -- you’re reading like a human teleprompter. When you look at an image long enough, it plays a video. When someone asks you if this is the right station and you raise your head, the video automatically pauses -- it waits for you to look back again, and then it plays. You can also have intelligent content that reacts to what you do with your eyes even if you don’t move your head.

Which of our movements can your technology recognize?If you move your head -- up, left, right, down or tilted -- then I can spot this change. If you are changing the distance between your head and the screen, I can spot it. If you move your pupil or iris, I can spot this change. I can spot a one-pixel change in your eyes.

How does the head- and eye-tracking technology work?All we need in order to track you is the raw video frames coming from the camera. We have technology that learns each and every frame in real time, compares the differences and by comparing the differences in the way we do it, we identify specific elements in your face and eyes.

On top of the main engine that knows how to translate your facial characteristics into a motion or movement, we have other smart engines that are helping us avoid typical pitfalls of tracking, like dynamic lightning conditions. [With our technology], you can turn on the light, turn off the light, go from a well-lit room to a dark room, while constantly being tracked.

Why use this? What’s the appeal of incorporating this tracking technology into apps?We are using your natural head and eye movement to control things. People will feel that they become the interface because we do not require them to pinch or to twist. You were not born pinching or twisting. You were born moving your head and eyes, looking closer at things, twisting your head. We want to transform those into meaningful actions that convert intention. I call it “back to people.”

And we can provide you with insights by watching where you’re looking, what you do with your head and what your focus of attention is. Then we can apply that in very creative ways, such as increasing your attention span, and it can be monetized in many ways.

How could you use the technology to increase people’s attention span?If you see a picture of Adele somewhere in an article, when you’re looking at her for more than 2 seconds, an Amazon ad could pop up suggesting you buy her latest record now. Or let’s say you’re looking at a book for kids. When my daughter is looking at Pooh the bear, why shouldn’t Pooh the bear smile and say, “Come with me, let me show you where I’ll go. The rabbit’s tree is there.”

[The content] looks like it flirts with me, it “understands” my attention. If you put all of that [technology] into an image or article or book, you can increase the attention span because people tend to follow what interacts with them. I’m using your natural head and eye movements to indulge you to do something, to explore something.

Misusing this will not be good. But using it wisely creates a new breed of content that’s much more compelling to explore.

What sort of data could you collect about people by tracking their head and eye movements? If we put aside the privacy issues, dry facts-wise, you can gather what are probably the best analytics in the world. You can get some analytics on where people are looking, which articles are being read more, things like that, and convert it into optimized placement, ranking of articles and customizing your local preferences, so maybe you like to get certain type of content more often than other kinds of content.

How do you predict this technology will change our interactions with machines?I believe it will make devices more human, more intention based. It will make devices look more natural and allow them to respond without mediation. Our devices will understand many things that up until today you had to tell the system explicitly -- like if I can see your face, I will not dim the lights, but if you look away, I will dim the ebook to save power.

If sometimes people feel that what’s in front of them is more human, then I’ve achieved my goal.

What’s your goal for this technology?The most compelling thing for us as humans is to expect that we will not notice that there is technology behind something. I always used to joke that it’s like the photographer in a wedding. If he’s doing a great job, you don’t notice him, but at the end you have the memories. Eventually you need to forget you have some technology in front of you.

This interview has been edited and condensed for clarity.

Before You Go

Fingerprint Gel

9 Gadgets To Help You Avoid Surveillance

Popular in the Community

Close

What's Hot