Gaze Estimation using an iPhone to improve user interaction

With the advancing technology and enhancements in the supported hardware on phone, the focus is shifting more and more towards identifying behaviour and improving user interaction towards a product.
 Using the latest iPhones with depth sensing capabilities, and the introduction of ARKit and Vision frameworks to the world of iOS, allows gaze estimation.

We can now detect and estimate the gaze of a user interacting with the phone using a front camera, and thus know how is the user interacting with the phone.
This can help in variety of domains be it a shopping app, a recipe app, reading ebooks or apps assisting people with disabilities. Being able to detect gaze and track eyes allows users to execute interactions with simple double blink of eyes, for example.

 To elaborate above examples, you can browse through various products in a shopping app, navigate back and forth within the app and also make a purchase.
 You can read a recipe or an ebook without having the need to scroll through the page or hit next. The page would scroll automatically based on the gaze detection and identifying the reading speed.

 We have had a chance to work on such an application where we are able to create an auto book reader app. The purpose of the app is to allow users to provide smooth reading experience and also gradually improve their reading speed.

The app identifies user’s gaze and tracks the word being read and calculates the pace of reading. A normal pace for reading is between 100-200 words.
The app automatically sets the scroll pace by identifying the user’s pace. It encourages user to read faster by gradually increasing the scroll speed and verifying if the user is catching up well.
On a regular use, this allows the user to set a constant pace and eventually improve the reading speed to comprehensive or speed reading levels.