`

Timezone: »

 
Expo Talk Panel
Accelerating Eye Movement Research Via Smartphone Gaze
Jaqui C Herman · Vidhya Navalpakkam

Sun Dec 06 08:00 PM -- 09:00 PM (PST) @

Eye movements have been widely studied in vision research, language and usability, yet progress has been limited since eye trackers are expensive (>$10K) and do not scale. In this talk, we'll present findings from our recent paper at Nature communications, which shows that smartphone's selfie cameras+ML can achieve high gaze accuracy comparable to SOTA eye trackers that are 100x more expensive. We demonstrate that this smartphone technology can help replicate key findings from prior eye movement research in Neuroscience/Psychology, that earlier required bulky/expensive desktop eye trackers in highly controlled settings (e.g., chin rest).

These findings offer the potential for orders-of-magnitude scaling of basic eye-movement research in Neuroscience/Psychology (with explicit user consent) and unlock new applications for improved accessibility, usability and screening of health conditions

Author Information

Jaqui C Herman (Google AI)
Vidhya Navalpakkam (Google Research)

I am currently a Principal Scientist at Google Research. I lead an interdisciplinary team at the intersection of Machine learning, Neuroscience, Cognitive Psychology and Vision. My interests are in modeling user attention and behavior across multimodal interfaces, for improved usability and accessibility of Google products. I am also interested in applications of attention for healthcare (e.g., smartphone-based screening for health conditions). Before joining Google in 2012, I was at Yahoo Research. Prior to joining the industry in 2010, I worked on modeling attention mechanisms in the brain during my postdoc at Caltech (working with Drs. Christof Koch, Pietro Perona and Antonio Rangel) and PhD at USC (working with Dr. Laurent Itti). I have a Bachelors in Computer Science from the Indian Institute of Technology, Kharagpur.

More from the Same Authors