Eye-tracking in 360: Methods, Challenges, and Opportunities

Abstract

As eye-trackers are being built into commodity head-mounted displays, applications such as gaze-based interaction are poised to enter the mainstream. Gaze is a natural indicator of what the user is interested in. Eye-tracking in virtual environments offers the opportunity to study human behavior in simulated settings, both for the purpose of creating realistic virtual avatars, and to learn models of saliency that apply to a three-dimensional scene. Research findings, such as consistency in where people look at in images and videos, and biases in two dimensional eye-tracking (e.g. center bias) will need to be replicated and/or rediscovered in VR (e.g. equator bias as a generalization of center bias). These are only a few examples of the rich lines of inquiry waiting to be explored by VR researchers and practitioners who have a working knowledge of eye-tracking. In this tutorial, we will cover three topic areas:

  • The human vision system, the eyes, and models/parameters that are relevant to virtual reality
  • Eye-tracking technology and methods for collecting reliable data from human participants
  • Methods to generate heat maps from eye-tracking data

Link to IEEEVR2019 Tutorial Page

Organizers

Le Meur
Olivier LE MEUR
IRISA / University of Rennes 1
olemeur@irisa.fr
http://people.irisa.fr/Olivier.Le_Meur/
 
 
 
Jain
Eakta JAIN
​University of Florida​
ejain@cise.ufl.edu​
http://jainlab.cise.ufl.edu

Tutorial details:

Eye-trackers are now being built into commodity VR headsets (e.g. the FOVE headset, Tobii eye-trackers built into HTC Vive headsets). As a result, researchers and practitioners of VR must quickly develop a working understanding of eye-tracking. The audience members for this tutorial can expect to leave with the following:

  • A basic understanding of the eye and the human visual system, with afocus on the parameters that are relevant to eye-tracking in VR
  • An understanding of methods for collecting eye-tracking data, including sample protocols and pitfalls to avoid
  • A discussion of methods to generate saliency maps from eye-tracking data, including pseudocode and MATLAB implementations

The tutorial will be of interest to students, faculty and researchers interested in quantifying user priorities and preferences using data from eye-tracking, develop gaze-based interaction techniques, and apply eye-tracking data toward generating virtual avatars. The prerequisites are kept to a minimum and anyone having elementary background in computer sciences, image processing and computer graphics can follow this tutorial.

A brief resume of the presenters

Olivier Le Meur obtained his PhD degree from the University of Nantes in 2005. From 1999 to 2009, he has worked in the media and broadcasting industry. In 2003 he joined the research center of Thomson-Technicolor at Rennes where he supervised a research project concerning the modelling of the human visual attention. Since 2009 he has been an associate professor for image processing at the University of Rennes 1. In the IRISA/SIROCCO team his research interests are dealing with the understanding of the human visual attention. It includes computational modelling of the visual attention and saliency-based applications (video compression, objective assessment of video quality, retargeting).

Eakta Jain is an Assistant Professor of Computer and Information Science and Engineering at the University of Florida. She received her PhD and MS degrees in Robotics from Carnegie Mellon University, and her B.Tech. degree in Electrical Engineering from IIT Kanpur. She has worked in industrial research at Texas Instruments R&D labs, Disney Research Pittsburgh, and the Walt Disney Animation Studios. Her research interests are in eye tracking, visual perception, and computer graphics and virtual reality.

Slides and additional materials

Slides (Part 1)

Slides (Part 2)

Cite as follows:

@misc{jain_lemeur_VR_2019,
author = {Eakta Jain and Olivier Le Meur},
title = {Eye tracking in 360: Methods, Challenges and Opportunities},
year = {2019},
note = {Presented at IEEE VR 2019 Osaka Japan}}