1 of 69

Eye Tracking and Gaze-based InteractionIntroduction and Current Research Trends

Human-Computer Interaction 2 - 10.01.2018

WS 2017/2018

1

Picture source: http://www.black.ninja

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

2 of 69

New Interaction Methods are Desirable

The amount of interaction with computer devices increases.

Consequently we look for interaction methods which are:

  • quicker
  • do not require training
  • do not need physical and mental efforts

Eye tracking is a promising technology as the eyes are quick and we use them with ease.

2

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

3 of 69

Things We Can Do with Eye Tracking

Text 2.0

A project from DFKI (German Research Center for Artificial Intelligence)

3

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

4 of 69

A Smart Computer Needs Awareness for Gaze

4

Eye gaze is very important for human-human interaction

... and human-computer interaction should be like human-human interaction.

Taken from Milekic: The More You Look the More You Get: Intention-based Interface using Gaze-tracking

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

5 of 69

What the Eyes Can Tell

  • Eyes indicate visual interest
  • Gaze precedes actions
  • Eyes reflect cognitive processes
  • Eyes reflect physical and mental condition

5

In HCI Gaze Can Be

  • A source of information about the user
  • An input method

Eye trackers keep advancing and getting cheaper!

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

6 of 69

Expectations for Eye Gaze Interaction

Hopes

Fears

6

  • Ease of use
  • Speed up interaction
  • Maintenance free
  • Hygienic interface
  • Remote control
  • Safer interaction
  • Smarter interaction

  • Ability to control the eyes
  • Conflict of input and vision
  • Fatigue of eye muscles

As eye movements are part of our social protocol we are able to control our eyes

Normally we look at the interaction element for input

Our eyes move constantly even if we sleep

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

7 of 69

Eye Tracking Applications

7

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

8 of 69

Eye Tracking Applications

8

Gaze Monitoring

Implicit gaze-based Interaction

Gaze-supported (Multimodal) Interaction

Explicit gaze-based Interaction

Päivi Majaranta and Andreas Bulling, “Eye Tracking and Eye-Based Human–Computer Interaction,” in Advances in Physiological Computing, Human–Computer Interaction Series (Springer, London, 2014), 39–65, https://doi.org/10.1007/978-1-4471-6392-3_3.

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

9 of 69

Gaze Monitoring

  • Collect data about the user
  • Cognitive load
  • User’s attention
  • Visual preference

9

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

10 of 69

Gaze Monitoring

Studying visual attention

10

Nicholas S. et al. (CHI 2015)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

11 of 69

Gaze Monitoring

Support in collaborative environments

11

Akkil et al. (CHI 2016)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

12 of 69

Gaze Monitoring

Optimizing Interfaces

12

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

13 of 69

Implicit gaze-based Interaction

Detecting intention/need for assistance

13

Karolus et al. (CHI 2017)

Walber et al. (CHI 2014)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

14 of 69

Implicit gaze-based Interaction

Gaze-assisted Photo/Video editing

14

Santella et al. (CHI 2006)

Jain et al. (ACM Trans. Graph. 34, 2015)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

15 of 69

Gaze-supported (Multimodal) Interaction

  • Combine gaze with other input modalities, such as
  • Touch
  • Mid-air gestures
  • Pen-input, Phone-input, etc..

15

Chatterjee et al. (ICMI 2015)

Khamis et al. (CHI 2016)

Kumar et al. (CHI 2007)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

16 of 69

Gaze-supported (Multimodal) Interaction

16

Pfeuffer et al. (UIST 2014)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

17 of 69

Gaze-supported (Multimodal) Interaction

17

Stellmach et al. (CHI 2013)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

18 of 69

Explicit gaze-based interaction

  • Using eyes for input/control
    • Moving the mouse
    • Selection
    • Scrolling
    • Eye typing
    • Authentication

18

Gaze-based authentication�EyePassShapes - De Luca et al. (SOUPS 2009)

Microsoft Patent 2014

Source: http://www.winbeta.org/news/microsoft-patents-eye-tracking-keyboard-could-use-technology-future-devices

Eye-based interaction with displays - Zhang et al. (UbiComp 2013)

Gaze input for the disabled

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

19 of 69

Explicit gaze-based interaction

19

Esteves et al. (UIST 2015)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

20 of 69

Explicit gaze-based interaction

Spontaneous Interaction with Displays

20

Zhang et al. (CHI 2013)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

21 of 69

Explicit gaze-based interaction

Manual And Gaze Input Cascaded (MAGIC) pointing

  1. The user gazes at the target

  • The cursor is wrapped to the eye tracking position

  • The distance to target is significantly shorter despite the tracker’s accuracy limitations

21

Zhai et al., (CHI 1999)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

22 of 69

How Does Eye Tracking Work?

22

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

23 of 69

Eye Anatomy

23

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

24 of 69

The Human Eye - Movement Control

24

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

25 of 69

The Human Eye - Vision

25

The eye works similar to a camera.

However there are some differences:

  • For adjusting the focus the lens changes its form (and not the position)
  • The sensor surface is curved (and not plane)
  • The sensor resolution is higher in the central field of vision (and not uniform)

From Duchowski, T. D.: Eye Tracking Methodology: Theory and Practice

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

26 of 69

Types of Eye Movement

26

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

27 of 69

The Human Eye - “Movement Types”

  • Fixations
    • (mostly) stable eye position
  • Saccades, Microsaccades
    • fast „ballistic“ eye movements
  • (Smooth) Pursuit
    • following an object with your eyes
    • cannot be done voluntarily
  • Vestibulo-ocular Reflex
    • compensate head movements
  • Optokinetic Nystagmus
    • combination of pursuit and saccades
  • Vergence Movements
    • Cooperation of both eyes to focus a single object
  • Pupil Dilation
    • diameter change due lighting conditions, increased cognitive load

27

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

28 of 69

Fixations

  • A pause, usually between 200 and 600 ms
  • Humans fixate to sharply focus on a narrow area, long enough for the brain to perceive it
  • Visualization as heat map

28

http://www.prweb.com/releases/2005/03/prweb213516.htm

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

29 of 69

Fixations - Interaction example

  • One possible interaction using fixations is the dwell time method

Typical application: Eye typing for handicapped (since the 80ies)

Problem: Midas Touch effect,

slow

  • Another interaction using fixations is multimodal interaction - fixate and press key�Problem: Need of second modality, not contact-free, but quickest interaction method

29

https://www.youtube.com/watch?v=oXBKXRqxVnU

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

30 of 69

Saccades

  • Fast jumps from a fixation point to another

30

http://www.web-solution-way.be/3-marketing-internet/19-eye-tracking-google.html

https://en.wikipedia.org/wiki/Eye_tracking

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

31 of 69

Saccades - Interaction Examples

  • Gaze gestures

Looking at the corners of the screen in a certain order

  • Reading detection, activity recognition

A series of forward saccades and a long backward saccade indicate reading activity.

31

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

32 of 69

Smooth Pursuits

  • Smoothly follow a moving target

32

https://www.andreas-bulling.de/fileadmin/docs/vidal13_ubicomp.pdf

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

33 of 69

Interaction using Smooth Pursuits

33

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

34 of 69

Vestibulo-ocular Reflex

  • Compensation of head movements

Interaction method using the vestibulo-ocular reflex:

gaze-based head gestures

34

https://www.youtube.com/watch?v=j_R0LcPnZ_w

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

35 of 69

Optokinetic Nystagmus

  • Smoothly follow a moving target

35

Jalaliniya and Mardanbegi. CHI 2016. EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements.

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

36 of 69

Vergence Movements

  • Cooperation of both eyes to focus a single object

36

Florian Alt, Stefan Schneegass, Jonas Auda, Rufat Rzayev, and Nora Broy. 2014. Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays. IUI '14

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

37 of 69

Pupil Dilation

  • Change in diameter to accommodate lighting
    • circular muscles around the pupil

  • (Smaller) change in diameter when exerting mental effort
    • radial muscles around the pupil
    • used to estimate cognitive/mental workload

37

Bastian Pfleging, Drea K. Fekety, Albrecht Schmidt, Andrew L. Kun, A Model Relating Pupil Diameter to Mental Workload and Lighting Conditions in Proc CHI `16

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

38 of 69

Eye Tracking Technologies

38

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

39 of 69

Eye Tracking Technologies

39

Remote Eye Tracking

Head Mounted Eye Tracking

Tobii eye trackers

Pupil labs eye tracker

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

40 of 69

Eye Tracking Technologies

40

Remote Eye Trackers

Head Mounted Eye Trackers

Tracks gaze..

on displays

in natural settings

Building responsive systems requires...

developing a software using an SDK

labeling the surroundings (e.g. adding QR-codes to products)

Setup

Easy to setup

Cumbersome to wear

Flexibility

Allows very limited movements

Allows free head/body movements

Can users see the gaze data?

Yes

No, unless you provide the user with another display

http://www.research-results.de/

http://www.useeye.de/

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

41 of 69

Eye Tracking Producers

41

Commercial

Open Source

Open Eyes project

http://thirtysixthspan.com/openEyes/

Pupil Eye Tracker (open source software)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

42 of 69

Eye Tracking Techniques

42

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

43 of 69

Eye Tracking Techniques

  • Video-based Tracking
  • Infrared Pupil-Corneal Reflection (IR-PCR) Tracking
  • Electrooculography-Based (EOG) Tracking

43

EOG Goggles

Tobii Glasses Eye tracker

Tobii Eye Tracker (stationary)

IR-PCR Eye trackers

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

44 of 69

Eye Tracking techniques - Video-based

  • Can be remote or head-mounted

  • Relies on image processing
    • Pupil detection (SET, Starburst algorithms)

  • Accuracy/quality is influenced by:
    • camera parameters (distance, resolution, …)
    • lighting conditions
    • reflections of glasses/lenses, obstacles (eyelids)

44

  • Li, D., & Parkhurst, D. J. (2005). Starburst : A robust algorithm for video-based eye tracking. Image (Rochester, N.Y.), (September 2005), 22.
  • Javadi, A.-H., Hakimi, Z., Barati, M., Walsh, V., & Tcheang, L. (2015). SET: a pupil detection method using sinusoidal approximation. Frontiers in Neuroengineering, 8(April), 1–10. http://doi.org/10.3389/fneng.2015.00004

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

45 of 69

Eye Tracking techniques - IR-PCR

  • Can be remote or head-mounted
  • “Corneal reflection” serves as static reference point
    • Allows slight head movement
    • more accurate gaze data
  • Does not work outside (sunlight)

45

Majaranta and Bulling 2014

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

46 of 69

Eye Tracking techniques - EOG

  • Uses electrodes attached on the skin
  • Can work in complete dark settings (e.g. closed eyes)
  • Signal processing is computationally lightweight
  • It can be affected by signal noise (e.g. power line)
  • less accurate, even with medical-grade equipment

46

EOG Goggles

Ear-pads based [Manabe et al. 2013]�(only detects looks to the left/right)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

47 of 69

Challenges

47

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

48 of 69

Challenges

Interaction

  • Midas touch (Perception vs Interaction)
  • Involuntary eye movements

Data Interpretation

  • Gaze fixation point != Visual attention
  • Experiment Bias
  • Eye Fatigue

Technical

  • Accuracy
  • Calibration
  • Environmental Artefacts
  • Sampling Rates

48

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

49 of 69

Midas Touch

49

Should the class end earlier?

Accidently choosing “No” while reading

(Midas Touch)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

50 of 69

Involuntary eye movements

50

In other words:

What is relevant data and what is not?

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

51 of 69

Data Interpretation

Gaze fixation point != Visual attention

Humans are not necessarily paying attention to what they look at

51

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

52 of 69

Data Interpretation

Eye fatigue

Kosch et al. (CHI 2018)

52

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

53 of 69

Accuracy

53

http://www.cns.nyu.edu/~david/courses/perception/lecturenotes/eye/eye.html

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

54 of 69

Environmental Artefacts

54

Makeup

Interference (e.g. electromagnetic)

Lighting conditions

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

55 of 69

Calibration

55

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

56 of 69

Calibration

56

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

57 of 69

Calibration

+ Results in more accurate data.

  • Difficult and tedious.
  • Might break if a user moves.
  • Almost impossible to determine the exact pixel a user is gazing at.

57

On-screen calibration using the Eye-tribe

Calibration using markers

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

58 of 69

Calibration-free Eye Tracking

58

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

59 of 69

Calibration-free eye tracking

59

SideWays�(Zhang et al. CHI 2013)

Smooth Pursuits�(Vidal et al. UIST 2013)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

60 of 69

Calibration-free eye tracking

60

Nagamatsu et al. PerDis 2014

Drewes et al. 2007

(can also be done without calibration)

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

61 of 69

Take-home Messages

Gaze is a promising modality

    • for understanding the user
    • for interacting with computers and smart environments
    • when combined with other modalities

Gaze technologies still have some limitations

    • Require calibration (low usability)
    • Track one user at a time
    • Can be confused with perception (Midas touch)
    • Not flexible for dynamic environments (e.g. public displays)

61

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

62 of 69

Our Research Interests

62

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

63 of 69

Gaze-based Interaction on Public Displays

63

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

64 of 69

Challenges of Gaze-based Interaction with Large Public Displays

64

1. Position

2. Movement

3. Calibration

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

65 of 69

EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays.

65

Project by Alexander Klimczack and Martin Reiss

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

66 of 69

GazeDrone: Using Drones as Mobile Remote Eye Trackers for Public Displays

66

Project by Anna Kienle

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

67 of 69

Text-based Calibration of Eye Trackers

67

Project by Ozan Saltuk

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

68 of 69

Text-based Calibration of Eye Trackers

Seamlessly integrating eye tracker calibration to public display application

68

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis

69 of 69

Our Research Interests

69

Proficiency Awareness through Gaze Features

Eye Tracking and Gaze-based Interaction - Heiko Drewes, Jakob Karolus, Mohamed Khamis