31
DENVER COLORADO, USA JUNE 25-28 2019

DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

DENVER COLORADO, USA

JUNE 25-28 2019

Page 2: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

Table of Contents

ETRA Event Maps .................................................................. 2

ETRA Schedule Overview ..................................................... 4

Message from Conference Chairs ....................................... 6

ETRA 2019 Credits ................................................................ 8

Keynotes ............................................................................ 9

Tutorials ................................................................................ 16

Doctoral Symposium .......................................................... 23

ETRA Wednesday Sessions .............................................. 26

COGAIN Wednesday Sessions ......................................... 27

ET4S Wednesday Sessions .............................................. 30

ETRA 2019 Posters, Pre-function Space ........................... 32

ETRA 2019 Demos & Videos............................................... 34

ETRA Thursday Sessions ................................................. 36

ETWEB Thursday Sessions .............................................. 37

ETVIS Thursday Sessions ................................................ 39

Privacy Panel ...................................................................... 42

ETRA Friday Sessions ...................................................... 48

Sponsor Descriptions ........................................................ 50

Event Notes ......................................................................... 55

Page 3: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

32

Exhibitors Registra�on Posters Mee�ng Rooms Meals Office

Page 4: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

54

TIMETutorials Track 1 (Oxford /

Pikes Peak / Humbolt)

Tutorials Track 2 (Red

Cloud)

Doctoral Symposium

(Torrey's)TIME

Workshops track

(Oxford)Talks Track 1 (Humboldt)

Talks Track 2 (Pikes

Peak)

8:00 - 10:00T1: Deep learning in the

eye tracking world

T3: Gaze Analytics

PipelineDoctoral Symposium 8:30 - 9:30

10:00 - 10:30 9:30 - 10:00

10:30 - 12:30T1: Deep learning in the

eye tracking world

T3: Gaze Analytics

PipelineDoctoral Symposium 10:00 - 12:00

Workshop by Facebook

Reality Labs

ETRA Session 1: Deep

Learning, Paths,

Transitions & Pursuits

COGAIN Session 1

12:30 - 13:30 12:00 - 13:00

13:30 - 15:30

T2: Discussion and

standardisation of the

metrics for eye movement

detection

T4: Eye Tracking in

Autism and Other

Developmental

Conditions

Doctoral Symposium 13:00 - 15:00Workshop by Tobii

Pro

ETRA Session 2:

Calibration, Cognition,

Smartphones, &

Sequences

COGAIN Session 2

15:30 - 16:00 15:00 - 15:30

16:00 - 18:00

T2: Discussion and

standardisation of the

metrics for eye movement

detection

T4: Eye Tracking in

Autism and Other

Developmental

Conditions

Doctoral Symposium 15:30 - 17:30POSTERS Fast Forward

Session (16:15-17:30)

ET4S Session 1 (15:30-

18:00)

17:30 - 21:00

06/25/2019 (Tuesday)

Coffee break (Ballroom Foyer)

Lunch break (Oxford / Pikes Peak/Humbolt)

Coffee break (Ballroom Foyer)

Lunch break (The Lockwood Kitchen)

Coffee break (Ballroom Foyer)

Opening session

Keynote Address by Oleg Komogortsev "Eye Tracking Sensors Past, Present,

Future and Their Applications"

(Oxford / Humboldt / Pikes Peak)

Reception, Poster Session, Video & Demo Session (Ellingwood A&B and Red

Cloud)

Coffee break (Ballroom Foyer)

06/26/2019 (Wednesday)

TIMEWorkshops track (Red

Cloud)

Talks Track 1 (Oxford / Pikes

Peak / Humboldt)Talks Track 2 (Torrey's) TIME

Talks Track 1 (Oxford / Pikes

Peak Humboldt)Talks Track 2 (Torrey's)

8:30 - 9:30 8:30 - 9:30

9:30 - 10:00 9:30 - 10:00

10:00 - 12:00Workshop by SR

Research Inc.

ETRA Session 3: Visualisations

& ProgrammingETWEB Session 1 10:00 - 12:00

ETRA Session 6: Privacy,

Authentication, Fitts of SkillCHALLENGE Session 1

12:00 - 13:00 12:00 - 13:00

13:00 - 15:00ETRA Session 4: Head-

Mounted Eye Tracking

ETVIS Session 1:

Visualization Tools and

Techniques

13:00 - 15:00

15:00 - 15:30

15:30 - 17:30ETRA Session 5: Gaze

Detection and Prediction

ETVIS Session 2: Visual

Scanpath Comparison

18:30 - 21:30 Banquet (Oxford / Pikes Peak / Humboldt)

Privacy in Eye Tracking - Panel Discussion (Oxford / Pikes Peak /

Humboldt)

Coffee break (Ballroom Foyer)

Coffee break (Ballroom Foyer)

ss by Oleg Komogortsev "Eye Tracking Sensors Past, Present,

Town hall meeting, ETRA 2020 Introduction (Oxford / Pike's Peak /

Humboldt)

Lunch break (Oxford / Pikes Peak / Humboldt)

06/28/2019 (Friday)06/27/2019 (Thursday)

Lunch break (Oxford / Pikes Peak / Humboldt)

deo & Demo Session (Ellingwood A&B and Red

Coffee break (Ballroom Foyer)

Keynote Address by Enkeleida Kasneci "From Gazing to Perceiving"

(Oxford / Pikes Peak / Humboldt)

ET

RA

Sch

ed

ule

for T

uesd

ay &

Wed

nesd

ay

ET

RA

Sch

ed

ule

for T

hu

rsd

ay &

Frid

ay

Page 5: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

76

Message from Conference Chairs

We are very pleased to welcome you to the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA) 2019 in Denver, Colorado, USA! We are excited to present an excellent program for you to experience. We strongly believe that the program and proceedings represent the most vibrant eye tracking methodology and its application advances which meet rigorous

For more than twenty years, the ACM ETRA conference has been the premier world-wide meeting place for the eye tracking community. ETRA is growing

this year, ETRA is being held annually after being biannual since its inception. ETRA 2019 puts forth the effort to enhance interdisciplinary collaboration

happen, we are pleased to have ETRA 2019 collocated with four excellent thematic eye tracking meetings and tracks: Eye Tracking For Spatial Research (ET4S), Computer – Gaze Interaction (COGAIN), Eye Tracking & Visualizations (ETVIS), and Eye Tracking for Web (ETWEB). ET4S joined

areas who have a common interests in using eye tracking for research questions related to spatial information and spatial decision making. ETWEB is a new initiative which covers topics related to Web (interface semantics extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction, with special emphasis on eye-controlled assistive technology. It presents advances in these areas, leading to new capabilities in gaze interaction, gaze enhanced applications, and gaze contingent devices. ETVIS covers topics that are

visualization, and visual analytics) and eye tracking. In 2019, ETRA will host for

will consist of a mining challenge. An open call was put out to apply analytical tools to a common human eye-movement data set. The challenge was to have participants creatively engage their newest and most exciting mining tools and approaches to make the most of this dataset. In ETRA 2019, we received 109 submissions to the main ETRA conference track. We accepted 28 full papers and 18 short papers after a two-phase reviewing process. This year we continue our tutorial program and doctoral students symposium with 13 accepted submissions. This year we accommodate four half-

Kasprowski, Discussion and standardization of the metrics for eye movement detection by Mikhail Startsev and Raimondas Zembys, Gaze Analytics Pipeline by Nina Gehrer and Andrew Duchowsk, Eye Tracking in the Study of Developmental Conditions: A Computer Scientists Primer by Frederick Shic.

ETRA 2019 also aims at bringing together science and business. We are most grateful to all of our sponsors. The ETRA conference is also supported by the efforts of the Special Interest Group on Computer-Human Interaction (SIGCHI) of the Association for Computing Machinery (ACM) and the ACM. We would like to point your attention to sponsors’ workshops accompanying the main conference. This year we accommodate three sponsor workshops by Tobii Pro, Facebook Reality Labs, and SR Research Ltd. These workshops are free for all ETRA attendees and are an excellent platform for knowledge exchange between business practitioners and academics. We are very excited about our keynote speakers addressing current trends in eye tracking. Oleg Komogortsev, a PECASE award winner will discuss the past and present status of eye tracking sensors, along with his vision for future development. He will also discuss applications that necessitate the presence of such sensors in VR/AR devices, along with applications

will dive into more fundamental issues of human gazing, perception and

requirements needed to shift our paradigm from foveal to retina-aware eye tracking, and discussing novel ways to employ this new paradigm to further our understanding of human perception. Putting on a conference takes many people working together towards a common goal. We thank all the authors and volunteer reviewers that have contributed to this year’s submissions. We would especially like to highlight the work of the area chairs who provided synthesizing meta-reviews, led discussions, and provided their expert guidance to reviewers and authors. It has been our pleasure to serve in our capacity of Conference Chairs for

together with all of you at this year’s ETRA. We wish you a great time in Denver!

Bonita Sharif & Krzysztof Krejtz

ETRA 2019 Conference Chairs

Page 6: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

98

ETRA 2019 Organization Conference Chairs

Bonita Sharif (University of Nebraska-Lincoln, USA)Krzysztof Krejtz (SWPS University of Social Sciences and Humanities, Poland) Paper Chairs

Veronica Sundstedt (Blekinge Tekniska Högskola, Sweden)Paige Rodeghero (Clemson University, USA) Demo & Video Chairs

Tanja Blascheck (Universität Stuttgart, Germany)Fabian Deitelhoff (University of Applied Sciences and Arts Dortmund, Germany) Doctoral Symposium Chairs

Hana Vrakova (University of Colorado Boulder, USA)Reynold Bailey (Rochester Institute of Technology, USA)Ann McNamara (Texas A&M University, USA) Tutorial Chairs

Preethi Vaidyanathan (LC Technologies Inc., USA)Diako Mardanbegi (Lancaster University, UK) Poster Chairs

Arantxa Villanueva (Public University of Navarra, Spain)Eakta Jain (University of Florida, USA) Eye Tracking Challenge Chairs

Susana Martinez-Conde (State University of New York, USA)Jorge Otero-Millan (Johns Hopkins University, USA) Sponsor Chairs

Oleg Komogortsev (Michigan State University, USA)Kenan Bektas (Zurich University of Applied Sciences, Switzerland) Social Media Chairs

Anna Niedzielska (SWPS University of Social Sciences and Humanities, Poland)Nina Gehrer (Eberhard Karls University of Tübingen, Germany)

Web Chairs

Adrian Pilkington (University of Nebraska-Lincoln, USA)Michael Decker (Bowling Green State University, USA) Proceedings Chair

Stephen N. Spencer (University of Washington, USA) Local Arrangements Chairs

Hana Vrakova (University of Colorado Boulder, USA)Martha Crosby (University of Hawaii, USA) Student Volunteer Chairs

Katarzyna Wisiecka (SWPS University of Social Sciences and Humanities, Poland)Ayush Kumar (Stony Brook University, USA) Accessibility Chairs

Poland) Conference Companion Booklet Chairs

Matthew Crosby (USA)Adrian Pilkington (University of Nebraska-Lincoln, USA)Agnieszka Ozimek (SWPS University of Social Sciences and Humanities, Poland) Design and Artwork Chairs

Matthew Crosby (USA)Adrian Pilkington (University of Nebraska-Lincoln, USA)Meera Patel (University of Nebraska-Lincoln, USA)

Co-sponsored by ACM SIGGRAPH and ACM SIGCHI

ETRA Steering CommitteeAndrew Duchowski (chair), Pernilla Qvarfordt, Paivi Majaranta

Special thanks to the 128 reviewers and 51 area chairs!

Check out the ETRA 2019 website at http://etra.acm.org/2019/areachairs.html and http://etra.acm.org/2019/reviewers.html for a list of names.

Page 7: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

1110

ETRA 2019 Co-located Events OrganizationCOGAIN (Communication by Gaze Interaction)Organizers

John Paulin Hansen, Technical University of Denmark, DenmarkPäivi Majaranta, Tampere University, FinlandProgram Co-Chairs

Diako Mardanbegi, Lancaster University, United KingdomKen Pfeuffer, Bundeswehr University Munich, Germany

ET4S (Eye Tracking for Spatial Research)

Organizers

Peter Kiefer, ETH Zurich, SwitzerlandFabian Göbel, ETH Zurich, SwitzerlandDavid Rudi, ETH Zurich, SwitzerlandIoannis Giannopoulos, TU Vienna, AustriaAndrew T. Duchowski, Clemson University, USAMartin Raubal, ETH Zurich, Switzerland

ETVIS (Eye Tracking and Visualization)

Organizers

Michael Burch, Eindhoven University of Technology, NetherlandsPawel Kasprowski, Silesian University of Technology, PolandLeslie Blaha, Air Force Research Laboratory, USASocial Media Chair

Ayush Kumar, Stony Brook University, USA

ETWEB (Eye Tracking for the Web)

Organizers

Chandan Kumar, Institute WeST, University of Koblenz, GermanyRaphael Menges, Institute WeST, University of Koblenz, GermanySukru Eraslan, METU Northern Cyprus Campus Program Committee

Alexandra Papoutsak, Pomona College, USAJacek Gwizdka, University of Texas, USAScott MacKenzie, York University, CanadaSimon Harper, University of Manchester, UKCaroline Jay, University of Manchester, UKVictoria Yaneva, University of Wolverhampton, UK

Marco Porta, University of Pavia, ItalySpiros Nikolopoulos, CERTH ITI, GreeceKorok Sengupta, University of Koblenz, GermanySteffen Staab, University of Koblenz, Germany

Keynotes

Eye Tracking Sensors Past, Present,

Future and Their ApplicationsWednesday, June 26, 8:30 - 9:30

(Oxford / Humboldt / Pikes Peak)

OLEG KOMOGORTSEV, ASSOCIATE PROFESSOR AT TEXAS STATE UNIVERSITY, USA

Abstract. The availability of eye tracking sensors is set to explode, with billions of units available in future Virtual Reality (VR) and Augmented Reality (AR) platforms. In my talk I will discuss the past and present status of eye tracking sensors, along with my vision for future development. I will also discuss applications that necessitate the presence of such sensors in VR/AR devices, along with applications that have the power to

are widely adopted.

Bio. Dr. Komogortsev is currently a tenured Associate Professor at Texas State University and a Visiting Scientist at Facebook Reality Labs at Facebook. Dr. Komogortsev has received his B.S. in Applied Mathematics from Volgograd State University, Russia, and M.S./Ph.D. degree in Computer Science from Kent State University, Ohio. He has previously worked for such institutions as Johns Hopkins University, Notre Dame University, and Michigan State University. Dr. Komogortsev conducts research in eye tracking with a focus on cyber security (biometrics), health assessment, human computer interaction, usability, and bioengineering. This work has thus far yielded more than 100 peer reviewed publications and several patents. Dr. Komogortsev’s research was covered by the national media including NBC News, Discovery, Yahoo, Livesience and others. Dr. Komogortsev is a recipient of four Google awards including two Virtual Reality Research Awards (2016, 2017), Google Faculty Research Award (2014), and Google Global Faculty Research Award (2018). Dr. Komogortsev has also won National Science Foundation CAREER award and Presidential Early Career Award for Scientists and Engineers (PECASE) from President Barack Obama on the topic of cybersecurity with the emphasis on eye movement-driven biometrics and health assessment. In addition, his research is supported by the National Science Foundation, National Institute of Health,

and various industrial sources. Dr. Komogortsev’s current grand vision is to push forward eye tracking solutions in the future virtual and augmented reality platforms as enablers of more immersive experiences, security, and assessment of human state.

Page 8: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

1312

From Gazing to PerceivingThursday, June 27, 8:30 - 9:30

(Oxford / Pikes Peak / Humboldt)

ENKELEJDA KASNECI, ASSOCIATE PROFESSOR OF COMPUTER SCIENCE PERCEPTION ENGINEERING LAB AT UNIVERSITY OF TÜBINGEN, GERMANY

Abstract. Eye tracking technology is based on the assumption that our perception follows the fovea – a tiny region in our retina responsible for sharp central vision. In fact, what we usually refer to as the line of sight is nothing but the imaginary line connecting the fovea to the gazed location. However, our visual perception is far more complex than that: Gazing is not perceiving. As a tangible example, consider our retinal peripheral view. Whereas we cannot distinguish details in this region, movements are perceptible nonetheless. In this talk, I will go beyond the line of sight

from foveal to retina-aware eye tracking, and b) discussing novel ways to employ this new paradigm to further our understanding of human perception.

Bio. Enkelejda Kasneci is an Associate Professor of Computer Science at the University of Tübingen, Germany, where she leads the Perception Engineering Group. As a BOSCH-scholar, she received her M.Sc. degree in Computer Science from the University of Stuttgart in 2007. In 2013, she received her PhD in Computer Science from the University of Tübingen, Germany. For her PhD research, she was awarded the research prize of the Federation Südwestmetall in 2014. From 2013 to 2015, she was a Margarete-von-Wrangell Fellow. Dr. Kasneci’s overarching and long-term vision aims at computing systems that sense and infer the user’s cognitive state, actions, and intentions based on eye movements. These systems set out to provide information for assistive technologies applicable for many activities of everyday life. Towards this vision, her research combines eye tracking technology with machine learning in various multidisciplinary projects that

sources. In addition, she serves as academic for PlosOne as well as a reviewer and PC member for several journals and major conferences.

Wednesday, June 26 2019

13:00 - 15:00 (Oxford)

Tobii Pro Workshop

Tobii Pro solutions for VR experiments

Presenter: Jonas Högström, Tobii Pro & Tim Holmes, Royal Holloway, University of London

Abstract: Whereas experiments in Virtual Reality (VR) have grown much more common over the last years, they are still not as common nor as well-supported as standard screen-based experiments. The choice of research tool to choose goes hand in hand with the research question of interest, but today the same question can be approached from different angles using screen-based experiments, glasses-based experiments, 360° VR media, and full 3D VR environments. The choice of what media to use in a VR experiment is determined by the researcher’s desired level of control of the stimulus, how representative it is supposed to be

resources available for the project.

This workshop will present Tobii Pro’s solutions for conducting VR experiments, and will go through how areas of interests, trials, moving

taken care of by the software, and what is expected of the researcher themselves. Workshop attendees will get a chance to try VR hardware and the software solutions themselves.

Page 9: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

1514

Wednesday, June 26 2019

10:00 - 12:00 (Oxford)

Facebook Reality Labs Workshop Establishing a Ground-Truth for Eye Tracking Robert Cavin, Research Lead, Eye Tracking, Facebook Reality LabsImmo Schuetz, Postdoctoral Research Scientist, Facebook Reality LabsRobin Sharma, Optical Scientist, Facebook Reality LabsKavitha Ratnam, Postdoctoral Research Scientist, Facebook Reality LabsMichele Rucci, Professor, Center for Visual Science, University of RochesterAustin Roorda, Professor, School of Optometry, University of California Berkeley Calibration and performance evaluation of current eye trackers typically rely on comparing known target positions to measured gaze directions

geometric eye model is then optimized based on this correspondence, essentially treating the calibration targets as the “ground truth” for each gaze direction. While this has worked reasonably well to achieve current calibration accuracies of around 0.5 degrees, trying to optimize beyond this point reveals that calibration targets are more a self-report measure

such as drifts and micro-saccades, as well as the accuracy of positioning the fovea or preferred viewing location itself all contribute to uncertainty in the “ground-truth” target location and thus form a lower bound for tracking accuracy. Many applications of eye tracking for virtual and augmented reality will

workshop, we will explore the hypothesis that measuring ground-truth gaze in conjunction with a second, to-be-evaluated eye tracking system

ground-truth as the mapping of real-world content onto the retinal locus of

industry, followed by a panel discussion on the viability and possibilities of ground-truth eye tracking approaches. To continue the conversation after the workshop, we invite participants to a Facebook-sponsored social after the main conference events

Thursday, June 27 2019

10:00 - 12:00 (Red Cloud)

SR Research Workshop Title: Recording and analyzing gaze during website interactions with EyeLink eye trackers Presenter: Dr. Sam Hutton, SR Research Ltd Abstract: Eye tracking can be a powerful tool in usability research and graphical interface design, providing important information concerning where users direct their attention to websites and applications they are interacting with. In website usability, for example, eye tracking can reveal important information about which areas of a web page are read, which areas are skipped, or even which areas increase cognitive workload. In traditional eye tracking, the researcher has tight control over what is shown, where it is shown and when it is shown. Analysis of the gaze data typically involves mapping gaze up with various areas of inter est, and

for usability research, however, introduces a number of complications that traditional stimulus presentation and analysis software do not always deal with adequately. For example, the participant themselves determines what is shown, and when /where it is shown. As such, an accurate recording of the screen is critical. Web pages often contain dynamic (moving) content, and can themselves be scrolled, adding further complications to traditional analysis approaches, in which interest areas are typically static. This workshop will introduce new recording and analysis software from SR Research that allows researchers to record and quantify participants gaze whilst they interact with websites. Key features include screen and audio recording, keypress and mouse logging, the ability to provide a live preview of the gaze data during recording, automatic scroll compensation at the analysis stage, automatic data segmentation and navigation based on URLs, data aggregation from multiple participants, mouse event data visualization

EyeLink

SR Research

Page 10: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

1716 Tuesday 6.25.19

Tutorials

Deep Learning in the Eye Tracking World PAWEL KASPROWSKI, [email protected], June 25, 2019, 8:00-12:30 (Oxford / Pikes Peak / Humbolt)

Abstract. Recently deep learning has become a hype word in computer science. Many problems, which till now could be solved only using sophisticated algorithms, can be now solved with specially developed neural networks.

Deep learning also becomes more and more popular in the eye tracking world. It may be used in any place

is needed. The tutorial aims to show the potential applications (like calibration, event detection, gaze data analysis and so on), and – what is more important – to show how to apply deep learning frameworks in such research.

There is a common belief that to use neural networks a strong mathematical background is necessary as there is much theory which must be understood before starting working. There is also a belief that, because most deep learning frameworks are just libraries in programming languages, it is necessary to be a programmer and have knowledge of the programming language that is used.

better results, this tutorial aims to prove that deep networks may be used even by people who know only a little about the theory. I will show you ready-to-use networks with exemplary eye movement datasets and try to explain the most critical issues which you will have to solve when preparing your own experiments. After the tutorial, you will probably not become an expert in deep learning, but you will know how to use it in practice with your eye movement data.

Audience. The tutorial is addressed to every person interested in deep learning; no special skills are required apart from some knowledge about eye tracking and eye movement analysis. However, minimal programming skills are welcome and may help in better understanding the problem.

Scope. This tutorial will include: (1) gentle introduction to machine

networks, (3) explanation of Convolutional Neural Networks and its

applications to eye movement data, (4) Recurrence Neural Networks and its possible usages. The tutorial will NOT include the detailed mathematical explanation of neural network architecture and algorithms. All subjects will be explained with simple try-on examples using real eye movement datasets.

Bio. Dr. Pawel Kasprowski is an Assistant Professor at Institute of Informatics, Silesian University of Technology, Poland. He received his Ph.D. in Computer Science in 2004 under the supervision of Prof. Jozef Ober – one of the precursors of eye tracking. He has experience in both eye tracking and data mining. His primary research interest includes using data mining methods to analyze eye movement signal. Dr. Pawel Kasprowski teaches data mining at the University as well as during commercial courses. In the same time, he is an author of numerous publications concerning eye movement analysis.

Additional information for prospective participants: http://www.kasprowski.pl/tutorial/

Discussion and standardisation of

the metrics for eye movement detectionMIKHAIL STARTSEV, [email protected] RAIMONDAS ZEMBLYS, [email protected], June 25, 2019, 13:30-18:00 (Oxford / Pikes Peak / Humbolt)

Abstract. By now, a vast number of algorithms and approaches for detecting various eye movements

OKN, etc.) have been proposed and evaluated by researchers in the

always directly comparable and easily interpretable, even by experts. Part of this problem lies in the diversity of the metrics that are used to test the algorithms.

The multitude of metrics reported in the literature is potentially

established groups. Firstly, there is a number of sample-level measures:

or disagreement rates. Secondly, a growing number of event-level measures exist: average statistics of the “true” and detected events

MIKHAIL STARTSEVRAIMONDAS ZEMBLYS

PAWEL KASPROWSKI

Page 11: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

1918 Tuesday 6.25.19

from the tutorial regardless of their background, either by discovering something new about the metrics they have or have not used before, or by contributing to the discussion and sharing their experiences.Bio. Mikhail Startsev is a PhD student at the Technical University of Munich (TUM), Germany and a member of an International Junior

Loop” (VESPA) under the supervision of Michael Dorr. He received his Diplom degree in Computational Mathematics and Informatics from the Lomonosov Moscow State University (LMSU), Russia, in 2015, where he was a member of the Graphics and Media Lab. Mikhail’s research is centred around the human visual system, with a particular emphasis on the eye movements and saliency modelling, with several publications in human and computer vision-related conferences and journals.

Dr. Raimondas Zemblys is currently a researcher at Siauliai University (Lithuania) and research engineer at Smart Eye AB (Sweden). His main research interests are eye-tracking methodology, eye-movement data quality, event detection and applications of deep learning for eye-movement analysis. He received his PhD in Informatics Engineering from Kaunas University of Technology in 2013, worked as a postdoc researcher at Lund University in 2013-2015 and Michigan State University in 2017-2018.

Gaze Analytics PipelineANDREW DUCHOWSKI, [email protected] NINA GEHRER, [email protected], June 25, 2019, 8:00-12:30 (Red Cloud)

Abstract. This tutorial gives a short introduction to experimental design in general and with regard to eye tracking studies in particular. Additionally, the design of three different eye tracking studies (using stationary as well as mobile eye trackers) will be presented and the strengths and limitations of their designs will be discussed. Further, the tutorial presents details of a Python-based gaze analytics pipeline developed and used by Prof. Duchowski and Ms. Gehrer. The gaze analytics pipeline consists of Python scripts for extraction of raw eye movement data, analysis

statistical evaluation, analysis and visualization of results using R.

(duration, amplitude, etc.), quantitative and qualitative scores proposed by Komogortsev et al. [2010], different ways of computing F1 scores [Hooge et al. 2018, Zemblys et al. 2018, Startsev et al. 2018], variations of the Cohen’s kappa [Zemblys et al. 2018, Startsev et al. 2019], temporal offset measures of Hooge et al. [2018], average intersection-over-union ratios [Startsev et al. 2018], and Levenshtein distance between event sequences [Zemblys et al. 2018]. Almost all of the metrics listed above can be computed for all eye movement classes taken together or for each considered class in isolation.

Some aspects of these evaluation measures (especially on the level of events) contribute to their interpretability, bias, and suitability for various purposes and testing scenarios (e.g. whether expert manual annotations are available for comparison, or whether the stimuli were synthetically generated or recorded in naturalistic conditions). With the advent of machine learning-based models, the choice of a metric, a loss function, or a set of those should be motivated not just by differentiating between a handful of algorithms, but also by the metric’s ability to guide the training process over thousands of epochs.

Right now, there is no clear-cut way of choosing a suitable metric for the problem of eye movement detection. Additionally, the set-up of an eye tracking experiment has a bearing on the applicable evaluation strategies. In this tutorial, we intend to provide an in-detail discussion of existing metrics, which would supply both theoretical and practical insights. We will illustrate our recommendations and conclusions through examples and experimental evidence. This tutorial aims to facilitate discussion and stimulate the researchers to employ uniform and well-grounded evaluation strategies.

Scope. The tutorial is aiming to provide its audience with a practice-

of eye movement detection, covering a wide variety of set-ups, such as eye movements with synthetic and naturalistic stimuli, in the presence or absence of manual annotations, as well as different purposes of the

systematic biases in the annotations by different experts; training a machine learning model) and evaluated entities (i.e. individual samples or whole events). The presentations will give recommendations for evaluation strategy choices for different scenarios, as well as support the discussion of various metrics by examples.

Audience. Researchers involved in eye movement detection (or even

ANDREW DUCHOWSKININA GEHRER

Page 12: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

2120 Tuesday 6.25.19

Eye Tracking in the Study of Developmental Conditions:

A Computer Scientists Primer

FREDERICK SHIC, [email protected], June 25, 2019, 13:30-6:00 (Red Cloud)

Abstract. Children with developmental conditions, such as autism, genetic disorders, and fetal alcohol syndrome, present with complex etiologies and can

Especially in very young children, heterogeneity across and within diagnostic categories makes uniform application of standard assessment methods, that often rely on assumptions of communicative or other

tool to study both the mechanistic underpinnings of atypical development as well as facets of cognitive and attentional development that may be of clinical and prognostic value. In this tutorial we discuss the challenges and approaches associated with studying developmental conditions using eye tracking. Using autism spectrum disorder (ASD) as a model, we discuss the interplay between clinical facets of conditions and studies and techniques used to probe neurodevelopment.

Scope and Audience. This tutorial is geared towards engineers and computer scientists who may be interested in the variety of ways eye tracking can be used in the study of developmental mechanism or for the development of clinically-relevant methods, but does not assume deep knowledge of eye tracking hardware, algorithms, or engineering-focused literature. Similarly, the tutorial will be broadly accessible, assuming limited or no knowledge of developmental conditions, clinical research, and/or autism.

Bio. Frederick Shic, Ph.D. is an Associate Professor of Pediatrics at the University of Washington and an Investigator at Seattle Children’s Research Institute’s Center for Child Health, Behavior and Development. Dr. Shic has been an autism researcher for 15 years and, as a computer scientist by training, brings an interdisciplinary perspective to early developmental, therapeutic, and phenotyping research. Dr. Shic leads the Seattle Children’s Innovative Technologies Laboratory (SCITL), a

eye tracking, functional near infrared spectroscopy, robots, mobile apps, and video games. His goals are to understand lifespan trajectories leading to heterogeneous outcomes in ASD, and to develop methods for positively intercepting these trajectories. To enable this, he focuses

Attendees of the tutorial will have the opportunity to run the scripts of an analysis of gaze data collected during categorization of different emotional expressions while viewing faces. The tutorial covers basic eye movement

advanced analysis using gaze transition entropy. Newer analytical tools and techniques such as microsaccade detection and the Index of Pupillary Activity will be covered with time permitting.

Scope and Audience. The tutorial welcomes attendees at all levels of experience and expertise, from those just beginning to study eye movements and interested in the basics of experimental design to those well practiced in the profession who might wish to consider adopting use of Python and R scripts, possibly wishing to contribute to, expand on, and improve the pipeline.

Bio. Dr. Duchowski is a professor of Computer Science at Clemson University. He received his baccalaureate (1990) from Simon Fraser University, Burnaby, Canada, and doctorate (1997) from Texas A&M University, College Station, TX, both in Computer Science. His research and teaching interests include visual attention and perception, eye tracking, computer vision, and computer graphics. He is a noted research leader

monograph related to eye tracking research, and has delivered courses and seminars on the subject at international conferences. He maintains Clemson’s eye tracking laboratory, and teaches a regular course on eye tracking methodology attracting students from a variety of disciplines across campus.

Nina Gehrer is a clinical psychologist who is currently working on her PhD thesis at the University of Tübingen, Germany, since she received her master’s degree in 2015. Her main research interest lies in studying face and emotion processing using eye tracking and a preferably wide range of analytic methods. As a clinical psychologist, she is particularly interested in possible alterations related to psychological disorders that could

working with Prof. Duchowski in 2016. Since then, they have enhanced and implemented his gaze analytics pipeline in the analysis of several eye tracking studies involving face and emotion processing. Recently, they have started to extend their research to gaze patterns during social interactions.

FREDERICK SHIC

Page 13: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

2322 Tuesday 6.25.19

Doctoral Symposium

Schedule, Tuesday, June 25, 2019, Humboldt

08:00-10:00

Welcome Note - Doctoral Symposium Co-Chairs (30 minutes)3-minute introductions - in 1 slide introduce yourself, educational background, collaborations, and why your work is important

10:00-10:30

Coffee break (Ballroom Foyer)

10:30-12:30

Large Group Discussions (All Together)Discuss technical aspects of your work (5 minutes, up to 5 slides, 3 minutes for feedback and Q&A).Each student is assigned two abstracts to review in detail.One student will serve as moderator - introducing the speaker and topic, and kick off discussion.Another student will serve as scribe.

12:30-13:30

Lunch break (The Lockwood Kitchen)Discuss the following with peers at your table with one person taking notes (prepare 3-4 concrete quesstions):What obstacles/challenges are you facing?Do you feel your work is progressing smoothly?What could you use guidance on?

13:30-15:30

Summary of lunch discussions (13:30-13:45)Small Group Discussions with Faculty (13:45-15:30) Three small groups of DS students meet with three groups of established researchers. Groups rotate every 30 minutes.

15:30-16:00

Coffee break (Ballroom Foyer)16:00-17:00Posters Fast-Forward practice runMaximizing your conference experienceClosing Remarks

18:00

Social Event

on big data perspectives of phenotypic variation, biomarker discovery enabled via technology, and rapid, adaptable, evolving frameworks for outcomes research applicable to diverse populations. His current and prior work, funded by NIMH, Simons Foundation, and Autism Speaks, includes developmental, psychological, and applied autism research as

predictive techniques. Previously, he was an engineering undergraduate at Caltech, a Sony PlayStation video game programmer, a magnetic resonance spectroscopy brain researcher, and a graduate student at Yale Computer Science’s Social Robotics Lab. It was during this graduate work when, needing child gaze patterns to program an attention system

at the Yale Child Study Center. He continued this work as an NIMH T32 postdoc in Childhood Neuropsychiatric Disorders and then as an Assistant Professor at the Yale Child Study Center.

Page 14: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

2524 Tuesday 6.25.19

Doctoral Symposium

Abstracts

When you don’t see what you expect: incongruence in music

and source code reading

Natalia Chitalkina (University of Turku)

Eye-tracking based Fatigue and Cognitive Assessment

Tanya Bafna (Technical University of Denmark) and John Paulin Hansen (Technical University of Denmark)

Pupil Diameter as a Measure of Emotion and Sickness in VR

Brendan John (University of Florida)

Accessible Control of Telepresence Robots based on Eye-Tracking

Guangtao Zhang (Technical University of Denmark)

The vision and interpretation of paintings: bottom-up visual processes,

top-down culturally informed attention, and aesthetic experience

Pablo Fontoura (EHESS), Jean-Marie Schaeffer (EHESS), and Michel Menu (C2RMF)

Attentional orienting in real and virtual 360-degree environments:

application to aeronautics

Rébaï Soret (ISAE-SUPAERO), Christophe Hurter (ENAC- Ecole Nationale de l’Aviation Civile), and Vsevolod Peysakhovich (ISAE)

Motion Tracking of Iris Features for Eye tracking

Aayush Chaudhary (Rochester Institute of Technology)

Automatic quick-phase detection in bedside recordings

from patients with acute dizziness and nystagmus

Sai Akanksha Punuganti (Johns Hopkins University, USA), Jing Tian (Johns Hopkins University, USA), and Jorge Otero-Millan (Johns Hopkins University, USA)

Towards a Data-driven Framework for Realistic Self-Organized Virtual

Humans: Coordinated Head and Eye movements

Zhizhuo Yang (Rochester Institute of Technology)

Looks Can Mean Achieving: Understanding Eye Gaze Patterns of

Jonathan Saddler (University of Nebraska Lincoln)

High-Resolution Eye Tracking Using Scanning Laser Ophthalmoscopy

Norick Bowers (University of California, Berkeley)

Eye movements during reading and reading assessment in Swedish

Andrea Strandberg (Karolinska Institute)

Page 15: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

2726 Wednesday 6.26.19

ETRA 2019 Long and Short Papers

Presented as Talks

Wednesday, June 26, 2019

10:00-12:00

(Humboldt)

Session 1

Session Chair: Tanja Blascheck (University of Stuttgart)

Deep learning investigation for chess player attention prediction using

eye-tracking and game data Justin Le Louedec, Thomas Guntz, James Crowley and Dominique Vaufreydaz Long

Semantic Gaze Labeling for Human-Robot Shared Manipulation

Reuben Aronson and Henny Admoni Long

Almoctar Hassoumi, Vsevolod Peysakhovich and Christopher Hurter Long

Exploring Simple Neural Network Architectures for Eye Movement

Jonas Goltz, Michael Grossberg, and Ronak Etemadpour Short

Analyzing Gaze Transition Behavior Using Bayesian Mixed Effects Markov

Models

Islam Akef Ebeid, Nilavra Bhattacharya, Jacek Gwizdka and Abhra Sarkar Short

COGAIN 2019 Long and Short Papers

Presented as Talks

Wednesday, June 26, 2019

10:00-12:00 & 13:00-15:00

(Pike’s Peak)

Session 1

Session Chair: Arantxa Villanueva (Public University of Navarre,

Italy

10:00-10:15

Welcome and Brief Introduction

10:15-10:55

Invited talk Eye Tracking - From the Past to the Future [Abstract]

Heiko Drewes (University of Munich, Germany) [Biography]

11:00-11:20

A Comparative Study of Eye Tracking and Hand Controller for Aiming

Tasks in Virtual Reality

Francisco Lopez Luro (Blekinge Institute of Technology) and Veronica Sundstedt (Blekinge Institute of Technology) Long paper

11:20-11:40

Pointing by Gaze, Head, and Foot in a Head-Mounted Display

John Hansen (Technical University of Denmark), Katsumi Minakata(Technical University of Denmark), I. Scott MacKenzie York University), Per Bkgaard (Technical University of Denmark), and Vijay Rajanna (Texas A&M University) Long paper

11:40-12:00

Hand- and Gaze-Control of Telepresence Robots

Guangtao Zhang (Technical University of Denmark), and John Paulin Hansen (Technical University of Denmark), and Katsumi Minakata (Technical University of Denmark) Long paper

12:00-13:00 Lunch break (Lockwood)

Page 16: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

2928 Wednesday 6.26.19

Session 2

Session Chair: Scott MacKenzie

(York University, Canada)

13:00-13:20

SacCalib: Reducing Calibration Distortion for Stationary

Eye Trackers Using Saccadic Eye Movements

Michael Xuelin Huang (Max Planck Institute for Informatics) and Andreas Bulling (University of Stuttgart) Long paper

13:20-13:40

SaccadeMachine: Software for Analyzing Saccade Tests

(Anti-Saccade and Pro-saccade)

Diako Mardanbegi (Lancaster University, Lancaster, UK), Thomas Wilcockson (Lancaster University, Lancaster, UK), Pete Sawyer (Aston University, Birmingham, UK), Hans Gellersen (Lancaster University, Lancaster, UK), and Trevor Crawford (Lancaster University, Lancaster, UK) Long paper

13:40-14:00

GazeButton: Enhancing Buttons with Eye Gaze Interactions

Sheikh Radiah Rahim Rivu (Bundeswehr University Munich ), Yasmeen Abdrabou (German University in Cairo), Thomas Mayer (Ludwig Maximilian University of Munich), Ken Pfeuffer (Bundeswehr University Munich), and Florian Alt (Bundeswehr University Munich) Long paper

14:00-14:20

Impact of Variable Position of Text Prediction in Gaze-based Text Entry

Korok Sengupta (University of Koblenz-Landau), Raphael Menges (University Koblenz-Landau), Chandan Kumar (University of Koblenz-Landau), and Steffen Staab (Institut WeST, University Koblenz-Landau and WAIS, University of Southampton) Long paper

14:20-14:35

Inducing Gaze Gestures by Static Illustrations

Pivi Majaranta (Tampere University ), Jari Laitinen (Tampere University),Jari Kangas (Tampere University), and Poika Isokoski (Tampere University) Short paper

14:35-15:00

Closing Session and Best COGAIN Paper Award

Session 2

Calibration, Cognition, Smartphones,

& Sequences

Session Chair: Izabela Krejtz

13:00-15:00

(Humboldt)

Gaze Behaviour on Interacted Objects during Hand Interaction

in Virtual Reality for Eye Tracking Re-calibration

Ludwig Sidenmark and Anders Lundström Long

Heiko Drewes, Ken Pfeuffer, and Florian Alt Long

Task-embedded online eye-tracker calibration for improving

robustness to head motion

Jimin Pi and Bertram E. Shi Long

Reducing Calibration Drift in Mobile Eye Trackers by

Exploiting Mobile Phone Usage

Philipp Müller, Daniel Buschek, Michael Xuelin Huang, and Andreas Bulling Long

Aiming for Quiet Eye in Biathlon

Dan Witzner Hansen, Amelie Heinrich, and Rouwen Cañal-Bruland

Long

Page 17: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

3130 Wednesday 6.26.19

ET4S 2019 Long and Short Papers Presented as Talks

Wednesday, June 26, 2019, 15:30-18:00

(Pike’s Peak)

Session 1Eye Tracking for Spatial Reseaerch

Session Chair: Peter Kiefer

15:30-16:30

Eye Tracking in Mixed Reality and its Promises for Spatial Research

Sophie Stellmach Invited Talk

16:30-16:50

GeoGCD: Improved Visual Search via Gaze-Contingent Display

and Sara Irina Fabrikant Long

16:50-17:10

Eye gaze and head gaze in collaborative games

Oleg Špakov, Howell Istance, Kari-Jouko Räihä, Tiia Viitanen, and Harri Siirtola Long

17:10-17:30

Attentional orienting in virtual reality using endogenous

and exogenous cues in auditory and visual modalities

Rébaï Soret, Pom Charras, Christophe Hurter, and Vsevolod Peysakhovich Long

17:30-17:50

POITrack: Improving Map-Based Planning with Implicit POI Tracking

Fabian Göbel and Peter Kiefer Long

17:50-18:00

in a collaborative assembly task

Haofei Wang and Bertram E. Shi Short

POSTERS Fast Forward Session / ET4S

• W!NCE: Eyewear Solution for Upper Face Action Units Monitoring• A Gaze-Based Experimenter Platform for Designing and Evaluating Adaptive Interventions in Information Visualizations• PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features• iLid: Eyewear Solution for Low-power Fatigue and Drowsiness Monitoring• Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head- Mounted Eye Tracking• Estimation of Situation Awareness Score and Performance Using Eye and Head Gaze for Human-Robot Collaboration• When you don’t see what you expect: incongruence in music and source code reading• Eye-tracking based Fatigue and Cognitive Assessment• Pupil Diameter as a Measure of Emotion and Sickness in VR• Accessible Control of Telepresence Robots based on Eye-Tracking• The vision and interpretation of paintings: bottom-up visual processes, top-down culturally informed attention, and aesthetic experience.• Attentional orienting in real and virtual 360-degree environments: application to aeronautics• Motion Tracking of Iris Features for Eye tracking• Automatic quick-phase detection in bedside recordings from patients with acute dizziness and nystagmus• Towards a Data-driven Framework for Realistic Self-Organized Virtual Humans: Coordinated Head and Eye movements

Comprehension• High-Resolution Eye Tracking Using Scanning Laser Ophthalmoscopy• Eye movements during reading and reading assessment in Swedish school children – a new

• GazeVR: A Toolkit for Developing Gaze Interactive Applications in VR/AR

• Reading Detection in Real-time

• EyeVEIL: Degrading Iris Authentication in Eye-Tracking Headsets• Remote Corneal Imaging by Integrating a 3D Face Model and an Eyeball Model• Detecting cognitive bias in a relevance assessment task using an eye tracker

• TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies• SeTA: Semiautomatic Tool for Annotation of Eye Tracking Images• A Fitts’ Law Study of Pupil Dilations in a Head-Mounted Display

Experiment• Calibration-free Text Entry using Smooth Pursuit Eye Movements• Analyzing Gaze Transition Behavior Using Bayesian Mixed Effects Markov Models• Quantifying and Understanding the Differences in Visual Activities with Contrast Subsequences• A Deep Learning Approach for Robust Head Pose Independent Eye movements recognition from Videos• A Gaze Model Improves Autonomous Driving• Inferring target locations from gaze data: A smartphone study• Boosting Speed- and Accuracy of Gradient based Dark Pupil Tracking using Vectorization and Differential Evolution

The poster fast forward session will feature lightning talks from all ETRA short papers, doctoral symposium, videos and demos

Page 18: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

3332 Wednesday 6.26.19

S5 Improving Real Time CNN-Based Pupil Detection Through Domain-

Shahram Eivazi (Eberhard Karls Universität Tübingen), Thiago Santini (Eberhard Karls Universität Tübingen), Alireza Keshavarzi (Eberhard Karls Universität Tübingen), Thomas Kübler (Eberhard Karls Universität Tübingen), and Andrea Mazzei (Cortical Arts GmbH)

S13 Reading Detection in Real-time

Conor Kelton (Stony Brook University), Zijun Wei (Stony Brook University), Seoyoung Ahn (Stony Brook University), Aruna Balasubramanian (Stony Brook University), Samir R. Das (Stony Brook University), Dimitris Samaras (Stony Brook University), and Gregory Zelinsky (Stony Brook University)

S26 Exploring Simple Neural Network Architectures for Eye Movement

Jonas Goltz (Department of Computer Science and Mathematics, Munich University of Applied Sciences), Michael Grossberg (Department of Computer Science, City College of New York/CUNY), and Ronak Etemadpour (Department of Computer Science, City College of New York/CUNY)

S32 EyeVEIL: Degrading Iris Authentication in

Eye-Tracking Headsets

Brendan John (University of Florida), Sanjeev Koppal (University of Florida), and Eakta Jain (University of Florida)

S34 Remote Corneal Imaging by Integrating a 3D Face Model and an

Eyeball Model

Takamasa Utsu (Tokai University) and Kentaro Takemura (Tokai University)

S58 Detecting cognitive bias in a relevance assessment task using an eye

tracker

Christopher G. Harris (University of Northern Colorado)S62

Wolfgang Fuhl (Eberhard Karls Universität Tübingen), Nora Castner (Eberhard Karls Universität Tübingen), Thomas Kübler (Eberhard Karls Universität Tübingen), Alexander Lotz (Daimler AG), Wolfgang Rosenstiel (Eberhard Karls Universität Tübingen), and Enkelejda Kasneci (University of Tübingen)

S65 TobiiGlassesPySuite: An open-source suite for using the Tobii Pro

Glasses 2 in eye-tracking studies

Davide De Tommaso (Istituto Italiano di Tecnologia) and Agnieszka Wykowska (Istituto Italiano di Tecnologia)

S70 SeTA: Semiautomatic Tool for Annotation of Eye Tracking Images

Andoni Larumbe (public university of navarra), Sonia Porta (public university of navarra), Rafael Cabeza (public university of navarra), and Arantxa Villanueva (public university of navarra)

S74 A Fitts Law Study of Pupil Dilations in a Head-Mounted Display

Per Bækgaard (Technical University of Denmark), John Paulin Hansen (Technical University of Denmark), Katsumi Minakata (Technical University of Denmark), and I. Scott MacKenzie (York University)

S83

Scale Replication Experiment

Cole Peterson (University of Nebraska - Lincoln), Nahla Abid (Kent State University), Corey Bryant (Kent State University), Jonathan Maletic (Kent State University), and Bonita Sharif (University of Nebraska - Lincoln)

S91 Calibration-free Text Entry using Smooth Pursuit Eye Movements

Yasmeen Abdrabou (German University in Cairo (GUC)), Mariam Mostafa (German University in Cairo (GUC)), Mohamed Khamis (University of Glasgow), and Amr Elmougy (German University in Cairo (GUC))

S103 Analyzing Gaze Transition Behavior Using Bayesian Mixed Effects

Markov Models

Islam Akef Ebeid (The University of Texas at Austin), Nilavra Bhattacharya (The University of Texas at Austin), Jacek Gwizdka (The University of Texas at Austin), and Abhra Sarkar (The University of Texas at Austin)

S111 Quantifying and Understanding the Differences in Visual Activities with

Contrast Subsequences

Yu Li (University of Missouri - Columbia), Carla Allen (University of Missouri - Columbia), and Chi-Ren Shyu (University of Missouri - Columbia)

S115 A Deep Learning Approach for Robust Head Pose Independent Eye

movements recognition from Videos

Remy Siegfried (Idiap Research Institute), Yu Yu (Idiap Research Institute), and Jean-Marc Odobez (Idiap Research Institute)

S119 A Gaze Model Improves Autonomous Driving

Congcong Liu (The Hong Kong University of Science and Technology), Yuying Chen (The Hong Kong University of Science and Technology), Lei Tai (The Hong Kong University of Science and Technology), Haoyang Ye (The Hong Kong University of Science and Technology), Ming Liu (The Hong Kong University of Science and Technology), and Bertram Shi (The Hong Kong University of Science and Technology)

S125 Inferring target locations from gaze data: A smartphone study

Stefanie Mueller (ZPID - Leibniz Institute of Psychology Information)S130 Boosting Speed- and Accuracy of Gradient based Dark Pupil Tracking

using Vectorization and Differential Evolution

André Frank Krause (Mediablix IIT GmbH) and Kai Essig (Rhine-Waal University of Applied Sciences)

ETRA 2019 POSTERS, PRE-FUNCTION SPACE

Page 19: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

3534 Wednesday 6.26.19

ETRA 2019 Demos & Videos

Wednesday, June 26, 2019

17:30-21:00

(Ellingwood A&B & Red Cloud)

A Gaze-Based Experimenter Platform for Designing and Evaluating

Adaptive Interventions in Information Visualizations

Sébastien Lallé, Cristina Conati, and Dereck Toker

Estimation of Situation Awareness Score and Performance Using Eye

and Head Gaze for Human-Robot Collaboration

Lucas Paletta, Amir Dini, Cornelia Murko, Saeed Yahyanejad, and Ursula Augsdörfer

Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-

Time Pervasive Head-Mounted Eye Tracking

Thiago Santini, Diederick C. Niehorster, and Enkelejda Kasneci

iLid: Eyewear Solution for Low-power Fatigue and Drowsiness

Monitoring

Soha Rostaminia, Addison Mayberry, Deepak Ganesan, Benjamin Marlin, and Jeremy Gummeson

PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using

Egocentric Scene Image and Eye Movement Features

Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, and Andreas Bulling

SeTA: Semiautomatic Tool for Annotation of Eye Tracking Images

Andoni Larumbe-Bergera, Sonia Porta, Rafael Cabeza, and Arantxa Villanueva

W!NCE: Eyewear Solution for Upper Face Action Units Monitoring

Soha Rostaminia, Alexander Lamson, Subhransu Maji, Tauhidur Rahman, and Deepak Ganesan

35

Page 20: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

3736 Thursday 6.27.19

ETRA 2019 Long and Short Papers

Presented as Talks

Session 3

Visualisations & Programming Session Chair: Andrew Duchowski

Thursday, June 27, 2019, 10:00-12:00

(Oxford / Pikes Peak / Humboldt)

Eye Tracking Support for Visual Analytics Systems: Foundations, Current

Applications, and Research Challenges

Nelson Silva, Tanja Blaschek, Radu Jianu, Nils Rodrigues, Daniel Weiskopf, Martin Raubal, and Tobias Schreck Long

Space-Time Volume Visualization of Gaze and Stimulus

Valentin Bruder, Kuno Kurzhals, Steffen Frey, Daniel Weiskopf, and Thomas Ertl Long

Using Developer Eye Movements to Externalize the Mental Model

Used in Code Summarization Tasks Nahla Abid, Jonathan Maletic, and Bonita Sharif Long

Visually Analyzing Eye Movements on Natural Language Texts and Source

Code Snippets

Tanja Blascheck and Bonita Sharif Long

Sequence Analysis

Unaizah Obaidellah, Michael Raschke, and Tanja Blascheck Long

ETWEB 2019 Long and Short Papers

Presented as Talks

Session 1

Eye Tracking for the Web Session Chair: Chandan Kumar

Thursday, June 27, 2019 10:30 - 12:00

(Torrey’s)

Welcome and Brief Introduction-

Eye tracking for the Web: Challenges and Opportunities Chandan Kumar, Raphael Menges, Sukru Eraslan

Welcome address

Interaction Graphs: Visual Analysis of Eye Movement Data from

Interactive Stimuli

Michael Burch Long

Quantitative Visual Attention Prediction on Webpage Images using

Multiclass SVM

Sandeep Vidyapu, Vijaya Saradhi Vedula, and Samit Bhattacharya Long

Agnieszka Ozimek, Paulina Lewandowska, Krzysztof Krejtz, Andrew T. Duchowski Short

Image, Brand and Price Info: do they always matter the same?

Mónica Cortiñas, Raquel Chocarro, and Arantxa Villanueva Long

An Interactive Web-Based Visual Analytics Tool for

Detecting Strategic Eye Movement Patterns

Michael Burch, Ayush Kumar, and Neil Timmermans Long

Page 21: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

3938 Thursday 6.27.19

Session 4: Head-Mounted Eye Tracking Session Chair: Dan Witzner Hansen

Thursday, June 27, 2019

13:00-15:00

(Oxford / Pikes Peak / Humboldt)

Towards a low cost and high speed mobile eye tracker

Frank Borsato and Carlos Morimoto Long

Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for

Real-Time Pervasive Head-Mounted Eye Tracking

Thiago Santini, Diederick Niehorster, and Enkelejda Kasneci Long

Experiments with Equirectangular Stimuli

Ioannis Agtzidis and Michael Dorr Long

headsets

Long

Diako Mardanbegi, Christopher Clarke, and Hans Gellersen Long

ETVIS 2019 Long and Short Papers

Presented as Talks

Thursday, June 27, 2019

13:00 - 15:00 & 15:30 - 17:30

(Torrey’s)

Session 1:

Visualization Tools and Techniques Session Chair: Michael Burch

13:00-15:00

Using Warped Time Distance Chart to Compare Scan-paths of

Multiple Observers

Pawel Kasprowski and Katarzyna Harezlak Long

An Intuitive Visualization for Rapid Data Analysis - Using the DNA

Metaphor for Eye Movement Patterns

Fabian Deitelhoff, Andreas Harrer, and Andrea Kienle Long

Iris: A Tool for Designing Contextually Relevant Gaze Visualizations

Sarah D’Angelo, Jeff Brewer, and Darren Gergle Long

Art Facing Science: Artistic Heuristics for Face Detection

Andrew Duchowski, Nina Gehrer, Michael Schoenenberg, and Krzysztof Krejtz Long

Page 22: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

4140 Thursday 6.27.19

Session 2: Visual Scanpath Comparison

Session Chair: Pawel Kasprowski

Thursday, June 27, 2019, 15:30-17:30

(Torrey’s)

Visually Comparing Eye Movements over Space and Time

Ayush Kumar, Michael Burch, and Klaus Mueller Long

Clustered Eye Movement Similarity Matrices

Ayush Kumar, Neil Timmermans, Michael Burch, and Klaus Mueller

Long

Finding the Outliers in Scanpath Data

Michael Burch, Ayush Kumar, Klaus Mueller, Titus Kervezee, Wouter Nuijten, Rens Oostenbach, Lucas Peeters, and Gijs Smit Long

General Discussion and ETVIS Best Papers Awards

Session 5: Gaze Detection and Prediction Session Chair: Andreas Bulling

Thursday, June 27, 2019, 15:30-17:30

(Oxford / Pikes Peak / Humboldt)

Characterizing Joint Attention Behavior during Real World Interactions

using Automated Object and Gaze Detection

Pranav Venuprasad, Tushar Dobhal, Anurag Paul, Tu N.M. Nguyen, Andrew Gilman, Pamela Cosman, and Leanne Chukoskie Long

A Novel Gaze Event Detection Metric That Is Not Fooled by Gaze-

independent Baselines

Mikhail Startsev, Stefan Göb, and Michael Dorr Long

prediction

Kai Dierkes, Moritz Kassner, and Andreas Bulling Long

Screen Corner Detection using Polarization Camera for Cross-Ratio Based

Gaze Estimation

Masato Sasaki, Takashi Nagamatsu, and Kentaro Takemura Long

Guiding Gaze: Expressive Models of Reading and Face Scanning

Andrew Duchowski, Sophie Jörg, Jaret Screws, Nina Gehrer, Michael Schönenberg, and Krzysztof Krejtz Long

Page 23: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

4342 Friday 6.28.19

Panel - Privacy In Eye TrackingFriday, June 28, 2019, 8:30-9:30

(Oxford / Pike’s Peak/ Humboldt)

Abstract

Technological advances in computing and sensing devices from one side and the crucial role that eye tracking plays in near-eye displays (e.g., in VR/AR devices) and driver assistance systems from the other, are moving eye tracking into the mainstream. On the way towards a new human-machine interaction paradigm, fundamental questions have to be answered on how the users want to use this new technology in the future. Discussions about ethical implications and issues of data priv acy will be important for the further positive development of eye-tracking technology and its acceptance by the society. Because eye-tracking will become a pervasive technology, possibly affecting millions of people, its misuse has to be avoided.

This panel aims to discuss privacy questions in the eye tracking community to offer a forum for people from the eye tracking, human-computer interaction, other relevant communities, as well as the industry to gather,

eye movement data before it becomes a part of everyday life.

Panelists

Ulrica Wikström is Vice President of Sales for Tobii Tech’s, Specialty Markets division. She has almost 30 years of experience working in leading roles with well-known Swedish companies. She spent the last 9 years working with eye tracking at Tobii where she has held various positions including Research and Development and Sales and has been involved in every aspect of eye tracking such as eye tracker quality and performance as well as end user privacy. With her extensive knowledge of eye tracking solutions and applications, Ulrica and her team helps customers realize the potential of commercializing their own Tobii-powered, eye tracking products in various segments such as condition assessment, assistive technology, high-end surgery room products and entertainment solutions. Ulrica earned her Masters of Science at the Royal Institute of Technology in Stockholm, Sweden, where she focused on embedded systems. She still lives in Stockholm and in her off time, enjoys being active in sports and living life to the fullest with family and friends.

Andreas Bulling is Full Professor (W3) of Computer Science at the University of Stuttgart where he holds the chair for Human-Computer Interaction and Cognitive Systems. He received his MSc. (Dipl.-Inform.) in Computer Science from the Karlsruhe Institute of Technology (KIT), Germany, focusing on embedded systems, robotics, and biomedical engineering. He holds a PhD in Information Technology and Electrical Engineering from the Swiss Federal Institute of Technology (ETH) Zurich, Switzerland. Andreas was previously a Feodor Lynen Research Fellow and a Marie Curie Research Fellow in the Computer Laboratory at the University of Cambridge, UK, a postdoctoral research associate in the School of Computing and Communications at Lancaster University, UK, as well as a Junior Research Fellow at Wolfson College, Cambridge. From 2013 – 2018 he was a Senior Researcher at the Max Planck Institute for Informatics and an Independent Research Group Leader (W2) at the Cluster of Excellence on Multimodal Computing and Interaction (MMCI) at Saarland University.

Apu Kapadia is an Associate Professor of Computer Science at the School of Informatics, Computing, and Engineering, Indiana University Bloomington. He received his Ph.D. in Computer Science from the University of Illinois at Urbana-Champaign in October 2005. For his dissertation research on trustworthy communication, he received a four-year High-Performance Computer Science Fellowship from the Department of Energy. Following his doctorate, he joined Dartmouth College as a Post-Doctoral Research Fellow with the Institute for Security Technology Studies (ISTS), and then as a Member of Technical Staff at MIT Lincoln Laboratory.

Apu Kapadia is interested in topics related to computer security and privacy. He is particularly interested in usable security and HCI; pervasive computing in the context of cameras, wearables, and IoT; and accountable anonymity. For his work on accountable anonymity, two of his papers were named as ‘Runners-up for PET Award 2009: Outstanding Research in Privacy Enhancing Technologies’. His work on privacy in the context of wearable cameras received an Honorable Mention Award at ACM CHI 2016. His work on usable privacy controls was given the ‘Honorable Mention Award (Runner-up for Best Paper)’ at the Conference on Pervasive Computing, 2007.

Ulrica Wikström (Tobii Tech)

Andreas Bulling (University of Stuttgart)

Apu Kapadia (Indiana University Bloomington)

Page 24: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

4544 Friday 6.28.19

Apu Kapadia has received eight NSF grants, including an NSF CAREER award in 2013, and a Google Research Award in 2014. He was also a recipient of the Indiana University Trustees Teaching Award in 2013 and a Distinguished Alumni Educator Award from the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2015. For the years 2015 and 2016, he was Program Co-Chair of the Privacy Enhancing Technologies Symposium (PETS) and Co-Editor-in-Chief of the associated journal Proceedings on Privacy Enhancing Technologies (PoPETs).

Enkelejda Kasneci is an Associate Professor of Computer Science at the University of Tübingen, Germany, where she leads the Perception Engineering Group. As a BOSCH-scholar, she received her M.Sc. degree in Computer Science from the University of Stuttgart in 2007. In 2013, she received her PhD in Computer Science from the University of Tübingen, Germany. For her PhD research, she was awarded the research prize of the Federation Südwestmetall in 2014. From 2013 to 2015, she was a Margarete-von-Wrangell Fellow. Dr. Kasneci’s overarching and long-term vision aims at computing systems that sense and infer the user’s cognitive state, actions, and intentions based on eye movements. These systems set out to provide information for assistive technologies applicable for many activities of everyday life. Towards this vision, her research combines eye tracking technology with machine learning in various multidisciplinary

various industrial sources. In addition, she serves as academic for PlosOne as well as a reviewer and PC member for several journals and major conferences.

Dr. Nino Zahirovic is a co-founder and the CTO of AdHawk Microsystems, where he leads a multi-disciplinary team to develop eye tracking technology for mobile, wearable, and consumer electronics devices. He received his Ph.D. in Electrical and Computer Engineering from the University of Waterloo in 2011 where he was the recipient of the Waterloo Institute for Nanotechnology Fellowship and the NSERC PGS-D Award. Following his graduate studies in RF MEMS he joined Ignis Innovation Inc. as Senior Hardware Engineer and subsequently the Director of Engineering. He was a key contributor to the development of OLED display technology that was licensed by Ignis to LG

displays.

Since co-founding AdHawk and taking on the role of CTO, Nino’s passion has been to bring AdHawk’s vision of eye tracking as a pervasive human augmentation device to AdHawk’s customers and partners. By precisely steering a beam of light across the eye thousands of times every second, AdHawk’s eye tracker offers unprecedented sensing resolution within a power envelope that is a small fraction of camera-based systems, while leveraging the economies of wafer-scale manufacturing.

Lirong Xia is an associate professor in the Department of Computer Science at Rensselaer Polytechnic Institute (RPI). Prior to joining RPI in 2013, he was a CRCS fellow and NSF CI Fellow at the Center for Research on Computation and Society at Harvard University. He received his Ph.D. in Computer Science and M.A. in Economics from Duke University. His research focuses on the intersection of computer science and microeconomics. He is an associate editor of Mathematical Social Sciences and is on the editorial

NSF CAREER award, a Simons-Berkeley Research Fellowship, the 2018 Rensselaer James M. Tien ’66 Early Career Award, and was named as one of “AI’s 10 to watch” by IEEE Intelligent Systems.

Enkelejda Kasneci (University of Tübingen)

Lirong Xia (Rensselaer Polytechnic Institute (RPI))

Nino Zahirovic (AdHawk Microsystems)

Page 25: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

4746 Friday 6.28.19

Moderators

Tanja Blascheck is a Post-Doctoral Researcher at the Institute for Visualization and Interactive Systems at the University of Stuttgart. Her main research areas are information visualization and visual analytics with a focus on evaluation, eye tracking, and interaction. She is interested in exploring how to effectively analyze eye tracking data with visualizations and the pervasive use of visualization on novel display technology like smartwatches. She received her Ph.D. in Computer Science from the University of Stuttgart, in 2017.

Eakta Jain is an Assistant Professor of Computer and Information Science and Engineering at the University of Florida. Her research interests are in leveraging eye tracking to access user priorities for creating algorithms, tools, and interactions in support of creativity and expressiveness for the visual medium. She received her PhD and MS degrees in Robotics from Carnegie Mellon University and B.Tech. degree is in Electrical Engineering from IIT Kanpur.

Organizers

Tanja Blascheck (University of Stuttgart)Andreas Bulling (University of Stuttgart)Eakta Jain (University of Florida)Enkelejda Kasneci (University of Tübingen)Diako Mardanbegi (AdHawk Microsystems)Michael Raschke (Blickshift Analytics)Panel SponsorsUniversity of FloridaConsortium on Trust in Media and Technology (University of Florida)SFB-TRR 161 (University of Stuttgart)University of Tübingen

Tanja Blascheck (University of Stuttgart)

Eakta Jain (University of Florida)

The Privacy Panel was sponsored in part by:

Page 26: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

4948 Friday 6.28.19

ETRA 2019 Long and Short Papers

Presented as Talks

Session 6: Privacy, Authentication, Fitts of Skill Session Chair: Frederick Shic

Friday, June 28, 2019, 10:00-12:00

(Oxford / Pikes Peak Humboldt)

PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using

Egocentric Scene Image and Eye Movement Features

Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, and Andreas Bulling Long

Privacy-Aware Eye Tracking Using Differential Privacy

Julian Steil, Inken Hagestedt, Michael Xuelin Huang, and Andreas Bulling Long

Differential Privacy for Eye-Tracking Data

Ao Liu, Lirong Xia, Andrew Duchowski, Reynold Bailey, Kenneth Holmqvist, and Eakta Jain Long

Just Gaze and Wave: Exploring the Use of Gaze and Gestures for

Yasmeen Abdrabou, Mohamed Khamis, Rana Mohamed Eisa, Sherif Ismail, and Amr Elmougy Long

Assessing Surgeons’ Skill Level in Laparoscopic Cholecystectomy using

Eye Metricsy

Nishan Gunawardena, Michael Matscheko, Bernhard Anzengruber, Alois Ferscha, Martin Schobesberger, Andreas Shamiyeh, Bettina Klugsberger, and Peter Solleder Long

ETRA 2019 Challenge Track

Friday, June 28, 2019, 10:00-12:00

(Torrey’s)

Ayush Kumar (Stony Brook University, United States), Anjul Tyagi (Stony Brook University, United States), Michael Burch (TU Eidhoven, Nether-lands), Daniel Weiskopf (University of Stuttgart, Germany), and Klaus Mueller (Stony Brook University, United States)

Encodji: Encoding Gaze Data Into Emoji Space for an amusing Scanpath

Wolfgang Fuhl (University Tübingen, Germany), Efe Bozkir (University Tübingen, Germany), Benedikt Hosp (University Tübingen, Germany), Nora Castner (University Tübingen, Germany), David Geisler (University Tübingen, Germany), Thiago Santini (University Tübingen, Germany), and Enkelejda Kasneci (University Tübingen, Germany)

Towards a better description of visual exploration through temporal

dynamic of ambient and focal modes

Alexandre Milisavljevic (Paris Descartes University, France), Thomas Le Bras (Paris Descartes University, France), Matei Mancas (University of Mons, Belgium), Coralie Petermann (Sublime Skinz, France), Bernard Gosselin (University of Mons, Belgium), and Karine Doré-Mazars (Paris Descartes University, France)

Understanding the Relation between Microsaccades and Pupil Dilation

Sudeep Raj (Saint Mary’s College of California, United States), Chia-Chien Wu (Harvard Medical School, Brigham and Women’s Hospital, United States), Shreya Raj (University of Washington, United States), and Nada Attar (San Jose State University, United States)

Page 27: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

5150

Platinum Sponsor Descriptions

Pupil Labs

We make state of the art hardware and software for eye tracking research.

Pupil Core - A platform that is comprised of an open source software suite and a wearable eye tracking headset. Pupil Core is more than just a product, it is an open platform used by a global community of researchers. Adapt Pupil Core hardware and software to your needs. Venture into new areas of inquiry. Build novel prototypes.

Pupil Invisible - We believe that great technology gets out of your way. We have taken steps towards realizing this vision in our newest product -

normal pair of glasses. No setup, no adjustments, no calibration. We are just starting to roll out Pupil Invisible, and are really excited for you to start using it in your research. Get out of the lab. Build applications for everyday life.

iMotions A/S

iMotions is a software company who provide tools for the next generation of behavioral research. With our products and services, we enable a clearer and more incisive understanding of human behavior, allowing for advances to be made in any human-centric Field.

Our best-in-class product, the iMotions software platform, is used worldwide to unpack human behavior and scale research in academia, business, healthcare, government, and More.

iMotions integrates multiple biosensors to uncover real human responses. The software works with any kind of biosensor, integrating eye tracking, facial expression analysis, EEG, EDA / GSR, EMG, ECG, surveys, and

multimodal research whenever you are ready. Data are seamlessly synchronized and presented in real time and are accompanied by analysis and export Tools.

With our modular platform and suite of support and services, iMotions facilitates a smarter and faster way to achieve results for business and

used worldwide, by Ivy-league universities, and some of the world’s biggest companies, including Unilever, GSK, and P&G and other 900+ globalclients setting up cutting-edge labs and execute multimodal human behavioral research

Tobii Pro

Tobii Pro provides world-leading eye tracking solutions to academic institutions and commercial companies that want to better understand human behavior. Our solutions consist of hardware, software, training, and support. Eye-tracking allows you to see how consumers react to different marketing messages and understand their cognitive engagement in real-time. It minimizes recall errors and the social desirability effect while revealing information conventional research methods normally miss. Ever since the start of the company in 2001, our mission has been to make eye-tracking as accessible as possible and to make eye-tracking easy to use for everyone. We also offer eye-tracking-based consumer research studies to customers that do not have the expertise or time to conduct the research themselves.

Gold Sponsor Descriptions

Facebook Reality Labs

Facebook Reality Labs (formerly Oculus Research) brings together a world-class team of researchers, developers, and engineers to create the future of virtual and augmented reality, which together will become as universal and essential as smartphones and personal computers are today. And just as personal computers have done over the past 45 years, AR and VR will ultimately change everything about how we work, play, and connect. We are developing all the technologies needed to enable breakthrough AR glasses and VR headsets, including optics and displays, computer vision, audio, graphics, brain-computer interface, haptic interaction, eye/hand/face/body tracking, perception science, and true telepresence. Some of those will advance much faster than others, but they all need to happen to enable AR and VR that are so compelling that they become an integral part of our lives.

Page 28: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

5352

VPixx Technologies

VPixx Technologies welcomes the eye tracking community to ETRA 2019, and is excited to demonstrate our TRACKPixx 2kHz binocular eye tracker. Visit us to also learn about the PROPixx DLP LED video projector, now supporting refresh

the generation of precise high refresh rate stimuli for gaze-contingent, stereoscopic, and other dynamic applications. The PROPixx is the world’s

1920×1080, and a perfectly linear gamma. The solid-state LED light engine has 30x the lifetime of halogen projectors, a wider color gamut, and zero image ghosting for stereo vision or dynamic eye tracking applications. Our high-speed circular polarizer can project 480Hz stereoscopic stimuli for passive polarizing glasses into MRI and MEG environments. The TRACKPixx can be mounted under the projection screen for high speed gaze-contingent paradigms in MRI and MEG settings. In addition, the TRACKPixx includes an embedded data acquisition system, permitting microsecond synchronization between visual stimulation, eye tracking data, and other types of I/O including EEG, TMS, audio stimulation, button box input, TTL trigger output, analog acquisition, and more!

SR Research Ltd - EyeLink Since its inception, SR Research has been dedicated to supporting and responding to the needs of the eye-tracking research community. Our cutting-edge EyeLink systems are renowned for their high sampling rates and uncompromised data quality, and are installed in behavioral labs as well as EEG/MEG/MRI environments in top research

data analysis software and legendary customer support, SR Research has been enabling academics to perform world class eye tracking research for over 20 years. In total, our users have published over 7000 peer-reviewed papers, with over 800 papers published in 2018 alone. We have recently been working to bring even closer integration between our eye tracking hardware and software and a wide range of neurophysiological recording devices, including EGI, BrainProducts, Neuroscan and BioSemi EEG systems, to name just a few. We are delighted to be attending ETRA, and will be demonstrating our new software solution for eye tracking in usability research settings.

FOVE

FOVE Inc. developed “FOVE0” as the world

model “FOVE0” in early 2017, receiving huge support from enthusiastic backers in Kickstarter and many cooperative investors. Currently, FOVE is continuing to push the boundaries of advanced eye tracking research, and

are ongoing. We are hoping to meet as many people as possible and exchange thought-provoking ideas through ETRA2019!

AdHawk Microsystems

AdHawk Microsystems produces custom silicon microsystems at the wafer-scale to enable the proliferation of eye tracking in consumer electronics products. We are a fabless semiconductor company that was spun out of the University of Waterloo to commercialize microsystems for human-computer interaction. Our team has re-imagined the architecture of conventional eye tracking systems with the goal of achieving the performance that is required for mobile applications and for the emerging AR/VR enterprise.For mobile eye tracking to become truly pervasive, we believe that order-

including power consumption, latency, form factor and sampling rate - without compromising accuracy and robustness. Although AdHawk’s OEM

they would also be welcomed by eye tracking researchers.At ETRA 2019, AdHawk is introducing MEMS-based eye tracking glasses that obviate the need for power-hungry cameras and computationally expensive image processing. We are seeking researchers from the ETRA community to evaluate the product through our beta kit program prior to our product launch. We welcome all attendees to visit our exhibition booth to

EyeLink

SR Research

Page 29: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

5554

Smart Eye

Bridging the gap between man and machine

intelligence (AI) powered eye-tracking technology that understands, assists and predicts human intentions and actions.By studying a person’s eye, face and head movements, our technology can draw conclusions about an individual’s alertness, attention, focus and gain insights into a person’s awareness and menta status.Today, our eye tracking technology is embedded in the next generation of vehicles, helping the automotive industry take another step towards safer and more eco-friendly transportation.Our research instruments offer unparalleled performance in complex, real-world situations, paving the way for new insights in aerospace, automotive, aviation, psychology, communication, neuroscience, medicine and clinical research.

situations and demands at almost every location. Because to us, tracking, measuring and analyzing human eye movements is an art.Our newest addition, the XO, is a remote eye tracking system that combines the durability of a true dual camera solution and the high performance of our Smart Eye Pro software. Connect it to any analysis

analyze.

Silver Sponsor Descriptions

Gazepoint

Gazepoint brings over two decades of eye tracking design and experience to the creation of the most powerful eye tracking research systems available at an affordable price point. Gazepoint’s

recently released GP3HD offers the next level of eye-tracking performance with a 150Hz sampling rate, creating more natural data collection environment and improving tracking reliability. Our recently released biometrics system now allows for easily capturing gaze data along with biometric signals such as heart rate and galvanic skin response.

The pool at Crowne Plaza is going through a remodel. The hotel has suggested that we use the pool at the Hilton Garden Inn (1 block west of Crowne Plaza) by showing the front desk your room key.

with treadmills, ellipticals, bikes, free weights, and balance balls.

The SIGCHI mobile app for #ETRA2019 is available for Android and iOS!

The app displays the conference schedule and content data and you are

able to create your individual schedule, a reading list and add notes to the

content. Start planning your schedule now!

Network: IHG Connect

Passcode: 6378

SIGCHI mobile app

Event Notes

Page 30: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

56

Notes

Page 31: DENVER - ETRA · 2019-06-23 · extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction,

EyeLink

SR Research

Herbert WertheimCollege of EngineeringDepartment of Computer & Information Science & Engineering

UNIVERSITY of FLORIDA

Platinum Sponsors

Gold Sponsors

Silver Sponsors

Institutional Sponsors