Pupil Citation List
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

View only
 
ACDEFGH
1
Year PublishedAuthor(s)TitleURLPDFJournal/ConferenceKeywords
2
2019Nitish Padmanaban, Robert Konrad, Gordon WetzsteinAutofocals: Evaluating gaze-contingent eyeglasses for presbyopeshttps://advances.sciencemag.org/content/5/6/eaav6187.fullhttps://advances.sciencemag.org/content/5/6/eaav6187.full.pdfScience Advances
Vol 5, No. 6
05 June 2019
3
2019Rakshit Kothari, Zhizhuo Yang , Christopher Kanan, Reynold Bailey, Jeff Pelz , and Gabriel DiazGaze-in-wild: A dataset for studying eye and head coordination in everyday activitieshttps://arxiv.org/pdf/1905.13146.pdfArxivEye head coordination, in the wild, dataset
4
2019Joohwan Kim , Michael Stengel , Alexander Majercik (Nvidia) , Shalini De Mello , David Dunn (UNC) , Samuli Laine , Morgan McGuire , David LuebkeNVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimationhttps://research.nvidia.com/publication/2019-05_NVGaze%3A-An-Anatomically-InformedACM Conference on Human-Computer-Interaction (CHI) 2019eye tracking, virtual reality, VR, NeuralNetworks, gaze estimation
5
2019Thiago Santini,Diederick C. Niehorster, Enkelejda KasneciGet a Grip : Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Trackinghttp://www.ti.uni-tuebingen.de/uploads/tx_timitarbeiter/etra2019-slippage_small.pdfETRA 2019calibration; drift; embedded; eye; gaze estimation; open source; pervasive; pupil tracking; real-time; slippage; tracking
6
2019Viviane Clay, Peter König, Sabine U. KoenigEye Tracking in Virtual Realityhttps://www.researchgate.net/publication/332780872_Eye_Tracking_in_Virtual_Realityhttps://www.researchgate.net/profile/Viviane_Clay/publication/332780872_Eye_Tracking_in_Virtual_Reality/links/5cc95b74299bf120978bd0f6/Eye-Tracking-in-Virtual-Reality.pdf?origin=publication_detailJournal of Eye Movement Research. 12. 10.16910/jemr.12.1.3.Eye movement, eye tracking, virtual reality, VR, smooth pursuit, region of
interest, gaze
7
2019Xi Wang, Andreas Ley, Sebastian Koch, David Lindlbauer, James Hays, Kenneth Holmqvist, Marc AlexaThe Mental Image Revealed by Gaze Trackinghttps://doi.org/10.1145/3290605.3300839http://cybertron.cg.tu-berlin.de/xiwang/files/mi.pdfCHI 2019, May 4–9, 2019, Glasgow, Scotland UKgaze pattern, mental imagery, eye tracking
8
2018Tobias Fischer, Hyung Jin Chang, and Yiannis DemirisRT-GENE: Real-Time Eye Gaze Estimation in Natural Environments http://openaccess.thecvf.com/content_ECCV_2018/papers/Tobias_Fischer_RT-GENE_Real-Time_Eye_ECCV_2018_paper.pdfECCV 2018Gaze estimation, Gaze dataset, Convolutional Neural
Network, Semantic inpainting, Eyetracking glasses
9
2018Stelling-Konczak, A., Vlakveld, W.P., Van Gent, P., Commandeur, J.J.F., Van
Wee, G.P., Hagenzieker, M
A study in real traffic examining glance behaviour of teenage cyclists when listening to music: Results and ethical considerationshttps://doi.org/10.1016/j.trf.2018.02.031https://www.researchgate.net/profile/Marjan_Hagenzieker/publication/320563592_A_study_in_real_traffic_examining_glance_behaviour_of_teenage_cyclists_when_listening_to_music_Results_and_ethical_considerations/links/5ba923ab92851ca9ed225474/A-study-in-real-traffic-examining-glance-behaviour-of-teenage-cyclists-when-listening-to-music-Results-and-ethical-considerations.pdfTransportation Research Part F: Traffic Psychology and Behaviour
Volume 55, May 2018, Pages 47-57
Cycling safety, Music, Auditory perception, Visual attention, Visual performance, Research ethics
10
2018Trent Koessler, Harold HillFocusing on an illusion: Accommodating to perceived depth?https://doi.org/10.1016/j.visres.2018.11.001https://reader.elsevier.com/reader/sd/pii/S0042698918302360?token=6C8E97AA7D2FC8F2FC32FC77EC3E6A7A9F22CE57F070F75B86550644FE991309A2D235248384C4C9989E47025A995919Vision Research, Volume 154, January 2019, Pages 131-141Accommodation Convergence, Depth perception, Hollow-face illusion
11
2018Vicente Soto, John Tyson-Carr, Katerina Kokmotou, Hannah Roberts, Stephanie Cook, Nicholas Fallon, Timo Giesbrecht, Andrej StancakBrain Responses to Emotional Faces in Natural Settings: A Wireless Mobile EEG Recording Studyhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6209651/Front Psychol. 2018; 9: 2003.EEG, eye-movement related potentials, N170 component, source dipole analysis, MoBI, mobile brain imaging, visual evoked potential (VEP)
12
2018Newman, Benjamin A., Aronson, Reuben M., Srinivasa, Siddhartha S., Kitani, Kris, and Admoni, HennyHARMONIC: A Multimodal Dataset of
Assistive Human-Robot Collaboration
http://harp.ri.cmu.edu/harmonic/https://arxiv.org/pdf/1807.11154.pdfComputer Science - Robotics, Computer Science - Human-Computer Interaction
13
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, and Henny AdmoniEye-Hand Behavior in Human-Robot Shared Manipulationhttp://harp.ri.cmu.edu/assets/pubs/hri2018_aronson.pdfHuman-Robot Interaction 2018Computer Science - Robotics, Computer Science - Human-Computer Interaction
14
2018Reuben M. Aronson and Henny AdmoniGaze for Error Detection During Human-Robot Shared Manipulation.http://harp.ri.cmu.edu/assets/pubs/fja_rss2018_aronson.pdfJoint Action Workshop at RSS 2018Computer Science - Robotics, Computer Science - Human-Computer Interaction
15
2018Jeff J MacInnes, Shariq Iqbal, John Pearson, Elizabeth N JohnsonWearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across deviceshttps://www.biorxiv.org/content/early/2018/06/28/299925.abstracthttps://www.biorxiv.org/content/biorxiv/early/2018/06/28/299925.full.pdf
16
Mohamed Khamis, Malin Eiband, Martin Zürn, Heinrich HussmannEyeSpot: Leveraging Gaze to Protect Private Text Content on Mobile Devices from Shoulder Surfinghttps://doi.org/10.3390/mti2030045https://www.mdpi.com/2414-4088/2/3/45/pdfMultimodal Technologies Interact. 2018, 2(3), 45;mobile devices; privacy; gaze; eye tracking; security
17
2018Corten SingerSee-Thru: Towards Minimally Obstructive Eye-Controlled
Wheelchair Interfaces
https://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-61.pdfMaster Thesis - Technical Report No. UCB/EECS-2018-61Eye Gaze, Eye Tracking, Gaze Control,
Eyes-Only Interaction, User Interfaces, Power
Wheelchair, Smart Wheelchair, User Experience,
Assistive Technology, Gaze Gestures, Field of View,
Obstructive
18
2018Mingming Wang, Kate Walders, Martin E.Gordon, Jeff B. Pelz, Susan FarnandAuto-simulator Preparation for Research into Assessing the Correlation
Between Human Driving Behaviors and Fixation Patterns
https://www.ingentaconnect.com/contentone/ist/ei/2018/00002018/00000017/art00007#http://docserver.ingentaconnect.com/deliver/connect/ist/24701173/v2018n17/s7.pdf?expires=1532320402&id=0000&titleid=72010604&checksum=5581D989E4F2D227E1AE054EEA0E51EBElectronic Imaging, Autonomous Vehicles and Machines 2018, pp. 163-1-163-6(6)
19
2018EIKE LANGBEHN, FRANK STEINICKE, MARKUS LAPPE, GREGORY F. WELCH, GERD BRUDERIn the Blink of an Eye –
Leveraging Blink-Induced Suppression for Imperceptible Position and
Orientation Redirection in Virtual Reality
https://basilic.informatik.uni-hamburg.de/Publications/2018/LSLWB18/eye_blinks.pdfACM Trans. Graph., Vol. 37, No. 4, Article 66Human-centered computing→Virtual reality;• Computing
methodologies → Perception; Virtual reality;Virtual reality, eye blinks, redirected
walking, psychophysical experiments
20
2018Carlos Rafael Fernandes Picanço, François Jacques TonneauA low‐cost platform for eye‐tracking research: Using Pupil© in behavior analysishttps://onlinelibrary.wiley.com/doi/abs/10.1002/jeab.448
21
2018Thiago Santini, Wolfgang Fuhl, Enkelejda KasneciPuReST: robust pupil tracking for real-time pervasive eye tracking
https://dl.acm.org/citation.cfm?id=3204578ETRA 2018
22
2018Justus Thies, Michael Zollhofer, Marc Stamminger, Christian Theobalt, Matthias Nießner, FaceVR: Real-Time Gaze-Aware
Facial Reenactment in Virtual Reality
https://dl.acm.org/citation.cfm?id=3182644https://arxiv.org/pdf/1610.03151.pdfACM Transactions on Graphics (TOG) Volume 37 Issue 2 Article No. 25face tracking, virtual reality, eye tracking
23
2018Li, W-C., Kearney, P., Braithwaite, G. & Lin, JHow much is too much? Visual Scan Patterns of Single Air Traffic Controller Performing Multiple Remote Tower Operationshttps://doi.org/10.1016/j.ergon.2018.05.005International Journal of Industrial Ergonomics (67, 136-144)Air traffic managementAviation safetyCost-efficiencyHuman-computer interactionsMultiple remote tower operationsSituation awarenes
24
2018Kearney, P. & Li, W-CMultiple Remote Tower for Single European Sky: the Evolution from Initial Operational Concept to Regulatory Approved Implementationhttps://doi.org/10.1016/j.tra.2018.06.005Transportation Research Part-A, 116, 15-30Air traffic control
Cost efficiency
Human performance
Multiple remote tower operations
Safety assessment
Single European Sky
25
2018Florian Jungwirth, Michael Haslgrübler, Alois FerschaContour-Guided Gaze Gestures: Using Object Contours as Visual Guidance for Triggering Interactionshttps://dl.acm.org/citation.cfm?id=3204530http://delivery.acm.org/10.1145/3210000/3204530/a28-jungwirth.pdf?ip=46.88.71.149&id=3204530&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&__acm__=1529603119_328af9dab57076b59c35dc9d6270f93bETRA 2018Wearable Computing; Pervasive Computing; Eye-Tracking; Gaze- based Interaction; Internet of Things
26
2018Carlos Elmadjian, Pushkar Shukla, Antonio Diaz Tula, Carlos H. Morimoto3D gaze estimation in the scene volume with a head-mounted eye trackerhttps://dl.acm.org/citation.cfm?id=3206351http://delivery.acm.org/10.1145/3210000/3206351/a3-elmadjian.pdf?ip=46.88.71.149&id=3206351&acc=OPENTOC&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E383ADA7593775D6F&__acm__=1529603164_193f8cbfe8ef54f156cf3d4a4f49c539ETRA / COGAIN 2018Head-mounted eye tracking, calibration, gaze estimation, 3D dataset
27
2018Michael Barz,
Florian Daiber,
Daniel Sonntag,
Andreas Bulling
Error-Aware Gaze-Based Interfaces for
Robust Mobile Gaze Interaction
https://doi.org/10.1145/3204493.3204536https://perceptual.mpi-inf.mpg.de/files/2018/04/barz18_etra.pdfETRA 2018Eye Tracking; Mobile Interaction; Gaze Interaction; Error Model;
Error-Aware
28
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, Henny Admoni Eye-Hand Behavior in Human-Robot Shared Manipulationhttps://dl.acm.org/citation.cfm?id=3171287https://www.ri.cmu.edu/wp-content/uploads/2018/01/hri2018_aronson.pdfConference Paper, Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, March, 2018 human-robot interaction, eye gaze, eye tracking, shared autonomy,
nonverbal communication
29
2018Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, Mark BillinghurstPinpointing: Precise Head- and Eye-Based Target
Selection for Augmented Reality
https://dl.acm.org/citation.cfm?id=3173655https://www.researchgate.net/profile/Mikko_Kytoe/publication/323970135_Pinpointing_Precise_Head-_and_Eye-Based_Target_Selection_for_Augmented_Reality/links/5ab559da0f7e9b68ef4cf26a/Pinpointing-Precise-Head-and-Eye-Based-Target-Selection-for-Augmented-Reality.pdfCHI 2018Eye tracking; gaze interaction; refinement techniques;
target selection; augmented reality; head-worn display
30
2018Leanne Chukoskie, Shengyao Guo, Eric Ho, Yalun Zheng, Qiming Chen, Vivian Meng, John Cao, Nikhita Devgan, Si Wu, Pamela C. CosmanQuantifying Gaze Behavior during Real World Interactions using Automated Object, Face, and Fixation Detectionhttps://ieeexplore.ieee.org/abstract/document/8328848/IEEE Transactions on Cognitive and Developmental Systemseye-tracking,
gaze behavior,
face detection,
computer vision.
31
2018Kai Dierkes, Moritz Kassner, Andreas BullingA novel approach to single camera, glint-free
3D eye model fitting including corneal refraction
https://perceptual.mpi-inf.mpg.de/files/2018/04/dierkes18_etra.pdfETRA 2018Eye tracking, refraction, 3D eye model, pupil detection, contourbased,
glint-free
32
2018Julian Steil, Michael Xuelin Huang, Andreas BullingFixation Detection for Head-Mounted Eye Tracking
Based on Visual Similarity of Gaze Targets
https://perceptual.mpi-inf.mpg.de/files/2018/04/steil18_etra.pdfETRA 2018Visual focus of attention; Mobile eye tracking; Egocentric vision
33
2018Thomas Kosch, Mariam Hassib, Daniel Buschek, Albrecht SchmidtLook into my Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptationhttps://dl.acm.org/citation.cfm?id=3188643CHI EA 2018Cognition-Aware Interfaces; Workload-Aware Computing;
Pupil Dilation; Eye Tracking
34
2018Damian Almaraz, Brock Carlson, Hieu-Trung Vu, Jeremy LoebachPupillometry as a measure of auditory cognitive processes and listening efforthttps://doi.org/10.1121/1.5035727The Journal of the Acoustical Society of America 143, 1751 (2018)
35
2018Iuliia Brishtel, Shoya Ishimaru,Olivier Augereau, Koichi Kise, Andreas DengelAssessing Cognitive Workload on Printed and Electronic Media using Eye-Tracker and EDA Wristbandhttps://dl.acm.org/citation.cfm?id=3180354https://www.dropbox.com/s/r807qwnvkz6v0lj/IUI2018Iuliia.pdf?raw=1IUI '18 Companion Proceedings of the 23rd International Conference on Intelligent User Interfaces CompanionE-learning; Eye-Tracking; Reading; Information Processing;
Electrodermal Activity; Cognitive Workload; User-Interface
36
2018Mirko Raković, Nuno Duarte, Jovica Tasevski, José Santos-Victor, Branislav BorovacA dataset of head and eye gaze during dyadic interaction task for
modeling robot gaze behavior
https://doi.org/10.1051/matecconf/201816103002https://www.matec-conferences.org/articles/matecconf/pdf/2018/20/matecconf_erzr2018_03002.pdf13th International Scientific-Technical Conference on Electromechanics and Robotics “Zavalishin’s Readings” - 2018
37
2018Mohamed Khamis, Carl Oechsner, Florian Alt, Andreas BullingVRPursuits: Interaction in Virtual Reality using Smooth Pursuit
Eye Movements
https://doi.org/10.1145/3206505.3206522https://perceptual.mpi-inf.mpg.de/files/2018/04/khamis18_avi.pdfAVI 2018Eye Tracking, Virtual Reality, Gaze Interaction, Pursuits
38
2018Ruimin Li, Bin Li, Shixiong Zhang, Hong Fu, Wai-Lun Lo, Jie Yu, Cindy H.P. Sit, Desheng WenEvaluation of the fine motor skills of children with DCD using the digitalised
visual-motor tracking system
https://ieeexplore.ieee.org/abstract/document/8316738/https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8316738The Journal of Engineering ( Volume: 2018, Issue: 2, 2 2018 )developmental coordination disorder,
hand movement,
eye gaze position,
eye tracker,
digitalised visual-motor tracking system,
DCD,
children,
fine motor skills
39
2018Nuno Duarte, Jovica Tasevski, Moreno Coco, Mirko Raković, José Santos-VictorAction Anticipation: Reading the Intentions of Humans and Robotshttps://arxiv.org/abs/1802.02788https://arxiv.org/pdf/1802.02788
40
2018Julian Steil, Philipp Müller, Yusuke Sugano, Andreas BullingForecasting User Attention During Everyday Mobile
Interactions Using Device-Integrated and Wearable Sensors
https://arxiv.org/abs/1801.06011.pdfhttps://arxiv.org/pdf/1801.06011.pdfMobileHCI 2018Egocentric Vision; Handheld Mobile Device; Attention Shifts;
Mobile Eye Tracking; Attentive User Interfaces
41
2018Christian Lander, Marco Speicher, Frederic Kerber, Antonio KrügerTowards Fixation Extraction in Corneal Imaging Based Eye Tracking Datahttps://doi.org/10.1145/3170427.3188597CHI’18 Extended Abstracts, April 21–26, 2018Corneal Imaging; Fixation; Image Stitching.
42
2018Sylvain Pauchet, Catherine Letondal, Jean-Luc Vinot, Mickaël Causse, Mathieu Cousy, Valentin Becquet, Guillaume Crouzet GazeForm: Dynamic Gaze-adaptive Touch Surface
for Eyes-free Interaction in Airliner Cockpits
https://www.researchgate.net/profile/Mickael_Causse/publication/324561583_GazeForm_Dynamic_Gaze-adaptive_Touch_Surface_for_Eyes-free_Interaction_in_Airliner_Cockpits/links/5ad5a8fda6fdcc293580bcdf/GazeForm-Dynamic-Gaze-adaptive-Touch-Surface-for-Eyes-free-Interaction-in-Airliner-Cockpits.pdfDIS 2018 (Hong Kong)Eyes-free interaction; touchscreens; TEI; shape-changing
interfaces; adaptive interfaces; eye-tracking; critical contexts
43
2018Paul Schydlo, Mirko Rakovic, Lorenzo Jamone, Jose Santos-VictorAnticipation in Human-Robot Cooperation: A recurrent neural network
approach for multiple action sequences prediction
https://arxiv.org/abs/1802.10503.pdfhttps://arxiv.org/pdf/1802.10503.pdf
44
2018Adithya B, Lee Hanna, Pavan Kumar B N, Youngho ChaiCalibration Techniques and Gaze Accuracy Estimation in Pupil Labs Eye Trackerhttp://www.dbpia.co.kr/Journal/ArticleDetail/NODE07404419TECHART: Journal of Arts and Imaging Science, Vol.5 No.1calibration
45
2018T.A.B. de Boer, J. Hoogmoed, N.M. Looye, J.R.P. van der Toorn, R.P. de Vos, J. Stapel, P. Bazilinskyy, J.C.F. de WinterCombining eye-tracking with semantic scene labelling in a car https://www.researchgate.net/profile/Joost_De_Winter/publication/322293911_Combining_eye-tracking_with_semantic_scene_labelling_in_a_car/links/5a5136d0aca2725638c592c0/Combining-eye-tracking-with-semantic-scene-labelling-in-a-car.pdfWorking paper, January 2018Computer Vision, Semantic Scene Understanding, Object Classification, Driving, Transportation
46
2018Ľuboš Hládek, Bernd Porr, W. Owen BrimijoinReal-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculographyhttp://journals.plos.org/plosone/article?id=10.1371/journal.pone.0190420http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0190420&type=printablePLoS ONE 13(1)Electrooculography, Eye gaze angle, Saccade
47
2018Martina Truschzinskinski, Alberto Betellab, Guido Brunnettc, Paul F.M.J. VerschureEmotional and cognitive influences in air traffic controller tasks: An investigation using a virtual environment?https://www.sciencedirect.com/science/article/pii/S0003687017302855Journal "Applied Ergonomics" Volume 69Air traffic control, Personality, Workload, Mood, Virtual reality
48
2018Yasmeen Abdrabou,Khaled KassemJailan SalahReem El-GendyMahesty MorsyYomna Abdelrahman, Slim AbdennadherExploring the Usage of EEG and Pupil Diameter to Detect Elicited Valencehttps://link.springer.com/chapter/10.1007/978-3-319-73888-8_45Intelligent Human Systems IntegrationEEG Eye tracker Affective computing
49
2018Sebastiaan Mathôt, Jasper FabiusElle Van HeusdenStefan Van der StigchelSafe and sensible preprocessing and baseline correction of pupil-size datahttps://link.springer.com/article/10.3758/s13428-017-1007-2https://link.springer.com/content/pdf/10.3758%2Fs13428-017-1007-2.pdfJournal "Behavior Research Method" (January 2018)Pupillometry, Pupil size, Baseline correction, Research methods
50
2018Rahool Patel, Adrian Zurca396: I See What You See Adding Eye-Tracking To Medical Simulationhttp://journals.lww.com/ccmjournal/Fulltext/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.aspxhttp://pdfs.journals.lww.com/ccmjournal/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.pdf?token=method|ExpireAbsolute;source|Journals;ttl|1516159040898;payload|mY8D3u1TCCsNvP5E421JYK6N6XICDamxByyYpaNzk7FKjTaa1Yz22MivkHZqjGP4kdS2v0J76WGAnHACH69s21Csk0OpQi3YbjEMdSoz2UhVybFqQxA7lKwSUlA502zQZr96TQRwhVlocEp/sJ586aVbcBFlltKNKo+tbuMfL73hiPqJliudqs17cHeLcLbV/CqjlP3IO0jGHlHQtJWcICDdAyGJMnpi6RlbEJaRheGeh5z5uvqz3FLHgPKVXJzd9ia1/MJJUFmWp1b9urv13G3AW53fk6CUMZKseFCONh0=;hash|4XIUop/VFLFVP+lXHDRboA==Journal "Critical Care Medicine" Volume 46, Issue 1 (January 2018)Medical simulation
51
2018Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, Andreas BullingPrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysishttps://arxiv.org/abs/1801.04457https://arxiv.org/pdf/1801.04457.pdfEgocentric Vision; Eye Tracking; Gaze Behaviour
52
2017Wen-Chin Li, Jiaqi Cao, Jr-Hung Lin, Graham Braithwaite, Matthew GreavesThe Evaluation of Pilot’s First Fixation
and Response Time to Different Design
of Alerting Messages
https://pdfs.semanticscholar.org/f9e6/e1c8b5194c4e7b7565bf346e8192d7289803.pdfAG 2017Cockpit design, Crew Alerting System, Eye movement, Human-computer interaction, Quick Reference Handbook
53
2107Henny Admoni, Siddhartha SrinivasaEye Gaze Reveals Intentions in Shared Autonomyhttp://intentions.xyz/wp-content/uploads/2017/01/Admoni.intentions-eyegaze.reduced.pdfProceedings of Intentions in HRI Workshop at HRI
2017, Vienna, Austria, March 2017 (Intentions in HRI ’17),
human-robot interaction, intentions, nonverbal behavior, eye gaze
54
2017Marco Filippucci, Fabio Bianconi, Elisa Bettollini, Michela Meschini and Marco Seccaroni Survey and Representation for Rural Landscape. New
Tools for New Strategies: The Example of Campello
Sul Clitunno
http://www.mdpi.com/2504-3900/1/9/934/pdflandscape and image; perception; eye tracking; algorithmic spatial analysis;
participation
55
2017Brendan JohnA Dataset of Gaze Behavior in VR Faithful to Natural Statisticshttp://scholarworks.rit.edu/cgi/viewcontent.cgi?article=10716&context=thesesRIT Student MS Thesis
56
2017Jork Stapel, Freddy Antony Mullakkal-Babu, Riender HappeeDriver behaviour and workload in an on-road automated vehiclehttps://www.researchgate.net/profile/Jork_Stapel2/publication/318702364_Driver_Behavior_and_Workload_in_an_On-road_Automated_Vehicle/links/597887fa45851570a1b9623a/Driver-Behavior-and-Workload-in-an-On-road-Automated-Vehicle.pdfPreprintAutomated Driving; On-road; Workload; Experience; Underload
57
2017Iuliia Kotseruba, John K. Tsotsos†STAR-RT: Visual attention for real-time video game playinghttps://arxiv.org/abs/1711.09464https://arxiv.org/pdf/1711.09464.pdfARXIV PreprintSTAR; Cognitive Programs; visual attention; Visual Routines; real-time vision; platform video games; game AI
58
2017QI SUN,FU-CHUNG HUANG, JOOHWAN KIM,
LI-YI WEI, DAVID LUEBKE, ARIE KAUFMAN
Perceptually-Guided Foveation for Light Field Displayshttp://research.nvidia.com/publication/2017-11_Perceptually-Guided-Foveation-forhttp://research.nvidia.com/sites/default/files/publications/c121-f121_199-a18-paperfinal-v3.pdfACM Trans. Graph. 36, 6, Article 192 (November 2017)light field, computational display, foveation,
sampling
59
2017Changwon Jang, Kiseung Bang, Seokil Moon, Jonghyun Kim, Seungjae Lee, Byoungho LeeRetinal 3D: Augmented Reality Near-Eye Display Via Pupil-Tracked Light Field Projection on Retinahttp://oeqelab.snu.ac.kr/retinal3dhttp://oeqelab.snu.ac.kr/?module=file&act=procFileDownload&file_srl=74430&sid=cfde647ef4e59a0adb7f07197f53b84e&module_srl=74407SIGGRAPH 2017. Vol 36. Issue 6. Article No. 190Near-eye display, eye tracking, computational
displays, holographic optical element, vergence-accommodation
conƒict
60
2017Khaled Kassem, Jailan Salah, Yasmeen Abdrabou, Mahesty Morsy, Reem El-Gendy,
Yomna Abdelrahman
, Slim Abdennadher
DiVA: Exploring the Usage of Pupil Diameter to Elicit
Valence and Arousal
https://www.researchgate.net/profile/Yomna_Abdelrahman/publication/321783827_DiVA_exploring_the_usage_of_pupil_di_ameter_to_elicit_v_alence_and_a_rousal/links/5a328a17aca27271444f3689/DiVA-exploring-the-usage-of-pupil-di-ameter-to-elicit-v-alence-and-a-rousal.pdfArousal; Valence; Eye Tracker; Pupil Diameter
61
2017Benjamin Hatscher, Maria Luz, Lennart E. Nacke, Norbert Elkmann, Veit Muller, Christian HansenGazeTap: Towards Hands-Free Interaction in the Operating Roomhttps://www.researchgate.net/profile/Christian_Hansen17/publication/320372375_GazeTap_Towards_Hands-Free_Interaction_in_the_Operating_Room/links/59f7498e0f7e9b553ebedb6d/GazeTap-Towards-Hands-Free-Interaction-in-the-Operating-Room.pdfInternational Conference on Multimodal InteractionInput techniques, multimodal interaction, foot input, gaze input,
eye tracking, gaze-foot interaction, HCI in the operating room
62
2017Christian Lander, Sven Gehring, Markus Löchtefeld, Andreas Bulling, Antonio KrügerEyeMirror: Mobile Calibration-Free Gaze Approximation
using Corneal Imaging
https://perceptual.mpi-inf.mpg.de/files/2017/11/lander17_mum.pdfMUM 2017, November 26–29, 2017Corneal Image; mobile device; gaze approximation; feature
tracking; pervasive
63
2017Zhao, Zhong, Robin N. Salesse, Ludovic Marin, Mathieu Gueugnon, and Benoit G. Bardy. Likability’s Effect on Interpersonal Motor Coordination: Exploring Natural Gaze Directionhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/fullhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/pdfJournal "Frontiers in Psychology"interpersonal motor coordination, likability, finger-tapping, markers
64
2017Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, Hans GellersenGaze + Pinch Interaction in Virtual Realityhttps://dl.acm.org/citation.cfm?id=3132180https://kenpfeuffer.files.wordpress.com/2015/02/p99-pfeuffer.pdfSUI ’17, October 16–17, 2017Gaze; pinch; freehand gesture; interaction technique; multimodal
interface; menu; eye tracking; virtual reality.
65
2017Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, Ian OakleySmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

https://www.researchgate.net/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Realityhttps://www.researchgate.net/profile/Islam_Md_Rasel/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Reality/links/5a0255c1aca2720df3ca03d2/SmoothMoves-Smooth-Pursuits-Head-Movements-for-Augmented-Reality.pdfACM Symposium on User Interface Software and Technology (UIST)Wearable computing, eye tracking, augmented reality, AR,
input technique, smooth pursuits, motion matching, HMD
66
2017Yun Suen Pai, Banjamin Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, Kai KunzeGazeSphere: Navigating 360-Degree-Video Environments in VR
Using Head Rotation and Eye Gaze
http://dl.acm.org/citation.cfm?id=3102183ACM SIGGRAPH PostersVirtual reality, 360-degree-video, eye tracking, orbital navigation
67
2017Mihai Bâce, Philippe Schlattner, Vincent Becker, Gábor SörösFacilitating Object Detection and Recognition
through Eye Gaze
http://www.vs.inf.ethz.ch/publ/papers/mbace_MobileHCI2017_workshop.pdfMobileHCI ’17 Workshops, September 04–09, 2017Eye Tracking; Eye Gaze; Wearable Computing; HCI
68
2017Otto Lappi, Paavo Rinkkala, Jami PekkanenSystematic Observation of an Expert Driver's Gaze Strategy—An On-Road Case Studyhttps://doi.org/10.3389/fpsyg.2017.00620https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00620/fullFront. Psychol., 27 April 2017
69
2017ANA SERRANO, VINCENT SITZMANN, JAIME RUIZ-BORAU, GORDON WETZSTEIN, DIEGO GUTIERREZ, BELEN MASIA Movie Editing and Cognitive Event Segmentation in Virtual Reality
Video
https://doi.org/10.1145/3072959.3073668http://webdiis.unizar.es/~aserrano/docs/Serrano_SIGG2017_VR-cine.pdfACM Transactions on Graphics, Vol. 36, No. 4, Article 47. Publication date: July 201VR, Human Centered Computing, Immersive environments, cinematography
70
2017Guillem Torrente MartiMobility for the severely disabled:
a head-controlled wheelchair
https://repositori.upf.edu/bitstream/handle/10230/32897/Torrente_2017.pdf?sequence=1&isAllowed=yBachelor's ThesisAccessibility, Eye Tracking
71
2017Francesco Walker, Berno Bucker, Nicola C. Anderson, Daniel Schreij, Jan TheeuwesLooking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adultshttps://doi.org/10.1371/journal.pone.0178912http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0178912&type=printablePloS one
72
2017Jason Orlosky, Yuta Itoh, Maud Ranchet, Kiyoshi Kiyokawa, John Morgan, Hannes DevosEmulation of Physician Tasks in Eye-Tracked Virtual Reality for Remote Diagnosis of Neurodegenerative Diseasehttps://www.researchgate.net/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Diseasehttps://www.researchgate.net/profile/Jason_Orlosky/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Disease/links/58d352f6a6fdccd24d43c710/Emulation-of-Physician-Tasks-in-Eye-tracked-Virtual-Reality-for-Remote-Diagnosis-of-Neurodegenerative-Disease.pdfIEEE Transactions on Visualization and Computer GraphicsVirtual reality, eye tracking, diagnosis, visualization
73
2017Enkelejda Kasneci, Alex A. Black, Joanne M. WoodEye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucomahttps://www.hindawi.com/journals/joph/2017/6425913/abs/http://downloads.hindawi.com/journals/joph/2017/6425913.pdfJournal of ophthalmology
74
2017Andrew L. Kun, Hidde van der Meulen, Christian P. JanssenCalling While Driving: An Initial Experiment With Hololenshttps://www.ris.uu.nl/ws/files/31229478/DA2017HoloLens.pdfInternational Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle DesignHololens
75
2017Ngu Nguyen, Stephan SiggPassFrame: Generating image-based passwords from egocentric videoshttp://ieeexplore.ieee.org/abstract/document/7917518/authorsPervasive Computing and Communications WorkshopsAuthentication, Videos, Image segmentation, Cameras, Visualization, Conferences, Pervasive computing
76
2017Timothy Stapleton, Helen Sumin KooBicyclist biomotion visibility aids: a 3D eye-tracking analysishttp://www.emeraldinsight.com/doi/abs/10.1108/IJCST-05-2016-0060International Journal of Clothing Science and TechnologyDesign, Visibility, Eye-tracking, Bicyclists, Biomotion
77
2017Yuta Itoh, Jason Orlosky, Kiyoshi Kiyokawa, Toshiyuki Amano, Maki SugimotoMonocular focus estimation method for a freely-orienting eye using Purkinje-Sanson imageshttp://ieeexplore.ieee.org/abstract/document/7892252/IEEE Virtual Reality (VR)Cameras, Three-dimensional displays, Lighting, Estimation, Measurement by laser beam, Solid modeling, Two dimensional displays
78
2017Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, Mark BillinghurstExploring natural eye-gaze-based interaction for immersive virtual realityhttp://ieeexplore.ieee.org/abstract/document/7893315/IEEE Symposium on 3D User Interfaces (3DUI)Erbium, Gaze tracking, Resists, Painting, Electronic mail, Two dimensional displays, Portable computers
79
Leanne Chukoskie, Jacqueline Nguyen, Jeanne TownsendGaze-contingent Games for Neurocognitive
Therapy: More than Meets the Eye?
http://cogain2017.cogain.org/camready/talk2-Chukoskie.pdfCOGAIN 2017 (Talk)attention, eye tracking, gaze-contingent, training, usability, video games
80
2017Carlos E. L. Elmadjian, Antonio Diaz-Tula, Fernando O. Aluani, Carlos H. MorimotoGaze interaction using low-resolution images at 5 FPShttp://cogain2017.cogain.org/camready/talk5-Elmadjian.pdfCOGAIN 2017 (Talk)
81
2017David Dunn, Cary Tippets, Kent Torell, Petr Kellnhofer, Kaan Aksit, Piotr Didyk, Karol Myszkowski, David Luebke, Henry FuchsWide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrorshttp://ieeexplore.ieee.org/abstract/document/7829412/http://telepresence.web.unc.edu/files/2017/01/Dunn_2017_TVCG_MembraneAR.pdfIEEE Transactions on Visualization and Computer GraphicsAugmented reality, displays, focus accommodation, perception, user study
82
2017Christian Lander, Frederik Wiehr, Nico Herbig, Antonio Krüger, Markus LöchtefeldInferring Landmarks for Pedestrian Navigation from Mobile Eye-Tracking Data and Google Street Viewhttp://dl.acm.org/citation.cfm?doid=3027063.3053201http://umtl.dfki.de/~fred/papers/final-ea2721-lander.pdfCHI Conference (Extended Abstracts on Human Factors in Computing Systems)Landmarks, Eye Tracking, Google Street View, Navigation
83
2017Michaela Klauck, Yusuke Sugano, Andreas BullingNoticeable or Distractive?: A Design Space for Gaze-Contingent User Interface Notificationshttp://dl.acm.org/citation.cfm?doid=3027063.3053085http://delivery.acm.org/10.1145/3060000/3053085/ea1779-klauck.pdf?ip=14.207.3.246&id=3053085&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&CFID=760488783&CFTOKEN=40502403&__acm__=1494304622_07758cbfc2c218d89f5a8aa9c6232cb7CHI Conference (Extended Abstracts on Human Factors in Computing Systems)Interruptions, Attentive User Interfaces, Eye Tracking, Public Display, Peripheral Display
84
2017Michael Barz, Peter Poller, Daniel SonntagEvaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRIhttp://dl.acm.org/citation.cfm?id=3038367HRI (Human-Robot Interaction)
85
2017Martina TruschzinskiModeling Workload: A System Theory Approachhttp://dl.acm.org/citation.cfm?id=3038408http://dl.acm.org/ft_gateway.cfm?id=3038408&ftid=1850535&dwn=1&CFID=744538827&CFTOKEN=85983821HRI (Human-Robot Interaction)-
86
2017Mark Billinghurst, Kunal Gupta, Masai Katsutoshi, Youngho Lee, Gun Lee, Kai Kunze, Maki SugimotoIs It in Your Eyes? Explorations in Using Gaze Cues for Remote Collaborationhttp://link.springer.com/chapter/10.1007/978-3-319-45853-3_9Book Chapter(Springer): Collaboration Meets Interactive Spaces-
87
2017Patrick Renner, Theis PfeifferAttention Guiding Techniques using Peripheral Vision and Eye Tracking for Feedback in Augmented-Reality-based Assistance Systemshttp://ieeexplore.ieee.org/abstract/document/7893338/https://pub.uni-bielefeld.de/download/2908162/2908197IEEE Symposium on 3D User Interfaces (3DUI)HCI, Information interfaces, Presentaion
88
2017Avi Caspi, Arup Roy, Jessy D. Dorn, Robert J. GreenbergRetinotopic to Spatiotopic Mapping in Blind Patients Implanted With the Argus II Retinal Prosthesishttp://iovs.arvojournals.org/article.aspx?articleid=2597840#149774812Investigative Opthamology & Visual Science (IVOS)-
89
2017Hevesi, P; Ward, JA; Amiraslanov, O; Pirkl, G; Lukowicz, PAnalysis of the Usefulness of Mobile Eyetracker for the Recognition of Physical Activities
http://discovery.ucl.ac.uk/10039438/
http://discovery.ucl.ac.uk/10039438/8/Ward_2017%20Analysis%20of%20the%20Usefulness%20of%20Mobile%20Eyetracker%20for%20the%20Recognition%20of%20Physical%20Activities_.pdf
UBICOMM 2017Eyetracker; activity recognition; sensor fusion
90
2017Thiago Santini, Wolfgang Fuhl, David Geisler, Enkelejda KasneciEyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking
https://www.google.co.th/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwiIkNH1yKzRAhUHPI8KHbiPBREQFgggMAE&url=http%3A%2F%2Fwww.ti.uni-tuebingen.de%2FWolfgang-Fuhl.1651.0.html&usg=AFQjCNE23Vekm0hOdVgjqHV3xzmm45-71A&sig2=yEdTm2nkUSk9eFkWAdx0Mw
http://www.ti.uni-tuebingen.de/uploads/tx_timitarbeiter/main.pdf
Eye movements, pupil detection, calibration, gaze estimation, open-source, eye tracking, data acquisition, human-computer interaction, real-time, pervasive
91
2017Aleksandar Rodić, Theodor Borangiu Advances In Robot Design And Intelligent Control: Proceedings Of The 25th Conference On Robotics In Alpe-Adria-Danube Region (RAAD16)
https://books.google.co.th/books?hl=en&lr=&id=QCecDQAAQBAJ&oi=fnd&pg=PA378&dq=pupil+labs&ots=dRoZH2Qv8Y&sig=S-RoCdSwNeQO5wxwHCikjZTojJ0&redir_esc=y#v=onepage&q=pupil%20labs&f=false
-Proceedings from Robotics in Alpe-Adria-Danube Region (RAAD)-
92
2016Miika ToivanenAn advanced Kalman filter for gaze tracking signal
https://www.researchgate.net/publication/287507544_An_advanced_Kalman_filter_for_gaze_tracking_signalhttps://www.researchgate.net/profile/Miika_Toivanen/publication/287507544_An_advanced_Kalman_filter_for_gaze_tracking_signal/links/5a620e3e0f7e9b6b8fd4248f/An-advanced-Kalman-filter-for-gaze-tracking-signal.pdfBiomedical Signal Processing and ControlGaze tracking, Kalman filter, Principal component analysis, Image analysis
93
2016Florian Alt, Sarah Torma, Daniel BuschekDon’t Disturb Me – Understanding Secondary Tasks on Public Displayshttps://pdfs.semanticscholar.org/d6f4/dab5af8e61f1167d84ac2e1c44e5e5518ec3.pdfInternational Symposium on Pervasive Displays (PERDIS)public display, secondary task performance, mental workload, parallel-task environment
94
2016Carolina Rodriguez-Paras, Shiyan Yang, & Thomas K. FerrisUsing Pupillometry to Indicate the Cognitive Redlinehttp://journals.sagepub.com/doi/pdf/10.1177/1541931213601157Proceedings from Human Factors and Ergonomics Society Annual Meeting
95
2016Ondřej KubánekThe Impact Analysis Of The Technical Devices Used By The Driver On The Reaction Timehttps://dspace.vutbr.cz/bitstream/handle/11012/63302/DP%20Kub%C3%A1nek.pdf?sequence=2PhD thesis?Driver, reaction time, legal requirements, calling
96
2016Yun Suen PaiPhysiological Signal-Driven Virtual Reality in Social Spaceshttp://dl.acm.org/citation.cfm?id=2984787ACM Symposium on User Interface Software and Technology (UIST)
97
2016Henny Admoni, Siddhartha SrinivasaPredicting User Intent Through Eye Gaze for Shared Autonomyhttp://hennyadmoni.com/documents/admoni2016aaaifs.pdf
98
2016Astrid Weiss, Andreas Huber, Jürgen Minichberger, Markus IkedaFirst Application of Robot Teaching in an Existing Industry 4.0 Environment: Does It Really Work?http://www.mdpi.com/2075-4698/6/3/20http://www.mdpi.com/2075-4698/6/3/20/pdfhuman-robot interaction, Industry 4.0, case study, field test, robot teaching
99
2016Tilman Dingler, Rufat Rzayev, Valentin Schwind, Niels HenzeRSVP on the Go - Implicit Reading Support on Smart Watches Through Eye Trackinghttp://dl.acm.org/citation.cfm?id=2971794&dl=ACM&coll=DL&CFID=712884317&CFTOKEN=39946609http://tilmanification.org/assets/pdf/Dingler2016RSVPEyeControl.pdfACM International Symposium on Wearable Computers (ISWC)Reading interfaces, RSVP; eye-tracking, eye gaze interaction, mental load, comprehension
100
2016Hidde van der Meulen, Petra Varsanyi, Lauren Westendorf, Andrew L. Kun, Orit ShaerTowards Understanding Collaboration around Interactive Surfaces: Exploring Joint Visual Attention
http://dl.acm.org/citation.cfm?id=2984778http://cs.wellesley.edu/~hcilab/publication/uist16-eyetracking.pdfACM Symposium on User Interface Software and Technology (UIST)Visual attention, collaboration, eye tracking
Loading...