Pupil Citation List
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

View only
 
ACDEFGH
1
Year PublishedAuthor(s)TitleURLPDFJournal/ConferenceKeywords
2
2019St-Onge Daivd, Kaufmann Marcel, Panerati Jacopo, Ramtoula Benjamin, Cao Yanjun, Coffey Emily, Beltrame Giovanni+Planetary exploration with robot teamshttp://espace2.etsmtl.ca/id/eprint/19351/1/St-Onge D 2019 19351.pdfIEEE Robotics and Automation Magazine 2019space exploration, decentralized robotics, unmanned aerial vehicles, human-swarm interaction
3
2019Xiaoxue Fu1, Eric E. Nelson, Marcela Borge3, Kristin A. Buss, Koraly Pérez-EdgarStationary and ambulatory attention patterns are differentially
associated with early temperamental risk for socioemotional
problems: Preliminary evidence from a multimodal
eye-tracking investigation
doi:10.1017/S0954579419000427https://static1.squarespace.com/static/52812781e4b0bfa86bc3c12f/t/5cd861b2eef1a15db85b4fdf/1557684660600/Fu+et+al+%28in+press%29+Dev.+%26+Psychopathology.pdfDevelopment and Psychopathology (2019), 1–18attention bias, behavioral inhibition, dot-probe task, eye-tracking, mobile eye-tracking
4
2019Ravi Teja Chadalavada, Henrik Andreasson, Maike Schindler, Rainer Palm, Achim J.LilienthalBi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interactionhttps://doi.org/10.1016/j.rcim.2019.101830Robotics and Computer-Integrated Manufacturing, volume 61, Feb 2020Human–robot interaction (HRI), Mobile robots, Intention communication, Eye-tracking, Intention recognition, Spatial augmented reality, Stimulated recall interview, Obstacle avoidance, Safety, Logistics
5
2019Tom Arthur, Sam Vine, Mark Brosnan, Gavin BuckinghamExploring how material cues drive sensorimotor prediction across different levels of autistic-like traitshttps://link.springer.com/article/10.1007/s00221-019-05586-zExperimental Brain Research, September 2019, V 237, Issue 9, pp 2255-2267Autism, Movement, Object lifting, Weight illusion, Grip force
6
2019Richard Wilkie, Callum Mole, Oscar Giles, Natasha Merat, Richard Romano, Gustav MarkkulaCognitive Load During Automation Affects Gaze Behaviours And
Transitions to Manual Steering Control
https://www.researchgate.net/profile/Callum_Mole/publication/332727693_Cognitive_Load_During_Automation_Affects_Gaze_Behaviours_and_Transitions_to_Manual_Steering_Control/links/5cc6bc1792851c8d220c7e1b/Cognitive-Load-During-Automation-Affects-Gaze-Behaviours-and-Transitions-to-Manual-Steering-Control.pdfAccepted to the 10th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design
7
2019Steve Grogorick, Matthias Ueberheide, Jan-Philipp Tauscher, Paul Maximilian Bittner§Gaze and Motion-aware Real-Time Dome Projection System10.1109/VR.2019.8797902https://graphics.tu-bs.de/upload/publications/grogorick2019vr.pdf2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)Human-centered computing—Human computer interaction (HCI)—Interactive systems and tools; Human-centered
computing—Visualization—Visualization systems and tools; Computer systems organization—Real-time system
8
2019Maike Schindler, Achim J. LilienthalDomain-specific interpretation of eye tracking data: towards a refined use of the eye-mind hypothesis for the field of geometryhttps://link.springer.com/article/10.1007/s10649-019-9878-zEducational Studies in MathematicsEye tracking, Eye movements, Eye-mind hypothesis, Geometry
9
2019Georg Simhandl, Philipp Paulweber, Uwe ZdunDesign of an Executable Specification Language
Using Eye Tracking
10.1109/EMIP.2019.00014https://eprints.cs.univie.ac.at/6022/1/simhandl2019emip.pdfEMIP '19 Proceedings of the 6th International Workshop on Eye Movements in Programming
reading, software development
10
2019DMV Díaz, M Knodler, BC Ríos, AMF MedinaEvaluation of Safety Enhancements in School
Zones with Familiar and Unfamiliar Drivers
http://safersim.nads-sc.uiowa.edu/final_reports/C%201%20Y2%20report_Final.pdftraffic safety, driving simulator, driving, eye-tracking
11
2019Nitish Padmanaban, Robert Konrad, Gordon WetzsteinAutofocals: Evaluating gaze-contingent eyeglasses for presbyopeshttps://advances.sciencemag.org/content/5/6/eaav6187.fullhttps://advances.sciencemag.org/content/5/6/eaav6187.full.pdfScience Advances
Vol 5, No. 6
05 June 2019
12
2019Rakshit Kothari, Zhizhuo Yang , Christopher Kanan, Reynold Bailey, Jeff Pelz , and Gabriel DiazGaze-in-wild: A dataset for studying eye and head coordination in everyday activitieshttps://arxiv.org/pdf/1905.13146.pdfArxivEye head coordination, in the wild, dataset
13
2019Joohwan Kim , Michael Stengel , Alexander Majercik (Nvidia) , Shalini De Mello , David Dunn (UNC) , Samuli Laine , Morgan McGuire , David LuebkeNVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimationhttps://research.nvidia.com/publication/2019-05_NVGaze%3A-An-Anatomically-InformedACM Conference on Human-Computer-Interaction (CHI) 2019eye tracking, virtual reality, VR, NeuralNetworks, gaze estimation
14
2019Thiago Santini,Diederick C. Niehorster, Enkelejda KasneciGet a Grip : Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Trackinghttp://www.ti.uni-tuebingen.de/uploads/tx_timitarbeiter/etra2019-slippage_small.pdfETRA 2019calibration; drift; embedded; eye; gaze estimation; open source; pervasive; pupil tracking; real-time; slippage; tracking
15
2019Almoctar Hassoumi, Christophe HurterEye Gesture in a Mixed Reality Environmenthttps://hal-enac.archives-ouvertes.fr/hal-02073441https://hal-enac.archives-ouvertes.fr/hal-02073441/documentHUCAPP 2019 : 3rd International Conference on Human Computer Interaction Theory and Applications, Feb 2019, Prague, Czech Republic. pp.P. 183 - 187Eye-movement, Interaction, Eye Tracking, Smooth Pursuit, Mixed Reality, Accessibility
16
2019Alexiou, Evangelos ; Xu, Peisen ; Ebrahimi, TouradjTowards modelling of visual saliency in point clouds for immersive applicationshttps://infoscience.epfl.ch/record/265790https://infoscience.epfl.ch/record/265790/files/ICIP2019.pdf26th IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan, September 22-25, 2019visual saliency ; immersive environments ; point clouds ; virtual reality ; eye-tracking
17
2019Viviane Clay, Peter König, Sabine U. KoenigEye Tracking in Virtual Realityhttps://www.researchgate.net/publication/332780872_Eye_Tracking_in_Virtual_Realityhttps://www.researchgate.net/profile/Viviane_Clay/publication/332780872_Eye_Tracking_in_Virtual_Reality/links/5cc95b74299bf120978bd0f6/Eye-Tracking-in-Virtual-Reality.pdf?origin=publication_detailJournal of Eye Movement Research. 12. 10.16910/jemr.12.1.3.Eye movement, eye tracking, virtual reality, VR, smooth pursuit, region of
interest, gaze
18
2019M. Kraus, T. Kilian, J. FuchsReal-Time Gaze Mapping in Virtual Environmentshttps://dx.doi.org/10.2312/eurp.20191135http://kops.uni-konstanz.de/bitstream/handle/123456789/46434/Kraus_2-rrvzhdtz0wt93.pdf?sequence=1&isAllowed=yEUROVIS 2019 Posters / Madeiras Pereira, João; Raidou, Renata Georgia (Hrsg.). - Genf : The Eurographics Association, 2019 augmented reality, mixed reality
19
2019Haase, H.
How People with a Visual Field Defect Scan their Environment: An Eye-Tracking Study
https://dspace.library.uu.nl/handle/1874/382794file:///home/pupil-labs/Documents/Haase%20(6545424)%20thesis.pdfMasters thesisCognitive Psychology
20
Björn Jörges, Joan López-MolinerEarth-Gravity Congruent Motion Benefits Visual Gain For Parabolic
Trajectories
https://doi.org/10.1101/547497https://www.biorxiv.org/content/biorxiv/early/2019/02/12/547497.full.pdf
21
2019Xi Wang, Andreas Ley, Sebastian Koch, David Lindlbauer, James Hays, Kenneth Holmqvist, Marc AlexaThe Mental Image Revealed by Gaze Trackinghttps://doi.org/10.1145/3290605.3300839http://cybertron.cg.tu-berlin.de/xiwang/files/mi.pdfCHI 2019, May 4–9, 2019, Glasgow, Scotland UKgaze pattern, mental imagery, eye tracking
22
2018Mathot, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, StefanSafe and sensible preprocessing and baseline correction of pupil-size datahttps://doi.org/10.3758/s13428-017-1007-2https://core.ac.uk/download/pdf/153218039.pdfBehavior Research Methods (2018) 50:94-106Pupillometry, Pupil size, Baseline correction
23
2018Roser Cañigueral,Antonia F Hamilton, Jamie A WardDon't Look at Me, I'm Wearing an Eyetracker!https://www.researchgate.net/publication/328681758_Don't_Look_at_Me_I'm_Wearing_an_Eyetracker/referenceshttps://www.researchgate.net/profile/Roser_Canigueral/publication/328681758_Don't_Look_at_Me_I'm_Wearing_an_Eyetracker/links/5bdefd2c4585150b2b9e3c21/Dont-Look-at-Me-Im-Wearing-an-Eyetracker.pdf?_sg%5B0%5D=CTKEXMltkckqgNwTWDrxi7uzQZEZeivGCZpysX8cHTnax3yOCG99Cz9bbxmX5Dlp5FPf0L1IkQ2h5h-B6I0FkQ.HLJEToAiEg41evAotd6BGlRB8BfH9CYCxt0_r_HOvn07dxgidFaO-lSTIh8oY7kbfghs9TElCFvha3ixpe4HBA&_sg%5B1%5D=LYrtOMrGwXVn1QEv8U0oKrAOYjedB0dB-aqsTg6E8rHjshxGAtv0Ng4yjgEV1sZy95L7bl59RsZmvwZZXWuKFYFprbfUfh0dLwMctH_1dEYT.HLJEToAiEg41evAotd6BGlRB8BfH9CYCxt0_r_HOvn07dxgidFaO-lSTIh8oY7kbfghs9TElCFvha3ixpe4HBA&_iepl=UbiComp/ISWC'18 Adjunct, October 8–12, 2018, Singapore, SingaporeEye tracking, interaction, gaze contingency, social
behavior, eye-based computing, wearables
24
2018Tobias Fischer, Hyung Jin Chang, and Yiannis DemirisRT-GENE: Real-Time Eye Gaze Estimation in Natural Environments http://openaccess.thecvf.com/content_ECCV_2018/papers/Tobias_Fischer_RT-GENE_Real-Time_Eye_ECCV_2018_paper.pdfECCV 2018Gaze estimation, Gaze dataset, Convolutional Neural
Network, Semantic inpainting, Eyetracking glasses
25
2018Stelling-Konczak, A., Vlakveld, W.P., Van Gent, P., Commandeur, J.J.F., Van
Wee, G.P., Hagenzieker, M
A study in real traffic examining glance behaviour of teenage cyclists when listening to music: Results and ethical considerationshttps://doi.org/10.1016/j.trf.2018.02.031https://www.researchgate.net/profile/Marjan_Hagenzieker/publication/320563592_A_study_in_real_traffic_examining_glance_behaviour_of_teenage_cyclists_when_listening_to_music_Results_and_ethical_considerations/links/5ba923ab92851ca9ed225474/A-study-in-real-traffic-examining-glance-behaviour-of-teenage-cyclists-when-listening-to-music-Results-and-ethical-considerations.pdfTransportation Research Part F: Traffic Psychology and Behaviour
Volume 55, May 2018, Pages 47-57
Cycling safety, Music, Auditory perception, Visual attention, Visual performance, Research ethics
26
2018Trent Koessler, Harold HillFocusing on an illusion: Accommodating to perceived depth?https://doi.org/10.1016/j.visres.2018.11.001https://reader.elsevier.com/reader/sd/pii/S0042698918302360?token=6C8E97AA7D2FC8F2FC32FC77EC3E6A7A9F22CE57F070F75B86550644FE991309A2D235248384C4C9989E47025A995919Vision Research, Volume 154, January 2019, Pages 131-141Accommodation Convergence, Depth perception, Hollow-face illusion
27
2018Vicente Soto, John Tyson-Carr, Katerina Kokmotou, Hannah Roberts, Stephanie Cook, Nicholas Fallon, Timo Giesbrecht, Andrej StancakBrain Responses to Emotional Faces in Natural Settings: A Wireless Mobile EEG Recording Studyhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6209651/Front Psychol. 2018; 9: 2003.EEG, eye-movement related potentials, N170 component, source dipole analysis, MoBI, mobile brain imaging, visual evoked potential (VEP)
28
2018Newman, Benjamin A., Aronson, Reuben M., Srinivasa, Siddhartha S., Kitani, Kris, and Admoni, HennyHARMONIC: A Multimodal Dataset of
Assistive Human-Robot Collaboration
http://harp.ri.cmu.edu/harmonic/https://arxiv.org/pdf/1807.11154.pdfComputer Science - Robotics, Computer Science - Human-Computer Interaction
29
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, and Henny AdmoniEye-Hand Behavior in Human-Robot Shared Manipulationhttp://harp.ri.cmu.edu/assets/pubs/hri2018_aronson.pdfHuman-Robot Interaction 2018Computer Science - Robotics, Computer Science - Human-Computer Interaction
30
2018Reuben M. Aronson and Henny AdmoniGaze for Error Detection During Human-Robot Shared Manipulation.http://harp.ri.cmu.edu/assets/pubs/fja_rss2018_aronson.pdfJoint Action Workshop at RSS 2018Computer Science - Robotics, Computer Science - Human-Computer Interaction
31
2018Jeff J MacInnes, Shariq Iqbal, John Pearson, Elizabeth N JohnsonWearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across deviceshttps://www.biorxiv.org/content/early/2018/06/28/299925.abstracthttps://www.biorxiv.org/content/biorxiv/early/2018/06/28/299925.full.pdf
32
Mohamed Khamis, Malin Eiband, Martin Zürn, Heinrich HussmannEyeSpot: Leveraging Gaze to Protect Private Text Content on Mobile Devices from Shoulder Surfinghttps://doi.org/10.3390/mti2030045https://www.mdpi.com/2414-4088/2/3/45/pdfMultimodal Technologies Interact. 2018, 2(3), 45;mobile devices; privacy; gaze; eye tracking; security
33
2018Corten SingerSee-Thru: Towards Minimally Obstructive Eye-Controlled
Wheelchair Interfaces
https://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-61.pdfMaster Thesis - Technical Report No. UCB/EECS-2018-61Eye Gaze, Eye Tracking, Gaze Control,
Eyes-Only Interaction, User Interfaces, Power
Wheelchair, Smart Wheelchair, User Experience,
Assistive Technology, Gaze Gestures, Field of View,
Obstructive
34
2018Mingming Wang, Kate Walders, Martin E.Gordon, Jeff B. Pelz, Susan FarnandAuto-simulator Preparation for Research into Assessing the Correlation
Between Human Driving Behaviors and Fixation Patterns
https://www.ingentaconnect.com/contentone/ist/ei/2018/00002018/00000017/art00007#http://docserver.ingentaconnect.com/deliver/connect/ist/24701173/v2018n17/s7.pdf?expires=1532320402&id=0000&titleid=72010604&checksum=5581D989E4F2D227E1AE054EEA0E51EBElectronic Imaging, Autonomous Vehicles and Machines 2018, pp. 163-1-163-6(6)
35
2018EIKE LANGBEHN, FRANK STEINICKE, MARKUS LAPPE, GREGORY F. WELCH, GERD BRUDERIn the Blink of an Eye –
Leveraging Blink-Induced Suppression for Imperceptible Position and
Orientation Redirection in Virtual Reality
https://basilic.informatik.uni-hamburg.de/Publications/2018/LSLWB18/eye_blinks.pdfACM Trans. Graph., Vol. 37, No. 4, Article 66Human-centered computing→Virtual reality;• Computing
methodologies → Perception; Virtual reality;Virtual reality, eye blinks, redirected
walking, psychophysical experiments
36
2018Carlos Rafael Fernandes Picanço, François Jacques TonneauA low‐cost platform for eye‐tracking research: Using Pupil© in behavior analysishttps://onlinelibrary.wiley.com/doi/abs/10.1002/jeab.448
37
2018Thiago Santini, Wolfgang Fuhl, Enkelejda KasneciPuReST: robust pupil tracking for real-time pervasive eye tracking
https://dl.acm.org/citation.cfm?id=3204578ETRA 2018
38
2018Justus Thies, Michael Zollhofer, Marc Stamminger, Christian Theobalt, Matthias Nießner, FaceVR: Real-Time Gaze-Aware
Facial Reenactment in Virtual Reality
https://dl.acm.org/citation.cfm?id=3182644https://arxiv.org/pdf/1610.03151.pdfACM Transactions on Graphics (TOG) Volume 37 Issue 2 Article No. 25face tracking, virtual reality, eye tracking
39
2018Li, W-C., Kearney, P., Braithwaite, G. & Lin, JHow much is too much? Visual Scan Patterns of Single Air Traffic Controller Performing Multiple Remote Tower Operationshttps://doi.org/10.1016/j.ergon.2018.05.005International Journal of Industrial Ergonomics (67, 136-144)Air traffic managementAviation safetyCost-efficiencyHuman-computer interactionsMultiple remote tower operationsSituation awarenes
40
2018Kearney, P. & Li, W-CMultiple Remote Tower for Single European Sky: the Evolution from Initial Operational Concept to Regulatory Approved Implementationhttps://doi.org/10.1016/j.tra.2018.06.005Transportation Research Part-A, 116, 15-30Air traffic control
Cost efficiency
Human performance
Multiple remote tower operations
Safety assessment
Single European Sky
41
2018Florian Jungwirth, Michael Haslgrübler, Alois FerschaContour-Guided Gaze Gestures: Using Object Contours as Visual Guidance for Triggering Interactionshttps://dl.acm.org/citation.cfm?id=3204530http://delivery.acm.org/10.1145/3210000/3204530/a28-jungwirth.pdf?ip=46.88.71.149&id=3204530&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&__acm__=1529603119_328af9dab57076b59c35dc9d6270f93bETRA 2018Wearable Computing; Pervasive Computing; Eye-Tracking; Gaze- based Interaction; Internet of Things
42
2018Carlos Elmadjian, Pushkar Shukla, Antonio Diaz Tula, Carlos H. Morimoto3D gaze estimation in the scene volume with a head-mounted eye trackerhttps://dl.acm.org/citation.cfm?id=3206351http://delivery.acm.org/10.1145/3210000/3206351/a3-elmadjian.pdf?ip=46.88.71.149&id=3206351&acc=OPENTOC&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E383ADA7593775D6F&__acm__=1529603164_193f8cbfe8ef54f156cf3d4a4f49c539ETRA / COGAIN 2018Head-mounted eye tracking, calibration, gaze estimation, 3D dataset
43
2018Michael Barz,
Florian Daiber,
Daniel Sonntag,
Andreas Bulling
Error-Aware Gaze-Based Interfaces for
Robust Mobile Gaze Interaction
https://doi.org/10.1145/3204493.3204536https://perceptual.mpi-inf.mpg.de/files/2018/04/barz18_etra.pdfETRA 2018Eye Tracking; Mobile Interaction; Gaze Interaction; Error Model;
Error-Aware
44
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, Henny Admoni Eye-Hand Behavior in Human-Robot Shared Manipulationhttps://dl.acm.org/citation.cfm?id=3171287https://www.ri.cmu.edu/wp-content/uploads/2018/01/hri2018_aronson.pdfConference Paper, Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, March, 2018 human-robot interaction, eye gaze, eye tracking, shared autonomy,
nonverbal communication
45
2018Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, Mark BillinghurstPinpointing: Precise Head- and Eye-Based Target
Selection for Augmented Reality
https://dl.acm.org/citation.cfm?id=3173655https://www.researchgate.net/profile/Mikko_Kytoe/publication/323970135_Pinpointing_Precise_Head-_and_Eye-Based_Target_Selection_for_Augmented_Reality/links/5ab559da0f7e9b68ef4cf26a/Pinpointing-Precise-Head-and-Eye-Based-Target-Selection-for-Augmented-Reality.pdfCHI 2018Eye tracking; gaze interaction; refinement techniques;
target selection; augmented reality; head-worn display
46
2018Leanne Chukoskie, Shengyao Guo, Eric Ho, Yalun Zheng, Qiming Chen, Vivian Meng, John Cao, Nikhita Devgan, Si Wu, Pamela C. CosmanQuantifying Gaze Behavior during Real World Interactions using Automated Object, Face, and Fixation Detectionhttps://ieeexplore.ieee.org/abstract/document/8328848/IEEE Transactions on Cognitive and Developmental Systemseye-tracking,
gaze behavior,
face detection,
computer vision.
47
2018Kai Dierkes, Moritz Kassner, Andreas BullingA novel approach to single camera, glint-free
3D eye model fitting including corneal refraction
https://perceptual.mpi-inf.mpg.de/files/2018/04/dierkes18_etra.pdfETRA 2018Eye tracking, refraction, 3D eye model, pupil detection, contourbased,
glint-free
48
2018Julian Steil, Michael Xuelin Huang, Andreas BullingFixation Detection for Head-Mounted Eye Tracking
Based on Visual Similarity of Gaze Targets
https://perceptual.mpi-inf.mpg.de/files/2018/04/steil18_etra.pdfETRA 2018Visual focus of attention; Mobile eye tracking; Egocentric vision
49
2018Thomas Kosch, Mariam Hassib, Daniel Buschek, Albrecht SchmidtLook into my Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptationhttps://dl.acm.org/citation.cfm?id=3188643CHI EA 2018Cognition-Aware Interfaces; Workload-Aware Computing;
Pupil Dilation; Eye Tracking
50
2018Damian Almaraz, Brock Carlson, Hieu-Trung Vu, Jeremy LoebachPupillometry as a measure of auditory cognitive processes and listening efforthttps://doi.org/10.1121/1.5035727The Journal of the Acoustical Society of America 143, 1751 (2018)
51
2018Iuliia Brishtel, Shoya Ishimaru,Olivier Augereau, Koichi Kise, Andreas DengelAssessing Cognitive Workload on Printed and Electronic Media using Eye-Tracker and EDA Wristbandhttps://dl.acm.org/citation.cfm?id=3180354https://www.dropbox.com/s/r807qwnvkz6v0lj/IUI2018Iuliia.pdf?raw=1IUI '18 Companion Proceedings of the 23rd International Conference on Intelligent User Interfaces CompanionE-learning; Eye-Tracking; Reading; Information Processing;
Electrodermal Activity; Cognitive Workload; User-Interface
52
2018Mirko Raković, Nuno Duarte, Jovica Tasevski, José Santos-Victor, Branislav BorovacA dataset of head and eye gaze during dyadic interaction task for
modeling robot gaze behavior
https://doi.org/10.1051/matecconf/201816103002https://www.matec-conferences.org/articles/matecconf/pdf/2018/20/matecconf_erzr2018_03002.pdf13th International Scientific-Technical Conference on Electromechanics and Robotics “Zavalishin’s Readings” - 2018
53
2018Mohamed Khamis, Carl Oechsner, Florian Alt, Andreas BullingVRPursuits: Interaction in Virtual Reality using Smooth Pursuit
Eye Movements
https://doi.org/10.1145/3206505.3206522https://perceptual.mpi-inf.mpg.de/files/2018/04/khamis18_avi.pdfAVI 2018Eye Tracking, Virtual Reality, Gaze Interaction, Pursuits
54
2018Ruimin Li, Bin Li, Shixiong Zhang, Hong Fu, Wai-Lun Lo, Jie Yu, Cindy H.P. Sit, Desheng WenEvaluation of the fine motor skills of children with DCD using the digitalised
visual-motor tracking system
https://ieeexplore.ieee.org/abstract/document/8316738/https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8316738The Journal of Engineering ( Volume: 2018, Issue: 2, 2 2018 )developmental coordination disorder,
hand movement,
eye gaze position,
eye tracker,
digitalised visual-motor tracking system,
DCD,
children,
fine motor skills
55
2018Nuno Duarte, Jovica Tasevski, Moreno Coco, Mirko Raković, José Santos-VictorAction Anticipation: Reading the Intentions of Humans and Robotshttps://arxiv.org/abs/1802.02788https://arxiv.org/pdf/1802.02788
56
2018Julian Steil, Philipp Müller, Yusuke Sugano, Andreas BullingForecasting User Attention During Everyday Mobile
Interactions Using Device-Integrated and Wearable Sensors
https://arxiv.org/abs/1801.06011.pdfhttps://arxiv.org/pdf/1801.06011.pdfMobileHCI 2018Egocentric Vision; Handheld Mobile Device; Attention Shifts;
Mobile Eye Tracking; Attentive User Interfaces
57
2018Christian Lander, Marco Speicher, Frederic Kerber, Antonio KrügerTowards Fixation Extraction in Corneal Imaging Based Eye Tracking Datahttps://doi.org/10.1145/3170427.3188597CHI’18 Extended Abstracts, April 21–26, 2018Corneal Imaging; Fixation; Image Stitching.
58
2018Sylvain Pauchet, Catherine Letondal, Jean-Luc Vinot, Mickaël Causse, Mathieu Cousy, Valentin Becquet, Guillaume Crouzet GazeForm: Dynamic Gaze-adaptive Touch Surface
for Eyes-free Interaction in Airliner Cockpits
https://www.researchgate.net/profile/Mickael_Causse/publication/324561583_GazeForm_Dynamic_Gaze-adaptive_Touch_Surface_for_Eyes-free_Interaction_in_Airliner_Cockpits/links/5ad5a8fda6fdcc293580bcdf/GazeForm-Dynamic-Gaze-adaptive-Touch-Surface-for-Eyes-free-Interaction-in-Airliner-Cockpits.pdfDIS 2018 (Hong Kong)Eyes-free interaction; touchscreens; TEI; shape-changing
interfaces; adaptive interfaces; eye-tracking; critical contexts
59
2018Paul Schydlo, Mirko Rakovic, Lorenzo Jamone, Jose Santos-VictorAnticipation in Human-Robot Cooperation: A recurrent neural network
approach for multiple action sequences prediction
https://arxiv.org/abs/1802.10503.pdfhttps://arxiv.org/pdf/1802.10503.pdf
60
2018Adithya B, Lee Hanna, Pavan Kumar B N, Youngho ChaiCalibration Techniques and Gaze Accuracy Estimation in Pupil Labs Eye Trackerhttp://www.dbpia.co.kr/Journal/ArticleDetail/NODE07404419TECHART: Journal of Arts and Imaging Science, Vol.5 No.1calibration
61
2018T.A.B. de Boer, J. Hoogmoed, N.M. Looye, J.R.P. van der Toorn, R.P. de Vos, J. Stapel, P. Bazilinskyy, J.C.F. de WinterCombining eye-tracking with semantic scene labelling in a car https://www.researchgate.net/profile/Joost_De_Winter/publication/322293911_Combining_eye-tracking_with_semantic_scene_labelling_in_a_car/links/5a5136d0aca2725638c592c0/Combining-eye-tracking-with-semantic-scene-labelling-in-a-car.pdfWorking paper, January 2018Computer Vision, Semantic Scene Understanding, Object Classification, Driving, Transportation
62
2018Ľuboš Hládek, Bernd Porr, W. Owen BrimijoinReal-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculographyhttp://journals.plos.org/plosone/article?id=10.1371/journal.pone.0190420http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0190420&type=printablePLoS ONE 13(1)Electrooculography, Eye gaze angle, Saccade
63
2018Martina Truschzinskinski, Alberto Betellab, Guido Brunnettc, Paul F.M.J. VerschureEmotional and cognitive influences in air traffic controller tasks: An investigation using a virtual environment?https://www.sciencedirect.com/science/article/pii/S0003687017302855Journal "Applied Ergonomics" Volume 69Air traffic control, Personality, Workload, Mood, Virtual reality
64
2018Yasmeen Abdrabou,Khaled KassemJailan SalahReem El-GendyMahesty MorsyYomna Abdelrahman, Slim AbdennadherExploring the Usage of EEG and Pupil Diameter to Detect Elicited Valencehttps://link.springer.com/chapter/10.1007/978-3-319-73888-8_45Intelligent Human Systems IntegrationEEG Eye tracker Affective computing
65
2018Sebastiaan Mathôt, Jasper FabiusElle Van HeusdenStefan Van der StigchelSafe and sensible preprocessing and baseline correction of pupil-size datahttps://link.springer.com/article/10.3758/s13428-017-1007-2https://link.springer.com/content/pdf/10.3758%2Fs13428-017-1007-2.pdfJournal "Behavior Research Method" (January 2018)Pupillometry, Pupil size, Baseline correction, Research methods
66
2018Rahool Patel, Adrian Zurca396: I See What You See Adding Eye-Tracking To Medical Simulationhttp://journals.lww.com/ccmjournal/Fulltext/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.aspxhttp://pdfs.journals.lww.com/ccmjournal/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.pdf?token=method|ExpireAbsolute;source|Journals;ttl|1516159040898;payload|mY8D3u1TCCsNvP5E421JYK6N6XICDamxByyYpaNzk7FKjTaa1Yz22MivkHZqjGP4kdS2v0J76WGAnHACH69s21Csk0OpQi3YbjEMdSoz2UhVybFqQxA7lKwSUlA502zQZr96TQRwhVlocEp/sJ586aVbcBFlltKNKo+tbuMfL73hiPqJliudqs17cHeLcLbV/CqjlP3IO0jGHlHQtJWcICDdAyGJMnpi6RlbEJaRheGeh5z5uvqz3FLHgPKVXJzd9ia1/MJJUFmWp1b9urv13G3AW53fk6CUMZKseFCONh0=;hash|4XIUop/VFLFVP+lXHDRboA==Journal "Critical Care Medicine" Volume 46, Issue 1 (January 2018)Medical simulation
67
2018Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, Andreas BullingPrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysishttps://arxiv.org/abs/1801.04457https://arxiv.org/pdf/1801.04457.pdfEgocentric Vision; Eye Tracking; Gaze Behaviour
68
2017Wen-Chin Li, Jiaqi Cao, Jr-Hung Lin, Graham Braithwaite, Matthew GreavesThe Evaluation of Pilot’s First Fixation
and Response Time to Different Design
of Alerting Messages
https://pdfs.semanticscholar.org/f9e6/e1c8b5194c4e7b7565bf346e8192d7289803.pdfAG 2017Cockpit design, Crew Alerting System, Eye movement, Human-computer interaction, Quick Reference Handbook
69
2107Henny Admoni, Siddhartha SrinivasaEye Gaze Reveals Intentions in Shared Autonomyhttp://intentions.xyz/wp-content/uploads/2017/01/Admoni.intentions-eyegaze.reduced.pdfProceedings of Intentions in HRI Workshop at HRI
2017, Vienna, Austria, March 2017 (Intentions in HRI ’17),
human-robot interaction, intentions, nonverbal behavior, eye gaze
70
2017Marco Filippucci, Fabio Bianconi, Elisa Bettollini, Michela Meschini and Marco Seccaroni Survey and Representation for Rural Landscape. New
Tools for New Strategies: The Example of Campello
Sul Clitunno
http://www.mdpi.com/2504-3900/1/9/934/pdflandscape and image; perception; eye tracking; algorithmic spatial analysis;
participation
71
2017Brendan JohnA Dataset of Gaze Behavior in VR Faithful to Natural Statisticshttp://scholarworks.rit.edu/cgi/viewcontent.cgi?article=10716&context=thesesRIT Student MS Thesis
72
2017Jork Stapel, Freddy Antony Mullakkal-Babu, Riender HappeeDriver behaviour and workload in an on-road automated vehiclehttps://www.researchgate.net/profile/Jork_Stapel2/publication/318702364_Driver_Behavior_and_Workload_in_an_On-road_Automated_Vehicle/links/597887fa45851570a1b9623a/Driver-Behavior-and-Workload-in-an-On-road-Automated-Vehicle.pdfPreprintAutomated Driving; On-road; Workload; Experience; Underload
73
2017Iuliia Kotseruba, John K. Tsotsos†STAR-RT: Visual attention for real-time video game playinghttps://arxiv.org/abs/1711.09464https://arxiv.org/pdf/1711.09464.pdfARXIV PreprintSTAR; Cognitive Programs; visual attention; Visual Routines; real-time vision; platform video games; game AI
74
2017QI SUN,FU-CHUNG HUANG, JOOHWAN KIM,
LI-YI WEI, DAVID LUEBKE, ARIE KAUFMAN
Perceptually-Guided Foveation for Light Field Displayshttp://research.nvidia.com/publication/2017-11_Perceptually-Guided-Foveation-forhttp://research.nvidia.com/sites/default/files/publications/c121-f121_199-a18-paperfinal-v3.pdfACM Trans. Graph. 36, 6, Article 192 (November 2017)light field, computational display, foveation,
sampling
75
2017Changwon Jang, Kiseung Bang, Seokil Moon, Jonghyun Kim, Seungjae Lee, Byoungho LeeRetinal 3D: Augmented Reality Near-Eye Display Via Pupil-Tracked Light Field Projection on Retinahttp://oeqelab.snu.ac.kr/retinal3dhttp://oeqelab.snu.ac.kr/?module=file&act=procFileDownload&file_srl=74430&sid=cfde647ef4e59a0adb7f07197f53b84e&module_srl=74407SIGGRAPH 2017. Vol 36. Issue 6. Article No. 190Near-eye display, eye tracking, computational
displays, holographic optical element, vergence-accommodation
conƒict
76
2017Khaled Kassem, Jailan Salah, Yasmeen Abdrabou, Mahesty Morsy, Reem El-Gendy,
Yomna Abdelrahman
, Slim Abdennadher
DiVA: Exploring the Usage of Pupil Diameter to Elicit
Valence and Arousal
https://www.researchgate.net/profile/Yomna_Abdelrahman/publication/321783827_DiVA_exploring_the_usage_of_pupil_di_ameter_to_elicit_v_alence_and_a_rousal/links/5a328a17aca27271444f3689/DiVA-exploring-the-usage-of-pupil-di-ameter-to-elicit-v-alence-and-a-rousal.pdfArousal; Valence; Eye Tracker; Pupil Diameter
77
2017Benjamin Hatscher, Maria Luz, Lennart E. Nacke, Norbert Elkmann, Veit Muller, Christian HansenGazeTap: Towards Hands-Free Interaction in the Operating Roomhttps://www.researchgate.net/profile/Christian_Hansen17/publication/320372375_GazeTap_Towards_Hands-Free_Interaction_in_the_Operating_Room/links/59f7498e0f7e9b553ebedb6d/GazeTap-Towards-Hands-Free-Interaction-in-the-Operating-Room.pdfInternational Conference on Multimodal InteractionInput techniques, multimodal interaction, foot input, gaze input,
eye tracking, gaze-foot interaction, HCI in the operating room
78
2017Christian Lander, Sven Gehring, Markus Löchtefeld, Andreas Bulling, Antonio KrügerEyeMirror: Mobile Calibration-Free Gaze Approximation
using Corneal Imaging
https://perceptual.mpi-inf.mpg.de/files/2017/11/lander17_mum.pdfMUM 2017, November 26–29, 2017Corneal Image; mobile device; gaze approximation; feature
tracking; pervasive
79
2017Zhao, Zhong, Robin N. Salesse, Ludovic Marin, Mathieu Gueugnon, and Benoit G. Bardy. Likability’s Effect on Interpersonal Motor Coordination: Exploring Natural Gaze Directionhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/fullhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/pdfJournal "Frontiers in Psychology"interpersonal motor coordination, likability, finger-tapping, markers
80
2017Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, Hans GellersenGaze + Pinch Interaction in Virtual Realityhttps://dl.acm.org/citation.cfm?id=3132180https://kenpfeuffer.files.wordpress.com/2015/02/p99-pfeuffer.pdfSUI ’17, October 16–17, 2017Gaze; pinch; freehand gesture; interaction technique; multimodal
interface; menu; eye tracking; virtual reality.
81
2017Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, Ian OakleySmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

https://www.researchgate.net/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Realityhttps://www.researchgate.net/profile/Islam_Md_Rasel/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Reality/links/5a0255c1aca2720df3ca03d2/SmoothMoves-Smooth-Pursuits-Head-Movements-for-Augmented-Reality.pdfACM Symposium on User Interface Software and Technology (UIST)Wearable computing, eye tracking, augmented reality, AR,
input technique, smooth pursuits, motion matching, HMD
82
2017Yun Suen Pai, Banjamin Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, Kai KunzeGazeSphere: Navigating 360-Degree-Video Environments in VR
Using Head Rotation and Eye Gaze
http://dl.acm.org/citation.cfm?id=3102183ACM SIGGRAPH PostersVirtual reality, 360-degree-video, eye tracking, orbital navigation
83
2017Mihai Bâce, Philippe Schlattner, Vincent Becker, Gábor SörösFacilitating Object Detection and Recognition
through Eye Gaze
http://www.vs.inf.ethz.ch/publ/papers/mbace_MobileHCI2017_workshop.pdfMobileHCI ’17 Workshops, September 04–09, 2017Eye Tracking; Eye Gaze; Wearable Computing; HCI
84
2017Otto Lappi, Paavo Rinkkala, Jami PekkanenSystematic Observation of an Expert Driver's Gaze Strategy—An On-Road Case Studyhttps://doi.org/10.3389/fpsyg.2017.00620https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00620/fullFront. Psychol., 27 April 2017
85
2017ANA SERRANO, VINCENT SITZMANN, JAIME RUIZ-BORAU, GORDON WETZSTEIN, DIEGO GUTIERREZ, BELEN MASIA Movie Editing and Cognitive Event Segmentation in Virtual Reality
Video
https://doi.org/10.1145/3072959.3073668http://webdiis.unizar.es/~aserrano/docs/Serrano_SIGG2017_VR-cine.pdfACM Transactions on Graphics, Vol. 36, No. 4, Article 47. Publication date: July 201VR, Human Centered Computing, Immersive environments, cinematography
86
2017Guillem Torrente MartiMobility for the severely disabled:
a head-controlled wheelchair
https://repositori.upf.edu/bitstream/handle/10230/32897/Torrente_2017.pdf?sequence=1&isAllowed=yBachelor's ThesisAccessibility, Eye Tracking
87
2017Francesco Walker, Berno Bucker, Nicola C. Anderson, Daniel Schreij, Jan TheeuwesLooking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adultshttps://doi.org/10.1371/journal.pone.0178912http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0178912&type=printablePloS one
88
2017Jason Orlosky, Yuta Itoh, Maud Ranchet, Kiyoshi Kiyokawa, John Morgan, Hannes DevosEmulation of Physician Tasks in Eye-Tracked Virtual Reality for Remote Diagnosis of Neurodegenerative Diseasehttps://www.researchgate.net/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Diseasehttps://www.researchgate.net/profile/Jason_Orlosky/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Disease/links/58d352f6a6fdccd24d43c710/Emulation-of-Physician-Tasks-in-Eye-tracked-Virtual-Reality-for-Remote-Diagnosis-of-Neurodegenerative-Disease.pdfIEEE Transactions on Visualization and Computer GraphicsVirtual reality, eye tracking, diagnosis, visualization
89
2017Enkelejda Kasneci, Alex A. Black, Joanne M. WoodEye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucomahttps://www.hindawi.com/journals/joph/2017/6425913/abs/http://downloads.hindawi.com/journals/joph/2017/6425913.pdfJournal of ophthalmology
90
2017Andrew L. Kun, Hidde van der Meulen, Christian P. JanssenCalling While Driving: An Initial Experiment With Hololenshttps://www.ris.uu.nl/ws/files/31229478/DA2017HoloLens.pdfInternational Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle DesignHololens
91
2017Ngu Nguyen, Stephan SiggPassFrame: Generating image-based passwords from egocentric videoshttp://ieeexplore.ieee.org/abstract/document/7917518/authorsPervasive Computing and Communications WorkshopsAuthentication, Videos, Image segmentation, Cameras, Visualization, Conferences, Pervasive computing
92
2017Timothy Stapleton, Helen Sumin KooBicyclist biomotion visibility aids: a 3D eye-tracking analysishttp://www.emeraldinsight.com/doi/abs/10.1108/IJCST-05-2016-0060International Journal of Clothing Science and TechnologyDesign, Visibility, Eye-tracking, Bicyclists, Biomotion
93
2017Yuta Itoh, Jason Orlosky, Kiyoshi Kiyokawa, Toshiyuki Amano, Maki SugimotoMonocular focus estimation method for a freely-orienting eye using Purkinje-Sanson imageshttp://ieeexplore.ieee.org/abstract/document/7892252/IEEE Virtual Reality (VR)Cameras, Three-dimensional displays, Lighting, Estimation, Measurement by laser beam, Solid modeling, Two dimensional displays
94
2017Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, Mark BillinghurstExploring natural eye-gaze-based interaction for immersive virtual realityhttp://ieeexplore.ieee.org/abstract/document/7893315/IEEE Symposium on 3D User Interfaces (3DUI)Erbium, Gaze tracking, Resists, Painting, Electronic mail, Two dimensional displays, Portable computers
95
Leanne Chukoskie, Jacqueline Nguyen, Jeanne TownsendGaze-contingent Games for Neurocognitive
Therapy: More than Meets the Eye?
http://cogain2017.cogain.org/camready/talk2-Chukoskie.pdfCOGAIN 2017 (Talk)attention, eye tracking, gaze-contingent, training, usability, video games
96
2017Carlos E. L. Elmadjian, Antonio Diaz-Tula, Fernando O. Aluani, Carlos H. MorimotoGaze interaction using low-resolution images at 5 FPShttp://cogain2017.cogain.org/camready/talk5-Elmadjian.pdfCOGAIN 2017 (Talk)
97
2017David Dunn, Cary Tippets, Kent Torell, Petr Kellnhofer, Kaan Aksit, Piotr Didyk, Karol Myszkowski, David Luebke, Henry FuchsWide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrorshttp://ieeexplore.ieee.org/abstract/document/7829412/http://telepresence.web.unc.edu/files/2017/01/Dunn_2017_TVCG_MembraneAR.pdfIEEE Transactions on Visualization and Computer GraphicsAugmented reality, displays, focus accommodation, perception, user study
98
2017Christian Lander, Frederik Wiehr, Nico Herbig, Antonio Krüger, Markus LöchtefeldInferring Landmarks for Pedestrian Navigation from Mobile Eye-Tracking Data and Google Street Viewhttp://dl.acm.org/citation.cfm?doid=3027063.3053201http://umtl.dfki.de/~fred/papers/final-ea2721-lander.pdfCHI Conference (Extended Abstracts on Human Factors in Computing Systems)Landmarks, Eye Tracking, Google Street View, Navigation
99
2017Michaela Klauck, Yusuke Sugano, Andreas BullingNoticeable or Distractive?: A Design Space for Gaze-Contingent User Interface Notificationshttp://dl.acm.org/citation.cfm?doid=3027063.3053085http://delivery.acm.org/10.1145/3060000/3053085/ea1779-klauck.pdf?ip=14.207.3.246&id=3053085&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&CFID=760488783&CFTOKEN=40502403&__acm__=1494304622_07758cbfc2c218d89f5a8aa9c6232cb7CHI Conference (Extended Abstracts on Human Factors in Computing Systems)Interruptions, Attentive User Interfaces, Eye Tracking, Public Display, Peripheral Display
100
2017Michael Barz, Peter Poller, Daniel SonntagEvaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRIhttp://dl.acm.org/citation.cfm?id=3038367HRI (Human-Robot Interaction)
Loading...