Pupil Citation List
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
ACDEFGH
1
Year PublishedAuthor(s)TitleURLPDFJournal/ConferenceKeywords
2
2018Michael Barz,
Florian Daiber,
Daniel Sonntag,
Andreas Bulling
Error-Aware Gaze-Based Interfaces for
Robust Mobile Gaze Interaction
https://doi.org/10.1145/3204493.3204536https://perceptual.mpi-inf.mpg.de/files/2018/04/barz18_etra.pdfETRA 2018Eye Tracking; Mobile Interaction; Gaze Interaction; Error Model;
Error-Aware
3
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, Henny Admoni Eye-Hand Behavior in Human-Robot Shared Manipulationhttps://dl.acm.org/citation.cfm?id=3171287https://www.ri.cmu.edu/wp-content/uploads/2018/01/hri2018_aronson.pdfConference Paper, Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, March, 2018 human-robot interaction, eye gaze, eye tracking, shared autonomy,
nonverbal communication
4
2018Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, Mark BillinghurstPinpointing: Precise Head- and Eye-Based Target
Selection for Augmented Reality
https://dl.acm.org/citation.cfm?id=3173655https://www.researchgate.net/profile/Mikko_Kytoe/publication/323970135_Pinpointing_Precise_Head-_and_Eye-Based_Target_Selection_for_Augmented_Reality/links/5ab559da0f7e9b68ef4cf26a/Pinpointing-Precise-Head-and-Eye-Based-Target-Selection-for-Augmented-Reality.pdfCHI 2018Eye tracking; gaze interaction; refinement techniques;
target selection; augmented reality; head-worn display
5
2018Leanne Chukoskie, Shengyao Guo, Eric Ho, Yalun Zheng, Qiming Chen, Vivian Meng, John Cao, Nikhita Devgan, Si Wu, Pamela C. CosmanQuantifying Gaze Behavior during Real World Interactions using Automated Object, Face, and Fixation Detectionhttps://ieeexplore.ieee.org/abstract/document/8328848/IEEE Transactions on Cognitive and Developmental Systemseye-tracking,
gaze behavior,
face detection,
computer vision.
6
2018Kai Dierkes, Moritz Kassner, Andreas BullingA novel approach to single camera, glint-free
3D eye model fitting including corneal refraction
https://perceptual.mpi-inf.mpg.de/files/2018/04/dierkes18_etra.pdfETRA 2018Eye tracking, refraction, 3D eye model, pupil detection, contourbased,
glint-free
7
2018Thomas Kosch, Mariam Hassib, Daniel Buschek, Albrecht SchmidtLook into my Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptationhttps://dl.acm.org/citation.cfm?id=3188643CHI EA 2018Cognition-Aware Interfaces; Workload-Aware Computing;
Pupil Dilation; Eye Tracking
8
2018Damian Almaraz, Brock Carlson, Hieu-Trung Vu, Jeremy LoebachPupillometry as a measure of auditory cognitive processes and listening efforthttps://doi.org/10.1121/1.5035727The Journal of the Acoustical Society of America 143, 1751 (2018)
9
2018Iuliia Brishtel, Shoya Ishimaru,Olivier Augereau, Koichi Kise, Andreas DengelAssessing Cognitive Workload on Printed and Electronic Media using Eye-Tracker and EDA Wristbandhttps://dl.acm.org/citation.cfm?id=3180354https://www.dropbox.com/s/r807qwnvkz6v0lj/IUI2018Iuliia.pdf?raw=1IUI '18 Companion Proceedings of the 23rd International Conference on Intelligent User Interfaces CompanionE-learning; Eye-Tracking; Reading; Information Processing;
Electrodermal Activity; Cognitive Workload; User-Interface
10
2018Mirko Raković, Nuno Duarte, Jovica Tasevski, José Santos-Victor, Branislav BorovacA dataset of head and eye gaze during dyadic interaction task for
modeling robot gaze behavior
https://doi.org/10.1051/matecconf/201816103002https://www.matec-conferences.org/articles/matecconf/pdf/2018/20/matecconf_erzr2018_03002.pdf13th International Scientific-Technical Conference on Electromechanics and Robotics “Zavalishin’s Readings” - 2018
11
2018Mohamed Khamis, Carl Oechsner, Florian Alt, Andreas BullingVRPursuits: Interaction in Virtual Reality using Smooth Pursuit
Eye Movements
https://doi.org/10.1145/3206505.3206522https://perceptual.mpi-inf.mpg.de/files/2018/04/khamis18_avi.pdfAVI 2018Eye Tracking, Virtual Reality, Gaze Interaction, Pursuits
12
2018Ruimin Li, Bin Li, Shixiong Zhang, Hong Fu, Wai-Lun Lo, Jie Yu, Cindy H.P. Sit, Desheng WenEvaluation of the fine motor skills of children with DCD using the digitalised
visual-motor tracking system
https://ieeexplore.ieee.org/abstract/document/8316738/https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8316738The Journal of Engineering ( Volume: 2018, Issue: 2, 2 2018 )developmental coordination disorder,
hand movement,
eye gaze position,
eye tracker,
digitalised visual-motor tracking system,
DCD,
children,
fine motor skills
13
2018Nuno Duarte, Jovica Tasevski, Moreno Coco, Mirko Raković, José Santos-VictorAction Anticipation: Reading the Intentions of Humans and Robotshttps://arxiv.org/abs/1802.02788https://arxiv.org/pdf/1802.02788
14
2018Julian Steil, Philipp Müller, Yusuke Sugano, Andreas BullingForecasting User Attention During Everyday Mobile
Interactions Using Device-Integrated and Wearable Sensors
https://arxiv.org/abs/1801.06011.pdfhttps://arxiv.org/pdf/1801.06011.pdfMobileHCI 2018Egocentric Vision; Handheld Mobile Device; Attention Shifts;
Mobile Eye Tracking; Attentive User Interfaces
15
2018Christian Lander, Marco Speicher, Frederic Kerber, Antonio KrügerTowards Fixation Extraction in Corneal Imaging Based Eye Tracking Datahttps://doi.org/10.1145/3170427.3188597CHI’18 Extended Abstracts, April 21–26, 2018Corneal Imaging; Fixation; Image Stitching.
16
2018Sylvain Pauchet, Catherine Letondal, Jean-Luc Vinot, Mickaël Causse, Mathieu Cousy, Valentin Becquet, Guillaume Crouzet GazeForm: Dynamic Gaze-adaptive Touch Surface
for Eyes-free Interaction in Airliner Cockpits
https://www.researchgate.net/profile/Mickael_Causse/publication/324561583_GazeForm_Dynamic_Gaze-adaptive_Touch_Surface_for_Eyes-free_Interaction_in_Airliner_Cockpits/links/5ad5a8fda6fdcc293580bcdf/GazeForm-Dynamic-Gaze-adaptive-Touch-Surface-for-Eyes-free-Interaction-in-Airliner-Cockpits.pdfDIS 2018 (Hong Kong)Eyes-free interaction; touchscreens; TEI; shape-changing
interfaces; adaptive interfaces; eye-tracking; critical contexts
17
2018Paul Schydlo, Mirko Rakovic, Lorenzo Jamone, Jose Santos-VictorAnticipation in Human-Robot Cooperation: A recurrent neural network
approach for multiple action sequences prediction
https://arxiv.org/abs/1802.10503.pdfhttps://arxiv.org/pdf/1802.10503.pdf
18
2018Adithya B, Lee Hanna, Pavan Kumar B N, Youngho ChaiCalibration Techniques and Gaze Accuracy Estimation in Pupil Labs Eye Trackerhttp://www.dbpia.co.kr/Journal/ArticleDetail/NODE07404419TECHART: Journal of Arts and Imaging Science, Vol.5 No.1calibration
19
2018T.A.B. de Boer, J. Hoogmoed, N.M. Looye, J.R.P. van der Toorn, R.P. de Vos, J. Stapel, P. Bazilinskyy, J.C.F. de WinterCombining eye-tracking with semantic scene labelling in a car https://www.researchgate.net/profile/Joost_De_Winter/publication/322293911_Combining_eye-tracking_with_semantic_scene_labelling_in_a_car/links/5a5136d0aca2725638c592c0/Combining-eye-tracking-with-semantic-scene-labelling-in-a-car.pdfWorking paper, January 2018Computer Vision, Semantic Scene Understanding, Object Classification, Driving, Transportation
20
2018Ľuboš Hládek, Bernd Porr, W. Owen BrimijoinReal-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculographyhttp://journals.plos.org/plosone/article?id=10.1371/journal.pone.0190420http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0190420&type=printablePLoS ONE 13(1)Electrooculography, Eye gaze angle, Saccade
21
2018Martina Truschzinskinski, Alberto Betellab, Guido Brunnettc, Paul F.M.J. VerschureEmotional and cognitive influences in air traffic controller tasks: An investigation using a virtual environment?https://www.sciencedirect.com/science/article/pii/S0003687017302855Journal "Applied Ergonomics" Volume 69Air traffic control, Personality, Workload, Mood, Virtual reality
22
2018Yasmeen Abdrabou,Khaled KassemJailan SalahReem El-GendyMahesty MorsyYomna Abdelrahman, Slim AbdennadherExploring the Usage of EEG and Pupil Diameter to Detect Elicited Valencehttps://link.springer.com/chapter/10.1007/978-3-319-73888-8_45Intelligent Human Systems IntegrationEEG Eye tracker Affective computing
23
2018Sebastiaan Mathôt, Jasper FabiusElle Van HeusdenStefan Van der StigchelSafe and sensible preprocessing and baseline correction of pupil-size datahttps://link.springer.com/article/10.3758/s13428-017-1007-2https://link.springer.com/content/pdf/10.3758%2Fs13428-017-1007-2.pdfJournal "Behavior Research Method" (January 2018)Pupillometry, Pupil size, Baseline correction, Research methods
24
2018Rahool Patel, Adrian Zurca396: I See What You See Adding Eye-Tracking To Medical Simulationhttp://journals.lww.com/ccmjournal/Fulltext/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.aspxhttp://pdfs.journals.lww.com/ccmjournal/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.pdf?token=method|ExpireAbsolute;source|Journals;ttl|1516159040898;payload|mY8D3u1TCCsNvP5E421JYK6N6XICDamxByyYpaNzk7FKjTaa1Yz22MivkHZqjGP4kdS2v0J76WGAnHACH69s21Csk0OpQi3YbjEMdSoz2UhVybFqQxA7lKwSUlA502zQZr96TQRwhVlocEp/sJ586aVbcBFlltKNKo+tbuMfL73hiPqJliudqs17cHeLcLbV/CqjlP3IO0jGHlHQtJWcICDdAyGJMnpi6RlbEJaRheGeh5z5uvqz3FLHgPKVXJzd9ia1/MJJUFmWp1b9urv13G3AW53fk6CUMZKseFCONh0=;hash|4XIUop/VFLFVP+lXHDRboA==Journal "Critical Care Medicine" Volume 46, Issue 1 (January 2018)Medical simulation
25
2018Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, Andreas BullingPrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysishttps://arxiv.org/abs/1801.04457https://arxiv.org/pdf/1801.04457.pdfEgocentric Vision; Eye Tracking; Gaze Behaviour
26
2017Wen-Chin Li, Jiaqi Cao, Jr-Hung Lin, Graham Braithwaite, Matthew GreavesThe Evaluation of Pilot’s First Fixation
and Response Time to Different Design
of Alerting Messages
https://pdfs.semanticscholar.org/f9e6/e1c8b5194c4e7b7565bf346e8192d7289803.pdfAG 2017Cockpit design, Crew Alerting System, Eye movement, Human-computer interaction, Quick Reference Handbook
27
2017Marco Filippucci, Fabio Bianconi, Elisa Bettollini, Michela Meschini and Marco Seccaroni Survey and Representation for Rural Landscape. New
Tools for New Strategies: The Example of Campello
Sul Clitunno
http://www.mdpi.com/2504-3900/1/9/934/pdflandscape and image; perception; eye tracking; algorithmic spatial analysis;
participation
28
2017Brendan JohnA Dataset of Gaze Behavior in VR Faithful to Natural Statisticshttp://scholarworks.rit.edu/cgi/viewcontent.cgi?article=10716&context=thesesRIT Student MS Thesis
29
2017Jork Stapel, Freddy Antony Mullakkal-Babu, Riender HappeeDriver behaviour and workload in an on-road automated vehiclehttps://www.researchgate.net/profile/Jork_Stapel2/publication/318702364_Driver_Behavior_and_Workload_in_an_On-road_Automated_Vehicle/links/597887fa45851570a1b9623a/Driver-Behavior-and-Workload-in-an-On-road-Automated-Vehicle.pdfPreprintAutomated Driving; On-road; Workload; Experience; Underload
30
2017Iuliia Kotseruba, John K. Tsotsos†STAR-RT: Visual attention for real-time video game playinghttps://arxiv.org/abs/1711.09464https://arxiv.org/pdf/1711.09464.pdfARXIV PreprintSTAR; Cognitive Programs; visual attention; Visual Routines; real-time vision; platform video games; game AI
31
2017QI SUN,FU-CHUNG HUANG, JOOHWAN KIM,
LI-YI WEI, DAVID LUEBKE, ARIE KAUFMAN
Perceptually-Guided Foveation for Light Field Displayshttp://research.nvidia.com/publication/2017-11_Perceptually-Guided-Foveation-forhttp://research.nvidia.com/sites/default/files/publications/c121-f121_199-a18-paperfinal-v3.pdfACM Trans. Graph. 36, 6, Article 192 (November 2017)light field, computational display, foveation,
sampling
32
2017Changwon Jang, Kiseung Bang, Seokil Moon, Jonghyun Kim, Seungjae Lee, Byoungho LeeRetinal 3D: Augmented Reality Near-Eye Display Via Pupil-Tracked Light Field Projection on Retinahttp://oeqelab.snu.ac.kr/retinal3dhttp://oeqelab.snu.ac.kr/?module=file&act=procFileDownload&file_srl=74430&sid=cfde647ef4e59a0adb7f07197f53b84e&module_srl=74407SIGGRAPH 2017. Vol 36. Issue 6. Article No. 190Near-eye display, eye tracking, computational
displays, holographic optical element, vergence-accommodation
conƒict
33
2017Khaled Kassem, Jailan Salah, Yasmeen Abdrabou, Mahesty Morsy, Reem El-Gendy,
Yomna Abdelrahman
, Slim Abdennadher
DiVA: Exploring the Usage of Pupil Diameter to Elicit
Valence and Arousal
https://www.researchgate.net/profile/Yomna_Abdelrahman/publication/321783827_DiVA_exploring_the_usage_of_pupil_di_ameter_to_elicit_v_alence_and_a_rousal/links/5a328a17aca27271444f3689/DiVA-exploring-the-usage-of-pupil-di-ameter-to-elicit-v-alence-and-a-rousal.pdfArousal; Valence; Eye Tracker; Pupil Diameter
34
2017Benjamin Hatscher, Maria Luz, Lennart E. Nacke, Norbert Elkmann, Veit Muller, Christian HansenGazeTap: Towards Hands-Free Interaction in the Operating Roomhttps://www.researchgate.net/profile/Christian_Hansen17/publication/320372375_GazeTap_Towards_Hands-Free_Interaction_in_the_Operating_Room/links/59f7498e0f7e9b553ebedb6d/GazeTap-Towards-Hands-Free-Interaction-in-the-Operating-Room.pdfInternational Conference on Multimodal InteractionInput techniques, multimodal interaction, foot input, gaze input,
eye tracking, gaze-foot interaction, HCI in the operating room
35
2017Christian Lander, Sven Gehring, Markus Löchtefeld, Andreas Bulling, Antonio KrügerEyeMirror: Mobile Calibration-Free Gaze Approximation
using Corneal Imaging
https://perceptual.mpi-inf.mpg.de/files/2017/11/lander17_mum.pdfMUM 2017, November 26–29, 2017Corneal Image; mobile device; gaze approximation; feature
tracking; pervasive
36
2017Zhao, Zhong, Robin N. Salesse, Ludovic Marin, Mathieu Gueugnon, and Benoit G. Bardy. Likability’s Effect on Interpersonal Motor Coordination: Exploring Natural Gaze Directionhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/fullhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/pdfJournal "Frontiers in Psychology"interpersonal motor coordination, likability, finger-tapping, markers
37
2017Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, Hans GellersenGaze + Pinch Interaction in Virtual Realityhttps://dl.acm.org/citation.cfm?id=3132180https://kenpfeuffer.files.wordpress.com/2015/02/p99-pfeuffer.pdfSUI ’17, October 16–17, 2017Gaze; pinch; freehand gesture; interaction technique; multimodal
interface; menu; eye tracking; virtual reality.
38
2017Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, Ian OakleySmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

https://www.researchgate.net/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Realityhttps://www.researchgate.net/profile/Islam_Md_Rasel/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Reality/links/5a0255c1aca2720df3ca03d2/SmoothMoves-Smooth-Pursuits-Head-Movements-for-Augmented-Reality.pdfACM Symposium on User Interface Software and Technology (UIST)Wearable computing, eye tracking, augmented reality, AR,
input technique, smooth pursuits, motion matching, HMD
39
2017Yun Suen Pai, Banjamin Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, Kai KunzeGazeSphere: Navigating 360-Degree-Video Environments in VR
Using Head Rotation and Eye Gaze
http://dl.acm.org/citation.cfm?id=3102183ACM SIGGRAPH PostersVirtual reality, 360-degree-video, eye tracking, orbital navigation
40
2017Mihai Bâce, Philippe Schlattner, Vincent Becker, Gábor SörösFacilitating Object Detection and Recognition
through Eye Gaze
http://www.vs.inf.ethz.ch/publ/papers/mbace_MobileHCI2017_workshop.pdfMobileHCI ’17 Workshops, September 04–09, 2017Eye Tracking; Eye Gaze; Wearable Computing; HCI
41
2017Otto Lappi, Paavo Rinkkala, Jami PekkanenSystematic Observation of an Expert Driver's Gaze Strategy—An On-Road Case Studyhttps://doi.org/10.3389/fpsyg.2017.00620https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00620/fullFront. Psychol., 27 April 2017
42
2017ANA SERRANO, VINCENT SITZMANN, JAIME RUIZ-BORAU, GORDON WETZSTEIN, DIEGO GUTIERREZ, BELEN MASIA Movie Editing and Cognitive Event Segmentation in Virtual Reality
Video
https://doi.org/10.1145/3072959.3073668http://webdiis.unizar.es/~aserrano/docs/Serrano_SIGG2017_VR-cine.pdfACM Transactions on Graphics, Vol. 36, No. 4, Article 47. Publication date: July 201VR, Human Centered Computing, Immersive environments, cinematography
43
2017Guillem Torrente MartiMobility for the severely disabled:
a head-controlled wheelchair
https://repositori.upf.edu/bitstream/handle/10230/32897/Torrente_2017.pdf?sequence=1&isAllowed=yBachelor's ThesisAccessibility, Eye Tracking
44
2017Francesco Walker, Berno Bucker, Nicola C. Anderson, Daniel Schreij, Jan TheeuwesLooking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adultshttps://doi.org/10.1371/journal.pone.0178912http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0178912&type=printablePloS one
45
2017Jason Orlosky, Yuta Itoh, Maud Ranchet, Kiyoshi Kiyokawa, John Morgan, Hannes DevosEmulation of Physician Tasks in Eye-Tracked Virtual Reality for Remote Diagnosis of Neurodegenerative Diseasehttps://www.researchgate.net/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Diseasehttps://www.researchgate.net/profile/Jason_Orlosky/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Disease/links/58d352f6a6fdccd24d43c710/Emulation-of-Physician-Tasks-in-Eye-tracked-Virtual-Reality-for-Remote-Diagnosis-of-Neurodegenerative-Disease.pdfIEEE Transactions on Visualization and Computer GraphicsVirtual reality, eye tracking, diagnosis, visualization
46
2017Enkelejda Kasneci, Alex A. Black, Joanne M. WoodEye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucomahttps://www.hindawi.com/journals/joph/2017/6425913/abs/http://downloads.hindawi.com/journals/joph/2017/6425913.pdfJournal of ophthalmology
47
2017Andrew L. Kun, Hidde van der Meulen, Christian P. JanssenCalling While Driving: An Initial Experiment With Hololenshttps://www.ris.uu.nl/ws/files/31229478/DA2017HoloLens.pdfInternational Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle DesignHololens
48
2017Ngu Nguyen, Stephan SiggPassFrame: Generating image-based passwords from egocentric videoshttp://ieeexplore.ieee.org/abstract/document/7917518/authorsPervasive Computing and Communications WorkshopsAuthentication, Videos, Image segmentation, Cameras, Visualization, Conferences, Pervasive computing
49
2017Timothy Stapleton, Helen Sumin KooBicyclist biomotion visibility aids: a 3D eye-tracking analysishttp://www.emeraldinsight.com/doi/abs/10.1108/IJCST-05-2016-0060International Journal of Clothing Science and TechnologyDesign, Visibility, Eye-tracking, Bicyclists, Biomotion
50
2017Yuta Itoh, Jason Orlosky, Kiyoshi Kiyokawa, Toshiyuki Amano, Maki SugimotoMonocular focus estimation method for a freely-orienting eye using Purkinje-Sanson imageshttp://ieeexplore.ieee.org/abstract/document/7892252/IEEE Virtual Reality (VR)Cameras, Three-dimensional displays, Lighting, Estimation, Measurement by laser beam, Solid modeling, Two dimensional displays
51
2017Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, Mark BillinghurstExploring natural eye-gaze-based interaction for immersive virtual realityhttp://ieeexplore.ieee.org/abstract/document/7893315/IEEE Symposium on 3D User Interfaces (3DUI)Erbium, Gaze tracking, Resists, Painting, Electronic mail, Two dimensional displays, Portable computers
52
Leanne Chukoskie, Jacqueline Nguyen, Jeanne TownsendGaze-contingent Games for Neurocognitive
Therapy: More than Meets the Eye?
http://cogain2017.cogain.org/camready/talk2-Chukoskie.pdfCOGAIN 2017 (Talk)attention, eye tracking, gaze-contingent, training, usability, video games
53
2017Carlos E. L. Elmadjian, Antonio Diaz-Tula, Fernando O. Aluani, Carlos H. MorimotoGaze interaction using low-resolution images at 5 FPShttp://cogain2017.cogain.org/camready/talk5-Elmadjian.pdfCOGAIN 2017 (Talk)
54
2017David Dunn, Cary Tippets, Kent Torell, Petr Kellnhofer, Kaan Aksit, Piotr Didyk, Karol Myszkowski, David Luebke, Henry FuchsWide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrorshttp://ieeexplore.ieee.org/abstract/document/7829412/http://telepresence.web.unc.edu/files/2017/01/Dunn_2017_TVCG_MembraneAR.pdfIEEE Transactions on Visualization and Computer GraphicsAugmented reality, displays, focus accommodation, perception, user study
55
2017Christian Lander, Frederik Wiehr, Nico Herbig, Antonio Krüger, Markus LöchtefeldInferring Landmarks for Pedestrian Navigation from Mobile Eye-Tracking Data and Google Street Viewhttp://dl.acm.org/citation.cfm?doid=3027063.3053201http://umtl.dfki.de/~fred/papers/final-ea2721-lander.pdfCHI Conference (Extended Abstracts on Human Factors in Computing Systems)Landmarks, Eye Tracking, Google Street View, Navigation
56
2017Michaela Klauck, Yusuke Sugano, Andreas BullingNoticeable or Distractive?: A Design Space for Gaze-Contingent User Interface Notificationshttp://dl.acm.org/citation.cfm?doid=3027063.3053085http://delivery.acm.org/10.1145/3060000/3053085/ea1779-klauck.pdf?ip=14.207.3.246&id=3053085&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&CFID=760488783&CFTOKEN=40502403&__acm__=1494304622_07758cbfc2c218d89f5a8aa9c6232cb7CHI Conference (Extended Abstracts on Human Factors in Computing Systems)Interruptions, Attentive User Interfaces, Eye Tracking, Public Display, Peripheral Display
57
2017Michael Barz, Peter Poller, Daniel SonntagEvaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRIhttp://dl.acm.org/citation.cfm?id=3038367HRI (Human-Robot Interaction)
58
2017Martina TruschzinskiModeling Workload: A System Theory Approachhttp://dl.acm.org/citation.cfm?id=3038408http://dl.acm.org/ft_gateway.cfm?id=3038408&ftid=1850535&dwn=1&CFID=744538827&CFTOKEN=85983821HRI (Human-Robot Interaction)-
59
2017Mark Billinghurst, Kunal Gupta, Masai Katsutoshi, Youngho Lee, Gun Lee, Kai Kunze, Maki SugimotoIs It in Your Eyes? Explorations in Using Gaze Cues for Remote Collaborationhttp://link.springer.com/chapter/10.1007/978-3-319-45853-3_9Book Chapter(Springer): Collaboration Meets Interactive Spaces-
60
2017Patrick Renner, Theis PfeifferAttention Guiding Techniques using Peripheral Vision and Eye Tracking for Feedback in Augmented-Reality-based Assistance Systemshttp://ieeexplore.ieee.org/abstract/document/7893338/https://pub.uni-bielefeld.de/download/2908162/2908197IEEE Symposium on 3D User Interfaces (3DUI)HCI, Information interfaces, Presentaion
61
2017Avi Caspi, Arup Roy, Jessy D. Dorn, Robert J. GreenbergRetinotopic to Spatiotopic Mapping in Blind Patients Implanted With the Argus II Retinal Prosthesishttp://iovs.arvojournals.org/article.aspx?articleid=2597840#149774812Investigative Opthamology & Visual Science (IVOS)-
62
2017Hevesi, P; Ward, JA; Amiraslanov, O; Pirkl, G; Lukowicz, PAnalysis of the Usefulness of Mobile Eyetracker for the Recognition of Physical Activities
http://discovery.ucl.ac.uk/10039438/
http://discovery.ucl.ac.uk/10039438/8/Ward_2017%20Analysis%20of%20the%20Usefulness%20of%20Mobile%20Eyetracker%20for%20the%20Recognition%20of%20Physical%20Activities_.pdf
UBICOMM 2017Eyetracker; activity recognition; sensor fusion
63
2017Thiago Santini, Wolfgang Fuhl, David Geisler, Enkelejda KasneciEyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking
https://www.google.co.th/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwiIkNH1yKzRAhUHPI8KHbiPBREQFgggMAE&url=http%3A%2F%2Fwww.ti.uni-tuebingen.de%2FWolfgang-Fuhl.1651.0.html&usg=AFQjCNE23Vekm0hOdVgjqHV3xzmm45-71A&sig2=yEdTm2nkUSk9eFkWAdx0Mw
http://www.ti.uni-tuebingen.de/uploads/tx_timitarbeiter/main.pdf
Eye movements, pupil detection, calibration, gaze estimation, open-source, eye tracking, data acquisition, human-computer interaction, real-time, pervasive
64
2017Aleksandar Rodić, Theodor Borangiu Advances In Robot Design And Intelligent Control: Proceedings Of The 25th Conference On Robotics In Alpe-Adria-Danube Region (RAAD16)
https://books.google.co.th/books?hl=en&lr=&id=QCecDQAAQBAJ&oi=fnd&pg=PA378&dq=pupil+labs&ots=dRoZH2Qv8Y&sig=S-RoCdSwNeQO5wxwHCikjZTojJ0&redir_esc=y#v=onepage&q=pupil%20labs&f=false
-Proceedings from Robotics in Alpe-Adria-Danube Region (RAAD)-
65
2016Miika ToivanenAn advanced Kalman filter for gaze tracking signal
https://www.researchgate.net/publication/287507544_An_advanced_Kalman_filter_for_gaze_tracking_signalhttps://www.researchgate.net/profile/Miika_Toivanen/publication/287507544_An_advanced_Kalman_filter_for_gaze_tracking_signal/links/5a620e3e0f7e9b6b8fd4248f/An-advanced-Kalman-filter-for-gaze-tracking-signal.pdfBiomedical Signal Processing and ControlGaze tracking, Kalman filter, Principal component analysis, Image analysis
66
2016Florian Alt, Sarah Torma, Daniel BuschekDon’t Disturb Me – Understanding Secondary Tasks on Public Displayshttps://pdfs.semanticscholar.org/d6f4/dab5af8e61f1167d84ac2e1c44e5e5518ec3.pdfInternational Symposium on Pervasive Displays (PERDIS)public display, secondary task performance, mental workload, parallel-task environment
67
2016Carolina Rodriguez-Paras, Shiyan Yang, & Thomas K. FerrisUsing Pupillometry to Indicate the Cognitive Redlinehttp://journals.sagepub.com/doi/pdf/10.1177/1541931213601157Proceedings from Human Factors and Ergonomics Society Annual Meeting
68
2016Ondřej KubánekThe Impact Analysis Of The Technical Devices Used By The Driver On The Reaction Timehttps://dspace.vutbr.cz/bitstream/handle/11012/63302/DP%20Kub%C3%A1nek.pdf?sequence=2PhD thesis?Driver, reaction time, legal requirements, calling
69
2016Yun Suen PaiPhysiological Signal-Driven Virtual Reality in Social Spaceshttp://dl.acm.org/citation.cfm?id=2984787ACM Symposium on User Interface Software and Technology (UIST)
70
2016Henny Admoni, Siddhartha SrinivasaPredicting User Intent Through Eye Gaze for Shared Autonomyhttp://hennyadmoni.com/documents/admoni2016aaaifs.pdf
71
2016Astrid Weiss, Andreas Huber, Jürgen Minichberger, Markus IkedaFirst Application of Robot Teaching in an Existing Industry 4.0 Environment: Does It Really Work?http://www.mdpi.com/2075-4698/6/3/20http://www.mdpi.com/2075-4698/6/3/20/pdfhuman-robot interaction, Industry 4.0, case study, field test, robot teaching
72
2016Tilman Dingler, Rufat Rzayev, Valentin Schwind, Niels HenzeRSVP on the Go - Implicit Reading Support on Smart Watches Through Eye Trackinghttp://dl.acm.org/citation.cfm?id=2971794&dl=ACM&coll=DL&CFID=712884317&CFTOKEN=39946609http://tilmanification.org/assets/pdf/Dingler2016RSVPEyeControl.pdfACM International Symposium on Wearable Computers (ISWC)Reading interfaces, RSVP; eye-tracking, eye gaze interaction, mental load, comprehension
73
2016Hidde van der Meulen, Petra Varsanyi, Lauren Westendorf, Andrew L. Kun, Orit ShaerTowards Understanding Collaboration around Interactive Surfaces: Exploring Joint Visual Attention
http://dl.acm.org/citation.cfm?id=2984778http://cs.wellesley.edu/~hcilab/publication/uist16-eyetracking.pdfACM Symposium on User Interface Software and Technology (UIST)Visual attention, collaboration, eye tracking
74
2016Daniel Pohl, Xucong Zhang, Andreas Bulling, Oliver GrauConcept for Using Eye Tracking in a Head-Mounted Display to Adapt Rendering to the User’s Current Visual Fieldhttp://dl.acm.org/citation.cfm?doid=2993369.2996300https://perceptual.mpi-inf.mpg.de/wp-content/blogs.dir/12/files/2016/11/pohl2016_vrst.pdfVirtual reality, eye tracking, rendering, head-mounted display
75
2016Santiago Bonada, Rafael Veras, Christopher CollinsPersonalized Views for Immersive Analyticshttp://dl.acm.org/citation.cfm?id=3009953http://vialab.science.uoit.ca/wp-content/papercite-data/pdf/bon2016a.pdf-
76
2016Hidde van der Meulen, Andrew L. Kun, Christian P. JanssenSwitching Back to Manual Driving: How Does it Compare to Simply Driving Away After Parking?
http://dspace.library.uu.nl/handle/1874/338115
http://www.cpjanssen.nl/Publications/VanDerMeulenKunJanssen_AutoUI2016.pdfAutonomous driving, driver distraction, parking, visual distraction
77
2016Mihai Bace, Teemu Leppanen, Argenis Ramirez Gomez, David Gil de GomezubiGaze: Ubiquitous Augmented Reality Messaging Using Gaze Gestures
http://dl.acm.org/citation.cfm?id=2999530
http://www.vs.inf.ethz.ch/publ/papers/mbace_SA2016_ubiGaze.pdfAugmented Reality, Eye Tracking, Gesture Interaction, Gaze Gestures, Messaging
78
2016Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Gordon WetzsteinSaliency in VR: How do people explore virtual environments?
https://arxiv.org/abs/1612.04335
https://arxiv.org/pdf/1612.04335.pdf-
79
2016Alexander Plopski, Jason Orlosky, Yuta Itoh, Christian Nitschke, Kiyoshi Kiyokawa, Gudrun KlinkerAutomated Spatial Calibration of HMD Systems with Unconstrained Eye-cameras
http://ieeexplore.ieee.org/document/7781771/
http://imd.naist.jp/imdweb/pub/plopski_ISMAR16/paper.pdfOST-HMD calibration, eye pose estimation
80
2016Evani Amaral Camargo, Daniel Paz de Araújo, Hermes Renato Hildebrand, Rosangela da Silva LeoteTecnologias Assistivas E Arte-Educação: Interfaces Digitais E Físicas
https://www.google.co.th/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjpg6HLyKzRAhXERo8KHSr4AsQQFggaMAA&url=https%3A%2F%2Fwww.metodista.br%2Frevistas%2Frevistas-unimep%2Findex.php%2Fcomunicacoes%2Farticle%2Fview%2F2949%2F1850&usg=AFQjCNGR5M_-73Hc13kuAPeHTugQxTudRw&sig2=C8tGuPFoV7tQPtOOq7Aoaw
https://arxiv.org/pdf/1612.06209.pdfAssistive Technologies, Digital Interfaces, Art Education, Interaction Design.
81
2016Ngu Nguyen, Stephan SiggPersonalized Image-based User Authentication using Wearable Cameras-
https://arxiv.org/pdf/1612.06209.pdf
-
82
2016Moayad Mokatren, Tsvi Kuflik, Ilan ShimshoniListen to What You Look at: Combining an Audio Guide with a Mobile Eye Tracker on the Go
http://www.cri.haifa.ac.il/index.php/40-uncategorised/474-moayad-mokatran-abstract
http://ceur-ws.org/Vol-1772/paper1.pdf
AI* CH@ AI* IA (Italian Workshop on Artificial Intelligence for Cultural Heritage)-
83
Moayad Mokatren, Tsvi KuflikExploring the potential contribution of mobile eye-tracking
technology in enhancing the museum visit experience
http://ceur-ws.org/Vol-1621/paper5.pdf
AVI*CH (Workshop on Advanced Visual Interfaces for Cultural Heritage)Mobile guide; Mobile eye tracking; Personalized information;
Smart environment; Context aware service
84
2016Maike Schindler, Achim J. Lilienthal, Ravi Teja Chadalavada, Magnus ÖgrenCreativity In The Eye Of The Student. Refining Investigations Of Mathematical Creativity Using Eye-Tracking Goggles.
https://www.researchgate.net/publication/306263411_Creativity_in_the_eye_of_the_student_Refining_investigations_of_mathematical_creativity_using_eye-tracking_goggles
https://www.researchgate.net/profile/Achim_Lilienthal/publication/306263411_Creativity_in_the_eye_of_the_student_Refining_investigations_of_mathematical_creativity_using_eye-tracking_goggles/links/585546f208ae81995eb419b2.pdf
Conference of the International Group for the Psychology of Mathematics Education -
85
2016Joshua Newn, Eduardo Velloso, Marcus Carter, Frank VetereExploring the Effects of Gaze Awareness on Multiplayer Gameplay
http://dl.acm.org/citation.cfm?id=2987740
http://www.socialnui.unimelb.edu.au/publications/2016-SocialNUI-Newn-3.pdf
Symposium on Computer-Human Interaction in Play Companion Extended AbstractsEye tracking, Gaze tracking, Co-located play, Multiplayer,m Tabletop games, Board games, Video games, Cognitive strategies, Gaze awareness; EyePlay
86
2016Yun Suen Pai, Benjamin Outram, Noriyasu Vontin, Kai KunzeTransparent Reality: Using Eye Gaze Focus Depth as Interaction Modality
http://dl.acm.org/citation.cfm?id=2984754
http://dl.acm.org/ft_gateway.cfm?id=2984754&ftid=1801795&dwn=1&CFID=864965420&CFTOKEN=38215053
ACM Symposium on User Interface Software and Technology (UIST)Virtual reality; Eye tracking; Focus depth; Interaction modality
87
2016Kunal Gupta, Gun A. Lee, Mark BillinghurstDo You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration
http://ieeexplore.ieee.org/document/7523400/
-IEEE Transactions on Visualization and Computer Graphicsvideoconferencing, Computer-supported collaborative work, Computer conferencing, teleconferencing
88
2016Justus Thies, Michael Zollhofer, Marc Stamminger, Christian Theobalt, Matthias Nießner, FaceVR: Real-Time Facial Reenactment and Eye Gaze Control in Virtual Reality
https://arxiv.org/abs/1610.03151
https://arxiv.org/pdf/1610.03151.pdf

Computer Science - Computer Vision and Pattern Recognition
89
2016Stewart Greenhill, Cathie TraversFocal: An Eye-Tracking Musical Expression Controller
http://stewartgreenhill.com/articles/focal/
http://stewartgreenhill.com/documents/FocalEyeTrackingMusicalExpressionController-NIME2016.pdf
computer music, eye tracking, expression controller, augmented reality, user interface design
90
2016Gerald Pirkl, Peter Hevesi,Paul Lukowicz, Pascal Klein,Carina Heisel,Sebastian Gröber,Jochen Kuhn, Bernhard SickAny Problems? A wearable sensor-based platform for representational learning-analytics.http://dl.acm.org/citation.cfm?doid=2968219.2971383-ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP)sensor supported eduction, sensor pen, eye tracker
91
2016Benedikt Gollan, Michael Haslgrübler, Alois FerschaDemonstrator for Extracting Cognitive Load from Pupil Dilation for Attention Management Serviceshttp://dl.acm.org/citation.cfm?doid=2968219.2968550-ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP)Cognitive load, Attention Estimation, Pupillometry, Attention & Interruption Management
92
2016Michael Barz, Daniel SonntagGaze-guided Object Classification using Deep Neural Networks for Attention-based Computinghttp://dl.acm.org/citation.cfm?doid=2968219.2971389-ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP)eye tracking, gaze-based interaction, object classification, visual attention
93
2016Carlos Rafael Fernandes Picanço, François Jacques TonneauAnálise do comportamento por meio de rastreamento de movimentos oculares: uma nota técnicahttps://github.com/cpicanco/abpmc-2016-rastreamento de movimentos oculares; discriminações simples; acessibilidade
94
2016Yun Suen Pai, Benjamin Tag, Benjamin Outram, Noriyasu Vontin, Kazunori Sugiura, Kai KunzeGazeSim: Simulating Foveated Rendering Using Depth in Eye Gaze for VR, SIGRAPH 2016 (Poster)http://dl.acm.org/citation.cfm?id=2945153http://delivery.acm.org/10.1145/2950000/2945153/a75-pai.pdf?ip=61.90.143.130&id=2945153&acc=OPENTOC&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E383ADA7593775D6F&CFID=851898319&CFTOKEN=23874214&__acm__=1476421271_2e9c4d3b10ac2ed10ee7ce7d3a1b1d0ddepth of field, eye gaze, foveated rendering, virtual reality, vr
95
2016Yamen Saraiji, Shota Sugimoto, Charith Lasantha Fernando, Kouta Minamizawa, Susumu TachiLayered Telepresence, Simultaneous Multi Presence Experience using Eye gaze based Perceptual Awarness Blending, SIGGRAPH 2016 (Poster)http://s2016.siggraph.org/poster_pdfs/Poster_24_-_0074.pdfaugmented reality, telepresence, blended reality
96
2016Thiago Santini, Wolfgang Fuhl, Thomas Kubler, Enkelejda KasneciEyeRec: An Open-Source Data Acquisition Software for Head-Mounted Eye-Trackinghttp://www.ti.uni-tuebingen.de/Thiago-Santini.1781.0.html?&L=1http://www.ti.uni-tuebingen.de/uploads/tx_timitarbeiter/Santini_et_al.pdfEye movements, recording software, pupil detection, calibration, open-source, data acquisition
97
2016Wolfgang Fuhl, Andreas Bulling, Marc Tonsen, Enkelejda KasneciPupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the arthttp://link.springer.com/article/10.1007/s00138-016-0776-4-Pupil detection, Head-mounted eye tracking, Data set, Computer vision, Image processing
98
2016Moayad Mokatren, Tsvi Kuflik, Ilan ShimshoniUsing Eye-Tracking for Enhancing the Museum Visit Experiencehttp://dl.acm.org/citation.cfm?id=2926060-International Working Conference on Advanced Visual Interfaces (AVI)-
99
2016Xi Wang, David Lindlbauer, Christian Lessig, Marianne Maertens, Marc AlexaMeasuring Visual Salience of 3D Printed Objectshttp://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=7478427&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D7478427-Calibration, Cameras, Geometry, Observers, Predictive models, Three-dimensional displays, Visualization
100
2016Erol AygarTowards gaze recurrence quantification for N-body collaboration environmentshttp://www.erolaygar.com/wp-content/uploads/2016/05/TowardsNbodyGazeRecurrenceQuantification.pdfhttp://www.erolaygar.com/wp-content/uploads/2016/05/TowardsNbodyGazeRecurrenceQuantification.pdfHuman Computer Interaction; Ubiquitous Computing; Eye Tracking; Cross Recurrence Analysis
Loading...
 
 
 
Database
 
 
Main menu