Pupil Citation List
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
ACDEFGH
1
Year PublishedAuthor(s)TitleURLPDFJournal/ConferenceKeywords
2
2018Newman, Benjamin A., Aronson, Reuben M., Srinivasa, Siddhartha S., Kitani, Kris, and Admoni, HennyHARMONIC: A Multimodal Dataset of
Assistive Human-Robot Collaboration
http://harp.ri.cmu.edu/harmonic/https://arxiv.org/pdf/1807.11154.pdfComputer Science - Robotics, Computer Science - Human-Computer Interaction
3
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, and Henny AdmoniEye-Hand Behavior in Human-Robot Shared Manipulationhttp://harp.ri.cmu.edu/assets/pubs/hri2018_aronson.pdfHuman-Robot Interaction 2018Computer Science - Robotics, Computer Science - Human-Computer Interaction
4
2018Reuben M. Aronson and Henny AdmoniGaze for Error Detection During Human-Robot Shared Manipulation.http://harp.ri.cmu.edu/assets/pubs/fja_rss2018_aronson.pdfJoint Action Workshop at RSS 2018Computer Science - Robotics, Computer Science - Human-Computer Interaction
5
2018Jeff J MacInnes, Shariq Iqbal, John Pearson, Elizabeth N JohnsonWearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across deviceshttps://www.biorxiv.org/content/early/2018/06/28/299925.abstracthttps://www.biorxiv.org/content/biorxiv/early/2018/06/28/299925.full.pdf
6
2018Corten SingerSee-Thru: Towards Minimally Obstructive Eye-Controlled
Wheelchair Interfaces
https://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-61.pdfMaster Thesis - Technical Report No. UCB/EECS-2018-61Eye Gaze, Eye Tracking, Gaze Control,
Eyes-Only Interaction, User Interfaces, Power
Wheelchair, Smart Wheelchair, User Experience,
Assistive Technology, Gaze Gestures, Field of View,
Obstructive
7
2018Mingming Wang, Kate Walders, Martin E.Gordon, Jeff B. Pelz, Susan FarnandAuto-simulator Preparation for Research into Assessing the Correlation
Between Human Driving Behaviors and Fixation Patterns
https://www.ingentaconnect.com/contentone/ist/ei/2018/00002018/00000017/art00007#http://docserver.ingentaconnect.com/deliver/connect/ist/24701173/v2018n17/s7.pdf?expires=1532320402&id=0000&titleid=72010604&checksum=5581D989E4F2D227E1AE054EEA0E51EBElectronic Imaging, Autonomous Vehicles and Machines 2018, pp. 163-1-163-6(6)
8
2018EIKE LANGBEHN, FRANK STEINICKE, MARKUS LAPPE, GREGORY F. WELCH, GERD BRUDERIn the Blink of an Eye –
Leveraging Blink-Induced Suppression for Imperceptible Position and
Orientation Redirection in Virtual Reality
https://basilic.informatik.uni-hamburg.de/Publications/2018/LSLWB18/eye_blinks.pdfACM Trans. Graph., Vol. 37, No. 4, Article 66Human-centered computing→Virtual reality;• Computing
methodologies → Perception; Virtual reality;Virtual reality, eye blinks, redirected
walking, psychophysical experiments
9
2018Carlos Rafael Fernandes Picanço, François Jacques TonneauA low‐cost platform for eye‐tracking research: Using Pupil© in behavior analysishttps://onlinelibrary.wiley.com/doi/abs/10.1002/jeab.448
10
2018Thiago Santini, Wolfgang Fuhl, Enkelejda KasneciPuReST: robust pupil tracking for real-time pervasive eye tracking
https://dl.acm.org/citation.cfm?id=3204578ETRA 2018
11
2018Justus Thies, Michael Zollhofer, Marc Stamminger, Christian Theobalt, Matthias Nießner, FaceVR: Real-Time Gaze-Aware
Facial Reenactment in Virtual Reality
https://dl.acm.org/citation.cfm?id=3182644https://arxiv.org/pdf/1610.03151.pdfACM Transactions on Graphics (TOG) Volume 37 Issue 2 Article No. 25face tracking, virtual reality, eye tracking
12
2018Li, W-C., Kearney, P., Braithwaite, G. & Lin, JHow much is too much? Visual Scan Patterns of Single Air Traffic Controller Performing Multiple Remote Tower Operationshttps://doi.org/10.1016/j.ergon.2018.05.005International Journal of Industrial Ergonomics (67, 136-144)Air traffic managementAviation safetyCost-efficiencyHuman-computer interactionsMultiple remote tower operationsSituation awarenes
13
2018Kearney, P. & Li, W-CMultiple Remote Tower for Single European Sky: the Evolution from Initial Operational Concept to Regulatory Approved Implementationhttps://doi.org/10.1016/j.tra.2018.06.005Transportation Research Part-A, 116, 15-30Air traffic control
Cost efficiency
Human performance
Multiple remote tower operations
Safety assessment
Single European Sky
14
2018Florian Jungwirth, Michael Haslgrübler, Alois FerschaContour-Guided Gaze Gestures: Using Object Contours as Visual Guidance for Triggering Interactionshttps://dl.acm.org/citation.cfm?id=3204530http://delivery.acm.org/10.1145/3210000/3204530/a28-jungwirth.pdf?ip=46.88.71.149&id=3204530&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&__acm__=1529603119_328af9dab57076b59c35dc9d6270f93bETRA 2018Wearable Computing; Pervasive Computing; Eye-Tracking; Gaze- based Interaction; Internet of Things
15
2018Carlos Elmadjian, Pushkar Shukla, Antonio Diaz Tula, Carlos H. Morimoto3D gaze estimation in the scene volume with a head-mounted eye trackerhttps://dl.acm.org/citation.cfm?id=3206351http://delivery.acm.org/10.1145/3210000/3206351/a3-elmadjian.pdf?ip=46.88.71.149&id=3206351&acc=OPENTOC&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E383ADA7593775D6F&__acm__=1529603164_193f8cbfe8ef54f156cf3d4a4f49c539ETRA / COGAIN 2018Head-mounted eye tracking, calibration, gaze estimation, 3D dataset
16
2018Michael Barz,
Florian Daiber,
Daniel Sonntag,
Andreas Bulling
Error-Aware Gaze-Based Interfaces for
Robust Mobile Gaze Interaction
https://doi.org/10.1145/3204493.3204536https://perceptual.mpi-inf.mpg.de/files/2018/04/barz18_etra.pdfETRA 2018Eye Tracking; Mobile Interaction; Gaze Interaction; Error Model;
Error-Aware
17
2018Reuben M. Aronson, Thiago Santini, Thomas C. Kübler, Enkelejda Kasneci, Siddhartha Srinivasa, Henny Admoni Eye-Hand Behavior in Human-Robot Shared Manipulationhttps://dl.acm.org/citation.cfm?id=3171287https://www.ri.cmu.edu/wp-content/uploads/2018/01/hri2018_aronson.pdfConference Paper, Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, March, 2018 human-robot interaction, eye gaze, eye tracking, shared autonomy,
nonverbal communication
18
2018Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, Mark BillinghurstPinpointing: Precise Head- and Eye-Based Target
Selection for Augmented Reality
https://dl.acm.org/citation.cfm?id=3173655https://www.researchgate.net/profile/Mikko_Kytoe/publication/323970135_Pinpointing_Precise_Head-_and_Eye-Based_Target_Selection_for_Augmented_Reality/links/5ab559da0f7e9b68ef4cf26a/Pinpointing-Precise-Head-and-Eye-Based-Target-Selection-for-Augmented-Reality.pdfCHI 2018Eye tracking; gaze interaction; refinement techniques;
target selection; augmented reality; head-worn display
19
2018Leanne Chukoskie, Shengyao Guo, Eric Ho, Yalun Zheng, Qiming Chen, Vivian Meng, John Cao, Nikhita Devgan, Si Wu, Pamela C. CosmanQuantifying Gaze Behavior during Real World Interactions using Automated Object, Face, and Fixation Detectionhttps://ieeexplore.ieee.org/abstract/document/8328848/IEEE Transactions on Cognitive and Developmental Systemseye-tracking,
gaze behavior,
face detection,
computer vision.
20
2018Kai Dierkes, Moritz Kassner, Andreas BullingA novel approach to single camera, glint-free
3D eye model fitting including corneal refraction
https://perceptual.mpi-inf.mpg.de/files/2018/04/dierkes18_etra.pdfETRA 2018Eye tracking, refraction, 3D eye model, pupil detection, contourbased,
glint-free
21
2018Thomas Kosch, Mariam Hassib, Daniel Buschek, Albrecht SchmidtLook into my Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptationhttps://dl.acm.org/citation.cfm?id=3188643CHI EA 2018Cognition-Aware Interfaces; Workload-Aware Computing;
Pupil Dilation; Eye Tracking
22
2018Damian Almaraz, Brock Carlson, Hieu-Trung Vu, Jeremy LoebachPupillometry as a measure of auditory cognitive processes and listening efforthttps://doi.org/10.1121/1.5035727The Journal of the Acoustical Society of America 143, 1751 (2018)
23
2018Iuliia Brishtel, Shoya Ishimaru,Olivier Augereau, Koichi Kise, Andreas DengelAssessing Cognitive Workload on Printed and Electronic Media using Eye-Tracker and EDA Wristbandhttps://dl.acm.org/citation.cfm?id=3180354https://www.dropbox.com/s/r807qwnvkz6v0lj/IUI2018Iuliia.pdf?raw=1IUI '18 Companion Proceedings of the 23rd International Conference on Intelligent User Interfaces CompanionE-learning; Eye-Tracking; Reading; Information Processing;
Electrodermal Activity; Cognitive Workload; User-Interface
24
2018Mirko Raković, Nuno Duarte, Jovica Tasevski, José Santos-Victor, Branislav BorovacA dataset of head and eye gaze during dyadic interaction task for
modeling robot gaze behavior
https://doi.org/10.1051/matecconf/201816103002https://www.matec-conferences.org/articles/matecconf/pdf/2018/20/matecconf_erzr2018_03002.pdf13th International Scientific-Technical Conference on Electromechanics and Robotics “Zavalishin’s Readings” - 2018
25
2018Mohamed Khamis, Carl Oechsner, Florian Alt, Andreas BullingVRPursuits: Interaction in Virtual Reality using Smooth Pursuit
Eye Movements
https://doi.org/10.1145/3206505.3206522https://perceptual.mpi-inf.mpg.de/files/2018/04/khamis18_avi.pdfAVI 2018Eye Tracking, Virtual Reality, Gaze Interaction, Pursuits
26
2018Ruimin Li, Bin Li, Shixiong Zhang, Hong Fu, Wai-Lun Lo, Jie Yu, Cindy H.P. Sit, Desheng WenEvaluation of the fine motor skills of children with DCD using the digitalised
visual-motor tracking system
https://ieeexplore.ieee.org/abstract/document/8316738/https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8316738The Journal of Engineering ( Volume: 2018, Issue: 2, 2 2018 )developmental coordination disorder,
hand movement,
eye gaze position,
eye tracker,
digitalised visual-motor tracking system,
DCD,
children,
fine motor skills
27
2018Nuno Duarte, Jovica Tasevski, Moreno Coco, Mirko Raković, José Santos-VictorAction Anticipation: Reading the Intentions of Humans and Robotshttps://arxiv.org/abs/1802.02788https://arxiv.org/pdf/1802.02788
28
2018Julian Steil, Philipp Müller, Yusuke Sugano, Andreas BullingForecasting User Attention During Everyday Mobile
Interactions Using Device-Integrated and Wearable Sensors
https://arxiv.org/abs/1801.06011.pdfhttps://arxiv.org/pdf/1801.06011.pdfMobileHCI 2018Egocentric Vision; Handheld Mobile Device; Attention Shifts;
Mobile Eye Tracking; Attentive User Interfaces
29
2018Christian Lander, Marco Speicher, Frederic Kerber, Antonio KrügerTowards Fixation Extraction in Corneal Imaging Based Eye Tracking Datahttps://doi.org/10.1145/3170427.3188597CHI’18 Extended Abstracts, April 21–26, 2018Corneal Imaging; Fixation; Image Stitching.
30
2018Sylvain Pauchet, Catherine Letondal, Jean-Luc Vinot, Mickaël Causse, Mathieu Cousy, Valentin Becquet, Guillaume Crouzet GazeForm: Dynamic Gaze-adaptive Touch Surface
for Eyes-free Interaction in Airliner Cockpits
https://www.researchgate.net/profile/Mickael_Causse/publication/324561583_GazeForm_Dynamic_Gaze-adaptive_Touch_Surface_for_Eyes-free_Interaction_in_Airliner_Cockpits/links/5ad5a8fda6fdcc293580bcdf/GazeForm-Dynamic-Gaze-adaptive-Touch-Surface-for-Eyes-free-Interaction-in-Airliner-Cockpits.pdfDIS 2018 (Hong Kong)Eyes-free interaction; touchscreens; TEI; shape-changing
interfaces; adaptive interfaces; eye-tracking; critical contexts
31
2018Paul Schydlo, Mirko Rakovic, Lorenzo Jamone, Jose Santos-VictorAnticipation in Human-Robot Cooperation: A recurrent neural network
approach for multiple action sequences prediction
https://arxiv.org/abs/1802.10503.pdfhttps://arxiv.org/pdf/1802.10503.pdf
32
2018Adithya B, Lee Hanna, Pavan Kumar B N, Youngho ChaiCalibration Techniques and Gaze Accuracy Estimation in Pupil Labs Eye Trackerhttp://www.dbpia.co.kr/Journal/ArticleDetail/NODE07404419TECHART: Journal of Arts and Imaging Science, Vol.5 No.1calibration
33
2018T.A.B. de Boer, J. Hoogmoed, N.M. Looye, J.R.P. van der Toorn, R.P. de Vos, J. Stapel, P. Bazilinskyy, J.C.F. de WinterCombining eye-tracking with semantic scene labelling in a car https://www.researchgate.net/profile/Joost_De_Winter/publication/322293911_Combining_eye-tracking_with_semantic_scene_labelling_in_a_car/links/5a5136d0aca2725638c592c0/Combining-eye-tracking-with-semantic-scene-labelling-in-a-car.pdfWorking paper, January 2018Computer Vision, Semantic Scene Understanding, Object Classification, Driving, Transportation
34
2018Ľuboš Hládek, Bernd Porr, W. Owen BrimijoinReal-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculographyhttp://journals.plos.org/plosone/article?id=10.1371/journal.pone.0190420http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0190420&type=printablePLoS ONE 13(1)Electrooculography, Eye gaze angle, Saccade
35
2018Martina Truschzinskinski, Alberto Betellab, Guido Brunnettc, Paul F.M.J. VerschureEmotional and cognitive influences in air traffic controller tasks: An investigation using a virtual environment?https://www.sciencedirect.com/science/article/pii/S0003687017302855Journal "Applied Ergonomics" Volume 69Air traffic control, Personality, Workload, Mood, Virtual reality
36
2018Yasmeen Abdrabou,Khaled KassemJailan SalahReem El-GendyMahesty MorsyYomna Abdelrahman, Slim AbdennadherExploring the Usage of EEG and Pupil Diameter to Detect Elicited Valencehttps://link.springer.com/chapter/10.1007/978-3-319-73888-8_45Intelligent Human Systems IntegrationEEG Eye tracker Affective computing
37
2018Sebastiaan Mathôt, Jasper FabiusElle Van HeusdenStefan Van der StigchelSafe and sensible preprocessing and baseline correction of pupil-size datahttps://link.springer.com/article/10.3758/s13428-017-1007-2https://link.springer.com/content/pdf/10.3758%2Fs13428-017-1007-2.pdfJournal "Behavior Research Method" (January 2018)Pupillometry, Pupil size, Baseline correction, Research methods
38
2018Rahool Patel, Adrian Zurca396: I See What You See Adding Eye-Tracking To Medical Simulationhttp://journals.lww.com/ccmjournal/Fulltext/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.aspxhttp://pdfs.journals.lww.com/ccmjournal/2018/01001/396___I_SEE_WHAT_YOU_SEE_ADDING_EYE_TRACKING_TO.362.pdf?token=method|ExpireAbsolute;source|Journals;ttl|1516159040898;payload|mY8D3u1TCCsNvP5E421JYK6N6XICDamxByyYpaNzk7FKjTaa1Yz22MivkHZqjGP4kdS2v0J76WGAnHACH69s21Csk0OpQi3YbjEMdSoz2UhVybFqQxA7lKwSUlA502zQZr96TQRwhVlocEp/sJ586aVbcBFlltKNKo+tbuMfL73hiPqJliudqs17cHeLcLbV/CqjlP3IO0jGHlHQtJWcICDdAyGJMnpi6RlbEJaRheGeh5z5uvqz3FLHgPKVXJzd9ia1/MJJUFmWp1b9urv13G3AW53fk6CUMZKseFCONh0=;hash|4XIUop/VFLFVP+lXHDRboA==Journal "Critical Care Medicine" Volume 46, Issue 1 (January 2018)Medical simulation
39
2018Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, Andreas BullingPrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysishttps://arxiv.org/abs/1801.04457https://arxiv.org/pdf/1801.04457.pdfEgocentric Vision; Eye Tracking; Gaze Behaviour
40
2017Wen-Chin Li, Jiaqi Cao, Jr-Hung Lin, Graham Braithwaite, Matthew GreavesThe Evaluation of Pilot’s First Fixation
and Response Time to Different Design
of Alerting Messages
https://pdfs.semanticscholar.org/f9e6/e1c8b5194c4e7b7565bf346e8192d7289803.pdfAG 2017Cockpit design, Crew Alerting System, Eye movement, Human-computer interaction, Quick Reference Handbook
41
2107Henny Admoni, Siddhartha SrinivasaEye Gaze Reveals Intentions in Shared Autonomyhttp://intentions.xyz/wp-content/uploads/2017/01/Admoni.intentions-eyegaze.reduced.pdfProceedings of Intentions in HRI Workshop at HRI
2017, Vienna, Austria, March 2017 (Intentions in HRI ’17),
human-robot interaction, intentions, nonverbal behavior, eye gaze
42
2017Marco Filippucci, Fabio Bianconi, Elisa Bettollini, Michela Meschini and Marco Seccaroni Survey and Representation for Rural Landscape. New
Tools for New Strategies: The Example of Campello
Sul Clitunno
http://www.mdpi.com/2504-3900/1/9/934/pdflandscape and image; perception; eye tracking; algorithmic spatial analysis;
participation
43
2017Brendan JohnA Dataset of Gaze Behavior in VR Faithful to Natural Statisticshttp://scholarworks.rit.edu/cgi/viewcontent.cgi?article=10716&context=thesesRIT Student MS Thesis
44
2017Jork Stapel, Freddy Antony Mullakkal-Babu, Riender HappeeDriver behaviour and workload in an on-road automated vehiclehttps://www.researchgate.net/profile/Jork_Stapel2/publication/318702364_Driver_Behavior_and_Workload_in_an_On-road_Automated_Vehicle/links/597887fa45851570a1b9623a/Driver-Behavior-and-Workload-in-an-On-road-Automated-Vehicle.pdfPreprintAutomated Driving; On-road; Workload; Experience; Underload
45
2017Iuliia Kotseruba, John K. Tsotsos†STAR-RT: Visual attention for real-time video game playinghttps://arxiv.org/abs/1711.09464https://arxiv.org/pdf/1711.09464.pdfARXIV PreprintSTAR; Cognitive Programs; visual attention; Visual Routines; real-time vision; platform video games; game AI
46
2017QI SUN,FU-CHUNG HUANG, JOOHWAN KIM,
LI-YI WEI, DAVID LUEBKE, ARIE KAUFMAN
Perceptually-Guided Foveation for Light Field Displayshttp://research.nvidia.com/publication/2017-11_Perceptually-Guided-Foveation-forhttp://research.nvidia.com/sites/default/files/publications/c121-f121_199-a18-paperfinal-v3.pdfACM Trans. Graph. 36, 6, Article 192 (November 2017)light field, computational display, foveation,
sampling
47
2017Changwon Jang, Kiseung Bang, Seokil Moon, Jonghyun Kim, Seungjae Lee, Byoungho LeeRetinal 3D: Augmented Reality Near-Eye Display Via Pupil-Tracked Light Field Projection on Retinahttp://oeqelab.snu.ac.kr/retinal3dhttp://oeqelab.snu.ac.kr/?module=file&act=procFileDownload&file_srl=74430&sid=cfde647ef4e59a0adb7f07197f53b84e&module_srl=74407SIGGRAPH 2017. Vol 36. Issue 6. Article No. 190Near-eye display, eye tracking, computational
displays, holographic optical element, vergence-accommodation
conƒict
48
2017Khaled Kassem, Jailan Salah, Yasmeen Abdrabou, Mahesty Morsy, Reem El-Gendy,
Yomna Abdelrahman
, Slim Abdennadher
DiVA: Exploring the Usage of Pupil Diameter to Elicit
Valence and Arousal
https://www.researchgate.net/profile/Yomna_Abdelrahman/publication/321783827_DiVA_exploring_the_usage_of_pupil_di_ameter_to_elicit_v_alence_and_a_rousal/links/5a328a17aca27271444f3689/DiVA-exploring-the-usage-of-pupil-di-ameter-to-elicit-v-alence-and-a-rousal.pdfArousal; Valence; Eye Tracker; Pupil Diameter
49
2017Benjamin Hatscher, Maria Luz, Lennart E. Nacke, Norbert Elkmann, Veit Muller, Christian HansenGazeTap: Towards Hands-Free Interaction in the Operating Roomhttps://www.researchgate.net/profile/Christian_Hansen17/publication/320372375_GazeTap_Towards_Hands-Free_Interaction_in_the_Operating_Room/links/59f7498e0f7e9b553ebedb6d/GazeTap-Towards-Hands-Free-Interaction-in-the-Operating-Room.pdfInternational Conference on Multimodal InteractionInput techniques, multimodal interaction, foot input, gaze input,
eye tracking, gaze-foot interaction, HCI in the operating room
50
2017Christian Lander, Sven Gehring, Markus Löchtefeld, Andreas Bulling, Antonio KrügerEyeMirror: Mobile Calibration-Free Gaze Approximation
using Corneal Imaging
https://perceptual.mpi-inf.mpg.de/files/2017/11/lander17_mum.pdfMUM 2017, November 26–29, 2017Corneal Image; mobile device; gaze approximation; feature
tracking; pervasive
51
2017Zhao, Zhong, Robin N. Salesse, Ludovic Marin, Mathieu Gueugnon, and Benoit G. Bardy. Likability’s Effect on Interpersonal Motor Coordination: Exploring Natural Gaze Directionhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/fullhttps://www.frontiersin.org/articles/10.3389/fpsyg.2017.01864/pdfJournal "Frontiers in Psychology"interpersonal motor coordination, likability, finger-tapping, markers
52
2017Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, Hans GellersenGaze + Pinch Interaction in Virtual Realityhttps://dl.acm.org/citation.cfm?id=3132180https://kenpfeuffer.files.wordpress.com/2015/02/p99-pfeuffer.pdfSUI ’17, October 16–17, 2017Gaze; pinch; freehand gesture; interaction technique; multimodal
interface; menu; eye tracking; virtual reality.
53
2017Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, Ian OakleySmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

https://www.researchgate.net/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Realityhttps://www.researchgate.net/profile/Islam_Md_Rasel/publication/320571212_SmoothMoves_Smooth_Pursuits_Head_Movements_for_Augmented_Reality/links/5a0255c1aca2720df3ca03d2/SmoothMoves-Smooth-Pursuits-Head-Movements-for-Augmented-Reality.pdfACM Symposium on User Interface Software and Technology (UIST)Wearable computing, eye tracking, augmented reality, AR,
input technique, smooth pursuits, motion matching, HMD
54
2017Yun Suen Pai, Banjamin Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, Kai KunzeGazeSphere: Navigating 360-Degree-Video Environments in VR
Using Head Rotation and Eye Gaze
http://dl.acm.org/citation.cfm?id=3102183ACM SIGGRAPH PostersVirtual reality, 360-degree-video, eye tracking, orbital navigation
55
2017Mihai Bâce, Philippe Schlattner, Vincent Becker, Gábor SörösFacilitating Object Detection and Recognition
through Eye Gaze
http://www.vs.inf.ethz.ch/publ/papers/mbace_MobileHCI2017_workshop.pdfMobileHCI ’17 Workshops, September 04–09, 2017Eye Tracking; Eye Gaze; Wearable Computing; HCI
56
2017Otto Lappi, Paavo Rinkkala, Jami PekkanenSystematic Observation of an Expert Driver's Gaze Strategy—An On-Road Case Studyhttps://doi.org/10.3389/fpsyg.2017.00620https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00620/fullFront. Psychol., 27 April 2017
57
2017ANA SERRANO, VINCENT SITZMANN, JAIME RUIZ-BORAU, GORDON WETZSTEIN, DIEGO GUTIERREZ, BELEN MASIA Movie Editing and Cognitive Event Segmentation in Virtual Reality
Video
https://doi.org/10.1145/3072959.3073668http://webdiis.unizar.es/~aserrano/docs/Serrano_SIGG2017_VR-cine.pdfACM Transactions on Graphics, Vol. 36, No. 4, Article 47. Publication date: July 201VR, Human Centered Computing, Immersive environments, cinematography
58
2017Guillem Torrente MartiMobility for the severely disabled:
a head-controlled wheelchair
https://repositori.upf.edu/bitstream/handle/10230/32897/Torrente_2017.pdf?sequence=1&isAllowed=yBachelor's ThesisAccessibility, Eye Tracking
59
2017Francesco Walker, Berno Bucker, Nicola C. Anderson, Daniel Schreij, Jan TheeuwesLooking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adultshttps://doi.org/10.1371/journal.pone.0178912http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0178912&type=printablePloS one
60
2017Jason Orlosky, Yuta Itoh, Maud Ranchet, Kiyoshi Kiyokawa, John Morgan, Hannes DevosEmulation of Physician Tasks in Eye-Tracked Virtual Reality for Remote Diagnosis of Neurodegenerative Diseasehttps://www.researchgate.net/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Diseasehttps://www.researchgate.net/profile/Jason_Orlosky/publication/313022699_Emulation_of_Physician_Tasks_in_Eye-tracked_Virtual_Reality_for_Remote_Diagnosis_of_Neurodegenerative_Disease/links/58d352f6a6fdccd24d43c710/Emulation-of-Physician-Tasks-in-Eye-tracked-Virtual-Reality-for-Remote-Diagnosis-of-Neurodegenerative-Disease.pdfIEEE Transactions on Visualization and Computer GraphicsVirtual reality, eye tracking, diagnosis, visualization
61
2017Enkelejda Kasneci, Alex A. Black, Joanne M. WoodEye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucomahttps://www.hindawi.com/journals/joph/2017/6425913/abs/http://downloads.hindawi.com/journals/joph/2017/6425913.pdfJournal of ophthalmology
62
2017Andrew L. Kun, Hidde van der Meulen, Christian P. JanssenCalling While Driving: An Initial Experiment With Hololenshttps://www.ris.uu.nl/ws/files/31229478/DA2017HoloLens.pdfInternational Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle DesignHololens
63
2017Ngu Nguyen, Stephan SiggPassFrame: Generating image-based passwords from egocentric videoshttp://ieeexplore.ieee.org/abstract/document/7917518/authorsPervasive Computing and Communications WorkshopsAuthentication, Videos, Image segmentation, Cameras, Visualization, Conferences, Pervasive computing
64
2017Timothy Stapleton, Helen Sumin KooBicyclist biomotion visibility aids: a 3D eye-tracking analysishttp://www.emeraldinsight.com/doi/abs/10.1108/IJCST-05-2016-0060International Journal of Clothing Science and TechnologyDesign, Visibility, Eye-tracking, Bicyclists, Biomotion
65
2017Yuta Itoh, Jason Orlosky, Kiyoshi Kiyokawa, Toshiyuki Amano, Maki SugimotoMonocular focus estimation method for a freely-orienting eye using Purkinje-Sanson imageshttp://ieeexplore.ieee.org/abstract/document/7892252/IEEE Virtual Reality (VR)Cameras, Three-dimensional displays, Lighting, Estimation, Measurement by laser beam, Solid modeling, Two dimensional displays
66
2017Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, Mark BillinghurstExploring natural eye-gaze-based interaction for immersive virtual realityhttp://ieeexplore.ieee.org/abstract/document/7893315/IEEE Symposium on 3D User Interfaces (3DUI)Erbium, Gaze tracking, Resists, Painting, Electronic mail, Two dimensional displays, Portable computers
67
Leanne Chukoskie, Jacqueline Nguyen, Jeanne TownsendGaze-contingent Games for Neurocognitive
Therapy: More than Meets the Eye?
http://cogain2017.cogain.org/camready/talk2-Chukoskie.pdfCOGAIN 2017 (Talk)attention, eye tracking, gaze-contingent, training, usability, video games
68
2017Carlos E. L. Elmadjian, Antonio Diaz-Tula, Fernando O. Aluani, Carlos H. MorimotoGaze interaction using low-resolution images at 5 FPShttp://cogain2017.cogain.org/camready/talk5-Elmadjian.pdfCOGAIN 2017 (Talk)
69
2017David Dunn, Cary Tippets, Kent Torell, Petr Kellnhofer, Kaan Aksit, Piotr Didyk, Karol Myszkowski, David Luebke, Henry FuchsWide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrorshttp://ieeexplore.ieee.org/abstract/document/7829412/http://telepresence.web.unc.edu/files/2017/01/Dunn_2017_TVCG_MembraneAR.pdfIEEE Transactions on Visualization and Computer GraphicsAugmented reality, displays, focus accommodation, perception, user study
70
2017Christian Lander, Frederik Wiehr, Nico Herbig, Antonio Krüger, Markus LöchtefeldInferring Landmarks for Pedestrian Navigation from Mobile Eye-Tracking Data and Google Street Viewhttp://dl.acm.org/citation.cfm?doid=3027063.3053201http://umtl.dfki.de/~fred/papers/final-ea2721-lander.pdfCHI Conference (Extended Abstracts on Human Factors in Computing Systems)Landmarks, Eye Tracking, Google Street View, Navigation
71
2017Michaela Klauck, Yusuke Sugano, Andreas BullingNoticeable or Distractive?: A Design Space for Gaze-Contingent User Interface Notificationshttp://dl.acm.org/citation.cfm?doid=3027063.3053085http://delivery.acm.org/10.1145/3060000/3053085/ea1779-klauck.pdf?ip=14.207.3.246&id=3053085&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&CFID=760488783&CFTOKEN=40502403&__acm__=1494304622_07758cbfc2c218d89f5a8aa9c6232cb7CHI Conference (Extended Abstracts on Human Factors in Computing Systems)Interruptions, Attentive User Interfaces, Eye Tracking, Public Display, Peripheral Display
72
2017Michael Barz, Peter Poller, Daniel SonntagEvaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRIhttp://dl.acm.org/citation.cfm?id=3038367HRI (Human-Robot Interaction)
73
2017Martina TruschzinskiModeling Workload: A System Theory Approachhttp://dl.acm.org/citation.cfm?id=3038408http://dl.acm.org/ft_gateway.cfm?id=3038408&ftid=1850535&dwn=1&CFID=744538827&CFTOKEN=85983821HRI (Human-Robot Interaction)-
74
2017Mark Billinghurst, Kunal Gupta, Masai Katsutoshi, Youngho Lee, Gun Lee, Kai Kunze, Maki SugimotoIs It in Your Eyes? Explorations in Using Gaze Cues for Remote Collaborationhttp://link.springer.com/chapter/10.1007/978-3-319-45853-3_9Book Chapter(Springer): Collaboration Meets Interactive Spaces-
75
2017Patrick Renner, Theis PfeifferAttention Guiding Techniques using Peripheral Vision and Eye Tracking for Feedback in Augmented-Reality-based Assistance Systemshttp://ieeexplore.ieee.org/abstract/document/7893338/https://pub.uni-bielefeld.de/download/2908162/2908197IEEE Symposium on 3D User Interfaces (3DUI)HCI, Information interfaces, Presentaion
76
2017Avi Caspi, Arup Roy, Jessy D. Dorn, Robert J. GreenbergRetinotopic to Spatiotopic Mapping in Blind Patients Implanted With the Argus II Retinal Prosthesishttp://iovs.arvojournals.org/article.aspx?articleid=2597840#149774812Investigative Opthamology & Visual Science (IVOS)-
77
2017Hevesi, P; Ward, JA; Amiraslanov, O; Pirkl, G; Lukowicz, PAnalysis of the Usefulness of Mobile Eyetracker for the Recognition of Physical Activities
http://discovery.ucl.ac.uk/10039438/
http://discovery.ucl.ac.uk/10039438/8/Ward_2017%20Analysis%20of%20the%20Usefulness%20of%20Mobile%20Eyetracker%20for%20the%20Recognition%20of%20Physical%20Activities_.pdf
UBICOMM 2017Eyetracker; activity recognition; sensor fusion
78
2017Thiago Santini, Wolfgang Fuhl, David Geisler, Enkelejda KasneciEyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking
https://www.google.co.th/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwiIkNH1yKzRAhUHPI8KHbiPBREQFgggMAE&url=http%3A%2F%2Fwww.ti.uni-tuebingen.de%2FWolfgang-Fuhl.1651.0.html&usg=AFQjCNE23Vekm0hOdVgjqHV3xzmm45-71A&sig2=yEdTm2nkUSk9eFkWAdx0Mw
http://www.ti.uni-tuebingen.de/uploads/tx_timitarbeiter/main.pdf
Eye movements, pupil detection, calibration, gaze estimation, open-source, eye tracking, data acquisition, human-computer interaction, real-time, pervasive
79
2017Aleksandar Rodić, Theodor Borangiu Advances In Robot Design And Intelligent Control: Proceedings Of The 25th Conference On Robotics In Alpe-Adria-Danube Region (RAAD16)
https://books.google.co.th/books?hl=en&lr=&id=QCecDQAAQBAJ&oi=fnd&pg=PA378&dq=pupil+labs&ots=dRoZH2Qv8Y&sig=S-RoCdSwNeQO5wxwHCikjZTojJ0&redir_esc=y#v=onepage&q=pupil%20labs&f=false
-Proceedings from Robotics in Alpe-Adria-Danube Region (RAAD)-
80
2016Miika ToivanenAn advanced Kalman filter for gaze tracking signal
https://www.researchgate.net/publication/287507544_An_advanced_Kalman_filter_for_gaze_tracking_signalhttps://www.researchgate.net/profile/Miika_Toivanen/publication/287507544_An_advanced_Kalman_filter_for_gaze_tracking_signal/links/5a620e3e0f7e9b6b8fd4248f/An-advanced-Kalman-filter-for-gaze-tracking-signal.pdfBiomedical Signal Processing and ControlGaze tracking, Kalman filter, Principal component analysis, Image analysis
81
2016Florian Alt, Sarah Torma, Daniel BuschekDon’t Disturb Me – Understanding Secondary Tasks on Public Displayshttps://pdfs.semanticscholar.org/d6f4/dab5af8e61f1167d84ac2e1c44e5e5518ec3.pdfInternational Symposium on Pervasive Displays (PERDIS)public display, secondary task performance, mental workload, parallel-task environment
82
2016Carolina Rodriguez-Paras, Shiyan Yang, & Thomas K. FerrisUsing Pupillometry to Indicate the Cognitive Redlinehttp://journals.sagepub.com/doi/pdf/10.1177/1541931213601157Proceedings from Human Factors and Ergonomics Society Annual Meeting
83
2016Ondřej KubánekThe Impact Analysis Of The Technical Devices Used By The Driver On The Reaction Timehttps://dspace.vutbr.cz/bitstream/handle/11012/63302/DP%20Kub%C3%A1nek.pdf?sequence=2PhD thesis?Driver, reaction time, legal requirements, calling
84
2016Yun Suen PaiPhysiological Signal-Driven Virtual Reality in Social Spaceshttp://dl.acm.org/citation.cfm?id=2984787ACM Symposium on User Interface Software and Technology (UIST)
85
2016Henny Admoni, Siddhartha SrinivasaPredicting User Intent Through Eye Gaze for Shared Autonomyhttp://hennyadmoni.com/documents/admoni2016aaaifs.pdf
86
2016Astrid Weiss, Andreas Huber, Jürgen Minichberger, Markus IkedaFirst Application of Robot Teaching in an Existing Industry 4.0 Environment: Does It Really Work?http://www.mdpi.com/2075-4698/6/3/20http://www.mdpi.com/2075-4698/6/3/20/pdfhuman-robot interaction, Industry 4.0, case study, field test, robot teaching
87
2016Tilman Dingler, Rufat Rzayev, Valentin Schwind, Niels HenzeRSVP on the Go - Implicit Reading Support on Smart Watches Through Eye Trackinghttp://dl.acm.org/citation.cfm?id=2971794&dl=ACM&coll=DL&CFID=712884317&CFTOKEN=39946609http://tilmanification.org/assets/pdf/Dingler2016RSVPEyeControl.pdfACM International Symposium on Wearable Computers (ISWC)Reading interfaces, RSVP; eye-tracking, eye gaze interaction, mental load, comprehension
88
2016Hidde van der Meulen, Petra Varsanyi, Lauren Westendorf, Andrew L. Kun, Orit ShaerTowards Understanding Collaboration around Interactive Surfaces: Exploring Joint Visual Attention
http://dl.acm.org/citation.cfm?id=2984778http://cs.wellesley.edu/~hcilab/publication/uist16-eyetracking.pdfACM Symposium on User Interface Software and Technology (UIST)Visual attention, collaboration, eye tracking
89
2016Daniel Pohl, Xucong Zhang, Andreas Bulling, Oliver GrauConcept for Using Eye Tracking in a Head-Mounted Display to Adapt Rendering to the User’s Current Visual Fieldhttp://dl.acm.org/citation.cfm?doid=2993369.2996300https://perceptual.mpi-inf.mpg.de/wp-content/blogs.dir/12/files/2016/11/pohl2016_vrst.pdfVirtual reality, eye tracking, rendering, head-mounted display
90
2016Santiago Bonada, Rafael Veras, Christopher CollinsPersonalized Views for Immersive Analyticshttp://dl.acm.org/citation.cfm?id=3009953http://vialab.science.uoit.ca/wp-content/papercite-data/pdf/bon2016a.pdf-
91
2016Hidde van der Meulen, Andrew L. Kun, Christian P. JanssenSwitching Back to Manual Driving: How Does it Compare to Simply Driving Away After Parking?
http://dspace.library.uu.nl/handle/1874/338115
http://www.cpjanssen.nl/Publications/VanDerMeulenKunJanssen_AutoUI2016.pdfAutonomous driving, driver distraction, parking, visual distraction
92
2016Mihai Bace, Teemu Leppanen, Argenis Ramirez Gomez, David Gil de GomezubiGaze: Ubiquitous Augmented Reality Messaging Using Gaze Gestures
http://dl.acm.org/citation.cfm?id=2999530
http://www.vs.inf.ethz.ch/publ/papers/mbace_SA2016_ubiGaze.pdfAugmented Reality, Eye Tracking, Gesture Interaction, Gaze Gestures, Messaging
93
2016Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Gordon WetzsteinSaliency in VR: How do people explore virtual environments?
https://arxiv.org/abs/1612.04335
https://arxiv.org/pdf/1612.04335.pdf-
94
2016Alexander Plopski, Jason Orlosky, Yuta Itoh, Christian Nitschke, Kiyoshi Kiyokawa, Gudrun KlinkerAutomated Spatial Calibration of HMD Systems with Unconstrained Eye-cameras
http://ieeexplore.ieee.org/document/7781771/
http://imd.naist.jp/imdweb/pub/plopski_ISMAR16/paper.pdfOST-HMD calibration, eye pose estimation
95
2016Evani Amaral Camargo, Daniel Paz de Araújo, Hermes Renato Hildebrand, Rosangela da Silva LeoteTecnologias Assistivas E Arte-Educação: Interfaces Digitais E Físicas
https://www.google.co.th/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjpg6HLyKzRAhXERo8KHSr4AsQQFggaMAA&url=https%3A%2F%2Fwww.metodista.br%2Frevistas%2Frevistas-unimep%2Findex.php%2Fcomunicacoes%2Farticle%2Fview%2F2949%2F1850&usg=AFQjCNGR5M_-73Hc13kuAPeHTugQxTudRw&sig2=C8tGuPFoV7tQPtOOq7Aoaw
https://arxiv.org/pdf/1612.06209.pdfAssistive Technologies, Digital Interfaces, Art Education, Interaction Design.
96
2016Ngu Nguyen, Stephan SiggPersonalized Image-based User Authentication using Wearable Cameras-
https://arxiv.org/pdf/1612.06209.pdf
-
97
2016Moayad Mokatren, Tsvi Kuflik, Ilan ShimshoniListen to What You Look at: Combining an Audio Guide with a Mobile Eye Tracker on the Go
http://www.cri.haifa.ac.il/index.php/40-uncategorised/474-moayad-mokatran-abstract
http://ceur-ws.org/Vol-1772/paper1.pdf
AI* CH@ AI* IA (Italian Workshop on Artificial Intelligence for Cultural Heritage)-
98
Moayad Mokatren, Tsvi KuflikExploring the potential contribution of mobile eye-tracking
technology in enhancing the museum visit experience
http://ceur-ws.org/Vol-1621/paper5.pdf
AVI*CH (Workshop on Advanced Visual Interfaces for Cultural Heritage)Mobile guide; Mobile eye tracking; Personalized information;
Smart environment; Context aware service
99
2016Maike Schindler, Achim J. Lilienthal, Ravi Teja Chadalavada, Magnus ÖgrenCreativity In The Eye Of The Student. Refining Investigations Of Mathematical Creativity Using Eye-Tracking Goggles.
https://www.researchgate.net/publication/306263411_Creativity_in_the_eye_of_the_student_Refining_investigations_of_mathematical_creativity_using_eye-tracking_goggles
https://www.researchgate.net/profile/Achim_Lilienthal/publication/306263411_Creativity_in_the_eye_of_the_student_Refining_investigations_of_mathematical_creativity_using_eye-tracking_goggles/links/585546f208ae81995eb419b2.pdf
Conference of the International Group for the Psychology of Mathematics Education -
100
2016Joshua Newn, Eduardo Velloso, Marcus Carter, Frank VetereExploring the Effects of Gaze Awareness on Multiplayer Gameplay
http://dl.acm.org/citation.cfm?id=2987740
http://www.socialnui.unimelb.edu.au/publications/2016-SocialNUI-Newn-3.pdf
Symposium on Computer-Human Interaction in Play Companion Extended AbstractsEye tracking, Gaze tracking, Co-located play, Multiplayer,m Tabletop games, Board games, Video games, Cognitive strategies, Gaze awareness; EyePlay
Loading...
Main menu