ABSTRACT:Collaborating remotely on tasks that require engagement with the physical environment is difficult using existing technologies, since they do not adequately support a shared 3D space or ways to effectively communicate spatial information. We present an approach for unobtrusive mobile telecollaboration that leverages physical and virtual environments, supporting rich interaction by using computer vision-based tracking and mapping, along with augmented reality techniques to convey spatial information about a novel scene. This allows for rich interaction among remote collaborators, useful in a wide range of applications from mission-critical field service repairs to helping a friend or relative program the DVR.
ABOUT THE PRESENTER:Matthew Turk is a Chair of the Department of Computer Science at the University of California, Santa Barbara, where he co-directs the UCSB Four Eyes Lab, focused on the "four I's" of Imaging, Interaction, and Innovative Interfaces. He received a BS from Virginia Tech, an MS from Carnegie Mellon University, and a PhD from the Massachusetts Institute of Technology. He has worked at Martin Marietta Denver Aerospace, LIFIA/ENSIMAG (Grenoble, France), Teleos Research, and Microsoft Research, where he was a founder of the Vision Technology Group. He co-founded a startup company in 2014 that was acquired by PTC Vuforia in 2016. Prof. Turk has received several best paper awards and has served as general or program chair of many conferences, including ACM Multimedia, Face and Gesture Recognition, WACV, ICMI, and CVPR. He is an IEEE Fellow, an IAPR Fellow, and the recipient of the 2011-2012 Fulbright-Nokia Distinguished Chair in Information and Communications Technologies.
Many thanks. MARA seminar team