1 of 8

Call for Research Interns �July/August 2024

2 of 8

About the Wearable and Interactive Technology Lab

The Wearable and Interactive Technology Lab in the School of Electrical Engineering at KAIST focuses on the design, development, and evaluation of wearable, physical, and tangible interactive computing systems. Bringing together design perspectives, computer science skills, and psychological methods, the WIT lab conceives, creates, and studies the next generation of human-computer interfaces.

SonarID: Sensing fingers around a smartwatch

https://doi.org/10.1145/3491102.3501935

FingerText: Typing on fingernails for AR

https://doi.org/10.1145/3411764.3445106

Gestures passwords for device lock

https://doi.org/10.1109/SP40000.2020.00034

3 of 8

Why join WIT Lab?

Conduct research on emerging technology

Work with a graduate student mentor

Work towards a research paper

4 of 8

  • How to
    • Select at one or more topics that you like to work on
    • Write a brief one-paragraph statement about why you want to work on each topic. If you select more than one topic, write a statement for each one.
    • Send an email with this content to Ian Oakley <ianoakley@kaist.ac.kr>
  • Logistics
    • Only accepting full-time internship (no part-time participation)
    • Remote working is feasible (discuss with your mentor for coordination)
    • Stipend will be paid for two months (Jan/Feb)
    • Can complete the internship as EE495 independent research (getting 1 credit)
  • Timeline
    • Mon 29th April: Announcement
    • Friday 24th May: Application deadline
    • Fri 31st May: Acceptance notification and mentor matching
    • June 3-28 Pre-internship meeting with a mentor
    • July 1 - August 23 Internship (8 weeks)

How to apply?

5 of 8

Exploring Design Space For Acoustic Interaction on Wearable Ecosystem

Mentor: Jiwan Kim (mail: kjwan4435@gmail.com, web: http://jiwan.kim/)

Required skills:, Python data processing, Arduino, Prototyping

Background: Acoustic interaction enables device-free gesture tracking on smart devices using their internal speaker and microphones. Using this feature, AAAmouse continuously tracks the device's new position in real-time on various devices such as smart TVs, PCs, and laptops that already have multiple speakers. In this study, we will expand this acoustic interaction system on wearable devices for their interactive ecosystem. To enable this, we would build microphones and speaker modules and analyze acoustic signals to track hand or finger gestures. Then, we would integrate them into wearables.

Expected Outcomes: Embedded Sonar System, Data analysis

References:

1. Around gesture tracking using sonar on mobile device: Wang, W., Liu, A. X., & Sun, K. (2016, October). Device-free gesture tracking using acoustic signals. In Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking (pp. 82-94).

2. Yun, S., Chen, Y. C., & Qiu, L. (2015, May). Turning a mobile device into a mouse in the air. In Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services (pp. 15-29).

AAAMouse (MobiSys ’15)

LLAP (MobiCom’16)

Wearable Ecosystem

6 of 8

TMRing: Turning Ring into a Mouse For Everywhere Using Magnetic Sensing

Mentor: Jiwan Kim (mail: kjwan4435@gmail.com, web: http://jiwan.kim/)

Required skills: Arduino, Python data processing, Prototyping

Background: In recent decades, magnetoresistance sensor technology has developed into a new field of magnetic sensing called tunnel magnetoresistance technology (TMR), which provides fine-grained magnetic field changes. In this project, we would integrate this TMR sensor into a smart ring to enable the ring to be turned into a mouse everywhere using magnetic sensing.

Expected Outcomes: Ring prototype, Data analysis

References:

1. Zhou, H., Lu, T., Liu, Y., Zhang, S., Liu, R., & Gowda, M. (2023, May). One ring to rule them all: An open source smartring platform for finger motion analytics and healthcare applications. In Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation (pp. 27-38).

2. Iqbal, S., Khan, S. N., Sajid, M., Ali, S., Iqbal, K. F., Asgher, U., & Ayaz, Y. (2022). Novel Approach for Sensing the Humanoid Hand Finger Position Using Non-contact TMR Sensor. Industrial Cognitive Ergonomics and Engineering Psychology, 35, 38.

TMR for Finger Tracking (AHFE 2022)

OmniRing (IoTDI 2023)

7 of 8

Stabilizing AR interface placement while mobile

Mentor: Yonghwan Shin yonghwan.shin@outlook.com

Required skills: Python, C# (Unity3D), Arduino.

Background: Although many of the most compelling use cases for smart glasses involve mobility (e.g., navigation, information access on the go, fieldwork), such situations make it challenging to operate smart glasses. For example, the canonical mobility task of walking greatly reduces the performance of basic smart glass input tasks such as selecting a button or target. One key aspect contributing to this problem is that it is challenging to present input elements such that they appear fixed with respect to the user – as a user moves, buttons move in tandem, creating an unstable experience. This project will explore software (e.g., filters, signal processing) and hardware (e.g., Arduino, motion sensors) techniques to create more stable presentations on interfaces on smart glasses while users are walking. Ultimately, applications for state-of-the-art XR devices can be created based on the developed techniques.

Expected Outcomes: XR application, software or hardware solution to mitigate instability due to walking

References: 1. F. Lu, S. Davari, L. Lisle, Y. Li and D. A. Bowman, "Glanceable AR: Evaluating Information Access Methods for Head-Worn Augmented Reality," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 930-939, doi: 10.1109/VR46266.2020.00113.

2. Q. Zhou, D. Yu, M. N. Reinoso, J. Newn, J. Goncalves and E. Velloso, "Eyes-free Target Acquisition During Walking in Immersive Mixed Reality," in IEEE Transactions on Visualization and Computer Graphics, vol. 26, no. 12, pp. 3423-3433, Dec. 2020, doi: 10.1109/TVCG.2020.3023570.

8 of 8

Development of EIT-based Authentication System for Smartwatches

Mentor: Eunyong Cheon (beer@unist.ac.kr)

Required skills: Arduino, Python

Background: Smartwatches often record, and store sensitive user data especially related to health. To protect such user data from security threats, current commercial devices adopt traditional passwords such as 4-digit PINs as the authentication technique. However, these methods rely on using keyboards or PIN pads laid out on small smartwatch screens, which cause usability problems1. To achieve a usable authentication system for smartwatches, this project will develop an Electrical Impedance Tomography (EIT) based biometric user authentication system2. The work in this project will involve 1) developing a prototype that can collect electric signals conducted through the body tissues of a user’s wrist, 2) analyzing electric signals and reconstructing inner wrist images, and 3) building deep learning-based classifiers to train and assess the images.

Expected Outcomes: Hardware, Authentication system. Human subject experiments

References: �1. Y. Wu, D. Jiang, J. Duan, X. Liu, R. Bayford and A. Demosthenous, "Towards a High Accuracy Wearable Hand Gesture Recognition System Using EIT," 2018 IEEE International Symposium on Circuits and Systems (ISCAS), Florence, Italy, 2018, pp. 1-4, doi: 10.1109/ISCAS.2018.8351296.

2. Jun Ho Huh, Hyejin Shin, HongMin Kim, Eunyong Cheon, Youngeun Song, Choong-Hoon Lee, and Ian Oakley. 2023. WristAcoustic: Through-Wrist Acoustic Response Based Authentication for Smartwatches. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6, 4, Article 167 (December 2022), 34 pages. https://doi.org/10.1145/3569473

3. B. Ben Atitallah et al., "Hand Sign Recognition System Based on EIT Imaging and Robust CNN Classification," in IEEE Sensors Journal, vol. 22, no. 2, pp. 1729-1737, 15 Jan.15, 2022, doi: 10.1109/JSEN.2021.3130982.

EIT-based Hand Sign Recognition

Sample EIT Hardware