The HET is a wearable assistive technology that provides people who are blind, low vision, or have autism with real-time access about others nonverbal communication cues (NVC) during social interactions. Through an intuitive mapping of visual information translated to a haptic (sense of touch) sensation applied to the forearm, the HET device breaks down barriers to effective and engaging conversations by raising awareness of non-verbal communication queues in real-time. The technology leverages AI/ computer vision to detect and recognize the emotions/ facial expressions, gestures, and body language received from an embedded camera on a pair of glasses before mapping the NVC to a dynamic haptic pattern applied to the forearm of the user on a wearable sleeve. Each pattern generates a distinct and unique sensation along the forearm corresponding to the facial expression, body language, or gesture of the person or persons they are engaged in conversation with. All patterns were designed, tested, and refined with the end users involved to ensure our technology is accurate when translating NVC queues in an intuitive and intelligible manner. With the HET you can tell if someone is happy, upset, going for a hand shake or waving at you in real time.
The HET significantly enhances the quality of life for its users by providing real-time, non-verbal communication insights, enabling them to confidently navigate any social interaction, from job interviews to casual dinners with friends. By offering more comprehensive information during these interactions, the HET not only empowers users socially but also increases their earning potential, helping them seize opportunities and engage more fully in professional and personal environments.
We are looking for beta testers that want to be at the forefront of technology, be part of the development process, and help the community.
Email or call Jack Walters if interested or have any questions: jackwalters@mines.edu and 720-280-8117