InterViewR: The Virtual Reality Interview Training Platform
Icons provide live feedback to the user about their eye contact, heart rate, response time, and volume level.
- Finding ways to monitor the user’s state (ie. eye contact and heart rate).
- Useful and unobtrusive real time feedback to user.
- Creating intuitive and realistic VR environments as well as user interaction.
- Sending data to users to analyze interview progress.
- Advanced eye gaze detection.
- Handling of disruptions.
- Additional distractions.
- Realistic text-to-speech audio.
Interviews are challenging for everyone but are especially hard for individuals with ASD.
Up to 90% of adults with ASD are unemployed or underemployed.
InterViewR aims to address this challenge.
Chris White, Nick Kim, Marley Medema, Anais Dawson, Matthew Stepita
Advisors: Dr. Wesley Deneke, Dr. Moushumi Sharmin, Dr. Shameem Ahmed
- InterViewR is a VR-based interview simulator designed to help individuals with Autism Spectrum Disorder (ASD) get interview training and, ultimately, employment.
- interViewR provides users with real-time feedback to help them maintain an appropriate social interaction (eye contact, spoken volume level, length of their responses), and track their physiological response (heart rate variation).
- There are multiple environments and varying numbers of interviewers to provide a wide range of user experiences.
- When the program starts, the user appears in an introductory area, where they can adapt to the VR environment (Upper left). They are able to customize their experience and once they are ready, they may start the interview.
- A conference room will render around the user, and depending on the selected options, distractions will appear as well.
- The interviewers ask the user standard questions, and feedback about their answers are depicted within the environment in a picture frame.
- After answering the questions, the user is given the option to be presented with statistics and feedback on how they did during the interview.
NEAT Research Lab, Computer Science Department
- Personalized User experience by changing the number of interviewers as well as the amount of environmental distractions.
- Feedback on eye contact, volume level, heart rate, and response length provided in real time.
- Dynamic change of environment details (color schemes, object types).
- Seamless interaction with environment objects is supported.