Rachel Naidich1
Stanford University Department of Computer Science1
Fig.2: A A screenshot from the hololens of the beginning of the hand exercise scene. When the user first starts the scene, the first set of instructions appears for the user to complete the gravity eliminated finger flexion exercise. The user must place the side of their hand on a flat surface, and in order for the Hololens device to detect the hand, the user must look at their hand. B A screenshot from the hololens of a closed fist. After the hololens detects the hand, the hand exercise game begins and instructs the user to bend their fingers. The screenshot shows that when the user bends their fingers all the way into a fist, the text shows a high flexion percentage. This can be used to motivate the user to bend as much as possible. Cubes begin to appear next to the hand that move towards the hand, and users must flatten their fingers back out to smash the cube. C A screenshot from the hololens of a flat hand. When the user flattens out their hand, the text shows a very low flexion percentage. If the user successfully touches a cube, then the cube explodes and disappears. A repetition only counts and gets displayed in the text if the user successfully smashes the cube. This visual makes this repetitive exercise more enjoyable and exciting.
Hand Tracking
Body Tracking
rnaidich@stanford.edu
Fig.1: The software uses the MRTK hand tracking capabilities to overlay a cube objects over each joint of the user’s hand. This allows the user to see that their hand has been detected by the device and enables the user to physically interact with virtual objects that appear.
Fig.3: A user posing with the Hololens on and the users’s view in the Hololens displayed on a monitor. The software uses the Azure Kinect to track body movements. The body that the kinect has detected and is tracking shows up as a yellow figure in the bottom right of the computer monitor. The tracked joints are then mapped onto an avatar, and the avatar then mirrors the movements of the user.
Fig.4: A A screenshot from the Hololens of a squat detection scene. The user is able to view a mirrored version of themselves along with text that displays the number of squats that they have done. The program tracks where the hip center is as the user moves. When the user squats down far enough and the program detects the change in location of the hip center, the squat count displayed in the text is incremented. Other types of exercises, such as raising your arm, can also be detected similarly. Progress for other types of dynamic exercises that require more precise measurements can be displayed as a percentage of movement rather than repetitions completed.
B An image of the user squatting, which is mirrored by the avatar on the screen.
Discussion
Body Tracking with Holographic Augmented Reality Displays to Increase Rehabilitation Compliance
A
Body Tracking Gamification
Hand Exercise Gamification
B
C
A
B