1 of 17

Privacy Leakage Study and Protection for Virtual Reality Devices

Dirk Catpo Risco, Brody Vallier, and Emily Yao

11 July 2024

Project Advisor: Dr. Chen

Mentors: Changming Li, Honglu Li, and Tianfang Zhang

2 of 17

Introducing the Team

Team members:

Mentors:

Advisor:

2

Emily Yao

HTHS HS

Dirk Catpo Risco

RU ECE MS

Changming Li

RU ECE PhD

Dr. Yingying (Jennifer) Chen

Honglu Li

RU ECE PhD

Tianfang Zhang

RU ECE PhD

Brody Vallier

RU ECE UG

3 of 17

Project Overview

  • Augmented reality (AR)/virtual reality (VR) devices are becoming more popular
  • Used in many applications (e.g. healthcare, communication, tourism)
  • Privacy concerns arise due to zero-permission sensors
  • We study activity privacy leakage in AR/VR devices
  • Our study focuses on activity recognition based on motion sensors

3

4 of 17

Week 6 Recap

  • Made a list of different motion data to capture and train a convolutional neural network (CNN)
    • Study on existing codes for building CNN model
  • Specifics of the motion data
    • How many samples
    • How many users
    • What data will we be using
  • Design basic prompt for large language model (LLM)

4

5 of 17

Week 7 Progress

  • Collected 250 samples from six different motions to enlarge datasets for the CNN task
  • Designed prompts with specific parts for LLMs to establish activity recognition tasks
  • Tested prompt with different LLMs
    • ChatGPT 4o
    • Gemini Advanced

5

6 of 17

Motions for Data Collection

  • Arm movements
    • Front raising and lowering
    • Side raising and lowering
  • Head movements
    • Looking left then back middle
    • Looking right then back middle
    • Looking up then back middle
    • Looking down then back middle

6

7 of 17

Hand Motion Design

7

  • Collecting activity data from arm movements
    • Raising and lowering the arm in the front (Activity 1)
    • Side raising and lowering (Activity 2)

Activity 2

Activity 1

8 of 17

Head Motion Design

8

  • Collecting activity data from head movement
    • Looking left and center (Activity 3)
    • Looking right and center (Activity 4)

Activity 4

Activity 3

9 of 17

Head Motion Design

9

  • Collecting activity data from head movement
    • Looking down and center (Activity 5)
    • Looking up and center (Activity 6)

Activity 5

Activity 6

10 of 17

LLM Prompt Design

  • Design prompts to describe the task in natural language as input to LLM
  • We use expert knowledge to guide LLM by including explicit text-based descriptions of the relationship between sensor patterns and user activity

10

11 of 17

LLM Prompt Detailed Design

  • Design prompts including 4 parts:
    • Objective
    • Background
    • Data and Expert Knowledge
    • Response Format

11

12 of 17

LLM Prompt Detailed Design

  • Objective
    • Determine the user’s head and hand motions by analyzing acceleration and gyroscope data from an Oculus Meta Quest HMD and controllers
  • Background
    • Acceleration and gyroscope data patterns vary based on the user’s head and hand motions. These patterns help infer the user’s actions with their head and hands

12

13 of 17

LLM Prompt Detailed Design

  • Data and Expert Knowledge
    • You will receive acceleration and gyroscope data from an Oculus Meta Quest HMD and controllers as .csv files. Here's how to interpret this data:

13

14 of 17

LLM Prompt Detailed Design

  • Response format
    • Graph: Plot the .csv files for each device
    • Reasoning: Provide a comprehensive analysis of the accelerometer and gyroscope data for each device
    • Summary: Conclude with a brief summary of your findings, detailing the inferred head and hand motions

14

15 of 17

LLM Result: ChatGPT 4o

  • Inaccurate results using Activity 2 data

15

Activity 2

16 of 17

LLM Result: Gemini Advanced

  • Accurate results using Activity 2 data

16

Activity 2

17 of 17

Week 8 Goals

  • Improve the prompt design to get more accurate prediction results from LLMs

  • Begin developing a CNN using samples collected from the six motion patterns

17