1 of 37

Unit 2: Representation in Virtual Reality: Aural, Haptic & Integrative Dimensions

SY-MDM

2 of 37

Foundations of Virtual World Representation

  • Definition, abstraction levels, history, applications

Visual Representation in VR (Principles)

  • Human perception, rendering, realism vs. stylization

Visual Representation in VR (Advanced)

  • Avatars, environments, spatial illusions, foveated rendering

Aural Representation in VR (Basics)

  • Sound in immersion, binaural hearing, spatial audio

Aural Representation in VR (Advanced)

  • Interactive soundscapes, voice, emotion, technical challenges

Haptic Representation in VR

  • Types of haptics, applications, limitations, case studies

Integrative & Future VR Representation

  • Multimodal integration, cognitive load, ethics, neural interfaces

3 of 37

Introduction

  • VR → Beyond visuals: sound & touch critical for immersion�
  • Goal: Explore aural, haptic & integrative representations�
  • Key focus: Basics → Advanced → Future

4 of 37

Part 1- Aural Representation (Basics)

5 of 37

Importance of Sound

  • Enhances immersion & realism�
  • Directs attention, builds atmosphere�
  • Example: Horror VR – sound more effective than visuals

6 of 37

Binaural Hearing

Human auditory system uses:�

  • Interaural Time Difference (ITD)�
  • Interaural Level Difference (ILD)�

Head-Related Transfer Function (HRTF)�

Activity: Play stereo vs binaural demo

7 of 37

Interaural Time Difference (ITD)�

  • Definition:

ITD is the difference in arrival time of a sound wave at the left and right ears.

  • Mechanism:

If a sound source is to the right, it reaches the right ear slightly earlier than the left ear.

The brain uses this tiny delay (in microseconds) to determine the direction.

  • Important For:

Low-frequency sounds (<1500 Hz), since their wavelengths are long.

  • Visual:� Diagram showing sound waves hitting one ear first, then the other (delay).

8 of 37

Interaural Level Difference (ILD)

  • Definition:

ILD is the difference in sound intensity (loudness) reaching each ear.

  • Mechanism:

The head blocks some sound energy, creating a “head shadow.”�The ear closer to the sound hears it louder than the far ear.

  • Important For:

High-frequency sounds (>1500 Hz), since short wavelengths can be blocked.

  • Visual:� Diagram showing stronger sound wave on one side and weaker on the other (volume difference).

9 of 37

10 of 37

11 of 37

12 of 37

13 of 37

Head-Related Transfer Function (HRTF)

  • Definition:

HRTF describes how the shape of the head, ears (pinna), and torso filter sound before it reaches the eardrum.

  • Mechanism:

Each person has a unique HRTF.�Captures elevation (up/down) and front/back localization.

  • Important For:

patial audio in VR, gaming, and 3D sound rendering.

  • Visual:�Diagram of sound waves being modified by the ear’s shape before entering the canal.

14 of 37

Combined Role

ITD + ILD = Azimuth (left-right localization)�

HRTF = Elevation & distance cues�

Together: Enable full 3D sound perception in VR

15 of 37

Spatial Audio in VR

  • 3D positioning of sound sources�
  • Environmental acoustics: echo, reverb, occlusion�
  • Example: Thunder rumbling far away, footsteps behind player

16 of 37

Spatial Audio in VR

  • Definition:� Echo is the distinct repetition of a sound caused by reflection from a surface.
  • Mechanism:
  • If a sound wave reflects and returns after ~50–100 ms, it’s perceived as a separate sound (echo).
  • Multiple echoes create delays (like shouting in a canyon).
  • Role in VR:
  • Helps convey large open spaces (mountains, caves).
  • Visual:� Diagram of sound bouncing off a wall and returning later.

17 of 37

Reverberation (Reverb)

  • Definition:�Reverb is the persistence of sound after the source stops, due to many small reflections blending together.
  • Mechanism:
  • Unlike echo, reverb overlaps with the original sound.
  • Characterized by decay time (RT60) — how long it takes to fade.
  • Role in VR:
  • Gives users a sense of room size, surface material, and atmosphere.
  • Example: short reverb in a small room, long reverb in a cathedral.
  • Visual:� A room with sound waves bouncing repeatedly in all directions.

18 of 37

Occlusion

  • Definition:�Occlusion is the reduction or alteration of sound when an object blocks the direct path between source and listener.
  • Mechanism:
  • Direct sound is weakened.
  • Remaining sound is mostly reflections and muffled frequencies.
  • Role in VR:
  • Important for realism & immersion — e.g., hearing someone talk behind a wall or door.
  • Head on one side of a wall, sound source on the other. Direct path blocked, only diffracted/reflected waves reaching ear.

19 of 37

Occlusion

In spatial audio / VR,

Azimuth

  • Definition:� Azimuth is the horizontal angle of a sound source around the listener’s head, measured relative to the front-facing direction (0°).�
  • Range:� → sound directly in front�90° → sound to the right�180° → sound directly behind�270° (or –90°) → sound to the left

20 of 37

Occlusion

  • Role:�ITD (Interaural Time Difference) and ILD (Interaural Level Difference) mainly help the brain determine azimuth — i.e., left vs. right positioning.�📌 Quick Example in VR:
  • If someone claps their hands to your right, the azimuth is ~90°.�If the clap is behind you, the azimuth is 180°.�👉 In short:� Azimuth = Left–Right positioning angle of a sound source.� (Whereas Elevation = Up–Down angle.)

21 of 37

Summary

  • ITD, ILD, HRTF → Spatial positioning (where sound comes from)�
  • Echo, Reverb, Occlusion → Environmental realism (what space the sound is in)

22 of 37

Part 2- Aural Representation (Advanced)

23 of 37

Interactive Soundscapes

  • Dynamic audio responding to user actions�
  • Adaptive soundtracks → emotional impact�
  • Example: Music intensifies when danger approaches in VR game

24 of 37

Voice & Emotion

  • Voice input for VR interaction�
  • Emotion conveyed through tone & pitch�
  • Real-time voice modulation�
  • Discussion: How voice can build stronger avatar identit

25 of 37

Technical Challenges

  • Latency & sync issues�
  • Personalized HRTFs → still a challenge�
  • Hardware/bandwidth limitations�
  • Example: Lag in multiplayer VR sound = broken immersion

26 of 37

Part 3 - Haptic Representation in VR

27 of 37

Technical Challenges

  • Tactile feedback: vibrations, textures�
  • Force feedback: resistance, weight�
  • Kinesthetic: body movement, exoskeletons

28 of 37

Applications

  • Gaming: weapon recoil, object handling�
  • Medical VR: surgery simulation, rehab�
  • Education: tactile prototyping, science labs�
  • Case Example: Haptics in VR surgical training

29 of 37

Limitations

  • High cost & complexity�
  • Fatigue in long sessions�
  • Realism gap vs physical world�
  • Prompt: Why hasn’t haptics gone mainstream yet?

30 of 37

Case Studies

31 of 37

TeslaSuit full-body haptic suit

32 of 37

Part 3 - Integrative & Future VR Representation

33 of 37

Multimodal Integration

  • Vision + audio + haptics = stronger immersion�
  • Cross-modal effects: sound changes touch perception�
  • Example: Crunch sound enhances virtual apple bite

34 of 37

Cognitive Load

  • Too much sensory input → overload�
  • Balance realism vs usability�
  • Discussion: Where should designers draw the line?

35 of 37

Ethics in VR

  • Manipulation through immersive cues�
  • Voice/biometric data privacy�
  • Safety (addiction, physical harm)�
  • Question: Should VR be regulated like drugs/media?

36 of 37

Neural Interfaces (Future)

  • Brain-computer interfaces (BCI)�
  • EEG-based VR control�
  • Thought-controlled interaction�
  • Vision: Full sensory VR via neural links

37 of 37

Summary

  • Aural VR: from binaural basics → interactive soundscapes�
  • Haptic VR: tactile, force, kinesthetic → applications & limits�
  • Future VR: multimodal, ethics, neural interfaces�
  • “Immersion is not only what we see, but what we hear & feel.”