1 of 32

Global Illumination

Instructor: Christopher Rasmussen (cer@cis.udel.edu)

May 3, 2023 ❖ Lecture 33

2 of 32

Outline

  • Diffuse inter-reflection and other light paths: Photon mapping/path tracing
  • Noise

3 of 32

Photon Mapping (H. Jensen, 1996)

  • Two-pass algorithm somewhat like bidirectional ray tracing, but photons stored differently
    • Related to particle tracing approach in Marschner 23.1
  • 1st pass: Build photon map (analog of rexes)
    • Shoot random rays from light(s) into scene
    • Each photon carries fraction of light’s power
    • Follow specular bounces, but store photons in map at each diffuse surface hit (or scattering event)
  • 2nd pass: Render scene
    • Modified ray tracing: follow eye rays into scene
    • Use photons near each intersection to compute light

4 of 32

Photon Mapping: 1st pass (Write)

  • Probabilistically decide on photon reflection, transmission, or absorption based on material properties of object hit
    • Specular surface: Send new photon (with scaled-down power) in reflection/refraction direction just like ray tracing
    • Diffuse surface: If at least one bounce, store photon in photon map, send new photon in random direction (usually cosine distribution, see Marschner 14.4.1)
      • So do NOT store photon at specular interactions
    • Arbitrary BRDF: Use BRDF as probability distribution on new photon’s direction
  • Photon map is kd-tree
    • Decoupling from scene geometry allows fewer photons than scene objects/triangles (no texture maps, no meshes)

5 of 32

kd-trees

  • Each point parametrizes axis-aligned splitting plane; rotate which axis is split
  • Example kd tree for k = 2 and N = 6:

6 of 32

kd-trees

  • Each point parametrizes axis-aligned splitting plane; rotate which axis is split
  • Example kd tree for k = 2 and N = 6:

  • But balance is important to get O(log N) efficiency for nearest-neighbor queries

7 of 32

Photon Map: Example

8 of 32

Raytraced scene (courtesy of P. Christensen)

9 of 32

Photon map of scene (n=500,000)�[notice nothing stored at pure specular surfaces]

10 of 32

WebGL path tracer (not same as photon mapping, but similar)

11 of 32

Blender "Cycles" path tracer (not same as photon mapping, but similar)

Tutorial from same source as earlier texture-mapping videos

Another rendering tutorial

12 of 32

Photon Mapping: 2nd pass (Read)

  • For each eye ray intersection, estimate irradiance as function of nearby photons:
    • Each photon stores position, power, incident direction—can treat like mini-light source
    • Use filtering (cone or Gaussian) to weight nearer photons more
    • Can use discs instead of spheres to only get photons from same planar surface
  • For soft, indirect illumination, irradiance estimates are combined with standard local illumination calculations after final gathering (which shoots more rays to bring back irradiance estimates from other diffuse surfaces)—just like ray tracing adds reflection/refraction components to local color
  • As usual, more accurate with more photons → Use multiple maps for different phenomena

13 of 32

Irradiance estimates based on nearby photons

14 of 32

Previous image combined w/ texture maps & material colors

15 of 32

Scene after final gathering

16 of 32

Raytraced scene (courtesy of P. Christensen)

17 of 32

Lighting Components, Reconsidered

  • Break rendering equation into parts: L = Ldirect + Lspecular + Lindirect + Lcaustic
  • Can get Ldirect and Lspecular using ray-casting, ray-tracing respectively
  • Lindirect is main reason we’re looking at photon mapping—it’s our LD*E paths
  • Lcaustic from special “caustic” photon map

18 of 32

Multiple Photon Maps

  • Global map: Shoot photons everywhere for diffuse, indirect illumination
  • Caustic map: Shoot photons only at specular objects (“aimed” sort of like shadow rays)
  • Volume map: Photon interactions with participating media such as fog, smoke

Caustic map

Global map

19 of 32

Ray Tracing

20 of 32

Photon Mapping

21 of 32

Photon Mapping

Where are LSDE paths? LDDE paths?

22 of 32

Example: Water caustics

23 of 32

Example: Smoke (volume map)

24 of 32

Demo: Photon mapping with caustics

25 of 32

What is Computer Graphics? Goals

More procedural modeling

26 of 32

What is Computer Graphics? Goals

More procedural modeling -- noise can help with shape, motion, pattern, ...

27 of 32

Noise as a Texture Generator

  • Easiest texture to make: Random values for texels
    • noise(x, y) = random()
  • If random() has limited range (e.g., [0, 1]), can control maximum value via amplitude
    • a * noise(x, y)
  • But the results usually aren’t very exciting visually

28 of 32

3-D Noise

  • 3-D or solid texture has value at every point (x, y, z)
  • This makes texture mapping very easy :)
  • Simple solid texture generator is noise function on lattice:
    • noise(x, y, z) = random()
  • For points in between, we need to interpolate
  • Technically, this is "value" noise -- "Perlin" noise is based on random gradients

courtesy of L. McMillan

Perlin noise is implemented in GLM library -- see this page

29 of 32

3-D Noise Interpolation

  • f(x, y, z) can be evaluated at non-lattice points with a straightforward extension of 2-D bilinear interpolation (to trilinear interpolation)
  • Other interpolation methods (quadratic, cubic, etc.) also applicable
  • All of these are approximations of smoothing filters such as Gaussians

courtesy of L. McMillan

Z interpolation

X interpolation

Y interpolation

30 of 32

3-D Noise Texturing: Examples

Original object

Noise with trilinear smoothing

Triquadratic noise

courtesy of L. McMillan

31 of 32

Noise Frequency

  • By selecting larger spaces between lattice points, we are increasing the magnification of the noise texture and hence reducing its frequency

Original

noise

Smoothed

noise

Smoothed noise at different magnification levels

courtesy of H. Elias

32 of 32

Fractal Noise (aka "turbulence", aka "fractional Brownian motion" (FBM))

  • Many frequencies present, looks more natural
  • Can get this by summing noise at different magnifications
  • turb(x, y, z) = Σi ai * noisei(x, y, z)
  • Typical (but totally adjustable) parameters:
    • Magnification doubles at each level (octave)
    • Amplitude drops by half

+ 0.5 x

+ 0.25 x

+ 0.125 x

1 x

=

courtesy of H. Elias