1 of 52

Texturing

Instructor: Christopher Rasmussen (cer@cis.udel.edu)

March 15, 2022 ❖ Lecture 11

2 of 52

Outline

  • Bump maps
  • Texturing pipeline
  • More applications
    • Light maps
    • Environment maps
    • Shadow maps

3 of 52

Bump Mapping

  • So far we’ve been thinking of textures modulating color and transparency only
    • Billboards, decals, lightmaps, etc.
  • But any other per-pixel properties are fair game...
  • Pixel normals usually smoothly varying
    • Computed at vertices for Gouraud shading; color interpolated
    • Interpolated from vertices for Phong shading
  • Textures allow setting per-pixel normal with a bump map

4 of 52

Bump map: Example

from wikipedia.org

Sphere WITH bumps

Sphere

5 of 52

Bump mapping: Why?

  • Can get a lot more surface detail without expense of more object vertices to light, transform

courtesy of Nvidia

6 of 52

Bump + color map: Example

7 of 52

Bump mapping: Example

from MIT CG lecture notes

+

=

8 of 52

Bump mapping: Example

courtesy of A. Awadallah

Height map

Bump texture applied to teapot

9 of 52

Bump mapping: How?

  • Idea: Perturb pixel normals n(u, v) derived from object geometry to get additional detail for shading
  • Compute lighting per pixel (like Phong)

from Hill

10 of 52

Bump mapping: Representations

  • 3-D vector m(u, v) added directly to normal n
  • Or: 2-D vector of coefficients (bu, bv) that scale u, v vectors tangent to surface

from Akenine-Moller & Haines

11 of 52

Bump representation: Height map f(u, v)

  • Store just scalar “altitude” at each pixel
  • Get bu, bv from partial derivatives:

    • Approximate with finite differencing

from Akenine-Moller

& Haines

12 of 52

Example: Converting height maps to normal displacements (aka normal maps)

Z coordinate set to some constant scale factor; (X, Y) normalized to [0, 1] range.

Right image is mostly blue because “straight up” vector is (0.5, 0.5, 1)

courtesy of Nvidia

13 of 52

Bump mapping: Issues

  • Bumps don’t cast shadows
  • Geometry doesn’t change, so silhouette of object is unaffected
  • Textures can be used to modify underlying geometry with displacement maps
    • Generally in direction of surface normal

courtesy of Nvidia

from https://en.wikipedia.org/wiki/Normal_mapping

14 of 52

Displacement Mapping

courtesy of spot3d.com

Bump mapping

Displacement mapping

15 of 52

Displacement Mapping the Sphere

courtesy of geeks3d.com

16 of 52

Displacement Mapping – Height Maps

courtesy of artofillusion.org -- Julian MacDonald

17 of 52

Texture mapping: Steps

  • Creation: Where does the texture image come from?
  • Geometry: Transformation from 3-D shape locations to 2-D texture image coordinates
  • Rasterization: What to draw at each pixel (since texture coords. are floats)
    • E.g., bilinear interpolation vs. nearest-neighbor

18 of 52

Texturing: Creation

  • Reproductions
    • Photographs
    • Handpainted
  • Directly-computed functions
    • E.g., lightmaps, visibility maps
  • Procedurally-built
    • Synthesize with randomness, pattern-generating rules, etc.
    • More about this after midterm

courtesy of H. Elias

courtesy of Nvidia

Procedural bump mapping

19 of 52

Texture mapping applications: Lightmaps

courtesy of K. Miller

+

=

Idea: Precompute expensive static lighting effects (such as ambient occlusion or diffuse reflectance) and “bake” them into color texture. Then scene looks more realistic as camera moves without expense of recomputing effects

20 of 52

Texture mapping application: Lightmaps

Tenebrae Quake screenshot

21 of 52

Lightmap example: Diffuse lighting only

22 of 52

Lightmap example: Light configuration

23 of 52

Lightmap example: Diffuse + lightmap

24 of 52

Texturing Pipeline (Geometry + Rasterization)

  1. Compute object space location (x, y, z) from screen space location (i, j)

list adapted from Akenine-Moller & Haines

courtesy of R. Wolfe

25 of 52

Texturing Pipeline (Geometry + Rasterization)

  • Compute object space location (x, y, z) from screen space location (i, j)
  • Use projector function to obtain object surface coordinates (u, v) (3-D -> 2-D projection)

list adapted from Akenine-Moller & Haines

courtesy of R. Wolfe

26 of 52

Texturing Pipeline (Geometry + Rasterization)

  • Compute object space location (x, y, z) from screen space location (i, j)
  • Use projector function to obtain object surface coordinates (u, v) (3-D -> 2-D projection)
  • Use corresponder function to find texel coordinates (s, t) (2-D -> 2-D transformation)
    • Scale, shift, wrap like viewport transform in geometry pipeline

list adapted from Akenine-Moller & Haines

courtesy of R. Wolfe

27 of 52

Texturing Pipeline (Geometry + Rasterization)

  • Compute object space location (x, y, z) from screen space location (i, j)
  • Use projector function to obtain object surface coordinates (u, v) (3-D -> 2-D projection)
  • Use corresponder function to find texel coordinates (s, t) (2-D -> 2-D transformation)
    • Scale, shift, wrap like viewport transform in geometry pipeline
  • Filter texel at (s, t)

list adapted from Akenine-Moller & Haines

courtesy of R. Wolfe

28 of 52

Texturing Pipeline (Geometry + Rasterization)

  • Compute object space location (x, y, z) from screen space location (i, j)
  • Use projector function to obtain object surface coordinates (u, v) (3-D -> 2-D projection)
  • Use corresponder function to find texel coordinates (s, t) (2-D -> 2-D transformation)
    • Scale, shift, wrap like viewport transform in geometry pipeline
  • Filter texel at (s, t)
  • Modify pixel (i, j)

list adapted from Akenine-Moller & Haines

courtesy of R. Wolfe

Rasterization

29 of 52

Texture coordinates at vertices

  • Polygons can be treated as parametric patches by assigning texture coordinates to vertices
    • The standard OpenGL way

courtesy of

R. Wolfe

30 of 52

Parametric Surfaces

  • If we have a surface patch already parametrized by some natural (u, v) such that x = f(u, v), y = g(u, v), z = h(u, v), we can use parametric coordinates u, v without a projector

courtesy of R. Wolfe

31 of 52

Projector Functions

  • Want way to get from 3-D point to 2-D surface coordinates as an intermediate step
  • Idea: Project complex object onto simple object’s surface with parallel or perspective projection (focal point inside object)
    • Mesh: piecewise planar (default option

with texture coords)

    • Plane
    • Cylinder
    • Sphere
    • Cube

Planar projector

courtesy of R. Wolfe

32 of 52

Planar projector

Orthographic projection onto XY plane: u = x, v = y

...onto YZ plane

...onto XZ plane

courtesy of

R. Wolfe

33 of 52

Cylindrical projector

  • Convert rectangular coordinates (x, y, z) to cylindrical (r, h, θ), use only (h, θ) to index texture image

courtesy of

R. Wolfe

34 of 52

Spherical projector

  • Convert rectangular coordinates (x, y, z) to spherical (r, θ, Φ), use only (θ, Φ)

courtesy of R. Wolfe

35 of 52

Projecting in non-standard directions

  • Texture projector function doesn’t have to project ray from object center through position (x, y, z)—you can use any attribute of that position. For example:
    • Ray comes from another location
    • Ray is surface normal n at (x, y, z)
    • Ray is reflection-from-eye vector r at (x, y, z)
    • Etc.

courtesy of R. Wolfe

36 of 52

Projecting in non-standard directions

  • This can lead to interesting or informative effects

courtesy of R. Wolfe

Different ray directions for a spherical projector

37 of 52

Environment/Reflection Mapping

  • Problem: To render pixel on mirrored surface correctly, we need to follow reflection of eye vector back to first intersection with another surface and get its color
  • This is an expensive procedure with ray tracing
  • Idea: Approximate with texture mapping

from Angel

38 of 52

Environment mapping: Details

  • Key idea: Render 360 degree view of environment from center of object with sphere or box as intermediate surface
  • Intersection of eye reflection vector with intermediate surface provides texture coordinates for reflection/environment mapping

courtesy of R. Wolfe

39 of 52

Making environment textures: Cube

  • Cube map straightforward to make: Render/ photograph six rotated views of environment
    • 4 side views at compass points
    • 1 straight-up view, 1 straight-down view

40 of 52

Making environment textures: Sphere

  • Or construct from several photographs of mirrored sphere

courtesy of P. Debevec

41 of 52

Environment mapping: Example

courtesy of G. Miller

~1982

42 of 52

Environment mapping: Example

From “Terminator II” (1991)

43 of 52

Environment mapping example: Same scene, different lighting

courtesy of P. Debevec

44 of 52

Environment Bump Mapping

  • Idea: Bump map perturbs eye reflection vector

from Akenine-Moller

& Haines

45 of 52

Environment mapping: Issues

  • Only physically correct under assumptions that object shape is convex and radiance comes from infinite distance
    • Object concavities mean self-reflections, which won’t show up
    • Other objects won’t be reflected
    • Parallel reflection vectors access same environment texel, which is only a good approximation when environment objects are very far from object

v

from Angel

46 of 52

Shadow Maps

  • Idea: If we render scene from point of view of light source, all visible surfaces are lit and hidden surfaces are in shadow
    • “Camera” parameters here = spotlight characteristics

View from light

View from camera

47 of 52

Z-buffering

  • A per-pixel hidden surface elimination technique
  • Maintain an image-sized buffer of the nearest depth drawn at each pixel so far
  • Only draw a pixel if it’s closer than what’s been rendered already

from Hill

48 of 52

Z-buffer: Example

courtesy of DAM Entertainment

Color buffer

Depth buffer

49 of 52

Z-buffering

  • A per-pixel hidden surface elimination technique
  • Maintain an image-sized buffer of the nearest depth drawn at each pixel so far
  • Only draw a pixel if it’s closer than what’s been rendered already

from Hill

50 of 52

Shadow Maps

  • Idea: If we render scene from point of view of light source, all visible surfaces are lit and hidden surfaces are in shadow
    • “Camera” parameters here = spotlight characteristics
  • When rasterizing scene from eye view, transform each pixel to get 3-D position with respect to the light
    • Project pixel to (i, j, depth) with respect to light
    • Compare depth to value in shadow buffer (aka light’s z-buffer*) at (i, j) to see if it is visible to light = not shadowed

View from light

View from camera

51 of 52

Shadow Maps

  • Shadow edges have aliasing depending on shadow map resolution and scene geometry
  • Shadow edges are “hard” by default

from GPU Gems

52 of 52

Shadow Maps

  • Shadow edges have aliasing depending on shadow map resolution and scene geometry
  • Shadow edges are “hard” by default
  • Solutions to both problems typically involve multiple offset shadow buffer lookups