1 of 80

Climate Model Ensembles:�Why they are complicated, valuable, and fascinating

May 22, 2024, 5:00 – 6:30 p.m. ET

Benjamin Brown-Steiner, Ph.D.

Research Associate, PRI

2 of 80

3 of 80

4 of 80

1

5 of 80

1

6 of 80

1

7 of 80

4

8 of 80

4

9 of 80

4

10 of 80

Surface Temperature, Two Layers

11 of 80

Surface Temperature, Many Layers

12 of 80

4

13 of 80

16

14 of 80

16

15 of 80

160

16 of 80

160

17 of 80

Simple Climate Models

Atmosphere

Ocean

18 of 80

Simple Climate Models

Atmosphere

Land

Ocean

19 of 80

Simple Climate Models

Atmosphere

Land

Ocean

20 of 80

Simple Climate Models

Atmosphere

Land

Ocean

21 of 80

Global Climate Models

22 of 80

23 of 80

24 of 80

25 of 80

Q&A Session #1

26 of 80

The Spark

  • Wendell Berry:
  • “Raindrops that pass in random fashion through an imaginary plane above the forest canopy…tend to leave the ecosystem as they entered it, in randomized fashion.”
  • “Does ‘random’ in this (or any) context describe a verifiable condition or a limit of perception (i.e. random as far as we can tell)?”
  • Berry’s answer is: “It describes a limit of perception…based on the belief that pattern is verifiable by limited information, whereas the information required to verify randomness is unlimited…”
  • “…should have said that rainwater moves from mystery through pattern back into mystery...”
  • “To call this mystery ‘randomness’ is to take charge of it on behalf of those who do not respect pattern. To call the unknown ‘random’ is to plant the flag by which to colonize and exploit the unknown. To call the unknown by its right name, ‘mystery’…”

t=0

t=t1

27 of 80

Perhaps the most familiar attempts to get at uncertainty/mystery…

IPCC, Guidance Notes, July, 2005

IPCC, AR5, Summary for Policymakers

“There is high confidence that the ENSO will remain the dominant mode…and precipitation variability on regional scales will likely intensify. Natural variations…are large and thus confidence in any specific projected change in ENSO …remains low.”

28 of 80

What is science?

the study of or

knowledge about

the natural world

29 of 80

One more complexity…all that stuff science cannot do

30 of 80

Facts

Values

Knowledge

Wisdom

Decisions

Actions

31 of 80

Facts

Values

Knowledge

Wisdom

Decisions

Actions

Very Certain

Uncertain

32 of 80

Facts

Values

Knowledge

Wisdom

Decisions

Actions

“One of the purposes of objectivity, in practice, is to avoid coming to a moral conclusion.”

- Wendell Berry

33 of 80

REAL WORLD�(OMNISCIENT)

the past today the future

34 of 80

OUR VIEW AS OBSERVERS�(NOT OMNISCIENT)

AVAILABILITY (missing data)

ACCURACY (inexactness)

INDETERMINANCY (unobservable)

IGNORANCE (mystery)

UNPREDICTABILITY (the unexpected)

PATH DEPENDENT (what will we decide?)

the past today the future

“…we will always experience [data] as probabilistic, as shimmering, rather than fixed.”

“…all knowledge about [the earth system] depends fundamentally on modeling.”

- Paul Edwards, A Vast Machine, page 352

35 of 80

SCIENTIFIC METHOD�LETS US PUT THINGS TOGETHER�

  • We make Presuppositions

  • We take observations

  • We build a body of Evidence

  • We apply Logic

  • We develop a Model

  • We get Output

  • Now what?

“the world exists”

“what we see is what is there”

Parameterizations: ß, Ω, µ

Theoretical Knowledge

Empirical Knowledge

Computational Capabilities

Structure

modus ponens (if A 🡪 B, A .˙. B)

“we can represent the world with mathematics”

BC & IC

PEL Model,

Scientific Method in Practice,

Hugh Gauch, Jr. (2003)

36 of 80

This is important

But the interesting stuff is out here

Understanding = Knowledge + Ignorance

Bill Vitek and Wes Jackon, Virtures of Ignorance, 2005

We must characterize what we don’t know as much as what we do know.

37 of 80

BUILDING TRUST �THROUGH CHARACTERIZING UNCERTAINTIES

EPISTEMIC

UNCERTAINTIES

ALEATORY

UNCERTAINTIES

Observational Uncertainty (the past shimmers)

Chaos / Dependence on Initial Conditions

Internal Variability

Surprises (non-linearities, unobservables)

Known-Unknowns: uncertainties in the parameters or model components

Unknown-Knowns: the very rare happy surprise (more often luck)

Unknown-Unknowns: things we cannot or do not know how to describe

38 of 80

We want to know how much

rain will pass through

that forest canopy.

ALEATORY

UNCERTAINTIES

EPISTEMIC

UNCERTAINTIES

If we look at the aleatory

uncertainties as chaos

or randomness, we have to

consider: “Does ‘random’

in this (or any) context

describe a verifiable condition

or a limit of perception?”

When we use random do we

mean “random as far as we can tell?”

- Wes Jackson, Virtues of Ignorance, page 22

MYSTERY?

RANDOM?

39 of 80

PERTURB BOUNDARY CONDITIONS�(a.k.a. CREATE SCENARIOS)

  • What does this do for us?
    • SRES, RCPs, etc. allow us to proceed under uncertainty, very practical
    • Allow us to explore the boundaries of what is possible (in our models) through exploration of epistemic uncertainties (via model structure)
    • Let us use our expert knowledge (which is good, but fallible)
    • Lets us sidestep the responsibility of saying which path is the best/worst
    • Historically, we have been hesitant to apply probabilities to scenarios
    • CAVEAT: Only explores what we think is possible, so we may be missing a lot

40 of 80

41 of 80

PERTURB INITIAL CONDITIONS

t0

t50

t100

  • Ensemble Forecasting (or Monte-Carlo Approach)
  • Keep everything in the model the same (parameters, structure, etc.)
  • Exploring aleatory uncertainties
  • …but not all of them. Just the ones your model can simulate.

Adapted from Wilks 2006

Full

Range

of

Possibilities

42 of 80

Figure 2: Scenarios of future emissions for various greenhouse gases and other pollutants. Image from Riahi, K. et al. The Shared Socioeconomic Pathways and their energy, land use, and greenhouse gas emissions implications: An overview. Global Environmental Change 42, 153–168 (2017).

43 of 80

44 of 80

Q&A Session #2

45 of 80

46 of 80

47 of 80

D6

48 of 80

D6

D4

D8

D12

D20

D10

49 of 80

Group Name Dice

“Today” D6 + D6

“2.6” D6 + D4

“4.5” D6 + D8

“6.0” D6 + D12

“8.5” D6 + D20

50 of 80

Group Name Dice

“Today” D6 + D6

“2.6” D6 + D4

“4.5” D6 + D8

“6.0” D6 + D12

“8.5” D6 + D20

51 of 80

52 of 80

X

Gigatons of Emitted GHG (as CO2-eq)

Today D6 + D6

X

RCP6.0 D6 + D12

53 of 80

X

Gigatons of Emitted GHG (as CO2-eq)

X

Today D6 + D6

RCP6.0 D6 + D12

54 of 80

X

Gigatons of Emitted GHG (as CO2-eq)

X

Today D6 + D6

RCP6.0 D6 + D12

55 of 80

X

Gigatons of Emitted GHG (as CO2-eq)

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

Today D6 + D6

RCP6.0 D6 + D12

56 of 80

X

Gigatons of Emitted GHG (as CO2-eq)

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

X

Today D6 + D6

RCP6.0 D6 + D12

57 of 80

58 of 80

Dice Models

  • Model Structure:
    • Two Dice
  • Scenarios:
    • Different Dice for Different Levels of GHG Emissions
  • Aleatory Uncertainties:
    • “Randomness” of the Dice Emulated Perturbed Initial Conditions
  • Uncertainty Characterization:
    • Large Ensembles of the Different Scenarios

59 of 80

60 of 80

61 of 80

62 of 80

63 of 80

Thanks!

64 of 80

COMPARE TO INDEPENDENT MODELS�(WE HOPE)

Mason and Knutti, 2011

  • Ideally, comparing independent models will let you explore structural epistemic uncertainties
  • Although you are not excluding your aleatory uncertainties
  • Also known as multi-model ensembles (MMEs)

  • If two models get the same result from two different approaches, you may be approaching real understanding
  • Where two models don’t agree…well…
  • In reality, models are largely not independent, this method will give an underestimate of the total uncertainty

65 of 80

66 of 80

67 of 80

68 of 80

https://crt-climate-explorer.nemac.org/

69 of 80

70 of 80

Questions to Consider

  • What is this model simulating?
  • What dimensions are considered?
  • What are the inputs?
  • What is the structure of the model?
  • What are the outputs?
  • What are the diagnostics?
  • What does this model do?
  • What work does this model do?
  • Try to break it!

71 of 80

https://atlas.climate.copernicus.eu/atlas

72 of 80

73 of 80

Questions to Consider

  • What is this model simulating?
  • What dimensions are considered?
  • What are the inputs?
  • What is the structure of the model?
  • What are the outputs?
  • What are the diagnostics?
  • What does this model do?
  • What work does this model do?
  • Try to break it!

74 of 80

https://www2.cesm.ucar.edu/experiments/cesm1.0/diagnostics/cam5_diag/f40_amip_cam5_c03_78b/f40_amip_cam5_c03_78b-obs/

75 of 80

76 of 80

77 of 80

https://ourworldindata.org/explorers/ipcc-scenarios

78 of 80

79 of 80

FIXING IMPERFECT MODELS�(THAT DON’T MATCH THE OBSERVATIONS)

REAL WORLD

MODEL RESULTS

OBSERVATIONS

What could be wrong?

  • Observations
  • BC or IC
  • Parameters
  • Model
    • Theoretical
    • Empirical
  • Structure

POTENTIAL SOLUTIONS

(1) Think Harder / Do Better

  • This only really works for your known-knowns or known-unknowns
  • Reducing epistemic uncertainties (model or parametric)

(2) Tune Your Model (a.k.a. “play with the knobs”)

  • “tuning is the manipulation of uncertain or unobservable parameters to best match observations through a sequence of subjective choices.” [Mauristen et al., 2012]
  • If your model does better after some tuning, you cannot assume it’s because it is a better model. It’s more likely that you’ve manipulated your model such that “compensation among model errors is occurring”
  • (you’re running around in the space that’s filled up with uncertainties)

80 of 80

PROJECTING INTO THE FUTURE�(WE MUST BE OUT OF OUR MINDS)

…unless

you

wait.

  • Why should anyone believe our projections? Why should they trust us?
  • It’s our responsibility to give our projections meaning
  • Without the ability to compare to reality, we’re left with…
  • Characterizing our uncertainties (epistemic and aleatory)
  • We are up against:
    • IGNORANCE (mystery)
    • UNPREDICTABILITY (the unexpected)
    • PATH DEPENDENT (what will we decide?)
  • Options:
    • (1) Perturb Your BCs (a.k.a. Create Scenarios), Bound Your Uncertainty, Hedge Your Bets
    • (2) Perturb your ICs, Run Ensembles, Attack Your Aleatory Uncertainties
    • (3) Perturb your Parameters, Run Ensembles (PPEs), Attack Your Epistemic Uncertainties
    • (4) Compare to Other Independent (uh-oh) Models (MMEs)

The future is literally unverifiable