Causality is very much an open science topic
A meta-tool for navigating causal assessment
SIPS 2025 Budapest
June 25 09:00–10:30, second floor 213
Michael Höfler�Technical University Dresden, Germany
michael.hoefler@tu-dresden.de
Short introduction round
3
Explicit causality is avoided outside
experimental research
‘That one can not randomize a factor does not mean that one is not interested in its effect!’
‘Conflating the means and the ends.’
1. Background
Intransparency, even in the causal goal, is disastrous for scientific communication.
4
Explanations are aligned with cognitive dissonance theory
Hypothesis 1: Avoiding causality is rewarded because addressing causality has large costs
Hypothesis 2: Without a profound understanding of causality, the acceptance of inevitability coincides with inappropriate stances
Hypothesis 3: The modes of dissonance reduction are diverse
Hypothesis 4: Short reflection on potential benefits does not help against avoidance
Hypothesis 5: Social aspects maintain the avoidance
Observed in the common blending of causal and associational wording:
1. Background
5
Addressing causality�… begins with qualitative considerations
1. Background
6
If new study/analysis is required, one must begin with a qualitative generative model on processes beyond the data (i.e., on bias)
You can not have no generative model!
‘Let the data speak for themselves‘ assumes:
Nobody would buy these assumptions if they were explicit!
1. Background
7
Convenient method for generative models: �Directed Acyclic Graphs (DAGs)
e.g. on common causes, the specific issue of causal inference in observational studies
1. Background
8
Underrated: testing Causal predictions
1. Background
through secondary data analysis or collecting new data
9
Alternative Explanations
… to an analysis supporting a causal effect �(e.g. unconsidered common causes, poor measurement …)
1. Background
10
Causal toolbox, very much under-utilised
in Psychology�
1. Background
... and many visual tools that facilitate design decisions and analysis, see slides below
11
What can be done to make addressing causality more appealing?
E.g. ‘is the hypothesis causal or associational?’ If one wants to inform theory or assess the potential of an intervention → causal.
2. A meta tool for navigating causal assessment
12
Existing tools hardly
cover that
2. A meta tool for navigating causal assessment
13
Is the hypothesis meant to be…
Causal?
Associational?
Alternative, noncausal explanations�E.g. an association has been found that might be due to common causes or measurement error …
Reasons to believe an alternative explanation
Reasons to believe causal explanation (e.g. known mechanism)
But a new meta tool may guide through the qualitative steps
Creates transparency on arguments and their foundations
Navigate through options and tools to collect new evidence for or against
Collect ideas on testing an explanation with secondary data analysis or new studies
2. A meta tool for navigating causal assessment
14
Help planning a causal observational study and its analysis, identify helpful tools
Navigate through existing tools
Can a prediction be made that can be tested with perhaps existing observational data?
Go through causal methods and their assumptions to assess if they might help: instrumental variables, Mendelian randomisation, Granger causality …�
Can they be used in perhaps existing observational data?
Methodological advice usually stars only here
Navigate through options and tools to collect new evidence for or against a causal hypothesis
… or just there
2. A meta tool for navigating causal assessment
15
A concise page with collected:
1) Hypothesis, meant causal or not causal?
2) Arguments for and against an effect and alternative explanations
3) Options for secondary data analyses and associated tools
4) Options for new study and associated tools
Returned sheet
2. A meta tool for navigating causal assessment
16
https://tally.so/create
Is the hypothesis meant to be …
causal or associational?
Might return:
3 out of 3 answers suggest that your hypothesis is causal.
2. A meta tool for navigating causal assessment
Ideas on the sections
17
Reasons to believe causal explanation
Is there a known mechanism that could explain the effect if it exists?
Is there a known mechanisms that speaks against the effect? (e.g. gravitation law speaks against astrology)
2. A meta tool for navigating causal assessment
New study warranted? What reasons to believe could this add?
Is a study feasible that adds substantial evidence for or against the effect?
Might reveal that a study is premature (e.g. knowledge of common causes is so limited that one cannot specify which of such variables should be collected → instead recommend to engage in theory building)
‘Study’ might just mean analysing existing data to test predictions that a causal hypothesis makes (on associations)
Causal design and analysis tools, evaluate if potentially helpful
Which common causes must be collected?
If this can not be decided but many such variables are known: very large sample size that allows adjustment for many
Make use of the ‘target trials’ conception:
2. A meta tool for navigating causal assessment
20
(Software used so far:)
https://tally.so/create
2. A meta tool for navigating causal assessment
21
Collection of existing tools�(free and non-expert tools so far)
3. Existing tools
DAGitty Shiny App and R package
22
3. Existing tools
Theoraizer �use AI to create a model (which can be fed into DAGitty)
Causal Loop Diagram (CLD) method
Model building may be fed with information on associations, because papers blend causal and associational language and do not use causal methods?
Child maltreatment,
Internalizing problems in adulthood,
Bonding characteristics,
Parental Psychopathology,
Parenting characteristics
Internalizing problems in Adulthood
23
ViSe
Graphical tool, shiny app and R package to determine a confirmational threshold that accounts for unconsidered common causes (confounders).
3. Existing tools
24
Z → X
Z → Y
α1 :
α2 :
3. Existing tools
Effect supported
Left bound of confidence interval for Cohen‘s d = 0.09 produces this:
X = factor�Y = outcome
Z = common disposition
25
3. Existing tools
Currently just based on this simple generative model:
(and beyond that qualitative model quantitative �assumptions, “variable omission model“)
26
26
thinkCausal
Nice teaching material
Guides you well through analysis
3. Existing tools
27
DoWhy
Python library that combines generating models, data analysis and robustness checks
Nicely implemented interplay between model building and model checking with data, provides verbal assessments. Uses simple graphs, seems only useful for researchers with a keen methodological interest.
3. Existing tools
28
TETRAD
R and Python package, has a graphical user interface that requires Java JDK
3. Existing tools
29
(CausalImpact)
R package by Google that does “causal inference using Bayesian structural time-series models” and visualizes the results
3. Existing tools
Tools for sensitivity analysis that require only summary results (e.g. confidence intervals)
30
Just considers confounding, outcome can be binary or dimensional
Software review (2021 or older)
ViSe can be used for sensitivity analysis as well
3. Existing tools
31
31
(TippingSens)
Similar to ViSe for binary variables, but needs data input.
3. Existing tools
32
32
(CausalOptim)
You can add a DAG to be evaluated, bounds are computed. Can specify if variables are dimensional or binary. Requires understanding of potential outcomes. Unstable website.
3. Existing tools
33
33
3. Existing tools
(CausalVis)
Python library, not directly usable interactively
34
34
3. Existing tools
(CausalNex)
Python library: machine learning to identify causal relations with big data; not directly usable interactively