1 of 16

Explaining Deep Learning Predictions in Healthcare using Clinical Concepts

Sayantan Kumar

Advisor: Dr. Philip Payne

Doctoral Student Seminar

December 2nd , 2022

2 of 16

Deep learning is Everywhere

1

Motivation

Methods

Contribution

Takeaways

Results

Criminal Justice

Recommendation Systems

Healthcare

3 of 16

Deep learning in Healthcare : Example

2

Clinical outcome

Example: Predict if a patient will die within ICU in the next 24 hours.

Motivation

Methods

Contribution

Takeaways

Results

4 of 16

Challenge: Deep learning = Blackbox models

3

Clinical outcome

Predict if a patient will die within ICU in the next 24 hours.

  • Difficult for clinicians to understand reasoning (why) behind model predictions.

  • Lack of trust in deploying deep learning frameworks in healthcare.

Black Box

Why did the model predict mortality?

Motivation

Methods

Contribution

Takeaways

Results

5 of 16

Objective

4

Goal: A deep learning framework that can explain/interpret model predictions.

    • Explanations: Provide insights about the possible reasons behind mortality.

    • Should be understandable to clinicians.

Clinical outcome

Predict if a patient will die within ICU in the next 24 hours.

Why did the model predict mortality?

Motivation

Methods

Contribution

Takeaways

Results

6 of 16

Limitations : Prior works on explainability in EHR

5

Assign weights to individual features

EHR features

  • Lab tests
  • Vital signs
  • Demographics

Feature-based explanations

  • Rank features – assign weights (importance)
  • Not clinically informative for high-dimensional data

Creatinine

Urine output

Blood pressure

Age

0.5

0

1

weights (importance)

…...

…...

Example

Did the patient die of multi organ failure?

Patient died due to creatinine, age, urine, …etc?

Too granular

High level

Neural Network

Motivation

Methods

Contribution

Takeaways

Results

7 of 16

Our Contribution

6

Contribution : Concept-based explanations

    • easier for clinical understanding.

    • derived from input features.

Goal: A deep learning framework that can explain/interpret why model predicted patient mortality in ICU.

EHR features

  • Lab tests
  • Vital signs

Heart failure

Renal failure

The patient died due to heart and kidney failure.

Example Concepts 🡪 Organ failure risk scores

Challenge

Feature based explanations

High-level

Intermediate

Neural Network

Motivation

Methods

Contribution

Takeaways

Results

8 of 16

Research Hypothesis

7

Contribution : Concept-based explanations

    • easier for clinical understanding.

    • derived from input features.

Goal: A deep learning framework that can explain/interpret why model predicted patient mortality in ICU.

Challenge

Feature based explanations

High-level

Intermediate

Concept-based explanations provide intuitive clinical insights about patient mortality.

Hypotheses

Motivation

Methods

Contribution

Takeaways

Results

9 of 16

Novelty of our work – Supervised EHR Concepts

8

Existing work

Our proposed work

Concept based explanations

  • Applied on imagesno applications on clinical EHR

  • Learnt in unsupervised way – learnt concepts might have no clinical significance.
  • 1st application on EHR for predicting a clinical outcome.

  • Learnt in supervised way – clinically meaningful and driven by domain knowledge.

Koh, Pang Wei, et al. "Concept bottleneck models." International Conference on Machine Learning. PMLR, 2020.

David Alvarez Melis and Tommi Jaakkola. 2018. Towards robust interpretability with self explaining neural networks. Advances in neural information processing systems 31 (2018).

Motivation

Methods

Contribution

Takeaways

Results

10 of 16

Clinical Concepts: SOFA

9

EHR features

c1

c3

c4

c5

c6

c2

  • Respiratory
  • Coagulation
  • Kidney

  • Heart
  • Nervous System
  • Liver

Clinical concepts

Organ-specific risk scores

Sequential Organ Failure Assessment (SOFA)

  • Extent of failure (risk scores) for organ systems.

  • Tracks patient’s health status in ICU.

  • 0 – 4 (higher = more risk of organ failure).

High-level

Intermediate

Derived from input clinical features.

Feature-level to organ level.

Neural Network

Motivation

Methods

Contribution

Takeaways

Results

11 of 16

Proposed framework (high-level)

10

EHR features

Predicted mortality within next 24 hours

y = ∑ f (wi * ci)

i = 1

n

….

c1

cn

w1

….

wn

Clinical concepts

Relevance scores

Predicted Auxiliary layer

    • Learn clinical concepts supervised by domain knowledge (SOFA).

    • Multitask framework : use concepts as auxiliary tasks to predict final outcome.

    • Relevance scores 🡪 Contribution of each concept towards clinical outcome.

  • Regression problem.
  • Maximize interpretability

Weighted combination

Motivation

Methods

Contribution

Takeaways

Results

12 of 16

Longitudinal Prediction : Final Outcome

11

Predict if a patient will die within ICU in the next 24 hours.

Clinical outcome

……..…………

ICU admit

Died/discharged

for timepoint t = 1,2,3….T hours:

if death within next 24 hours:

outcome (t) = died (+)

else:

outcome (t) = alive (-)

Hourly interval

24 hours

timepoints

Motivation

Methods

Contribution

Takeaways

Results

13 of 16

Longitudinal Prediction : Concepts

12

Predicted maximum SOFA organ score within next 24 hours

c1

c3

c4

c5

c6

c2

  • Respiratory
  • Coagulation
  • Kidney

  • Heart
  • Nervous System
  • Liver

……..…………

ICU admit

Died/discharged

for concept c = 1 to 6:

for timepoint t = 1,2,3….T hours:

c (t) = max score within next 24 hours

Hourly interval

Max (24 hours)

timepoints

Motivation

Methods

Contribution

Takeaways

Results

14 of 16

Proposed framework (high-level)

13

EHR features

Predicted mortality within next 24 hours

….

c1

cn

w1

….

wn

Relevance scores

Predicted Auxiliary layer

Attention

Predicted maximum SOFA organ score within next 24 hours

Which organ system failure provide insights about mortality?

Clinical concepts

Explanations

Motivation

Methods

Contribution

Takeaways

Results

15 of 16

Explanations: Why will the patient die?

14

Hypothesis : Concept based explanations provide interpretable clinical insights about mortality.

Mortality probability within 24 hours

ICU admit

Died (t = 80)

SOFA neurological 24h max

SOFA cardiovascular 24h max

Low

High

Relevance scores

ICU admit

Died (t = 80)

Motivation

Methods

Contribution

Takeaways

Results

16 of 16

Key Takeaways

15

  • Deep learning framework to explain/interpret why model predicted patient mortality in ICU.

  • Concepts with relevance scores provide explanations.

  • Provide interpretable clinical insights about “why will the patient die after 24 hours?

Motivation

Methods

Contribution

Takeaways

Results