Mountaineer: Topology-Driven Visual Analytics for Comparing Local Explanations
Parikshit Solunke
Committee:
Claudio Silva (advisor)
Juliana Freire
Luis Gustavo Nonato
Agenda
Why do we need Explanations for Black-box Machine Learning Models?
Why xAI?
Local Explanation Methods for Black-box ML Models
Classifier Task: Binary - “Does the person with the given features earn more than 100K ?
�Explainer Task: “What features contributed to the classifier’s prediction and how much significant was the contribution?”
Local Explanation Methods - Attributions
Local Explanation Methods - Problems
Explanation Results are difficult to compare and evaluate!�
IntGrad
SHAP
LIME
DeepLIFT
Anchors
Local Explanation Methods - Disagreement
�Problem: Given multiple explanation results:�� 1) How do you compare them - locally AND globally?
2) How do you decide which ones to trust?
Proposal: Leverage Topological Data Analysis (TDA) to help understand and compare explanation results on a structural level.
Geometrically different but topologically equivalent
Background: Topology
Background - TDA
�
�
Mapper Algorithm and Bottleneck Distance
Creating topological representations of Explanations
Explanation Output
Creating topological representations of Explanations
Predicted Probabilities
Creating topological representations of Explanations
Overlapping Clustering
Creating topological representations of Explanations
Topological Graph
Why TDA helps?
Workflow
Workflow
Workflow
Workflow
Workflow
Interactions and Linked Views