T ELE G AM Combining Visualization and Verbalization for Interpretable Machine Learning VIS 2019 Vancouver, Canada Arjun Srinivasan Georgia Tech Fred Hohman @fredhohman Georgia Tech Steven Drucker Microsoft Research
2
While building and deploying ML models is now an increasingly common practice, interpreting models is not. 3
G AMUT Operationalize Interpretability in design probe GAMs Use generalized additive models Investigation G AMUT : A Design Probe to Understand How Data Of emerging practice of Scientists Understand Machine Learning Models. interpretability w/ industry Fred Hohman, Andrew Head, Rich Caruana, Robert practitioners DeLine, Steven Drucker. CHI, 2019. 4
π Visualization Explanations Show model context Interactive analytics Rely on user interpretation 5
π π Visualization Verbalization Explanations Explanations Show model context Direct and concise Interactive analytics Less cognitive load Rely on user interpretation No training needed 6
+ Visualization Verbalization π π Explanations Explanations 7
T ELE G AM Automatically generate natural language statements, or verbalizations , to complement explanatory visualizations for machine learning models. + Visualization Verbalization π π Explanations Explanations 8
Demo 9
Visualize each featureβs global impact on model, grouped by verbalization
Interactively highlight verbalization in context of the visualization
Adjust verbalization explanation resolution
Comparative verbalization of two prediction visualizations
Explanation Resolution 11
Explanation Resolution Predictions vary potentially due to some features contributing di ff erently from both instances. 11
Explanation Resolution Predictions vary potentially due to Predictions vary potentially due to some features contributing 9 features contributing di ff erently di ff erently from both instances. from both instances. 11
Explanation Resolution Predictions vary potentially due to Predictions vary potentially due to Predictions 126,024 and 312,129 some features contributing 9 features contributing di ff erently vary potentially due to 9 features di ff erently from both instances. from both instances. (i.e., 25%) contributing di ff erently from both instances. 11
Explanation Resolution Predictions vary potentially due to Predictions vary potentially due to Predictions 126,024 and 312,129 some features contributing 9 features contributing di ff erently vary potentially due to 9 features di ff erently from both instances. from both instances. (i.e., 25%) contributing di ff erently from both instances. Brief Detailed 11
Verbalization Types T ELE G AM Future Work Model features Dataset context Instance features Uncertainty Instance comparison β¦ 12
Takeaways 13
Takeaways π + π Visualization + verbalization are complementary Combining explanation mediums for the best of both worlds 13
Takeaways π + π πΌ + β‘ Visualization + verbalization Use interaction for are complementary generation & presentation Combining explanation mediums for the Let users decide resolution, balancing best of both worlds simplicity and completeness 13
Thanks! T ELE G AM Fred Hohman @fredhohman Combining Visualization and Verbalization Georgia Tech for Interpretable Machine Learning Arjun Srinivasan bit.ly/ telegam-vis Georgia Tech π π π¦ πΌ π» Steven Drucker Demo Paper Video Code Slides Microsoft Research We thank the GT Vis Lab and the anonymous reviewers for their constructive feedback. Funded by a NASA PhD Fellowship.
Recommend
More recommend