funct functiona nal trans nspa parency ncy for r struct
play

Funct Functiona nal Trans nspa parency ncy for r Struct ructur - PowerPoint PPT Presentation

Funct Functiona nal Trans nspa parency ncy for r Struct ructur ured d Data: a a Gam ame-The heoretic c Appr pproach ch Gu Guang-He He Lee, Lee, We Wengong Jin Jin, , David d Alvarez Me Melis, , and nd Tom ommi S. S. Jaak


  1. Funct Functiona nal Trans nspa parency ncy for r Struct ructur ured d Data: a a Gam ame-The heoretic c Appr pproach ch Gu Guang-He He Lee, Lee, We Wengong Jin Jin, , David d Alvarez Me Melis, , and nd Tom ommi S. S. Jaak Jaakkola ola

  2. Goal: al: understan and the (comple lex) ne network dog de deep p ne nets Img: https://blog.paperspace.com/intro-to-optimization-in-deep-learning-gradient-descent/

  3. Typic ical al method: post-hoc explan lanatio ion 1. Given an example De Deep p ne nets Img: https://blog.paperspace.com/intro-to-optimization-in-deep-learning-gradient-descent/

  4. Typic ical al method: post-hoc explan lanatio ion 1. Given an example 2. choose a neighborhood ! De Deep p ne nets Img: https://blog.paperspace.com/intro-to-optimization-in-deep-learning-gradient-descent/

  5. Typic ical al method: post-hoc explan lanatio ion 1. Given an example 2. choose a neighborhood 3. Find a simple approximation - e.g., linear model, decision tree. ≈ De Deep p ne nets Img: https://blog.paperspace.com/intro-to-optimization-in-deep-learning-gradient-descent/

  6. Typic ical al method: post-hoc explan lanatio ion 1. Given an example 2. choose a neighborhood 3. Find a simple approximation - e.g., linear model, decision tree. ≈ De Deep p ne nets Img: https://blog.paperspace.com/intro-to-optimization-in-deep-learning-gradient-descent/

  7. Po Post-hoc explan lanatio ions ar are not stab able le Input 1 Explanation 1 (Alvarez-Melis & Jaakkola, 18’, Ghorbani et al., 19’)

  8. Po Post-hoc explan lanatio ions ar are not stab able le Input 1 Explanation 1 Input 2 Explanation 2 + ! (Alvarez-Melis & Jaakkola, 18’, Ghorbani et al., 19’)

  9. Reas ason: the network does not operate as as the desir ired explan lanatio ion ≈

  10. Train ainin ing comple lex models ls to exhib ibit it mean anin ingful l propertie ies lo locally ally stability, transparency , …

  11. Train ainin ing comple lex models ls to exhib ibit it mean anin ingful l propertie ies lo locally ally stability, transparency , … :the set of functions with desired property

  12. Train ainin ing comple lex models ls to exhib ibit it mean anin ingful l propertie ies lo locally ally stability, transparency , … :the set of functions with desired property - example for transparency : linear model, decision tree

  13. Train ainin ing comple lex models ls to exhib ibit it mean anin ingful l propertie ies lo locally ally stability, transparency , … :the set of functions with desired property Degree to which the property is enforced on around .

  14. Train ainin ing comple lex models ls to exhib ibit it mean anin ingful l propertie ies lo locally ally stability, transparency , … :the set of functions with desired property Degree to which the property is enforced on around . Regularize the model towards the property

  15. Functio ional al property enforcement For each , the witness measures the enforcement

  16. Functio ional al property enforcement For each , the witness measures the enforcement We regularize the predictor towards agreement

  17. Functio ional al property enforcement A co-operative game: $ The asymmetry leads to efficiency in optimization. (see the paper for more details)

  18. Exam ample les Predictor Task Witness

  19. Exam ample les Predictor Task Witness

  20. Exam ample les Predictor Task Witness toxic (Hamilton et al., 17’)

  21. Exam ample les Predictor Task Witness toxic (Hamilton et al., 17’) (Jin et al., 18’)

  22. Empir iric ical al study - we can n measur ure trans nspa parency ncy ba based d on n de devi viation n be between n pr predi dict ctor r and nd expl plaine ner.

  23. Mo Mode dels traine ned d w/ thi his appr pproach ch yield d more compa pact ct expl plana nations ns The explanation from The explanation from our model a normal model

  24. Poster: : 06:30 -- 09:00 PM @ Pacific Ballroom #64 - Details and analysis about the framework Related w work on functional transparency: Towards Robust, Locally Linear Deep Networks, ICLR 19’

Recommend


More recommend