department of computer science csci 5622 machine learning
play

Department of Computer Science CSCI 5622: Machine Learning Chenhao - PowerPoint PPT Presentation

Department of Computer Science CSCI 5622: Machine Learning Chenhao Tan Lecture 23: Machine learning and society Slides adapted from Chris Ketelsen 1 Learning objectives Learn about the connection between our society and machine learning


  1. Department of Computer Science CSCI 5622: Machine Learning Chenhao Tan Lecture 23: Machine learning and society Slides adapted from Chris Ketelsen 1

  2. Learning objectives • Learn about the connection between our society and machine learning • Make sure that you think about ethics when applying machine learning • Fill in FCQ 2

  3. Now you understand the magic behind machine learning! 3

  4. 4

  5. Now you understand the magic behind machine learning! Machine learning in real life 5

  6. Machine learning are commonly used in our society 6

  7. Brainstorm Where did you see machine learning today? 7

  8. Machine learning and our society • Machine learning are increasingly connected with our society • Authorizing credit • Sentencing guidelines • Suggesting medical treatment And many more! 8

  9. Ethical issues in Machine Learning 9

  10. https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing 10

  11. Case study: Decision making https://www.propublica.org/article/m achine-bias-risk-assessments-in- criminal-sentencing 11

  12. Case study: Decision making • Crime prediction, similar situations arise when someone is getting a loan • Harms of allocation https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing 12

  13. Case study: Decision making • Crime prediction, similar situations arise when someone is getting a loan • Harms of allocation • What are potential reasons in the machine learning pipeline? https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing 13

  14. Case study: Word embeddings 14

  15. Case study: Word embeddings Figure credit: Lior Shkiller 15

  16. Case study: Word embeddings Bolukbasi et al. 2016, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings 16

  17. Case study: Word embeddings SVD to rescue Bolukbasi et al. 2016, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings 17

  18. @mmantyla 18

  19. Case study: Recommender Systems Filter bubble [Pariser, 2011] • Main idea: Personalized search determines what information you see and what information you don’t see • Google, Facebook, Netflix 19

  20. Case study: Recommender Systems • Blue feed vs Red feed http://graphics.wsj.com/blue-feed-red-feed/ 20

  21. 21

  22. Case study: Recommender Systems Filter bubble [Pariser, 2011] • Main idea: Personalized search determines what information you see and what information you don’t see • Google, Facebook, Netflix • Debate: a good thing or a bad thing? 22

  23. Case study: Recommender Systems • Bakshy, Messing, Adamic, “Exposure to ideologically diverse news and opinion on Facebook” 23

  24. Case study: Recommender Systems Bakshy, Messing, Adamic, “Exposure to ideologically diverse news and opinion on Facebook” 24

  25. Case study: Recommender Systems • Latanya Sweeney 2013, “Discrimination in Online Ad Delivery” 25

  26. 26

  27. Case study: Recommender Systems • Latanya Sweeney 2013, “Discrimination in Online Ad Delivery” 27

  28. Credit: @math_rachel, Kate Crawford 28

  29. Representation harms • Denigration • Stereotype • Recognition • Under-representation Credit: Solon Barocas, Kate Crawford, Aaron Shapiro, Hanna Wallach 29

  30. Case study: Physical Systems Self-driving cars 30

  31. Case study: Physical Systems Self-driving cars • In the event of an inevitable crash leading to likely loss of life, what should the car do? • Example: Car crash will either results in death of • Driver and several passengers • Several pedestrians • Debate: How do we choose? 31

  32. At least think about the following question • What question can/should we ask? • What data is OK to use? • What is the boundary between private data and public data? • If something is public, should you use the data? • Example: Enron email corpus • Who owns your data? • What anonymization should be done and is the data *really* anonymized? 32

  33. Plug • Thinking more about • Why we develop machine learning systems • How we develop machine learning systems for humans Human-centered Machine Learning in Spring 18 (CS 7000) 33

  34. Thanks! Zhenguo Chen Sean Harrison Tyler Scott Most importantly, all of you! 34

  35. FCQ time! https://colorado.campuslabs.com/courseeval/ 35

Recommend


More recommend