usable security and the human in the loop
play

Usable security and the human in the loop Michelle Mazurek Some - PowerPoint PPT Presentation

Usable security and the human in the loop Michelle Mazurek Some slides adapted from Lujo Bauer, Lorrie Cranor, Rob Reeder, Blase Ur, and Yinqian Zhang 1 The human threat Malicious humans Humans who dont know what to do


  1. Usable security and the human in the loop Michelle Mazurek Some slides adapted from Lujo Bauer, Lorrie Cranor, Rob Reeder, Blase Ur, and Yinqian Zhang 1

  2. The human threat • Malicious humans • Humans who don’t know what to do • Unmotivated humans • Humans with human limitations 2

  3. Key challenges • Security is a secondary task secondary task – Users are trying to get something else done • Security concepts are har hard – Viruses, certificates, SSL, encryption, phishing • Human capabilities are limited imited 3

  4. Are you capable of remembering a unique strong password for every account you have? 4

  5. Key challenges • Security is a secondary task secondary task • Security concepts are har hard • Human capabilities are limited imited • Habituat Habituation ion – The “crying wolf” problem • Misaligned priorit priorities ies 5

  6. Keep the Don’t lock bad guys out me out! Security User Expert 6

  7. Key challenges • Security is a secondary task secondary task • Security concepts are har hard • Human capabilities are limited imited • Habituat Habituation ion • Misaligned priorit priorities ies • Act Active adversaries ive adversaries – Unlike ordinary UX 7

  8. 8

  9. Case study #1: GREY AND USER BUY-IN GREY AND USER BUY -IN 9

  10. Grey: Smartphone-enabled doors • Access control system for doors in the CMU CyLab offices • Based on formal proofs of access – Allows users to grant access to others remotely • Year-long interview study – 29 users x 12 accesses per week L. Bauer, L.F. Cranor, R.W. Reeder, M.K. Reiter, and K. Vaniea. A User Study of Pol A User Study of Policy icy Creat Cr eation in a Flexible Access-Contr ion in a Flexible Access-Control System. ol System. CHI 2008. L. Bauer, L. F. Cranor, M. K. Reiter, and K. Vaniea. Lessons Learned fr Lessons Learned from t om the he Deployment of a Smartphone-Based Access-Control System. Deployment of a Smartphone-Based Access-Contr ol System. SOUPS 2007. 10

  11. Users complained about speed • Videotaped a door to understand how Grey is different from keys 11

  12. Average access times Grey is not noticeably slower than keys! Total 3.6 sec 5.4 sec 5.7 sec 14.7 σ = 3.6 σ = 3.1 σ = 3.1 sec Door Door Stop in Getting σ = 5.6 Closed opened front of keys door Total 8.4 sec 2.9 sec 3.8 sec 15.1 σ = 2.8 σ = 1.5 σ = 1.1 sec Door Door Getting Stop in σ = 3.9 Closed opened phone front of door 12

  13. “I find myself standing outside and everybody inside is looking at me standing outside while I am trying to futz with my phone and open the stupid door.” Takeaway: Misaligned priorities 13

  14. Case Study #2 PASSWORD EXPIRA ASSWORD EXPIRATION AND TION AND USER BEHA USER BEHAVIOR VIOR 14

  15. Does password expiration improve security in practice? • Observat Observation ion – Users often respond to password expiration by transforming their previous passwords in small ways [Adams & Sasse 99 … we’ll talk about this later] • Conjectur Conjecture – Attackers can exploit the similarity of passwords in the same account to predict the future password based on the old ones [Zhang et. al, CCS 2010] 15

  16. Empirical analysis • UNC “Onyen” logins – Broadly used by campus and hospital personnel – Password change required every 3 months – No repetition within 1 year • 51141 unsalted hashes, 10374 defunct accounts – 4 to 15 hashes per account in temporal order • Cracked ~8k accounts, 8 months, standard tools • Experimental set: 7752 accounts – At least one cracked password, NOT the last one 16

  17. Transform Trees “password” “pa$sword”? “Password”? p → s → $ P p → p → s → $ s → $ P P “pa$$word”? “Pa$sword”? “Pa$sword”? ┴ • Approximation algorithm for optimal tree searching 17

  18. Location Independent Transforms CATEGORY EXAMPLE Capitalization tarheels#1 → tArheels#1 Deletion tarheels#1 → tarheels1 Duplication tarheels#1 → tarheels#11 Substitution tarheels#1 → tarheels#2 Insertion tarheels#1 → tarheels#12 Leet Transform tarheels#1 → t@rheels#1 Block Move tarheels#1 → #tarheels1 Keyboard Transform tarheels#1 → tarheels#! 18

  19. Evaluation • Pick a known plaintext, non-last password (OLD) • Pick any later password (NEW) • Attempt to crack NEW with transform tree rooted at OLD 19

  20. Results: Offline Attack Within 3 Seconds !! 41% 50% 39% 41% 30% 37% 28% 40% Success rate 26% 28% 30% 24% 25% 20% 17% depth 4 10% depth 3 depth 2 0% depth 1 Edit Dist Edit w/ Loc Ind Mov Pruned Takeaways: Memory limitations matter Convenience always wins 20

  21. Understanding the human • Who wants to practice good security but doesn’t know how • Who is indifferent to security but will comply – If it’s easy – If it’s the default – If it doesn’t interfere with the primary task 21

  22. Human-in-the-loop framework • Based on Communication-Human Information Processing Model (C-HIP) from Warnings Science • Models human interaction with secure systems • Can help identify (non-malicious) human threats L. Cranor. A Framework for Reasoning About the Human In the Loop. Usability, Psychology and Security 2008. http://www.usenix.org/events/upsec08/tech/full_papers/cranor/cranor.pdf 22

  23. Human-in-the-loop framework Human Receiver Human Receiver Communication Communication Attention Attention Personal Personal Communication Communication Delivery Delivery Switch Switch Variables Variables Demographics Demographics Attention Attention and Personal and Personal Maintenance Maintenance Characteristics Characteristics Communication Communication Knowledge & Knowledge & Communication Communication Impediments Impediments Processing Processing Experience Experience Comprehension Comprehension Environmental Environmental Behavior Behavior Stimuli Stimuli Knowledge Knowledge Intentions Intentions Acquisition Acquisition Attitudes Attitudes Interference Interference and Beliefs and Beliefs Application Application Knowledge Knowledge Motivation Motivation Retention Retention Knowledge Knowledge Capabilities Capabilities Transfer Transfer 23

  24. Human threat identification and mitigation process Task Task Failure Failure Identification Automation Identification Mitigation Human-in- User Identify points Find ways to the-loop Studies where system partially or fully Framework relies on humans automate some Find ways to User to perform of these tasks prevent these Studies security-critical failures functions Identify potential failure modes for remaining tasks 24

  25. Human-in-the-loop framework Human Receiver Communication Attention Personal Communication Delivery Switch Variables Demographics Attention and Personal Maintenance Characteristics Communication Knowledge & Communication Impediments Processing Experience Comprehension Comprehension Environmental Behavior Stimuli Knowledge Intentions Acquisition Attitudes Interference and Beliefs Application Knowledge Motivation Retention Knowledge Capabilities Transfer 25

  26. 26

  27. Internet Explorer cookie flag 27

  28. Human threat identification and mitigation process Task Task Failure Failure Identification Automation Identification Mitigation Human-in- User Identify points Find ways to the-loop Studies where system partially or fully Framework relies on humans automate some Find ways to User to perform of these tasks prevent these Studies security-critical failures functions Identify potential failure modes for remaining tasks 28

  29. 29

  30. 30

  31. Users are not the enemy • “These observations cannot be disputed, but the conclusion that this behavior occurs because users are inherently careless — and therefore insecure — needs to be challenged.” • Study methods: – Online survey, primarily from organization A – Interviews at organizations A and B – Grounded theory 31

  32. Discussion questions • This paper is “classic” (from 1999). What do you think might be different today? What questions would you add or change? • Are these participants representative (of what)? – What other groups could you ask? How might the results be different? 32

  33. Discussion questions • “Users identified certain systems as worthy of secure password practices, while others were perceived as ‘not important enough.’” – How do you motivate users? – How do you treat users as partners? – What about when this behavior is rational/correct? • What solutions are suggested? – Do you think these would work? Why / why not? – Other suggestions? 33

  34. (One) Hierarchy of solutions • Make it “just work” – Invisible security • Make security/privacy understandable – Make it visible – Make it intuitive – Use metaphors that users can relate to • Train the user 34

  35. Automation considered harmful? Problems: • Insufficient flexibility • Imposition of values • Impact on user experience – Especially in failure cases • Examples from your home domain? 35

  36. Considerations for automating • Accuracy • Implicit instead? • Stakeholder values • Keep human informed? • Information overload? • Fail gracefully? • Do you agree with all of these? • Are there others we should add? 36

Recommend


More recommend