introduction to human computer interaction
play

Introduction to Human Computer Interaction Course on NPTEL, Spring - PowerPoint PPT Presentation

Introduction to Human Computer Interaction Course on NPTEL, Spring 2018 Week 7 Usable Security Ponnurangam Kumaraguru (PK) Associate Professor ACM Distinguished & TEDx Speaker Linkedin/in/ponguru/ 1 fb/ponnurangam.kumaraguru,


  1. Introduction to Human Computer Interaction Course on NPTEL, Spring 2018 Week 7 Usable Security Ponnurangam Kumaraguru (“PK”) Associate Professor ACM Distinguished & TEDx Speaker Linkedin/in/ponguru/ 1 fb/ponnurangam.kumaraguru, @ponguru

  2. Usability and Security ● Why should we study this? ● Why is it important? ● Any experience / relationship? 2

  3. Everyday Security Problems Setting File Permissions 3

  4. Secure, but usable?

  5. Unusable security frustrates users 5

  6. Usable Privacy and Security “Give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future.” - Grand Challenges in Information Security & Assurance Computing Research Association (2003) More research needed on how “cultural and social influences can affect how people use computers and electronic information in ways that increase the risk of cybersecurity breaches.” - Grand Challenges for Engineering National Academy of Engineering (2008) 6

  7. Humans are weakest link ● Most security breaches attributed to “human error” ● Social engineering attacks proliferate 7

  8. How can we make secure systems more usable? ● Make it “just work” ● Invisible security ● Make security/privacy understandable ● Make it visible ● Make it intuitive ● Use metaphors that users can relate to ● Train the user 8

  9. Concerns may not be aligned Keep the bad Don’t lock me out! guys out Security User Expert

  10. Grey ● Smartphone based access-control system ● Used to open doors in the Carnegie Mellon CIC building ● Allows users to grant access to their doors remotely L. Bauer, L.F. Cranor, R.W. Reeder, M.K. Reiter, and K. Vaniea. A User Study of Policy Creation in a Flexible Access-Control System. CHI 2008. http://www.robreeder.com/pubs/greyCHI2008.pdf L. Bauer, L. F. Cranor, M. K. Reiter, and K. Vaniea. Lessons Learned from the Deployment of a Smartphone-Based Access-Control System. SOUPS 2007. http://cups.cs.cmu.edu/soups/2007/proceedings/p64_bauer.pdf

  11. Data collection ● Year long interview study ● Recorded 30 hours of interviews with Grey users ● System was actively used: 29 users x 12 access per week

  12. Users complained about speed ● Users said Grey was slow ● But Grey was as fast as keys ● Videotaped a door to better understand how doors are opened differently with Grey and keys

  13. “I find myself standing outside and everybody inside is looking at me standing outside while I am trying to futz with my phone and open the stupid door.” 13

  14. Train the user

  15. Why do humans fall for phish? ● Not motivated to pay attention to training ● “Security is not my problem” ● Mental models inconsistent with reality ● “If site looks professional it must be legitimate” ● Need actionable advice they can understand ● Difficult to be alert if you don’t know what you’re looking for

  16. How do we get people trained? Learning science principles + Teachable moments + Fun P. Kumaraguru, S. Sheng, A. Acquisti, L. Cranor, and J. Hong. Teaching Johnny Not to Fall for Phish. ACM Trans. Internet Technol. 10, 2 (May 2010), 1-31.

  17. PhishGuru embedded training ● Send email that look like phish ● If recipient falls for it, train in succinct and engaging format ● Study demonstrated effectiveness of PhishGuru and found that same training was not effective sent as regular email Learning science principles + Teachable moments + Fun

  18. Design rationale ● Paper and HTML prototypes ● One page constraint ● Analyzed instructions from most popular websites ● Present the training materials when users click on the link

  19. Applies learning-by-doing and immediate feedback principles

  20. Applies story-based agent principle

  21. Applies contiguity principle Presents procedural knowledge

  22. Applies personalization principle Presents conceptual knowledge

  23. Iterations

  24. First intervention

  25. Intervention: eBay

  26. Focus group studies ● One with age group 18 – 55 and another with age group greater than 65 ● All age groups will read the interventions ● Everybody liked the gold fish and the comic script format ● Participants did not like the phisher character

  27. First lab study results ● Security notices are an ineffective medium for training users ● Users educated with embedded training make better decisions than those sent security notices Kumaraguru, P., Rhee, Y., Acquisti, A., Cranor, L. F., Hong, J., and Nunge, E. Protecting people from phishing: the design and evaluation of an embedded training email system. CHI ’07, pp. 905-914.

  28. Second lab study results ● Users educated with PhishGuru retained knowledge after seven days ● Users trained with embedded did better than users trained with non-embedded Kumaraguru, P., Rhee, Y., Sheng, S., Hasan, S., Acquisti, A., Cranor, L. F., and Hong, J. Getting users to pay attention to anti-phishing education: Evaluation of retention and transfer. e-Crime Researchers Summit, Anti-Phishing Working Group (2007).

  29. Real world study: Portuguese ISP ● PhishGuru is effective in training people in the real world ● Trained participants retained knowledge after 7 days of training Kumaraguru, P., Sheng, S., Acquisti, A., Cranor, L. F., and Hong, J. Lessons from a real world evaluation of anti-phishing training. e-Crime Researchers Summit, 2008

  30. Real world study: CMU ● Evaluate effectiveness of PhishGuru training in the real world ● Investigate retention after 1 week, 2 weeks, and 4 weeks ● Compare effectiveness of 2 training messages with effectiveness of 1 training message P. Kumaraguru, J. Cranshaw, A. Acquisti, L. Cranor, J. Hong, M. A. Blair, and T. Pham. School of Phish: A Real-World Evaluation of Anti-Phishing Training. 2009. Under review.

  31. Study design ● Sent email to all CMU students, faculty and staff to recruit participants to opt-in to study ● 515 participants in three conditions ● Control ● One training message ● Two training messages ● Emails sent over 28 day period ● 7 simulated spear-phishing messages ● 3 legitimate messages from ISO (cyber security scavenger hunt) ● Exit survey

  32. What study design? ● For 2 different solutions – PhishGuru & PhishX 40

  33. Comparing Two Alternatives ● Between groups experiment ● two groups of test users ● each group uses only 1 of the systems ● Within groups experiment ● one group of test users ● each person uses both systems, randomized ordering ● can’t use the same tasks or order (learning) ● Between groups requires many more participants than within groups 41

  34. Implementation ● Unique hash in the URL for each participant ● Demographic and department/status data linked to each hash ● Form does not POST login details ● Campus help desks and all spoofed departments were notified before messages were sent

  35. Study schedule Day of the Control One training Two training study message messages Day 0 Test and real Train and real Train and real Day 2 Test Day 7 Test and real Day 14 Test Test Train Day 16 Test Day 21 Test Day 28 Test and real Day 35 Post-study survey

  36. Simulated spear phishing message Plain text email without graphics URL is not hidden

  37. Simulated phishing website http://andrewwebmail.org/password/change.htm?ID=9009

  38. Simulated phishing website http://andrewwebmail.org/password/thankyou.html?ID=9009

  39. PhishGuru intervention

  40. Effect of PhishGuru Condition N % who % who clicked on clicked on Day 0 Day 28 Control 172 52.3 44.2 Trained 343 48.4 24.5

  41. Results conditioned on participants who clicked on day 0 Trained participants less likely to fall for phish

  42. Results conditioned on participants who clicked on day 0 Trained participants less likely to fall for phish Trained participants remember what they learned 28 days later

  43. Results conditioned on participants who clicked on day 0 and day 14 Two-train participants less likely than one-train participants to click on days 16 and 21

  44. Results conditioned on participants who clicked on day 0 and day 14 Two-train participants less likely than one-train participants to click on days 16 and 21 Two-train participants less likely than one-train participants to provide information on day 28

  45. Legitimate emails Condition N Day 0 Day 7 Day 28 Clicked % Clicked % Clicked % Control 90 50.0 41.1 38.9 One-train 89 39.3 42.7 32.3 Two-train 77 48.1 44.2 35.1 No difference between the three conditions on day 0, 7, and 28

  46. Legitimate emails Condition N Day 0 Day 7 Day 28 Clicked % Clicked % Clicked % Control 90 50.0 41.1 38.9 One-train 89 39.3 42.7 32.3 Two-train 77 48.1 44.2 35.1 No difference between the three conditions on day 0, 7, and 28 No difference within the three conditions for the three emails

Recommend


More recommend