User education, mental models, and risk factors Michelle Mazurek With material from Lorrie Cranor 1
Today • User education • Risk factors: demographics and behavior • Folk models of security • Activity 2
Case study: PhishGuru and Anti-Phishing Phil USER EDUCATION 3
Challenges in user education • Users are not motivated • Security is a secondary task • Risk of increasing false positives – All you do is make them more paranoid 4
Is user education possible? • Security education “puts the burden on the wrong shoulders.” 1 • “Security user education is a myth.” 2 • “User education is a complete waste of time. It is about as much use as nailing jelly to a wall…. They are not interested.” 3 1. J. Nielson, 2004. User education is not the answer to security problems. 2. S. Gorling, 2006. The myth of user education. 3. M. Overton, http://news.cnet.com/2100-7350_3-6125213-2.html 5
Evaluating existing training (2007) • Lab study: 28 non-experts • Evaluate 10 sites, 15 min break, 10 more sites – Control break: read email, play solitaire – Experiment break: read training materials • Experimental group: Better after training – But more false positives P. Kumaraguru, S. Sheng, A. Acquisti, L. Cranor, and J. Hong. Teaching Johnny Not to Fall for Phish. ACM TOIT, May 2010. 6
Maybe we can nail jelly after all… 7 http://graeme.woaf.net/otherbits/jelly.html
PhishGuru and learning science • Send email that looks like phishing • If participant falls for it, redirect to intervention 8
20 9
Applies&learningTbyTdoing& and&immediate&feedback& principles& 21 10
Applies&storyTbased&agent& principle& 22 11
Applies&con.guity&principle& Presents&procedural&knowledge& 23 12
Applies&personaliza.on&principle& Presents&conceptual&knowledge& 24 13
20 14
Evaluating PhishGuru • Two lab studies, field study • Second lab study: roleplay and respond to email – Part 1: 16 emails, training, 16 more – Part 2: 16 emails (7 days later) • Training: four conditions – Email from friend, no relevant content – Email from friend mentions phishing – PhishGuru cartoon in email – PhishGuru embedded (when click on link) 15
Results: Identifying phishing emails 1& 0.8& 0.68& Mean&correctness& 0.64& 0.6& 0.4& 0.18& 0.14& 0.2& 0.07& 0.04& 0& Before&& Immediate& Delay&& NonTembedded& Embedded& Control& Suspicion& 16
Embedded helps! 1& Retention Learning 0.8& 0.68& Mean&correctness& 0.64& 0.6& 0.4& 0.18& 0.2& 0.07& 0.04& 0& Before&& Immediate& Delay&& NonTembedded& Embedded& Control& Suspicion& 17
Results: Legitimate links 1.00& 0.96& 1.00& 0.96& 0.96& 0.96& 0.89& 0.86& 0.80& Mean&correctness& 0.60& 0.40& 0.20& 0.00& Before& Immediate& Delay& 18
Field study at CMU • Investigate retention at 1, 2, and 4 weeks • Opt-in to all students, faculty, staff (N=515) • 28-day period: – 7 simulated phishing – 3 legitimate ISO messages – Exit survey • Unique hash in link per participant – Collect demographic data P. Kumaraguru, J. Cranshaw, A. Acquisti, L. Cranor, J. Hong, M. A. Blair, and T. Pham. 19 School of Phish: A Real-World Evaluation of Anti-Phishing Training. SOUPS 2009.
Email schedule Day Control One training Two training 0 Test and real Train and real Train and real 2 Test 7 Test and real 14 Test Test Train 16 Test 21 Test 28 Test and real 35 Exit survey 20
Sample phishing email Plain&text&email& without&graphics& URL&is¬&hidden& 21
Phishing emails From Subject ISO Bandwidth quota offer Networking Register for CMU’s annual networking event Services Webmaster Change Andrew password Enrollment Services Congratulations – Plaid Ca$h Sophie Jones Please register for the conference Community Service Volunteer at Community Service Links Help Desk Your Andrew password alert 22
Results • Training è less clicking on phishing • Retained training for 28 days • Two trainings better than one • No increase in false positives • 80% recommended CMU continue Condition N % clicked day 1 % clicked day 28 Control 172 52.3 44.2 Trained 343 48.4 24.5 23
Anti-Phishing Phil 69 24
Evaluation • 10 URLs before, 10 after, randomized – In between, up to 15 min training • Two standard tutorials, game 25
Results • All training made people more suspicious • No sig. difference in false negatives • Only game did not increase false positives • Lots of positive comments – Online study with similar results 26
81 27
Why does it work? • Fun to play • People like to win things (even just points) • Fast – 10 minutes • Teaches actionable steps 28
Lalonde-Levesque et al., CCS 2013 CASE STUDY: MALWARE RISK FACTORS 29
A “clinical study” of malware • Distributed laptops, 50 participants, 4 months – Sold to participants below retail • Installed with a bunch of anti-virus • Configured to collect data about: – Applications installed/updated – Browsing, downloads, browser plugins, wi-fi • Monthly sessions to check for infection – Fill out survey – Collect additional infection data 30
Results: Infections • 38% exposed within 4 months – Consistent over all months – 17 via portable storage – 1 outlier with 28 unique detections • 18 threats on 10 machines – 7 unwanted, 9 adware, 1 malware, 1 maybe – 1 fake AV scanner 31
Risk factors • Risk of exposure (detection), not infection • Self-reported expertise è more exposure – Nothing for gender, age, field • Behavioral factors for more exposure: – Installing more applications, visiting more sites – Sites: mp3, peer-peer, gambling, sports, etc. 32
Wash, SOUPS 2010 Case Study: Viruses, hackers, advice MENTAL MODELS 33
Folk models of security • Qualitative interviews, snowball sample • 23 in first round • 10 more to specifically test conclusions 34
Viruses and malware • Viruses are bad; not much else • Viruses are buggy software – Must be intentionally placed on computer • Viruses are mischievous, annoying – Caught by visiting bad parts of internet • Viruses support crime – Identity theft, not damage 35
Hackers • Graffiti artists – Target anyone, cause mischief, impress friends • Opportunistic criminals – Steal identity/financials, targets of opportunity • Organized criminals – Target important/rich people • Criminal contractors – Hybrid of first and third 36
Effect on security advice Virus Models Hacker Models Viruses are Bad Buggy Software Support Crime Contractor Big Fish Mischief Burglar Gra ffi ti Use anti-virus software ?? xx ?? !! !! xx xx 1. 2. Keep anti-virus updated xx xx ?? !! xx Regularly scan computer with anti-virus xx xx ?? !! xx 3. Use security software (firewall, etc.) xx ?? ?? ?? ?? xx 4. Don’t click on attachments !! !! !! !! !! !! 5. Be careful downloading from websites ?? !! ?? !! ?? ?? xx xx 6. Be careful which websites you visit xx !! ?? !! !! ?? !! 7. Disable scripting in web and email xx 8. Use good passwords ?? ?? xx 9. Make regular backups ?? !! xx !! xx xx xx 10. 11. Keep patches up to date ?? xx !! !! !! xx xx Turn o ff computer when not in use xx xx !! ?? !! xx xx 12. !! = very important xx = unnecessary ?? = maybe helpful, not too important 37
Avoiding scams on social media EDUCATION DESIGN 38
In groups: Train users to avoid social media scams • Get free stuff, shocking videos, hoax news • Use learning science principles – Learning by doing, immediate feedback, agent/story, contiguity (words + pictures), personalization (I/we/ you), conceptual + procedural – See required reading #2 for more • Consider mental models • Explain context/delivery, sketch the intervention 39
Recommend
More recommend