Outline Tor basics CSci 5271 Tor experiences and challenges Introduction to Computer Security Tor and usability combined lecture Usability and security Stephen McCamant University of Minnesota, Computer Science & Engineering Usable security example areas Tor: an overlay network Low-latency TCP applications Tor (originally from “the onion router”) Tor works by proxying TCP streams ❤tt♣s✿✴✴✇✇✇✳t♦r♣r♦❥❡❝t✳♦r❣✴ (And DNS lookups) Focuses on achieving interactive An anonymous network built on top of latency the non-anonymous Internet WWW, but potentially also chat, SSH, etc. Designed to support a wide variety of Anonymity tradeoffs compared to anonymity use cases remailers Tor Onion routing Client perspective Stream from sender to ❉ forwarded Install Tor client running in background via ❆ , ❇ , and ❈ Configure browser to use Tor as proxy One Tor circuit made of four TCP hops Or complete Tor+Proxy+Browser bundle Encrypt packets (512-byte “cells”) as Browse web as normal, but a lot slower ❊ ❆ ✭ ❇❀ ❊ ❇ ✭ ❈❀ ❊ ❈ ✭ ❉❀ P ✮✮✮ Also, sometimes ❣♦♦❣❧❡✳❝♦♠ is in TLS-like hybrid encryption with Swedish “telescoping” path setup
Entry/guard relays Exit relays “Entry node”: first relay on path Forwards traffic to/from non-Tor Entry knows the client’s identity, so destination particularly sensitive Focal point for anti-abuse policies Many attacks possible if one adversary controls entry and exit E.g., no exits will forward for port 25 Choose a small random set of “guards” (email sending) as only entries to use Can see plaintext traffic, so danger of Rotate slowly or if necessary sniffing, MITM, etc. For repeat users, better than random each time Centralized directory Outline How to find relays in the first place? Tor basics Straightforward current approach: Tor experiences and challenges central directory servers Relay information includes bandwidth, Usability and security exit polices, public keys, etc. Replicated, but potential bottleneck for Usable security example areas scalability and blocking Anonymity loves company Who (arguably) needs Tor? Consumers concerned about web Diverse user pool needed for tracking anonymity to be meaningful Businesses doing research on the Hypothetical Department of Defense competition Anonymity Network Citizens of countries with Internet Tor aims to be helpful to a broad range censorship of (sympathetic sounding) potential Reporters protecting their sources users Law enforcement investigating targets
Tor and the US government Volunteer relays Tor relays are run basically by Onion routing research started with the volunteers US Navy Most are idealistic Academic research still supported by A few have been less-ethical researchers, or GCHQ NSF Never enough, or enough bandwidth Anti-censorship work supported by the State Department P2P-style mandatory participation? Same branch as Voice of America Unworkable/undesirable But also targeted by the NSA Various other kinds of incentives Per Snowden, so far only limited success explored Performance Anti-censorship As a web proxy, Tor is useful for Increased latency from long paths getting around blocking Bandwidth limited by relays Unless Tor itself is blocked, as it often is Recently 1-2 sec for 50KB, 3-7 sec for Bridges are special less-public entry 1MB points Historically worse for many periods Also, protocol obfuscation arms race Flooding (guessed botnet) fall 2013 (uneven) Hidden services Undesirable users P2P filesharing Tor can be used by servers as well as Discouraged by Tor developers, to little clients effect Identified by cryptographic key, use Terrorists special rendezvous protocol At least the NSA thinks so Servers often present easier attack Illicit e-commerce surface “Silk Road” and its successors
Intersection attacks Exit sniffing Suppose you use Tor to update a pseudonymous blog, reveal you live in Easy mistake to make: log in to an Minneapolis HTTP web site over Tor Comcast can tell who in the city was A malicious exit node could now steal sending to Tor at the moment you post your password an entry Anonymity set of 1000 ✦ reasonable Another reason to always use HTTPS protection for logins But if you keep posting, adversary can keep narrowing down the set Browser bundle JS attack Traffic confirmation attacks If the same entity controls both guard Tor’s Browser Bundle disables many and exit on a circuit, many attacks can features try to stop tracking link the two connections But, JavaScript defaults to on “Traffic confirmation attack” Usability for non-expert users Can’t directly compare payload data, Fingerprinting via NoScript settings since it is encrypted Was incompatible with Firefox Standard approach: insert and observe auto-updating delays Many Tor users de-anonymized in Protocol bug until recently: covert August 2013 by JS vulnerability channel in hidden service lookup patched in June Hidden service traffic conf. Outline Bug allowed signal to guard when user looked up a hidden service Tor basics Non-statistical traffic confirmation For 5 months in 2014, 115 guard nodes Tor experiences and challenges (about 6%) participated in this attack Apparently researchers at CMU’s Usability and security SEI/CERT Beyond “research,” they also gave/sold Usable security example areas info. to the FBI Apparently used in Silk Road 2.0 prosecution, etc.
Users are not ‘ideal components’ Most users are benign and sensible On the other hand, you can’t just treat Frustrates engineers: cannot give users users as adversaries instructions like a computer Some level of trust is inevitable Closest approximation: military Your institution is not a prison Unrealistic expectations are bad for Also need to take advantage of user common sense and expertise security A resource you can’t afford to pass up Don’t blame users Users as rational Economic perspective: users have “User error” can be the end of a goals and pursue them discussion They’re just not necessarily aligned with This is a poor excuse security Almost any “user error” could be Ignoring a security practice can be avoidable with better systems and rational if the rewards is greater than procedures the risk Perspectives from psychology User attention is a resource Users become habituated to Users have limited attention to devote experiences and processes to security Learn “skill” of clicking OK in dialog boxes Exaggeration: treat as fixed Heuristic factors affect perception of If you waste attention on unimportant risk things, it won’t be available when you Level of control, salience of examples need it Social pressures can override security Fable of the boy who cried wolf rules “Social engineering” attacks
Research: ecological validity Research: deception and ethics User behavior with respect to security Have to be very careful about ethics of is hard to study experiments with human subjects Experimental settings are not like real Enforced by institutional review systems situations When is it acceptable to deceive Subjects often: subjects? Have little really at stake Many security problems naturally include Expect experimenters will protect them deception Do what seems socially acceptable Do what they think the experimenters want Outline Email encryption Technology became available with PGP Tor basics in the early 90s Classic depressing study: “Why Johnny Tor experiences and challenges can’t encrypt: a usability evaluation of PGP 5.0” (USENIX Security 1999) Usability and security Still an open “challenge problem” Usable security example areas Also some other non-UI difficulties: adoption, govt. policy Phishing Phishing defenses Educate users to pay attention to ❳ : Attacker sends email appearing to Spelling ✦ copy from real emails come from an institution you trust URL ✦ homograph attacks SSL “lock” icon ✦ fake lock icon, or Links to web site where you type your SSL-hosted attack password, etc. Extended validation (green bar) Spear phishing : individually targeted, certificates can be much more effective Phishing URL blacklists
SSL warnings: prevalence Older SSL warning Browsers will warn on SSL certificate problems In the wild, most are false positives ❢♦♦✳❝♦♠ vs. ✇✇✇✳❢♦♦✳❝♦♠ Recently expired Technical problems with validation Self-signed certificates (HA2) Classic warning-fatigue danger SSL warnings: effectiveness Modern Firefox warning Early warnings fared very poorly in lab settings Recent browsers have a new generation of designs: Harder to click through mindlessly Persistent storage of exceptions Recent telemetry study: they work pretty well Modern Firefox warning (2) Modern Firefox warning (3)
Recommend
More recommend