Security requires a particular mindset. Security professionals -- at least the good ones -- see the world differently. They can't walk into a store without noticing how they might shoplift. They can't use a computer without wondering about the security vulnerabilities. They can't vote without trying to figure out how to vote twice. They just can't help it […] This kind of thinking is not natural for most people. It's not natural for engineers. Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail. It involves thinking like an attacker, an adversary or a criminal. You don't have to exploit the vulnerabilities you find, but if you don't see the world that way, you'll never notice most security problems. Bruce Schneier
It seems easy ◦ “How hard can it be” Hard rd to test ◦ Working implementations don‟t mean a lot Moving Target ◦ Attack Technologies advance. What is secure today may be completely inappropriate tomorrow Subtle issues ◦ Security „bugs‟ can be extremely subtle Interaction of various complex subsystems ◦ Hard enough to get to right without an attacker Multi- disciplinary ◦ Need to take into account user actions, economics If you fail, really bad things can happen
The Economist, Sept. 2011
Successful “Anonymous” attacks in 2011 ◦ Fine Gael Website, Government of Tunesia, National Democratic Party Egypt, HBGary Federal, Westborrow babtist church, Sony, … ◦ Most of these where „simple‟ attacks on well known weaknesses
It seems easy ◦ “How hard can it be” Hard to test ◦ Working implementations don‟t mean a lot Moving ving Targe get ◦ Att ttack ack Techn chnolo logi gies es advance. vance. What is secu cure e to today y may be completel ompletely y inappropr ppropriat iate e tomor morrow row Subtle issues ◦ Security „bugs‟ can be extremely subtle Interaction of various complex subsystems ◦ Hard enough to get to right without an attacker Multi- disciplinary ◦ Need to take into account user actions, economics If you fail, really bad things can happen
Standard Privacy approach (EU Directive 95/46/EC) Separate all personal identifiable information from the data. Now the data is considered save to use. Approach: ◦ Anonymise everything possible ◦ Get user consent for everything else ◦ Implement strict controls inside the companies ◦ Tough contracts with third parties
Netflix: online movie rental service In October 2006, released real movie ratings of 500,000 subscribers ◦ 10% of all Netflix users as of late 2005 ◦ Names removed ◦ Information may be perturbed ◦ Numerical ratings as well as dates ◦ Average user rated over 200 movies Task is to predict how a user will rate a movie ◦ Beat Netflix’s algorithm (called Cinematch) by 10% ◦ You get 1 million dollars
Most popular movie rated by almost half the users! Least popular: 4 users Most users rank movies outside top 100/500/1000
1 3 2 5 4 1 2 3 2 4
Extremely noisy, some data missing Most IMDb users are not in the Netflix dataset Here is what we learn from the Netflix record of one IMDb user (not in his IMDb profile)
Average subscriber has 214 dated ratings Two is enough to reduce to 8 candidate records Four is enough to identify uniquely (on average) Works even better with relatively rare ratings “The Astro- Zombies” rather than “Star Wars” Fat Tail effect helps here: most people watch obscure movies (really!)
It seems easy ◦ “How hard can it be” Hard to test ◦ Working implementations don‟t mean a lot Moving Target ◦ Attack Technologies advance. What is secure today may be completely inappropriate tomorrow Su Subtle tle issue ues ◦ Security „bugs‟ can be extremely subtle Interaction of various complex subsystems ◦ Hard enough to get to right without an attacker Multi- disciplinary ◦ Need to take into account user actions, economics If you fail, really bad things can happen
Example: “Using Memory Errors to Attack a Virtual Machine” S. Govindavajhala, A. Appel, 2003 If you don‟t know about those attacks, you won‟t defend against them.
It seems easy ◦ “How hard can it be” Hard to test ◦ Working implementations don‟t mean a lot Moving Target ◦ Attack Technologies advance. What is secure today may be completely inappropriate tomorrow Subtle issues ◦ Security „bugs‟ can be extremely subtle Interac eraction tion of va vario ious s compl plex ex subsystem systems ◦ Hard enough ough to get to right witho thout ut an attack cker Multi- disciplinary ◦ Need to take into account user actions, economics If you fail, really bad things can happen
London Ambulance Disaster Ariane Flight 501
It seems easy ◦ “How hard can it be” Hard to test ◦ Working implementations don‟t mean a lot Moving Target ◦ Attack Technologies advance. What is secure today may be completely inappropriate tomorrow Subtle issues ◦ Security „bugs‟ can be extremely subtle Interaction of various complex subsystems ◦ Hard enough to get to right without an attacker Mult lti- disciplin sciplinary ary ◦ Need ed to take into o acco count t user actio ions, , econo onomic mics If you fail, really bad things can happen
“What is new, however, the password with which he can be opened, as well as the file open in the network are found. The Friday tried to reconstruct how it came about. By his testimony, Wikileaks founder Julian Assange has passed the password on an undisclosed person of his trust. This person published there, supposedly in the faith, it was only a temporary password. The newspaper writes the password consists of a “ phrase ” and was “ for connoisseurs of matter to identify” . The Asian Sun, 28.9.11
Security is never a goal in itselfs, it always serves another goal. ◦ Must integrate into business case, economics, usability, … Technology alone doesn‟t help. ◦ Other aspects as procedures, training, incident response, legal protection, continuity plans, … Not understanding technology is a road to failure ◦ It is the glue between the other blocks. And your attacker will understand it.
Recommend
More recommend