when foes are friends adversarial examples as protective
play

When foes are friends adversarial examples as protective - PowerPoint PPT Presentation

When foes are friends adversarial examples as protective technologies Carmela Troncoso @carmelatroncoso Security and Privacy Engineering Lab The machine learning revolution 2 Machine Learning ) Definition (Wikipedia) Machine learning [...]


  1. When foes are friends adversarial examples as protective technologies Carmela Troncoso @carmelatroncoso Security and Privacy Engineering Lab

  2. The machine learning revolution 2

  3. Machine Learning ) Definition (Wikipedia) Machine learning [...] gives " computers the ability to learn without being explicitly programmed " [and] [...] explores the study and construction of algorithms that can learn from and make predictions on data – such algorithms overcome following strictly static program instructions by making data-driven predictions or decisions, through building a model from sample inputs. Data Machine learning Services 3

  4. Machine Learning Data Learning / Model Training Expected Output Data Machine learning Services 4

  5. Machine Learning Data New Learning / Model Model Output Data Training Expected Output Data Machine learning Services 5

  6. Adversarial machine learning Adversarial Poisoning examples 6

  7. Adversarial machine learning New Output Model Data Crafted testing samples that get misclassified by an ML algorithm Adversarial examples 7 https://ai.googleblog.com/2018/09/introducing-unrestricted-adversarial.html

  8. Adversarial machine learning Data Model Learning Expected Output Crafted training samples that change the decision boundaries of an ML algorithm Poisoning 8

  9. Adversarial machine learning is here to stay Crafted training Crafted testing samples that samples that get change the decision misclassified by boundaries of an an ML algorithm ML algorithm Adversarial Poisoning examples 9

  10. 10

  11. 11

  12. Three ways of using adversarial examples as defensive technologies For Security For Privacy For Social Justice Adversarial examples to defend from Adversarial examples as adversarial machine learning uses security metrics 12

  13. Three ways of using adversarial examples as defensive technologies For Security For Privacy For Social Justice Adversarial examples to defend from Adversarial examples as adversarial machine learning uses security metrics 13

  14. Defending against adversarial examples?

  15. Defending against adversarial examples?

  16. Defending against adversarial examples? Adversarial training training on simulated adversarial examples

  17. Does this solve security problems? Adversarial training training on simulated adversarial examples

  18. Does this solve security problems? Adversarial training training on simulated adversarial examples Random Noise Images

  19. Does this solve security problems? Malware ??? Twitter ??? Bots ??? Spam

  20. Does this solve security problems? Malware ??? In security problems examples belong to a DISCRETE and CONSTRAINED domain Twitter F EASIBILITY C OST ??? Bots How can we become the enemy? ??? Spam

  21. Does this solve security problems? Malware ??? In security problems examples belong to a DISCRETE and CONSTRAINED domain Twitter F EASIBILITY C OST ??? Bots How can we become the enemy? ??? Spam

  22. Does this solve security problems? Malware ??? In security problems examples belong to a DISCRETE and CONSTRAINED domain Twitter F EASIBILITY C OST ??? Bots How can we become the enemy? ??? Spam

  23. Number of followers : x Our approach: search as a graph Age of account : y ML 23

  24. Number of followers : x Our approach: search as a graph Age of account : y ML Number of followers : few Age of account : new 24

  25. Number of followers : x Our approach: search as a graph Age of account : y ML Number of followers : few Age of account : new Number of followers : few Number of followers : few Number of followers : few Age of account : 3 years Age of account : 1 year Age of account : 2 years 25

  26. Number of followers : x Our approach: search as a graph Age of account : y ML Number of followers : few Age of account : new Number of followers : few Number of followers : few Number of followers : few Age of account : 3 years Age of account : 1 year Age of account : 2 years Number of followers : some Number of followers : many Age of account : 1 year Age of account : 1 year 26

  27. Number of followers : x Our approach: search as a graph Age of account : y ML Number of followers : few Age of account : new In security problems examples belong to a DISCRETE and CONSTRAINED domain Number of followers : few Number of followers : few Number of tweets : few F EASIBILITY C OST Age of account : 3 years Age of account : 1 year Age of account : 2 years How can we become the enemy? Number of followers : some Number of followers : many Age of account : 1 year Age of account : 1 year 27

  28. Number of followers : x Our approach: search as a graph Age of account : y ML Number of followers : few Age of account : new $7 $2 $4 Number of followers : few Number of followers : few Number of tweets : few Age of account : 3 years Age of account : 1 year Age of account : 2 years $8 $2 Number of followers : some Number of followers : many Age of account : 1 year Age of account : 1 year cost = $2 + $2 = $4 cost = $2 + $8 = $10 28

  29. Number of followers : x Our approach: search as a graph Age of account : y ML Number of followers : few Age of account : new In security problems examples belong to a DISCRETE and CONSTRAINED domain $7 $2 $4 Number of followers : few Number of followers : few Number of tweets : few F EASIBILITY C OST Age of account : 3 years Age of account : 1 year Age of account : 2 years How can we become the enemy? $8 $2 Number of followers : some Number of followers : many Age of account : 1 year Age of account : 1 year cost = $2 + $2 = $4 cost = $2 + $8 = $10 29

  30. The graph approach comes with more advantages Enables the use of graph theory to EFFICIENTLY find adversarial examples ( A*, beam search, hill climbing, etc ) C APTURES most attacks in the literature! (comparison base) 30

  31. The graph approach comes with more advantages Enables the use of graph theory to EFFICIENTLY find adversarial examples ( A*, beam search, hill climbing, etc ) C APTURES most attacks in the literature! (comparison base) Find MINIMAL COST adversarial examples if - The discrete domain is a subset of R m For example, categorical one-hot encoded features: [0 1 0 0] - Cost of each single transformation is L p For example, L ∞ ([0 1 0 0], [1 0 0 0]) = 1 - We can compute pointwise robustness for the target classifier over R m 31

  32. A* search with a heuristic Pointwise robustness over R m 32

  33. A* search with a heuristic 33

  34. A* search with a heuristic 34

  35. Minimal adversarial example found! 35

  36. Minimal adversarial example found! 36

  37. Minimal adversarial example found! Confidence of the example 37

  38. Is this really efficient? l =d  just flip decision 38

  39. Is this really efficient? l >d  high confidence example 39

  40. Is this really efficient? l >d  high confidence example 40

  41. Why is this relevant? MINIMAL COST adversarial examples can become security metrics! Cost can be associated with RISK Cannot stop attacks, but can we ensure they are expensive? Constrained domains security Continuous-domains approaches can be very conservative! 41

  42. Applicability Non-linear models: - approximate heuristics - transferability (basic attacks transfer worse than high confidence) Faster search: - tradeoff guarantees vs. search speed (e.g., hill climbing) Other cost functions 42

  43. Three ways of using adversarial examples as defensive technologies For Security For Privacy For Social Justice Adversarial examples to defend from Adversarial examples as adversarial machine learning uses security metrics 43

  44. Adversarial examples are only adversarial when you are the algorithm!

  45. Machine learning as a privacy adversary ML Privacy-oriented Literature Data Service Avoid that learns about data 45

  46. Machine learning as a privacy adversary ML Privacy-oriented Literature Data Actively (maybe not willingly) provide data. Solutions like Service Differential privacy and Encryption are suitable Avoid that learns about data 46

  47. Machine learning as a privacy adversary ML Privacy-oriented Literature Data Actively (maybe not willingly) provide data. Solutions like Service Differential privacy and Encryption are suitable No active sharing! Cannot count on Avoid that learns about data 47

  48. Adversarial examples as privacy defenses Data Inferences 48

  49. Adversarial examples as privacy defenses Data Inferences Goal: modify the data to avoid inferences Social networks data, browsing patterns, traffic patterns, location … 49

  50. Adversarial examples as privacy defenses Data Inferences Goal: modify the data to avoid inferences Social networks data, browsing patterns, traffic patterns, location … Privacy solutions are also CONSTRAINED in FEASIBILITY and COST ! 50

  51. Protecting from traffic analysis Encrypted traffic trace ML Apps, webs, words,… 51

  52. Protecting from traffic analysis Encrypted traffic trace ML Apps, webs, words,… Add 1 packet in position 1 Add 2 packets Add 1 packet Add 1 packet … in position 1 in position 2 in position N F EASIBLE perturbations: - Add packets Add 3 packets Add 2 packets … - Delay packets in position 2 in position 1 52

Recommend


More recommend