information flow
play

Information Flow Gang Tan Penn State University Spring 2019 CMPSC - PowerPoint PPT Presentation

Information Flow Gang Tan Penn State University Spring 2019 CMPSC 447, Software Security Information Flow 3 Many security requirements can be formulated as what information flow is allowed/disallowed E.g., confidential data should not


  1. Information Flow Gang Tan Penn State University Spring 2019 CMPSC 447, Software Security

  2. Information Flow 3  Many security requirements can be formulated as what information flow is allowed/disallowed  E.g., confidential data should not flow to unauthorized personnel  E.g., data from untrusted sources should not taint a database that stores data from trusted sources  An information flow policy  Provide data with labels  E.g., confidential data vs. non‐confidential data  E.g., low‐integrity data vs. high‐integrity data  Specify how labeled data can/cannot flow in a system  E.g., confidential data cannot flow to the network

  3. Multi‐Level Security (MLS) 4  Used in military to classify personnel and data  Label L = 〈 rank, compartment 〉  Dominance relation  L1 ≤ L2 iff rank1 ≤ rank2 & compart1 compart2  Example: 〈 Unclassified, Iraq 〉 ≤ 〈 Secret, CENTCOM 〉  Mathematically, the relation forms a lattice  Subjects: users or processes  Label(S) = clearance of S  Objects : documents or resources  Label(O) = classification of O

  4. MLS Example: A Linear Lattice about Confidentiality 5 Top Secret (TS) Information flows up, but not down! Secret (S) Confidential (C) Unclassified (U) Unclassified  Confidential  Secret  Top Secret

  5. Bell‐LaPadula Model 6  Simple Security Condition: subject S can read object O if and only if L(O)  L(S)  Example: Person with top‐secret clearance can read secret files  “No read up”  *‐Property (Star Property): subject S can write object O if and only if L(S)  L(O)  Example: Person with secret clearance can write top‐ secret files  “No write down”

  6. Integrity [Biba 1977] 7  Information flow can be used to enforce integrity as well as confidentiality Low integrity  Integrity is the dual of confidentiality High integrity  Labels: low‐integrity (tainted); high‐integrity  Tainted data should not flow to places that demand high‐ integrity data

  7. What about both Confidential and Integrity? 8  Idea: combine the following two lattices Low integrity Secret High integrity Unclassified Secret and low integrity Secret and Unclassified high integrity and low integrity Unclassified and high integrity

  8. Compartments 9  Provide finer‐grained security labels  Suppose TOP SECRET divided into <TOP SECRET, CAT> and <TOP SECRET, DOG>  Both are TOP SECRET but information flow restricted between the two kinds  A user with <TOP SECRECT, CAT> clearance cannot access data with <TOP SECRET, DOG>, and vice versa.  Compartments designed to enforce the need to know principle  Even with a top secret clearance, a user cannot access all top secret data

  9. Compartments Example 10  Arrows indicate allowed information flow <TOP SECRET, CAT & DOG> <TOP SECRET, CAT> <TOP SECRET, DOG> <TOP SECRET> <SECRET, CAT & DOG> <SECRET, CAT> <SECRET, DOG> <SECRET>  Not all classifications are comparable, e.g., <TOP SECRET, CAT> vs < SECRET, CAT & DOG>

  10. 11 Controlling Information Flow * Some slides borrowed from Shmatikov

  11. Mixing Information of Multiple Levels 12  Systems (programs) often mix information of different security levels  E.g., an OS manages both secret and public documents and is shared by users of different clearances  Q: how do we know such a system respects multi‐ level security?  That is, information at a higher level does not flow to information at a lower level

  12. Noninterference 13  A system is modeled as a blackbox with some inputs and outputs  Each input/output has a security level  Noninterference requires  Output at a lower level does not depend on input at a higher level  Changing higher‐level input won’t change lower‐level output Output seen by users of Secret documents secret clearance OS Output seen by users of Public documents public clearance

  13. Noninterference for a Two‐Point High Lattice Low 14  High: cannot be observed by the attacker before, after, and during execution  Low: can be observed by the attacker before and after the execution, but not during  Some of these are inputs and some are outputs H: high input P L in : low input L out :low output  Example: web server  High input: the server’s private key  Low input: user requests to access webpages  Low output: returned webpages

  14. Noninterference for a Two‐Point High Lattice Low 15  Noninterference  Low output does not depend on high input  No matter what the high input is, the system returns the same low output for the same low input  E.g., no matter what the private key is, the web server returns the same information for the same user webpage request  As a result, the attacker learns no information about the high input by observing the low output H: high input P L out :low output L in : low input

  15. Challenges for Enforcement 16 H: high input P L in : low input L out :low output  Goal: low output should not depend on high input  An end‐to‐end security policy  Enforcement challenges  Various channels for information flow (e.g., Implicit flows)  Need to track information flow inside P  Declassification: downgrade information

  16. Explicit Flows 17  Direct transfer of information  E.g., via copying; in “x=y”, the information of y is copied to x  E.g., via writing to display, files, sockets, etc., which are visible to the attacker

  17. Explicit Flows Example 18  Example about confidentiality String hi; // security label secret String lo; // security label public Which program fragments (may) cause problems if hi has to be kept confidential? 1. hi = lo; 5. println(lo); 2. lo = hi; 6. println(hi); 3. lo = "1234"; 4. hi = "1234";

  18. Implicit Flows 19  Indirect transfer of information via the control flow of a program  Information in a variable x may be correlated to y’s information due to control flow  Example: if (hi>0) lo=100; else lo=1000;  Note there is no explicit flows such as “lo=hi”; only constants are assigned to lo  But at the end the program, lo’s value is correlated with hi’s value  By observing lo, an adversary can infer information about hi!  We call this an implicit flow

  19. Yet There are Other Channels 20  Termination channel if (hi>0) infinite‐loop();  If the attacker can observe the program’s termination behavior, then there is a leakage  Side channels  E.g., timing behavior if (hi>0) run‐1000‐cycles(); else run‐1‐cycle();  An end‐to‐end enforcement would require us to control all these possible channels

  20. Enforcement Challenge: Must Track Information Flow Internally 21  The input flows inside a system through intermediate variables and memory  To prevent high input to flow to low output, must also track internally how information flows  E.g., Need to know x contains hi information in “lo=x” x=hi; lo=x;  E.g., if (hi>0) x=0; else x=1; lo=x;

  21. Tracking Information Flow Inside a System 22  Static enforcement  E.g., via a type system  Dynamic enforcement  Straightforward for dynamically tracking explicit flows (taint tracking)  Much harder for the case of implicit flows

  22. Jif [Myers] 23  Jif: Java with static information flow control  Jif augments Java types with labels  int {Alice:Bob} x;  Object {L} o;  Subtyping follows the lattice order  Type inference  Programmer may omit types; Jif will infer them from how values are used in expressions

  23. Implicit Flows (1) 24 {Alice:; Bob:} int{Alice:} a; {Alice:} {Bob:} int{Bob:} b; PC label ... {} {} if (a > 0) then { {}  {Alice:}={Alice:} b = 4; } This assignment leaks {} information contained in program counter (PC)

  24. Implicit Flows (2) [Zdancewic] 25 {Alice:; Bob:} int{Alice:} a; {Alice:} {Bob:} int{Bob:} b; PC label ... {} {} if (a > 0) then { {}  {Alice:}={Alice:} b = 4; } To assign to variable {} with label L, must have PC � L

  25. Jif Caveats 26  No threads  Information flow hard to control  Active area of current research  Timing channels not controlled  Explicit choice for practicality  Differences from Java  Some exceptions are fatal  Restricted access to some system calls

  26. Enforcement Challenge: Declassification 27  In realistic systems, disallowing all information flow from a higher level to a lower level is too prohibitive  Very often, information need to be declassified to a lower level  Jif requires explicit declassification by programmers

  27. Declassification Example 28  A password checking system, H: real password Password Checker L in : user-input password L out : yes/no  The low output does depend on the real password  It reveals some info about the real password  Namely, whether the user‐input password is correct or not  However, the amount of information flow is extremely small; so we can declassify that output

Recommend


More recommend