Privacy engineering, CyLab privacy by design, privacy impact assessments, and privacy governance Engineering & Public Policy Lorrie Faith Cranor � November 11, 2014 y & c S a e v c i u r P r i t e y l b L a a s b U o 8-533 / 8-733 / 19-608 / 95-818: � b r a a t L o Privacy Policy, Law, and Technology y r C y U H D T T E P . U : / M / C C U . S P S C . 1
Today’s agenda • Quiz • Questions/comments about the readings • Discussion about the midterm • Privacy engineering • Privacy by design • Privacy impact assessments • Privacy governance 2
By the end of class you will be able to: • Understand how to apply various approaches to privacy engineering and privacy by design to design problems 3
Privacy by policy vs. architecture • What techniques are used in each approach? • What are the advantages and disadvantages of each approach? 4
How privacy rights are protected By policy • – Protection through laws and organizational privacy policies – Must be enforced – Transparency facilitates choice and accountability – Technology facilitates compliance and reduces the need to rely solely on trust and external enforcement – Violations still possible due to bad actors, mistakes, government mandates By architecture • – Protection through technology – Reduces the need to rely on trust and external enforcement – Violations only possible if technology fails or the availability of new data or technology defeats protections – Often viewed as too expensive or restrictive 5
What system features tend to lead to more or less privacy? 6
low Degree of Person Identifiability high Privacy by Policy Privacy by through FIPs Architecture high low Degree of Network Centricity 7
Privacy by policy techniques • Notice • Choice • Security safeguards • Access • Accountability – Audits – Privacy policy management technology • Enforcement engine 8
Privacy by architecture techniques • Best – No collection of contact information – No collection of long-term person characteristics – k-anonymity with large value of k • Good – No unique identifiers across databases – No common attributes across databases – Random identifiers – Contact information stored separately from profile or transaction information – Collection of long-term personal characteristics w/ low granularity – Technically enforced deletion of profile details at regular intervals 9
Linkability Approach of data to Privacy to privacy identifiability System Characteristics personal stages protection identifiers • unique identifiers across databases • contact information stored with profile information 0 identified privacy linked by policy (notice and • no unique identifies across databases linkable with choice) • common attributes across databases reasonable & 1 automatable • contact information stored separately from profile effort or transaction information • no unique identifiers across databases • no common attributes across databases pseudonymous • random identifiers not linkable • contact information stored separately with 2 from profile or transaction information reasonable privacy • collection of long term person characteristics on a effort by low level of granularity architecture • technically enforced deletion of profile details at regular intervals • no collection of contact information • no collection of long term person characteristics 3 anonymous unlinkable • k -anonymity with large value of k 10
De-identification and re-identification • Simplistic de-identification: remove obvious identifiers • Better de-identification: also k-anonymize and/or use statistical confidentiality techniques • Re-identification can occur through linking entries within the same database or to entries in external databases 11
Examples • When RFID tags are sewn into every garment, how might we use this to identify and track people? • What if the tags are partially killed so only the product information is broadcast, not a unique ID? • How can a cellular provider identify an anonymous pre-paid cell phone user? 12
Privacy by Design Principles (PbD) 1. Proactive not Reactive; Preventative not Remedial 2. Privacy as the Default Setting 3. Privacy Embedded into Design 4. Full Functionality—Positive-Sum, not Zero-Sum 5. End-to-End Security—Full Lifecycle Protection 6. Visibility and Transparency—Keep it Open 7. Respect for User Privacy—Keep it User-Centric Ann Cavoukian 13
Privacy by design Rubinstein, Ira and Good, Nathan, Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents. 28 Berkeley Technology Law Journal 1333 (2013). http://ssrn.com/abstract=2128146 or http://dx.doi.org/10.2139/ssrn.2128146 • PbD principles “more aspirational than practical or operational” • Microsoft principles outdated (ignore social media) and don’t provide insights into decision making behind “company approval” • PbD requires “translation of FIPs into engineering and design principles and practices” 14
Privacy Impact Assessment A methodology for – assessing the impacts on privacy of a project, policy, program, service, product, or other initiative which involves the processing of personal information and, – in consultation with stakeholders, for taking remedial actions as necessary in order to avoid or minimize negative impacts D. Wright and P . De Hert, eds. Privacy Impact Assessment . Springer 2012. 15
PIA is a process • Should begin at early stages of a project • Should continue to end of project and beyond 16
Why carry out a PIA? • To manage risks • To derive benefits – Negative media – Increase trust attention – Avoid future liability – Reputation damage – Early warning system – Legal violations – Facilitate privacy by – Fines, penalties design early in design process – Privacy harms – Enforce or encourage – Opportunity costs accountability 17
Who has to carry out PIAs? • US administrative agencies, when developing or procuring IT systems that include PII – Required by E-Government Act of 2002 • Government agencies in many other countries • Sometimes done by private sector – Case studies from Vodaphone, Nokia, and Siemens in PIA book 18
Data governance • People, process, and technology for managing data within an organization • Data-centric threat modeling and risk assessment • Protect data throughout information lifecycle – Including data destruction at end of lifecycle • Assign responsibility 19
Beam discussion • https://www.youtube.com/channel/ UC_Cqp2VdYp9YSQqK07bIMmQ • What privacy issues does this technology raise in the home environment? How might these issues be addressed? 20
y & c S a e v c i u r P r i e t y l b L a a s b U o b r a a t L o y r C y U H D T T E P . U : / M / C C U . S P C S . Engineering & Public Policy CyLab
Recommend
More recommend