insider problem and elec1ons
play

Insider Problem and Elec1ons Ma3 Bishop Computer Security Lab - PowerPoint PPT Presentation

Insider Problem and Elec1ons Ma3 Bishop Computer Security Lab Dept. of Computer Science University of California, Davis Ma3 Bishop, February 18, 2016 Slide #1 Opening Thought Theres no sense in being precise when you don't even know


  1. Insider Problem and Elec1ons Ma3 Bishop Computer Security Lab Dept. of Computer Science University of California, Davis Ma3 Bishop, February 18, 2016 Slide #1

  2. Opening Thought • There’s no sense in being precise when you don't even know what you’re talking about. — John von Neumann Ma3 Bishop, February 18, 2016 Slide #2

  3. What is an Insider? • Government intelligence analyst – By day: analyzes informa1on obtained from monitoring electronic signals to determine what adversary is up to – By night: provides this informa1on to the adversary so they know what the analyst’s government is being told Ma3 Bishop, February 18, 2016 Slide #3

  4. Legendary Example • Greeks wanted to get inside Troy • They built a wooden horse and put soldiers inside • The Trojans pulled the horse into the city • At night, the Greeks inside the horse got out and opened the gates • The Greeks entered the city (and sacked it) Ma3 Bishop, February 18, 2016 Slide #4

  5. Real-Life Examples • In World War II, Abwehr sent spies to England – Germany fed them informa1on about other spies – Bri1sh had captured all of them, turned many of them, and so got the informa1on • But Soviets had penetrated Bri1sh counter- intelligence – Kim Philby was high-ranking Bri1sh official – He was also a Soviet spy Ma3 Bishop, February 18, 2016 Slide #5

  6. University of California, San Francisco • Transcriber in Pakistan said she would post pa1ent records on the Internet unless UCSF would help her get money owed from “Tom”, the subcontractor who hired her • Tom subcontracted by Sonya • Sonya subcontracted by Transcrip1on Stat • Transcrip1on Stat contracted with UCSF (for past 20 years!) Ma3 Bishop, February 18, 2016 Slide #6

  7. Defining the Insider • “an already trusted person with access to sensi1ve informa1on and informa1on systems” – Understanding the Insider Threat , RAND (2004), xi • “anyone with access, privileges, or knowledge of informa1on systems and services” – Same report, 10 • Anyone opera1ng inside the security perimeter – New Incident Response Best Prac7ces: Patch and Process is No Longer Acceptable Incident Response , Guidance Sojware (2003) Ma3 Bishop, February 18, 2016 Slide #7

  8. More Defini1ons • “legi1mate users who maliciously leverage their system privileges, and familiarity and proximity to their computa1onal environment to compromise valuable informa1on or inflict damage ” – “Towards a Theory of Insider Threat Assessment”, Proc. 2008 Intl. Conf. Dependable Systems and Networks (2008) • “a person with legi1mate access to an organiza1on’s computers and networks” – “Insiders Behaving Badly: Addressing Bad Actors and Their Ac1ons”, IEEE Trans. Infor. Forensics and Security (2010) • “users with privileged knowledge about a system” – “Designing Host and Network Sensors to Mi1gate the Insider Threat”, IEEE Security & Privacy (2009) Ma3 Bishop, February 18, 2016 Slide #8

  9. S1ll More Defini1ons • “Insider a3acks—that is, a3acks by users with privileged knowledge about a system” – “Designing Host and Network Sensors to Mi1gate the Insider Threat,” IEEE Security & Privacy (2009) • “legi1mate users in an IT Infrastructure” – “Towards an Insider Threat Predic1on Specifica1on Language,” Informa7on Management & Computer Security (2006) Ma3 Bishop, February 18, 2016 Slide #9

  10. And S1ll More Defini1ons • “a person [who] has been legi1mately empowered with the right to access, represent, or decide about one or more assets of the organiza1on's structure” – “Countering Insider Threats”, Schloss Dagstuhl (2008) • “a human en1ty that has/had access to the informa1on system of an organiza1on and does not comply with the security policy of the organiza1on” – “An Insider Threat Predic1on Model”, Proc. 7 th Intl. Conf. Trust, Privacy, and Security (2010) Ma3 Bishop, February 18, 2016 Slide #10

  11. And a Final Defini1on • “A current or former employee, contractor, or business partner who – has or had authorized access to an organiza1on’s network, system data and – inten1onally exceeded or misused that access in a manner that nega1vely affected the confiden1ality, integrity, or availability of the organiza1on’s informa1on or informa1on systems” • Common Sense Guide to Preven7on and Detec7on of Insider Threats 3 rd Edi7on — Version 3.1 (2009) Ma3 Bishop, February 18, 2016 Slide #11

  12. Perimeters mobile mobile firewall Organiza1on Internet Outsiders Insiders Ma3 Bishop, February 18, 2016 Slide #12

  13. Problems • How well defined is your perimeter? – Mobile compu1ng, especially BYOD – Virtual private networks – Remote sites – Unknown modems, etc. • How does physical access play into this? – Authorized users – Others, such as janitors Ma3 Bishop, February 18, 2016 Slide #13

  14. Supply Chain Problem • Someone sells you a program to solve a problem your company has • When you use it, it copies data from your computer (including medical records) to a server on the Internet • So . . . Are you an insider? • And . . . Is the person who wrote it an insider? • And . . . Is the person who sold it an insider? Ma3 Bishop, February 18, 2016 Slide #14

  15. Example: Vo1ng Machines • Rumor: The son of a U.S. presiden1al candidate ran an investment firm that was rumored to have a stake in Hart InterCivic, a maker of e-vo1ng systems used in Ohio, USA – No evidence this is true; but suppose it is • Implica1on is someone in the company, on instruc1ons from a stakeholder, could corrupt e- vo1ng system to deliver votes as desired – A supply chain a3ack as person is not an elec1on official; so, is that person an insider? Is the stakeholder? Ma3 Bishop, February 18, 2016 Slide #15

  16. Not Just Computer Scien1sts • Insider trading – In U.S. law, defined by the agency that regulates the stock exchanges (Securi1es Exchange Commission) – Extensively li1gated over the years – Considerable grey area makes it difficult to know whether par1cular transac1on is legal or not Ma3 Bishop, February 18, 2016 Slide #16

  17. Common No1ons in Defini1ons • Access – Without access, nothing can happen – Access can be direct or indirect • Hunker, Probst list 3 other categories of a3ributes of insiders – Knowledge – Ability to represent something – Trust by the organiza1on • All require some form of (direct or indirect) access Ma3 Bishop, February 18, 2016 Slide #17

  18. Back to Basics (For a Moment) • Security policy defines security • Security mechanisms enforce security • The mechanisms are imprecise – Do not enforce the security policy precisely – Jones and Lipton result: no generic procedure for developing security mechanisms that are both secure and precise (except trivial cases) Ma3 Bishop, February 18, 2016 Slide #18

  19. Policies • Idea: define policies that minimize access – Principles of least privilege, fail-safe defaults – Look for inconsistencies that could enable viola1on of security policy Ma3 Bishop, February 18, 2016 Slide #19

  20. Cri1cal Assump1on • The stated policy is complete, precise, and correct – Reality: this is rarely (if ever) true – Indeed, you may not know the policy you need! Ma3 Bishop, February 18, 2016 Slide #20

  21. Analyzing It More • Apply no1on of “layers of abstrac1on” to security policy • Examine the discrepancies between different layers • Can integrate inten1on into these layers Ma3 Bishop, February 18, 2016 Slide #21

  22. Issues • Feasibility – Computer systems understand accounts , not people – Computer systems understand ac7ons , not inten7ons Ma3 Bishop, February 18, 2016 Slide #22

  23. Example • Policy: Alice is authorized to read medical records for the purpose of transcrip7ons • Implementa1on: account alice is authorized to read access files labeled “medical records” • Gaps – Anyone with access to account alice can read files labeled “medical records” – Account alice can read medical records and then do anything with that data (including pos1ng the data to the Web) – Account alice can read any file labeled “medical record” whether it is a medical record or not Ma3 Bishop, February 18, 2016 Slide #23

  24. Unifying Policy Hierarchy • Ideal policy: what you want, even if not stated explicitly • Feasible policy: what you can actually implement on an actual system • Configured policy: what you actually configure your system to enforce • Run-1me policy: what the system actually enforces (including what vulnerabili1es enable) Ma3 Bishop, February 18, 2016 Slide #24

  25. Simple Example • Bob surfing web using browser vulnerable to remote exploit • Accidentally surfs to site with a3ack that exploits it – If a3acker, Alice, gets access to Bob’s account, she’s in the Configured/Run-Time gap • Deliberately surfs to site with a3ack that exploits it; claims “oops, I didn’t know!” – Now Bob gives access to Alice, so he’s in the Ideal/ Feasible gap Ma3 Bishop, February 18, 2016 Slide #25

  26. The Threats • Someone has more access at lower policy level than at higher policy level • Someone has less access at lower policy level than at higher policy level • This provides a policy-based defini1on of “insider” – an en1ty that falls into these gaps Ma3 Bishop, February 18, 2016 Slide #26

Recommend


More recommend