Secure Hardware HOW CAN WE PROTECT OUR HARDWARE ??? HOW CAN OUR HARDWARE PROTECT ITSELF ??? 1
O’Reilly 2
Trusted or Trustworthy? An object is trusted if and only if it operates as expected An object is trustworthy if and only if it is proven to operate as expected. - Peter Neumann (paraphrased) 3
4
What Do TPMs Do? (Trusted Computing) • Store secrets (keys) • Protect secrets against infected systems. • Provide a repository for trusted system measurements that a compromised platform cannot lie about when interrogated. • TPMs can best be described as platform trust coprocessors. • TPMs are also very useful in enabling verified boot of OS’s and hypervisors.
Other useful features of TPMs Policy protected persistent secure storage (for keys, certificates, and • other data) Keys • – Generate, store, and use symmetric and asymmetric keys – Key hierarchy – key encrypting keys, platform keys, endorsement keys, storage root keys – Key cache – Key migration – Key usage policy enforcement Random number generation • Limited crypto in TPM 2.0 (sign, verify, encrypt, decrypt) • Most TPM chips are FIPS 140-2 validated • Common Criteria protection profile for chips themselves (not the • systems they are in) 6 IBM Confidential
Platform Configuration Registers (PCR) • In volatile storage in TPM • SHA-256 cumulative hash • Initialized to zero at TPM initialization NEVER written to directly, always extended PCR new = SHA-256(PCR old || new value) 7
8
9
10
Because of the unknown order of the contents of the cumulative hash, you also need it to send you its log. 11
Extreme HW Security 12
The Threat Model – Who are the attackers? • buggy software from business partners (intent is not required!) • system administrators? • employee insiders? • thrill-seeking hackers? • terrorists? • organized nation-states? – What resources do they have? – What attacks are anticipated? 13
What can a successful attacker gain? – notoriety – money – life and limb of others – a marketable identity – defeat of an enemy in wartime – control of • a flight deck? • the brake system? • the on-board movie? • the radio of the BMW next to me in traffic? 14
How will my product defend itself? Physical – it’s own enclosure – the enclosure of the system it’s in – a secure environment (inside the Pentagon) – inspection / human beings / dogs Logical / software – Encryption of sensitive material – Authentication of operators – Integrity of incoming commands and software – Integrity of on-board memory – Integrity of the bootstrap / factory initialization 15
Why secure hardware? • Because we all doing . . . – Increasingly important operations – In increasingly distributed environments – That are increasingly open • As a consequence . . . – We need to trust machines we cannot control – And to which motivated adversaries may have direct access 16
What is a secure coprocessor? A general-purpose computing environment that withstands • physical and logical attacks It runs only the programs it’s supposed to • It runs them unmolested • One can (remotely) tell the difference between the real • program on the real thing, and a clever impersonator An attacker might carry out destructive analysis on one or more • devices, yet not break the security of the whole system Usually incorporates high-performance crypto, but not just a • fast crypto box 17
What do applications need from secure hardware? Acceleration of security operations (e.g. cryptography, random • number generation) Physical protection of information assets • • Cryptographic keys • Electronic valuables (e.g. e-cash, postage, coupons) • Software (e.g. meters, risk calculations) Enablement of network security operations (e.g. intrusion • detection) 18
Some applications for secure hardware Network security • • Intrusion and virus defense, filtering, virtual private networks E-commerce • • Electronic payments, e-postage/tickets/coupons, smart card personalization Data centers • • Secure databases (e.g. healthcare, corporate secrets), entertainment content Banking and finance • • Securities trading, transaction processing and funds transfer Government and military • • Benefits transfer, government suppliers, high assurance systems for defense, intelligence, justice 19
IBM PCI-X Cryptographic Coprocessor (PCIXCC) ▪ Announced in September, 2003 ▪ Greatly improved performance ▪ PCI-X and network interface ▪ Same physical / logical security feature set as 4758 ▪ Certified FIPS 140-2 level 4 20
4764 hardware architecture Tamper-responding membrane PowerPC SoC 266+MHz Proprietary DRAM 64MB memory (4MB) CPU Time of day clock ECC interlock FLASH 16MB PCI/X, Ethernet (4MB) I/O Battery-backed (PCI, serial port) New pipeline and crypto ASICs RAM ROM 128KB (8KB) (64KB) 1MB w/ much better performance Proprietary pipelining hardware (e.g. 50X in some cases) DES/TDES engine Modular math engine SHA-1 engine Active tamper-detection Hardware random number generator and response circuitry 21
Daughter card 22 20 January 2006
Inner copper enclosure 23 20 January 2006
Tamper-sensing mesh/membrane 24 20 January 2006
Outer copper shell and potting material 25 20 January 2006
Completed assembly 26 20 January 2006
IBM 4769 PCIe Cryptographic Coprocessor (HSM) 2019 27
IBM 4769 PCIe Cryptographic Coprocessor (HSM) 28
Matchbox: Solution Overview - Allows collaborative procesing amongst parties that do not trust each other - One or more parties provide data (kept encrypted) - One or more parties make requests (request encrypted, processing in secure coprocessor) - One or more parties gets the results (transmission encrypted) Solution Components • IBM 4758 • MatchBox software
Matchbox: Solution • All processing and Railroad Airport distributed results are based on automatically Auth/Enc enforced contracts Railroad 1 Auth/Enc (4758) Auth/Enc Airline 1 Airline 2 (4758) (4758) • Sensitive processing is Law confined within the Enforcement Law secure coprocessor Auth/Enc Enforcement (4758) (IBM 4758) and is not Auth/Enc observable from outside (4758) • Sensitive data outside Auth/Enc Auth/Enc Agency 1 (4758) the coprocessor is (4758) secure secure always encrypted secure secure . Matchbox matching secure . secure matching . matching . • Secure matching of service matching in 4758 . matching matching in 4758 in 4758 biometrics in 4758 in 4758 in 4758 Auth/Enc Agency 2 (4758)
FIPS 140-2 Physical Security Requirements Level 1 – production grade Level 2 – tamper evident Level 3 – tamper resistant Level 4 – tamper responding 31 9/29/19
FIPS 140-2 Level 1 – Production Grade analogy Ordinary snap-cap bottles 32 9/29/19
FIPS 140-2 Level 2 – Tamper Evident analogy Tamper evident bottle caps ▪ alert the user to tampering ▪ require inspection (rely on humans for protection) 33 9/29/19
FIPS 140-2 Level 3 – Tamper Resistant analogy Child resistant medicine bottle caps resist opening by unauthorized users (children). Tamper resistance → device begins to protect itself 34 9/29/19
FIPS 140-2 Level 4 – Tamper Responding analogy The PillSafe™ stacks drug tablets next to a stable chemical reactant that can destroy the drugs. Attempts to force the mechanism or penetrate the bottle cause instant destruction of the medication. Tamper responding → device protects itself See http://www.healthcarepackaging.com/archives/2007/03/futuristic_pill_container_zaps.php 35 9/29/19
FIPS level 1 (out of 4) – the bare mimimum Requirements are mainly cumulative, with a few exceptions Mainly algorithmic compliance ( “ our AES is compatible with • yours ” ) Limited testing outside algorithm verification • Important indication if certified (but check security policy) • Double-check algorithm certificates (key sizes, modes etc.) • State transition diagrams • 36 9/29/19
FIPS level 2 (out of 4) Level 1 and . . . • crypto users are identified acting in roles • Not very useful for software modules • Limited use for modules which provide only raw operations • Must authenticate user ’ s role • Visible tamper evidence is not very relevant today: • – One does not actually see most devices interacted with – Most devices themselves have disappeared into systems (or onto chips) – This was still feasible when FIPS 140-1 was written – Still useful in some restricted environments (example: ATMs) 37 9/29/19
FIPS and CC roles and identities • Users must be identified / authenticated somehow • Users can be companies or software, not necessarily human beings, e.g. – Human being in Personnel – IBM as a microcode and operating system loader – An operating system as an application loader 38 9/29/19
FIPS level 3 (out of 4) Level 2 and. . . • Infeasible for practical software, unless protected by a Level 3+ • enclosure (plus other features) Additional, extensive testing over Levels 1-2 • User separation / identification required (more than in Level 2) • 39 9/29/19
Recommend
More recommend