Insert Title Here - Black Hat Briefings

Download Report

Transcript Insert Title Here - Black Hat Briefings

Secure Hardware Design

Secure Hardware Design

The Black Hat Briefings July 26-27, 2000

Brian Oblivion, Kingpin [oblivion, kingpin]@atstake.com

Why Secure Hardware?

    Embedded systems now common in the industry  Hardware tokens, smartcards, crypto accelerators, internet appliances Detailed analysis & reverse engineering techniques available to all Increase difficulty of attack The means exist

Solid Development Process

     Clearly identified design requirements Identify risks in the life-cycle      Secure build environment Hardware/Software Revision control Verbose design documentation Secure assembly and initialization facility End-of-life recommendations Identify single points of failure Security fault analysis Third-party design review

Sources of Attack

 Attacker resources and methods vary greatly

Resource

Time Budget ($) Creativity Detectability Target Number Organized Spread info?

Teenager

Limited <$1000 Varies High Challenge Many No Yes

Academic

Moderate $10K-$100K High High Publicity Moderate No Yes

Org. Crime

Large $100K+ Varies Low Money Few Yes Varies

Gov’t

Large Unknown Varies Low Varies Unknown Yes No Source: Cryptography Research, Inc. 1999, “Crypto Due Diligence”

Accessibility to Product

Purchase All attacks possible Evaluation Active, in-service Remote access Most attacks possible with risk of detection Most attacks possible No physical access

Attack Scenarios

System Enclosure Circuit Firmware

Attack Scenarios

 System  Initial experimentation & probing  Viewed as a “black box”  Can be performed remotely  Bootstrapping attacks

Attack Scenarios

 Enclosure  Gaining access to product internals  Probing (X-ray, thermal imaging, optical)  Bypassing tamper-proofing mechanisms

Attack Scenarios

 Circuit       PCB design & parts placement analysis Component substitution Active bus and device probing Fault induction attacks 1 Timing attacks 2 Integrated circuit die analysis 3

Attack Scenarios

 Firmware  Low-level understanding of the product  Obtain & modify intellectual property  Bypass system security mechanisms  Ability to mask failure detection

Attack Scenarios

Strictly

Firmware - no product needed!

 Obtain firmware from vendor’s public facing web site  Can be analyzed and disassembled without detection

What Needs To Be Protected?

  Firmware binaries   Boot sequence  Cryptographic functionality (offloaded to coprocessor) Secret storage and management Configuration and management communication channels

System

System

Firmware Circuit Enclosure

Trusted Base

 Minimal functionality – Trusted base to verify the integrity on firmware and/or Operating System – Secure store for secrets – Secrets never leave the base unencrypted – Security Kernel  Examples of a Trusted Base – A single IC (some provide secure store for secrets) – May be purchased or custom built (Secure Coprocessor) – All Internals - circuit boards, components, etc.

– Entire trusted base resides within tamper envelope – Firmware – Security Kernel System

Security Kernel

 Better when implemented in Trusted Base, but can function in OS System  Enforces the security policy  Ability to decouple secrets from OS Example: Cryptlib 4

Trusted Base example

Memory Mapped Bus CSOC Control Bus Bulk Transfer Bus External Memory Bus Data Memory (may be dual ported.) for bulk encrypt/decrypt Host Firmware Host Processor Main memory (DRAM) Communication Interface(s) System

Failure Modes

 Determine how the product handles failures  Fail-open or fail-closed?

 Response depends on failure type  Halt system  Set failure flags and continue  Zeroization of critical areas System

Management Interfaces

   Do not include service backdoors!

Utilize Access Control Encrypt all management sessions  SSH for shell administration  SSL for web administration System

Firmware

System

Firmware

Circuit Enclosure

Firmware

Secure Programming Practice

 Code obfuscation & symbol stripping  Use compiler optimizations  Remove functionality not needed in production  Two versions of firmware: Development, Prod.

 Remove symbol tables, debug info.

Secure Programming Practice

 Buffer overflows 5  Highly publicized and attempted  If interfacing to PC, driver code with overflow could potentially lead to compromise Firmware

Boot Sequence

Common Boot Model Flash (BIOS) (May be ROM) New or Overloaded functionality Time Hardware Reset

Trusted Boot Sequence

CSOC Common Boot Model Bootrom POST, Security Kernel Verify Bootrom and Flash Flash New or Overloaded functionality Verify Embedded OS Host System FlashDisk or Fixed Disk Embedded OS or state machine Verify Applications FlashDisk or Fixed Disk Applications Hardware Reset Host System FlashDisk or Fixed Disk Embedded OS or state machine Time FlashDisk or Fixed Disk Applications Firmware

Firmware

Run-Time Diagnostics

 Make sure device is 100% operational all the time  Periodic system checks  Failing device may result in compromise

Firmware

Secret Management

  Never leak unencrypted secrets out Escrow mechanisms are a security hazard  If required, perform at key generation, in the physical presence of humans  Physically export Key Encryption Key and protect  Export other keys encrypted with Key Encryption Key

Firmware

Cryptographic Functions

 If possible, move out of firmware   …into ASIC  Difficult to modify algorithm   Cannot be upgraded easily Increased performance …into commercial CSOC or FPGA  Can reconfigure for other algorithms  May also provide key management   Increased Performance Reconfiguration via signed download procedure (CSOC only)

Firmware

Field Programmability

 Is your firmware accessible to everyone from your product support web page?

 Encryption  Compressing the image is not secure  Encrypting code will limit exposure of intellectual property  Code signing  Reduce possibility of loading unauthorized code

Circuit

System Firmware

Circuit

Enclosure

Circuit

PCB Design

   Remove unnecessary test points   Traces as short as possible Differential lines parallel (even if on separate layers) Separate analog, digital & power GND planes Alternate power and GND planes

Circuit

Parts Placement

  Difficult access to critical components Proper power filtering circuit as close to input as possible  Noisy circuitry (i.e. inductors) compartmentalized

Physical Access to Components

 Epoxy encapsulation of critical components Circuit  Include detection mechanisms in and under epoxy boundary

Power Supply & Clock Protection

  Set min. & max. operating limits Protect against intentional voltage variation  Watchdogs (ex: Maxim, Dallas Semi.)  dc-dc Converters, Regulators, Diodes  Monitor clock signals to detect variations Circuit

Circuit

I/O Port Properties

 Use unused pins to detect probing or tampering (esp. for FPGAs) - Digital Honeypot  Disable all unused I/O pins

Programmable Logic & Memory

    Make use of on-chip security features FPGA design   Make sure all conditions are covered State machines should have default states in place Be aware of what information is being stored in memory at all times 6 (i.e. passwords, private keys, etc.) Prevent back-powering of non-volatile memory devices Circuit

Advanced Memory Management

  Often implemented in small FPGA Bounds checking in hardware  Execution, R/W restricted to defined memory  DMA restricted to specified areas only  Trigger response based on detection of “code probing” or error condition Circuit

Circuit

Bus Management

 COMSEC Requirements  Keep black (encrypted) and red (in-the-clear) buses separate  Data leaving the device should always be black  Be aware of data on shared buses

Enclosure

System Firmware Circuit

Enclosure

Enclosure

Tamper Proofing

   Resistance, Evidence, Detection, Response Most effective when layered Possibly bypassed with knowledge of method

Tamper Proofing

 Tamper Resistance       Hardened steel enclosures Locks Encapsulation, potting Security screws Tight airflow channels, 90 o probing bends to prevent optical Side-effect is tamper evident Enclosure

Tamper Proofing

 Tamper Evidence  Major deterrent for minimal risk takers  Passive detectors - seals, tapes, cables  Special enclosure finishes  Most can be bypassed 7 Enclosure

Tamper Proofing

 Tamper Detection  Ex: Temperature sensors Micro-switches Nichrome wire Flex circuit Radiation sensors Magnetic switches Pressure contacts Fiber optics Enclosure

Tamper Proofing

 Tamper Response  Result of tampering being detected  Zeroization of critical memory areas  Provide audit information Enclosure

Enclosure

RF, ESD Emissions & Immunity

  Clean, properly filtered power supply  EMI Shielding  Coatings, sprays, housings Electrostatic discharge protection  Could be injected by attacker to cause failures  Diodes, Transient Voltage Suppressor devices (i.e. Semtech)

Enclosure

External Interfaces

   Use caution if connecting to “outside world”   Protect against malformed, intentionally bad packets Encrypt or (at least) obfuscate traffic Be aware if interfaces provide access to internal bus   Control bus activity through transceivers Attenuate signals which leak through transceivers with exposed buses (token interfaces) Disable JTAG and diagnostic functionality in operational modes

In Conclusion…

  As a designer:  Think as an attacker would  As design is in progress, allocate time to analyze and break product Peer review Third-party analysis  Be aware of latest attack methodologies & trends

References

2.

3.

4.

5.

6.

7.

1.

Maher, David P., “Fault Induction Attacks, Tamper Resistance, and Hostile Reverse Engineering in Perspective,” Financial Cryptography, February 1997, pp. 109-121 Timing Attacks, Cryptography Research, Inc., http://www.cryptography.com/timingattack/ Beck, F., “Integrated Circuit Failure Analysis: A Guide to Preparation Techniques,” John Wiley & Sons, Ltd., 1998 Gutmann, P., Cryptlib, “The Design of a Cryptographic Security Architecture,” Usenix Security Symposium 1999, http://www.cs.auckland.ac.nz/~pgut001/cryptlib.html

Mudge, “Compromised Buffer Overflows, from Intel to SPARC version 8,” http://www.L0pht.com/advisories/bufitos.pdf

Gutmann, P., “Secure Deletion from Magnetic and Solid-State Memory Devices,” http://www.cs.auckland.cs.nz/~pgut001/secure_del.html

“Physical Security and Tamper-Indicating Devices,” http://www.asis.org/midyear-97/Proceedings/johnstons.html

Additional Reading

1.

2.

3.

4.

5.

6.

DoD Trusted Computer System Evaluation Criteria (Orange Book), 5200.28-STD, December 1985, http://www.radium.ncsc.mil/tpep/library/rainbow/5200.28-STD.html

Clark, Andrew J., “Physical Protection of Cryptographic Devices,” Eurocrypt: Advances in Cryptography, April 1987, pp. 83-93 Chaum, D., “Design Concepts for Tamper Responding Systems,” Crypto 1983, pp. 387-392 Weingart, S.H., White, S.R., Arnold, W.C., Double, G.P., “An Evaluation System for the Physical Security of Computing Systems,” Sixth Annual Computer Security Applications Conference 1990, pp. 232-243 Differential Power Analysis, Cryptography Research, Inc., http://www.cryptography.com/dpa/ The Complete, Unofficial TEMPEST Information Page, http://www.eskimo.com/~joelm/tempest.html

Thanks!