An Introduction to Information Security in Its Broadest

Download Report

Transcript An Introduction to Information Security in Its Broadest

An Introduction to
Information Security
Why there’s more to protect than you might think
and why protecting it is a lot tougher than you ever
dreamed of in your wildest, most paranoid nightmare
Roadmap

Introduction

The key mechanisms of information security: their
strengths, weaknesses and inter-dependencies





8/15/2002
INFOSEC
OPSEC
Personnel and procedural security
Physical security
Summary
Copyright 2002 by M.S. jaffe
MSJ-2
Roadmap: Introduction
8/15/2002

Introduction
 Purpose
 Context
 Some key vocabulary, including some integrating concepts

Mechanisms of information security and their inter-dependencies

Summary
Copyright 2002 by M.S. jaffe
MSJ-3
Purpose

Provide an overview of the context of digital information security

Summarize the key “soft” factors beyond the hardware and
software technologies

Provide an introduction to the key concepts, vocabulary, and
issues of digital information security itself
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-4
The Context of Information Security
INFOSEC
Information Security
Information Assurance

Information assurance is making sure that valid information
is only accessible to the right people and it’s there when they
ask for it

Information security is about protecting information from
unauthorized disclosure or modification but not specifically
about assuring all aspects of its accessibility

INFOSEC (a.k.a. ISS) is an abbreviation of Information
Systems Security, the protection of information systems –
which correctly highlights the fact that electronic data systems
are by no means the only places that information can be
compromised
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-5
Some Underlying Vocabulary and
Integrating Concepts

To have access is to be able to do something

Authorization means that you’re supposed to have access

A security policy describes who is authorized which type(s)
of access to what

Mechanisms are the physical, electronic, and procedural
means of enforcing a security policy

A system’s security architecture consists of all the
mechanisms involved in enforcing its security policy

An attack is a deliberate attempt to circumvent some
mechanism and violate a security policy
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-6
Some Underlying Vocabulary and
Integrating Concepts (cont’d)

A vulnerability is some aspect of the security architecture
that may be subject to attack

A threat is a person or persons that might make an attempt
to attack a system; characterized by, among other factors:

Knowledge/skill

Of INFOSEC attacks in general
 Your security architecture in specific


8/15/2002
Resources
A threat analysis is necessary before coming up with a
security policy that is used as the starting point for the
design of a security architecture
Copyright 2002 by M.S. jaffe
MSJ-7
The Mechanisms of Information Security
INFOSEC
COMPUSEC
COMSEC
Crypto
Physical
OPSEC
Security
Personnel Emissions
Security Security
Information Security
Information Assurance
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-8
COMPUSEC

Informally: Security of information in computers

Formally:
“Measures and controls that ensure confidentiality, integrity, and
availability of the information processed and stored by a computer.”
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-9
A More Detailed View of COMPUSEC

COMPUSEC is the defense against three kinds of attacks on
computers or the information they store:

Theft of service (TOS), which is the unauthorized use of
computational resources (most often CPU time or disk storage space)

A breach of confidentiality (BOC), which allows the unauthorized
disclosure of information

A denial of service (DOS), which prevents valid disclosures of valid
information to valid users

8/15/2002
Although technically an availability reduction (and hence seemingly
more in the domain of information assurance than information security),
some types of denial of service are so closely related to breach of
confidentiality that they require a coordinated COMPUSEC defense
Copyright 2002 by M.S. jaffe
MSJ-10
How Denial of Service is Related to Breach
of Confidentiality

Denial of service comes in a spectrum with two clear endpoints:

Coarse DOS – e.g., denying anybody access to anything



8/15/2002
Easy to detect; may be hard to prevent or cure, but it’s sure easy to detect
Subtle DOS – surreptitiously altering information, thus denying some
valid user(s) valid information or expected service

Subtle DOS is better known as “breach of integrity” (BOI)

May be very hard to detect

May be very, very, very (like absolutely extremely) nasty indeed in its
consequences – e.g., changing the blood type in a medical record, changing the
target coordinates for an ICBM, or blowing up a natural gas pipleine in Siberia
Defense against breach of integrity in particular requires defense
against breach of confidentiality and vice versa
Copyright 2002 by M.S. jaffe
MSJ-11
Why a Common Defense is Required for
Both BOI and BOC

Unauthorized modification of a program is a breach of integrity
that might, for example, result in the insertion of malicious code
designed to make possible a subsequent breach of confidentiality

A breach of confidentiality may reveal information that makes
possible a breach of integrity attack


E.g., unauthorized disclosure of a system administrator's password
The point: Although conceptually distinct, breach of integrity
and breach of confidentiality are related in that you can’t
successfully defend against the one without also defending
against the other
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-12
COMPUSEC’s Dependency on Other
Types of Security


COMPUSEC is heavily dependent on both physical and
personnel security

COMPUSEC can often be easily bypassed by attackers who gain
physical access to the machine – e.g., on a Windows machine an
attacker could insert a floppy, reboot from the floppy, and use DOS to
totally bypass all Windows security features (laughable as they are)

Many users do not have exclusive use of their machines and must trust
certain other users, particularly system administrators, to not be
malicious – good personnel security is thus required
COMPUSEC is dependent on even the trusted users
following security procedures properly

8/15/2002
E.g., depending on the physical security environment, taping your
password to your keyboard may be the same as mailing it to Moscow
Copyright 2002 by M.S. jaffe
MSJ-13
COMPUSEC and Cryptography

Although very useful in COMSEC, cryptography is of far
less value in COMPUSEC than most people realize

8/15/2002
The reasons are illustrative of some key issues in COMPUSEC but
they are subtle and require some additional technical background
first, so a more detailed examination of the relationship between
COMPUSEC and cryptography will be deferred until after an
overview of the key concepts of COMPUSEC itself
Copyright 2002 by M.S. jaffe
MSJ-14
The Key Technical Concepts to
Understand COMPUSEC
8/15/2002

Objects and Subjects

Identification and Authentication

Operations and Access Modes

Access Rights and the Security Policy

The Trusted Computing Base (TCB)
Copyright 2002 by M.S. jaffe
MSJ-15
Objects

Objects: The data (including programs) on a computer to
be protected from unauthorized access

The key issue: What is the lowest level of granularity at
which access is going to be controlled?



The entire computer system?
An entire disk?
A file?


A record?


E.g., you can be authorized to see the student employee records in the
HR database but not faculty or staff records
An individual field?

8/15/2002
E.g., if you are authorized access to anything in the HR database (stored
in a single file) you have access to the entire HR database
E.g., You may see most of my HR record but not my salary
Copyright 2002 by M.S. jaffe
MSJ-16
Subjects
(And Their Identification and Authentication)

Subjects: The active entities that access the (passive) objects

Users are the starting point for defining subjects, which leads to the need
for mechansisms for:

Identification


Authentication

8/15/2002
To answer the question, who are you?
To answer the question, why should the system trust that you’re really who you
say you are?
Copyright 2002 by M.S. jaffe
MSJ-17
Access Modes and Access Rights

Access Mode: A set of one or more operations that a system
is only designed to grant or deny together




E.g., will the system allow a file to be modified by someone who can’t
delete it? Can it be deleted by someone who can’t modify it?
Some systems grant or deny users the “delete” and “modify” operations
together, with the same access mode; some systems control them with
independent access modes
Misunderstanding of the implications of the operations covered by a single
access mode is a contributor to much human error
Access Rights: The access modes authorized for a given
subject for a given object
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-18
Security Policies, the TCB, and
Trusted Software

A Security Policy spells out the access rights of all subjects
for all objects on a computer system

The Trusted Computer Base (TCB) consists of the software
that is involved in enforcing the security policy



The TCB must be trusted; all other software outside of the TCB need
not be trusted at all
The TCB is only as trustworthy as its least trustworthy part
There are then two key questions with regard to the TCB:
What does it take to make software trustworthy anyway?
2. How much software needs to be trusted (i.e., how much software must
be in the TCB)? All of it? All 45 million lines of WindowsXP? All
the games you download off the internet?
1.
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-19
Trusted Software

Software can be said to be trusted if:
1.
Its implementation is correct with respect to its requirements
 The software must actually do what it is required to do
 It must not do anything it’s not supposed to do
2.
Its requirements are correct with respect to specified security properties

3.
8/15/2002
We’re continually seeing examples of security software that was a correct
implementation of its specified requirements but the requirements didn’t
specify the security properties people thought they did
The software that actually executes hasn’t been improperly modified
since it was shown to be correct
Copyright 2002 by M.S. jaffe
MSJ-20
Level of Assurance

“Level of Assurance” refers to the degree to which one is
confident that the requirements and code are correct with
respect to desired properties

The federal government helped develop the now
internationally standard “Common Criteria”, which spell out
several discrete levels of assurance and the activities necessary
to achieve them
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-21
Why is High Assurance Software
Expensive?

Testing is good at showing that there are circumstances where
the software sometimes appears to do what you want; it is not
so good at showing that there aren’t other circumstances where
the software does something else, beyond what was wanted
“Well, we tested the reactor control logic for two years in the lab and it
always worked fine; how were we to know that if it would deliberately
blow up the reactor to celebrate if the Cubs won the World Series?”

There are a variety of other means of attempting to assure the
correctness of requirements and code; some are cheap, but not
too effective; the highly effective ones are expensive


8/15/2002
Cheap: “We hired the boss’s nephew for a day and he read our two
handwritten pages of documentation and said everything was OK”
Expensive: “We proved mathematically that the code implements the
requirements correctly and does not contain any hidden side effects”
Copyright 2002 by M.S. jaffe
MSJ-22
So How Big Does a TCB Have to Be?

Remember, for code to be trusted, the code that executes must be
the same as the code that has been assured to be correct

Now note that a program on disk is an object. Even if that code
itself was originally perfect, if other TCB software is not perfect,
a breach of integrity attack facilitated by a flaw in the imperfect
code could surreptitiously alter the installed version of a
previously perfect program so that it now does not do what its
designers intended

So the TCB must include all code that is involved in protecting
objects from unauthorized modification – i.e., the TCB must
protect itself from unauthorized modification and cannot use or
rely on any code outside of the TCB for its protection


8/15/2002
Untrusted code can always make use of TCB code; but
TCB code cannot make any use of untrusted code for any access to any
data object involved in enforcing a security policy
Copyright 2002 by M.S. jaffe
MSJ-23
So How Big Does a TCB Have to Be? (cont’d)

What is one set of code that all other code always uses,
whether it wants to or not, whether it even knows it or not?


8/15/2002
Answer: The operating system (or at least some portion of it)
So parts of the OS must be trusted, since other parts of the
TCB will make use of them, and then all the other OS parts
those OS parts use, and all the other parts those other parts use,
and so on, until it is a closed (or complete) set that makes no
references to any software outside itself – that’s the TCB
Copyright 2002 by M.S. jaffe
MSJ-24
The TCB and the Operating System (cont’d)

8/15/2002
Unless security is carefully considered and designed in from
the beginning and the OS/TCB designers are very skillful and
document their work very carefully, it will not be possible to
assert with great confidence that the assurance team has really
identified the entire TCB and it’s both small and complete:

Small, so that it can be verified to a high level of assurance without
breaking the bank

Complete (closed), so that there’s no other piece of software
somewhere that’s involved somehow sometime in enforcing a securitypolicy but that was overlooked during the TCB assurance effort
Copyright 2002 by M.S. jaffe
MSJ-25
The TCB and the Operating System
(cont’d)

Lots of vendors claim to have small “security kernels”* but
the OS’s have usually not been submitted for high assurance
evaluation by the independent certification agencies


Often vendors don’t even apply for a low assurance evaluation
Often also, vendor claims for their own software assurance activities
are subsequently shown to be erroneous


“I knew we couldn’t trust the bosses nephew!”
Often then, the real TCB (and it won’t be very trusted) will
include all, or at least most, of the OS (45 gazillion lines for
WindowsXP, for example), although almost certainly no one
will ever know for sure
* The term is often misused; technically, a security kernel is but one particularly
vital part of the whole (larger) TCB
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-26
Summary of COMPUSEC
 The
key factor in COMPUSEC is the Trusted Computing
Base, the sum total of all software that must be trusted to
enforce a security policy, including protecting itself from all
unauthorized modification

How big is it, really?

What level of assurance does one repose in it?

Including the degree of confidence that it has been correctly
identified in the first place; i.e., it really doesn’t depend on any
additional software that wasn’t identified as being part of it
“Oh damn, that’s right; the page replacement software could corrupt an
executing module, couldn’t it?”
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-27
Summary of COMPUSEC (cont’d)
8/15/2002

High assurance TCBs are expensive because testing alone
(which is expensive enough by itself when done properly)
can’t provide very much assurance that the software doesn’t
have hidden features and flaws (which “ordinary” OS
software surely does, and by the bucketful)

Consumer OS’s are low assurance because:
1.
Their TCB’s are substantial fractions of the overall OS – their exact
composition (which part of the OS must really be trusted?) often not
really known with great assurance
2.
Commercial vendors don’t see sufficient economic payoff in the
extra design, documentation, and analysis required to achieve higher
assurance levels – did you ask to see the Common Criteria certificate
for the OS on the last computer you bought?
Copyright 2002 by M.S. jaffe
MSJ-28
Prognosis for COMPUSEC in the
Civilian World

Extremely poor

Because the existing consumer OSs were designed without
sufficient thought for security properties, they would require
major redesign and re-implementation to provide greater levels
of trustworthiness and that redesign would probably render
them incompatible with much (possibly most) of the several
trillion dollars of already existing commercial software
designed for the current OS versions
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-29
Prognosis for COMPUSEC in the
Civilian World (cont’d)

Toys like firewalls help stop most current high school students
from penetrating today’s computer systems; they are useless
against serious attackers


There are types of Trojan Horse attacks that are known to be
theoretically impossible for even a perfect firewall to defend against
There are known defenses against such attacks, but they involve major
extensions to TCBs and result in major inconvenience for users

Security professionals have long knows that you can’t add
security after the fact, it must be designed in from the
beginning; since no major commercial OS software vendors
did so, their products will never be particularly trustworthy

Never – it’s too late
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-30
Roadmap: The Mechanisms for
Information Security



Introduction
Overview of the higher level concepts and vocabulary
Mechanisms for information security and their inter-dependencies
 INFOSEC
 COMPUSEC
 COMSEC



Emissions security
Cryptography
Network Security
 OPSEC
 Personnel and procedural security
 Physical security
 Summary
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-31
COMSEC: Communication Security

Informally: Protection of information as it is being
electronically transmitted from one place to another

Formally:
“Measures and controls taken to deny unauthorized persons
information derived from telecommunications and to ensure the
authenticity of such telecommunications. Communications security
includes cryptosecurity, transmission security, emissions security, and
physical security of COMSEC material.”
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-32
The Components of COMSEC

Cryptosecurity: Security that results from the provision of
technically sound cryptosystems and their proper use

Emission security: Protection resulting from all measures taken
to deny unauthorized persons information of value which might
be derived from intercept and analysis of compromising
emanations from crypto-equipment, AIS [automated inforamtion
systems], and telecommunications systems

Physical security: The component of communications security
that results from all physical measures necessary to safeguard
classified equipment, material, and documents from access
thereto or observation thereof by unauthorized persons

Transmission security:Security that results from the
application of measures designed to protect transmissions from
interception and exploitation by means other than cryptanalysis.
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-33
Cryptography

Informally: 230;xir459scvn09348)23-nf90*(&^a3590hjv0
(You can’t read this unless you too have a secret decoder ring.)

8/15/2002
Formally: “ [The] art or science concerning the principles,
means, and methods for rendering plain information
unintelligible, and for restoring encrypted information to
intelligible form. … .”
Copyright 2002 by M.S. jaffe
MSJ-34
The Basic Concepts of Cryptography
encryption
decryption
key
key
PT crypto CT
CT crypto PT
engine
engine
physically secure location
physically secure location
the unsecured outside world




8/15/2002
Plain Text (PT): The original information,
e.g., The quick brown fox jumped over the lazy yellow dog
Cipher Text (CT): The unintelligible translation of the PT,
e.g., xmx3mdj%uitojuglk)sdvna$erqw0293hdfp9erhiodfv
Key: a secret tool; in digital systems, much like a password
(a secret set of characters)
Cryptographic engine: a machine or program for combining the
key and the plain text to produce the cipher text or vice versa
Copyright 2002 by M.S. jaffe
MSJ-35
How Strong is Your Cryptography?

8/15/2002
In modern cryptography, the strength of a cryptographic system
relies on the secrecy of the key and the mathematical validity of
the algorithm in the cryptographic engine, not its secrecy

Generally, the mathematical logic (encryption algorithm) in the engine is
assumed to be known to a potential adversary – remember how the allies
captured the German Enigma machine at the start of WWII?

But if the key is kept secret, the CT will still be unreadable provided the
encryption algorithm really scrambles things up as well its designers think
it does – the Enigma didn’t; and many modern crypto systems developed
“in private” without sufficient review by enough knowledgeable
professionals have been shown to have mathematical flaws that reduce
their strength
Copyright 2002 by M.S. jaffe
MSJ-36
Why Software Cryptography Doesn’t
Help COMPUSEC That Much

If cryptographic software is being used to assist in COMPUSEC
(by encrypting files on a disk, for example, so unauthorized users
won’t be able to make sense of them), it is no more trustworthy
than the OS which dispatches it and allocates it memory to run in


8/15/2002
A corrupted OS might, for example, distribute surreptitious copies of the
encryption keys or alter the code doing the encryption to weaken its
strength (resistance to unauthorized decryption without the key); or, more
likely, make surreptitious copies of everything before it gets encrypted
Hence the degree of trust in the security of encrypted data can be
no greater than the degree of trust in the TCB that could prevent
unauthorized access to unencrypted files in the first place
Copyright 2002 by M.S. jaffe
MSJ-37
Is Cryptography Useless, Then?

Not at all. The military, for example, builds custom hardware to
do its encryption. The same basic analysis as we just did applies,
but the hardware equivalent of a TCB is much smaller than
WindowsXP since it (the crypto hardware “TCB”) is special
purpose, devoted entirely to encryption, and not a general
purpose consumer product designed to allow children to run
multi-tasking interactive games downloaded from the internet

Being so much smaller and simpler, the hardware crypto logic
can be analyzed to a fare-the-well – it still won’t be cheap, but it
won’t double the national debt either

It is software cryptography whose utility is severely limited by
the assurance level of the underlying TCB
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-38
Cryptography’s Dependency on Other
Forms of Security

Key compromise is a common attack on cryptographic systems

Keys can be compromised by:



Poor physical security – code books (containing keys) have been lost or
stolen
Poor personnel security – remember John Walker, the Navy cryptology
specialist who sold codebooks to the Russians
Poor computer security – if I gain access to the wrong files on your
computer, I can see your PGP encryption keys; then your encrypted email
messages will be easy for me to read

8/15/2002
If your COMPUSEC is weak in the first place I could probably just steal your
email from your computer, but maybe your overall information security is
strong enough that I can’t access your files as often as I’d like (e.g., the
dastardly janitor only comes in on Sunday); but to steal your PGP key, I only
need to access your files once
Copyright 2002 by M.S. jaffe
MSJ-39
Roadmap: The Mechanisms for
Information Security


Introduction
Mechanisms for information security and their inter-dependencies
 INFOSEC
 COMPUSEC
 COMSEC
 Cryptography

Emissions security

Transmission security
 Network Security
 OPSEC
 Personnel and procedural security
 Physical security
 Summary
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-40
Emissions Security (EMSEC)

Informally: Protection against electronic eavesdropping
(which can come in some surprisingly nasty forms)

Formally:
“Protection resulting from all measures taken to deny unauthorized
persons information of value which might be derived from intercept and
analysis of compromising emanations from crypto-equipment, AIS, and
telecommunications systems.”
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-41
EMSEC

At first blush, it would seem that emissions security is
unnecessary if adequate cryptographic measures are
employed (so the bad guys pick up my encrypted
transmission, so what?)

There are two problems with this overly simplistic view:
1.
2.
Sometimes just the fact that communication is taking place is
almost as sensitive as the actual content of the communication
No data is always encrypted – a classic and effective EMSEC
attack is to intercept the electronic signature of a CRT

8/15/2002
With a big enough antenna and some signal processing software,
eavesdroppers can display the contents of your CRT screen (where
your data can’t be encrypted or why put it on the CRT?) on their CRT
despite never having had physical or network access to your machine
(never getting within a hundred yards of it, in fact)
Copyright 2002 by M.S. jaffe
MSJ-42
EMSEC’s Relationship to Other
Forms of Security

EMSEC is heavily dependent on physical security




EMSEC requirements are influenced by OPSEC needs

8/15/2002
If, for example, the bad guys’ antennas were only good enough to
intercept my CRT’s electromagnetic signature from a distance of 300
yards or less and I could control the physical security to a radius of more
than 300 yards, I would be in good shape
If, on the other hand, I could control the physical security only to 200
yards, I’d need to find another way to reduce the emitted signal strength
of my electronics so that the signals couldn’t provide eavesdroppers
useful information beyond my 200 yard physical security perimeter
Electronic shielding (a.k.a. TEMPEST) or jamming is a common solution
It’s tougher to keep an adversary from knowing you have electronics
around than it is to keep him from knowing when they’re actually doing
something useful which is tougher than keeping him from knowing
exactly what they’re doing
Copyright 2002 by M.S. jaffe
MSJ-43
Transmission Security

Informally: Pass a folded note by hand directly to its
intended recipient and it won’t matter that it wasn’t encrypted

Formally:
“[A] Component of COMSEC resulting from the application of measures
designed to protect transmissions from interception and exploitation by
means other than cryptanalysis.”
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-44
Roadmap: The Mechanisms for
Information Security


Introduction
Mechanisms for information security and their inter-dependencies
 INFOSEC
 COMPUSEC
 COMSEC
 Cryptography
 Emissions security
 Transmission security

Network Security

OPSEC
 Personnel and procedural security
 Physical security
 Summary
 Further information
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-45
Network Security
8/15/2002

Informally: The name says it all – the security of
information on networks

Formally:
Protection of networks and their services from unauthorized
modification, destruction, or disclosure. It provides assurance the
network performs its critical functions correctly and there are no
harmful side-effects.
Copyright 2002 by M.S. jaffe
MSJ-46
Network Security
computer
#1


computer
#2
computer
#3
Network security is a combination of COMPUSEC and COMSEC

Information must be protected when it’s in the computers of the network
– that’s straight COMPUSEC

Information must be protected when it is being communicated on the network
– that’s straight COMSEC
But the difficulties are compounded when the one computer must
trust another to help it protect some of its own information, and that
second one may in fact depend on a third, and so on

8/15/2002
The problems are not qualitatively different from their COMPUSEC
counterparts, but the quantitative aspects can make them really daunting
Copyright 2002 by M.S. jaffe
MSJ-47
Example of Network Security Issues
computer
#1
computer
#2
computer
#3
user A
8/15/2002

Suppose user A is connected to computer #3 and requests
some access to a data object stored on computer #2

Computer #2’s security policy indicates that user A is
allowed that mode of access to the requested data object

But computer #2 has not itself authenticated user A’s
claimed identification as user A and, in this example, has no
protocol/software for doing so “remotely”; it must trust that
the other computer’s TCB did the identification and
authentication (I&A) properly, i.e., in a trustworthy manner
Copyright 2002 by M.S. jaffe
MSJ-48
Network Security and the Network TCB
computer
#1
computer
#2
imposter user A
real user A

Now there may be many disparate computers on the
network; they may not all be equally trustworthy

Computer #2 may trust computer #3 enough to accept its
authentication of user A, but how does computer #2 know
that the computer requesting access for user A is really
computer #3 and not computer #1?

Now computer #2 needs to I&A other computers as well as
humans; not a conceptually new problem, but:


8/15/2002
computer
#3
Probably can’t use the same I&A software for computers as for
humans (biometrics don’t work so well ;-) so
That’s even more software that needs to be trusted and is hence part
of the network TCB (NTCB)
Copyright 2002 by M.S. jaffe
MSJ-49
Trusting the NTCB

And remember, NTCB software can be no more trustworthy
than the underlying TCB on the computer on which it (any
piece of the NTCB) is running

So the NTCB as a whole can be no more trustworthy than the
least trusted TCB that participates in hosting the NTCB
software of the network
8/15/2002

Technically, there can be different virtual trusted networks running at
different levels of trust on the same physical comm network if the
COMSEC (usually crypto) is trusted enough to keep them separate

But within a given virtual trusted network, the NTCB trust level can
be no stronger than its weakest computer’s TCB
Copyright 2002 by M.S. jaffe
MSJ-50
OPSEC: Operations Security

Informally:
“We can tell something is up at the White House by keeping
track of the number of pizzas delivered after midnight”

Formally:
“[The] process denying to potential adversaries information about capabilities
and/or intentions by identifying, controlling and protecting generally
unclassified evidence of the planning and execution of sensitive activities.
8/15/2002
Copyright 2002 by M.S. jaffe
MSJ-51
OPSEC Deals with What Can Be
Inferred from What Can Be Observed

8/15/2002
OPSEC may be thought of as a form of “inferential”
security, trying to keep the bad guys from ascertaining
sensitive information based on inferences drawn from the
observation of somewhat less sensitive information – e.g.,
we won’t attack Iraq without a lot of late night staff work at
the White House and the bad guys keep track of the
“normal” level of after hours pizza delivery; if the level of
pizza delivery is still normal, any attack can’t be scheduled
for earlier than next week
Copyright 2002 by M.S. jaffe
MSJ-52
OPSEC is Heavily Dependent on
Physical and Personnel Security
8/15/2002

The greater the degree of physical control over the greater
area, the less “inferentially useful” information will be
observable by the bad guys; the more we trust the people
who do have access to areas where inferentially useful
information can be observed, the less we worry about such
information falling into the bad guys hands

E.g., if no Iraqis are allowed within 5 miles of the White
House (assuming there are take-out pizza parlors within that
5 mile radius) and we trust everyone else who is allowed
within the five mile radius to not give pizza information to
the Iraqis, then our White House staff can eat their pizzas
without worrying about the cost in soldiers’ lives of so
doing (ordering takeout pizza)
Copyright 2002 by M.S. jaffe
MSJ-53
The Two Big Problems of OPSEC (and
One Saving Grace for INFOSEC)

The two big problems of OPSEC are:


Figuring out all these odd clues from which sensitive inferences might
be drawn (pizza delivery rates, for crying out loud!)
Figuring out how to deny observation of such clues to potential bad
guys
There’s a reason much of Nevada is one giant, controlled military test area
 But then what can one do about satellites overhead (lots, actually – like
schedule no crucial tests or exercises when the bad guys satellites are in a
position to observe them)


8/15/2002
One saving grace in all of this is that inferences are often not
terribly specific; the pizza delivery rate indicates we’re going
to attack somebody, but maybe it’s France, not Iraq
Copyright 2002 by M.S. jaffe
MSJ-54
Physical Security
8/15/2002

Informally: Keeping the bad guys out of places they’re
not supposed to be

Formally:
“The physical measures necessary to safeguard equipment, material,
and documents from access thereto or observation thereof by
unauthorized persons.”
Copyright 2002 by M.S. jaffe
MSJ-55
Physical Security and Access Control
8/15/2002

Physical security needs are particularly dependent on threat
analysis, what type of access is threatening? Is the threat
observation of sensitive data on a CRT screen or insertion
of a Trojan Horse floppy disk into a disk drive?

The type of access required to effect an attack can vary
tremendously; floppy disk insertion requires physical access
to the computer itself (from a distance of essentially zero);
eyeball capture of screen data only needs line-of-sight from
within perhaps a dozen feet; camera capture of classified
images from a CRT still requires line of sight, but may be
possible at ranges out to a hundred feet or more; electronic
intercept of unshielded CRT signatures can be done from
surprisingly far away: the bigger the antenna the bad guys
can bring in without detection, the farther away they can be
and still capture the CRT data
Copyright 2002 by M.S. jaffe
MSJ-56
Physical Security and Personnel Security
8/15/2002

Improved personnel security can reduce the need for
physical security, and vice versa

Suppose for example, a worrisome attack requires the
attacker to insert a floppy disk into a computer; if the
computer is in a secure room where the only people with
physical access to the entire room are trusted to not insert
Trojan Horse disks, physical security is already adequate for
that threat

Suppose janitorial staff clean the room at night and are not
sufficiently trusted. Might have to have a continuous
trusted escort, or put a lock on the floppy disk drive, no?
Copyright 2002 by M.S. jaffe
MSJ-57
Personnel Security
8/15/2002

Informally: Not hiring bad guys and keeping good guys
from becoming bad guys

Formally: The ongoing screening, selection, management, and
evaluation of people with security clearances, sensitive positions, and/or
special access
Copyright 2002 by M.S. jaffe
MSJ-58