Software Security Assessment
Download
Report
Transcript Software Security Assessment
Software Security
Assessment
COEN 225
Code Auditing vs.
Black Box Penetration
Testing
Code Auditing vs. Black Box
Penetration Testing
Security audits of software:
White box testing
Auditors work with source code
+: Complete code coverage is possible
-: Complexity:
Manual code inspection
Automated tools such as
RATS, ITS4, Splint, Flawfinder, Jlint, Codespy
Tools are imperfect and need to be supported by manual review
-: Occasional lack of availability of source code
Black box testing
Auditors provide input to program / service under audit.
+: Black box testing is always possible
+: Portability
Can test several applications with the same test suite
+: Simplicity
-: Coverage
-: Lack of intelligence
Code Auditing vs. Black Box
Penetration Testing
Black Box Testing
Manual Testing
Automated Testing or Fuzzing
Pros:
E.g.: Provide single quotes to various parameters in a form to find
an sql or XSS attack possibility
Availability: Fuzzing is always possible
Reproducibility: Fuzzing ports to similar applications to be tested
Simplicity: Fuzzing replaces analysis with extensive trials
Contras:
Coverage: Coverage usually implies code inspection
Intelligence: Fuzzing is unlikely to find complicated attack patterns
Code Auditing vs. Black Box
Penetration Testing
Gray box testing
Combines
black box testing with some Reverse
Engineering (RE)
RE is used to identify possible vulnerabilities
Manual gray box testing
Use IDA PRO or similar disassembler tool to generate
assembly code of binary
Identify possible vulnerabilities
Generate sample input
Automated gray box testing
Number of tools that automatize the process
BugScam
Inspector
Bin Audit
LogiScan
SecurityReview
Code Auditing vs. Black Box
Penetration Testing
Gray box testing
Pro:
Availability: Can always be done
Coverage: Better than black box testing
Contra:
Complexity: Reverse Engineering is very difficult
Code Auditing vs. Black Box Penetration Testing
Example
struct keyval
{
char * key;
char * value;
}
int handle_query_string(char * query_string)
{
Vulnerability:
struct keyval *qstring_values, *ent;
Programmer assumes that
char buf[1024];
if(!query_string)
ent->value fits into buffer
return 0;
qstring_values = split_keyvalue_pairs(query_string);
ent->value is controlled by input
if(ent = find_entry(qstring_values, ″mode″))!= NULL)
{
sprintf(buf, ″MODE=%s″,ent->value);
putenv(buf);
}
…
}
Code Auditing vs. Black Box Penetration Testing
Example
Web server behaves differently if the query string
contains
mode=xxx
Places string xxx into buffer
buffer can overflow
Black box testing will have difficulties finding this
possible vulnerability
Gray box testing needs to find the if statement
Code inspection can find the faulty sprintf statement and
check for existence of an actual vulnerability
Code Auditing and
Development Life
Cycle
System Development Life Cycle
Feasibility study
Requirements definition
Design
Implementation
Integration and Testing
Operation and Maintenance
Trust Relationships
Trust relationships:
Different
components in a system place
varying degrees of trust in each other.
Trust relationships need to be made explicit
and investigated.
Transitive Nature of Trust
Trust Relationships:
Misplaced Trust
Misplaced Trust = Making an unfounded
assumption
Input:
Most vulnerabilities are triggered by malicious
input
Developer assumes that no-one enters a 5000 character
telephone number
Developer assumes that related software module
sanitizes input to module under development
Trust Relationships:
Misplaced Trust
Misplaced Trust = Making an unfounded assumption
Interfaces:
Mechanisms by which software components communicate with
each other and the outside world.
Developers
chose method of exposing interface that does not provide enough
protection from external attackers.
chose reliable method of exposing interface, but configure it incorrectly
assume that interface is too difficult for an attacker to access.
Example:
Custom network interface with custom encryption.
Attacker needs to reverse engineer a client
Trust Relationships:
Misplaced Trust
Misplaced Trust = Making an unfounded
assumption
Environment
Teaser
Software does not run in a vacuum
Developer trusts environment, but attacker
manipulates it
Teaser
TEASER
Classic example – teaser: /tmp – race
Application creates file in /tmp or /var/tmp
Attacker creates symbolic link while app is running
Application writes to the symbolic link instead
Trust Relationships:
Misplaced Trust
Misplaced Trust = Making an unfounded
assumption
Exceptions
Teaser
Attacker causes unexpected change in
application’s program flow by external measures
Example:
Teaser
TEASER
App writes to a (attacker-controlled) pipe
Attacker causes pipe to close just before write
Results in a SIGPIPE exception (in *nix)
App aborts, possibly leaving data incoherent
Design Review
Algorithms
E.g.:
Sorted list poses a DoS risk if attacker can
cause it to increase beyond reasonable bounds
Problem Domain Logic – Business Logic
Banking Example:
Person can make one monthly transaction with their money
market account to or from checking.
Can make unlimited transfers to checking account.
If checking account is below limit, money is transferred from
money market account to checking
Design Review
Trust Relationships
Trust
Reflects limitation of trust between modules
Trust
Domains
Regions of shared trust, limited by trust boundaries
Trust
boundary
Model
Abstraction that presents these relationships
Design Review: Trust Relationship
Win98 Trust Model
Users are not protected
from each other
If not networked, need to
get physical access to
machine
Rest
of
World
Physical Access Boundary
Administrator
Administrative Privilege Boundary
User 1
User 2
Design Review
Examples for Design Flaws
Exploiting Strong Coupling
Application is not decomposed along trust boundaries
Example: Windows Shatter
Windows WM_TIMER Message Handling can enable privilege
elevation (12/2002)
Interactive processes need to react to user events
One mechanism is WM_TIMER, sent at expiration of timer
Causes process to execute a timer callback function
One process can cause another process to execute a timer callback
function (of its choosing), even if the second process did not set
a timer.
Several processes run with LocalSystem privileges
Attacker logs onto system interactively, executes program, that
levies a WM_TIMER request upon a LocalSystem privilege process,
causing it to take any action the attacker specifies.
Fixed by MS 2003
Design Review
Examples for Design Flaws
Exploiting transitive trusts
Solaris Example:
Solaris contains automountd
Runs as root
Allows root to specify a command as part of a mounting operation
Does not listen on an IP network
Available only through three protected loopback transports
Solaris contains rpc.statd
Runs as root
Listens on TCP and UDP interfaces
Monitors NFS servers to send out notification if they go down
Clients tell rpc.statd which host to contact and what RPC program
number to call on host
Design Review
Examples for Design Flaws
Exploiting transitive trusts
Solaris Example continued:
Attacker registers local automountd with rpc.statd
Attacker tells rpc.statd that NFS server has crashed
rpc.statd contacts automountd daemon through
loopback device
automountd trusts message since it comes from root
through loopback device and carries out a command of the
attacker’s choice.
Some work needed to make request a valid automountd
request.
Design Review
Examples for Design Flaws
Failure Handling
User
friendly:
Recovers from problem
Generates assistance in solving problems
Security
conscious:
Assumes that failure conditions are result of an
attack
Close down app without explanation
Design Review
Examples for Design Flaws
Authentication
Lack
of authentication
Attacker can get access to a (presumably) private interface
between modules in an app
Example: Web site does authentication in a main page, but
then does not check it when using links from main site.
Untrustworthy credentials
Versions of sadmind were shipped without a default of “no
authentication required” (1999)
Use of source IP address as a credential
Design Review
Examples for Design Flaws
Authorization
Omitting
authorization checks
Allowing users to set up authorization
themselves
…
Design Review
Examples for Design Flaws
Accountability
Logging
Failure
200801091536 Logon Failure: Bob
200801091539 Logon Success: Alice
200801091547 Logout: Alice
Example: Log of Login
username: Alice\n 200801091559 Logon Success: Alice
Attempts
200801091536 Logon Failure: Bob
User name allows
200801091539 Logon Success: Alice
newlines
200801091559 Logon Failure: Alice
200801091559 Logon Success: Alice
Design Review
Examples for Design Flaws
Confidentiality & Integrity
Obfuscation instead of encryption
Insecure Use of Encryption
Example: XOR-encryption
Storing Sensitive Data Unnecessarily
Example: Storing a password
Instead store (1-way) salted hash of password
Without salt, can use rainbow tables to crack password
Bait & Switch Attacks
Example: Using an insecure hash (MD5, SHA1) to validate
Application signs hash of request
If hash is insecure, can generate request with the same hash
Design Review
Threat Modeling
Michael Howard and David LeBlanc:
Writing Secure Code, Microsoft Press,
2002
Frank Swiderski and Window Snyder:
Threat Modeling, Microsoft Press 2004
Design Review
Threat Modeling
Process during design phase, updated in
later development phases
1.
2.
3.
4.
5.
Information Collection
Application Architecture Modeling
Threat Identification
Documentation of Findings
Prioritizing of Implementation Review
Design Review
Threat Modeling: Information Collection
Goal: Get understanding of application
Assets:
What has value for an attacker?
Entry points:
Path through which an attacker can access the system.
External entities:
Those that communicate with process through the entry
points
External trust levels
Major components
Use scenarios
Design Review
Threat Modeling: Information Collection
Developer Interviews
Keep in Mind
Developers have put lots of efforts into work.
Avoid any judgmental or condescending overtones
Developer Documentation
Often
incomplete
Often no longer representative of implementation
Standards Documentation
Source Profiling (Not Source Inspection)
Design Review
Threat Modeling: Information Collection
System Profiling
Requires access to a
Approaches:
File system layout
Code reuse
Imports and Exports
Sandboxing
functioning installation
Determine all objects touched and all activities performed
Use sniffer and application proxies to record any network
activity
Use tools such as FileMon, RegMon, WinObj, Process Explorer
Scanning
Design Review
Threat Modeling: Application Architecture Modeling
Create Data Flow Diagrams
http request
database query
https request
Web
Database
Application
User
database response
http answer
https answer
Design Review
Threat Modeling: Application Architecture Modeling
DFD level-0 diagram of login process
login
Login
database query
process
login status
database response
Database
User
operational
request
operational
response
database query
Authenticated
Operations
database response
Design Review
Threat Modeling: Application Architecture Modeling
Submit login
request
Check
for
HTTPS
Redirect to
https
https
connection
accepted
Look-up
user
Query password
salt for user
Return salt
Salt is valid
Login accepted
Check
password
User
Invalid
password
Database
Query for
username with
salted password
Access
denied
Login failed
Invalid salt value
Return user record
Design Review
Threat Modeling: Application Architecture Modeling
Check
for
HTTPS
Submit login
request
https
connection
accepted
Query password
salt for user
Redirect to
https
Look-up
user
Return salt
Invalid user
name
Database
User
Invalid
password
Salt is
valid
Query for
username with
salted password
Return user record
Login accepted
Check
password
Design Review
Threat Identification
Process of determining an application’s
security exposure
Uses attack trees
Design Review
Threat Identification
Process of determining an application’s
security exposure
Uses attack trees
Design Review
Threat Identification
1. Adversary gains access to a user’s personal
information
1.1. Gain direct
access to the
database
1.1.1. Exploit
a hole in
system
application or
kernel
1.2. Login as target
user
1.2.1. Brute
force login
1.2.1.1.
Identify user
name
1.2.2. Steal
user
credentials
1.2.1.2.
Identify user
password
1.3. Hijack user
session
1.3.1. Steal
user session
cookie
1.4. Passively
intercept personal
data
1.4.1. Identify
user
connection
initiation
1.4.2. Sniff
network traffic
for personal
data
Design Review
Threat Identification
1. Adversary gains access to a user’s personal information
1.1 Gainrepresentation
direct access to the
OR
Textual
database
1.1.1 Exploit a hole in system application or kernel
1.2 Log in as target user
OR
1.2.1 Brute-force login
AND 1.2.1.1 Identify user name
1.2.1.2 Identify user password
1.2.2 Steal user credentials
1.3 Hijack user session
1.3.1 Steal user session cookie
1.4 Passively intercept personal data
AND 1.4.1 Identify user connection initiation
1.4.2 Sniff network traffic for personal data
Design Review
Threat Mitigation
Adorn attack tree with threat mitigation
measures
Dashed lines indicate improbable attack
vectors
Design Review
Threat Mitigation
1. Adversary gains access to a user’s personal
information
1.1. Gain direct
access to the
database
1.1.1. Exploit
a hole in
system
application or
kernel
1.2. Login as target
user
1.2.1. Brute
force login
1.2.1.1.
Identify user
name
System
patches
up to date
1.2.2. Steal
user
credentials
1.2.1.2.
Identify user
password
1.3. Hijack user
session
1.3.1. Steal
user session
cookie
https
required
1.4. Passively
intercept personal
data
1.4.1. Identify
user
connection
initiation
1.4.2. Sniff
network traffic
for personal
data
https
required
Design Review
Documentation of Findings
Threat summary structure:
Threat: Bruce force login
Affected component: Web application login
Description: Clients can brute force attack usernames and
passwords by repeatedly connecting and attempting to log in. This
thread is increased because the application returns different error
messages for invalid usernames and passwords making usernames
easier to guess.
Result: Untrusted clients can gain access to a user account and
therefore read or modify sensitive information
Mitigation
Strategies: Make error messages ambiguous so
that an attacker does not know whether the username or password is
invalid. Lock the account after repeated failed login attempts
Design Review
DREAD Risk Ratings
Brute force login
Damage potential:
Reproducibility
Exploitability
Affected users
Discoverability
Overall
6
8
4
5
8
6.2
Operational Review
Operational vulnerabilities
result
result
from application’s configuration
from deployment environment
Operational vulnerabilities can result from
configuration options
failure to use platform security
insecure deployment
insecure base platform
mechanisms properly
Hence, responsibility falls between developer
and administrative personnel
Operational Review
Attack surface reductions
Minimizing attack surface = Hardening platform
Get rid of unnecessary services
Use virtualization
Example:
IIS HTR vulnerabilities
Scripting technology not widely used because supplanted by ASP
Default IIS enabled HTR service
1999 – 2002: Number of HTR vulnerabilities
Insecure Defaults
In order to make installation simple
Pay attention to
Application’s default settings
Platform’s default settings
Operational Review
Access Control
Externally,
application depends completely on access
control of host OS or platform
Example:
Python on Windows installed on C:\Python25
Default write permissions on Windows to any direct
subdirectory of c: drive
python uses mscvr71.dll
Attacker can provide mscvr71.dll in the Python25 directory
python.exe will pick mscvr71.dll in its own directory by
preference
Operational Review
Secure Channel Vulnerabilities:
Simply not using a secure channel
Example: Web site sends session cookie in the clear
Acceptable for web-based email
Not acceptable for banking
Spoofing and Identification
Trusting
TCP or UDP source addresses
Network profiles
NFS
or Server Message Block (SMB) are acceptable
within a firewall, but not without
Operational Review
HTTP request methods:
Question
honoring TRACE, OPTIONS, and
CONNECT requests
OPTIONS – Lists all services server accepts
TRACE – echoes request body to client
Worry about cross-scripting attacks
CONNECT – provides way for proxies to establish
SSL connections
Directory Indexing
Operational Review
Protective Measures
Stack
protection
Non-executable stack
Canaries
Address
space layout randomization
Registered function pointers
Long-lived function pointer calls are wrapped by protective
checks
Virtual
machines
Operational Review
Host-based measures
Object and file system permissions
Restricted accounts
Chroot jails
System virtualization
Enhanced kernel protection
One virtual system per service
SELinux, Core Force
Host-based firewalls
Antimalware applications
File and object change monitors
Host-based intrusion detection systems
Operational Review
Network-based measures
Network
address translation
Virtual private networks
Network Intrusion Detection Systems (NIDS)
Application Review
Process
Application Review Process
Process Outline
Preassessment
Planning and scoping an application review
Setting up a time line
Application
Review
Documentation and Analysis
Remediation Support
Application Review Process
Application Access can be
Source
only
Binary only
Both binary and source
Checked build
Binary with additional debug information
Strict
black box
Application Review Process
Application Review
Natural
to follow waterfall model, starting with
design
However, design review needs thorough
understanding of code, which comes with
exposure.
Postpone design review.
Application Review Process
Methodology is constrained by code reviewer’s
capability to concentrate
Application review process
1.
Initial preparation
2.
Iterate through 2-3 hr cycles:
1.
2.
3.
4.
3.
Without documentation Derive design from implementation:
Top Down, Bottom Up, or Hybrid
Plan step
Work
Reflect
Break
Documentation, Analysis, and Recommendations
Application Review Process
Code Navigation
Described in terms of
External
Control flow vs. data flow
Tracing
flow sensitivity
direction
Forward vs. backward
Application Review Process
Code Navigation
int foo(int c)
if( c == 4)
{
bar(c);
if( c == 72) fubar();
for(; c; c--)
updateGlobalState();
}
Data
flow external
sensitive:
Ignoring
control flow and data flow:
Start
top.
Then
follow
into bar, because it receives c.
Readwith
code
from
top to
bottom
Do not follow fubar and updateGlobalState, because they do nothing
with
c. flow sensitive:
Control
Startflow:with top. Then inspect bar, fubar, and
Control and data
updateGlobalState
You would have some idea on the range of c. For
example, if c is always larger than 40, you would not bother following
bar.
Application Review Process
Code Auditing Strategies
Three basic categories of code auditing
strategies:
1.
Code Comprehension
Analyze source code directly
2.
Candidate Point Strategies
1.
2.
3.
to discover vulnerabilities
to improve understanding of code
Create list of potential issues
Examine source code to determine relevance of issues
Design Generalization Strategies
To analyze potential medium to high logic and design flaws
Application Review Process
Code Comprehension Strategies
CC1: Tracing malicious input
Start
at entry point to the system
Trace flow of code forward, while performing
limitedInherent
data focus
flowonanalysis
Strengths:
security-relevant code
Basically,
keep set
of “malicious
Can sometimes
identify
subtle flaws input” in the back
of your
mind as you read the code
Difficult to go off track
Focus
effort
on any
ofup
behavior
Weaknesses:
Code
and data
pathstype
balloon
quickly that fits into a
vulnerability
class that you know
Easy to overlook issues
Requires focus and experience
Application Review Process
Code Comprehension Strategies
CC2: Analyze a module
Read
a single source file from top to bottom
Look at each function in a vacuum and
document
potential issues
Strengths:
You learn the way application is programmed
Easier to analyze cohesive modules
Can find subtle and abstract flaws
Weaknesses:
Mentally taxing
Easy to mismanage time
Constant documentation requires discipline
Application Review Process
Code Comprehension Strategies
CC3: Analyze an algorithm
Look
at the algorithm and identify any
possible weakness in the design of the
algorithm
Pick security relevant algorithms
Crypto
Strengths:
You cannot go off track
Security modelCan
enforcement
find subtle and abstract flaws
Input
processing
Weaknesses:
Mentally taxing
Lacks context
Application Review Process
Code Comprehension Strategies
CC4: Analyze a class / object
Focus
on class instead of module (CC2)
Strengths:
Less likely to go off track than for module analysis
Weaknesses:
Mentally taxing
Might lack context
More likely to go off track than algorithm analysis
Application Review Process
Code Comprehension Strategies
CC5: Trace black box hits
Start
out with a list of possible problems
obtained
by fuzzing or manual black boxing
Strengths:
Traces some form of known issue
Problems
: program crashes or program displays
Easy to stay on track
useful information
Simple
Trace input
to vulnerabilities
Weaknesses:
Ignores many
other issues
Has the potential to have to deal with many false positives
Application Review Process
Candidate Point Strategies
CP1: General Candidate Point Approach
Start
with low level routines that grant access
to application assets or that could harbor
vulnerabilities
Strengths:
Good coverage of known vulnerability classes
Trace backward to the code to see whether
Not too difficult
these routines expose any vulnerabilities
Hard to get off track
accessible from an application entry point
Weaknesses:
Biases towards a limited set of potential issues
Comprehensive impact is much lower than with code
comprehension strategies
The results are only as good as your candidate points
Application Review Process
CP1 Example
Assume tool reports: util.c line 143: sprintf()
used on a stack buffer
You cannot determine whether this bug is exploitable
unless you can control either argument
get them to be long enough to overflow buffer
Need to check all calls to the function
int construct_email(char * name, char * domain)
{
char buf [1024];
sprintf(buf, %s@%s, name, domain);
...
}
Application Review Process
Candidate Point Strategies
CP2: Automated source analysis tool
Early
source analysis tools were simply lexical
analyzers
Search for patterns matching potentially vulnerable source
Strengths:
Good coverage for easily identified vulnerabilities
code
Newer
Not too do
difficult
systems
a more extensive analysis job
to get offcandidate
track
Helpful inHard
identifying
points
Weaknesses:
Biases
confirming
onlyup
a manual
limited set
of potential
Offer some
leveltowards
of analysis
to speed
review
issues
Comprehensive impact is much lower than with code
comprehension strategies
The results are only as good as your tool
Application Review Process
Candidate Point Strategies
CP3: Simple lexical candidate points
A wide range of vulnerabilities
SQL injection
format string vulnerabilities
can be easily identified
Strengths:
Simple
Good
coverage
forinstances
known vulnerability
classes
tools
can
find all
of a certain
vulnerability
Notclass
too difficult
Eliminate from
listtrack
everything that cannot be a
Hard tothis
get off
vulnerability,
based on whether a module handles any
Weaknesses:
Confirms only a limited set of issues
potentially malicious input
Does not lead to comprehension
After pairing down, use the candidate point strategy
Search results depends on the pattern used
Application Review Process
Candidate Point Strategies
CP4: Simple binary candidate points
Certain
classes of vulnerabilities can also be
found in binary code
Strengths:
sign
Good coverage
for known vulnerability
classes
extension
vulnerabilities
by looking
for
too difficult
MOVSXNot
instruction
Hard to go off track
…
Weaknesses:
Trace
Confirms only a limited set of issues
backward
from these candidates
Does not lead to comprehension
Search results depends on the pattern used
Application Review Process
Candidate Point Strategies
CP5: Black box generated candidate
points
Use
fuzzing or manual black boxing to find
Strengths:
issues Good coverage for known vulnerability classes
Not too difficult, after training and depending on trace
Trace them back to user-malleable input
Hard to go off track
Weaknesses:
Confirms only a limited set of issues
Does not lead to comprehension
Results depend on the tool
Application Review Process
Example
movzx
ecx, word ptr [eax+0Ah]
dec
ecx
mov
edx, ecx
shr
ecx, 2
lea
edi, [eax+19h]
rep movsd
mov
ecx, edx
and
ecx, 3
rep movsb
pop edi
pop esi
A huge copy occurs if we control
the short integer at [eax+0Ah] and
set it to zero.
The dec ecx will result in an
integer underflow
Crash would occur on the rep
movsd instruction (which get the
number of moves from the ecx
register)
Once crash is identified, we need
to figure out where [eax+0Ah] is
populated
Application Review Process
Example
Crash in analyzed by:
Finding the instruction where the program crashed
Examine why it crashed there:
Invalid source operand?
Invalid destination written to?
Index to memory to large?
Loop counter not a sane value?
Work backward to determine where the invalid operand came
from.
Connect invalid operand with some data fed to the program at
the entry point that was fuzz-tested. Determine the part of data
that causes the exception to occur
Application Review Process
Example
Dealing with faults where application seems to crash at a random
location:
Usually memory corruption, where corrupted portion is not used
immediately:
In our example:
You determined that [eax+0Ah] was set to 10h at initialization and
never changed
But obviously it now contains 0.
Two possibilities:
1.
2.
1.
2.
Memory corruption in the structure to which eax points
Corruption of another buffer on the heap has overwritten the structure eax
points at.
In the first case: fuzzing with same input should result in an identical
crash
In the second case: application might crash somewhere else or not at
all
Application Review Process
Candidate Point Strategies
CP6: Application specific candidate points
Sometimes
after working with an application,
you find recurring vulnerability patterns
Strengths:
Good
balance
of speed patterns
and depth of coverage
Search for
the
resulting
– the
Not too difficult
application
specific candidate points
Hard to go off track
Weaknesses:
Requires a thorough understanding of the code base
Deals only with a limited set of issues
Application Review Process
Design Generalization Strategies
DG1: Model the system
Start
withMost
implementation
effective method for identifying logic and design
vulnerabilities design
Reverse engineer
Can identify even some operational vulnerabilities
Limit yourself to security-critical components
Strengths:
Provides detailed knowledge of the application’s design
and architecture
Weaknesses:
Requires a thorough understanding of the system
implementation
Requires focus and experience
Can be extremely time consuming
Application Review Process
Design Generalization Strategies
DG2: Hypothesis testing
Determines
design of smaller elements of
code by making a hypothesis and testing the
hypothesis
Strengths:
Faster method to identify issues with design
If you are
correct,
reverse
engineered a
Helps
build goodyou
understanding
of design
part of the
application and investigates its
Is well suited to identify more complex and subtle issues
consequences
Weaknesses:
Easy to go off track
If not, then
you probably
understand
Poor assumptions
can derail
design analysis the code
elementMentally
better
and can make a better guess
taxing
Application Review Process
Design Generalization Strategies
DG3: Deriving purpose and function
Try
to directly identify the abstraction that the
Strengths:
Focuses
on areas known to be security relevant
implementation
represents
Builds good understanding of the application design and
Pick key programmatic elements elements and
architecture
summarize them
Builds good understanding of individual design aspects
Should lead to a good understanding of the
Weaknesses:
Poor assumptions can derail later elements of the review
programmatic idioms responsible for the components
Mentally taxing
of the trust model
Derive and identify design and architectural issues
Application Review Process
Design Generalization Strategies
DG4: Design conformity check
Focuses
on vulnerabilities arising from
differences between implementation and
design
Design is
typically
Strengths:
Hard
to go off underspecified
track
Implementation
canbalance
also just
deviate
from design
Provides good
between
implementation
and
design understanding
Method tries
to find “policy
breaches”
Much easier than deriving function without a design
These are
then analyzed for security
Weaknesses:
Misinterpretation of design could result in overlooking
consequences
vulnerabilities
Quality of result depends on the quality of original design
Application Review Process
Code Auditing Tactics
CODE AUDITING TACTICS:
Purpose:
Make errors such as skipping a line of code less
likely
Set
of simple tricks
Application Review Process
Code Auditing Tactics
char * ReadString(int fd, int maxlength)
{
Internal Flow Analysis
int length;
If read_integer fails:
read_integer(fd, &length);
return NULL;
char * data;
Not very exciting.
Follow
all control
and
if(read_integer(fd,
&length)
< 0) data flows in
a given
return NULL;
If read fails:
module
data = (char *) malloc(length + 1);
read_integer(fd, &length);
if (data ==NULL) but potentially
data
= (char *) malloc(length
Overlooked,
relevant
code: + 1);
return NULL;
Error-checking
branches
if (read(fd, data, length)
< 0)
Pathological
code paths:
{
Example:
}
data[length] = ‘\0’;
return data;
}
read(fd, data, length)
free(data);
return NULL;
Functions with many small and
non-terminating branches
There is a major difference in handling a failure
free(data);
return NULL;
return NULL;
between these two function calls
Application Review Process
Code Auditing Tactics
Subsystem and dependency analysis
Often,
security sensitive code is spread over
several modules
Rereading code
With
different emphasis and vulnerability class
targets
Application Review Process
Code Auditing Tactics
Desk-Checking
Create
table of all variables in a code
fragment
Populate them with initial values
Follow execution by steping through each line
of code.
Application Review Process
Code Auditing Tactics
Test Cases
Implemented by:
For a program or a small isolated part of code
Writing software to interact with program and provide input
Entering values manually into a program using debugger
Entering values manually into a program using desk-checking
Choosing test values:
Boundary values
Several inputs Too many cases
Constraint establishment
Practice of verifying that certain test values cannot reach the program fragment
Extraneous input thinning
Eliminate test inputs that are not a concern