security engineering - University of Sydney

Download Report

Transcript security engineering - University of Sydney

ELEC5616
computer and network security
matt barrie
[email protected]
CNS
lecture 1 :: overview
1
goals
•
Understanding of security fundamentals
•
Introduction to applied cryptography
•
Issues with designing secure systems
•
Experience in designing and implementing one
•
Examination of real world case studies
•
Understanding of the cross-disciplinary issues
•
Why systems fail
CNS
lecture 1 :: overview
2
about us
•
Matt is an External Lecturer
–
–
–
–
–
Chief Executive Officer of Freelancer.com
Also lectures Technology Venture Creation
Non-Executive Chairman at Leasate (TVC’10)
Strategic Advisor to QuintessenceLabs (Quantum Cryptography)
Formerly
• Chief Executive Officer of Sensory Networks, Inc.
• Director of Packet Storm (packetstormsecurity.org), which was at the time the
world’s largest information security resource.
• Ran the Systems and Network Assessment Practice at Kroll-O’Gara Information
Security Group
• Managing Director of Infilsec, a computer security consulting firm
CNS
lecture 1 :: overview
3
syllabus
•
•
•
•
•
•
•
•
•
•
•
•
•
CNS
Hash functions
Authentication
Secret key encryption
Public key encryption
Key exchange
Digital signatures
Cryptographic protocols
Secure programming
Real world systems and protocols
Political and legal issues
Attacks
How and why systems fail
The shape of things to come
lecture 1 :: overview
4
mechanics
• Two lectures per week, for twelve weeks
– Friday 5pm – 7pm (EE450)
• One 2-hour lab working on a project
– Friday 9 – 11am (EE630)
– Friday 11am – 1pm (EE630)
• Tutors :
– Emma Fitzgerald
– Meena Rajani
• Assessment :
– Assignments & Challenges (25%)
• Wargames (12.5%)
• Quiz on papers given out in class (2.5%)
• One Assignment (10%)
– Project (25%)
– Final Exam (50%) [two hours, closed book]
CNS
lecture 1 :: overview
5
expectations
•
All lectures are compulsory
•
All labs are compulsory
•
Attendance below 50% can be grounds for failure
•
It is your responsibility to make up missed classes
CNS
lecture 1 :: overview
6
textbooks
1.
Cryptography and Network Security, William Stallings,
(Prentice Hall), 4th Edition
2.
Handbook of Applied Cryptography
A. Menezes, P. van Oorschot, S. Vanstone (online)
URL: http://www.cacr.math.uwaterloo.ca/hac/
3.
Lecture notes and additional reading material will also be
handed out in class.
Highly recommended:
•
Applied Cryptography, 2nd Ed., Bruce Schneier, (Wiley), 1996
•
Security Engineering, Ross Anderson, (Wiley), 2001
CNS
lecture 1 :: overview
7
project
CNS
lecture 1 :: overview
8
project part 1
•
It is 2011 and Big Brother is way ahead of schedule.
•
Naturally the Internet by now is fully tapped by the first world
countries as part of Echelon, and other partner SIGINT networks.
•
Committed to the global war on terrorism, the world’s terrorist
organisations plan to develop a global information exchange using
civilian infrastructure (i.e. the Internet).
•
Naturally, none of the terrorists involved want to necessarily be
identified as the buyers or sellers of this information (even to each
other!) – hence the need for an secure, anonymous platform for
facilitating this exchange… layered on the Internet
•
This exchange will be used by cells to trade classified information and
dirty secrets through a wholesale information exchange.
–
–
–
–
–
CNS
SIGINT on known military units e.g. email, voice transcripts
Blueprints and ‘eyeballs’ of bases and capitalist agent identities
Classified agency documents
Private video collections of dictators around the world
The occasional bootleg Britney Spears MP3
lecture 1 :: overview
9
stealthnet
•
Your group has been hired by a rogue cypherpunk cell to build
a secure communications application for underground
messaging, file transfers and secrets exchange
•
Think of it as a secure version of ICQ (www.icq.com) with the
ability to buy and sell ‘black market’ information
•
You may assume that anonymity will be handled by the
underlying StealthNet network layers
•
Written in Java with crypto library support
•
Teams of two
•
You will be supplied with an insecure skeleton for reference
CNS
lecture 1 :: overview
10
challenges
•
We will be providing “challenges” for you to solve as part of “Wargames”
•
There will be a leader board of highest scores
•
Challenges can be attempted individually or in teams (max. of 4)
•
IMPORTANT: You do not need to solve all or even half the challenges!
•
WARNING: these challenges may be a tremendous drain on your time,
and are mostly provided for your own interest and enjoyment
•
In the olden days some of the challenges had cash prizes (e.g.
US$30,000 prize for the RSA challenge). Unfortunately due to the Global
Financial Crisis, this has been discontinued. However we will be
substituting with some k-rad security prizes!
•
We will be giving out overall prizes for the top 3 teams at the end of
semester (final submission deadline Thursday @ midnight before the last
lecture).
CNS
–
Challenge difficulty will range from easy to extremely difficult-- by ‘extremely difficult’ we
mean that thousands of people have been trying for years to solve with no success, and
these challenges may not even be solved in your lifetime
lecture 1 :: overview
11
challenge marking
•
Each challenge is worth a different number of points based on
the difficulty. Some challenges will also have a time limit.
•
There are two types of challenges that will be given:
1. Challenges with a single solution
• Points will be determined by the number of people who have solved it
• Your points will decay as more people submit correct answers
2. Challenges with many or infinite numbers of solutions
•
•
•
•
•
Goal is to find the ‘best’ answer
Points will be determined by the quality of the solution
Better solutions get more points
You may submit multiple solutions as you find better answers
Points might still decay with multiple people solving the answer
•
Your mark for the challenges will be scaled versus all
submissions at the end of the course and account for half your
total assignment mark (or 12.5% of the total course mark)
•
REMEMBER you do have a life outside this course. Don’t get
carried away…
CNS
lecture 1 :: overview
12
help!
Help algorithm:
1.
Check the website
–
2.
If FAIL, post on the class message board
–
–
–
3.
Linked from course website
others may already have asked your question and got an answer
others may be having the same problem
If FAIL, e-mail us
–
–
CNS
http://www.ee.usyd.edu.au/~mattb/2010/
[email protected]
we have a neural connection to the Internet
lecture 1 :: overview
13
we are entering a brave new world ...
CNS
lecture 1 :: overview
14
CNS2009
lecture 1 :: overview
15
did cypherpunk become true?
•
Wikileaks. A dude with a suit and a laptop flying around Europe
giving lectures and press conferences while spraying corporate and
government secrets across (relatively) uncontrollable cyberspace.
Julian Assange is a second string character in a Gibson novel.
•
Stuxnet worm attacks uranium enrichment facility.
•
China sponsored hackers launch simultaneous, multi vector attacks
on US megacorps.
•
4chan. Anonymous. They remind me of the panther moderns.
Reckless, anarchistic, technical. Sometimes moral sometimes not.”
•
William Gibson no longer writes novels set in the future.
(stolen from reddit)
CNS
lecture 1 :: overview
16
actual newspaper headlines
•
“WebTV virus dials 911”
•
“The number 7 blocks Belgian ATM machines”
•
“Tampered heart monitors, simulating failure to get human organs”
•
“Secret American spy photos broadcast unencrypted over satellite TV”
•
“Software flaw in submarine-launched ballistic missile system”
•
“Accidental launch of live Canadian Navy missile: color-code mixup”
•
“Navy to use Windows 2000 on aircraft carriers”
•
“Classified data in wrong systems at Rocky Flats nuclear weapons plant”
•
“Russian nuclear warheads armed by computer malfunction”
•
“U.S. House approves life sentences for crackers”
•
“Fingerprints can now be scanned from 2 meters away”
•
“ $15 phone, 3 minutes all that's needed to eavesdrop on GSM call”
•
“Stuxnet worm attacks uranium enrichment facility.”
Courtesy of RISKS (http://catless.ncl.ac.uk/Risks/)
CNS
lecture 1 :: overview
17
and now, the bad news...
CNS
lecture 1 :: overview
18
nothing is secure in the digital world
•
The digital world behaves differently to the physical world
– Everything in the digital world is made of bits
– Bits have no uniqueness
– It’s easy to copy bits perfectly
•
Therefore, if you have something, I can copy it
–
–
–
–
–
–
Information
Privileges
Identity
Media
Software
Digital money
•
Much of information security revolves around making it hard
to copy bits
CNS
lecture 1 :: overview
19
matt’s definition of information security
•
You spend X so that your opponent has to spend Y to do
something you don’t want them to do
– Y is rarely greater than X
– … and there are lots of opponents
•
It’s all a resource game
•
Implication:
•
The trick is to raise the bar to an adequate level of (in)security
for the resource you are trying to protect
CNS
– Time
– $$$
– Computational power (time x $$$)
–
–
–
–
Given enough resources, someone’s going to get in
Given enough attackers, someone’s going to get in
Given enough time, someone’s going to get in
Thus all systems can and will fail
lecture 1 :: overview
20
security requirements
•
Everything you have been taught so far in engineering
revolves around building dependable systems that work
– Typically engineering efforts are associated with ensuring something does
happen e.g. John can access this file
•
Security engineering traditionally revolves around building
dependable systems that work in the face of a world full of
clever, malicious attackers
– Typically security has been about ensuring something can’t happen.
e.g. the Chinese government can’t access this file.
•
Reality is far more complex
•
Security requirements differ greatly between systems
CNS
lecture 1 :: overview
21
why do systems fail?
•
Systems often fail because designers :
–
–
–
–
–
–
CNS
Protect the wrong things
Protect the right things in the wrong way
Make poor assumptions about their systems
Do not understand their system’s threat model properly
Fail to account for paradigm shifts (e.g. the Internet)
Fail to understand the scope of their system
lecture 1 :: overview
22
bank security requirements
•
Core of a bank’s operations is its bookkeeping system
•
ATMs
•
High value transaction systems
•
Internet banking
•
Safe
– Most likely threat: internal staff stealing petty cash
– Goal: highest level of integrity
– Most likely threat: petty thieves
– Goal: authentication of customers, resist attack
– Most likely threat: internal staff, sophisticated criminals
– Goal: integrity of transactions
– Most likely threat: hacking the website or account
– Goal: authentication and availability
– Threat: physical break-ins, stealing safe
– Goal: physical integrity, difficult to transport, slow to open
CNS
lecture 1 :: overview
23
military communications
•
Electronic warfare systems
•
Military communications
•
Compartmentalisation
•
Nuclear weapons command & control
CNS
– Objective: jam enemy radar without being jammed yourself
– Goal: covertness, availability
– Result: countermeasures, countercountermeasures etc.
– Objective: Low probability of intercept (LPI)
– Goal: confidentiality, covertness, availability
– Result: spread spectrum communications etc.
– Objective example: logistics software- administration of boot polish different
from stinger missiles
– Goal: confidentiality, availability, resilience to traffic analysis?
– Goal: prevent weapons from being used outside the chain of command
lecture 1 :: overview
24
hospital security requirements
•
Use of web based technologies
– Goal: harness economies of the Internet (EoI) e.g. online reference books
– Goal: integrity of data
•
Remote access for doctors
– Goal: authentication, confidentiality
•
Patient record systems
– Goal: “nurses may only look at records of patients who have been in their
ward in the last 90 days”
– Goal: anonymity of records for research
•
Paradigm shifts introduce new threats
– Shift to online drug databases means paper records are no longer kept
– Results in new threats on
• availability e.g. denial of service of network
• integrity e.g. malicious “temporary” tampering of information
CNS
lecture 1 :: overview
25
risk analysis
Risk Impact Matrix
Likelihood
1
2
3
4
5
6
7
CNS
Impact
Extreme
High
Medium
Low
Negligible
Certain
1
1
2
3
4
Likely
1
2
3
4
5
Moderate
2
3
4
5
6
Unlikely
3
4
5
6
7
Rare
4
5
6
7
7
severe
high
major
significant
moderate
low
trivial
must be managed by senior management with a detailed plan
detailed research and management planning required at senior levels
senior management attention is needed
management responsibility must be specified
manage by specific monitoring or response procedures
manage by routine procedures
unlikely to need specific application of resources
lecture 1 :: overview
26
axioms of information security
•
All systems are buggy
•
The bigger the system the more buggy it is
•
Nothing works in isolation
•
Humans are most often the weakest link
•
It’s a lot easier to break a system than to make it secure
CNS
lecture 1 :: overview
27
a system can be..
•
A product or component
– e.g. software program, cryptographic protocol, smart card
•
… plus infrastructure
– e.g. PC, operating system, communications
•
… plus applications
– e.g. web server, payroll system
•
… plus IT staff
•
… plus users and management
•
… plus customers and external users
•
… plus partners, vendors
•
… plus the law, the media, competitors, politicians, regulators…
CNS
lecture 1 :: overview
28
aspects of security
•
Authenticity
•
Confidentiality
•
Integrity
•
Non-repudiation
•
Availability
•
Covertness
CNS
– Proof of a message’s origin
– Integrity plus freshness (i.e. message is not a replay)
– The ability to keep messages secret (for time t)
– Messages should not be able to be modified in transit
– Attackers should not be able to substitute fakes
– Cannot deny that a message was sent (related to authenticity)
– Guarantee of quality of service (fault tolerance)
– Message existence secrecy (related to anonymity)
lecture 1 :: overview
29
passive attacks
•
Those that do not involve modification or fabrication of data
•
Examples include eavesdropping on communications
•
Interception
– An unauthorised party gains access to an asset
– Release of message contents: an attack on confidentiality
– Traffic analysis: an attack on covertness
CNS
lecture 1 :: overview
30
active attacks
•
Those which involve some modification of the data stream or
creation of a false stream
•
Fabrication
– An unauthorised party inserts counterfeit objects into the system
– Examples include masquerading as an entity to gain access to the system
– An attack on authenticity
•
Interruption
– An asset of the system is destroyed or becomes unavailable or unusable
– Examples include denial-of-service attacks on networks
– An attack on availability
•
Modification
– An unauthorised party not only gains access to but tampers with an asset
– Examples include changing values in a data file or a virus
– An attack on integrity
CNS
lecture 1 :: overview
31
definitions
•
Secrecy
– A technical term which refers to the effect of actions to limit access to
information
•
Confidentiality
– An obligation to protect someone or some organisation’s secrets
•
Privacy
– The ability and/or right to protect the personal secrets of you or your
family; including invasions of your personal space
– Privacy does not extend to corporations
•
Anonymity
– The ability/desire to keep message source/destination confidentiality
CNS
lecture 1 :: overview
32
trust
•
A trusted system is one whose failure can break security
policy.
•
A trustworthy system is one which won’t fail.
•
A NSA employee caught selling US nuclear secrets to a foreign
diplomat is trusted but not trustworthy.
•
In information security trust is your enemy.
CNS
lecture 1 :: overview
33
trust is your enemy
•
You cannot trust software or vendors
•
You cannot trust the Internet nor its protocols
•
You cannot trust managers
– They won’t tell you their software is broken
– They won’t fix it if you tell them
– It’s built from broken pieces
– It’s a monoculture; something breaks  everything breaks
– It was designed to work, not be secure
– They don’t want to be laggards nor leaders
– Security is a cost centre, not a profit centre!
•
You cannot trust the government
•
You cannot trust your employees or users
– They only want to raise the resource game to their level
– They are going to pick poor passwords
– They are going to mess up the configuration and try to hack in
– They account for 90% of security problems
CNS
lecture 1 :: overview
34
trust is your enemy
•
You cannot trust your peers
– They are as bad as you
•
You cannot trust algorithms nor curves
– Moore’s law does not keep yesterday’s secrets
– Tomorrow they might figure out how to factor large numbers
– Tomorrow they might build a quantum computer
•
You cannot trust the security community
– They are going to ridicule you when they find a problem
– They are going to tell the whole world about it
•
You cannot trust information security
– It’s always going to be easier to break knees than break codes
•
You cannot trust yourself
– You are human
– One day you will screw up
CNS
lecture 1 :: overview
35
tenet of information security
•
Security through obscurity does not work
•
Full disclosure of the mechanisms of security algorithms and
systems (except secret key material) is the only policy that
works
•
Kirchoff’s Principle: For a system to be truly secure, all secrecy
must reside in the key
•
If the algorithms are known but cannot be broken, the system
is a good system
•
If an algorithm is secret and no-one has looked at it, nothing
can be said for its security
CNS
lecture 1 :: overview
36
morals of the story
•
Nothing is perfectly secure
•
Information security is a resource game
•
Nothing works in isolation
•
Know your system
•
Know your threat model
•
Trust is your enemy
•
All systems can and will fail
•
Humans are usually the weakest link
•
Attackers often know more about your system than you do
CNS
lecture 1 :: overview
37
references
•
Stallings
– §1
•
Interesting Websites
–
–
–
–
–
–
–
CNS
http://www.csl.sri.com/users/neumann/illustrative.html
http://www.packetstormsecurity.org
http://www.securityfocus.com
http://www.digicrime.com
http://www.cryptome.org
http://www.phrack.org
http://www.eff.org
lecture 1 :: overview
38