Transcript Document

CSCI E-170:
Computer Security,
Usability & Privacy
Simson L. Garfinkel
http://e170.ex.com/
[email protected]
Eleni Drinea
© 2005 Simson Garfinkel
Course Fundamentals
• Online Info:
– [email protected]
– http://e170.ex.com/
• Texts:
– Security and Usability (Cranor &
Garfinkel, 2005)
– Lots of papers from the website.
• Class meetings:
– L01, 53 Church Street, Mondays
5:30-7:30
© 2005 Simson Garfinkel
Streaming Information
This class is being streamed live!
(30 second delay)
Students in Cambridge may sign a release so that they
can get on-camera!
-Let me know if you don’t want to be on camera.
Ask questions with AIM to: TK
© 2005 Simson Garfinkel
Check the website!
•
•
•
•
Announcements
Assignments
Notes
Materials
© 2005 Simson Garfinkel
LiveJournal
• We will be using
LiveJournal as a
collaboration system
• Get an account!
• Upload a photograph!
• Understand the security
and privacy of LiveJournal
© 2005 Simson Garfinkel
Today’s Class
• Hour #1: Computer Security
–
–
–
–
–
What is it?
What is a security policy?
What does it include / not include?
Perimeter definition & risk assessment
Best practices
• Saltzer’s Design Principles
• Hour #2: Understanding Privacy
– Data disclosure
– Fair information practices
• Assignment #1: A Security Incident…
© 2005 Simson Garfinkel
What is Computer Security?
• COMPUTER SECURITY:
– “A computer is secure if you can depend on it and
its software to behave as you expect.” (Garfinkel
& Spafford, 1991)
© 2005 Simson Garfinkel
Brief History of Computer Security
• 1930s - Turing
• 1940s - Cracking codes
• 1950s - Batch computing
– Deck of cards had account, no password
• 1960s - Interactive Computing
– usernames & passwords
• 1971 - First reports of “hacking”
© 2005 Simson Garfinkel
RFC 602 (1973)
• Arpa Network Working Group
Request for Comments: 602
Bob Metcalfe (PARC-MAXC)
Dec 1973NIC #21021
"The Stockings Were Hung by the Chimney with Care”
The ARPA Computer Network is susceptible to security
violations for at least the three following reasons:
(1) Individual sites, used to physical limitations on
machine access, have not yet taken sufficient precautions
toward securing their systems against unauthorized remote
use. For example, many people still use passwords which are
easy to guess: their fist names, their initials, their host
name spelled backwards, a string of characters which are
easy to type in sequence (e.g. ZXCVBNM).
© 2005 Simson Garfinkel
RFC 602 (1973)
• (2) The TIP allows access to the ARPANET to
a much wider audience than is thought or
intended. TIP phone numbers are posted, like
those scribbled hastily on the walls of phone
booths and men's rooms. The TIP required no
user identification before giving service.
Thus, many people, including those who used
to spend their time ripping off
Ma Bell,
get access to our stockings in a most
anonymous way
© 2005 Simson Garfinkel
RFC 602 (1973)
• (3) There is lingering affection for the
challenge of breaking someone's system. This
affection lingers despite the fact that
everyone knows that it's easy to break
systems, even easier to crash them.
© 2005 Simson Garfinkel
RFC 602 (1973)
• All of this would be quite humorous and cause for
raucous eye winking and elbow nudging, if it weren't
for the fact that in recent weeks at least two major
serving hosts were crashed under suspicious
circumstances by people who knew what they were
risking; on yet a third system, the system wheel
password was compromised -- by two high school
students in Los Angeles no less.
We suspect that the number of dangerous security
violations is larger than any of us know is growing.
You are advised not to sit "in hope that Saint
Nicholas would soon be there".
RMV:rmv
© 2005 Simson Garfinkel
Brief History of Computer Security
Cont…
• 1980s - Emergence of the hacker
underground
QuickTime™ and a
TIFF(Uncompres sed) dec ompressor
are needed to see thi s pic ture.
– 1983 - WarGames
• “Is it a game, or is it real?”
• “War Dialing”
– 1986 - The Cuckoo’s Egg
Qu i c k Ti m e ™ a n d a
TIF F (Un c o m p re s s e d ) d e c o m p re s s o r
a re n e e d e d t o s e e th i s p i c tu re .
• Cliff Stoll and the German Hackers
• January 15, 1990
Qu ickT ime™ an d a
TIF F (U ncom pre ssed ) de com press or
are nee ded to se e th is pi cture .
– AT&T Network Crash
– Operation Sun Devil
– http://www.mit.edu/hacker/hacker.html
© 2005 Simson Garfinkel
Goals of Computer Security
• Availability
– Make sure you can use your system
© 2005 Simson Garfinkel
Goals of Computer Security (2)
• Confidentiality
– Keep your secrets secret!
© 2005 Simson Garfinkel
Goals of Computer Security (3)
• Data Integrity
– Prevent others from modifying your data
© 2005 Simson Garfinkel
Goals of Computer Security (4)
• Control
– Regulate the use of your system
© 2005 Simson Garfinkel
Goals of Computer Security (5)
• Audit
– What happened?
– How do we undo it?
© 2005 Simson Garfinkel
Goals of Computer Security:
•
•
•
•
•
Availability
Confidentiality
Data Integrity
who/what are we protecting?
Control
who/what are we protecting against?
Audit
how are we going to do it?
© 2005 Simson Garfinkel
Different environments have
different priorities
• Banking environment:
– integrity, control and audit are more critical than
confidentiality and availability
• Intelligence service:
– confidentiality may come first, availability last.
• Military on the battlefield:
– availability may come first, audit may come last
• University:
– Integrity and availability may come first.
© 2005 Simson Garfinkel
Vulnerabilities, Threats, Attacks
• Most security texts focus on bad-guy
attackers, worms, viruses, etc.
• Most continuity problems arise from:
– Operator error
– Software error
– Environmental problems
• The best security measures protect against
both inadvertent and malicious threats.
© 2005 Simson Garfinkel
Class Participation:
• Threats to:
– Availability
– Confidentiality
– Data Integrity?
– Control?
– Audit?
© 2005 Simson Garfinkel
Security Policy
• Defines a security perimeter
Because you
can’t secure
everything
© 2005 Simson Garfinkel
Security Policy
• Defines a security perimeter
• Standards:
– codify the what should be done
• Guidelines.
– explain how it will be done
© 2005 Simson Garfinkel
How do you create a policy?
• Option #1 Risk Assessment:
– Identify assets and their value
– Identify the threats
– Calculate the risks
– Conduct a Cost-Benefit Analysis
© 2005 Simson Garfinkel
How do you create a policy?
• Option #2: Adopt “Best Practices.”
© 2005 Simson Garfinkel
Techniques For Drafting Policies
• Assign a specific “owner” to everything that is
to be protected.
• Be positive
• Be realistic in your expectations
• Concentrate on education and prevention
© 2005 Simson Garfinkel
Threats to Consider:
• Human error
• “Hackers”
– technical gurus, script kiddies, criminals looking for gain.
• Disgruntled employees
• Organized crime
– increasingly a threat! Breaking into hospitals, e-commerce sites, etc.
•
•
•
•
•
Foreign espionage (it happens!)
Cyber terrorists (it hasn’t happened yet)
Information warfare attacks (depends on how you count)
Microsoft / RIAA / MPAA
Mom
© 2005 Simson Garfinkel
Risk can be reduced, but not
eliminates.
• You can purchase a UPS…
– But the power failure may outlast the batteries
– But the UPS may fail
– But the cleaning crew may unplug it
– But the UPS may crash due to a software error.
© 2005 Simson Garfinkel
Spaf’s first principle of security
administration:
• “If you have responsibility for security, but
have no authority to set rules or punish
violators, your own role in the organization is
to take the blame when something big goes
wrong.”
© 2005 Simson Garfinkel
Technical Design Principles
• “The Protection of Information in Computer
Systems,” (Saltzer & Schroeder, 1975)
• Designed for securing operating systems, but
generally applicable.
© 2005 Simson Garfinkel
Saltzer & Schroeder’s Principles:
– Least Privilege
– Economy of Mechanism
– Complete Mediation
– Open Design
– Separation of Privilege
– Least Common Mechanism
– Psychological Acceptability
© 2005 Simson Garfinkel
Least Privilege
• “Every user and process should have the
minimum amount of access rights necessary.
Least privilege limits the damage that can be
done by malicious attackers and errors alike.
Access rights should be explicitly required,
rather than given to users by default.”
© 2005 Simson Garfinkel
Economy of mechanism
• “The design of the system should be small
and simple so that it can be verifiedand
correctly implemented.”
© 2005 Simson Garfinkel
Complete Mediation
• “Every access should be checked for proper
authorization.”
© 2005 Simson Garfinkel
Open Design
• “Security should not depend upon the
ignorance of the attacker. This criterion
precludes back doors in systems, which give
access to users who know about them.”
© 2005 Simson Garfinkel
Separation of privilege
• “Where possible, access to system resources
should depend on more than one ondition
being satisfied.”
© 2005 Simson Garfinkel
Least Common Mechanism
• “Users should be isolated from one another
by the system. This limits both covert
monitoring and cooperative efforts to override
system security mechanisms.”
© 2005 Simson Garfinkel
Psychological acceptability
• “The security controls must be easy to use so
that they will be used and not bypassed.”
© 2005 Simson Garfinkel
Privacy-Protecting Policies
• Simson L. Garfinkel
The Big Idea:
•
•
Many technical problems can be solved through
the use of policy
Technologists tend to overlook policy solutions
because they:
–
–
–
•
Aren’t 100% effective
Don’t work across legislative boundaries
Are open to [possibly intentional] misinterpretation
Example: CAN-SPAM act
© 2005 Simson Garfinkel
On the other hand…
• Policy solutions can be more flexible than
technical solutions
– Policy can be “technology-neutral”
– Policy doesn’t need to be upgraded
– Policy doesn’t crash when there are typos
– Policy can enable lawsuits that attack the human
root of problems
© 2005 Simson Garfinkel
The “Bad People” problem
• The world is filled with bad people.
• You can’t put them all in jail.
Evidence of “bad people”
• Decreasing inventory at stores
– Shoplifting?
– Employee theft?
• Merchandise purchased with “lost” credit
cards
– Perhaps the card was stolen
– Perhaps the card wasn’t stolen
More Evidence...
• Money borrowed and not repaid
• Faked insurance claims
• Forged checks
Solution to the
“bad person” problem
• Make a list of the bad people.
• Don’t do business with anybody on the list.
Examples of Solution...
• Retail Credit
– List of people “known” not to reply their debts
• Medical Information Bureau (est. 1902)
– List of people with “known” medical problems
• Chicago-area merchants (1950s)
– List of “known” shoplifters
Typical Credit Report
• “Retired Army Lieutenant Colonel”
– “A rather wild-tempered, unreasonably, and
uncouth person….
– “who abused his rank and wasn’t considered a
well-adjusted person.
– “He was known to roam the reservation at Ft.
Hood and shoot cattle belonging to ranchers
who had leased the grazing land from the
Army.”
• —Hearings on the Retail Credit Company,
1968
Credit reports of the 1960s
• Contained information that was hearsay or
just plain wrong.
• Records confused between individuals.
• No “statute of limitations” on the information.
• People frequently prohibited from seeing their
own records.
Fair Credit Reporting Act, 1970
• Right to see your credit report.
• Right to challenge incorrect information.
• Information automatically removed from report
after 7 years
– 10 years for Bankruptcy.
• Right to know who accesses your report.
• Right to a free credit report if you are denied
credit.
Code of Fair Information Practices
(1973) #1
• There must be no personal data recordkeeping systems whose very existence is
secret.
CFIPS #2
• There must be a way for a person to find out
what information about the person is in a
record and how it is used.
CIFP #3
• There must be a way for a person to prevent
information about the person that was
obtained for one purpose from being used or
made available for other purposes without the
person's consent.
CFIP #4
• There must be a way for a person to correct
or amend a record of identifiable information
about the person.
CFIP #5
• Any organization creating, maintaining, using,
or disseminating records of identifiable
personal data must assure the reliability of the
data for their intended use and must take
precautions to prevent misuses of the data.
CFIPs in Short
• No Secret databanks
• You are allowed to see your own record
• Information obtained for one purpose can’t
be used for another without consent.
• Ways for correcting or amending
information.
• Prevention of misuse.
CFIPs…
•
•
•
•
•
Good ideas --- matches what we believe.
FCRA - 1970
1980 OECD Guidelines
1999 Canada “C6”
FTC’s “Notice, Choice, Security and Access”
CIFP, cont.
• Good ideas --- matches what we believe.
• Never passed into law.
• Adopted in Europe.
1980 OECD Guidelines
• “Guidelines on the Protection of Privacy and
Transborder Flows of Personal Data”
• Collection Limitation Principle
– “obtained by lawful and fair means”
– “with the knowledge or consent” where
appropriate
• Data Quality Principle
– Data should be relevant and kept up-to-date.
1980 OECD Guidelines, Cont.
• Purpose Specification Principle
– Purpose specified before the data is collected.
• Use Limitation Principle
– Not be used for purposes other than originally
intended except
• With the consent of the data subject
• By the authority of law.
1980 OECD Guidelines, Cont.
• Security Safeguards Principle
– “Reasonable security safeguards” to prevent loss,
unauthorized access, destruction, use,
modification or disclosure of data.
• Openness Principle
– Clearly stated practices and policies.
– No secret databases.
1980 OECD Guidelines, Cont.
• Individual Participation Principle
– Individuals have the right to see their own records.
– Right to challenge and demand correction or erasure.
• (note Steve Ross story!)
• Accountability Principle
– “A data controller should be accountable for complying
with measures which give effect to the principles stated
above.”
1995 CSA “Privacy Standard”
•
•
•
•
•
•
•
•
•
•
1. Accountability
2. Identifying Purposes
3. Consent
4. Limiting Collection
5. Limiting Use, Disclosure, and Retention
6. Accuracy
7. Safeguards
8. Openness
9. Individual Access
10. Challenging Compliance
1999: Canada “C6”
• Comprehensive privacy law applies to both
public and private sector
• National businesses, banks, etc
• Medical records, prescriptions and insurance
records (January 1, 2002)
• Law extends to all commercial activity in
Canada (January 1, 2004)
What really makes C6 work...
Approaches to Privacy
Enforcement
•
Governmental Standards
–
•
Industry Standards
–
–
–
•
Enforcement by regulatory agencies, states, etc.
“Codes of conduct”
Limited enforcement through licensing
Limited enforcement from government
Unregulated Market
–
Reputation, or Caveat emptor
HIPAA - 1996*
(Health Insurance Portability and Accountability Act of 1996)
• Key Provisions:
– Largely about health insurance portability, not
about privacy
– Privacy mandates are largely about security:
•
•
•
•
Firewalls, anti-virus, etc.
Designate a privacy officer
Post privacy policy
Require outsourcing companies to protect
information.
• Access to health information; procedures for
correcting errors.
– Enforced by the States (unfunded mandate);
HHS enforces in “extreme cases.”
• (*privacy rule passed 2002)
COPPA (1998)
(Children’s Online Privacy Protection Act.)
• Key Provisions:
– Applies to online collection of info on children under 13
– Requires “verifiable parental consent”
• Very hard in most cases; letter, fax or phone call
• Some exceptions — one time response to “homework help”
– Privacy notice must be posted on website
• http://www.ftc.gov/opa/1999/9910/childfinal.htm
GLB
(Gramm-Leach-Bliley Act of 1999)
• Consumers must be informed of privacy
policies
– Initial notice in 2000
– Annual notice
– Notices were mostly ignored!
• Consumers must have a chance to “opt-out”
– Many different ways to “opt-out”
Sarbanes-Oxley Act of 2002
• Financial Auditing and Accountability
• Mostly security; no real privacy issues.
• Lots of money spent on compliance.
– Audit
– Control
– Visibility
© 2005 Simson Garfinkel
Identity Theft & Phishing
• Personal stories from class?
© 2005 Simson Garfinkel
Assignment #1
• Write a 600-900 essay describing an incident
in which you were personally involved. Be
sure to include relevant details. Sanitize it for
publication. Post online before next class.
© 2005 Simson Garfinkel
Reading Assignment
• Recommended:
– Chapter 3: Design for Usability
– Chapter 20: A User-Centric Privacy Space
Framework
• Optional:
– Chapter 2: Usable Security
– Chapter 19: Privacy Issues and HCI
© 2005 Simson Garfinkel