CS 4/591: Introduction to Computer Security Lecture 8: Multilateral models James Hook 11/6/2015 3:18 PM.

Download Report

Transcript CS 4/591: Introduction to Computer Security Lecture 8: Multilateral models James Hook 11/6/2015 3:18 PM.

CS 4/591: Introduction to
Computer Security
Lecture 8:
Multilateral models
James Hook
11/6/2015 3:18 PM
Last Time
• Applying cryptography to secure
communication
11/6/2015 3:18 PM
Today
• Multilateral security models
– Models that partition information to
enforce need-to-know between peers
11/6/2015 3:18 PM
Hybrid Policies
• Policy models in specific domains
• Combine notions of confidentiality and
integrity
• Two case studies:
– Chinese Wall, Brewer and Nash
– British Medical Association (BMA) model,
Ross Anderson, 1996
11/6/2015 3:18 PM
Chinese Wall
• Domain:
– Financial institutions
• Problem:
– Want to enable sharing of sensitive information
between traded companies and investment banks
– Don’t want the investment banks to become a
conduit of information
– British securities law dictates that strict conflict of
interest rules be applied preventing one specialist
to work with two clients in the same sector
11/6/2015 3:18 PM
Example
• Oil Companies
– Shell
– Texaco
– Mobil
• Soft Drink
– Pepsi
– Coke
• Analysts
– Amy
– Bob
• Problem
– Amy is working on Shell
and Pepsi
• Amy cannot work on T,
M or C
– Bob starts working on
Coke
• Can Bob help Amy on
Shell?
11/6/2015 3:18 PM
Novel Aspect
• Model is temporal --- it changes with
time
• Before Bob starts working on Coke he
can work on anything
• Once he commits to Coke he is directly
blocked from working on Pepsi
11/6/2015 3:18 PM
Concepts
• Objects: information related to a client
company
• Company Dataset (CD): objects related to a
single company
• Conflict of Interest (COI) class: datasets of
the companies in competition
• Sanitized: Non confidential information about
a company
11/6/2015 3:18 PM
Rules
• Simple Security:
– S can read O if one of:
• S has accessed O’ and CD(O) = CD(O’)
• For all previously accessed O’, COI (O’)  COI (O)
• O is sanitized
• *-Property
– S can write O iff
• S can read O by above
• For all unsanitized O’ readable by S, CD (O’) = CD (O)
11/6/2015 3:18 PM
Comments
• *-Property is very strict (too strict?)
• How does this relate to BLP?
11/6/2015 3:18 PM
BMA Model
• Presentation follows Anderson Section
9.2.3 [First edition 8.2.3]
• BMA model influenced HIPAA
11/6/2015 3:18 PM
Medical Records
• Scenario:
– Alice and Bob are married
– Cindy is the daughter of Alice and Bob
• Alice has 3 doctors:
– General practitioner, Obstetrician, Psychiatrist
• Bob has 2 doctors
– GP, Fertility specialist
• Cindy has 1 doctor
– Pediatrician
11/6/2015 3:18 PM
Traditional practice
• In paper based systems each provider
would keep a separate medical record
– Alice’s psychiatrist’s notes would not be
directly accessible to any other provider
11/6/2015 3:18 PM
Automating Medical Records
• One super record?
– Single Electronic Patient Record (EPR) that
follows patient from conception to autopsy
– Central and persistent avoids some medical
errors
– Alice’s GP’s advice nurse can now read Alice’s
psychiatrist’s notes!
• One record per provider?
– Matches existing flow of work
– Supports existing ethical practices
11/6/2015 3:18 PM
One patient?
• Does Cindy have rights to any
information in her mother’s
obstetrician’s medical record?
– What about Bob?
• Most policy is based on the
simplification of assuming only one
patient at a time has rights to data
11/6/2015 3:18 PM
Expectations
• Medical information is private and
confidential
– You want to be able to tell your physician that
you use drugs without worrying about
disclosure to law enforcement
• Some public health concerns require
reporting, even if that may violate
confidentiality
• Statistical research methods may advance
the state of knowledge
11/6/2015 3:18 PM
AIDS and Privacy
• AIDS epidemic brought many privacy
concerns to a head
• Some organizations were discriminating
against individuals based on HIV
infection
• Effective AIDS treatments, such as AZT,
were not employed to treat other
diseases
11/6/2015 3:18 PM
British Medical Association
• BMA is a policy doctrine developed by
R. Anderson in 1995
• Context:
– National Health Service was centralizing
data
11/6/2015 3:18 PM
Threat Model
• Typical attack
– Hello, this is Dr. B of the cardiology
department at …. Your patient S has just
been admitted here in a coma, and he has
a funny-looking ventricular arrhythmia.
Can you tell me if there’s anything relevant
in his record?
• At time of study (1997) 15% of queries
were bogus
11/6/2015 3:18 PM
Insider Threats
• Trusted employee discloses information
• Risk doesn’t scale
– Acceptable risk with one receptionist with
access to 2,000 patient records
– Unacceptable when thousands of
receptionists have access to millions of
patient records
11/6/2015 3:18 PM
Past Approaches
• Adapt Military policy
– Secret: AIDS, STDS,
– Confidential: “normal” patient records
– Restricted: admin and prescription data
• Problem:
– What about a prescription for AZT?
11/6/2015 3:18 PM
BMA Model Goals
• “… enforce principle of patient consent,
prevent too many people getting access
to too large databases of identifiable
records. … not anything new … codify
best practices”
11/6/2015 3:18 PM
BMA Principles
• Access Control
– each identifiable clinical record shall be marked
with an access control list naming the people or
groups of people who may read it and append
data to it. The system shall prevent anyone not
on the ACL from accessing the record in any way
• Record Opening
– a clinician may open a record with herself and the
patient on the ACL. Where a patient has been
referred, she may open a record with herself, the
patient, and the referring clinician(s) on the ACL
11/6/2015 3:18 PM
BMA Principles (cont)
• Control:
– One of the clinicians on the ACL must be marked
as being responsible. Only she may alter the ACL,
and she may only add other health care
professionals to it
• Consent and notification:
– the responsible clinician must notify the patient of
the names on his record’s ACL when it is opened,
of all subsequent additions, and whenever
responsibility is transferred. His consent must
also be obtained, except in emergency or in the
case of statutory exemptions
11/6/2015 3:18 PM
BMA Principles (cont)
• Persistence:
– No one shall have the ability to delete
clinical information until the appropriate
time period has expired
• Attribution:
– all accesses to clinical records shall be
marked on the record with the subject’s
name, as well as the date and time. An
audit trail must also be kept of all deletions
11/6/2015 3:18 PM
BMA
• Information flow:
– Information derived from record A may be
appended to record B if and only if B’s ACL is
contained in A’s
• Aggregation control:
– There shall be effective measures to prevent the
aggregation of personal health information. In
particular, patients must receive special
notification if any person whom it is proposed to
add to their access control list already has access
to personal health information on a large number
of people
11/6/2015 3:18 PM
BMA
• Trusted Computing Base
– computer systems that handle personal
health information shall have a subsystem
that enforces the above principles in an
effective way. Its effectiveness shall be
subject to evaluation by independent
experts.
11/6/2015 3:18 PM
Contrasts
• BMA is decentralized
• Chinese Wall is centralized
• Both hybrid models reflect concerns not
naturally provided by BLP alone
11/6/2015 3:18 PM
Inference Control
• Netflix Challenge
– October 2, 2006, Published anonymized
movie ratings of 500,000 subscribers
– Promised $1M to best algorithm for
recommending movies
11/6/2015 3:18 PM
Netflix Challenge
• Is there any customer information in the
dataset that should be kept private?
– No, all customer identifying information has been
removed; all that remains are ratings and dates.
This follows our privacy policy […] Even if, for
example, you knew all your own ratings and their
dates you probably couldn’t identify them reliably
in the data because only a small sample was
included (less than one-tenth of our complete
dataset) and that data was subject to
perturbation. Of course, since you know all your
own ratings that really isn’t a privacy problem is
it? – FAQ of Netflix challenge
11/6/2015 3:18 PM
• Narayanan and Shmatikov present an
algorithm to de-anonymize the Netflix data
– http://www.cs.utexas.edu/~shmat/shmat_oak
08netflix.pdf
• They can generate queries from IMDb
data that give precise results
• The following slides draw examples and
quotes from the paper
11/6/2015 3:18 PM
So what?
• Joe publishes on IMDb his ratings of
– Power and Terror: Noam Chomsky in Our
Times
– Farenheit 9/11
– Jesus of Nazareth
– The Gospel of John
• Ratings and dates of posts are used to
index into published Netflix data
11/6/2015 3:18 PM
• Netflix data shows that Joe also rated
– Bent
– Queer as folk
• Joe publically released data that might
reveal his politics and religion, but not his
sexual preference
• Inference correlates that with information
about sexual preference that he did not
intend to reveal
11/6/2015 3:18 PM
Micro-data
• Example of “micro-data”, information
about specific individuals
• Micro-data characterized by
– high dimensionality
– sparsity
• “fat tail” individual transaction and preference
records tend to include statistically rare
attributes
11/6/2015 3:18 PM
• NS give an algorithm for deanonymization of data that is
surprisingly robust
• The implications for other “micro data”,
such as medical records are significant
11/6/2015 3:18 PM
• “You Might Also Like:” Privacy Risks of
Collaborative Filtering
– Joseph A. Calandrino1, Ann Kilzer2, Arvind
Narayanan3, Edward W. Felten1, and
Vitaly Shmatikov2
11/6/2015 3:18 PM
• Our work concretely demonstrates the risk
posed by data aggregated from private records
and undermines the widely accepted dichotomy
between “personally identifiable” individual
records and “safe,” large-scale, aggregate
statistics. Furthermore, it demonstrates that
the dynamics of aggregate outputs constitute a
new vector for privacy breaches. Dynamic
behavior of high-dimensional aggregates like
item similarity lists falls beyond the protections
offered by any existing privacy technology,
including differential privacy.
11/6/2015 3:18 PM
• Modern systems have vast surfaces for attacks on privacy,
making it difficult to protect fine-grained information
about their users. Unintentional leaks of private
information are akin to side-channel attacks: it is very
hard to enumerate all aspects of the system’s publicly
observable behavior which may reveal information about
individual users. Increasingly, websites learn from—and
indirectly expose—aggregated user activity in order to
improve user experience, provide recommendations, and
support many other features. Our work demonstrates the
inadequacy of current theory and practice in
understanding the privacy implications of aggregated
data.
11/6/2015 3:18 PM
Other work by Shamtikov and
others
• CACM article “De-anonymizing Social
Networks”, June 2010
• “You Might Also Like:” Privacy Risks of
Collaborative Filtering, S&P 2011.
• http://www.cs.utexas.edu/~shmat/
11/6/2015 3:18 PM
Micro-data
• Other examples?
11/6/2015 3:18 PM
Medical Domains
• Enable research
• Protect public health
• Patient consent
11/6/2015 3:18 PM
Census Data
• Can I find out statistics about Phil
Knight from the census?
• Can I build a series of queries that zero
in on an individual?
11/6/2015 3:18 PM
Inference Control
• D Denning in late 70’s early 80’s
• Characteristic formula
– query set
– query set size control
• only answer queries with query set size ≥ t
– dual
• if total N records, only answer queries with size
≤N–t
11/6/2015 3:18 PM
Anderson’s example
• The two queries
– Average salary of all professors in the
computer laboratory
– Average salary of all male professors in the
computer laboratory
• Reveals the salary of the unique female
professor in the computer laboratory
11/6/2015 3:18 PM
Fascinating Area
• Watch this space
• Consent is key to participation
• Understanding the consequences of
consent is difficult
• Who knew that your IMDb posts could
unlock other data revealing more
complete Netflix behavior
11/6/2015 3:18 PM
• https://www.google.com/ads/preferenc
es
11/6/2015 3:18 PM