Online Privacy Issues Overview

Download Report

Transcript Online Privacy Issues Overview

Design for Privacy
September 2009
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
1
Outline




Engineering privacy
Design of privacy tools
Design for privacy in everyday software
Obtaining informed consent
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
2
Engineering privacy
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
3
How Privacy Rights are Protected
 By policy
 By architecture
– Protection through laws and
organizational privacy policies
– Must be enforced
– Often requires mechanisms to
obtain and record consent
– Transparency facilitates choice and
accountability
– Technology facilitates compliance
and reduces the need to rely solely
on trust and external enforcement
– Technology reduces or eliminates
any form of manual processing
or intervention by humans
– Violations still possible due to bad
actors, mistakes, government
mandates
– Protection through technology
– Reduces the need to rely on trust
and external enforcement
– Violations only possible if
technology fails or the availability
of new data or technology defeats
protections
– Often viewed as too expensive or
restrictive
• Limits the amount of data available
for data mining, R&D, targeting,
other business purposes
• May require more complicated
system architecture, expensive
cryptographic operations
• Pay now or pay later
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
4
Privacy
stages
0
identifiability
identified
Approach
to privacy
protection
privacy
by
policy
(notice and
choice)
1
Linkability
of data to
personal
identifiers
linked
• unique identifiers across databases
• contact information stored with profile information
linkable with
reasonable &
automatable
effort
• no unique identifies across databases
• common attributes across databases
• contact information stored separately from profile
or transaction information
not linkable
with
reasonable
effort
• no unique identifiers across databases
• no common attributes across databases
• random identifiers
• contact information stored separately
from profile or transaction information
• collection of long term person characteristics on a
low level of granularity
• technically enforced deletion of profile details at
regular intervals
unlinkable
• no collection of contact information
• no collection of long term person characteristics
• k-anonymity with large value of k
pseudonymous
2
privacy
by
architecture
3
anonymous
System Characteristics
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
Sarah Spiekermann and Lorrie Faith Cranor. Engineering Privacy. IEEE Transactions on Software Engineering. Vo.
35, No. 1, January/February, 2009, pp. 67-82. http://ssrn.com/abstract=1085333
Degrees of Identifiability
5
Design of Privacy Tools
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
6
Privacy tool examples





Cookie managers
Anonymizers
Encryption tools
Disk wiping utilities
P3P user agents
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
7
Laptop Compubody Sock for
privacy, warmth, and
concentration in public spaces
Created by Becky Stern
http://sternlab.org/2008/04/body-technology-interfaces/
CIPP/IT Section Three | Privacy Protection Mechanisms
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
8
Issues to consider
 Privacy is a secondary task
– Users of privacy tools often seek out these tools due to their
awareness of or concern about privacy
– Even so, users still want to focus on their primary tasks
 Users have differing privacy concerns and needs
– One-size-fits-all interface may not work
 Most users are not privacy experts
– Difficult to explain current privacy state or future privacy
implications
– Difficult to explain privacy options to them
– Difficult to capture privacy needs/preferences
 Many privacy tools reduce application performance,
functionality, or convenience
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
9
Case study: Tor
 Internet anonymity system
 Allows users to send messages that cannot be
traced back to them (web browsing, chat, p2p,
etc.)
 UI was mostly command line interface until
recently
 2005 Tor GUI competition
– CUPS team won phase 1 with design for Foxtor!
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
10
One-size-doesn’t-fit-all problem
 Tor is configurable and different users will want to configure it
in different ways
– But most users won’t understand configuration options
– Give users choices, not dilemmas
 We began by trying to understand our users
– No budget, little time, limited access to users
– So we brainstormed about their needs, tried to imagine them, and
develop personas for them
 This process led to realization that our users had 3 categories
of privacy needs
– Basic, selective, critical
 Instead of asking users to figure out complicated settings,
most of our configuration involves figuring out which types of
privacy needs they have
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
11
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
12
Understand primary task
 Anonymity is not a primary task
 What are the primary tasks our users are engaged in
when they want anonymity?
 Lots of them …. Web browsing, chatting, file sharing,
etc., but we speculate that browsing will be most
frequent for most users
 So, instead of building anonymity tool that you can
use to anonymize web browsing…
 … build a web browser with built in anonymity
functions
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
13
Metaphors
 Because of performance issues and problems
accessing some web sites through Tor, some users
will want to turn the anonymity function on and off
 Important to make it easy for users to determine
current state
 Communicate through visual symbol and readily
understandable metaphor
 Brainstormed possibilities: torized/untorized,
private/exposed, cloaked/uncloaked,
masked/unmasked
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
14
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
15
Design for privacy in every day
software
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
16
Examples
 Ecommerce personalization systems
– Concerns about use of user profiles
 Software that “phones home” to fetch software
updates or refresh content, report bugs, relay usage
data, verify authorization keys, etc.
– Concerns that software will track and profile users
 Communications software (email, IM, chat)
– Concerns about traffic monitoring, eavesdroppers
 Presence systems (buddy lists, shared spaces, friend
finders)
– Concerns about limiting when info is shared and with
whom
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
17
Issues to consider
 Similar to issues to consider for privacy tools PLUS
 Users may not be aware of privacy issues up front
– When they find out about privacy issues they may be
angry or confused, especially if they view notice as
inadequate or defaults as unreasonable
 Users may have to give up functionality or
convenience, or spend more time configuring system
for better privacy
 Failure to address privacy issues adequately may lead
to bad press and legal action
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
18
The Prada NYC dressing room
 http://www.sggprivalite
.com/
 What aspects seem
privacy invasive?
 How could the design
be changed to reduce
privacy concerns?
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
19
Amazon.com privacy makeover
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
20
Streamline menu navigation for
customization
Provide way to set up default rules
 Every time a user makes a new purchase that
they want to rate or exclude they have to edit
profile info
– There should be a way to set up default rules
•
•
•
•
Exclude all purchases
Exclude all purchases shipped to my work address
Exclude all movie purchases
Exclude all purchases I had gift wrapped
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
22
Remove excluded purchases from
profile
 Users should be able to
remove items from profile
 If purchase records are
needed for legal reasons,
users should be able to
request that they not be
accessible online
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
23
Better: options for controlling recent
history
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
24
Use personae
 Amazon already allows users to store multiple
credit cards and addresses
 Why not allow users to create personae linked
to each with option of keeping
recommendations and history separate
(would allow easy way to separate
work/home/gift personae)?
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
25
Allow users to access all privacy-related
options in one place
 Currently privacy-related options are found
with relevant features
 Users have to be aware of features to find the
options
 Put them all in one place
 But also leave them with relevant features
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
26
I didn’t buy it for myself
How about an “I
didn’t buy it for
myself” check-off
box (perhaps
automatically
checked if gift
wrapping is
requested)
I didn’t buy it
for myself
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
27
Other ideas for improving Amazon
privacy interface?
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
28
Obtaining informed consent
 Many software products contain phone home features, for example, for
performing software updates or monitoring usage patterns. In some cases
software phones homes quite frequently, for example, to update phishing
black lists or check for fresh image files. Users may be concerned that the
software company is using these features to track or profile them. Thus it is
important that the software is up front about the fact that it is phoning home.
Furthermore, some users may wish to disable such features or be prompted
every time before they phone home (due to privacy or other concerns),
whereas other users are happy to have them operate automatically.
 Discuss the various approaches you have seen different software
manufacturers take to addressing this problem. What do you like/dislike about
them?
 How should phone home features be designed so that they facilitate informed
consent? Describe an example user interface design and general principles
that might be applied to specific cases.
 What sort of user studies should be performed to test this user interface
design?
CyLab Usable Privacy and Security Laboratory
http://cups.cs.cmu.edu/
29