Quantifying Location Privacy: The Case of Sporadic

Download Report

Transcript Quantifying Location Privacy: The Case of Sporadic

Quantifying Location Privacy:
The Case of Sporadic Location Exposure
Reza Shokri
George Theodorakopoulos
George Danezis
Jean-Pierre Hubaux
Jean-Yves Le Boudec
The 11th Privacy Enhancing Technologies Symposium (PETS), July 2011
Mobility
Actual Trajectory
Metric
Application
Reconstructed
Trajectory
Exposed Trajectory
Attack
Protection
Distorted Trajectory
Observation
● Assume time and location are discrete…
2
Location-based Services
• Sporadic vs. Continuous Location Exposure
• Application Model
Is the location
exposed?
0/1
Mobility Model
Actual Location of
user ‘u’ at time ‘t’
3
Protection Mechanisms
Actual Trajectory
ui
Actual Location
Observed Location
1
2
3
4
5
1
2
3
4
5
6
7
8
9
10
6
7
8
9
10
11
12
13
14
15
obfuscate
11
12
13
14
15
16
17
18
19
20
anonymize
16
17
18
19
20
21
22
23
24
25
21
22
23
24
25
Application
Protection Mechanism
● Consider a given user at a given time instant
4
Protection Mechanisms
• Model
Observed location
of pseudonymous
user u’ at time t
user to pseudonym
assignment
● User pseudonyms stay unchanged over time…
5
Adversary
• Background Knowledge
– Stronger: Users’ transition probability between locations
• Markov Chain transition probability matrix
– Weaker: Users’ location distribution over space
• Stationary distribution of the ‘transition probability matrix’
● Adversary also knows the PDFs associated to the ‘application’ and the ‘protection mechanism’
6
Adversary
• Localization Attack
– What is the probability that Alice is at a given location at a
specific time instant? (given the observation and
adversary’s background knowledge)
– Bayesian Inference relying on Hidden Markov Model
• Forward-Backward algorithm, Maximum weight assignment
● Find the details of the attack in the paper
7
Location Privacy Metric
• Anonymity?
– How successfully can the adversary link the user
pseudonyms to their identities?
– Metric: The percentage of correct assignments
• Location Privacy?
– How correctly can the adversary localize the users?
– Metric: Expected Estimation Error (Distortion)
● Justification:
R. Shokri, G. Theodorakopoulos, J-Y. Le Boudec, J-P. Hubaux.
‘Quantifying Location Privacy’. IEEE S&P 2011
8
Evaluation
• Location-Privacy Meter
– Input: Actual Traces
• Vehicular traces in SF, 20 mobile users moving in 40 regions
– Output: ‘Anonymity’ and ‘Location Privacy’ of users over time
– Modules: Associated PDFs of ‘Location-based Application’
and ‘Location-Privacy Preserving Mechanisms’
● More information here: http://lca.epfl.ch/projects/quantifyingprivacy
9
Evaluation
• Location-based Applications
– once-in-a-while APP(o, Θ)
– local search APP(s, Θ)
• Location-Privacy Preserving Mechanisms LPPM(φ, ρ, {u,g})
– fake-location injection (with rate φ)
• (u) Uniform selection
• (g) Selection according to the average mobility profile
– location obfuscation (with parameter ρ)
• ρ: The number of removed low-order bits from the location identifier
10
11
Results - Anonymity
Results – Location Privacy
φ: the fake-location injection rate
12
More Results – Location Privacy
uniform selection
0
obfuscation
fake injection 0.0
0.0
hiding
2
0.0
0.0
4
0.0
0.0
0
0.3
0.0
0
0.5
0.0
0
0.0
0.3
0
0.0
0.5
Conclusions & Future Work
• The effectiveness of ‘Location-Privacy Preserving Mechanisms’
cannot be evaluated independently of the ‘Location-based
Application’ used by the users
• Fake-location injection technique is very effective for ‘sporadic
location exposure’ applications
– Advantage: no loss of quality of service
– Drawback: more traffic exchange
• The ‘Location-Privacy Meter’ tool is enhanced in order to model the
applications and also new protection mechanisms, notably fakelocation injection
• Changing pseudonyms over time: to be added to our probabilistic
framework
14
Location-Privacy Meter (LPM):
A Tool to Quantify Location Privacy
http://lca.epfl.ch/projects/quantifyingprivacy
15
16