File - George Corser

Download Report

Transcript File - George Corser

Slide: 1
Dissertation Proposal
Securing Location Privacy in
Vehicular Communication Systems
and Applications
George Corser, PhD Candidate
Oakland University
May 1, 2014
Slide: 2
Agenda
1.
2.
3.
4.
5.
Background
Problem Statement
Related Work
Preliminary Results
Proposed Research
Slide: 3
1. Background
•
•
•
•
•
What is VANET?
DSRC Protocol Stack(s)
Why VANET?
What is Privacy?
VANET Privacy Threat Model
Slide: 4
What is VANET?
Vehicular Ad-hoc Network
Global Positioning System
Roadside Unit
V2V: Vehicle-to-vehicle
V2V:
V2R:
Also called: V2I
Image source: http://adrianlatorre.com/projects/pfc/img/vanet_full.jpg
Slide: 5
DSRC Protocol Stack(s)
• Two DSRC stacks
Location
Based
Services
– WSMP: WAVE Short
Message Protocol
– TCP/IP
• DSRC: Dedicated Short
Range Communications
• WAVE: Wireless Access
for Vehicular
Environments
Image source: Kenney, 2011
Slide: 6
Why VANET?
Major Applications
• Safety
– Application: Collision Avoidance
– Est: Eliminate 82% of crashes of non-impaired drivers
(US DOT)
– Est: $299.5 billion for traffic crashes (AAA)
• Traffic Management
– Application: Congestion reduction
– Est: $97.7 billion for congestion (AAA)
• Infotainment (LBS)
– Applications: Simple queries, Navigation
– Application: Frequent precise location (FPL) queries
Slide: 7
What is Privacy?
• Definitions of privacy
– Charles Fried (1984): “Privacy is not simply an
absence of information about us in the minds of
others, rather it is the control we have over
information about ourselves.”
– James Moor (1997): “I agree that it is highly desirable
that we control information about ourselves.
However, in a highly computerized culture this is
simply impossible.”
– IEEE 1609.2 (2013): “Anonymity—meaning the ability
of private drivers to maintain a certain amount of
privacy—is a core goal of the system.”
Slide: 8
What is Privacy?
• Types of privacy
– Identity privacy: unlinkability with personally
identifiable information (PII); often achieved with
pseudonyms.
– Location privacy: unlinkability of PII with a
geographical position, and further, the unlinkability of
one pseudonym with another by using location data.
– Query privacy: unlinkability of PII, not only with
location, but also with the particular type of request
made or service used.
• This research would focus on location privacy.
Slide: 9
What is Privacy?
• Desired properties of vehicle network privacy systems
1. Safety (Collision
Avoidance)
2. Trust (Authentication)
3. Identity Privacy
(Pseudonymity*)
4. Location Privacy
(Untrackability)
5. Historical Privacy
(Untraceability)
6. Conditional Privacy
(Accountability)
7. Revocability
8. Trust Authority
Decentralization
9. Anonymous LBS Access
(LBS Pseudonymity)
10. Map Database
Undeanonymizability
11. Context Awareness
(Contextuality)
12. User Consent, Choice,
Control
* a.k.a. anonymous authentication, pseudonymous authentication
Slide: 10
VANET Privacy Threat Model
MAC Layer
RSU: Roadside unit, a wireless access
point for vehicles to connect to wired
network infrastructure
APP Layer
LBS: Location Based Service, an internet
application which uses geographical
position as input (e.g. Google Navigation)
Slide: 11
2. Problem Statement
• VANET (MAC Layer)
– Ultra low latency, for safety
– Low overhead, for wireless efficiency
– Conditional/revocable anonymity, for privacy
• LBS (APP Layer)
– Frequent precise location (FPL) service availability
– Undeanonymizable* anonymous service access
with privacy over wide geographical range
• How to achieve vehicular location privacy?
* protect from RSU/LBS collusion and map deanonymization
Slide: 12
3. Related Work
•
•
•
•
•
Location Privacy Techniques
Location Privacy Theory
Dummy Events
Dummy Events v. Active Decoys
Location Privacy Metrics
Slide: 13
Location Privacy Techniques
• Group signature
– Chaum, 1991, 1712 citations
– Boneh,Boyen, Shacham, 2004, 1024 citations
• Mix zones
– Beresford, Stajano, 2003, 1068 citations
• Cloaking, anonymous LBS
– Gruteser, Greenwald, 2003, 1303 citations
Slide: 14
Location Privacy Theory
Group Signatures
Mix Zones
Cloaking
Image source: Shokri (2010)
Slide: 15
Dummy Events
Source: Location Privacy in Pervasive Computing, Beresford & Stajano, 2003
Early abandonment
Assumption: many concentrated vehicles require continuous privacy protection?
Slide: 16
Dummy Events
Authors
Methods
Category
You, Peng and Lee (2007)
Random trajectory
Spatial shift
Lu, Jensen and Yiu (2008)
Virtual grid, virtual circle
Spatial shift
Chow & Golle (2009)
Google Maps poly line
Trajectory database
Kido, Yanagisawa and
Satoh (2009)
Moving in a neighborhood
Spatial shift
Krumm (2009)
Data gathered from GPS
receivers, then modified
with noise
Trajectory database
Alnahash, Corser, Fu, Zhu
(ASEE, 2014)
Random trajectory
confined to road grid
Spatial shift
Corser, et. al. (IEEE, 2014)
“Live” dummies generated
by active vehicles
Active decoy
Recent resurgence, special applicability to vehicular settings
Assumption: only a subset of users desire privacy?
Slide: 17
Dummy Events v. Active Decoys
• Dummy event: a message containing false
data, sent in order to help conceal a genuine
message. Dummy events and genuine
messages are sent by the same genuine entity,
and function analogously to aircraft flares.
• Active decoy: a dummy event sent by an
entity pretending to be the genuine one.
Active decoys function analogously to fleeing
and dispersing animals in a herd.
The proposed research is designed to examine the tradeoffs between safety,
efficiency and privacy using dummy event and active decoy methods.
Slide: 18
Metrics*
•
•
•
•
•
•
Anonymity Set Size: |AS|
Entropy of |AS|: H( |AS| )
Tracking Probability: Pt = Prob(|AS|=1)
Short-term Disclosure (SD)
Long-term Disclosure (LD)
Distance Deviation (dst)
* See supplemental slides for equations
Slide: 19
4. Preliminary Results
• EPZ: Endpoint Protection Zone
• PBD: Privacy by Decoy
• RRVT: Random Rotation of Vehicle Trajectory
Slide: 20
Endpoint Vulnerability
• Motorists will use LBS applications (V2I)
• LBS administrators can cross-reference vehicle
trajectory endpoints with map databases to
identify LBS user (privacy problem)
LBS: Location Based Service (like Google Navigation)
20
Slide: 21
Cloaking Under FPL
• Under FPL, cloaking can be defeated by
examining trajectory (series of snapshots)
#1: Vehicle/roadway mobility is more predictable than mobile phone mobility.
#2: What if no other active LBS users in vicinity?
21
Slide: 22
EPZ
• Endpoint Protection Zone (EPZ)
V: number of vehicles in region, R
λ: ratio of LBS user vehicles to V
A: area of R
w, h: width, height of EPZ (endpoint protection zone)
E{ | ASEPZ | } = λVwh/A
“Corserian” mix zone
provides “Snowden”
privacy defense, and
defends against map
deanonymization.
Slide: 23
EPZ Simulation Set-up
• Realistic mobility models [15][16][17]: MMTS
– Did not want to use grid-like models (e.g.
Manhattan) because EPZ is square-shaped)
• Counted vehicles originating in EPZ
• Computed metrics
– Metrics: |AS|, H(|AS|), Pt
– Variables: LBS user percentage, λ, and EPZ size
MMTS: Multi-agent Microscopic Traffic Simulator [16]
23
Slide: 24
Metric: |AS|
• The anonymity set, ASi, of target LBS user, i, is
the collection of all LBS users, j, including i,
within the set of all LBS userIDs, ID, whose
trajectories, Tj, are indistinguishable from Ti
AS i  { j | j  ID ,  T j s .t . p ( i , j )  0}
24
Slide: 25
Metric: H(|AS|)
• Entropy expresses the level of uncertainty in
the correlations between Ti and Tj
• It is the sum of the products of all probabilities
and their logarithms, base 2.
Hi  

p ( i , j )  log 2 ( p ( i , j ))
j AS i
If all trajectories equally likely to be the real one, then Hmax = - log2 (p(i,j))
25
Slide: 26
Metric: Pt
• Tracking probability, Pti, is defined as the
chance that |ASi|=k=1
– If |AS|=1, then vehicle has no anonymity
• This metric is important because average Pt
tells what percentage of vehicles have some
privacy, and what percentage have no privacy
at all, not just how much privacy exists in the
overall system
Pt
 P ( AS i  1)
i
26
Slide: 27
Performance Evaluation: |AS|
27
10% LBS users (λ=0.1)
20% LBS users (λ=0.2)
Average anonymity set size, |AS| = k
Slide: 28
Performance Evaluation: H(|AS|)
28
10% LBS users (λ=0.1)
20% LBS users (λ=0.2)
Entropy of average anonymity set size, H(|AS|) = H(k)
Slide: 29
Performance Evaluation: Pt
29
10% LBS users (λ=0.1)
20% LBS users (λ=0.2)
Average tracking probability, Pt
Slide: 30
RSU/LBS Collusion Vulnerability
• Suppose a vehicle tried sending a request to
an LBS using a false location.
Slide: 31
PBD
• Privacy by Decoy (PBD)
Note: an active decoy is
different from a dummy.
PARROTS: Position Altered Requests Relayed Over Time and Space
Slide: 32
Group PBD
Slide: 33
PBD Simulation Setup
•
•
•
•
•
Grid: 3000 m x 3000 m (1.864 mi x 1.864 mi)
Mobility models, rural, urban and city
Sim. time 2000 seconds or 33.3 minutes.
EPZ: 600 m x 600 m (25 EPZs) to 300 m x 300 m (100 EPZs)
λ = LBS users; ρ = potential parrots; φ = pirates
Slide: 34
PBD Overall Results
Slide: 35
PBD Results
Before PBD (EPZ Only)
Theoretical Values of |AS|
After
ρ: ratio of potential parrots to total vehicles
φ: ratio of LBS users who desire privacy
E{ | ASEPZpi | } = 1 + ρ / φ λ
Group login: E{ | ASEPZpg | } = (λ + ρ) wh/A
Individual login:
Slide: 36
Pure Dummy Event Solution
• Can a vehicle transmit dummy events without
recruiting parrots?
Slide: 37
RRVT
• Random Rotation of Vehicular Trajectory
Note: vehicles desiring privacy can produce accurate dummies using
points from other vehicles which transmit precise locations.
Left image source: You, Peng and Lee, 2007
Slide: 38
Metric: SD
• Short-term Disclosure (SD)
m: time slices
Di : set of true and dummy locations at time slot i
SD: the probability of an eavesdropper successfully identifying a true trajectory
given a set of true and dummy POSITIONS over a short period of time
Slide: 39
Metric: LD
• Long-term Disclosure (LD)
More overlap means
more privacy
n total trajectories
k trajectories that overlap
n – k trajectories that do not overlap
Tk is the number of possible trajectories amongst the overlapping trajectories
SD: the probability of an eavesdropper successfully identifying a true trajectory
given a set of true and dummy TRAJECTORIES over a longer period of time
Slide: 40
Overlap Improves Privacy
• 3 trajectories
• 8 possible paths
Image source: You, Peng and Lee, 2007
Slide: 41
Metric: dst
• Distance Deviation (dst)
dsti : the distance deviation of user i
PLji : the location of true user i at the jth time slot
Ljdk : the location of the kth dummy at the jth time slot
dist() express the distance between the true user location and
the dummy location
n dummies
m time slots
dst is the average of distance between trajectories of dummies and the true user
Slide: 42
RRVT Simulation Setup
Example real trajectory in red
Example dummy trajectories in black
•
•
•
•
•
Sim. time: 20 time slots
Speed: ~3 squares/slot
Dummies: sets of 5 to 25
Manhattan grid 50x50
Trajectories constrained
to roadways every 10 grid
squares
• Ran simulation nine times
per dummy set
• Data presented: median
number of trajectory
intersection overlaps
Slide: 43
RRVT Results
Improvement in LD when roadway mobility enforced
SD
LD
For SD, LD: Lower is better
Slide: 44
5. Proposed Research
• Systematic Study
• Anticipated Contributions
• Timeline
Slide: 45
Systematic Study
• Measure the effectiveness of existing methods
(See: Metrics supp. slides)
• Create new methods* and compare tradeoffs,
effectiveness with existing methods
• Create new metrics, if necessary
• Consider vehicular domain specific issues
– Mobility/density (city, suburb, rural), location
privacy metrics, mix zone choices, GPS precision,
LBS query frequency (esp. FPL), RSU coverage
area, LBS market penetration, MAC/APP layer
collusion, map deanonymization, ...
* Currently working on gas station mix zone
Slide: 46
Anticipated Contributions
• Combined MAC layer and APP layer privacy
has not been studied in vehicular contexts.
• Dummy event and active decoy methods have
been ignored for many years. It is possible
they may apply in vehicular applications
because of the different network architecture.
• Journal publication(s) detailing the discovered
mathematical relationships (extending
conference papers)
Slide: 47
Timeline
Month
Actions
May



Present this dissertation proposal to DAC on May 1
Apply to graduate in fall 2014
Gather early simulation results, develop simulation (simple anonymous LBS access)
June

Gather more simulation results (LBS access with spatial-temporal cloaking, and
active decoy LBS access)
July


Final simulation results
Begin dissertation write-up
August
(Move to
Saginaw)





Conclude dissertation initial draft write-up
Submit dissertation draft to DAC prior to August 31
Meet with adviser to ensure all degree requirements met
Register for 1 dissertation credit, Fall 2014
Dissertation write-up
September


Wait for DAC approval
Schedule defense
October


Submit Dissertation Defense Announcement Form to Graduate Study and Lifelong
Learning (at least 2 weeks prior to defense)
Defend before DAC prior to Oct 31
November

Format dissertation and submit for binding
December

Graduate December 13
Slide: 48
Slide: 49
Publications to Date
Venue
Topic
Year
IEEE IV 2014: 2014 IEEE Intelligent Vehicles
Symposium
PBD: Privacy-by-decoy
(Dearborn, MI)
2014
IEEE ICCVE 2013: Second International
Conference on Connected Vehicles & Expo
EPZ: Endpoint Protection Zone
(Las Vegas, NV)
2013
IJITN: International Journal of Interdisciplinary Measuring Attacker
Telecommunications and Networking 2013
Motivation (Journal)
2013
ACM InfoSecCD 2012: 2012 Information
Security Curriculum Development Conference
A tale of two CTs: IP packets
rejected by a firewall (Award)
2012
ACM InfoSecCD 2012: 2012 Information
Security Curriculum Development Conference
Professional association
membership (Award)
2012
•
•
•
Alrajei, N., Corser, G., Fu, H., Zhu, Y. (2014, February). Energy Prediction Based Intrusion Detection In Wireless Sensor
Networks. International Journal of Emerging Technology and Advanced Engineering (IJETAE), Volume 4, Issue 2. (Journal)
Oluoch, J., Corser, G., Fu, H., Zhu, Y. (2014, April). Simulation Evaluation of Existing Trust Models in Vehicular Ad Hoc Networks.
In 2014 American Society For Engineering Education North Central Section Conference (ASEE NCS 2014).
Alnahash, N., Corser, G., Fu, H. (2014, April). Protecting Vehicle Privacy using Dummy Events. In 2014 American Society For
Engineering Education North Central Section Conference (ASEE NCS 2014).
Slide: 50
Location Privacy:
Vehicular Methods and Techniques
Method
Technique
MAC Layer
IP, APP Layers
Hiding
Silent period
Unsafe
No service
Unsafe
No service
OK (1)
OK (1)
+ Dummifying Cloaking region
Latency:TTP (2)
Congestion
+ Dummifying Active decoy
OK (2)
Congestion
+ Anonymizing Mix zone
Anonymizing
PseudoID
Dummifying
False data
Unsafe
Congestion
Obfuscating
Noise
Unsafe
Impaired service
All techniques, except active decoy, impair APP-level continuous precise location (CPL)
and frequent precise location (FPL) queries. Other problems:
1.
2.
Anonymizing problems: PseudoID-to-pseudoID tracking, map deanonymization
MAC layer cloaking/decoy problems: too slow for safety beacon, exposes
duplicate beacons, complicates authentication/CRL/congestion