Transcript Slide 1

Krishna Rawali Puppala Privacy-Aware Personalization for Mobile Advertising Michaela Hardt Suman Nath

Summary • • • • • • • Contextual Computing Personalized Ad Delivery System Framework Ad selection Algorithms Distributed Count Protocol Experimental Set-up Conclusion

Contextual Computing

Personalized ad Delivery System Statistics gathering

Personalized ad Delivery system Ad delivery

Personalized ad Delivery System Billing Advertisers

Privacy Aware Ad Delivery • • • Server only Personalization  Repriv System Client only Personalization  Privad System Can we formalize a common framework for personalized ad delivery that can be instantiated to any desired trade off point?

– Hybrid Framework

Privacy-Preserving Statistics Gathering • Personalization information based on (Click through rates) CTRs.

– Problems : Users may or may not available during the course of stats gathering – Users might decline to participate Then how can we achieve stats gathering in an efficient and privacy preserving way??? Ans- Developed a differentially private protocol without a trusted third party

Framework • • • Users who are served ads Advertisers who pay for clicks on their ads Ad service provider who decides which ads to display

Desiderata • • Goals for Ad Delivery  Privacy  Efficiency  Revenue & Relevance  Expected revenue is Goals for Statistic Gathering  Privacy in the absence of trusted server  Scalability  Robustness

Privacy Aware Ad Delivery • • The P-E-R Trade Offs – One has to find reasonable trade offs between the three design goals Optimizing Ad Delivery – Client Side Computation – Server Side Computation

Ad Selection Algorithms • • Client and Server can efficiently compute their parts of optimization jointly to choose the best set of ads that achieve a desired trade off.

Approximation Algorithm • Greedy Algorithm – Starts with A empty set and increments in each round that increases the expected revenue.

• Provides maximum coverage problem.

Greedy Algorithm

Private Statistics Gathering • • How to achieve statistics in a privacy way?

– Uses Server and a Proxy • Server- Key distribution • Proxy- Aggregation and Anonymization – Ex – VeriSign as the proxy Assumptions – Honest but curious servers – Honest Fraction of Users • Works with ε-differential privacy • Noise is generated in the absence of trusted third party. Probabilistic relaxation (ε,δ)-differential privacy is adopted.

Privacy Preserving Distributed Count • Counting Protocol

• Privacy Preserving Estimates • Top-Down Computation

Experimental Setup • • Dataset- Used a trace of location aware searches. Trace has a scheme : {user-ID, query, user-location, business-ID} Context- Evaluation to contexts on – Location- User’s location – Interest- Multi set of Ids the user clicked on before – Query- The search query the user sends

Experimental Setup • • Attribute Generalization – Location – Interest – Query Context Hierarchy

Evaluating Trade-Offs Effect of CTR Threshold Effect of Communication Complexity

Evaluating Trade-Offs (Cont) • Effect of Information Disclosure

Conclusion • • • Addressed the personalization ad delivery problem without compromising user privacy Proposed the differentially private protocol – Computed statistics even in the presence of malicious users Finally Achieved P-E-R.