Information

An Individual Differential Privacy Framework for Rigorous and High-Utility Privacy Accounting in Web Measurement
  • Past
  • Confirmed
  • Breakout Sessions

Meeting

Event details

Date:
Pacific Daylight Time
Status:
Confirmed
Location:
4 Concourse Level - Huntington
Participants:
Panos Astithas, Matthew Atkinson, Christian Berkhoff, Aykut Bulut, Benjamin Case, Yug Desai, Alessandro Distaso, Roxana Geambasu, Fabian Höring, Richa Jain, Osei Manu Kagyah, Michal Kalisz, Joey Knightbrook, Mirja Kühlewind, FATMA MOALLA, Vinod Panicker, Thomas Prieur, Alexandra Reimers, Vikas Sahu, Phillipp Schoppmann, Wendy Seltzer, Joshua Ssengonzi, Zacharias Törnblom
Big meeting:
TPAC 2024 (Calendar)

@bmcase and I, along with several differential privacy researchers, have developed a compelling privacy framework where each device tracks and controls the privacy loss incurred by the user’s participation in various measurements, such as advertising, engagement, or mobility analytics. Currently, these measurements require collecting sensitive user activity traces (e.g., visited sites, purchases), which raises privacy concerns. Our framework proposes a privacy-preserving alternative: the device tracks activity locally and generates encrypted reports, which can be aggregated by a trusted execution engine (TEE) or secure multi-party computation system.

We formalize our framework using individual differential privacy, allowing each device to account for and constrain their own user’s privacy loss toward each measurement party. This approach offers significant privacy-utility benefits over traditional models and improves transparency by letting users monitor their privacy on each device. However, it also introduces potential biases in measurement results, which we are working to address, but for whose design we require the community’s input.

At the breakout, we thus plan to:

  1. Present our privacy framework, which we developed initially for advertising measurement use cases.
  2. Seek community feedback on applying the framework to other domains, as we believe our framework is much more general.
  3. Discuss strategies to mitigate bias introduced by individual privacy tracking.

An academic paper describing our privacy framework can be found here.

Agenda

Chairs:
Roxana Geambasu, Benjamin Case

Description:
@bmcase and I, along with several differential privacy researchers, have developed a compelling privacy framework where each device tracks and controls the privacy loss incurred by the user’s participation in various measurements, such as advertising, engagement, or mobility analytics. Currently, these measurements require collecting sensitive user activity traces (e.g., visited sites, purchases), which raises privacy concerns. Our framework proposes a privacy-preserving alternative: the device tracks activity locally and generates encrypted reports, which can be aggregated by a trusted execution engine (TEE) or secure multi-party computation system.

We formalize our framework using individual differential privacy, allowing each device to account for and constrain their own user’s privacy loss toward each measurement party. This approach offers significant privacy-utility benefits over traditional models and improves transparency by letting users monitor their privacy on each device. However, it also introduces potential biases in measurement results, which we are working to address, but for whose design we require the community’s input.

At the breakout, we thus plan to:

  1. Present our privacy framework, which we developed initially for advertising measurement use cases.
  2. Seek community feedback on applying the framework to other domains, as we believe our framework is much more general.
  3. Discuss strategies to mitigate bias introduced by individual privacy tracking.

An academic paper describing our privacy framework can be found here.

Goal(s):
To present our individual differential privacy framework for web measurements, gather community feedback on extending its application beyond advertising, and explore strategies for addressing challenges like bias in measurement results.

Agenda:
Outline:

  • Background on ad measurements and emerging APIs
  • Our privacy framework: Cookie Monster
  • Discussion on broader applications and bias mitigation

Materials:

Export options

Personal Links

Please log in to export this event with all the information you have access to.

Public Links

The following links do not contain any sensitive information and can be shared publicly.

Feedback

Report feedback and issues on GitHub.