Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in Statistical Data Privacy: Raef Bassily, Penn State

Starts:
10:00 am on Wednesday, October 9, 2013
Ends:
11:30 am on Wednesday, October 9, 2013
Location:
MCS 137
Abstract: In this talk, I will present a new framework for defining privacy in statistical databases that enables reasoning about and exploiting adversarial uncertainty about the data. Roughly, our framework requires indistinguishability of the real world in which a mechanism is computed over the real dataset, and an ideal world in which a simulator outputs some function of a “scrubbed” version of the dataset (e.g., one in which an individual user’s data is removed). In each world, the underlying dataset is drawn from the same distribution in some class (specified as part of the definition), which models the adversary’s uncertainty about the dataset. I will argue that our framework provides meaningful guarantees in a broader range of settings as compared to previous efforts to model privacy in the presence of adversarial uncertainty. I will also present several natural, “noiseless” mechanisms that satisfy our definitional framework under realistic assumptions on the distribution of the underlying data. Joint work with Adam Groce, Jonathan Katz, and Adam Smith, appearing in FOCS 2013