Munz, R., Eigner, F., Maffei, M., Francis, P., & Garg, D. (2018). UniTraX: Protecting Data Privacy with Discoverable Biases. In L. Bauer & R. Küsters (Eds.), Principles of Security and Trust (pp. 278–299). Springer, Lecture Notes in Computer Science. https://doi.org/10.1007/978-3-319-89722-6_12
Springer, Lecture Notes in Computer Science, Schwitzerland
Data; Privacy; UniTraX; Protecting
An ongoing challenge with differentially private database systems is that of maximizing system utility while staying within a certain privacy budget. One approach is to maintain per-user budgets instead of a single global budget, and to silently drop users whose budget is depleted. This, however, can lead to very misleading analyses because the system cannot provide the analyst any information about which users have been dropped.
This paper presents UniTraX, the first differentially private system that allows per-user budgets while providing the analyst information about the budget state. The key insight behind UniTraX is that it tracks budget not only for actual records in the system, but at all points in the domain of the database, including points that could exist but do not. UniTraX can safely report the budget state because the analyst does not know if the state refers to actual records or not. We prove that UniTraX is differentially private. UniTraX is compatible with existing differentially private analyses and our implementation on top of PINQ shows only moderate runtime overheads on a realistic workload.