TY - GEN
T1 - Conservative or liberal? Personalized differential privacy
AU - Jorgensen, Zach
AU - Yu, Ting
AU - Cormode, Graham
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/5/26
Y1 - 2015/5/26
N2 - Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.
AB - Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to those who do not want it. We propose a new privacy definition called personalized differential privacy (PDP), a generalization of differential privacy in which users specify a personal privacy requirement for their data. We then introduce several novel mechanisms for achieving PDP. Our primary mechanism is a general one that automatically converts any existing differentially private algorithm into one that satisfies PDP. We also present a more direct approach for achieving PDP, inspired by the well-known exponential mechanism. We demonstrate our framework through extensive experiments on real and synthetic data.
UR - http://www.scopus.com/inward/record.url?scp=84940878770&partnerID=8YFLogxK
U2 - 10.1109/ICDE.2015.7113353
DO - 10.1109/ICDE.2015.7113353
M3 - Conference contribution
AN - SCOPUS:84940878770
T3 - Proceedings - International Conference on Data Engineering
SP - 1023
EP - 1034
BT - 2015 IEEE 31st International Conference on Data Engineering, ICDE 2015
PB - IEEE Computer Society
T2 - 2015 31st IEEE International Conference on Data Engineering, ICDE 2015
Y2 - 13 April 2015 through 17 April 2015
ER -