Differentially Private Gaussian Processes

Michael Thomas SmithMax ZwiesseleNeil D. Lawrence
, 2016.

Abstract

A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-smith-dpgp16, title = {Differentially Private Gaussian Processes}, author = {Michael Thomas Smith and Max Zwiessele and Neil D. Lawrence}, year = {}, editor = {}, url = {http://inverseprobability.com/publications/smith-dpgp16.html}, abstract = {A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.} }
Endnote
%0 Conference Paper %T Differentially Private Gaussian Processes %A Michael Thomas Smith %A Max Zwiessele %A Neil D. Lawrence %B %C Proceedings of Machine Learning Research %D %E %F pmlr-v-smith-dpgp16 %I PMLR %J Proceedings of Machine Learning Research %P -- %U http://inverseprobability.com %V %W PMLR %X A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.
RIS
TY - CPAPER TI - Differentially Private Gaussian Processes AU - Michael Thomas Smith AU - Max Zwiessele AU - Neil D. Lawrence BT - PY - DA - ED - ID - pmlr-v-smith-dpgp16 PB - PMLR SP - DP - PMLR EP - L1 - UR - http://inverseprobability.com/publications/smith-dpgp16.html AB - A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes. ER -
APA
Smith, M.T., Zwiessele, M. & Lawrence, N.D.. (). Differentially Private Gaussian Processes. , in PMLR :-

Related Material