Differentially Private Gaussian Processes

[edit]

Michael Thomas Smith, University of Sheffield
Max Zwiessele, University of Sheffield
Neil D. Lawrence, University of Sheffield

Related Material

Abstract

A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.


@TechReport{smith-dpgp16,
  title = 	 {Differentially Private Gaussian Processes},
  author = 	 {Michael Thomas Smith and Max Zwiessele and Neil D. Lawrence},
  year = 	 {2016},
  institution = 	 {},
  month = 	 {00},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2016-06-02-smith-dpgp16.md},
  url =  	 {http://inverseprobability.com/publications/smith-dpgp16.html},
  abstract = 	 {A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.},
  key = 	 {Smith:dpgp16},
  linkpdf = 	 {https://arxiv.org/abs/1606.00720},
  OPTgroup = 	 {}
 

}
%T Differentially Private Gaussian Processes
%A Michael Thomas Smith and Max Zwiessele and Neil D. Lawrence
%B 
%D 
%F smith-dpgp16	
%P --
%R 
%U http://inverseprobability.com/publications/smith-dpgp16.html
%X A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.
TY  - CPAPER
TI  - Differentially Private Gaussian Processes
AU  - Michael Thomas Smith
AU  - Max Zwiessele
AU  - Neil D. Lawrence
PY  - 2016/06/02
DA  - 2016/06/02	
ID  - smith-dpgp16	
SP  - 
EP  - 
L1  - https://arxiv.org/abs/1606.00720
UR  - http://inverseprobability.com/publications/smith-dpgp16.html
AB  - A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Differential privacy is a framework which allows algorithms to have provable privacy guarantees. Gaussian processes are a widely used approach for dealing with uncertainty in functions. This paper explores differentially private mechanisms for Gaussian processes. We compare binning and adding noise before regression with adding noise post-regression. For the former we develop a new kernel for use with binned data. For the latter we show that using inducing inputs allows us to reduce the scale of the added perturbation. We find that, for the datasets used, adding noise to a binned dataset has superior accuracy. Together these methods provide a starter toolkit for combining differential privacy and Gaussian processes.
ER  -

Smith, M.T., Zwiessele, M. & Lawrence, N.D.. (2016). Differentially Private Gaussian Processes.:-