Differentially Private Regression with Gaussian Processes

[edit]

Michael Smith
Mauricio Álvarez
Max Zwiessele
Neil Lawrence

84, pp 1195-1203

Related Material

Abstract

A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.


@InProceedings{smith18a,
  title = 	 {Differentially Private Regression with Gaussian Processes},
  author = 	 {Michael Smith and Mauricio Álvarez and Max Zwiessele and Neil Lawrence},
  pages = 	 {1195},
  year = 	 {},
  volume = 	 {84},
  series = 	 {Proceedings of Machine Learning Research},
  publisher = 	 {PMLR},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2018-03-31-smith18a.md},
  url =  	 {http://inverseprobability.com/publications/smith18a.html},
  pdf = 	 {http://proceedings.mlr.press/v84/smith18a/smith18a.pdf},
  abstract = 	 {A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.},
  OPTgroup = 	 {}

}
%T Differentially Private Regression with Gaussian Processes
%A Michael Smith and Mauricio Álvarez and Max Zwiessele and Neil Lawrence
%B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics
%D 
%F smith18a
%I PMLR	
%P 1195--1203
%R 
%U http://inverseprobability.com/publications/smith18a.html
%V 84
%X A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.
TY  - CPAPER
TI  - Differentially Private Regression with Gaussian Processes
AU  - Michael Smith
AU  - Mauricio Álvarez
AU  - Max Zwiessele
AU  - Neil Lawrence
PY  - 
DA  - 	
ID  - smith18a
PB  - PMLR	
SP  - 1195
EP  - 1203
L1  - http://proceedings.mlr.press/v84/smith18a/smith18a.pdf
UR  - http://inverseprobability.com/publications/smith18a.html
AB  - A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.
ER  -

Smith, M., Álvarez, M., Zwiessele, M. & Lawrence, N.. (). Differentially Private Regression with Gaussian Processes.84:1195-1203