Differentially Private Regression with Gaussian Processes

Michael T. Smith, Mauricio Álvarez, Max Zwiessele, Neil D. Lawrence
; Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics,  84:1195-1203, 2018.

Abstract

A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v-smith18a, title = {Differentially Private Regression with Gaussian Processes}, author = {Michael T. Smith and Mauricio Álvarez and Max Zwiessele and Neil D. Lawrence}, pages = {1195--1203}, year = {}, editor = {}, volume = {84}, pdf = {http://proceedings.mlr.press/v84/smith18a/smith18a.pdf}, url = {http://inverseprobability.com/publications/smith18a.html}, abstract = {A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.} }
Endnote
%0 Conference Paper %T Differentially Private Regression with Gaussian Processes %A Michael T. Smith %A Mauricio Álvarez %A Max Zwiessele %A Neil D. Lawrence %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D %E %F pmlr-v-smith18a %I PMLR %J Proceedings of Machine Learning Research %P 1195--1203 %U http://inverseprobability.com %V %W PMLR %X A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.
RIS
TY - CPAPER TI - Differentially Private Regression with Gaussian Processes AU - Michael T. Smith AU - Mauricio Álvarez AU - Max Zwiessele AU - Neil D. Lawrence BT - Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics PY - DA - ED - ID - pmlr-v-smith18a PB - PMLR SP - 1195 DP - PMLR EP - 1203 L1 - http://proceedings.mlr.press/v84/smith18a/smith18a.pdf UR - http://inverseprobability.com/publications/smith18a.html AB - A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs. ER -
APA
Smith, M.T., Álvarez, M., Zwiessele, M. & Lawrence, N.D.. (). Differentially Private Regression with Gaussian Processes. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in PMLR :1195-1203

Related Material