Differentially Private Regression with Gaussian Processes

Michael T. Smith, Mauricio Álvarez, Max Zwiessele, Neil D. Lawrence
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1195-1203, 2018.

Abstract

A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.

Cite this Paper


BibTeX
@InProceedings{differentially-private-regression-with-gaussian-processes, title = {Differentially Private Regression with Gaussian Processes}, author = {Smith, Michael T. and Álvarez, Mauricio and Zwiessele, Max and Lawrence, Neil D.}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1195--1203}, year = {2018}, volume = {84}, series = {Proceedings of Machine Learning Research}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/smith18a/smith18a.pdf}, url = {http://inverseprobability.com/publications/differentially-private-regression-with-gaussian-processes.html}, abstract = {A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.} }
Endnote
%0 Conference Paper %T Differentially Private Regression with Gaussian Processes %A Michael T. Smith %A Mauricio Álvarez %A Max Zwiessele %A Neil D. Lawrence %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %F differentially-private-regression-with-gaussian-processes %I PMLR %P 1195--1203 %U http://inverseprobability.com/publications/differentially-private-regression-with-gaussian-processes.html %V 84 %X A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.
RIS
TY - CPAPER TI - Differentially Private Regression with Gaussian Processes AU - Michael T. Smith AU - Mauricio Álvarez AU - Max Zwiessele AU - Neil D. Lawrence BT - Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics DA - 2018/03/31 ID - differentially-private-regression-with-gaussian-processes PB - PMLR DP - Proceedings of Machine Learning Research VL - 84 SP - 1195 EP - 1203 L1 - http://proceedings.mlr.press/v84/smith18a/smith18a.pdf UR - http://inverseprobability.com/publications/differentially-private-regression-with-gaussian-processes.html AB - A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs. ER -
APA
Smith, M.T., Álvarez, M., Zwiessele, M. & Lawrence, N.D.. (2018). Differentially Private Regression with Gaussian Processes. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1195-1203 Available from http://inverseprobability.com/publications/differentially-private-regression-with-gaussian-processes.html.

Related Material