Cloaking Functions: Differential Privacy with Gaussian Processes

[edit]

at CD-MAKE Workshop at ARES 2017 on Aug 30, 2017 [reveal]
Neil D. Lawrence, Amazon Research Cambridge and University of Sheffield

Abstract

Processing of personally sensitive information should respect an individual’s privacy. One promising framework is Differential Privacy (DP). In this talk I’ll present work led by Michael Smith at the University of Sheffield on the use of cloaking functions to make Gaussian process (GP) predictions differentially private. Gaussian process models are flexible models with particular advantages in handling missing and noisy data. Our hope is that advances in DP for GPs will make it easier to ‘learn without looking’, i.e. gain the advantages of prediction from patient data without impinging on their privacy. Joint work with Michael T. Smith, Max Zwiessele and Mauricio Alvarez