Learning to Learn with the Informative Vector Machine

[edit]

Neil D. Lawrence, University of Sheffield
John C. Platt, Google

in Proceedings of the International Conference in Machine Learning 21, pp 512-519

Related Material

Abstract

This paper describes an efficient method for learning the parameters of a Gaussian process (GP). The parameters are learned from multiple tasks which are assumed to have been drawn independently from the same GP prior. An efficient algorithm is obtained by extending the informative vector machine (IVM) algorithm to handle the multi-task learning case. The multi-task IVM (MT-IVM) saves computation by greedily selecting the most informative examples from the separate tasks. The MT-IVM is also shown to be more efficient than sub-sampling on an artificial data-set and more effective than the traditional IVM in a speaker dependent phoneme recognition task.


@InProceedings{lawrence-learning04,
  title = 	 {Learning to Learn with the Informative Vector Machine},
  author = 	 {Neil D. Lawrence and John C. Platt},
  booktitle = 	 {Proceedings of the International Conference in Machine Learning},
  pages = 	 {512},
  year = 	 {2004},
  editor = 	 {Russell Greiner and Dale Schuurmans},
  volume = 	 {21},
  month = 	 {00},
  publisher = 	 {Omnipress},
  edit = 	 {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2004-01-01-lawrence-learning04.md},
  url =  	 {http://inverseprobability.com/publications/lawrence-learning04.html},
  abstract = 	 {This paper describes an efficient method for learning the parameters of a Gaussian process (GP). The parameters are learned from multiple tasks which are assumed to have been drawn independently from the same GP prior. An efficient algorithm is obtained by extending the informative vector machine (IVM) algorithm to handle the multi-task learning case. The multi-task IVM (MT-IVM) saves computation by greedily selecting the most informative examples from the separate tasks. The MT-IVM is also shown to be more efficient than sub-sampling on an artificial data-set and more effective than the traditional IVM in a speaker dependent phoneme recognition task.},
  crossref =  {Greiner:icml04},
  key = 	 {Lawrence:learning04},
  doi = 	 {10.1145/1015330.1015382},
  linkpdf = 	 {ftp://ftp.dcs.shef.ac.uk/home/neil/mtivm.pdf},
  linkpsgz =  {ftp://ftp.dcs.shef.ac.uk/home/neil/mtivm.ps.gz},
  linksoftware = {http://inverseprobability.com/mtivm/},
  group = 	 {shefml,gp,spgp}
 

}
%T Learning to Learn with the Informative Vector Machine
%A Neil D. Lawrence and John C. Platt
%B 
%C Proceedings of the International Conference in Machine Learning
%D 
%E Russell Greiner and Dale Schuurmans
%F lawrence-learning04
%I Omnipress	
%P 512--519
%R 10.1145/1015330.1015382
%U http://inverseprobability.com/publications/lawrence-learning04.html
%V 21
%X This paper describes an efficient method for learning the parameters of a Gaussian process (GP). The parameters are learned from multiple tasks which are assumed to have been drawn independently from the same GP prior. An efficient algorithm is obtained by extending the informative vector machine (IVM) algorithm to handle the multi-task learning case. The multi-task IVM (MT-IVM) saves computation by greedily selecting the most informative examples from the separate tasks. The MT-IVM is also shown to be more efficient than sub-sampling on an artificial data-set and more effective than the traditional IVM in a speaker dependent phoneme recognition task.
TY  - CPAPER
TI  - Learning to Learn with the Informative Vector Machine
AU  - Neil D. Lawrence
AU  - John C. Platt
BT  - Proceedings of the International Conference in Machine Learning
PY  - 2004/01/01
DA  - 2004/01/01
ED  - Russell Greiner
ED  - Dale Schuurmans	
ID  - lawrence-learning04
PB  - Omnipress	
SP  - 512
EP  - 519
DO  - 10.1145/1015330.1015382
L1  - ftp://ftp.dcs.shef.ac.uk/home/neil/mtivm.pdf
UR  - http://inverseprobability.com/publications/lawrence-learning04.html
AB  - This paper describes an efficient method for learning the parameters of a Gaussian process (GP). The parameters are learned from multiple tasks which are assumed to have been drawn independently from the same GP prior. An efficient algorithm is obtained by extending the informative vector machine (IVM) algorithm to handle the multi-task learning case. The multi-task IVM (MT-IVM) saves computation by greedily selecting the most informative examples from the separate tasks. The MT-IVM is also shown to be more efficient than sub-sampling on an artificial data-set and more effective than the traditional IVM in a speaker dependent phoneme recognition task.
ER  -

Lawrence, N.D. & Platt, J.C.. (2004). Learning to Learn with the Informative Vector Machine. Proceedings of the International Conference in Machine Learning 21:512-519