Efficient Inference in Matrix-Variate Gaussian Models with i.i.d. Observation Noise

Oliver StegleChristoph Lippert, Joris Mooij, Neil D. LawrenceKarsten Borgwardt
Neural Information Processing Systems:630-638, 2011.

Abstract

Inference in matrix-variate Gaussian models has major applications for multi- output prediction and joint learning of row and column covariances from matrix- variate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance between samples. We show practical utility on applications to biology, where we model covariances with more than 100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to better reconstruct the confounders.

Cite this Paper


BibTeX
@InProceedings{Stegle-sparse11, title = {Efficient Inference in Matrix-Variate {G}aussian Models with i.i.d. Observation Noise}, author = {Stegle, Oliver and Lippert, Christoph and Mooij, Joris and Lawrence, Neil D. and Borgwardt, Karsten}, booktitle = {Neural Information Processing Systems}, pages = {630--638}, year = {2011}, pdf = {https://proceedings.neurips.cc/paper/2011/file/a732804c8566fc8f498947ea59a841f8-Paper.pdf}, url = {http://inverseprobability.com/publications/stegle-sparse11.html}, abstract = {Inference in matrix-variate Gaussian models has major applications for multi- output prediction and joint learning of row and column covariances from matrix- variate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance between samples. We show practical utility on applications to biology, where we model covariances with more than 100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to better reconstruct the confounders. } }
Endnote
%0 Conference Paper %T Efficient Inference in Matrix-Variate Gaussian Models with i.i.d. Observation Noise %A Oliver Stegle %A Christoph Lippert %A Joris Mooij %A Neil D. Lawrence %A Karsten Borgwardt %B Neural Information Processing Systems %D 2011 %F Stegle-sparse11 %P 630--638 %U http://inverseprobability.com/publications/stegle-sparse11.html %X Inference in matrix-variate Gaussian models has major applications for multi- output prediction and joint learning of row and column covariances from matrix- variate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance between samples. We show practical utility on applications to biology, where we model covariances with more than 100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to better reconstruct the confounders.
RIS
TY - CPAPER TI - Efficient Inference in Matrix-Variate Gaussian Models with i.i.d. Observation Noise AU - Oliver Stegle AU - Christoph Lippert AU - Joris Mooij AU - Neil D. Lawrence AU - Karsten Borgwardt BT - Neural Information Processing Systems DA - 2011/12/12 ID - Stegle-sparse11 SP - 630 EP - 638 L1 - https://proceedings.neurips.cc/paper/2011/file/a732804c8566fc8f498947ea59a841f8-Paper.pdf UR - http://inverseprobability.com/publications/stegle-sparse11.html AB - Inference in matrix-variate Gaussian models has major applications for multi- output prediction and joint learning of row and column covariances from matrix- variate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance between samples. We show practical utility on applications to biology, where we model covariances with more than 100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to better reconstruct the confounders. ER -
APA
Stegle, O., Lippert, C., Mooij, J., Lawrence, N.D. & Borgwardt, K.. (2011). Efficient Inference in Matrix-Variate Gaussian Models with i.i.d. Observation Noise. Neural Information Processing Systems:630-638 Available from http://inverseprobability.com/publications/stegle-sparse11.html.

Related Material