[edit]
Efficient Inference in Matrix-Variate Gaussian Models with i.i.d. Observation Noise
Neural Information Processing Systems:630-638, 2011.
Abstract
Inference in matrix-variate Gaussian models has major applications for multi- output prediction and
joint learning of row and column covariances from matrix- variate data. Here, we discuss an approach for
efficient inference in such models that explicitly account for iid observation noise. Computational
tractability can be retained by exploiting the Kronecker product between row and column covariance
matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse
inverse covariance between features while accounting for a low-rank confounding covariance between
samples. We show practical utility on applications to biology, where we model covariances with more than
100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to
better reconstruct the confounders.