Residual Component Analysis

Alfredo A. KalaitzisNeil D. Lawrence
Proceedings of the International Conference in Machine Learning, Morgan Kauffman 29, 2012.

Abstract

Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data.

Cite this Paper


BibTeX
@InProceedings{Kalaitzis:rca12, title = {Residual Component Analysis}, author = {Kalaitzis, Alfredo A. and Lawrence, Neil D.}, booktitle = {Proceedings of the International Conference in Machine Learning}, year = {2012}, editor = {Langford, John and Pineau, Joelle}, volume = {29}, address = {San Francisco, CA}, publisher = {Morgan Kauffman}, pdf = {http://icml.cc/2012/papers/114.pdf}, url = {http://inverseprobability.com/publications/kalaitzis-rca12.html}, abstract = {Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data.} }
Endnote
%0 Conference Paper %T Residual Component Analysis %A Alfredo A. Kalaitzis %A Neil D. Lawrence %B Proceedings of the International Conference in Machine Learning %D 2012 %E John Langford %E Joelle Pineau %F Kalaitzis:rca12 %I Morgan Kauffman %U http://inverseprobability.com/publications/kalaitzis-rca12.html %V 29 %X Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data.
RIS
TY - CPAPER TI - Residual Component Analysis AU - Alfredo A. Kalaitzis AU - Neil D. Lawrence BT - Proceedings of the International Conference in Machine Learning DA - 2012/01/01 ED - John Langford ED - Joelle Pineau ID - Kalaitzis:rca12 PB - Morgan Kauffman VL - 29 L1 - http://icml.cc/2012/papers/114.pdf UR - http://inverseprobability.com/publications/kalaitzis-rca12.html AB - Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data. ER -
APA
Kalaitzis, A.A. & Lawrence, N.D.. (2012). Residual Component Analysis. Proceedings of the International Conference in Machine Learning 29 Available from http://inverseprobability.com/publications/kalaitzis-rca12.html.

Related Material