# Residual Component Analysis

Alfredo A. Kalaitzis, Microsoft
Neil D. Lawrence, University of Sheffield

in Proceedings of the International Conference in Machine Learning 29

#### Abstract

Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data.

  @InProceedings{kalaitzis-rca12, title = {Residual Component Analysis}, author = {Alfredo A. Kalaitzis and Neil D. Lawrence}, booktitle = {Proceedings of the International Conference in Machine Learning}, year = {2012}, editor = {John Langford and Joelle Pineau}, volume = {29}, address = {San Francisco, CA}, month = {00}, publisher = {Morgan Kauffman}, edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2012-01-01-kalaitzis-rca12.md}, url = {http://inverseprobability.com/publications/kalaitzis-rca12.html}, abstract = {Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data.}, crossref = {Langford:icml12}, key = {Kalaitzis:rca12}, linkpdf = {http://icml.cc/2012/papers/114.pdf}, group = {} }
 %T Residual Component Analysis %A Alfredo A. Kalaitzis and Neil D. Lawrence %B %C Proceedings of the International Conference in Machine Learning %D %E John Langford and Joelle Pineau %F kalaitzis-rca12 %I Morgan Kauffman %P -- %R %U http://inverseprobability.com/publications/kalaitzis-rca12.html %V 29 %X Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data. 
 TY - CPAPER TI - Residual Component Analysis AU - Alfredo A. Kalaitzis AU - Neil D. Lawrence BT - Proceedings of the International Conference in Machine Learning PY - 2012/01/01 DA - 2012/01/01 ED - John Langford ED - Joelle Pineau ID - kalaitzis-rca12 PB - Morgan Kauffman SP - EP - L1 - http://icml.cc/2012/papers/114.pdf UR - http://inverseprobability.com/publications/kalaitzis-rca12.html AB - Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = \sigma^2\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. conditional dependencies between the covariates, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalised eigenvalue problem, which we call residual component analysis (RCA). We explore a range of new algorithms that arise from the framework, including one that factorises the covariance of a Gaussian density into a low-rank and a sparse-inverse component. We illustrate the ideas on the recovery of a protein-signaling network, a gene expression time-series data set and the recovery of the human skeleton from motion capture 3-D cloud data. ER - 
 Kalaitzis, A.A. & Lawrence, N.D.. (2012). Residual Component Analysis. Proceedings of the International Conference in Machine Learning 29:-