[edit]

# Residual Component Analysis

Alfredo A. Kalaitzis, Neil D. Lawrence , 2011.

#### Abstract

Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.

#### Cite this Paper

BibTeX

```
@InProceedings{pmlr-v-kalaitzis-rca11,
title = {Residual Component Analysis},
author = {Alfredo A. Kalaitzis and Neil D. Lawrence},
year = {},
editor = {},
url = {http://inverseprobability.com/publications/kalaitzis-rca11.html},
abstract = {Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.}
}
```

Endnote

```
%0 Conference Paper
%T Residual Component Analysis
%A Alfredo A. Kalaitzis
%A Neil D. Lawrence
%B
%C Proceedings of Machine Learning Research
%D
%E
%F pmlr-v-kalaitzis-rca11
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%W PMLR
%X Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.
```

RIS

```
TY - CPAPER
TI - Residual Component Analysis
AU - Alfredo A. Kalaitzis
AU - Neil D. Lawrence
BT -
PY -
DA -
ED -
ID - pmlr-v-kalaitzis-rca11
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/kalaitzis-rca11.html
AB - Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.
ER -
```

APA

`Kalaitzis, A.A. & Lawrence, N.D.. (). Residual Component Analysis. `*, in PMLR* :-

#### Related Material

BibTeX

```
@InProceedings{/kalaitzis-rca11,
title = {Residual Component Analysis},
author = {Alfredo A. Kalaitzis and Neil D. Lawrence},
year = {},
editor = {},
url = {http://inverseprobability.com/publications/kalaitzis-rca11.html},
abstract = {Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.}
}
```

Endnote

```
%0 Conference Paper
%T Residual Component Analysis
%A Alfredo A. Kalaitzis
%A Neil D. Lawrence
%B
%C Proceedings of Machine Learning Research
%D
%E
%F /kalaitzis-rca11
%I PMLR
%J Proceedings of Machine Learning Research
%P --
%U http://inverseprobability.com
%V
%W PMLR
%X Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.
```

RIS

```
TY - CPAPER
TI - Residual Component Analysis
AU - Alfredo A. Kalaitzis
AU - Neil D. Lawrence
BT -
PY -
DA -
ED -
ID - /kalaitzis-rca11
PB - PMLR
SP -
DP - PMLR
EP -
L1 -
UR - http://inverseprobability.com/publications/kalaitzis-rca11.html
AB - Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, $\Sigma = (\sigma^2)\mathbf{I}$. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.
ER -
```

APA

`Kalaitzis, A.A. & Lawrence, N.D.. (). Residual Component Analysis. `*, in PMLR* :-