# Accounting for Probe-level Noise in Principal Component Analysis of Microarray Data

Guido Sanguinetti, University of Edinburgh
Marta Milo, University of Sheffield
Magnus Rattray, University of Manchester
Neil D. Lawrence, University of Sheffield

Bionformatics 21, pp 3748-3754

#### Abstract

Motivation: Principal Component Analysis (PCA) is one of the most popular dimensionality reduction techniques for the analysis of high-dimensional datasets. However, in its standard form, it does not take into account any error measures associated with the data points beyond a standard spherical noise. This indiscriminate nature provides one of its main weaknesses when applied to biological data with inherently large variability, such as expression levels measured with microarrays. Methods now exist for extracting credibility intervals from the probe-level analysis of cDNA and oligonucleotide microarray experiments. These credibility intervals are gene and experiment specific, and can be propagated through an appropriate probabilistic downstream analysis.\ \ Results: We propose a new model-based approach to PCA that takes into account the variances associated with each gene in each experiment. We develop an efficient EM-algorithm to estimate the parameters of our new model. The model provides significantly better results than standard PCA, while remaining computationally reasonable. We show how the model can be used to ’denoise’ a microarray dataset leading to improved expression profiles and tighter clustering across profiles. The probabilistic nature of the model means that the correct number of principal components is automatically obtained.\ \ Availability: The software used in the paper is available from http://www.bioinf.manchester.ac.uk/resources/puma. The microarray data are deposited in the NCBI database.

  @Article{sanguinetti-accounting05, title = {Accounting for Probe-level Noise in Principal Component Analysis of Microarray Data}, journal = {Bionformatics}, author = {Guido Sanguinetti and Marta Milo and Magnus Rattray and Neil D. Lawrence}, pages = {3748}, year = {2005}, volume = {21}, number = {19}, month = {00}, edit = {https://github.com/lawrennd//publications/edit/gh-pages/_posts/2005-01-01-sanguinetti-accounting05.md}, url = {http://inverseprobability.com/publications/sanguinetti-accounting05.html}, abstract = {**Motivation:** Principal Component Analysis (PCA) is one of the most popular dimensionality reduction techniques for the analysis of high-dimensional datasets. However, in its standard form, it does not take into account any error measures associated with the data points beyond a standard spherical noise. This indiscriminate nature provides one of its main weaknesses when applied to biological data with inherently large variability, such as expression levels measured with microarrays. Methods now exist for extracting credibility intervals from the probe-level analysis of cDNA and oligonucleotide microarray experiments. These credibility intervals are gene and experiment specific, and can be propagated through an appropriate probabilistic downstream analysis.\ \ **Results:** We propose a new model-based approach to PCA that takes into account the variances associated with each gene in each experiment. We develop an efficient EM-algorithm to estimate the parameters of our new model. The model provides significantly better results than standard PCA, while remaining computationally reasonable. We show how the model can be used to ’denoise’ a microarray dataset leading to improved expression profiles and tighter clustering across profiles. The probabilistic nature of the model means that the correct number of principal components is automatically obtained.\ \ **Availability:** The software used in the paper is available from . The microarray data are deposited in the NCBI database.}, key = {Sanguinetti-accounting05}, doi = {10.1093/bioinformatics/bti617}, linkpdf = {http://bioinformatics.oxfordjournals.org/cgi/reprint/21/19/3748}, linksoftware = {http://inverseprobability.com/nppca/}, group = {shefml,puma,pca,ppca} }
 %T Accounting for Probe-level Noise in Principal Component Analysis of Microarray Data %A Guido Sanguinetti and Marta Milo and Magnus Rattray and Neil D. Lawrence %B %C Bionformatics %D %F sanguinetti-accounting05 %J Bionformatics %P 3748--3754 %R 10.1093/bioinformatics/bti617 %U http://inverseprobability.com/publications/sanguinetti-accounting05.html %V 21 %N 19 %X **Motivation:** Principal Component Analysis (PCA) is one of the most popular dimensionality reduction techniques for the analysis of high-dimensional datasets. However, in its standard form, it does not take into account any error measures associated with the data points beyond a standard spherical noise. This indiscriminate nature provides one of its main weaknesses when applied to biological data with inherently large variability, such as expression levels measured with microarrays. Methods now exist for extracting credibility intervals from the probe-level analysis of cDNA and oligonucleotide microarray experiments. These credibility intervals are gene and experiment specific, and can be propagated through an appropriate probabilistic downstream analysis.\ \ **Results:** We propose a new model-based approach to PCA that takes into account the variances associated with each gene in each experiment. We develop an efficient EM-algorithm to estimate the parameters of our new model. The model provides significantly better results than standard PCA, while remaining computationally reasonable. We show how the model can be used to ’denoise’ a microarray dataset leading to improved expression profiles and tighter clustering across profiles. The probabilistic nature of the model means that the correct number of principal components is automatically obtained.\ \ **Availability:** The software used in the paper is available from . The microarray data are deposited in the NCBI database. 
 TY - CPAPER TI - Accounting for Probe-level Noise in Principal Component Analysis of Microarray Data AU - Guido Sanguinetti AU - Marta Milo AU - Magnus Rattray AU - Neil D. Lawrence PY - 2005/01/01 DA - 2005/01/01 ID - sanguinetti-accounting05 SP - 3748 EP - 3754 DO - 10.1093/bioinformatics/bti617 L1 - http://bioinformatics.oxfordjournals.org/cgi/reprint/21/19/3748 UR - http://inverseprobability.com/publications/sanguinetti-accounting05.html AB - **Motivation:** Principal Component Analysis (PCA) is one of the most popular dimensionality reduction techniques for the analysis of high-dimensional datasets. However, in its standard form, it does not take into account any error measures associated with the data points beyond a standard spherical noise. This indiscriminate nature provides one of its main weaknesses when applied to biological data with inherently large variability, such as expression levels measured with microarrays. Methods now exist for extracting credibility intervals from the probe-level analysis of cDNA and oligonucleotide microarray experiments. These credibility intervals are gene and experiment specific, and can be propagated through an appropriate probabilistic downstream analysis.\ \ **Results:** We propose a new model-based approach to PCA that takes into account the variances associated with each gene in each experiment. We develop an efficient EM-algorithm to estimate the parameters of our new model. The model provides significantly better results than standard PCA, while remaining computationally reasonable. We show how the model can be used to ’denoise’ a microarray dataset leading to improved expression profiles and tighter clustering across profiles. The probabilistic nature of the model means that the correct number of principal components is automatically obtained.\ \ **Availability:** The software used in the paper is available from . The microarray data are deposited in the NCBI database. ER - 
 Sanguinetti, G., Milo, M., Rattray, M. & Lawrence, N.D.. (2005). Accounting for Probe-level Noise in Principal Component Analysis of Microarray Data. Bionformatics 21(19):3748-3754