Is Protein Quantification and Physical Normalization Always Necessary in Proteomics?
Zelter, A.; Riffle, M.; Merrihew, G. E.; Mutawe, B.; Maurais, A.; Inman, J. L.; Celniker, S. E.; Mao, J.-H.; Wan, K. H.; Snijders, A. M.; Wu, C. C.; MacCoss, M. J.
Show abstract
Dogma suggests protein quantification is a pre-requisite to LC-MS/MS based proteomics studies. Such quantification allows a standardized ratio of sample to digestion enzyme and enables physical normalization of protein digest loaded onto the mass spectrometer for analysis. Most proteomics studies include these steps. However, there are significant costs in time, money and experimental complexity, associated with performing protein quantification and physical normalization for every sample, especially for larger studies. Proteomics data analysis pipelines typically include computational normalization strategies to compensate for unavoidable systematic biases. These strategies also have the potential to compensate for avoidable variation such as omitting sample amount normalization. Here we investigate the effects of either physically normalizing the amount of protein for each individual sample or leaving it unnormalized. Our results show the relationship between increased protein amount variation in sample input, and the variance of quantified relative abundances of peptides and proteins output after data analysis. The experiments presented here suggest that protein quantification and physical normalization steps can be omitted from some quantitative proteomic experiments without incurring an unacceptable increase in measurement variability after computational normalization has been applied. This work will enable important time and cost saving optimizations to be made to many proteomics workflows.
Matching journals
The top 4 journals account for 50% of the predicted probability mass.