Raman spectroscopy has been revolutionised in recent decades by major technological advances such as lasers, charge-coupled detectors (CCD) and notch/edge filters. In contrast the development of signal processing algorithms has progressed at a slower pace. Spectroscopic applications increasingly focus on ‘real-world’ applications that are not under highly controlled conditions and with more stringent limitations placed on acquisition conditions (e.g. low power for in vivo and explosives analysis). Often it is necessary to work with signals of a quality traditionally considered poor. In this study an alternative paradigm for signal processing poor quality signals is presented and rigorously assessed. Instead of estimating the background on the individual signals it is estimated on the results of a multivariate analysis. Under this paradigm prediction reproducibility is unaffected by the signal processing, unlike the traditional paradigm of correcting individual signals which induces errors that propagate through to the prediction. The paradigms were tested on a ‘real-world’ dataset to predict the concentration of a pathologically relevant protein modification, carboxymethyl lysine (CML). Use of the new paradigm allowed signals with a signal to noise ratio (SNR) of 2.4 to give a prediction with variance just 8.7% of the mean, with the traditional paradigm giving a variance of over 140% of the mean. Significant improvement in reproducibility could even be observed with signals as good as SNR 85. The ability to obtain reproducible predictions from low quality signals allows shorter acquisition (e.g. mapping or on-line analysis), use of low powers (in vivo diagnostics, hazardous materials analysis (HAZMAT)) or use of cheaper equipment. Copyright © 2011 John Wiley & Sons, Ltd.