A fast and memory-saving PLS regression algorithm for matrices with large numbers of objects is presented. It is called the kernel algorithm for PLS. Long (meaning having many objects, N) matrices X (N × K) and Y (N × M) are condensed into a small (K × K) square ‘kernel’ matrix XTYYTX of size equal to the number of X-variables. Using this kernel matrix XTYYTX together with the small covariance matrices XTX (K × K), XTY (K × M) and YTY (M × M), it is possible to estimate all necessary parameters for a complete PLS regression solution with some statistical diagnostics. The new developments are presented in equation form. A comparison of consumed floating point operations is given for the kernel and the classical PLS algorithm. As appendices, a condensed matrix algebra version of the kernel algorithm is given together with the MATLAB code.