The method of moments has found extensive applications in the solution of a wide class of electromagnetic problems. Additionally, there are a number of basis functions which have found favor and with them different types of testing methods. This work examines the selection of test functions, given a selected set of basis functions, which serve to minimize the mean square error. These test functions will guarantee equal or lesser error compared with any other test function. Additionally, these test functions provide a monotonic improvement of the approximation as the increase in the order of the system. Hence while these test functions require the application of the differential or integral operator one additional time, they provide an absolute lower bound to the mean square error in the approximation and a means to systematically improve accuracy with each increase in the order of the unknowns.