The SMOS mission is a European Space Agency (ESA) project aimed at global monitoring of surface soil moisture and sea surface salinity from radiometric L-band observations. The radiometer onboard SMOS uses a 2-D synthetic aperture concept in order to achieve satisfactory spatial resolution performances for a minimal cost in terms of payload mass and volume. As the satellite moves ahead, every area on Earth's surface is seen at a variety of incidence angles. This multiangular capability is used in the retrieval of geophysical parameters. A major issue for obtaining useful measurements of the surface salinity is the radiometric accuracy, since the overall dynamic range resulting from ocean salinity variations only extends over a few Kelvin. To improve instrument performances, it is foreseen that independent retrieved salinity estimates will be averaged over a suitable space/time domain. This should bring random uncertainties due to radiometric sensitivity down to around 0.1 Practical Salinity Scale (PSS). However, several systematic error sources are also present: biases arise from channel or baseline instrument errors and are superimposed to Gibbs oscillations generated through the reconstruction of brightness temperature fields from correlation products. It is thus of importance to assess to which extent these errors can be averaged out when building space/time averages of the retrieved salinity values. The presents study is a step toward addressing this issue.