The detection of long-term trends in geophysical time series is a key issue in climate change studies. This detection is affected by many factors: the size of the trend to be detected, the length of the available data sets, and the noise properties. Although the noise autocorrelation observed in geophysical time series does not bias the trend estimate, it affects the estimation of its uncertainty and consequently the ability to detect, or not, a significant trend. Ignoring the noise autocorrelation level typically leads to an overdetection of significant trends. Due to satellite lifetime, usually between 5 and 10 years, sea surface time series do not cover the same period and are acquired by different sensors with different characteristics. These differences lead to unknown level shifts (biases) between the data sets, which affect the trend detection. In this work, we develop a generic framework to detect and evaluate linear trends and level shifts in multisensor time series of satellite chlorophyll-a concentrations, as provided by the Medium Resolution Imaging Spectrometer instrument (MERIS) and sea-viewing wide field-of-view sensor (SeaWiFS) ocean-color missions. We also discuss the optimization of the observation networks, in terms of needed time overlap between successive time series to reduce the uncertainty on the detection of long-term trends. For the incoming Sentinel 3-Ocean and Land Color Instrument (3-OLCI) mission that should be launched at the end of 2014, we propose a global map of the number of months of observations to enhance the trend detection performed with the joint SeaWiFS-MERIS analysis.