This study investigated using Monte Carlo simulation the interaction between a linear trend and a lag-one autoregressive (AR(1)) process when both exist in a time series. Simulation experiments demonstrated that the existence of serial correlation alters the variance of the estimate of the Mann–Kendall (MK) statistic; and the presence of a trend alters the estimate of the magnitude of serial correlation. Furthermore, it was shown that removal of a positive serial correlation component from time series by pre-whitening resulted in a reduction in the magnitude of the existing trend; and the removal of a trend component from a time series as a first step prior to pre-whitening eliminates the influence of the trend on the serial correlation and does not seriously affect the estimate of the true AR(1). These results indicate that the commonly used pre-whitening procedure for eliminating the effect of serial correlation on the MK test leads to potentially inaccurate assessments of the significance of a trend; and certain procedures will be more appropriate for eliminating the impact of serial correlation on the MK test. In essence, it was advocated that a trend first be removed in a series prior to ascertaining the magnitude of serial correlation. This alternative approach and the previously existing approaches were employed to assess the significance of a trend in serially correlated annual mean and annual minimum streamflow data of some pristine river basins in Ontario, Canada. Results indicate that, with the previously existing procedures, researchers and practitioners may have incorrectly identified the possibility of significant trends. Copyright © Environment Canada. Published by John Wiley & Sons, Ltd.