Background-error variances estimated from a small-size ensemble of data assimilations need to be filtered because of the associated sampling noise. Previous studies showed that objective spectral filtering is efficient in reducing this noise, while preserving relevant features to a large extent. However, since such filters are homogeneous, they tend to smooth small-scale structures of interest.
In many applications, nonlinear thresholding of wavelet coefficients has proved to be an efficient technique for denoising signals. This algorithm proceeds by thresholding the wavelet coefficients of the noisy signal using an estimated threshold. This is equivalent to applying an adaptive local spatial filtering. A quasi-optimal value for the threshold can be computed from the noise variance. We show that the statistical properties of the sampling noise associated with the estimation of background-error variances can be used to calculate the noise level and the appropriate threshold value.
This method is first applied to 1D academic examples, with emphasis on correlated and heterogeneous noises. This approach is shown to outperform the commonly used homogeneous filters, since it automatically adapts to the local structure of the signal. We also show that this technique compares favourably to a heterogeneous diffusion-based filter, with the advantage of requiring less trial-and-error tuning. These results are next confirmed in a more realistic 2D problem, using the Arome-France convective-scale model.