Impact on the tensor-to-scalar ratio of incorrect Galactic foreground modelling



A key goal of many cosmic microwave background (CMB) experiments is the detection of gravitational waves, through their B-mode polarization signal at large scales. To extract such a signal requires modelling contamination from the Galaxy. Using the Planck experiment as an example, we investigate the impact of incorrectly modelling foregrounds on estimates of the polarized CMB, quantified by the bias in tensor-to-scalar ratio r, and optical depth τ. We use a Bayesian parameter estimation method to estimate the CMB, synchrotron and thermal dust components from simulated observations spanning 30–353 GHz, starting from a model that fits the simulated data, returning r < 0.03 at 95 per cent confidence for an r = 0 model and r = 0.09 ± 0.03 for an r = 0.1 model. We then introduce a set of mismatches between the simulated data and assumed model. Including a curvature of the synchrotron spectral index with frequency, but assuming a power-law model, can bias r high by ∼1σ (δr ∼ 0.03). A similar bias is seen for thermal dust with a modified blackbody frequency dependence, incorrectly modelled as a power law. If too much freedom is allowed in the model, for example, fitting for spectral indices in 3° pixel over the sky with physically reasonable priors, we find that r can be biased up to ∼3σ high by effectively setting the indices to the wrong values. Increasing the signal-to-noise ratio by reducing parameters, or adding additional foreground data, reduces the bias. We also find that neglecting a ∼1 per cent polarized free–free or spinning dust component has a negligible effect on r. These tests highlight the importance of modelling the foregrounds in a way that allows for sufficient complexity while minimizing the number of free parameters.