The present case study evaluates the downward longwave radiation at the surface (DLR) in several high-resolution (≈1°) general circulation models (GCMs) using surface observations from a semiarid continental site in New South Wales, Australia (Uardry, 34.39°S, 142.30°E). This site is located on a large grassland plain uniform in both its land use and landcover type, and is therefore particularly well suited for a comparison with GCM grid mean values. Monthly averages of newly constructed clear-sky and all-sky DLR climatologies and the resulting cloud-radiative forcing are compared. It is shown that the GCMs exceed the observed DLR under cloud-free conditions by 10–20 W m−2 at this semiarid site on an annual basis, with a strong seasonal dependence. The calculated clear-sky fluxes are overestimated during the warmer summer season, with large absolute values of DLR, while the biases are reduced in the colder and dryer winter season with smaller fluxes. This gives direct support for recent evidence that the DLR model biases depend systematically on the thermal and humidity structure of the cloudless atmosphere. Fluxes from strongly emitting atmospheres tend to be overestimated, but may be underestimated from atmospheres with smaller emission. This points to common problems inherent in the simulation of the emission from the cloudless atmosphere in current longwave radiation codes.
The comparisons of the all-sky climatologies at Uardry show that the clear-sky biases are partly masked in the models with an insufficient cloud-radiative forcing, thereby counterbalancing the excessive DLR of the cloud-free atmosphere. On the other hand, when the cloudradiative forcing is improved, the biases in the cloud-free atmosphere become fully apparent in the all-sky fluxes.