Significant differences in heating rates are found when two solar irradiance spectra are used in a line-by-line radiative transfer code. Compared with a spectrum of recent satellite data an older theoretical spectrum gives 20–40% more heating in the ozone Hartley band, important in the upper stratosphere. The spectra are implemented in a broad-band radiation code to which some improvements are also made to the ozone absorption parameterization. A widely-used spectrum of ground-based data from 1960s gives somewhat lower heating rates. The effects of the changes in the spectrum, and the broad-band scheme, on the temperatures simulated by a middle atmosphere GCM are investigated. The model has previously shown a warm bias, compared with climatology, around the stratopause but this is significantly reduced when the former spectrum is substituted for the latter, and the new ozone parameterization incorporated. The change in spectrum accounts for two-thirds of the improvement.