It has been shown that HD molecules can form efficiently in metal-free gas collapsing into massive protogalactic haloes at high redshift. The resulting radiative cooling by HD can lower the gas temperature to that of the cosmic microwave background (CMB), TCMB= 2.7(1 +z) K, significantly below the temperature of a few ×100 K achievable via H2 cooling alone, and thus reduce the masses of the first generation of stars. Here we consider the suppression of HD cooling by ultraviolet (UV) irradiation in the Lyman–Werner (LW) bands. We include photodissociation of both H2 and HD, and explicitly compute the self-shielding and shielding of both molecules by neutral hydrogen, H i, as well as the shielding of HD by H2. We use a simplified dynamical collapse model, and follow the chemical and thermal evolution of the gas, in the presence of a UV background. We find that a LW flux of Jcrit,HD≈ 10−22 erg cm−2 sr−1 s−1 Hz−1 is able to suppress HD cooling and thus prevent collapsing primordial gas from reaching temperatures below ∼100 K. The main reason for the lack of HD cooling for J > Jcrit,HD is the partial photodissociation of H2, which prevents the gas from reaching sufficiently low temperatures (T < 150 K) for HD to become the dominant coolant; direct HD photodissociation is unimportant except for a narrow range of fluxes and column densities. Since the prevention of HD cooling requires only partial H2 photodissociation, the critical flux Jcrit,HD is modest, and is below the UV background required to re-ionize the Universe at z∼ 10–20. We conclude that HD cooling can reduce the masses of typical stars only in rare haloes forming well before the epoch of re-ionization.