Relating the observed CO emission from giant molecular clouds (GMCs) to the underlying H2 column density is a long-standing problem in astrophysics. While the Galactic CO–H2 conversion factor (XCO) appears to be reasonably constant, observations indicate that XCO may be depressed in high surface density starburst environments. Using a multiscale approach, we investigate the dependence of XCO on the galactic environment in numerical simulations of disc galaxies and galaxy mergers. XCO is proportional to the GMC surface density divided by the integrated CO intensity, WCO, and WCO is related to the kinetic temperature and velocity dispersion in the cloud. In disc galaxies (except within the central ∼ kpc), the galactic environment is largely unimportant in setting the physical properties of GMCs provided they are gravitationally bound. The temperatures are roughly constant at ∼10 K due to the balance of CO cooling and cosmic ray heating, giving a nearly constant CO–H2 conversion factor in discs. In mergers, the velocity dispersion of the gas rises dramatically during coalescence. The gas temperature also rises as it couples well to the warm (∼50 K) dust at high densities (n > 104 cm−3). The rise in velocity dispersion and temperature combine to offset the rise in surface density in mergers, causing XCO to drop by a factor of ∼2–10 compared to the disc simulation. This model predicts that high-resolution Atacama Large Millimeter/submillimeter Array observations of nearby ultraluminous infrared galaxies should show velocity dispersions of 101–102 km s−1, and brightness temperatures comparable to the dust temperatures.