The most common means of converting an observed CO line intensity into a molecular gas mass requires the use of a conversion factor (XCO). While in the Milky Way this quantity does not appear to vary significantly, there is good reason to believe that XCO will depend on the larger-scale galactic environment. With sensitive instruments pushing detections to increasingly high redshift, characterizing XCO as a function of physical conditions is crucial to our understanding of galaxy evolution. Utilizing numerical models, we investigate how varying metallicities, gas temperatures and velocity dispersions in galaxies impacts the way CO line emission traces the underlying H2 gas mass, and under what circumstances XCO may differ from the Galactic mean value. We find that, due to the combined effects of increased gas temperature and velocity dispersion, XCO is depressed below the Galactic mean in high surface density environments such as ultraluminous infrared galaxies (ULIRGs). In contrast, in low-metallicity environments, XCO tends to be higher than in the Milky Way, due to photodissociation of CO in metal-poor clouds. At higher redshifts, gas-rich discs may have gravitationally unstable clumps that are warm (due to increased star formation) and have elevated velocity dispersions. These discs tend to have XCO values ranging between present-epoch gas-rich mergers and quiescent discs at low z. This model shows that on average mergers do have lower XCO values than disc galaxies, though there is significant overlap. XCO varies smoothly with the local conditions within a galaxy, and is not a function of global galaxy morphology. We combine our results to provide a general fitting formula for XCO as a function of CO line intensity and metallicity. We show that replacing the traditional approach of using one constant XCO for starbursts and another for discs with our best-fitting function produces star formation laws that are continuous rather than bimodal, and that have significantly reduced scatter.