Relating the observed CO emission from giant molecular clouds ( GMCs ) to the underlying H _ { 2 } column density is a long-standing problem in astrophysics . While the Galactic CO-H _ { 2 } conversion factor ( X _ { CO } ) appears to be reasonably constant , observations indicate that X _ { CO } may be depressed in high-surface density starburst environments . Using a multi-scale approach , we investigate the dependence of X _ { CO } on the galactic environment in numerical simulations of disc galaxies and galaxy mergers . X _ { CO } is proportional to the GMC surface density divided by the integrated CO intensity , W _ { CO } , and W _ { CO } is related to the kinetic temperature and velocity dispersion in the cloud . In disc galaxies ( except within the central \sim kpc ) , the galactic environment is largely unimportant in setting the physical properties of GMCs provided they are gravitationally bound . The temperatures are roughly constant at \sim 10 K due to the balance of CO cooling and cosmic ray heating , giving a nearly constant CO-H _ { 2 } conversion factor in discs . In mergers , the velocity dispersion of the gas rises dramatically during coalescence . The gas temperature also rises as it couples well to the warm ( \sim 50 K ) dust at high densities ( n > 10 ^ { 4 } cm ^ { -3 } ) . The rise in velocity dispersion and temperature combine to offset the rise in surface density in mergers , causing X _ { CO } to drop by a factor of \sim 2 - 10 compared to the disc simulation . This model predicts that high-resolution ALMA observations of nearby ULIRGs should show velocity dispersions of 10 ^ { 1 } -10 ^ { 2 } km s ^ { -1 } , and brightness temperatures comparable to the dust temperatures .