Using hydrodynamical simulations of the \Lambda CDM cosmology , that include both radiative cooling and a phenomenological model for star formation and supernovae feedback , we investigate the impact of galaxy formation on the X-ray properties of groups at redshift zero . Motivated by the observed “ break ” in the L _ { x } - T _ { x } relation at kT _ { x } \sim 1 - 2 keV , our feedback model is based on the assumption that supernovae imprint a temperature scale on the hot gas , with the star formation rate and corresponding reheated gas mass then depending only on the available energy budget . We demonstrate that a strong feedback model with a heating temperature comparable to this break ( kT _ { SN } = 2 keV ) , and an energy budget twice that available from supernovae ( \epsilon = 2 ) , raises the core entropy of groups sufficiently to produce an adequate match to their observed X-ray properties . A lower value of \epsilon increases the star formation rate without significantly affecting the X-ray properties of groups , and a model with \epsilon \sim 0.1 reproduces the observed fraction of baryons in stars . However , a heating temperature which is lower than the virial temperatures of the groups leads to an excess of cooling gas that boosts their X-ray luminosities , due to the failure of the reheated material to escape from the gravitational potential . A limited study of numerical resolution effects reveals that the temperature of poorly-resolved objects is underestimated , therefore ( in our case ) a fully-resolved group population would lead to a steeper L _ { x } - T _ { x } relation , bringing our results into even better agreement with the observations .