A maximum stellar surface density \Sigma _ { max } \sim 3 \times 10 ^ { 5 } \mathrm { \mathrm { M _ { \odot } pc ^ { -2 } } } is observed across all classes of dense stellar systems ( e.g . star clusters , galactic nuclei , etc . ) , spanning \sim 8 orders of magnitude in mass . It has been proposed that this characteristic scale is set by some dynamical feedback mechanism preventing collapse beyond a certain surface density . However , simple analytic models and detailed simulations of star formation moderated by feedback from massive stars argue that feedback becomes less efficient at higher surface densities ( with the star formation efficiency increasing as \sim \Sigma / \Sigma _ { crit } ) . We therefore propose an alternative model wherein stellar feedback becomes ineffective at moderating star formation above some \Sigma _ { crit } , so the supply of star-forming gas is rapidly converted to stars before the system can contract to higher surface density . We show that such a model – with \Sigma _ { crit } taken directly from the theory – naturally predicts the observed \Sigma _ { max } . \Sigma _ { max } \sim 100 \Sigma _ { crit } because the gas consumption time is longer than the global freefall time even when feedback is ineffective . Moreover the predicted \Sigma _ { max } is robust to spatial scale and metallicity , and is preserved even if multiple episodes of star formation/gas inflow occur . In this context , the observed \Sigma _ { max } directly tells us where feedback fails .