Radiation feedback from stellar clusters is expected to play a key role in setting the rate and efficiency of star formation in giant molecular clouds ( GMCs ) . To investigate how radiation forces influence realistic turbulent systems , we have conducted a series of numerical simulations employing the Hyperion radiation hydrodynamics solver , considering the regime that is optically thick to ultraviolet ( UV ) and optically thin to infrared ( IR ) radiation . Our model clouds cover initial surface densities between \Sigma _ { cl, 0 } \sim 10 - 300 ~ { } M _ { \odot } ~ { } { pc ^ { -2 } } , with varying initial turbulence . We follow them through turbulent , self-gravitating collapse , formation of star clusters , and cloud dispersal by stellar radiation . All our models display a lognormal distribution of gas surface density \Sigma ; for an initial virial parameter \alpha _ { vir, 0 } = 2 , the lognormal standard deviation is \sigma _ { ln \Sigma } = 1 - 1.5 and the star formation rate coefficient \varepsilon _ { ff, \bar { \rho } } = 0.3 - 0.5 , both of which are sensitive to turbulence but not radiation feedback . The net star formation efficiency \varepsilon _ { \mathrm { final } } increases with \Sigma _ { cl, 0 } and decreases with \alpha _ { vir, 0 } . We interpret these results via a simple conceptual framework , whereby steady star formation increases the radiation force , such that local gas patches at successively higher \Sigma become unbound . Based on this formalism ( with fixed \sigma _ { ln \Sigma } ) , we provide an analytic upper bound on \varepsilon _ { \mathrm { final } } , which is in good agreement with our numerical results . The final star formation efficiency depends on the distribution of Eddington ratios in the cloud and is strongly increased by turbulent compression of gas .