The observed cooling rate of hot gas in clusters is much lower than that inferred from the gas density profiles . This suggests that the gas is being heated by some source . We use an adaptive-mesh refinement code ( Flash ) to simulate the effect of multiple , randomly positioned , injections of thermal energy within 50 kpc of the centre of an initially isothermal cluster with mass M _ { 200 } = 3 \times 10 ^ { 14 } ~ { } \mbox { M } _ { \sun } and kT = 3.1 keV . We have performed eight simulations with spherical bubbles of energy generated every 10 ^ { 8 } years , over a total of 1.5 ~ { } \mbox { Gyr } . Each bubble is created by injecting thermal energy steadily for 10 ^ { 7 } years ; the total energy of each bubble ranges from 0.1–3 \times 10 ^ { 60 } ~ { } \mbox { erg } , depending on the simulation . We find that 2 \times 10 ^ { 60 } ~ { } \mbox { erg } per bubble ( corresponding to a average power of 6.3 \times 10 ^ { 44 } ~ { } \mbox { erg~ { } s } ^ { -1 } ) effectively balances energy loss in the cluster and prevents the accumulation of gas below kT = 1 keV from exceeding the observational limits . This injection rate is comparable to the radiated luminosity of the cluster , and the required energy and periodic timescale of events are consistent with observations of bubbles produced by central AGN in clusters . The effectiveness of this process depends primarily on the total amount of injected energy and the initial location of the bubbles , but is relatively insensitive to the exact duty cycle of events .