We determine the distribution of total energy emitted by gamma-ray bursts for bursts with fluences and distance information . Our core sample consists of eight bursts with BATSE spectra and spectroscopic redshifts . We extend this sample by adding four bursts with BATSE spectra and host galaxy R magnitudes . From these R magnitudes we calculate a redshift probability distribution ; this method requires a model of the host galaxy population . From a sample of ten bursts with both spectroscopic redshifts and host galaxy R magnitudes ( some do not have BATSE spectra ) we find that the burst rate is proportional to the galaxy luminosity at the epoch of the burst . Assuming that the total energy emitted has a log-normal distribution , we find that the average emitted energy ( assumed to be radiated isotropically ) is \langle E _ { \gamma iso } \rangle = 1.3 ^ { +1.2 } _ { -1.0 } \times 10 ^ { 53 } ergs ( for H _ { 0 } = 65 km s ^ { -1 } Mpc ^ { -1 } , \Omega _ { m } = 0.3 and \Omega _ { \Lambda } = 0.7 ) ; the distribution has a logarithmic width of \sigma _ { \gamma } = 1.7 ^ { +0.7 } _ { -0.3 } . The corresponding distribution of X-ray afterglow energy ( for seven bursts ) has \langle E _ { Xiso } \rangle = 4.0 ^ { +1.6 } _ { -1.8 } \times 10 ^ { 51 } ergs and \sigma _ { X } = 1.3 ^ { +0.4 } _ { -0.3 } . For completeness , we also provide spectral fits for all bursts with BATSE spectra for which there were afterglow searches .