We model the effects of repeated supernova explosions from starbursts in dwarf galaxies on the interstellar medium of these galaxies , taking into account the gravitational potential of their dominant dark matter haloes . We explore supernova rates from one every 30,000 yr to one every 3 million yr , equivalent to steady mechanical luminosities of L = 0.1 - 10 \times 10 ^ { 38 } ergs s ^ { -1 } , occurring in dwarf galaxies with gas masses M _ { g } = 10 ^ { 6 } -10 ^ { 9 } M _ { \odot } . We address in detail , both analytically and numerically , the following three questions : 1 . When do the supernova ejecta blow out of the disk of the galaxy ? 2 . When blowout occurs , what fraction of the interstellar gas is blown away , escaping the potential of the galactic halo ? 3 . What happens to the metals ejected from the massive stars of the starburst ? Are they retained or blown away ? We give quantitative results for when blowout will or will not occur in galaxies with 10 ^ { 6 } \leq M _ { g } \leq 10 ^ { 9 } M _ { \odot } . Surprisingly , we find that the mass ejection efficiency is very low for galaxies with mass M _ { g } \geq 10 ^ { 7 } M _ { \odot } . Only galaxies with M _ { g } \lesssim 10 ^ { 6 } M _ { \odot } have their interstellar gas blown away , and then virtually independently of L . On the other hand , metals from the supernova ejecta are accelerated to velocities larger than the escape speed from the galaxy far more easily than the gas . We find that for L _ { 38 } = 1 , only about 30 % of the metals are retained by a 10 ^ { 9 } M _ { \odot } galaxy , and virtually none by smaller galaxies . We discuss the implications of our results for the evolution , metallicity and observational properties of dwarf galaxies .