The phase-transition induced collapse of a neutron star to a more compact configuration ( typically a “ strange ” star ) and the subsequent core bounce is often invoked as a model for gamma-ray bursts . We present the results of numerical simulations of this kind of event using realistic neutrino physics and a high density equation of state . The nature of the collapse itself is represented by the arbitrary motion of a piston deep within the star , but if any shock is to develop , the transition , or at least its final stages , must occur in less than a sonic time . Fine surface zoning is employed to adequately represent the acceleration of the shock to relativistic speeds and to determine the amount and energy of the ejecta . We find that these explosions are far too baryon-rich ( M _ { ejecta } \sim 0.01 M _ { \odot } ) and have much too low an energy to explain gamma-ray bursts . The total energy of the ejecta having relativistic \Gamma \gtrsim 40 is less than 10 ^ { 46 } erg even in our most optimistic models ( deep bounce , no neutrino losses or photodisintegration ) . However , the total energy of all the ejecta , mostly mildly relativistic , is \sim 10 ^ { 51 } erg and , if they occur , these events might be observed . They would also contribute to Galactic nucleosynthesis , especially the r -process , even though the most energetic layers are composed of helium and nucleons , not heavy elements .