The radio continuum spectra of 14 star-forming galaxies are investigated by fitting nonthermal ( synchrotron ) and thermal ( free-free ) radiation laws . The underlying radio continuum measurements cover a frequency range of \sim 325 MHz to 24.5 GHz ( 32 GHz in case of M 82 ) . It turns out that most of these synchrotron spectra are not simple power-laws , but are best represented by a low-frequency spectrum with a mean slope \alpha _ { nth } = 0.59 \pm 0.20 ( S _ { \nu } \propto \nu ^ { - \alpha } ) , and by a break or an exponential decline in the frequency range of 1 – 12 GHz . Simple power-laws or mildly curved synchrotron spectra lead to unrealistically low thermal flux densities , and/or to strong deviations from the expected optically thin free-free spectra with slope \alpha _ { th } = 0.10 in the fits . The break or cutoff energies are in the range of 1.5 - 7 GeV . We briefly discuss the possible origin of such a cutoff or break . If the low-frequency spectra obtained here reflect the injection spectrum of cosmic-ray electrons , they comply with the mean spectral index of Galactic supernova remnants . A comparison of the fitted thermal flux densities with the ( foreground-corrected ) H \alpha fluxes yields the extinction , which increases with metallicity . The fraction of thermal emission is higher than believed hitherto , especially at high frequencies , and is highest in the dwarf galaxies of our sample , which we interpret in terms of a lack of containment in these low-mass systems , or a time effect caused by a very young starburst .