The light curve of many supernovae ( SNe ) and gamma-ray bursts ( GRBs ) can be explained by a sustained injection of extra energy from its possible central engine , a rapidly rotating strongly magnetic neutron star ( i.e . magnetar ) . The magnetic dipole radiation power that the magnetar supplies comes at the expense of the star ’ s rotational energy . However , radiation by gravitational waves ( GWs ) can be more efficient than magnetic dipole radiation because of its stronger dependence on neutron star spin rate \Omega , i.e . \Omega ^ { 6 } ( for a static ‘ mountain ’ ) or \Omega ^ { 8 } ( for a r-mode fluid oscillation ) versus \Omega ^ { 4 } for magnetic dipole radiation . Here , we use the magnetic field B and initial spin period P _ { 0 } inferred from SN and GRB observations to obtain simple constraints on the dimensionless amplitude of the mountain of \varepsilon < 0.01 and r-mode oscillation of \alpha < 1 , the former being similar to that obtained by recent works . We then include GW emission within the magnetar model . We show that when \varepsilon > 10 ^ { -4 } ( B / 10 ^ { 14 } \mbox { G } ) ( P _ { 0 } / 1 \mbox { ms } ) or \alpha > 0.01 ( B / 10 ^ { 14 } \mbox { G } ) ( P _ { 0 } / 1 \mbox { ms } ) ^ { 2 } , light curves are strongly affected , with significant decrease in peak luminosity and increase in time to peak luminosity . Thus the GW effects studied here are more pronounced for low B and short P _ { 0 } but are unlikely to be important in modelling SN and GRB light curves since the amplitudes needed for noticeable changes are quite large .