Relativistic astrophysical phenomena such as gamma-ray bursts ( GRBs ) and active galactic nuclei often require long-lived strong magnetic field that can not be achieved by shock compression alone . Here , we report on three-dimensional special-relativistic magnetohydrodynamic ( MHD ) simulations that we performed using a second-order Godunov-type conservative code , to explore the amplification and decay of macroscopic turbulence dynamo excited by the so-called Richtmyer-Meshkov instability ( RMI ; a Rayleigh-Taylor type instability ) . This instability is an inevitable outcome of interactions between shock and ambient density fluctuations . We find that the magnetic energy grows exponentially in a few eddy-turnover times , because of field-line stretching , and then , following the decay of kinetic turbulence , decays with a temporal power-law exponent of -0.7 . The magnetic-energy fraction can reach \epsilon _ { B } \sim 0.1 but depends on the initial magnetic field strength , which can diversify the observed phenomena . We find that the magnetic energy grows by at least two orders of magnitude compared to the magnetic energy immediately behind the shock , provided the kinetic energy of turbulence injected by the RMI is larger than the magnetic energy . This minimum degree of the amplification does not depend on the amplitude of the initial density fluctuations , while the growth timescale and the maximum magnetic energy depend on the degree of inhomogeneity in the density . The transition from Kolmogorov cascade to MHD critical balance cascade occurs at \sim 1 / 10 th the initial inhomogeneity scale , which limits the maximum synchrotron polarization to less than \sim 2 \% . We derive analytical formulas for these numerical results and apply them to GRBs . New results include the avoidance of electron cooling with RMI turbulence , the turbulent photosphere model via RMI , and the shallow decay of the early afterglow from RMI . We also performed a simulation of freely decaying turbulence with relativistic velocity dispersion . We find that relativistic turbulence begins to decay much faster than one eddy-turnover time because of fast shock dissipation , which does not support the relativistic turbulence model by Narayan & Kumar .