We model the evolution of the magnetic fields of neutron stars as consisting of a long term power-law decay modulated by short term small amplitude oscillations . Our model predictions on the timing noise \ddot { \nu } of neutron stars agree well with the observed statistical properties and correlations of normal radio pulsars . Fitting the model predictions to the observed data , we found that their initial parameter implies their initial surface magnetic dipole magnetic field strength B _ { 0 } \sim 5 \times 10 ^ { 14 } G when t _ { 0 } = 0.4 yr and that the oscillations have amplitude K \sim 10 ^ { -8 } to 10 ^ { -5 } and period T on the order of years . For individual pulsars our model can effectively reduce their timing residuals , thus offering the potential of more sensitive detections of gravitational waves with pulsar timing arrays . Finally our model can also re-produce their observed correlation and oscillations of \ddot { \nu } , as well as the “ slow glitch ” phenomenon .