An analytical model is developed for the screening of the external magnetic field of a rotating , axisymmetric neutron star due to the accretion of plasma from a disk . The decrease of the field occurs due to the electric current in the infalling plasma . The deposition of this current carrying plasma on the star ’ s surface creates an induced magnetic moment with a sign opposite to that of the original magnetic dipole . The field decreases independent of whether the star spins-up or spins-down . The time-scale for an appreciable decrease ( factor of > 100 ) of the field is found to be \sim 1.6 \times 10 ^ { 7 } yr , for a mass accretion rate \dot { M } = 10 ^ { -9 } M _ { \odot } / yr and an initial magnetic moment \mu _ { i } = 10 ^ { 30 } { ~ { } Gcm } ^ { 3 } which corresponds to a surface field of 10 ^ { 12 } ~ { } G if the star ’ s radius is 10 ^ { 6 } cm . The time-scale varies approximately as \mu _ { i } ^ { 3.8 } / \dot { M } ^ { 1.9 } . The decrease of the magnetic field does not have a simple relation to the accreted mass . Once the accretion stops the field leaks out on an Ohmic diffusion time scale which is estimated to be > 10 ^ { 9 } yr .