Knowledge of the scatter in the mass-observable relation is a key ingredient for a cosmological analysis based on galaxy clusters in a photometric survey . In this paper we aim to quantify the capability of the correlation function of galaxy cluster to constrain the intrinsic scatter \sigma _ { \ln M } . We demonstrate how the linear bias measured in the correlation function of clusters can be used to determine the value of this parameter . The new method is tested in simulations of a 5 , 000 deg ^ { 2 } optical survey up to z \sim 1 , similar to the ongoing Dark Energy Survey ( DES ) . Our results show that our method works better at lower scatter values . We can measured the intrinsic scatter \sigma _ { \ln M } = 0.1 with a standard deviation of \sigma ( \sigma _ { \ln M } ) \sim 0.03 using this technique . However , the expected intrinsic scatter of the DES RedMaPPer cluster catalog \sigma _ { \ln M } \sim 0.2 can not be recovered with this method at suitable accuracy and precision because the area coverage is insufficient . For future photometric surveys with a larger area such as LSST and Euclid , the statistical errors will be reduced . Therefore , we forecast higher precision to measure the intrinsic scatter including the value mention before . We conclude that this method can be used as an internal consistency check method on their simplifying assumptions and complementary to cross-calibration techniques in multi-wavelength cluster observations .