We compute the impact of the running of higher order density correlation functions on the two point functions of CMB spectral distortions ( SD ) . We show that having some levels of running enhances all of the SDs by few orders of magnitude which might make them easier to detect . Taking a reasonable range for |n _ { f _ { NL } } | \lesssim 1.1 and with f _ { NL } = 5 we show that for PIXIE like experiment , the signal to noise ratio , ( S / N ) _ { i } , enhances to \lesssim 4000 and \lesssim 10 for \mu T and yT toward the upper limit of n _ { f _ { NL } } . In addition , assuming |n _ { \tau _ { NL } } | < 1 and \tau _ { NL } = 10 ^ { 3 } , ( S / N ) _ { i } increases to \lesssim 8 \times 10 ^ { 6 } , \lesssim 10 ^ { 4 } and \lesssim 18 for \mu \mu , \mu y and yy , respectively . Therefore CMB spectral distortion can be a direct probe of running of higher order correlation functions in the near future .