We investigate the clustering effect of dark energy ( DE ) in the formation of galaxy clusters using the spherical collapse model . Assuming a fully clustered DE component , the spherical overdense region is treated as an isolated system which conserves the energy separately for both matter and DE inside the spherical region . Then , by introducing a parameter r to characterize the degree of DE clustering , which is defined by the nonlinear density contrast ratio of matter to DE at turnaround in the recollapsing process , i.e . r \equiv \delta ^ { NL } _ { { de } , { ta } } / \delta ^ { NL } _ { { m } , { ta } } , we are able to uniquely determine the spherical collapsing process and hence obtain the virialized overdensity \Delta _ { vir } through a proper virialization scheme . Estimation of the virialized overdensities from current observation on galaxy clusters suggests that 0.5 < r < 0.8 at 1 \sigma level for the clustered DE with w < -0.9 . Also , we compare our method to the linear perturbation theory that deals with the growth of DE perturbation at early times . While both results are consistent with each other , our method is practically simple and it shows that the collapse process is rather independent of initial DE perturbation and its evolution at early times .