We show that the observed mass-to-light ratio of galaxy clusters increases with cluster temperature as expected from cosmological simulations . Contrary to previous observational suggestions , we find a mild but robust increase of M / L from poor ( T \sim 1 - 2 keV ) to rich ( T \sim 12 keV ) clusters ; over this range , the mean M / L _ { V } increases by a factor of \sim 2 . The best fit relation satisfies M / L _ { V } = ( 170 \pm 30 ) T _ { keV } ^ { 0.3 \pm 0.1 } h at z = 0 , with a large scatter . This trend confirms predictions from cosmological simulations which show that the richest clusters are antibiased , with a higher ratio of mass per unit light than average . The antibias increases with cluster temperature . The effect is caused by the relatively older age of the high-density clusters , where light has declined more significantly than average since their earlier formation time . Combining the current observations with simulations , we find a global value of M / L _ { V } = 240 \pm 50 h , and a corresponding mass density of the universe of \Omega _ { m } = 0.17 \pm 0.05 .