Determining the scaling relations between galaxy cluster observables requires large samples of uniformly observed clusters . We measure the mean X-ray luminosity–optical richness ( \bar { L } _ { X } – \bar { N } _ { 200 } ) relation for an approximately volume-limited sample of more than 17,000 optically-selected clusters from the maxBCG catalog spanning the redshift range 0.1 < z < 0.3 . By stacking the X-ray emission from many clusters using ROSAT All-Sky Survey data , we are able to measure mean X-ray luminosities to \sim 10 % ( including systematic errors ) for clusters in nine independent optical richness bins . In addition , we are able to crudely measure individual X-ray emission from \sim 800 of the richest clusters . Assuming a log-normal form for the scatter in the L _ { X } – N _ { 200 } relation , we measure \sigma _ { \ln { L } } = 0.86 \pm 0.03 at fixed N _ { 200 } . This scatter is large enough to significantly bias the mean stacked relation . The corrected median relation can be parameterized by \widetilde { L } _ { X } = e ^ { \alpha } ( \bar { N } _ { 200 } / 40 ) ^ { \beta } 10 ^ { 42 } h ^ { -2 } % \mathrm { ergs } \mathrm { s } ^ { -1 } , where \alpha = 3.57 \pm 0.08 and \beta = 1.82 \pm 0.05 . We find that X-ray selected clusters are significantly brighter than optically-selected clusters at a given optical richness . This selection bias explains the apparently X-ray underluminous nature of optically-selected cluster catalogs .