Programs to observe evolution in the M _ { \bullet } - \sigma or M _ { \bullet } - L relations typically compare black-hole masses , M _ { \bullet } , in high-redshift galaxies selected by nuclear activity to M _ { \bullet } in local galaxies selected by luminosity L, or stellar velocity dispersion \sigma . Because AGN luminosity is likely to depend on M _ { \bullet } , selection effects are different for high-redshift and local samples , potentially producing a false signal of evolution . This bias arises because cosmic scatter in the M _ { \bullet } - \sigma and M _ { \bullet } - L relations means that the mean \log _ { 10 } L or \log _ { 10 } \sigma among galaxies that host a black hole of given M _ { \bullet } , may be substantially different than the \log _ { 10 } L or \log _ { 10 } \sigma obtained from inverting the M _ { \bullet } - L or M _ { \bullet } - \sigma relations for the same nominal M _ { \bullet } . The bias is particularly strong at high M _ { \bullet } , where the luminosity and dispersion functions of galaxies are falling rapidly . The most massive black holes occur more often as rare outliers in galaxies of modest mass than in the even rarer high-mass galaxies , which would otherwise be the sole location of such black holes in the absence of cosmic scatter . Because of this bias , M _ { \bullet } will typically appear to be too large in the distant sample for a given L or \sigma . For the largest black holes and the largest plausible cosmic scatter , the bias can reach a factor of 3 in M _ { \bullet } for the M _ { \bullet } - \sigma relation and a factor of 9 for the M _ { \bullet } - L relation . Unfortunately , the actual cosmic scatter is not known well enough to correct for the bias . Measuring evolution of the M _ { \bullet } and galaxy property relations requires object selection to be precisely defined and exactly the same at all redshifts .