Selection Bias in Observing the Cosmological Evolution of theM•‐σ andM•‐LRelationships
Citations Over TimeTop 1% of 2007 papers
Abstract
Programs to observe evolution in the M•−σ or M•−L relations typically compare black-hole masses, M•, in high-redshift galaxies selected by nuclear activity to M • in local galaxies selected by luminosity L, or stellar velocity dispersion σ. Because AGN luminosity is likely to depend on M•, selection effects are different for high-redshift and local samples, potentially producing a false signal of evolution. This bias arises because cosmic scatter in the M • − σ and M • −L relations means that the mean log 10 L or log 10 σ among galaxies that host a black hole of given M•, may be substantially different than the log 10 L or log 10 σ obtained from inverting the M • − L or M • − σ relations for the same nominal M•. The bias is particularly strong at high M•, where the luminosity and dispersion functions of galaxies are falling rapidly. The most massive black holes occur more often as rare outliers in galaxies of modest mass than in the even rarer high-mass galaxies, which would otherwise be the sole location of such black holes in the absence of cosmic scatter. Because of this bias, M • will typically appear to be too large in the distant sample for a given L or σ. For the largest black holes and the largest plausible cosmic scatter, the bias can reach a factor of 3 in M • for the M • −σ relation and a factor of 9 for the M • −L relation. Unfortunately, the actual cosmic scatter is not known well enough to correct for the bias. Measuring evolution of the M • and galaxy property relations requires object selection to be precisely defined and exactly the same at all redshifts. – 2 –