scholarly journals UV background fluctuations and three-point correlations in the large-scale clustering of the Lyman α forest

2019 ◽  
Vol 487 (4) ◽  
pp. 5346-5362 ◽  
Author(s):  
Suk Sien Tie ◽  
David H Weinberg ◽  
Paul Martini ◽  
Wei Zhu ◽  
Sébastien Peirani ◽  
...  

ABSTRACT Using the Lyman α (Lyα) Mass Association Scheme, we make theoretical predictions for the three-dimensional three-point correlation function (3PCF) of the Lyα forest at redshift z = 2.3. We bootstrap results from the (100 h−1 Mpc)3 Horizon hydrodynamic simulation to a (1 h−1 Gpc)3N-body simulation, considering both a uniform ultraviolet background (UVB) and a fluctuating UVB sourced by quasars with a comoving nq ≈ 10−5h3 Mpc−3 placed either in massive haloes or randomly. On scales of 10–30 h−1 Mpc, the flux 3PCF displays hierarchical scaling with the square of the two-point correlation function (2PCF), but with an unusual value of Q ≡ ζ123/(ξ12ξ13 + ξ12ξ23 + ξ13ξ23) ≈ −4.5 that reflects the low bias of the Lyα forest and the anticorrelation between mass density and transmitted flux. For halo-based quasars and an ionizing photon mean free path of λ = 300 h−1 Mpc comoving, UVB fluctuations moderately depress the 2PCF and 3PCF, with cancelling effects on Q. For λ = 100 or 50 h−1 Mpc, UVB fluctuations substantially boost the 2PCF and 3PCF on large scales, shifting the hierarchical ratio to Q ≈ −3. We scale our simulation results to derive rough estimate of the detectability of the 3PCF in current and future observational data sets for the redshift range z = 2.1–2.6. At r = 10 and 20 h−1 Mpc, we predict a signal-to-noise ratio (SNR) of ∼9 and ∼7, respectively, for both Baryon Oscillation Spectroscopic Survey (BOSS) and extended BOSS (eBOSS), and ∼37 and ∼25 for Dark Energy Spectroscopic Instrument (DESI). At r = 40 h−1 Mpc the predicted SNR is lower by a factor of ∼3–5. Measuring the flux 3PCF would provide a novel test of the conventional paradigm of the Lyα forest and help separate the contributions of UVB fluctuations and density fluctuations to Lyα forest clustering, thereby solidifying its foundation as a tool of precision cosmology.

1994 ◽  
Vol 161 ◽  
pp. 295-300
Author(s):  
R. Fong ◽  
N. Metcalfe ◽  
T. Shanks

The machine measurements of UK Schmidt plates have produced two very large galaxy surveys, the APM survey and the Edinburgh-Durham Southern Galaxy Catalogue (or COSMOS survey). These surveys can constrain the power on large scales of ≳ 10h −1 Mpc better than current redshift surveys, simply because such large numbers, ≳ 2 million galaxies to bJ ≤ 20.5, provide very high signal/noise in the estimated two-point correlation function for galaxies. Furthermore, the results for the three-dimensional galaxy two point correlation function, ξ(r), obtained from the measured projected function, ω(θ), should be quite robust for reasonable model number-redshift distributions, N(z), for these magnitude limits (see, e.g., Roche et al. 1993). Another clear advantage of measuring ω(θ) is that it is unaffected by the peculiar velocities of the galaxies, whereas they have an important effect on the corresponding ξ,(s) using galaxy redshift surveys.


2006 ◽  
Vol 15 (08) ◽  
pp. 1199-1215
Author(s):  
T. GOLDMAN ◽  
JUAN PÉREZ-MERCADER

We study the common relationships that exist between the various structures in the Universe, and show that a unifying description appears when these are considered as emerging from dynamical critical phenomena characterized by complex exponents in the two-point correlation function of matter density fluctuations. Since gravity drives their formation, structures are more likely to form where there is maximal correlation in the matter density. Applying this simple principle of maximal correlation to the two-point correlation function in a scaling regime with complex exponents leads to a hierarchy of structures where: (1) the structures can be classified according to an integer and (2) there is a common real exponent for the two-point correlation function across the range of structures. This in turn implies the existence of both universal size and mass hierarchy-order relationships. We show that these relationships are in good agreement with observations, and that sizes and masses for the known structures, from Globules in the Interstellar Medium to Clusters of Galaxies, can be classified (essentially to within one order of magnitude out of more than 10 orders of magnitude) in terms of just three constants.


1983 ◽  
Vol 104 ◽  
pp. 175-175
Author(s):  
J. Bean ◽  
G. Efstathiou ◽  
R. S. Ellis ◽  
B. A. Peterson ◽  
T. Shanks ◽  
...  

The aim of the survey is to sample a relatively large, randomly chosen volume of the Universe in order to study the large-scale distribution of galaxies using the two-point correlation function, the peculiar velocities between galaxy pairs and to provide an estimate of the galaxian luminosity function that is unaffected by density inhomogeneities and Virgo infall.


2019 ◽  
Vol 491 (3) ◽  
pp. 3290-3317 ◽  
Author(s):  
Oliver H E Philcox ◽  
Daniel J Eisenstein ◽  
Ross O’Connell ◽  
Alexander Wiegand

ABSTRACT To make use of clustering statistics from large cosmological surveys, accurate and precise covariance matrices are needed. We present a new code to estimate large-scale galaxy two-point correlation function (2PCF) covariances in arbitrary survey geometries that, due to new sampling techniques, runs ∼104 times faster than previous codes, computing finely binned covariance matrices with negligible noise in less than 100 CPU-hours. As in previous works, non-Gaussianity is approximated via a small rescaling of shot noise in the theoretical model, calibrated by comparing jackknife survey covariances to an associated jackknife model. The flexible code, rascalc, has been publicly released, and automatically takes care of all necessary pre- and post-processing, requiring only a single input data set (without a prior 2PCF model). Deviations between large-scale model covariances from a mock survey and those from a large suite of mocks are found to be indistinguishable from noise. In addition, the choice of input mock is shown to be irrelevant for desired noise levels below ∼105 mocks. Coupled with its generalization to multitracer data sets, this shows the algorithm to be an excellent tool for analysis, reducing the need for large numbers of mock simulations to be computed.


2017 ◽  
Vol 468 (1) ◽  
pp. 1070-1083 ◽  
Author(s):  
Zachary Slepian ◽  
Daniel J. Eisenstein ◽  
Florian Beutler ◽  
Chia-Hsun Chuang ◽  
Antonio J. Cuesta ◽  
...  

2017 ◽  
Vol 469 (2) ◽  
pp. 1738-1751 ◽  
Author(s):  
Zachary Slepian ◽  
Daniel J. Eisenstein ◽  
Joel R. Brownstein ◽  
Chia-Hsun Chuang ◽  
Héctor Gil-Marín ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document