scholarly journals Superclusters from velocity divergence fields

2020 ◽  
Vol 500 (1) ◽  
pp. L32-L36
Author(s):  
J D Peñaranda-Rivera ◽  
D L Paipa-León ◽  
S D Hernández-Charpak ◽  
J E Forero-Romero

ABSTRACT Superclusters are a convenient way to partition and characterize the large-scale structure of the Universe. In this Letter, we explore the advantages of defining superclusters as watershed basins in the divergence velocity field. We apply this definition on diverse data sets generated from linear theory and N-body simulations, with different grid sizes, smoothing scales, and types of tracers. From this framework emerges a linear scaling relation between the average supercluster size and the autocorrelation length in the divergence field, a result that holds for one order of magnitude from 10 up to 100 Mpc h−1. These results suggest that the divergence-based definition provides a robust context to quantitatively compare results across different observational or computational frameworks. Through its connection with linear theory, it can also facilitate the exploration of how supercluster properties depend on cosmological parameters, paving the way to use superclusters as cosmological probes.

1999 ◽  
Vol 183 ◽  
pp. 178-184 ◽  
Author(s):  
B.J. Boyle ◽  
R.J. Smith ◽  
T. Shanks ◽  
S.M. Croom ◽  
L. Miller

The study of large-scale structure through QSO clustering provides a potentially powerful route to determining the fundamental cosmological parameters of the Universe (see Croom & Shanks 1996). Unfortunately, previous QSO clustering studies have been limited by the relatively small sizes of homogeneous QSO catalogues that have been available. Although approximately 10,000 QSOs are now known (Veron-Cetty & Veron 1997), the largest catalogues suitable for clustering studies contain only 500–1000 QSOs (Boyle et al. 1990, Crampton et al. 1990, Hewett et al. 1994). Even combining all such suitable catalogues, the total number of QSOs which can be used for clustering studies is still only about 2000.


Author(s):  
Yu-Cheng Chou ◽  
David Ko ◽  
Harry H. Cheng ◽  
Roger L. Davis ◽  
Bo Chen

Two challenging problems in the area of scientific computation are long computation time and large-scale, distributed, and diverse data sets. As the scale of science and engineering applications rapidly expands, these two problems become more manifest than ever. This paper presents the concept of Mobile Agent-based Computational Steering (MACS) for distributed simulation. The MACS allows users to apply new or modified algorithms to a running application by altering certain sections of the program code without the need of stopping the execution and recompiling the program code. The concept has been validated through an application for dynamic CFD data post processing. The validation results show that the MACS has a great potential to enhance productivity and data manageability of large-scale distributed computational systems.


Author(s):  
Dave Higdon ◽  
Katrin Heitmann ◽  
Charles Nakhleh ◽  
Salman Habib

This article focuses on the use of a Bayesian approach that combines simulations and physical observations to estimate cosmological parameters. It begins with an overview of the Λ-cold dark matter (CDM) model, the simplest cosmological model in agreement with the cosmic microwave background (CMB) and largescale structure analysis. The CDM model is determined by a small number of parameters which control the composition, expansion and fluctuations of the universe. The present study aims to learn about the values of these parameters using measurements from the Sloan Digital Sky Survey (SDSS). Computationally intensive simulation results are combined with measurements from the SDSS to infer about a subset of the parameters that control the CDM model. The article also describes a statistical framework used to determine a posterior distribution for these cosmological parameters and concludes by showing how it can be extended to include data from diverse data sources.


Author(s):  
Malcolm S. Longair

Since 1980, our empirical knowledge of the universe has advanced tremendously and precision cosmology has become a reality. These developments have been largely technology-driven, the result of increased computer power, new generations of telescopes for all wavebands, new types of semiconductor detectors, such as CCDs, and major investments by many nations in superb observing facilities. The discipline also benefitted from the influx of experimental and theoretical physicists into the cosmological arena. The accuracy and reliability of the values of the cosmological parameters has improved dramatically, many of them now being known to about 1%. The ΛCDM model provides a remarkable fit to all the observational data, demonstrating that the cosmological constant is non-zero and that the global geometry of the universe is flat. The underlying physics of galaxy and large-scale structure formation has advanced dramatically and demonstrated the key roles played by dark matter and dark energy.


Geophysics ◽  
2002 ◽  
Vol 67 (1) ◽  
pp. 204-211 ◽  
Author(s):  
Pascal Audigane ◽  
Jean‐Jacques Royer ◽  
Hideshi Kaieda

Hydraulic fracturing is a common procedure to increase the permeability of a reservoir. It consists in injecting high‐pressure fluid into pilot boreholes. These hydraulic tests induce locally seismic emission (microseismicity) from which large‐scale permeability estimates can be derived assuming a diffusion‐like process of the pore pressure into the surrounding stimulated rocks. Such a procedure is applied on six data sets collected in the vicinity of two geothermal sites at Soultz (France) and Ogachi (Japan). The results show that the method is adequate to estimate large‐scale permeability tensors at different depths in the reservoir. Such an approach provides permeability of the medium before fracturing compatible with in situ measurements. Using a line source formulation of the diffusion equation rather than a classical point source approach, improvements are proposed for accounting in situation where the injection is performed on a well section. This technique applied to successive fluid‐injection tests indicates an increase in permeability by an order of magnitude. The underestimates observed in some cases are attributed to the difference of scale at which the permeability is estimated (some 1 km3 corresponding to the seismic active volume of rock compared to a few meters around the well for the pumping or pressure oscillation tests). One advantage of the proposed method is that it provides permeability tensor estimates at the reservoir scale.


2020 ◽  
Vol 495 (2) ◽  
pp. 1613-1640 ◽  
Author(s):  
Mehdi Rezaie ◽  
Hee-Jong Seo ◽  
Ashley J Ross ◽  
Razvan C Bunescu

ABSTRACT Robust measurements of cosmological parameters from galaxy surveys rely on our understanding of systematic effects that impact the observed galaxy density field. In this paper, we present, validate, and implement the idea of adopting the systematics mitigation method of artificial neural networks for modelling the relationship between the target galaxy density field and various observational realities including but not limited to Galactic extinction, seeing, and stellar density. Our method by construction allows a wide class of models and alleviates overtraining by performing k-fold cross-validation and dimensionality reduction via backward feature elimination. By permuting the choice of the training, validation, and test sets, we construct a selection mask for the entire footprint. We apply our method on the extended Baryon Oscillation Spectroscopic Survey (eBOSS) Emission Line Galaxies (ELGs) selection from the Dark Energy Camera Legacy Survey (DECaLS) Data Release 7 and show that the spurious large-scale contamination due to imaging systematics can be significantly reduced by up-weighting the observed galaxy density using the selection mask from the neural network and that our method is more effective than the conventional linear and quadratic polynomial functions. We perform extensive analyses on simulated mock data sets with and without systematic effects. Our analyses indicate that our methodology is more robust to overfitting compared to the conventional methods. This method can be utilized in the catalogue generation of future spectroscopic galaxy surveys such as eBOSS and Dark Energy Spectroscopic Instrument (DESI) to better mitigate observational systematics.


Author(s):  
Karl E. Misulis ◽  
Mark E. Frisse

Data science is the study of how analytics techniques can be applied to large and diverse data sets. This field is emerging because of the availability of massive data sets in both consumer and health sectors, new machine learning and other analytics requiring large-scale computation, and the vital need to identify risk factors, trends, and other relationships not apparent when applying traditional analytics methods to smaller structured data sets. In some organizations, the primary role of a clinical informatics professional no longer is focused on how electronic health records are used in healthcare delivery but instead is focused on how patient encounter information can be collected efficiently, aggregated with information from other encounters or sources, and analyzed to improve our understanding of how population studies can improve the care of individuals. Such an understanding is critical to improving care quality and lowering healthcare costs.


2013 ◽  
Vol 22 (10) ◽  
pp. 1350065 ◽  
Author(s):  
STEFANO VIAGGIU ◽  
MARCO MONTUORI

The distribution of matter in the universe shows a complex pattern, formed by cluster of galaxies, voids and filaments denoted as cosmic web. Different approaches have been proposed to model such structure in the framework of the general relativity. Recently, one of us has proposed a generalization (ΛFB model) of the Fractal Bubble model, proposed by Wiltshire, which accounts for such large scale structure. The ΛFB model is an evolution of FB model and includes in a consistent way a description of inhomogeneous matter distribution and a Λ term. Here, we analyze the ΛFB model focusing on the relation between cosmological parameters. The main result is the consistency of ΛCDM model values for ΩΛ0 (≈ 0.7) and Ωk0(∣Ωk0∣ < ≈ 0.01) with a large fraction of voids. This allows to quantify to which extent the inhomogeneous structure could account for Λ constant consistently with standard values of the other cosmological parameters.


2020 ◽  
Vol 500 (3) ◽  
pp. 3838-3853
Author(s):  
Fuyu Dong ◽  
Yu Yu ◽  
Jun Zhang ◽  
Xiaohu Yang ◽  
Pengjie Zhang

ABSTRACT The integrated Sachs–Wolfe (ISW) effect is caused by the decay of cosmological gravitational potential and is therefore a unique probe of dark energy. However, its robust detection is still problematic. Various tensions between different data sets, different large-scale structure (LSS) tracers, and between data and the ΛCDM theory prediction exist. We propose a novel method of ISW measurement by cross-correlating cosmic microwave background (CMB) and the LSS traced by ‘low-density position’ (LDP). It isolates the ISW effect generated by low-density regions of the universe but insensitive to selection effects associated with voids. We apply it to the DR8 galaxy catalogue of the DESI Legacy imaging surveys and obtain the LDPs at z ≤ 0.6 over ∼20 000 deg2 sky coverage. We then cross-correlate with the Planck temperature map and detect the ISW effect at 3.2σ. We further compare the measurement with numerical simulations of the concordance ΛCDM cosmology and find the ISW amplitude parameter AISW = 1.14 ± 0.38 when we adopt an LDP definition radius $R_\mathrm{ s}=3^{^{\prime }}$, fully consistent with the prediction of the standard ΛCDM cosmology (AISW = 1). This agreement with ΛCDM cosmology holds for all the galaxy samples and Rs that we have investigated. Furthermore, the S/N is comparable to that of galaxy ISW measurement. These results demonstrate the LDP method as a competitive alternative to existing ISW measurement methods and provide independent checks to existing tensions.


Webology ◽  
2021 ◽  
Vol 18 (2) ◽  
pp. 462-474
Author(s):  
Marischa Elveny ◽  
Mahyuddin KM Nasution ◽  
Muhammad Zarlis ◽  
Syahril Efendi

Business intelligence can be said to be techniques and tools as acquisition, transforming raw data into meaningful and useful information for business analysis purposes. This study aims to build business intelligence in optimizing large-scale data based on e-metrics. E-metrics are data created from electronic-based customer behavior. As more and more large data sets become available, the challenge of analyzing data sets will get bigger and bigger. Therefore, business intelligence is currently facing new challenges, but also interesting opportunities, where can describe in real time the needs of the market share. Optimization is done using adaptive multivariate regression that can be address high-dimensional data and produce accurate predictions of response variables and produce continuous models in knots based on the smallest GCV value, where large and diverse data are simplified and then modeled based on the level of behavior similarity, basic measurements of distances, attributes, times, places, and transactions between social actors. Customer purchases will represent each preferred behaviour and a formula can be used to calculate the score for each customer using 7 input variables. Adaptive multivariate regression looks for customer behaviour so as to get the results of cutting the deviation which is the determining factor for performance on the data. The results show there are strategies and information needed for a sustainable business. Where merchants who sell fast food or food stalls are more in demand by customers.


Sign in / Sign up

Export Citation Format

Share Document