scholarly journals Advances in constraining intrinsic alignment models with hydrodynamic simulations

2021 ◽  
Vol 508 (1) ◽  
pp. 637-664 ◽  
Author(s):  
S Samuroff ◽  
R Mandelbaum ◽  
J Blazek

ABSTRACT We use galaxies from the illustristng, massiveblack-ii, and illustris-1 hydrodynamic simulations to investigate the behaviour of large scale galaxy intrinsic alignments. Our analysis spans four redshift slices over the approximate range of contemporary lensing surveys z = 0−1. We construct comparable weighted samples from the three simulations, which we then analyse using an alignment model that includes both linear and quadratic alignment contributions. Our data vector includes galaxy–galaxy, galaxy–shape, and shape–shape projected correlations, with the joint covariance matrix estimated analytically. In all of the simulations, we report non-zero IAs at the level of several σ. For a fixed lower mass threshold, we find a relatively strong redshift dependence in all three simulations, with the linear IA amplitude increasing by a factor of ∼2 between redshifts z = 0 and z = 1. We report no significant evidence for non-zero values of the tidal torquing amplitude, A2, in TNG, above statistical uncertainties, although MBII favours a moderately negative A2 ∼ −2. Examining the properties of the TATT model as a function of colour, luminosity and galaxy type (satellite or central), our findings are consistent with the most recent measurements on real data. We also outline a novel method for constraining the TATT model parameters directly from the pixelized tidal field, alongside a proof-of-concept exercise using TNG. This technique is shown to be promising, although comparison with previous results obtained via other methods is non-trivial.

2017 ◽  
Vol 21 ◽  
pp. 369-393
Author(s):  
Nelson Antunes ◽  
Vladas Pipiras ◽  
Patrice Abry ◽  
Darryl Veitch

Poisson cluster processes are special point processes that find use in modeling Internet traffic, neural spike trains, computer failure times and other real-life phenomena. The focus of this work is on the various moments and cumulants of Poisson cluster processes, and specifically on their behavior at small and large scales. Under suitable assumptions motivated by the multiscale behavior of Internet traffic, it is shown that all these various quantities satisfy scale free (scaling) relations at both small and large scales. Only some of these relations turn out to carry information about salient model parameters of interest, and consequently can be used in the inference of the scaling behavior of Poisson cluster processes. At large scales, the derived results complement those available in the literature on the distributional convergence of normalized Poisson cluster processes, and also bring forward a more practical interpretation of the so-called slow and fast growth regimes. Finally, the results are applied to a real data trace from Internet traffic.


1996 ◽  
Vol 465 ◽  
pp. 499 ◽  
Author(s):  
J. Richard, III Gott ◽  
Renyue Cen ◽  
Jeremiah P. Ostriker

2019 ◽  
Vol XVI (2) ◽  
pp. 1-11
Author(s):  
Farrukh Jamal ◽  
Hesham Mohammed Reyad ◽  
Soha Othman Ahmed ◽  
Muhammad Akbar Ali Shah ◽  
Emrah Altun

A new three-parameter continuous model called the exponentiated half-logistic Lomax distribution is introduced in this paper. Basic mathematical properties for the proposed model were investigated which include raw and incomplete moments, skewness, kurtosis, generating functions, Rényi entropy, Lorenz, Bonferroni and Zenga curves, probability weighted moment, stress strength model, order statistics, and record statistics. The model parameters were estimated by using the maximum likelihood criterion and the behaviours of these estimates were examined by conducting a simulation study. The applicability of the new model is illustrated by applying it on a real data set.


2008 ◽  
Vol 59 (11) ◽  
Author(s):  
Iulia Lupan ◽  
Sergiu Chira ◽  
Maria Chiriac ◽  
Nicolae Palibroda ◽  
Octavian Popescu

Amino acids are obtained by bacterial fermentation, extraction from natural protein or enzymatic synthesis from specific substrates. With the introduction of recombinant DNA technology, it has become possible to apply more rational approaches to enzymatic synthesis of amino acids. Aspartase (L-aspartate ammonia-lyase) catalyzes the reversible deamination of L-aspartic acid to yield fumaric acid and ammonia. It is one of the most important industrial enzymes used to produce L-aspartic acid on a large scale. Here we described a novel method for [15N] L-aspartic synthesis from fumarate and ammonia (15NH4Cl) using a recombinant aspartase.


Cancers ◽  
2021 ◽  
Vol 13 (9) ◽  
pp. 2111
Author(s):  
Bo-Wei Zhao ◽  
Zhu-Hong You ◽  
Lun Hu ◽  
Zhen-Hao Guo ◽  
Lei Wang ◽  
...  

Identification of drug-target interactions (DTIs) is a significant step in the drug discovery or repositioning process. Compared with the time-consuming and labor-intensive in vivo experimental methods, the computational models can provide high-quality DTI candidates in an instant. In this study, we propose a novel method called LGDTI to predict DTIs based on large-scale graph representation learning. LGDTI can capture the local and global structural information of the graph. Specifically, the first-order neighbor information of nodes can be aggregated by the graph convolutional network (GCN); on the other hand, the high-order neighbor information of nodes can be learned by the graph embedding method called DeepWalk. Finally, the two kinds of feature are fed into the random forest classifier to train and predict potential DTIs. The results show that our method obtained area under the receiver operating characteristic curve (AUROC) of 0.9455 and area under the precision-recall curve (AUPR) of 0.9491 under 5-fold cross-validation. Moreover, we compare the presented method with some existing state-of-the-art methods. These results imply that LGDTI can efficiently and robustly capture undiscovered DTIs. Moreover, the proposed model is expected to bring new inspiration and provide novel perspectives to relevant researchers.


2021 ◽  
Vol 11 (2) ◽  
pp. 582
Author(s):  
Zean Bu ◽  
Changku Sun ◽  
Peng Wang ◽  
Hang Dong

Calibration between multiple sensors is a fundamental procedure for data fusion. To address the problems of large errors and tedious operation, we present a novel method to conduct the calibration between light detection and ranging (LiDAR) and camera. We invent a calibration target, which is an arbitrary triangular pyramid with three chessboard patterns on its three planes. The target contains both 3D information and 2D information, which can be utilized to obtain intrinsic parameters of the camera and extrinsic parameters of the system. In the proposed method, the world coordinate system is established through the triangular pyramid. We extract the equations of triangular pyramid planes to find the relative transformation between two sensors. One capture of camera and LiDAR is sufficient for calibration, and errors are reduced by minimizing the distance between points and planes. Furthermore, the accuracy can be increased by more captures. We carried out experiments on simulated data with varying degrees of noise and numbers of frames. Finally, the calibration results were verified by real data through incremental validation and analyzing the root mean square error (RMSE), demonstrating that our calibration method is robust and provides state-of-the-art performance.


2021 ◽  
Vol 22 (12) ◽  
pp. 6394
Author(s):  
Jacob Spinnen ◽  
Lennard K. Shopperly ◽  
Carsten Rendenbach ◽  
Anja A. Kühl ◽  
Ufuk Sentürk ◽  
...  

For in vitro modeling of human joints, osteochondral explants represent an acceptable compromise between conventional cell culture and animal models. However, the scarcity of native human joint tissue poses a challenge for experiments requiring high numbers of samples and makes the method rather unsuitable for toxicity analyses and dosing studies. To scale their application, we developed a novel method that allows the preparation of up to 100 explant cultures from a single human sample with a simple setup. Explants were cultured for 21 days, stimulated with TNF-α or TGF-β3, and analyzed for cell viability, gene expression and histological changes. Tissue cell viability remained stable at >90% for three weeks. Proteoglycan levels and gene expression of COL2A1, ACAN and COMP were maintained for 14 days before decreasing. TNF-α and TGF-β3 caused dose-dependent changes in cartilage marker gene expression as early as 7 days. Histologically, cultures under TNF-α stimulation showed a 32% reduction in proteoglycans, detachment of collagen fibers and cell swelling after 7 days. In conclusion, thin osteochondral slice cultures behaved analogously to conventional punch explants despite cell stress exerted during fabrication. In pharmacological testing, both the shorter diffusion distance and the lack of need for serum in the culture suggest a positive effect on sensitivity. The ease of fabrication and the scalability of the sample number make this manufacturing method a promising platform for large-scale preclinical testing in joint research.


Crystals ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 885
Author(s):  
Nicole Knoblauch ◽  
Peter Mechnich

Zirconium-Yttrium-co-doped ceria (Ce0.85Zr0.13Y0.02O1.99) compacts consisting of fibers with diameters in the range of 8–10 µm have been successfully prepared by direct infiltration of commercial YSZ fibers with a cerium oxide matrix and subsequent sintering. The resulting chemically homogeneous fiber-compacts are sinter-resistant up to 1923 K and retain a high porosity of around 58 vol% and a permeability of 1.6–3.3 × 10−10 m² at a pressure gradient of 100–500 kPa. The fiber-compacts show a high potential for the application in thermochemical redox cycling due its fast redox kinetics. The first evaluation of redox kinetics shows that the relaxation time of oxidation is five times faster than that of dense samples of the same composition. The improved gas exchange due to the high porosity also allows higher reduction rates, which enable higher hydrogen yields in thermochemical water-splitting redox cycles. The presented cost-effective fiber-compact preparation method is considered very promising for manufacturing large-scale functional components for solar-thermal high-temperature reactors.


Mathematics ◽  
2021 ◽  
Vol 9 (16) ◽  
pp. 1850
Author(s):  
Rashad A. R. Bantan ◽  
Farrukh Jamal ◽  
Christophe Chesneau ◽  
Mohammed Elgarhy

Unit distributions are commonly used in probability and statistics to describe useful quantities with values between 0 and 1, such as proportions, probabilities, and percentages. Some unit distributions are defined in a natural analytical manner, and the others are derived through the transformation of an existing distribution defined in a greater domain. In this article, we introduce the unit gamma/Gompertz distribution, founded on the inverse-exponential scheme and the gamma/Gompertz distribution. The gamma/Gompertz distribution is known to be a very flexible three-parameter lifetime distribution, and we aim to transpose this flexibility to the unit interval. First, we check this aspect with the analytical behavior of the primary functions. It is shown that the probability density function can be increasing, decreasing, “increasing-decreasing” and “decreasing-increasing”, with pliant asymmetric properties. On the other hand, the hazard rate function has monotonically increasing, decreasing, or constant shapes. We complete the theoretical part with some propositions on stochastic ordering, moments, quantiles, and the reliability coefficient. Practically, to estimate the model parameters from unit data, the maximum likelihood method is used. We present some simulation results to evaluate this method. Two applications using real data sets, one on trade shares and the other on flood levels, demonstrate the importance of the new model when compared to other unit models.


2021 ◽  
Vol 11 (15) ◽  
pp. 6998
Author(s):  
Qiuying Li ◽  
Hoang Pham

Many NHPP software reliability growth models (SRGMs) have been proposed to assess software reliability during the past 40 years, but most of them have focused on modeling the fault detection process (FDP) in two ways: one is to ignore the fault correction process (FCP), i.e., faults are assumed to be instantaneously removed after the failure caused by the faults is detected. However, in real software development, it is not always reliable as fault removal usually needs time, i.e., the faults causing failures cannot always be removed at once and the detected failures will become more and more difficult to correct as testing progresses. Another way to model the fault correction process is to consider the time delay between the fault detection and fault correction. The time delay has been assumed to be constant and function dependent on time or random variables following some kind of distribution. In this paper, some useful approaches to the modeling of dual fault detection and correction processes are discussed. The dependencies between fault amounts of dual processes are considered instead of fault correction time-delay. A model aiming to integrate fault-detection processes and fault-correction processes, along with the incorporation of a fault introduction rate and testing coverage rate into the software reliability evaluation is proposed. The model parameters are estimated using the Least Squares Estimation (LSE) method. The descriptive and predictive performance of this proposed model and other existing NHPP SRGMs are investigated by using three real data-sets based on four criteria, respectively. The results show that the new model can be significantly effective in yielding better reliability estimation and prediction.


Sign in / Sign up

Export Citation Format

Share Document