model finding
Recently Published Documents


TOTAL DOCUMENTS

68
(FIVE YEARS 25)

H-INDEX

10
(FIVE YEARS 3)

2022 ◽  
Vol 924 (1) ◽  
pp. 39
Author(s):  
Ajit Kumar Mehta ◽  
Alessandra Buonanno ◽  
Jonathan Gair ◽  
M. Coleman Miller ◽  
Ebraheem Farag ◽  
...  

Abstract Using ground-based gravitational-wave detectors, we probe the mass function of intermediate-mass black holes (IMBHs) wherein we also include BHs in the upper mass gap at ∼60–130 M ⊙. Employing the projected sensitivity of the upcoming LIGO and Virgo fourth observing run (O4), we perform Bayesian analysis on quasi-circular nonprecessing, spinning IMBH binaries (IMBHBs) with total masses 50–500 M ⊙, mass ratios 1.25, 4, and 10, and dimensionless spins up to 0.95, and estimate the precision with which the source-frame parameters can be measured. We find that, at 2σ, the mass of the heavier component of IMBHBs can be constrained with an uncertainty of ∼10%–40% at a signal-to-noise ratio of 20. Focusing on the stellar-mass gap with new tabulations of the 12C(α, γ)16O reaction rate and its uncertainties, we evolve massive helium core stars using MESA to establish the lower and upper edges of the mass gap as ≃ 59 − 13 + 34 M ⊙ and ≃ 139 − 14 + 30 M ⊙ respectively, where the error bars give the mass range that follows from the ±3σ uncertainty in the 12C(α, γ)16O nuclear reaction rate. We find that high resolution of the tabulated reaction rate and fine temporal resolution are necessary to resolve the peak of the BH mass spectrum. We then study IMBHBs with components lying in the mass gap and show that the O4 run will be able to robustly identify most such systems. Finally, we reanalyze GW190521 with a state-of-the-art aligned-spin waveform model, finding that the primary mass lies in the mass gap with 90% credibility.


2021 ◽  
Vol 8 ◽  
Author(s):  
João P. C. Bertoldo ◽  
Etienne Decencière ◽  
David Ryckelynck ◽  
Henry Proudhon

X-Ray Computed Tomography (XCT) techniques have evolved to a point that high-resolution data can be acquired so fast that classic segmentation methods are prohibitively cumbersome, demanding automated data pipelines capable of dealing with non-trivial 3D images. Meanwhile, deep learning has demonstrated success in many image processing tasks, including materials science applications, showing a promising alternative for a human-free segmentation pipeline. However, the rapidly increasing number of available architectures can be a serious drag to the wide adoption of this type of models by the end user. In this paper a modular interpretation of U-Net (Modular U-Net) is proposed with a parametrized architecture that can be easily tuned to optimize it. As an example, the model is trained to segment 3D tomography images of a three-phased glass fiber-reinforced Polyamide 66. We compare 2D and 3D versions of our model, finding that the former is slightly better than the latter. We observe that human-comparable results can be achievied even with only 13 annotated slices and using a shallow U-Net yields better results than a deeper one. As a consequence, neural networks show indeed a promising venue to automate XCT data processing pipelines needing no human, adhoc intervention.


Symmetry ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 2108
Author(s):  
David Benisty ◽  
Gonzalo J. Olmo ◽  
Diego Rubiera-Garcia

The early cosmology, driven by a single scalar field, both massless and massive, in the context of Eddington-inspired Born-Infeld gravity, is explored. We show the existence of nonsingular solutions of bouncing and loitering type (depending on the sign of the gravitational theory’s parameter, ϵ) replacing the Big Bang singularity, and discuss their properties. In addition, in the massive case, we find some new features of the cosmological evolution depending on the value of the mass parameter, including asymmetries in the expansion/contraction phases, or a continuous transition between a contracting phase to an expanding one via an intermediate loitering phase. We also provide a combined analysis of cosmic chronometers, standard candles, BAO, and CMB data to constrain the model, finding that for roughly |ϵ|≲5·10−8m2 the model is compatible with the latest observations while successfully removing the Big Bang singularity. This bound is several orders of magnitude stronger than the most stringent constraints currently available in the literature.


Author(s):  
Shangfeng Zhang ◽  
Jingjue Xu ◽  
Wei Chen ◽  
Manzhou Teng ◽  
Xiuwen Yu ◽  
...  

As an emerging economy, market distortions exist in China’s institutional adjustment during its economy transformation. However, the price distortion of capital and labor factors will lead to factor misallocation among provinces. This will eventually reduce the total factor productivity (TFP) at the national level. Based on Hsieh and Klenow’s [1] model framework, this paper aims to measure the degree of misallocation of capital and labor factors among provinces, and estimates the growth potential of China’s TFP by using input-output data from 1993 to 2017. The findings show that: First, the degree of inter-provincial labor misallocation is greater than that of capital. For example, in 2017, the degree of capital (labor) misallocation was 5.77% (10.25%), resulting in China’s TFP loss of 17.23%. Second, due to the factor marketization reforms, the degree of labor misallocation has declined while the degree of capital misallocation has intensified in recent years. Lastly, this paper introduces the time-varying elasticity production function model, finding that using the Cobb-Douglas production function will cause the factor misallocation to be underestimated by 5.91% due to the assumption of constant output elasticity.


2021 ◽  
Author(s):  
Nick H Barton ◽  
Oluwafunmilola Olusanya

A species distributed across heterogeneous environments may adapt to local conditions. Szep et al. (2021, Evolution) modelled this process in the infinite island model, finding the stationary distribution of allele frequencies and deme sizes. We extend this to ask how a metapopulation responds to changes in carrying capacity, selection strength, or migration rate, restricting attention to fixed deme size ("soft selection"). We develop a "fixed-state" approximation (accurate when migration is rare) which assumes that the loci are near fixation. Under this approximation, polymorphism is only possible for a narrow range of habitat proportions when selection is weak compared to drift, but for a much wider range otherwise. When local conditions (Ns or Nm) change in a single deme, it takes a time of ~1/m to reach the new equilibrium. However, even withmany loci, there can be substantial fluctuations in net adaptation, due to the bimodal allele frequency distributions at each locus. Thus, in a finite metapopulation, variation may gradually be lost by chance, even if it would persist with infinitely many demes. When conditions change across the whole metapopulation, there can be rapid response, accurately predicted by the fixed-state approximation when Nm <<1.


2021 ◽  
Author(s):  
João Paulo Casagrande Bertoldo ◽  
Etienne Decencière Ferrandière ◽  
David Ryckelynck ◽  
Henry Proudhon

Abstract X-ray Computed Tomography (XCT) techniques have evolved to a point that high-resolution data can be acquired so fast that classic segmentation methods are prohibitively cumbersome, demanding automated data pipelines capable of dealing with non-trivial 3D images. Deep learning has demonstrated success in many image processing tasks, including material science applications, showing a promising alternative for a human-free segmentation pipeline. In this paper a modular interpretation of U-Net (Modular U-Net) is proposed and trained to segment 3D tomography images of a three-phased glass fiber-reinforced Polyamide 66. We compare 2D and 3D versions of our model, finding that the former is slightly better than the latter. We observe that human-comparable results can be achievied even with only 10 annotated layers and using a shallow U-Net yields better results than a deeper one. As a consequence, Neural Network (NN) show indeed a promising venue to automate XCT data processing pipelines needing no human, adhoc intervention.


2021 ◽  
Author(s):  
Björn Koneswarakantha ◽  
Timothé Ménard

Background - As investigator site audits have largely been conducted remotely during the COVID-19 pandemic, remote quality monitoring has gained some momentum. To further facilitate the conduct of remote Quality Assurance (QA) activities, we developed new quality indicators, building on a previously published statistical modelling methodology. Methods - We modeled the risk of having an audit or inspection finding using historical audits and inspections data from 2011 - 2019. We used logistic regression to model finding risk for 4 clinical impact factor (CIF) categories: Safety Reporting, Data Integrity, Consent and Protecting Endpoints. Results - Resulting Area Under the Receiver Operating Characteristic Curves were between 0.57 - 0.66 with calibrated predictive ranges of 27 - 41%. The combined and adjusted risk factors could be used to easily interpret risk estimates. Conclusion - Continuous surveillance of the identified risk factors and resulting risk estimates could be used to complement remote QA strategies and help to manage audit targets and audit focus also in post-pandemic times.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Leyla Sadat Tavassoli ◽  
Reza Massah ◽  
Arsalan Montazeri ◽  
Mirpouya Mirmozaffari ◽  
Guang-Jun Jiang ◽  
...  

In this paper, a modified model of Nondominated Sorting Genetic Algorithm 2 (NSGA-II), which is one of the Multiobjective Evolutionary Algorithms, is proposed. This algorithm is a new model designed to make a trade-off between minimizing the cost of preventive maintenance (PM) and minimizing the time taken to perform this maintenance for a series-parallel system. In this model, the limitations of labor and equipment of the maintenance team and the effects of maintenance issues on manufacturing problems are also considered. In the mathematical model, finding the appropriate objective functions for the maintenance scheduling problem requires all maintenance costs and failure rates to be integrated. Additionally, the effects of production interruption during preventive maintenance are added to objective functions. Furthermore, to make a better performance compared with a regular NSGA-II algorithm, we proposed a modified algorithm with a repository to keep more unacceptable solutions. These solutions can be modified and changed with the proposed mutation algorithm to acceptable solutions. In this algorithm, modified operators, such as simulated binary crossover and polynomial mutation, will improve the algorithm to generate convergence and uniformly distributed solutions with more diverse solutions. Finally, by comparing the experimental solutions with the solutions of two Strength Pareto Evolutionary Algorithm 2 (SPEA2) and regular NSGA-II, MNSGA-II generates more efficient and uniform solutions than the other two algorithms.


2021 ◽  
Author(s):  
Alex Romanova

It is beneficial for document topic analysis to build a bridge between word embedding process and graph capacity to connect the dots and represent complex correlations between entities. In this study we examine processes of building a semantic graph model, finding document topics and validating topic discovery. We introduce a novel Word2Vec2Graph model that is built on top of Word2Vec word embedding model. We demonstrate how this model can be used to analyze long documents and uncover document topics as graph clusters. To validate topic discovery method we transfer words to vectors and vectors to images and use deep learning image classification.


2021 ◽  
Vol 9 (1) ◽  
pp. 22-33
Author(s):  
Aree Saeed Mustafa

This study extends agency theory by explaining the client's understanding of audit quality. This study contributes to the audit literature by examining the effect of wedge control-ownership on industry specialist auditors that have not been researched in Turkey. The interests of minority and controlling shareholders are not completely compatible. The research analysis method used a logistic regression model, finding that firms that practice a larger difference between control rights and cash flow rights tend to prefer high audit quality measures by industry specialist auditors. This study encourages regulators to improve law enforcement to enhance the role of corporate governance in Turkey to address the features of ownership-control firms and offer a suitable environment for investment and minority shareholders.


Sign in / Sign up

Export Citation Format

Share Document