reasonable approximation
Recently Published Documents


TOTAL DOCUMENTS

65
(FIVE YEARS 13)

H-INDEX

13
(FIVE YEARS 2)

2021 ◽  
Vol 64 (1) ◽  
pp. 42-49
Author(s):  
Christine Taylor ◽  
Budy Notohardjono ◽  
Suraush Khambati ◽  
Shawn Canfield

Abstract In optimizing packaging design, the product’s fragility is qualified by a protype undergoing quantitative and qualitative tests that rely heavily on past knowledge and experiments. By the addition of finite element analysis (FEA), the product’s fragility can be obtained in the initial stages of product design with material characterization and simulation. FEA can predict Gs on the product as well as examine the strains, which interpret product failure more easily in the design stage. To incorporate FEA, first the foam material was measured at various strain rates under compression. Next a shipping package containing an Al block with consistent density was dropped at different heights—610 mm (24”), 915 mm (36”), and 1067 mm (42”)—to confirm the methodology. An I/O book was packaged for the final demonstration incorporating FEA with an electronic card package. In an electronic card package, the electronic assemblies are sensitive to strains on the system board. If the strains on the board are high, the assemblies’ solder connections to the board could be damaged and result in a defect during shipment. The simulations’ predicted Gs and board strains were compared to experimental drop testing results at 610 mm (24”) and 915 mm (36”). The simulation results for each sensor location were within reasonable approximation of the experimental results, verifying that FEA could be used in the initial design stages to predict the accelerations and strains for packaging development in parallel to the product design.


Author(s):  
Flavio Calvino ◽  
Daniele Giachini ◽  
Mattia Guerini

AbstractWe investigate upon the shape and the determinants of the age distribution of business firms. By employing a novel dataset covering the population of French businesses, we highlight that a geometric law provides a reasonable approximation for the age distribution. However, relevant systematic deviations and sectoral heterogeneity appear. We develop a stochastic model of firm dynamics to explain the mechanisms behind this evidence and relate them to business dynamism. Results reveal a long-term decline in entry rates and lower survival probabilities of young firms. Our findings bear important implications for aggregate outcomes, notably employment growth.


Author(s):  
Gábor Riczu ◽  
József Cseh

Samples of the spectrum of the [Formula: see text]Ar nucleus are known in different energy windows. In addition to the ground-state (GS) region, the superdeformed (SD) state is observed, too, and there is a good candidate for the hyperdeformed (HD) one, as well. They are populated in different reactions. We intend to describe the gross features of the spectra of different energies, deformations and reactions in a unified way. We apply the multiconfigurational dynamical symmetry (MUSY). The [Formula: see text](3) quantum numbers of the shape isomers from previous studies pave the way for this description. The MUSY reproduces the gross features of the spectra to a reasonable approximation. The energy spectrum of the three valleys (GS, SD, HD) indicates that the multiconfigurational symmetry is valid to a good approximation, and different cluster configurations coexist in the shape isomers.


2021 ◽  
Author(s):  
Theo Tricou ◽  
Eric Tannier ◽  
Damien M de Vienne

The data that is known and sampled in any evolutionary study is always a small part of what exists, known or not, or what has existed in the past and is extinct. Therefore it is likely that all detected past horizontal gene fluxes, hybridization, introgressions, admixtures or transfers, involve "ghosts", that is, extinct or unsampled lineages. The presence of these ghosts is acknowledged by all scientists, but almost all wish that and make as if their blurring influence would be low, like a background noise that, with a reasonable approximation, can be ignored. We assess this undervalued hypothesis by qualifying and quantifying the effect of ghost lineages on introgression detection by the popular D-statistics method. We use a genomic dataset of bears to illustrate and circumscribe the possibility of misinterpretation and show on simulated data that under certain conditions, far from unrealistic, most results interpreted from D-statistics, concerning the existence of introgression and the identity of donors and recipients of horizontal gene flows, are erroneous. In particular, the use of a distant outgroup, usually given as a solid ground for these tests, leads in fact to an increase in the error probability, and to false interpretations in a vast majority of the cases. We argue for a switch of the null hypothesis: the results of detection methods for gene fluxes should be interpreted with the full and visible participation of the unknown ghosts.


Author(s):  
Nico van Dijk ◽  
Barteld Schilstra

AbstractOverflow mechanisms can be found in a variety of queueing models. This paper studies a simple and generic overflow system that allows the service times to be both job type and station dependent. This system does not exhibit a product form. To justify simple product form computations, two product form modifications are given, as by a so-called call packing principle and by a stop protocol. The provided proofs are self-contained and straightforward for the exponential case and of merit by itself. Next, it is numerically studied whether and when, or under which conditions, the modifications lead to a reasonable approximation of the blocking probability, if not an ordering. The numerical results indicate that call packing provides a rather accurate approximation when the overflow station is not heavily utilized. Moreover, when overflowed jobs have an equal or faster service rate, the approximation is consistently found to be pessimistic, which can be useful for practical purposes. The stop protocol, in contrast, appears to be less accurate for most natural situations. Nevertheless, for an extreme situation the order might change. In addition, for the stop protocol the product form is proven to be insensitive (i.e. to also apply for arbitrary non-exponential service times). For call packing, this numerically appears not to be the case, as of interest by itself. However, from a practical viewpoint the sensitivity seems light. The results are intriguing for both theoretical and practical further research.


2021 ◽  
Vol 24 (1) ◽  
pp. 112-136
Author(s):  
Elvira Di Nardo ◽  
Federico Polito ◽  
Enrico Scalas

Abstract This paper is devoted to a fractional generalization of the Dirichlet distribution. The form of the multivariate distribution is derived assuming that the n partitions of the interval [0, Wn ] are independent and identically distributed random variables following the generalized Mittag-Leffler distribution. The expected value and variance of the one-dimensional marginal are derived as well as the form of its probability density function. A related generalized Dirichlet distribution is studied that provides a reasonable approximation for some values of the parameters. The relation between this distribution and other generalizations of the Dirichlet distribution is discussed. Monte Carlo simulations of the one-dimensional marginals for both distributions are presented.


Energies ◽  
2020 ◽  
Vol 13 (20) ◽  
pp. 5489
Author(s):  
Matthew Kuperus Heun ◽  
Zeke Marshall ◽  
Emmanuel Aramendia ◽  
Paul E. Brockway

Lighting provides an indispensable energy service, illumination. The field of societal exergy analysis considers light (and many other energy products) to be enablers of economic growth, and lighting contributes a non-negligible proportion of total useful exergy supplied to modern economies. In societal exergy analysis, the exergetic efficiency of electric lamps is central to determining the exergy contribution of lighting to an economy. Conventionally, societal exergy practitioners estimate the exergetic efficiency of lamps by an energy efficiency, causing confusion and, sometimes, overestimation of exergetic efficiency by a factor as large as 3. In response, we use recent results from the fields of radiation thermodynamics and photometry to develop an exact method for calculating the exergy of light and the exergetic efficiency of lamps. The exact method (a) is free of any assumptions for the value of the maximum luminous efficacy, (b) uses a non-unity spectral exergy-to-energy ratio, and (c) allows choices for the spectral luminous weighting function, which converts broad-spectrum electromagnetic radiation to light. The exact method exposes shortcomings inherent to the conventional method and leads to a reasonable approximation of lamp exergetic efficiency, when needed. To conclude, we provide three recommendations for societal exergy practitioners: use (a) the exact method when a lamp’s spectral power distribution is available, (b) the universal luminous weighting function, and (c) the reasonable approximation to the exact method when a lamp’s luminous efficacy is known but its spectral power distribution is not.


2020 ◽  
Vol 87 (4) ◽  
pp. 1876-1914 ◽  
Author(s):  
Mark Gertler ◽  
Christopher Huckfeldt ◽  
Antonella Trigari

Abstract We revisit the issue of the high cyclicality of wages of new hires. We show that after controlling for composition effects likely involving procyclical upgrading of job match quality, the wages of new hires are no more cyclical than those of existing workers. The key implication is that the sluggish behaviour of wages for existing workers is a better guide to the cyclicality of the marginal cost of labour than is the high measured cyclicality of new hires wages unadjusted for composition effects. Key to our identification is distinguishing between new hires from unemployment versus those who are job changers. We argue that to a reasonable approximation, the wages of the former provide a composition-free estimate of the wage flexibility, while the same is not true for the latter. We then develop a quantitative general equilibrium model with sticky wages via staggered contracting, on-the-job search, and heterogeneous match quality, and show that it can account for both the panel data evidence and aggregate evidence on labour market volatility.


Proceedings ◽  
2019 ◽  
Vol 33 (1) ◽  
pp. 6
Author(s):  
Dirk Nille ◽  
Udo von Toussaint

An analysis tool using Adaptive Kernel to solve an ill-posed inverse problem for a 2D model space is introduced. It is applicable for linear and non-linear forward models, for example in tomography and image reconstruction. While an optimisation based on a Gaussian Approximation is possible, it becomes intractable for more than some hundred kernel functions. This is because the determinant of the Hessian of the system has be evaluated. The SVD typically used for 1D problems fails with increasing problem size. Alternatively Stochastic Trace Estimation can be used, giving a reasonable approximation. An alternative to searching for the MAP solution is to integrate using Marcov Chain Monte Carlo without the need to determine the determinant of the Hessian. This also allows to treat problems where a linear approximation is not justified.


Author(s):  
Mohammed Ehsan Ur Rahman ◽  
Hari Dasyam Sri Saaketh Ram

In this paper, we shall try to shed some light on the very pressing and challenging issue of ethics in technologies under the umbrella of AI. This concern has to be discussed extensively by philosophers, economists, and AI research scholars and this process of discussion is a continuous one. Autonomous services especially those involving the use of systems that have both aspects socio-technical [5] ones can increasingly become a major factor in making the environment hostile. The inclusion of human morals into human-level intelligence systems [8] is crucial. The capacity of the human mind for formulating, ideating, conceptualizing, thinking, innovating and solving complex problems is minuscule compared with the mass of the problems, those of whom have solutions, which are essential for objectively rational and moral behavior in the real world or even for a novice reasonable approximation to such target sanity. Equally important notions to be considered to avoid possible shortcomings in and havocs due to this technology are involvement of human ethics, consideration of social and moral implications of the same.


Sign in / Sign up

Export Citation Format

Share Document