Analysis of random variability in Tortilla shells baking

2021 ◽  
Vol 293 ◽  
pp. 110372
Author(s):  
Rosalina Iribe-Salazar ◽  
José Caro-Corrales ◽  
Yessica Vázquez-López
Keyword(s):  
2004 ◽  
Vol 41 (2) ◽  
pp. 351-355 ◽  
Author(s):  
Dieter Stolle ◽  
Peijun Guo ◽  
Gabriel Sedran

This paper analyzes the impact of natural random variation of soil properties on the constitutive modelling of geomaterial behaviour. A theoretical framework for accommodating variation in soil properties is presented. The framework is then used to examine the consequence of parameter variability on stress–strain relations. An important observation is that average soil parameters from a series of tests on small specimens, in which density of the specimens varies randomly, do not necessarily reflect the average constitutive behaviour of soil. Model predictions are shown to be consistent with the experimental data.Key words: random variability, deterministic analysis, soil parameters, constitutive model.


1971 ◽  
Vol 28 (1) ◽  
pp. 291-301 ◽  
Author(s):  
Donald W. Zimmerman

A model of variability in measurement, which is sufficiently general for a variety of applications and which includes the main content of traditional theories of error of measurement and psychological tests, can be derived from the axioms of probability, without introducing “true values” and “errors.” Beginning with probability spaces (Ω, P1) and (φ, P2), the set Ω representing the outcomes of a measurement procedure and the set * representing individuals or experimental objects, it is possible to construct suitable product probability spaces and collections of random variables which can yield all results needed to describe random variability and reliability. This paper attempts to fill gaps in the mathematical derivations in many classical theories and at the same time to overcome limitations in the language of “true values” and “errors” by presenting explicitly the essential constructions required for a general probability model.


2020 ◽  
pp. 247255522095838
Author(s):  
Maria Filipa Pinto ◽  
Francisco Figueiredo ◽  
Alexandra Silva ◽  
António R. Pombinho ◽  
Pedro José Barbosa Pereira ◽  
...  

The throughput level currently reached by automatic liquid handling and assay monitoring techniques is expected to facilitate the discovery of new modulators of enzyme activity. Judicious and dependable ways to interpret vast amounts of information are, however, required to effectively answer this challenge. Here, the 3-point method of kinetic analysis is proposed as a means to significantly increase the hit success rates and decrease the number of falsely identified compounds (false positives). In this post-Michaelis–Menten approach, each screened reaction is probed in three different occasions, none of which necessarily coincide with the initial period of constant velocity. Enzymology principles rather than subjective criteria are applied to identify unwanted outliers such as assay artifacts, and then to accurately distinguish true enzyme modulation effects from false positives. The exclusion and selection criteria are defined based on the 3-point reaction coordinates, whose relative positions along the time-courses may change from well to well or from plate to plate, if necessary. The robustness and efficiency of the new method is illustrated during a small drug repurposing screening of potential modulators of the deubiquinating activity of ataxin-3, a protein implicated in Machado–Joseph disease. Apparently, intractable Z factors are drastically enhanced after (1) eliminating spurious results, (2) improving the normalization method, and (3) increasing the assay resilience to systematic and random variability. Numerical simulations further demonstrate that the 3-point analysis is highly sensitive to specific, catalytic, and slow-onset modulation effects that are particularly difficult to detect by typical endpoint assays.


2012 ◽  
Vol 24 (12) ◽  
pp. 3145-3180 ◽  
Author(s):  
Thibaud Taillefumier ◽  
Jonathan Touboul ◽  
Marcelo Magnasco

In vivo cortical recording reveals that indirectly driven neural assemblies can produce reliable and temporally precise spiking patterns in response to stereotyped stimulation. This suggests that despite being fundamentally noisy, the collective activity of neurons conveys information through temporal coding. Stochastic integrate-and-fire models delineate a natural theoretical framework to study the interplay of intrinsic neural noise and spike timing precision. However, there are inherent difficulties in simulating their networks’ dynamics in silico with standard numerical discretization schemes. Indeed, the well-posedness of the evolution of such networks requires temporally ordering every neuronal interaction, whereas the order of interactions is highly sensitive to the random variability of spiking times. Here, we answer these issues for perfect stochastic integrate-and-fire neurons by designing an exact event-driven algorithm for the simulation of recurrent networks, with delayed Dirac-like interactions. In addition to being exact from the mathematical standpoint, our proposed method is highly efficient numerically. We envision that our algorithm is especially indicated for studying the emergence of polychronized motifs in networks evolving under spike-timing-dependent plasticity with intrinsic noise.


Euphytica ◽  
1962 ◽  
Vol 11 (3) ◽  
pp. 213-220
Author(s):  
J. H. A. Ferguson
Keyword(s):  

2020 ◽  
Vol 497 (4) ◽  
pp. 4448-4458
Author(s):  
N St-Louis ◽  
C Piaulet ◽  
N D Richardson ◽  
T Shenar ◽  
A F J Moffat ◽  
...  

ABSTRACT We present the results of a 4-month, spectroscopic campaign of the Wolf–Rayet dust-making binary, WR137. We detect only small-amplitude random variability in the C iii λ5696 emission line and its integrated quantities (radial velocity, equivalent width, skewness, and kurtosis) that can be explained by stochastic clumps in the wind of the WC star. We find no evidence of large-scale periodic variations often associated with Corotating Interaction Regions that could have explained the observed intrinsic continuum polarization of this star. Our moderately high-resolution and high signal-to-noise average Keck spectrum shows narrow double-peak emission profiles in the H α, H β, H γ, He ii λ6678, and He ii λ5876 lines. These peaks have a stable blue-to-red intensity ratio with a mean of 0.997 and a root mean square of 0.004 commensurate with the noise level; no variability is found during the entire observing period. We suggest that these profiles arise in a decretion disc around the O9 companion, which is thus an O9e star. The characteristics of the profiles are compatible with those of other Be/Oe stars. The presence of this disc can explain the constant component of the continuum polarization of this system, for which the angle is perpendicular to the plane of the orbit, implying that the rotation axis of the O9e star is aligned with that of the orbit. It remains to be explained why the disc is so stable within the strong ultraviolet radiation field of the O star. We present a binary evolutionary scenario that is compatible with the current stellar and system parameters.


2020 ◽  
Vol 10 (10) ◽  
pp. 3610 ◽  
Author(s):  
Simone Palladino ◽  
Luca Esposito ◽  
Paolo Ferla ◽  
Elena Totaro ◽  
Renato Zona ◽  
...  

Safety assessment of structures can be obtained employing limit design to overcome uncertainties concerning actual response due to inelastic constitutive behavior and more generally to non-linear structural response and loads’ random variability. The limit analysis is used for evaluating the safety of the structures, starting directly from load level without any knowledge of the load history. In the paper, the lower bound calculation is proposed where a new strain-based approach is used that allowed describing the residual stress and displacement in terms of permanent strain. The strategy uses the permanent strain as effective parameters of the procedure so that it is possible to assess the ductility requirements for the complete load program developed till collapse or shakedown. The procedure is compared to experimental results obtained on aluminum beams in shakedown.


1968 ◽  
Vol 17 (2) ◽  
pp. 333-358 ◽  
Author(s):  
P. Parisi ◽  
M. Di Bacco

SummaryA twin study was undertaken with the twofold aim (a) of studying the hereditary behaviour of digital dermatoglyphic traits both at the qualitative and quantitative level, and (b) of working out a method for discriminating MZ and DZ twins by means of fingerprints.Fingerprints of 50 MZ (25 ♂ and 25 ♀) and 50 DZ (25 ♂ and 25 ♀) twin pairs were thus examined and analyzed by means of a special methodology and of a 7044/K32 IBM computer.The qualitative analysis has shown a significantly higher concordance in MZ than in DZ twin pairs, with a certain variability of single finger concordance values. The quantitative analysis has shown significantly higher correlation values in MZ than in DZ twin pairs, with very limited confidence intervals in the former. Single ridge counts apparently behave as cumulative counts on the five or ten fingers, although with an obviously higher random variability.Digital dermatoglyphics thus appear to show practically complete genetic conditioning, which, rather than at a cumulative level for the ten fingers, as is largely believed, appears to act on single finger quali-quantitative traits. The total finger ridge count, rather than a trait, only appears to be a useful, but artificial cumulative value. Actually, applied to the diagnosis of zygosity, it provides, by itself, a fairly high, general probability (0.86) of a correct diagnosis.


Sign in / Sign up

Export Citation Format

Share Document