Entropy budget and coherent structures associated with a spectral closure model of turbulence

2018 ◽  
Vol 857 ◽  
pp. 806-822
Author(s):  
Rick Salmon

We ‘derive’ the eddy-damped quasi-normal Markovian model (EDQNM) by a method that replaces the exact equation for the Fourier phases with a solvable stochastic model, and we analyse the entropy budget of the EDQNM. We show that a quantity that appears in the probability distribution of the phases may be interpreted as the rate at which entropy is transferred from the Fourier phases to the Fourier amplitudes. In this interpretation, the decrease in phase entropy is associated with the formation of structures in the flow, and the increase of amplitude entropy is associated with the spreading of the energy spectrum in wavenumber space. We use Monte Carlo methods to sample the probability distribution of the phases predicted by our theory. This distribution contains a single adjustable parameter that corresponds to the triad correlation time in the EDQNM. Flow structures form as the triad correlation time becomes very large, but the structures take the form of vorticity quadrupoles that do not resemble the monopoles and dipoles that are actually observed.

Fluids ◽  
2021 ◽  
Vol 6 (3) ◽  
pp. 105
Author(s):  
Ichiro Ueno

Coherent structures by the particles suspended in the half-zone thermocapillary liquid bridges via experimental approaches are introduced. General knowledge on the particle accumulation structures (PAS) is described, and then the spatial–temporal behaviours of the particles forming the PAS are illustrated with the results of the two- and three-dimensional particle tracking. Variations of the coherent structures as functions of the intensity of the thermocapillary effect and the particle size are introduced by focusing on the PAS of the azimuthal wave number m=3. Correlation between the particle behaviour and the ordered flow structures known as the Kolmogorov–Arnold—Moser tori is discussed. Recent works on the PAS of m=1 are briefly introduced.


Author(s):  
Hasanatul Iftitah ◽  
Y Yuhandri

Vocational High School (SMK) Negeri 4 Kota Jambi is one of the favorite vocational schools in Jambi City which is also the only pure tourism vocational school in Jambi Province. SMK Negeri 4 Kota Jambi has several vocational majors, namely culinary, beauty, fashion and hospitality. In general, students who choose to attend vocational schools have the hope of being able to work immediately after graduating from school, they do not need to continue to study to be able to work. In this study, researchers will predict the level of acceptance of students from SMK Negeri 4 Kota Jambi in the business and industrial world using the Monte Carlo method. Monte Carlo is a method that can find values ​​that are close to the actual value of events that will occur based on the distribution of sampling data. The technique of this method is to select random numbers from the probability distribution to perform the simulation. The data used in this study is the data of students from SMK Negeri 4 Kota Jambi who worked from the 2015/2016 Academic Year to the 2018/2019 Academic Year. Furthermore, the data will be processed using the Monte Carlo method. The simulation will be implemented using PHP programming. The result of this research is the level of prediction accuracy of students of SMK Negeri 4 Kota Jambi who are accepted in the business and industrial world using the Monte Carlo method is 84%.


2021 ◽  
Author(s):  
Faezeh Ghasemnezhad ◽  
Ommolbanin Bazrafshan ◽  
Mehdi Fazeli ◽  
Mohammad Parvinnia ◽  
Vijay Singh

Abstract Standardized Runoff Index (SRI), as one of the well-known hydrological drought indices, may contain uncertainties caused by the employment of the distribution function, time scale, and record length of statistical data. In this study, the uncertainty in the SRI estimation of monthly discharge data of 30 and 49 year length from Minab dam watershed, south of Iran, was investigated. Four probability distribution functions (Gamma, Weibull, Lognormal, and Normal) were used to fit the cumulative discharge data at 3, 6. 9, 12, 24 and 48 month time scales, with their goodness-of-fit and normality evaluated by K-S and normality tests, respectively. Using Monte-Carlo sampling, 50,000 statistical data were generated for each event and each time scale, followed by 95% confidence interval. The width of the confidence interval was used as uncertainty and sources of uncertainty were investigated using miscellaneous factors. It was found that the maximum uncertainty was related to normal and lognormal distributions and the minimum uncertainty to gamma and Weibull distributions. Further, the increase in both time scale and record length led to the decrease in uncertainty.


2019 ◽  
Author(s):  
Adrian S. Wong ◽  
Kangbo Hao ◽  
Zheng Fang ◽  
Henry D. I. Abarbanel

Abstract. Statistical Data Assimilation (SDA) is the transfer of information from field or laboratory observations to a user selected model of the dynamical system producing those observations. The data is noisy and the model has errors; the information transfer addresses properties of the conditional probability distribution of the states of the model conditioned on the observations. The quantities of interest in SDA are the conditional expected values of functions of the model state, and these require the approximate evaluation of high dimensional integrals. We introduce a conditional probability distribution and use the Laplace method with annealing to identify the maxima of the conditional probability distribution. The annealing method slowly increases the precision term of the model as it enters the Laplace method. In this paper, we extend the idea of precision annealing (PA) to Monte Carlo calculations of conditional expected values using Metropolis-Hastings methods.


2016 ◽  
Author(s):  
William Gilpin ◽  
Vivek N. Prakash ◽  
Manu Prakash

1AbstractWe present a simple, intuitive algorithm for visualizing time-varying flow fields that can reveal complex flow structures with minimal user intervention. We apply this technique to a variety of biological systems, including the swimming currents of invertebrates and the collective motion of swarms of insects. We compare our results to more experimentally-diffcult and mathematically-sophisticated techniques for identifying patterns in fluid flows, and suggest that our tool represents an essential “middle ground” allowing experimentalists to easily determine whether a system exhibits interesting flow patterns and coherent structures without the need to resort to more intensive techniques. In addition to being informative, the visualizations generated by our tool are often striking and elegant, illustrating coherent structures directly from videos without the need for computational overlays. Our tool is available as fully-documented open-source code available for MATLAB, Python, or ImageJ at www.flowtrace.org.


2005 ◽  
Vol 23 (6) ◽  
pp. 429-461
Author(s):  
Ian Lerche ◽  
Brett S. Mudford

This article derives an estimation procedure to evaluate how many Monte Carlo realisations need to be done in order to achieve prescribed accuracies in the estimated mean value and also in the cumulative probabilities of achieving values greater than, or less than, a particular value as the chosen particular value is allowed to vary. In addition, by inverting the argument and asking what the accuracies are that result for a prescribed number of Monte Carlo realisations, one can assess the computer time that would be involved should one choose to carry out the Monte Carlo realisations. The arguments and numerical illustrations are carried though in detail for the four distributions of lognormal, binomial, Cauchy, and exponential. The procedure is valid for any choice of distribution function. The general method given in Lerche and Mudford (2005) is not merely a coincidence owing to the nature of the Gaussian distribution but is of universal validity. This article provides (in the Appendices) the general procedure for obtaining equivalent results for any distribution and shows quantitatively how the procedure operates for the four specific distributions. The methodology is therefore available for any choice of probability distribution function. Some distributions have more than two parameters that are needed to define precisely the distribution. Estimates of mean value and standard error around the mean only allow determination of two parameters for each distribution. Thus any distribution with more than two parameters has degrees of freedom that either have to be constrained from other information or that are unknown and so can be freely specified. That fluidity in such distributions allows a similar fluidity in the estimates of the number of Monte Carlo realisations needed to achieve prescribed accuracies as well as providing fluidity in the estimates of achievable accuracy for a prescribed number of Monte Carlo realisations. Without some way to control the free parameters in such distributions one will, presumably, always have such dynamic uncertainties. Even when the free parameters are known precisely, there is still considerable uncertainty in determining the number of Monte Carlo realisations needed to achieve prescribed accuracies, and in the accuracies achievable with a prescribed number of Monte Carol realisations because of the different functional forms of probability distribution that can be invoked from which one chooses the Monte Carlo realisations. Without knowledge of the underlying distribution functions that are appropriate to use for a given problem, presumably the choices one makes for numerical implementation of the basic logic procedure will bias the estimates of achievable accuracy and estimated number of Monte Carlo realisations one should undertake. The cautionary note, which is the main point of this article, and which is exhibited sharply with numerical illustrations, is that one must clearly specify precisely what distributions one is using and precisely what free parameter values one has chosen (and why the choices were made) in assessing the accuracy achievable and the number of Monte Carlo realisations needed with such choices. Without such available information it is not a very useful exercise to undertake Monte Carlo realisations because other investigations, using other distributions and with other values of available free parameters, will arrive at very different conclusions.


2021 ◽  
pp. 1-8
Author(s):  
Arley S. Carvalhal ◽  
Gloria M. N. Costa ◽  
Silvio A. B. Vieira de Melo

Summary Uncertainties regarding the factors that influence asphaltene deposition in porous media (e.g., those resulting from oil composition, rock properties, and rock/fluid interaction) strongly affect the prediction of important variables, such as oil production. Besides, some aspects of these predictions are stochastic processes, such as the aggregation phenomenon of asphaltene precipitates. For this reason, a well-defined output from an asphaltene-deposition model might not be feasible. Instead of this, obtaining the probability distribution of important outputs (e.g., permeability reduction and oil production) should be the objective of rigorous modeling of this phenomenon. This probability distribution would support the design of a risk-based policy for the prevention and mitigation of asphaltene deposition. In this paper we aim to present a new approach to assessing the risk of formation damage caused by asphaltene deposition using Monte Carlo simulations. Using this approach, the probability-distribution function of the permeability reduction was obtained. To connect this information to a parameter more related to economic concepts, the probability distribution of the damage ratio (DR) was also calculated, which is the fraction of production loss caused by formation damage. A hypothetical scenario involving a decision in the asphaltene-prevention policy is presented as an application of the method. A novel approach to model the prevention of asphaltene aggregation using inhibitors has been proposed and successfully applied in this scenario.


Sign in / Sign up

Export Citation Format

Share Document