Parametric modelling of turbulence

Some steps are taken towards a parametric statistical model for the velocity and velocity derivative fields in stationary turbulence, building on the background of existing theoretical and empirical knowledge of such fields. While the ultimate goal is a model for the three-dimensional velocity components, and hence for the corresponding velocity derivatives, we concentrate here on the stream wise velocity component. Discrete and continuous time stochastic processes of the first-order autoregressive type and with one-dimensional marginals having log-linear tails are constructed and compared with two large data-sets. It turns out that a first-order autoregression that fits the local correlation structure well is not capable of describing the correlations over longer ranges. A good fit locally as well as at longer ranges is achieved by using a process that is the sum of two independent autoregressions. We study this type of model in some detail. We also consider a model derived from the above-mentioned autoregressions and with dependence structure on the borderline to long-range dependence. This model is obtained by means of a general method for construction of processes with long-range dependence. Some suggestions for future empirical and theoretical work are given.

2019 ◽  
Vol 8 (3) ◽  
pp. 103
Author(s):  
Zhigang Han ◽  
Fen Qin ◽  
Caihui Cui ◽  
Yannan Liu ◽  
Lingling Wang ◽  
...  

A soil erosion model is used to evaluate the conditions of soil erosion and guide agricultural production. Recently, high spatial resolution data have been collected in new ways, such as three-dimensional laser scanning, providing the foundation for refined soil erosion modelling. However, serial computing cannot fully meet the computational requirements of massive data sets. Therefore, it is necessary to perform soil erosion modelling under a parallel computing framework. This paper focuses on a parallel computing framework for soil erosion modelling based on the Hadoop platform. The framework includes three layers: the methodology, algorithm, and application layers. In the methodology layer, two types of parallel strategies for data splitting are defined as row-oriented and sub-basin-oriented methods. The algorithms for six parallel calculation operators for local, focal and zonal computing tasks are designed in detail. These operators can be called to calculate the model factors and perform model calculations. We defined the key-value data structure of GeoCSV format for vector, row-based and cell-based rasters as the inputs for the algorithms. A geoprocessing toolbox is developed and integrated with the geographic information system (GIS) platform in the application layer. The performance of the framework is examined by taking the Gushanchuan basin as an example. The results show that the framework can perform calculations involving large data sets with high computational efficiency and GIS integration. This approach is easy to extend and use and provides essential support for applying high-precision data to refine soil erosion modelling.


Fractals ◽  
2007 ◽  
Vol 15 (02) ◽  
pp. 105-126 ◽  
Author(s):  
YINGCHUN ZHOU ◽  
MURAD S. TAQQU

Bucket random permutations (shuffling) are used to modify the dependence structure of a time series, and this may destroy long-range dependence, when it is present. Three types of bucket permutations are considered here: external, internal and two-level permutations. It is commonly believed that (1) an external random permutation destroys the long-range dependence and keeps the short-range dependence, (2) an internal permutation destroys the short-range dependence and keeps the long-range dependence, and (3) a two-level permutation distorts the medium-range dependence while keeping both the long-range and short-range dependence. This paper provides a theoretical basis for investigating these claims. It extends the study started in Ref. 1 and analyze the effects that these random permutations have on a long-range dependent finite variance stationary sequence both in the time domain and in the frequency domain.


2001 ◽  
Vol 34 (1) ◽  
pp. 76-79 ◽  
Author(s):  
Lynn Ribaud ◽  
Guang Wu ◽  
Yuegang Zhang ◽  
Philip Coppens

As the combination of high-intensity synchrotron sources and area detectors allows collection of large data sets in a much shorter time span than previously possible, the use of open helium gas-flow systems is much facilitated. A flow system installed at the SUNY X3 synchrotron beamline at the National Synchrotron Light Source has been used for collection of a number of large data sets at a temperature of ∼16 K. Instability problems encountered when using a helium cryostat for three-dimensional data collection are eliminated. Details of the equipment, its temperature calibration and a typical result are described.


2012 ◽  
Vol 75 (6) ◽  
pp. 1029-1035 ◽  
Author(s):  
ALI AL SAKKAF ◽  
GEOFF JONES

New Zealand has a high rate of reported campylobacteriosis compared with other developed countries. One possible reason is that local strains have greater heat tolerance and thus are better able to survive undercooking; this hypothesis is supported by the remarkably high D-values reported for Campylobacter jejuni in The Netherlands. The objective of this study was to investigate the thermal inactivation of isolates from New Zealand in broth, using strains that are commonly found in human cases and food samples in New Zealand. Typed Campylobacter strains were heated to a predetermined temperature using a submerged-coil heating apparatus. The first-order kinetic model has been used extensively in the calculation of the thermal inactivation parameters, D and z; however, nonlinear survival curves have been reported, and a number of models have been proposed to describe the patterns observed. Therefore, this study compared the conventional first-order model with eight nonlinear models for survival curves. Kinetic parameters were estimated using both one- and two-step regression techniques. In general, nonlinear models fit the individual inactivation data sets better than the log-linear model. However, the log-linear and the (nonlinear) Weibull models were the only models that could be successfully fitted to all data sets. For seven relevant New Zealand C. jejuni strains, at temperatures from 51.5 to 60°C, D- and z-values were obtained, ranging from 1.5 to 228 s and 4 to 5.2°C, respectively. These values are in broad agreement with published international data and do not indicate that the studied New Zealand C. jejuni strains are more heat resistant than other strains, in contrast with some reports from The Netherlands.


PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e10545
Author(s):  
Matt A. White ◽  
Nicolás E. Campione

Classifying isolated vertebrate bones to a high level of taxonomic precision can be difficult. Many of Australia’s Cretaceous terrestrial vertebrate fossil-bearing deposits, for example, produce large numbers of isolated bones and very few associated or articulated skeletons. Identifying these often fragmentary remains beyond high-level taxonomic ranks, such as Ornithopoda or Theropoda, is difficult and those classified to lower taxonomic levels are often debated. The ever-increasing accessibility to 3D-based comparative techniques has allowed palaeontologists to undertake a variety of shape analyses, such as geometric morphometrics, that although powerful and often ideal, require the recognition of diagnostic landmarks and the generation of sufficiently large data sets to detect clusters and accurately describe major components of morphological variation. As a result, such approaches are often outside the scope of basic palaeontological research that aims to simply identify fragmentary specimens. Herein we present a workflow in which pairwise comparisons between fragmentary fossils and better known exemplars are digitally achieved through three-dimensional mapping of their surface profiles and the iterative closest point (ICP) algorithm. To showcase this methodology, we compared a fragmentary theropod ungual (NMV P186153) from Victoria, Australia, identified as a neovenatorid, with the manual unguals of the megaraptoran Australovenator wintonensis (AODF604). We discovered that NMV P186153 was a near identical match to AODF604 manual ungual II-3, differing only in size, which, given their 10–15Ma age difference, suggests stasis in megaraptoran ungual morphology throughout this interval. Although useful, our approach is not free of subjectivity; care must be taken to eliminate the effects of broken and incomplete surfaces and identify the human errors incurred during scaling, such as through replication. Nevertheless, this approach will help to evaluate and identify fragmentary remains, adding a quantitative perspective to an otherwise qualitative endeavour.


2020 ◽  
Vol 17 (2) ◽  
pp. 297-307
Author(s):  
Bikramaditya Ghosh ◽  
Saleema J. S. ◽  
Aniruddha Oak ◽  
Manu K. S. ◽  
Sangeetha R.

Long-range dependence (LRD) in financial markets remains a key factor in determining whether there is market memory, herding traces, or a bubble in the economy. Usually referred to as ‘Long Memory’, LRD has remained a key parameter even today since the mid-1970s. In November 2016, a sudden and drastic demonetization measure took place in the Indian market, aimed at curbing money laundering and terrorist funding. This study is an attempt to identify market behavior using long-range dependence during those few days in demonetization. Besides, it tries to identify nascent traces of bubble and embedded herding during that time. Auto Regressive Fractionally Integrated Moving Average (ARFIMA) is used for three consecutive days around the event. Tick-by-tick data from CNX Nifty High Frequency Trading (CNX Nifty HFT) is used for three consecutive days around demonetization (approximately, 5000 data points from morning trading sessions on each of the three days). The results show a clear and profound presence of herd behavior in all three data sets. The herd intensity remained similar, indicating a unique mixture of both ‘Noah Effect’ and ‘Joseph Effect’, proving a clear regime switch. However, the results on the event day show stable and prominent herding. Mandelbrot’s specified effects were tested on an uncertain and sudden financial event in India and proved to function perfectly.


2005 ◽  
Vol 37 (02) ◽  
pp. 342-365 ◽  
Author(s):  
C. C. Heyde ◽  
N. N. Leonenko

Stochastic processes with Student marginals and various types of dependence structure, allowing for both short- and long-range dependence, are discussed in this paper. A particular motivation is the modelling of risky asset time series.


2013 ◽  
Vol 6 (4) ◽  
pp. 1261-1273 ◽  
Author(s):  
T. Heus ◽  
A. Seifert

Abstract. This paper presents a method for feature tracking of fields of shallow cumulus convection in large eddy simulations (LES) by connecting the projected cloud cover in space and time, and by accounting for splitting and merging of cloud objects. Existing methods tend to be either imprecise or, when using the full three-dimensional (3-D) spatial field, prohibitively expensive for large data sets. Compared to those 3-D methods, the current method reduces the memory footprint by up to a factor 100, while retaining most of the precision by correcting for splitting and merging events between different clouds. The precision of the algorithm is further enhanced by taking the vertical extent of the cloud into account. Furthermore, rain and subcloud thermals are also tracked, and links between clouds, their rain, and their subcloud thermals are made. The method compares well with results from the literature. Resolution and domain dependencies are also discussed. For the current simulations, the cloud size distribution converges for clouds larger than an effective resolution of 6 times the horizontal grid spacing, and smaller than about 20% of the horizontal domain size.


Geophysics ◽  
1990 ◽  
Vol 55 (10) ◽  
pp. 1321-1326 ◽  
Author(s):  
X. Wang ◽  
R. O. Hansen

Two‐dimensional (profile) inversion techniques for magnetic anomalies are widely used in exploration geophysics: but, until now, the three‐dimensional (3-D) methods available have been restricted in their geologic applicability, dependent upon good initial values or limited by the capabilities of existing computers. We have developed a fully 3-D inversion algorithm intended for routine application to large data sets. The algorithm based on a Fourier transform expression for the magnetic field of homogeneous polyhedral bodies (Hansen and Wang, 1998), is a 3-D generalization of CompuDepth (O’Brien, 1972). Like CompuDepth, the new inversion algorithm employs thespatial equivalent of frequency‐domain autoregression to determine a series of coefficients from which the depths and locations of polyhedral vertices are calculated by solving complex polynomials. These vertices are used to build a 3-D geologic model. Application to the Medicine Lake Volcano aeromagnetic anomaly resulted in a geologically reasonable model of the source.


2006 ◽  
Vol 21 (2) ◽  
pp. 102-104 ◽  
Author(s):  
Colleen S. Frazer ◽  
Mark A. Rodriguez ◽  
Ralph G. Tissot

The Interactive Data Language has been used to produce a software program capable of advanced three-dimensional visualizations of pole figure and θ-2θ data. The data can also be used to calculate quantitative properties such as strain level and to minimize the peak-height texture effects in individual θ-2θ scans. The collection of the large data sets necessary for the analyses is facilitated by use of a position sensitive detector or area detector.


Sign in / Sign up

Export Citation Format

Share Document