scholarly journals A Mathematical Method for Determining the Parameters of Functional Dependencies Using Multiscale Probability Distribution Functions

Mathematics ◽  
2021 ◽  
Vol 9 (10) ◽  
pp. 1085
Author(s):  
Ilya E. Tarasov

This article discusses the application of the method of approximation of experimental data by functional dependencies, which uses a probabilistic assessment of the deviation of the assumed dependence from experimental data. The application of this method involves the introduction of an independent parameter “scale of the error probability distribution function” and allows one to synthesize the deviation functions, forming spaces with a nonlinear metric, based on the existing assumptions about the sources of errors and noise. The existing method of regression analysis can be obtained from the considered method as a special case. The article examines examples of analysis of experimental data and shows the high resistance of the method to the appearance of single outliers in the sample under study. Since the introduction of an independent parameter increases the number of computations, for the practical application of the method in measuring and information systems, the architecture of a specialized computing device of the “system on a chip” class and practical approaches to its implementation based on programmable logic integrated circuits are considered.

2021 ◽  
Vol 11 (8) ◽  
pp. 3310
Author(s):  
Marzio Invernizzi ◽  
Federica Capra ◽  
Roberto Sozzi ◽  
Laura Capelli ◽  
Selena Sironi

For environmental odor nuisance, it is extremely important to identify the instantaneous concentration statistics. In this work, a Fluctuating Plume Model for different statistical moments is proposed. It provides data in terms of mean concentrations, variance, and intensity of concentration. The 90th percentile peak-to-mean factor, R90, was tested here by comparing it with the experimental results (Uttenweiler field experiment), considering different Probability Distribution Functions (PDFs): Gamma and the Modified Weibull. Seventy-two percent of the simulated mean concentration values fell within a factor 2 compared to the experimental ones: the model was judged acceptable. Both the modelled results for standard deviation, σC, and concentration intensity, Ic, overestimate the experimental data. This evidence can be due to the non-ideality of the measurement system. The propagation of those errors to the estimation of R90 is complex, but the ranges covered are quite repeatable: the obtained values are 1–3 for the Gamma, 1.5–4 for Modified Weibull PDF, and experimental ones from 1.4 to 3.6.


1997 ◽  
Vol 78 (10) ◽  
pp. 1904-1907 ◽  
Author(s):  
Weinan E ◽  
Konstantin Khanin ◽  
Alexandre Mazel ◽  
Yakov Sinai

Author(s):  
Adam Barylski ◽  
Mariusz Deja

Silicon wafers are the most widely used substrates for fabricating integrated circuits. A sequence of processes is needed to turn a silicon ingot into silicon wafers. One of the processes is flattening by lapping or by grinding to achieve a high degree of flatness and parallelism of the wafer [1, 2, 3]. Lapping can effectively remove or reduce the waviness induced by preceding operations [2, 4]. The main aim of this paper is to compare the simulation results with lapping experimental data obtained from the Polish producer of silicon wafers, the company Cemat Silicon from Warsaw (www.cematsil.com). Proposed model is going to be implemented by this company for the tool wear prediction. Proposed model can be applied for lapping or grinding with single or double-disc lapping kinematics [5, 6, 7]. Geometrical and kinematical relations with the simulations are presented in the work. Generated results for given workpiece diameter and for different kinematical parameters are studied using models programmed in the Matlab environment.


2021 ◽  
Author(s):  
Hamed Farhadi ◽  
Manousos Valyrakis

<p>Applying an instrumented particle [1-3], the probability density functions of kinetic energy of a coarse particle (at different solid densities) mobilised over a range of above threshold flow conditions conditions corresponding to the intermittent transport regime, were explored. The experiments were conducted in the Water Engineering Lab at the University of Glasgow on a tilting recirculating flume with 800 (length) × 90 (width) cm dimension. Twelve different flow conditions corresponding to intermittent transport regime for the range of particle densities examined herein, have been implemented in this research. Ensuring fully developed flow conditions, the start of the test section was located at 3.2 meters upstream of the flume outlet. The bed surface of the flume is flat and made up of well-packed glass beads of 16.2 mm diameter, offering a uniform roughness over which the instrumented particle is transported. MEMS sensors are embedded within the instrumented particle with 3-axis gyroscope and 3-axis accelerometer. At the beginning of each experimental run, instrumented particle is placed at the upstream of the test section, fully exposed to the free stream flow. Its motion is recorded with top and side cameras to enable a deeper understanding of particle transport processes. Using results from sets of instrumented particle transport experiments with varying flow rates and particle densities, the probability distribution functions (PDFs) of the instrumented particles kinetic energy, were generated. The best-fitted PDFs were selected by applying the Kolmogorov-Smirnov test and the results were discussed considering the light of the recent literature of the particle velocity distributions.</p><p>[1] Valyrakis, M.; Alexakis, A. Development of a “smart-pebble” for tracking sediment transport. In Proceedings of the International Conference on Fluvial Hydraulics (River Flow 2016), St. Louis, MO, USA, 12–15 July 2016.</p><p>[2] Al-Obaidi, K., Xu, Y. & Valyrakis, M. 2020, The Design and Calibration of Instrumented Particles for Assessing Water Infrastructure Hazards, Journal of Sensors and Actuator Networks, vol. 9, no. 3, 36.</p><p>[3] Al-Obaidi, K. & Valyrakis, M. 2020, Asensory instrumented particle for environmental monitoring applications: development and calibration, IEEE sensors journal (accepted).</p>


2021 ◽  
pp. 285-293
Author(s):  
Anurag Sharma ◽  
Deepak Swami ◽  
Nitin Joshi

Climate modelling and prediction studies play crucial role in identifying suitable mitigation techniques to minimize or avoid adverse consequences of climate extremes. The accurate spatially and temporally distributed temperature and rainfall dataset are key components in climate prediction studies. Reanalysis datasets provide better spatial and temporal coverage than observational datasets; therefore, reanalysis datasets are widely used for global and regional studies. However, before using the reanalysis dataset in climate modelling studies, it is crucial to compare the robustness and accuracy of the reanalysis dataset with the observational dataset. In this study, daily gridded maximum and minimum temperature datasets of Indian Meteorological Department (IMD) (1°?×?1°) and Sheffield (0.25°×0.25°) are compared using 62-years data i.e 1951-2012. The comparison is based on differences in spatial distribution pattern, probability distribution functions plots and box-plots of the respective gridded dataset. The spatial distribution of grid-wise averaged maximum and minimum temperature dataset generally compare well across pan India in both IMD and Sheffield; however, the significant differences are observed over western Himalaya (WH) and northeast (NE) region. The probability distribution of the pooled mean minimum temperature dataset of IMD is found significantly different from Sheffield using the two-sample Kolmogorov-Smirnov (KS) test. This study will be helpful for researchers who are planning to use Sheffield gridded temperature dataset for climate modelling studies.


Author(s):  
D. Xue ◽  
S. Y. Cheing ◽  
P. Gu

This research introduces a new systematic approach to identify the optimal design configuration and attributes to minimize the potential construction project changes. The second part of this paper focuses on the attribute design aspect. In this research, the potential changes of design attribute values are modeled by probability distribution functions. Attribute values of the design whose construction tasks are least sensitive to the changes of these attribute values are identified based upon Taguchi Method. In addition, estimation of the potential project change cost due to the potential design attribute value changes is also discussed. Case studies in pipeline engineering design and construction have been conducted to show the effectiveness of the introduced approach.


2015 ◽  
Vol 19 (sup8) ◽  
pp. S8-32-S8-37 ◽  
Author(s):  
K. Deng ◽  
J. Deng ◽  
T. Sun ◽  
Y. Guan ◽  
F. Yang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document