Feasibility of Vs Liquefication Evaluation Methods by Bachu Xinjiang Earthquake Data

2012 ◽  
Vol 238 ◽  
pp. 848-851
Author(s):  
Qian Yu Zhao ◽  
Rui Sun ◽  
Yu Run Li ◽  
Wei Ming Wang

A discrimination model for soil liquefaction is established by analyzing the liquefied and non-liquefied sites in the Bachu Xinjiang earthquake, based on 44 shear wave velocity data. One of them is based on the Code for seismic design of buildings, which is a linear model. The model is brief and convenient, while the evaluation success rate is 80%. But compared with the nonlinear model, the linear model is not advanced enough. The other model is based on probability analysis, and the evaluation success rate can reach up to 93%. And the discrimination results are high in reliability rely on real data analysis.

2014 ◽  
Vol 39 (2) ◽  
pp. 107-127 ◽  
Author(s):  
Artur Matyja ◽  
Krzysztof Siminski

Abstract The missing values are not uncommon in real data sets. The algorithms and methods used for the data analysis of complete data sets cannot always be applied to missing value data. In order to use the existing methods for complete data, the missing value data sets are preprocessed. The other solution to this problem is creation of new algorithms dedicated to missing value data sets. The objective of our research is to compare the preprocessing techniques and specialised algorithms and to find their most advantageous usage.


Author(s):  
P. Ingram

It is well established that unique physiological information can be obtained by rapidly freezing cells in various functional states and analyzing the cell element content and distribution by electron probe x-ray microanalysis. (The other techniques of microanalysis that are amenable to imaging, such as electron energy loss spectroscopy, secondary ion mass spectroscopy, particle induced x-ray emission etc., are not addressed in this tutorial.) However, the usual processes of data acquisition are labor intensive and lengthy, requiring that x-ray counts be collected from individually selected regions of each cell in question and that data analysis be performed subsequent to data collection. A judicious combination of quantitative elemental maps and static raster probes adds not only an additional overall perception of what is occurring during a particular biological manipulation or event, but substantially increases data productivity. Recent advances in microcomputer instrumentation and software have made readily feasible the acquisition and processing of digital quantitative x-ray maps of one to several cells.


2019 ◽  
Author(s):  
Rumen Manolov

The lack of consensus regarding the most appropriate analytical techniques for single-case experimental designs data requires justifying the choice of any specific analytical option. The current text mentions some of the arguments, provided by methodologists and statisticians, in favor of several analytical techniques. Additionally, a small-scale literature review is performed in order to explore if and how applied researchers justify the analytical choices that they make. The review suggests that certain practices are not sufficiently explained. In order to improve the reporting regarding the data analytical decisions, it is proposed to choose and justify the data analytical approach prior to gathering the data. As a possible justification for data analysis plan, we propose using as a basis the expected the data pattern (specifically, the expectation about an improving baseline trend and about the immediate or progressive nature of the intervention effect). Although there are multiple alternatives for single-case data analysis, the current text focuses on visual analysis and multilevel models and illustrates an application of these analytical options with real data. User-friendly software is also developed.


2017 ◽  
Vol 2 (2) ◽  
pp. 155-168 ◽  
Author(s):  
David Wong

This research aims at analyzing (1) the effect of vendor’s ability, benevolence, and integrity variables toward e-commerce customers’ trust in UBM; (2) the effect of vendor’s ability, benevolence, and integrity variables toward the level of e-commerce customers’ participation in Indonesia; and (3) the effect of trust variable toward level of e-commerce customers participation in UBM. This research makes use of UBM e-commerce users as research samples while using Likert scale questionnaire for data collection. Furthermore, the questionnaires are sent to as many as 200 respondents. For data analysis method, Structural Equation Model was used. Out of three predictor variables (ability, benevolence, and integrity), it is only vendor’s integrity that has a positive and significant effect on customers’ trust. On the other hand, it is only vendor’s integrity and customer’s trust that have a positive and significant effect on e-commerce customers’ participation in UBM. Keywords: e-commerce customers’ participation, ability, benevolence, integrity


Author(s):  
Saheb Foroutaifar

AbstractThe main objectives of this study were to compare the prediction accuracy of different Bayesian methods for traits with a wide range of genetic architecture using simulation and real data and to assess the sensitivity of these methods to the violation of their assumptions. For the simulation study, different scenarios were implemented based on two traits with low or high heritability and different numbers of QTL and the distribution of their effects. For real data analysis, a German Holstein dataset for milk fat percentage, milk yield, and somatic cell score was used. The simulation results showed that, with the exception of the Bayes R, the other methods were sensitive to changes in the number of QTLs and distribution of QTL effects. Having a distribution of QTL effects, similar to what different Bayesian methods assume for estimating marker effects, did not improve their prediction accuracy. The Bayes B method gave higher or equal accuracy rather than the rest. The real data analysis showed that similar to scenarios with a large number of QTLs in the simulation, there was no difference between the accuracies of the different methods for any of the traits.


Mathematics ◽  
2021 ◽  
Vol 9 (16) ◽  
pp. 1850
Author(s):  
Rashad A. R. Bantan ◽  
Farrukh Jamal ◽  
Christophe Chesneau ◽  
Mohammed Elgarhy

Unit distributions are commonly used in probability and statistics to describe useful quantities with values between 0 and 1, such as proportions, probabilities, and percentages. Some unit distributions are defined in a natural analytical manner, and the others are derived through the transformation of an existing distribution defined in a greater domain. In this article, we introduce the unit gamma/Gompertz distribution, founded on the inverse-exponential scheme and the gamma/Gompertz distribution. The gamma/Gompertz distribution is known to be a very flexible three-parameter lifetime distribution, and we aim to transpose this flexibility to the unit interval. First, we check this aspect with the analytical behavior of the primary functions. It is shown that the probability density function can be increasing, decreasing, “increasing-decreasing” and “decreasing-increasing”, with pliant asymmetric properties. On the other hand, the hazard rate function has monotonically increasing, decreasing, or constant shapes. We complete the theoretical part with some propositions on stochastic ordering, moments, quantiles, and the reliability coefficient. Practically, to estimate the model parameters from unit data, the maximum likelihood method is used. We present some simulation results to evaluate this method. Two applications using real data sets, one on trade shares and the other on flood levels, demonstrate the importance of the new model when compared to other unit models.


1998 ◽  
Vol 11 (1) ◽  
pp. 574-574
Author(s):  
A.E. Gómez ◽  
S. Grenier ◽  
S. Udry ◽  
M. Haywood ◽  
V. Sabas ◽  
...  

Using Hipparcos parallaxes and proper motions together with radial velocity data and individual ages estimated from isochones, the velocity ellipsoid has been determined as a function of age. On the basis of the available kinematic data two different samples were considered: a first one (7789 stars) for which only tangential velocities were calculated and a second one containing 3104 stars with available U, V and W velocity components and total velocities ≤ 65 km.s-1. The main conclusions are: -Mixing is not complete at about 0.8-1 Gyr. -The shape of the velocity ellipsoid changes with time getting rounder from σu/σv/σ-w = 1/0.63/0.42 ± 0.04 at about 1 Gyr to1/0.7/0.62 ±0.04 at 4-5 Gyr. -The age-velocity-dispersion relation (from the sample with kinematical selection) rises to a maximum, thereafter remaining roughly constant; there is no dynamically significant evolution of the disk after about 4-5 Gyr. -Among the stars with solar metallicities and log(age) > 9.8 two groups are identified: one has typical thin disk characteristics, the other is older than 10 Gyr and lags the LSR at about 40 km.s-1 . -The variation of the tangential velocity with age(without selection on the tangential velocity) shows a discontinuity at about 10 Gyr, which may be attributed to stars typically of the thick disk populations for ages > 10 Gyr.


1985 ◽  
Vol 57 (3) ◽  
pp. 793-794 ◽  
Author(s):  
Leroy Matthews ◽  
Julie Hunt

46 subjects were presented with 25 pairs of grades representing students' performance over the last two semesters. One group was asked to predict the grade for the following semester and another group was asked to predict the grade which occurred preceding the last two semesters. A linear model applied to the data indicated that both groups of subjects weighted the temporally contiguous grade more heavily than the other grade.


Sign in / Sign up

Export Citation Format

Share Document