scholarly journals Parametric Methodologies for Detecting Changes in Maximum Temperature of Tlaxco, Tlaxcala, México

2019 ◽  
Vol 2019 ◽  
pp. 1-14
Author(s):  
Silvia Herrera Cortés ◽  
Bulmaro Juárez Hernández ◽  
Victor Hugo Vázquez Guevara ◽  
Hugo Adán Cruz Suárez

In this paper, comparison results of parametric methodologies of change points, applied to maximum temperature records from the municipality of Tlaxco, Tlaxcala, México, are presented. Methodologies considered are likelihood ratio test, score test, and binary segmentation (BS), pruned exact linear time (PELT), and segment neighborhood (SN). In order to compare such methodologies, a quality analysis of the data was performed; in addition, lost data were estimated with linear regression, and finally, SARIMA models were adjusted.

2015 ◽  
Vol 744-746 ◽  
pp. 890-893
Author(s):  
Xun Wu ◽  
Yong Lan Zhang

In this paper, SAP2000 and ANSYS software are used to modeling and analysis athree-span continuous beam bridge with high piers case study.By using differentbearing types and combinations to form different options, create two finiteelement models.Analysis dynamic characteristics ,elastic response spectra,linear time history and nonlinear time history .And focus on comparing dynamiccharacteristics of the earthquake response of the two programs .Running outputdata processing and comparison results show that the application of thedifferent parameters of the rational combination of rubber bearing basin bridgearrangement has better seismic performance.


2017 ◽  
Vol 17 (1) ◽  
pp. 115-125 ◽  
Author(s):  
Guido Ceccherini ◽  
Simone Russo ◽  
Iban Ameztoy ◽  
Andrea Francesco Marchese ◽  
Cesar Carmona-Moreno

Abstract. The purpose of this article is to show the extreme temperature regime of heat waves across Africa over recent years (1981–2015). Heat waves have been quantified using the Heat Wave Magnitude Index daily (HWMId), which merges the duration and the intensity of extreme temperature events into a single numerical index. The HWMId enables a comparison between heat waves with different timing and location, and it has been applied to maximum and minimum temperature records. The time series used in this study have been derived from (1) observations from the Global Summary of the Day (GSOD) and (2) reanalysis data from ERA-Interim. The analysis shows an increasing number of heat waves of both maxima and minima temperatures in the last decades. Results from heat wave analysis of maximum temperature (HWMIdtx) indicate an increase in intensity and frequency of extreme events. Specifically, from 1996 onwards it is possible to observe HWMIdtx spread with the maximum presence during 2006–2015. Between 2006 and 2015 the frequency (spatial coverage) of extreme heat waves had increased to 24.5 observations per year (60.1 % of land cover), as compared to 12.3 per year (37.3 % of land area) in the period from 1981 to 2005 for GSOD stations (reanalysis).


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 374
Author(s):  
Lei He ◽  
Xiao-Hong Shen ◽  
Mu-Hang Zhang ◽  
Hai-Yan Wang

Due to the diversity of ship-radiated noise (SRN), audio segmentation is an essential procedure in the ship statuses/categories identification. However, the existing segmentation methods are not suitable for the SRN because of the lack of prior knowledge. In this paper, by a generalized likelihood ratio (GLR) test on the ordinal pattern distribution (OPD), we proposed a segmentation criterion and introduce it into single change-point detection (SCPD) and multiple change-points detection (MCPD) for SRN. The proposed method is free from the acoustic feature extraction and the corresponding probability distribution estimation. In addition, according to the sequential structure of ordinal patterns, the OPD is efficiently estimated on a series of analysis windows. By comparison with the Bayesian Information Criterion (BIC) based segmentation method, we evaluate the performance of the proposed method on both synthetic signals and real-world SRN. The segmentation results on synthetic signals show that the proposed method estimates the number and location of the change-points more accurately. The classification results on real-world SRN show that our method obtains more distinguishable segments, which verifies its effectiveness in SRN segmentation.


2020 ◽  
Vol 2020 ◽  
pp. 1-16
Author(s):  
Yi Dong ◽  
Jianmin Liu ◽  
Yanbin Liu ◽  
Xinyong Qiao ◽  
Xiaoming Zhang ◽  
...  

In order to improve reliability and fatigue life of cylinder gaskets in heavy duty diesel engine, several methods and algorithms are applied to optimize operating factors of gaskets. Finite element method is utilized to figure out and analyze the temperature fields, thermal-mechanical coupling stress fields, and deformations of gasket. After determining the maximum values of three state parameters, the orthogonal experimental design method is adopted to analyze the influence rules of five operating factors on three state parameters of the gaskets and four factors which most significantly affect these state parameters are determined. Then, the method which uses operating factors to predict state parameters is established on the application of hybrid neuron network based on partial least squares regression and deep neural network. The comparison results between the predicted values and real values verified the accuracy of the hybrid neuron network method. Based on artificial bee colony algorithm, improvement is attached to the way three kinds of grey wolves locate preys in grey wolf algorithm and the way how using different hierarchy wolfs in grey wolf algorithm to determine three weight coefficients and the location of prey is put forward with. The method using artificial bee colony algorithm to optimize the grey wolf algorithm is called ABC and GWO. The proposed HNN and the ABC and GWO method are applied to work out operating factors values which correspond to optimal state parameters of gasket, and the gaskets are optimized according to the optimal values. It has been demonstrated by finite element analysis results that maximum temperature, maximum coupling stress, and the maximum deformation decrease to 6 K, 12.57 MPa, and 0.0925 mm compared to the original values, respectively, which proves the accuracy of the algorithm and the validity of the improvement.


2017 ◽  
Vol 42 (3) ◽  
pp. 221-239 ◽  
Author(s):  
Chun Wang ◽  
David J. Weiss

The measurement of individual change has been an important topic in both education and psychology. For instance, teachers are interested in whether students have significantly improved (e.g., learned) from instruction, and counselors are interested in whether particular behaviors have been significantly changed after certain interventions. Although classical test methods have been unable to adequately resolve the problems in measuring change, recent approaches for measuring change have begun to use item response theory (IRT). However, all prior methods mainly focus on testing whether growth is significant at the group level. The present research targets a key research question: Is the “change” in latent trait estimates for each individual significant across occasions? Many researchers have addressed this research question assuming that the latent trait is unidimensional. This research generalizes their earlier work and proposes four hypothesis testing methods to evaluate individual change on multiple latent traits: a multivariate Z-test, a multivariate likelihood ratio test, a multivariate score test, and a Kullback–Leibler test. Simulation results show that these tests hold promise of detecting individual change with low Type I error and high power. A real-data example from an educational assessment illustrates the application of the proposed methods.


2020 ◽  
Vol 29 (11) ◽  
pp. 3235-3248
Author(s):  
Chun Yin Lee ◽  
KF Lam

We apply a maximal likelihood ratio test for the presence of multiple change-points in the covariate effects based on the Cox regression model. The covariate effect is assumed to change smoothly at one or more unknown change-points. The number of change-points is inferred by a sequential approach. Confidence intervals for the regression and change-point parameters are constructed by a bootstrap method based on Bernstein polynomials conditionally on the number of change-points. The methods are assessed by simulations and are applied to two datasets.


2009 ◽  
Vol 25 (6) ◽  
pp. 1515-1544 ◽  
Author(s):  
Morten Ørregaard Nielsen

This paper presents a family of simple nonparametric unit root tests indexed by one parameter,d, and containing the Breitung (2002,Journal of Econometrics108, 342–363) test as the special cased= 1. It is shown that (a) each member of the family withd> 0 is consistent, (b) the asymptotic distribution depends ondand thus reflects the parameter chosen to implement the test, and (c) because the asymptotic distribution depends ondand the test remains consistent for alld> 0, it is possible to analyze the power of the test for different values ofd. The usual Phillips–Perron and Dickey–Fuller type tests are indexed by bandwidth, lag length, etc., but have none of these three properties.It is shown that members of the family withd< 1 have higher asymptotic local power than the Breitung (2002) test, and whendis small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric power envelope, particularly in the case with a linear time trend. Furthermore, generalized least squares (GLS) detrending is shown to improve power whendis small, which is not the case for the Breitung (2002) test. Simulations demonstrate that when applying a sieve bootstrap procedure, the proposed variance ratio test has very good size properties, with finite-sample power that is higher than that of the Breitung (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented Dickey–Fuller test with lag length chosen by an information criterion.


2017 ◽  
Vol 20 (2) ◽  
pp. 108-118 ◽  
Author(s):  
Camelia C. Minică ◽  
Giulio Genovese ◽  
Christina M. Hultman ◽  
René Pool ◽  
Jacqueline M. Vink ◽  
...  

Sequence-based association studies are at a critical inflexion point with the increasing availability of exome-sequencing data. A popular test of association is the sequence kernel association test (SKAT). Weights are embedded within SKAT to reflect the hypothesized contribution of the variants to the trait variance. Because the true weights are generally unknown, and so are subject to misspecification, we examined the efficiency of a data-driven weighting scheme. We propose the use of a set of theoretically defensible weighting schemes, of which, we assume, the one that gives the largest test statistic is likely to capture best the allele frequency–functional effect relationship. We show that the use of alternative weights obviates the need to impose arbitrary frequency thresholds. As both the score test and the likelihood ratio test (LRT) may be used in this context, and may differ in power, we characterize the behavior of both tests. The two tests have equal power, if the weights in the set included weights resembling the correct ones. However, if the weights are badly specified, the LRT shows superior power (due to its robustness to misspecification). With this data-driven weighting procedure the LRT detected significant signal in genes located in regions already confirmed as associated with schizophrenia — the PRRC2A (p = 1.020e-06) and the VARS2 (p = 2.383e-06) — in the Swedish schizophrenia case-control cohort of 11,040 individuals with exome-sequencing data. The score test is currently preferred for its computational efficiency and power. Indeed, assuming correct specification, in some circumstances, the score test is the most powerful test. However, LRT has the advantageous properties of being generally more robust and more powerful under weight misspecification. This is an important result given that, arguably, misspecified models are likely to be the rule rather than the exception in weighting-based approaches.


Author(s):  
A. SYAMSUNDAR ◽  
V. N. A. NAIKAN

The failure processes of maintained systems operating in a changing environment may be affected by the changes and exhibit different failure behaviour before and after the changes. Such processes exhibiting abrupt changes in failure intensities at specified times require segmented models with the process domain divided into segments at the points of changes in the environment to represent them. The individual segments can be modeled by any of the usual point process models and combined to form a composite segmented model with multiple change points. This paper proposes such segmented models with multiple change points to represent the failure processes of these systems and uses a hierarchical binary segmentation method to obtain the location of the changes. Its purpose is to quantify the impacts of changes in the environment on the failure intensities. These models are applied to the field data from an industrial setting; parameter estimates obtained and are shown to more accurately describe the failure processes of maintained system in a changing environment than the single point process models usually used. The interpretation and use of these models for maintained systems is also depicted.


Sign in / Sign up

Export Citation Format

Share Document