statistical selection
Recently Published Documents


TOTAL DOCUMENTS

96
(FIVE YEARS 11)

H-INDEX

14
(FIVE YEARS 1)

2021 ◽  
Vol 2021 (2) ◽  
pp. 1-6
Author(s):  
Roman Tytarenko ◽  
◽  
Roman Khmil ◽  
Iryna Dankevych ◽  
◽  
...  

The article presents a theoretical analysis of existing concepts to evaluate the non-failure of RC structures in operation. To perform the analysis, the authors considered a number of scientific works of both Ukrainian and foreign researchers. The main focus was on works in which the model of the stochastic nature of the RC structure operation included random parameters of acting loads, as well as the reserve of its bearing capacity and serviceability (geometric dimensions of cross sections of constructive members, strength and deformation characteristics of materials, etc.). Among others, according to the authors, important problems in terms of analysis of a single work were the volume of statistical selection of random parameters, their number and impact on the study result, as well as rationality of the adopted method of calculating the probability of failure (or non-failure work) of RC structure in operation. Based on the processing of a number of scientific works, the authors highlight the relevance, advantages and disadvantages of the concepts of non-failure assessment proposed there, as well as the formulate the conclusions and recommendations for further experimental and theoretical research in this area.


Author(s):  
Medard Makrenek

The paper presents the methodology behind the statistical selection of input parameters using the example of spraying two cold-sprayed coatings. The Ti and Cr3C2-25(Ni20Cr)-Gr coatings were tested. Despite the large difference in the structure of these coatings, nanoindentation studies were carried out focusing on the nano hardness H and elastic modulus E. Based on the four input parameters and two output parameters, a 2-level factorial 2(k-p) experimental design was performed. The conducted analysis showed the significant influence of the spray distance on the H and E values in the case of the Ti coating. The input parameters of the spray distance and the type of carrier gas used turned out to be statistically significant in the case of the cermet coating. Taking into account the statistical analysis, the coatings were sprayed with modified values of the input parameters.


2021 ◽  
Vol 200 (2) ◽  
pp. 236-244
Author(s):  
Teresa Grabińska

The paper considers the problem of information credibility. Currently, such a problem is affecting scientists, as well as ordinary people who are dependent on information networks. Hence, the Author formulates three postulates that should be observed in dealing with the quality of information: P1 – identify the source of information, P2 – determine the level of credibility of the information source, P3 – recognize the purpose of information dissemination. The first two postulates are universal because they are applicable to all the users of information. The third becomes more and more important in the social and political choices of citizens. In scientific work, empirical facts are being transformed to empirical data (increasingly, to the form of big data) which are results of advanced registration and processing by means of technical and information science tools, such as: a) technical transforming the empirical signal into information; b) statistical selection of signals, and, next, statistical processing of the received data; c) assessment of results for suitability in applications. Other “epistemic” factors, however, are also involved, as: d) conceptual apparatus used for idealization (and then for interpretation), e) assessment of the results in terms of compliance with the epistemological (sometimes, also commercial or ideological) position. All these factors should be the subject of careful study of errology proposed by P. Homola.


2021 ◽  
Vol 15 (3) ◽  
Author(s):  
Dieter Schott

AbstractMonotone function problems are introduced on a very elementary level to reveal the close connection to certain statistical problems. Equations $$F(x) = c$$ F ( x ) = c and inequalities $$F(x) \ge c$$ F ( x ) ≥ c with monotone increasing functions F are considered. Solution methods are stated. In the following, it is shown how some important problems of statistics, especially also statistical selection problems, can be solved by transformation to monotone function problems.


2020 ◽  
Author(s):  
Davide Putero ◽  
Rita Traversi ◽  
Angelo Lupi ◽  
Francescopiero Calzolari ◽  
Maurizio Busetto ◽  
...  

<p>In this work, eight years (2006–2013) of continuous measurements of near-surface ozone (O<sub>3</sub>) at the WMO/GAW contributing station “Concordia” (DMC, 75°06’S, 123°20’E, 3280 m a.s.l.) are presented, and the role of specific atmospheric processes in affecting O<sub>3</sub> variability is investigated. In particular, during the period of highest data coverage (i.e., 2008–2013), O<sub>3</sub> enhancement events (OEEs) were systematically observed at DMC, affecting 11.6% of the dataset. As deduced by a statistical selection methodology, the OEEs are affected by a significant interannual variability, both in the average and in the frequency of O<sub>3</sub> values. To explain part of this variability, OEEs were analyzed as a function of: (i) total column of O<sub>3</sub> and UV-A irradiance variability, (ii) long-range transport of air masses over the Antarctic plateau (by using LAGRANTO), and (iii) occurrence of “deep” stratospheric intrusion events (by using STEFLUX). The overall O<sub>3</sub> concentrations are controlled by a day-to-day variability, which indicates the dominating influence of processes occurring at “synoptic” scales rather than “local” processes. Despite previous studies indicated an inverse relationship between OEEs and TCO, we found that the annual frequency of OEEs was higher when TCO values at DMC were higher than usual. The annual occurrence of OEEs at DMC was also related to the total time spent by air masses over the Antarctic plateau before their arrival at DMC, suggesting that the accumulation of photochemically-produced O<sub>3</sub> during the transport dominated the local O<sub>3</sub> production. Lastly, the influence of “deep” stratospheric intrusion events at DMC was analyzed, and it was observed that this contribution played only a marginal role (the highest frequency observed was 3% of the period, in November).</p><p>This latter point, i.e., the frequency and seasonality of stratosphere-to-troposphere (STE) events, and the relative influence of specific transport mechanisms, as well as snow chemistry, are still under debate. These topics will be investigated in the STEAR (Stratosphere-to-Troposphere Exchange in the Antarctic Region) project, starting in 2020 and funded by the Italian Antarctic Research Program (PNRA). In particular, STEAR will provide an assessment of STE events in Antarctica, by using both continuous observations (e.g., O<sub>3</sub> and Beryllium-7) at DMC, and modeling outputs. In addition to DMC measurements, simultaneous atmospheric composition datasets will be analyzed at Antarctic coastal observatories, i.e., the Mario Zucchelli (MZS) and Jang Bogo (JBS) stations.   </p>


2019 ◽  
Vol 122 ◽  
pp. 402-410
Author(s):  
Nelson Rosa Ferreira ◽  
Maria Inez de Moura Sarquis ◽  
Rubens Menezes Gobira ◽  
Márcia Gleice da Silva Souza ◽  
Alberdan Silva Santos

2019 ◽  
Vol 627 ◽  
pp. A31 ◽  
Author(s):  
J. González-Nuevo ◽  
S. L. Suárez Gómez ◽  
L. Bonavera ◽  
F. Sánchez-Lasheras ◽  
F. Argüeso ◽  
...  

Context. The statistical analysis of large sample of strong lensing events can be a powerful tool to extract astrophysical or cosmological valuable information. Their selection using submillimetre galaxies has been demonstrated to be very effective with more than ∼200 proposed candidates in the case ofHerschel-ATLAS data and several tens in the case of the South Pole Telescope. However, the number of confirmed events is still relatively low, i.e. a few tens, mostly because of the lengthy observational validation process on individual events.Aims. In this work we propose a new methodology with a statistical selection approach to increase by a factor of ∼5 the number of such events within theHerschel-ATLAS data set. Although the methodology can be applied to address several selection problems, it has particular benefits in the case of the identification of strongly lensed galaxies: objectivity, minimal initial constrains in the main parameter space, and preservation of statistical properties.Methods. The proposed methodology is based on the Bhattacharyya distance as a measure of the similarity between probability distributions of properties of two different cross-matched galaxies. The particular implementation for the aim of this work is called SHALOS and it combines the information of four different properties of the pair of galaxies: angular separation, luminosity percentile, redshift, and the ratio of the optical to the submillimetre flux densities.Results. The SHALOS method provides a ranked list of strongly lensed galaxies. The number of candidates within ∼340 deg2of theHerschel-ATLAS surveyed area for the final associated probability,Ptot >  0.7, is 447 and they have an estimated mean amplification factor of 3.12 for a halo with a typical cluster mass. Additional statistical properties of the SHALOS candidates, as the correlation function or the source number counts, are in agreement with previous results indicating the statistical lensing nature of the selected sample.


2019 ◽  
Author(s):  
Marie-Pierre Chapuis ◽  
Louis Raynal ◽  
Christophe Plantamp ◽  
Christine N. Meynard ◽  
Laurence Blondin ◽  
...  

AbstractDating population divergence within species from molecular data and relating such dating to climatic and biogeographic changes is not trivial. Yet it can help formulating evolutionary hypotheses regarding local adaptation and future responses to changing environments. Key issues include statistical selection of a demographic and historical scenario among a set of possible scenarios, and estimation of the parameter(s) of interest under the chosen scenario. Such inferences greatly benefit from new statistical approaches including approximate Bayesian computation - Random Forest (ABC-RF), the latter providing reliable inference at a low computational cost, with the possibility to take into account prior knowledge on both biogeographical history and genetic markers. Here, we used ABC-RF, including independent information on evolutionary rate and pattern at microsatellite markers, to decipher the evolutionary history of the African arid-adapted pest locust,Schistocerca gregaria. We found that the evolutionary processes that have shaped the present geographical distribution of the species in two disjoint northern and southern regions of Africa were recent, dating back 2.6 Ky (90% CI: 0.9 – 6.6 Ky). ABC-RF inferences also supported a southern colonization of Africa from a low number of founders of northern origin. The inferred divergence history is better explained by the peculiar biology ofS. gregaria, which involves a density-dependent swarming phase with some exceptional spectacular migrations, rather than a continuous colonization resulting from the continental expansion of open vegetation habitats during more ancient Quaternary glacial climatic episodes.


Sign in / Sign up

Export Citation Format

Share Document