type 2 errors
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 8)

H-INDEX

7
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Daniel Lakens

Psychological science would become more efficient if researchers implemented sequential designs where feasible. Miller and Ulrich (2020) propose an independent segments procedure where data can be analyzed at a prespecified number of equally spaced looks while controlling the Type 1 error rate. Such procedures already exist in the sequential analysis literature, and in this commentary I reflect on whether psychologist should choose to adopt these existing procedure instead. I believe limitations in the independent segments procedure make it relatively unattractive. Being forced to stop for futility based on a bound not chosen to control Type 2 errors, or reject a smallest effect size of interest in an equivalence test, limit the inferences one can make. Having to use a prespecified number of equally spaced looks is logistically inconvenient. And not having the flexibility to choose α and β spending functions limit the possibility to design efficient studies based on the goal and limitations of the researcher. Recent software packages such as rpact (Wassmer & Pahlke, 2019) make sequential designs equally easy to perform as the independent segments procedure. While learning new statistical methods always takes time, I believe psychological scientists should start on a path that will not limit them in the flexibility and inferences their statistical procedure provides.


Author(s):  
O. Baranik

The article analyzes the current state of the fleet of guided air means of destruction (missiles), the problems of repair and extension of the resource. The article substantiates the need for the transition of guided air missile of the "air-to-surface" class to operation according to the technical condition. The shortcomings of the existing inspection system for the technical condition of guided air weapon are shown. For the existing inspection system of technical operation of guided air weapon, one of the areas that will solve the problem of maintaining the combat readiness of aircraft is the transition to the operation of guided air weapon in technical condition. It is shown that the transition of guided air missiles to operation according to the technical condition and modernization of unguided air missiles involves strengthening the role of operations to measure and control their parameters and characteristics in order to determine the actual technical  condition and make informed decisions about their further operation. Peculiarities of appearance of the Type 1 and Type 2 errors during control of a technical condition of aviation armament are presented. A method of increasing the reliability of air-to-surface missile control equipment by conducting control checks in the inter-check interval is proposed. The fundamental difference between the proposed information-redundant model of operation of guided air missiles and the classic model is the introduction into the system of operation a new diagnostic operation - an intermediate control check. Peculiarities of application of the developed method concerning calculation of both quantity of control checks and their periodicity within an inter-check interval are shown.


2021 ◽  
Author(s):  
James A Watson ◽  
Stephen Kissler ◽  
Nicholas PJ Day ◽  
Yonatan H. Grad ◽  
Nicholas J White

There is no agreed methodology for pharmacometric assessment of candidate antiviral drugs in COVID-19. The most widely used measure of virological response in clinical trials so far is the time to viral clearance assessed by qPCR of viral nucleic acid in eluates from serial nasopharyngeal swabs. We posited that the rate of viral clearance would have better discriminatory value. Using a pharmacodynamic model fit to individual SARS-CoV-2 virus clearance data from 46 uncomplicated COVID-19 infections in a cohort of prospectively followed adults, we simulated qPCR viral load data to compare type 2 errors when using time to clearance and rate of clearance under varying antiviral effects, sample sizes, sampling frequencies and durations of follow-up. The rate of viral clearance is a uniformly superior endpoint as compared to time to clearance with respect to type 2 error, and it is not dependent on initial viral load or assay sensitivity. For greatest efficiency pharmacometric assessments should be conducted in early illness and daily qPCR samples should be taken over 7 to 10 days in each patient studied. Adaptive randomisation and early stopping for success permits more rapid identification of active interventions.


2020 ◽  
Vol 42 (5) ◽  
pp. 1041-1076
Author(s):  
Jeong-eun Kim ◽  
Yejin Cho ◽  
Youngsun Cho ◽  
Yeonjung Hong ◽  
Seohyun Kim ◽  
...  

AbstractThis study examines the effects of asymmetrical mappings of L2 sounds to L1 sounds on real-time processing of L2 phonology. L1-Korean participants completed a self-paced listening (SPL) task paired with a picture verification (PV) task, in which an English sentence was presented word by word along with a picture that matched or mismatched the sentence. In the critical region, an L2 vowel was deliberately replaced with the wrong vowel for two types of English vowel pairs: Type 1: English vowel pairs showing a one-to-one mapping to Korean counterparts (e.g., English: /i/ and /æ/ to Korean /i/ and /æ/, respectively); and Type 2: English vowel pairs showing a two-to-one mapping to a Korean counterpart (e.g., English /i/ and /ɪ/ to Korean /i/). We analyzed response times (RTs) and PV accuracy. Longer RTs were observed for Type 1 errors than Type 2 errors, indicating lower sensitivity to L2 vowels with two-to-one mapping to an L1 vowel. Also, PV accuracy was lower for the sentences containing Type 2 errors. These results suggest that asymmetrical L2-L1 sound mapping can affect learners’ processing of L2 phonological knowledge, which in turn can negatively affect their comprehension.


2020 ◽  
Author(s):  
Junya Hu ◽  
Wansuo Duan ◽  
Qian Zhou

<p>The “spring predictability barrier” (SPB) is a well-known characteristic of ENSO prediction, which has been widely studied for El Niño events. However, due to the nonlinearity of the coupled ocean–atmosphere system and the asymmetries between El Niño and La Niña, it is worthy to investigate the SPB for La Niña events and reveal their differences with El Niño. This study investigates the season-dependent predictability of sea surface temperature (SST) for La Niña events by exploring initial error growth in a perfect model scenario within the Community Earth System Model. The results show that for the prediction through the spring season, the prediction errors caused by initial errors have a season-dependent evolution and induce an SPB for La Niña events. Two types of initial errors that often yield the SPB phenomenon are identified: the first are type-1 initial errors showing positive SST errors in the central-eastern equatorial Pacific accompanied by a large positive error in the upper layers of the eastern equatorial Pacific. The second are type-2 errors presenting an SST pattern with positive errors in the southeastern equatorial Pacific and a west–east dipole pattern in the subsurface ocean. The type-1 errors exhibit an evolving mode similar to the growth phase of an El Niño-like event, while the type-2 initially experience a La Niña-like decay and then a transition to the growth phase of an El Niño-like event. Both types of initial errors cause positive prediction errors for Niño3 SST and under-predict the corresponding La Niña events. The resultant prediction errors of type-1 errors are owing to the growth of the initial errors in the upper layers of the eastern equatorial Pacific. For the type-2 errors, the prediction errors originate from the initial errors in the subsurface layers of the western equatorial Pacific. These two regions may represent the sensitive areas of targeted observation for La Niña prediction. In addition, the type-2 errors in the equatorial regions are enlarged by the recharge process from 10°N in the central Pacific during the eastward propagation. Therefore, the off-equatorial regions around 10°N in the central Pacific may represent another sensitive area of La Niña prediction. Additional observations may be prioritized in these identified sensitive areas to better predict La Niña events.</p>


2019 ◽  
Vol 33 (2) ◽  
pp. 17-24 ◽  
Author(s):  
Robert S. Kaplan

SYNOPSIS Faculty have increased the number of articles submitted to journals ranked in the top-5 of their discipline. This is their rational response to the overweighting of publications in top-5 journals by university promotions and tenure committees. Using journal impact factors, however, to infer the quality of a faculty member's publications incurs a high incidence of both Type 1 errors, when we conclude incorrectly that a paper published in a top-5 journal is a high-impact paper, and Type 2 errors, when we conclude that papers (and books) not published in these journals have low impact. A third type of error occurs when scholars underinvest in research about practice innovations because such research is viewed as unpublishable in top-5 journals. The paper suggests reforms to overcome the dysfunctional fixation on publication in top-5 journals.


2019 ◽  
Vol 69 ◽  
pp. 00014 ◽  
Author(s):  
Mikhail Basimov

The article raises the question of studying the statistical dependencies in a sociological research. Most sociologists, if they study cause and effect relations, offer interpretations of only linear relations and linear models based on their data. However, the problem arises not only with respect to the fact that sociologists interpreting any linear dependencies ignore a large number of simple non-linear relations (type 1 errors) often without understanding the essence of the issue. In the last 20-25 years, they often study processes not so simple to describe them by means of linear models. And sociologists go (consciously or not) along the way when weak linear relations (not strong ones are detected), referring to the hypothesis of zero correlation coefficient (saving stars SPSS), began to be presented as “significant” and tacitly understood as sufficiently strong correlations of a scientific interest for interpretation of cause and effect relations.But there is an even more significant error (type 2 errors), when they fail to notice not only the simplest non-linear dependencies, but strong simple non-linear dependencies between the parameters that provide their linear approximations with a weak correlation, and even a very weak correlation (0.11-0.3), which completely distorts the real picture of the phenomenon or process under study. It turns out to be scientific knowledge that does not correspond to reality, which contributes to the parallel development of philosophical (qualitative) analysis of social processes, based mainly on an intuitive understanding of social problems, the emergence of contradictions between the approaches. The article deals with some individual dependencies and their interpretation according to the results of the study of political preferences of young people, demonstrating the type 1 and 2 errors.


2018 ◽  
Vol 2018 (1) ◽  
pp. 65-81
Author(s):  
Andrej Makarov

This article discusses the rapid formation of the Rule of Reason (ROR) approach in antitrust policy in the field of anti — competitive agreements. In many countries (the US, EU) there was a significant reduction of the use of per se approach (prohibition on the base of formal characteristics) in favor of the ROR approach, nowadays agreements are usually permitted or prohibited on the basis of the analysis of positive and negative effects. The article analyzes and summarizes the experience of these jurisdictions in the development of the ROR approach, the chronology for agreements of various types (horizontal, vertical agreements). The role of discussions in economic theory in this process was provided the argumentation for the expansion of effects evaluation. At the same time, the article examines the problems of this transformation, taking into account the problems of legal uncertainty, growing risks of type 2 errors.


PeerJ ◽  
2017 ◽  
Vol 5 ◽  
pp. e3231 ◽  
Author(s):  
Caroline Witton ◽  
Joel B. Talcott ◽  
G. Bruce Henning

Measuring sensory sensitivity is important in studying development and developmental disorders. However, with children, there is a need to balance reliable but lengthy sensory tasks with the child’s ability to maintain motivation and vigilance. We used simulations to explore the problems associated with shortening adaptive psychophysical procedures, and suggest how these problems might be addressed. We quantify how adaptive procedures with too few reversals can over-estimate thresholds, introduce substantial measurement error, and make estimates of individual thresholds less reliable. The associated measurement error also obscures group differences. Adaptive procedures with children should therefore use as many reversals as possible, to reduce the effects of both Type 1 and Type 2 errors. Differences in response consistency, resulting from lapses in attention, further increase the over-estimation of threshold. Comparisons between data from individuals who may differ in lapse rate are therefore problematic, but measures to estimate and account for lapse rates in analyses may mitigate this problem.


Sign in / Sign up

Export Citation Format

Share Document