positive error
Recently Published Documents


TOTAL DOCUMENTS

58
(FIVE YEARS 19)

H-INDEX

9
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Rachit Dubey ◽  
Mark K Ho ◽  
Hermish Mehta ◽  
Tom Griffiths

Psychologists have long been fascinated with understanding the nature of Aha! moments, moments when we transition from not knowing to suddenly realizing the solution to a problem. In this work, we present a theoretical framework that explains when and why we experience Aha! moments. Our theory posits that during problem-solving, in addition to solving the problem, people also maintain a meta-cognitive model of their ability to solve the problem as well as a prediction about the time it would take them to solve that problem. Aha! moments arise when we experience a positive error in this meta-cognitive prediction, i.e. when we solve a problem much faster than we expected to solve it. We posit that this meta-cognitive error is analogous to a positive reward prediction error thereby explaining why we feel so good after an Aha! moment. A large-scale pre-registered experiment on anagram solving supports this theory, showing that people's time prediction errors are strongly correlated with their ratings of an Aha! experience while solving anagrams. A second experiment provides further evidence to our theory by demonstrating a causal link between time prediction errors and the Aha! experience. These results highlight the importance of meta-cognitive prediction errors and deepen our understanding of human meta-reasoning.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Andrew Buxton ◽  
Eleni Matechou ◽  
Jim Griffin ◽  
Alex Diana ◽  
Richard A. Griffiths

AbstractEcological surveys risk incurring false negative and false positive detections of the target species. With indirect survey methods, such as environmental DNA, such error can occur at two stages: sample collection and laboratory analysis. Here we analyse a large qPCR based eDNA data set using two occupancy models, one of which accounts for false positive error by Griffin et al. (J R Stat Soc Ser C Appl Stat 69: 377–392, 2020), and a second that assumes no false positive error by Stratton et al. (Methods Ecol Evol 11: 1113–1120, 2020). Additionally, we apply the Griffin et al. (2020) model to simulated data to determine optimal levels of replication at both sampling stages. The Stratton et al. (2020) model, which assumes no false positive results, consistently overestimated both overall and individual site occupancy compared to both the Griffin et al. (2020) model and to previous estimates of pond occupancy for the target species. The inclusion of replication at both stages of eDNA analysis (sample collection and in the laboratory) reduces both bias and credible interval width in estimates of both occupancy and detectability. Even the collection of > 1 sample from a site can improve parameter estimates more than having a high number of replicates only within the laboratory analysis.


Author(s):  
Gabriele Steuer ◽  
Maria Tulis ◽  
Markus Dresel

AbstractA frequent observation in the school context is that opportunities to learn from errors are often missed. However, a positive error climate may support learning from errors. For the school subject of mathematics, some findings about characteristics of the error climate already exist. But, a comparison of the error climate between different school subjects is still pending. In the present study, it is analyzed whether the error climate differs in different school subjects and whether the same interrelations between the ways in which individuals deal with errors can be found in these different school subjects. In a study with 937 students from 48 classrooms from grades 5 to 7, in different secondary schools in Germany and Austria, we assessed the error climate and individual reactions following errors in mathematics, German, and English. Small mean differences between mathematics and the two language subjects were yielded. In addition, we found medium-sized correlations between the error climate measures in the three school subjects. However, the same pattern of interrelations between error climate and the way individuals deal with errors for all three school subjects could be shown. The results suggest that the perception of the error climate is rather similar in different school subjects. This has implications, for instance, for interventions that aim at fostering the error climate.


2021 ◽  
Vol 2 (1) ◽  
pp. p1
Author(s):  
Edward J. Lusk

Focus Decision-making is often aided by examining False Positive Error-risk profiles [FPEs]. In this research report, the decision-making jeopardy that one invites by eschewing the Exact factorial-binomial Probability-values used to form the FPEs in favor of: (i) Normal Approximations [NA], or (ii) Continuity-Corrected Normal Approximations [CCNA] is addressed. Results Referencing an audit context where testing sample sizes for Re-Performance & Re-Calculation protocols are, by economic necessity, in the range of 20 to 100 account items, there are indications that audit decisions would benefit by using the Exact Probability-values. Specifically, using a jeopardy-screen of ±2.5% created by benchmarking the NA & the CCNA by the Exact FPEs, it is observed that: (i) for sample sizes of 100 there is little difference between the Exact and the CCNA FPEs, (ii) almost uniformly for both sample extremes of 20 and 100, the FPEs created using the NA are lower and outside the jeopardy screen, finally (iii) for the CCNA-arm for sample sizes of n = 20, only sometimes are the CCNA FPEs interior to the jeopardy screen. These results call into question not using the Exact Factorial Binomial results. Finally, an illustrative example is offered of an A priori FPE-risk Decision-Grid that can be parametrized and used in a decision-making context.


Italus Hortus ◽  
2020 ◽  
Vol 27 ◽  
pp. 3-18
Author(s):  
Giacomo Bedini ◽  
Giorgia Bastianelli ◽  
Swathi Sirisha Nallan Chakravartula ◽  
Carmen Morales-Rodríguez ◽  
Luca Rossini ◽  
...  

Authors explored the potential use of Vis/NIR hyperspectral imaging (HSI) and Fourier-transform Near-Infrared (FT-NIR) spectroscopy to be used as in-line tools for the detection of unsound chestnut fruits (i.e. infected and/or infested) in comparison with the traditional sorting technique. For the intended purpose, a total of 720 raw fruits were collected from a local company. Chestnut fruits were preliminarily classified into sound (360 fruits) and unsound (360 fruits) batches using a proprietary floating system at the facility along with manual selection performed by expert workers. The two batches were stored at 4 ± 1 °C until use. Samples were left at ambient temperature for at least 12 h before measurements. Subsequently, fruits were subjected to non-destructive measurements (i.e. spectral analysis) immediately followed by destructive analyses (i.e. microbiological and entomological assays). Classification models were trained using the Partial Least Squares Discriminant Analysis (PLS-DA) by pairing the spectrum of each fruit with the categorical information obtained from its destructive assay (i.e., sound, Y = 0; unsound, Y = 1). Categorical data were also used to evaluate the classification performance of the traditional sorting method. The performance of each PLS-DA model was evaluated in terms of false positive error (FP), false negative error (FN) and total error (TE) rates. The best result (8% FP, 14% FN, 11% TE) was obtained using Savitzky-Golay first derivative with a 5-points window of smoothing on the dataset of raw reflectance spectra scanned from the hilum side of fruit using the Vis/NIR HSI setup. This model showed similarity in terms of False Negative error rate with the best one computed using data from the FT-NIR setup (i.e. 15% FN), which, however, had the lowest global performance (17% TE) due to the highest False Positive error rate (19%). Finally, considering that the total error rate committed by the traditional sorting system was about 14.5% with a tendency of misclassifying unsound fruits, the results indicate the feasibility of a rapid, in-line detection system based on spectroscopic measurements.


2020 ◽  
Vol 34 (1) ◽  
pp. 5-12
Author(s):  
Stephen Braude

In my book Immortal Remains (Braude, 2003), I considered an intriguing argument William James offered against the suggestion that mediumistic evidence for postmortem survival could be explained away in normal, or at least non-survivalist, terms—that is, either by appealing to what I’ve called The Usual Suspects (e.g., misperception, hidden memories, fraud) or The Unusual Suspects (e.g., dissociation + latent abilities, exceptional memory, or living-agent psi). More specifically, James was concerned with a fascinating, but frustrating, feature of the material gathered from mental mediumship—namely, that even the best cases present a maddening mixture of (a) material suggesting survival, (b) material suggesting psi among the living, and (c) apparent rubbish.At their best, of course, mediums furnish detailed information for which no normal explanation will suffice. In the cases most strongly suggesting survival, that information concerns the past lives of the deceased. But sometimes mediums also provide information on the present actions, thoughts, and feelings of the living, and that’s one reason why some cases suggest psi among the living, and why a living-agent–psi interpretation of mediumship is difficult to rule out. After all, information about present states of affairs is not something to which the deceased would enjoy privileged access.Moreover, to complicate matters further,. . . gems of correct, detailed, and relevant information are nearly always imbedded in an immense matrix of twaddle, vagueness, irrelevance, ignorance, pretension, positive error, and occasional prevarication. (Broad, 1962, p. 259)


Sign in / Sign up

Export Citation Format

Share Document