degree of confidence
Recently Published Documents


TOTAL DOCUMENTS

426
(FIVE YEARS 129)

H-INDEX

27
(FIVE YEARS 2)

Author(s):  
James W. Firman ◽  
Mark T. D. Cronin ◽  
Philip H. Rowe ◽  
Elizaveta Semenova ◽  
John E. Doe

AbstractThere exists consensus that the traditional means by which safety of chemicals is assessed—namely through reliance upon apical outcomes obtained following in vivo testing—is increasingly unfit for purpose. Whilst efforts in development of suitable alternatives continue, few have achieved levels of robustness required for regulatory acceptance. An array of “new approach methodologies” (NAM) for determining toxic effect, spanning in vitro and in silico spheres, have by now emerged. It has been suggested, intuitively, that combining data obtained from across these sources might serve to enhance overall confidence in derived judgment. This concept may be formalised in the “tiered assessment” approach, whereby evidence gathered through a sequential NAM testing strategy is exploited so to infer the properties of a compound of interest. Our intention has been to provide an illustration of how such a scheme might be developed and applied within a practical setting—adopting for this purpose the endpoint of rat acute oral lethality. Bayesian statistical inference is drawn upon to enable quantification of degree of confidence that a substance might ultimately belong to one of five LD50-associated toxicity categories. Informing this is evidence acquired both from existing in silico and in vitro resources, alongside a purposely-constructed random forest model and structural alert set. Results indicate that the combination of in silico methodologies provides moderately conservative estimations of hazard, conducive for application in safety assessment, and for which levels of certainty are defined. Accordingly, scope for potential extension of approach to further toxicological endpoints is demonstrated.


Author(s):  
Anastasios E. Giannopoulos ◽  
Ioanna Zioga ◽  
Konstantinos Kontoangelos ◽  
Panos Papageorgiou ◽  
Fotini Kapsali ◽  
...  

Background: Body dysmorphic disorder (BDD) is a psychiatric disorder characterized by excessive preoccupation with imagined defects in appearance. Optical illusions induce illusory effects that distort the presented stimulus thus leading to ambiguous percepts. Using electroencephalography (EEG), we investigated whether BDD is related to differentiated perception during illusory percepts. Methods: 18 BDD patients and 18 controls were presented with 39 optical illusions together with a statement testing whether or not they perceived the illusion. After a delay period, they were prompted to answer whether the statement is right/wrong and their degree of confidence for their answer. We investigated differences of BDD on task performance and self-reported confidence and analysed the brain oscillations during decision-making using nonparametric cluster statistics. Results: Behaviorally, the BDD group exhibited reduced confidence when responding incorrectly, potentially attributed to higher levels of doubt. Electrophysiologically, the BDD group showed significantly reduced alpha power at mid-central scalp areas, suggesting impaired allocation of attention. Interestingly, the lower the alpha power of the identified cluster, the higher the BDD severity, as assessed by BDD psychometrics. Conclusions: Results evidenced that alpha power during illusory processing might serve as a quantitative EEG biomarker of BDD, potentially associated with reduced inhibition of task-irrelevant areas.


2021 ◽  
Vol 7 (1) ◽  
pp. 1-13
Author(s):  
Paul Lockwood ◽  
Abbaas Khan

Introduction Chest X-rays are the most frequently requested X-ray imaging in English hospitals. This study aimed to assess final year UK radiography student’s confidence and ability in image interpretation of chest X-rays. Methods Thirty-three diagnostic radiography students were invited to assess their confidence and ability in interpreting chest x-rays from a bank of n=10 cases using multiple choice answers. Data analysis included 2x2 contingency tables, Kappa for inter-rater reliability, a Likert scale of confidence for each case, and questions to assess individual interpretation skills and ways to increase the learning of the subject. Results Twenty-three students participated in the study. The pooled accuracy achieved was 61% (95% CI 38.4-77.7; k=0.22). The degree of confidence and ability varied depending upon the student and the conditions observed. High confidence was noted with COVID-19 (n=12/23; 52%), lung metastasis (n=14/23; 61%), and pneumothorax (n=13/23; 57%). Low confidence was noted with conditions of consolidation (n=8/23; 35%), haemothorax (n=8/23; 35%), and surgical emphysema (n=8/23; 35%). From the sample n=11 (48%), participants stated they felt they had the knowledge to interpret chest X-rays required for a newly qualified radiographer. Conclusion The results demonstrated final year radiography student’s confidence and ability in image interpretation of chest X-rays. Student feedback indicated a preference for learning support through university lectures, online study resources, and time spent with reporting radiographers on clinical practice to improve ability and confidence in interpreting chest X-rays.


Author(s):  
Mario Ljubičić

Origin, mechanics and properties of the Solar System are analyzed in the framework of Complete Relativity. The analysis confirms the postulates and hypotheses of the theory with a high degree of confidence. During the analysis, some new hypotheses have emerged. These are discussed and confirmed with various degrees of confidence. To increase confidence or refute some hypotheses, experimental verification is necessary. Main conclusions are: - Solar System is a scaled Carbon isotope with a nucleus in a condensed (bosonic) state and components in various vertically excited states, - Earth is a living being of extremely introverted intelligence, life is common everywhere, albeit extroverted complex forms are present on planetary surfaces only during planetary neurogenesis, - anthropogenic climate change is only a part (trigger from one perspective) of bigger global changes on Earth and in the Solar System during planetary neurogenesis, - major extinction events are relative extinctions, a regular part of transformation and transfer of life in the process of planetary neurogenesis.


2021 ◽  
Vol 11 (1) ◽  
pp. 1
Author(s):  
Takashi Ishida ◽  
Atsushi Maruyama ◽  
Shinichi Kurihara

In this study, we develop a model of food consumption with a focus on the subjectively assessed risk of consumers and their degree of confidence in their risk assessment and use it to examine consumer behavior in the chaotic situation created by the Fukushima nuclear accident in 2011. The data were collected in March 2012 using a mail survey for 1300 Japanese women, the primary food purchasers. The respondents were asked to evaluate the cancer risk of eating agricultural products, which were assumed to be grown in the affected area, despite meeting national regulatory standards for radioactive materials, as a measure of their risk assessment and willingness to purchase Fukushima beef. The results show that the effect of confidence in a consumer’s risk assessment on their behavior depends on the stated risk level: when stated risk is below an estimated critical value, termed the switching point, the risk perceived by a consumer without confidence exceeds that of one with confidence. On the other hand, perceived risk is inversely related to confidence when the stated risk exceeds the switching point.


Educatia 21 ◽  
2021 ◽  
pp. 144-151
Author(s):  
Mirela Minică ◽  

The article highlights according to the concept of social capital, the changes generated by the COVID-19 pandemic in the educational system. This research identified the attributes of social capital at the level of the adults involved in the educational process (parents, students), the impact of the current period on them and the opinion of the respondents regarding the education reform. The conclusions of the study prove an activation of the intentions of involvement and participation in the management structures and in the educational projects at the level of the school organization, along with a low degree of confidence in the way the reform of the educational system is designed and implemented. Change management in recent years has highlighted the need for school involvement in the development of social capital and also the need to increase the role of social stakeholders in solving the challenges facing school organizations.


MAUSAM ◽  
2021 ◽  
Vol 47 (1) ◽  
pp. 59-66
Author(s):  
J. CHATTOPADHYAY ◽  
R. BHATLA

The relationship between ENSO/anti-ENSO events in the Pacific basin and simultaneous all India monsoon has been re-examined for the period 1901-1990 using Southern Oscillation Index (SOI). The result shows that there is fairly strong association between ENSO events and dry monsoon years. There exists a weak teleconnection between anti-ENSO events and wet monsoon indicating that anti-ENSO events have only a moderate impact on the Indian monsoon rainfall. Developing ENSO (anti-ENSO) episodes during the monsoon season indicates non-occurrence of simultaneous floods (droughts) with a very high degree of confidence 70 (50) percent of the droughts (floods) during the above period have occurred during ENSO (anti-ENSO) events indicating that extreme monsoon activities in the form of droughts (floods) might be important factors for the occurrence of simultaneous ENSO/anti-ENSO events.


2021 ◽  
Vol 2 (2) ◽  
pp. 127-132
Author(s):  
Shintia Puspita Dewi ◽  
Sekar Dwi Ardianti ◽  
Muhammad Noor Ahsin

The purpose of this study is to analyze the impact of online learning on elementary school students 1 Barongan Kudus.The research method used is qualitative. Research data collection is done through the provision of questionnaires, observations, interviews, and documentation. The research was conducted in the odd semester of the 2021/2021 academic year. The research data were collected using observation and interview techniques. Research sources include teachers, students, and parents of elementary school students 1 Barongan Kudus. Data checking was carried out using triangulation to increase the degree of confidence and accuracy of the data. Triangulation is done with three strategies, namely source triangulation, method triangulation, and time triangulation. The data analysis technique used is the Miles and Huberman analysis technique.The results showed that teachers, students and parents were unable to communicate directly and online learning required all parties to communicate via WhatsApp..


2021 ◽  
Author(s):  
◽  
David Friggens

<p>Concurrent data structure algorithms have traditionally been designed using locks to regulate the behaviour of interacting threads, thus restricting access to parts of the shared memory to only one thread at a time. Since locks can lead to issues of performance and scalability, there has been interest in designing so-called nonblocking algorithms that do not use locks. However, designing and reasoning about concurrent systems is difficult, and is even more so for nonblocking systems, as evidenced by the number of incorrect algorithms in the literature.  This thesis explores how the technique of model checking can aid the testing and verification of nonblocking data structure algorithms. Model checking is an automated verification method for finite state systems, and is able to produce counterexamples when verification fails. For verification, concurrent data structures are considered to be infinite state systems, as there is no bound on the number of interacting threads, the number of elements in the data structure, nor the number of possible distinct data values. Thus, in order to analyse concurrent data structures with model checking, we must either place finite bounds upon them, or employ an abstraction technique that will construct a finite system with the same properties. First, we discuss how nonblocking data structures can be best represented for model checking, and how to specify the properties we are interested in verifying. These properties are the safety property linearisability, and the progress properties wait-freedom, lock-freedom and obstructionfreedom. Second, we investigate using model checking for exhaustive testing, by verifying bounded (and hence finite state) instances of nonblocking data structures, parameterised by the number of threads, the number of distinct data values, and the size of storage memory (e.g. array length, or maximum number of linked list nodes). It is widely held, based on anecdotal evidence, that most bugs occur in small instances. We investigate the smallest bounds needed to falsify a number of incorrect algorithms, which supports this hypothesis. We also investigate verifying a number of correct algorithms for a range of bounds. If an algorithm can be verified for bounds significantly higher than the minimum bounds needed for falsification, then we argue it provides a high degree of confidence in the general correctness of the algorithm. However, with the available hardware we were not able to verify any of the algorithms to high enough bounds to claim such confidence.  Third, we investigate using model checking to verify nonblocking data structures by employing the technique of canonical abstraction to construct finite state representations of the unbounded algorithms. Canonical abstraction represents abstract states as 3-valued logical structures, and allows the initial coarse abstraction to be refined as necessary by adding derived predicates. We introduce several novel derived predicates and show how these allow linearisability to be verified for linked list based nonblocking stack and queue algorithms. This is achieved within the standard canonical abstraction framework, in contrast to recent approaches that have added extra abstraction techniques on top to achieve the same goal.  The finite state systems we construct using canonical abstraction are still relatively large, being exponential in the number of distinct abstract thread objects. We present an alternative application of canonical abstraction, which more coarsely collapses all threads in a state to be represented by a single abstract thread object. In addition, we define further novel derived predicates, and show that these allow linearisability to be verified for the same stack and queue algorithms far more efficiently.</p>


2021 ◽  
Author(s):  
◽  
David Friggens

<p>Concurrent data structure algorithms have traditionally been designed using locks to regulate the behaviour of interacting threads, thus restricting access to parts of the shared memory to only one thread at a time. Since locks can lead to issues of performance and scalability, there has been interest in designing so-called nonblocking algorithms that do not use locks. However, designing and reasoning about concurrent systems is difficult, and is even more so for nonblocking systems, as evidenced by the number of incorrect algorithms in the literature.  This thesis explores how the technique of model checking can aid the testing and verification of nonblocking data structure algorithms. Model checking is an automated verification method for finite state systems, and is able to produce counterexamples when verification fails. For verification, concurrent data structures are considered to be infinite state systems, as there is no bound on the number of interacting threads, the number of elements in the data structure, nor the number of possible distinct data values. Thus, in order to analyse concurrent data structures with model checking, we must either place finite bounds upon them, or employ an abstraction technique that will construct a finite system with the same properties. First, we discuss how nonblocking data structures can be best represented for model checking, and how to specify the properties we are interested in verifying. These properties are the safety property linearisability, and the progress properties wait-freedom, lock-freedom and obstructionfreedom. Second, we investigate using model checking for exhaustive testing, by verifying bounded (and hence finite state) instances of nonblocking data structures, parameterised by the number of threads, the number of distinct data values, and the size of storage memory (e.g. array length, or maximum number of linked list nodes). It is widely held, based on anecdotal evidence, that most bugs occur in small instances. We investigate the smallest bounds needed to falsify a number of incorrect algorithms, which supports this hypothesis. We also investigate verifying a number of correct algorithms for a range of bounds. If an algorithm can be verified for bounds significantly higher than the minimum bounds needed for falsification, then we argue it provides a high degree of confidence in the general correctness of the algorithm. However, with the available hardware we were not able to verify any of the algorithms to high enough bounds to claim such confidence.  Third, we investigate using model checking to verify nonblocking data structures by employing the technique of canonical abstraction to construct finite state representations of the unbounded algorithms. Canonical abstraction represents abstract states as 3-valued logical structures, and allows the initial coarse abstraction to be refined as necessary by adding derived predicates. We introduce several novel derived predicates and show how these allow linearisability to be verified for linked list based nonblocking stack and queue algorithms. This is achieved within the standard canonical abstraction framework, in contrast to recent approaches that have added extra abstraction techniques on top to achieve the same goal.  The finite state systems we construct using canonical abstraction are still relatively large, being exponential in the number of distinct abstract thread objects. We present an alternative application of canonical abstraction, which more coarsely collapses all threads in a state to be represented by a single abstract thread object. In addition, we define further novel derived predicates, and show that these allow linearisability to be verified for the same stack and queue algorithms far more efficiently.</p>


Sign in / Sign up

Export Citation Format

Share Document