scholarly journals Precise/not precise (PNP): A Brunswikian model that uses judgment error distributions to identify cognitive processes

Author(s):  
Joakim Sundh ◽  
August Collsiöö ◽  
Philip Millroth ◽  
Peter Juslin

Abstract In 1956, Brunswik proposed a definition of what he called intuitive and analytic cognitive processes, not in terms of verbally specified properties, but operationally based on the observable error distributions. In the decades since, the diagnostic value of error distributions has generally been overlooked, arguably because of a long tradition to consider the error as exogenous (and irrelevant) to the process. Based on Brunswik’s ideas, we develop the precise/not precise (PNP) model, using a mixture distribution to model the proportion of error-perturbed versus error-free executions of an algorithm, to determine if Brunswik’s claims can be replicated and extended. In Experiment 1, we demonstrate that the PNP model recovers Brunswik’s distinction between perceptual and conceptual tasks. In Experiment 2, we show that also in symbolic tasks that involve no perceptual noise, the PNP model identifies both types of processes based on the error distributions. In Experiment 3, we apply the PNP model to confirm the often-assumed “quasi-rational” nature of the rule-based processes involved in multiple-cue judgment. The results demonstrate that the PNP model reliably identifies the two cognitive processes proposed by Brunswik, and often recovers the parameters of the process more effectively than a standard regression model with homogeneous Gaussian error, suggesting that the standard Gaussian assumption incorrectly specifies the error distribution in many tasks. We discuss the untapped potentials of using error distributions to identify cognitive processes and how the PNP model relates to, and can enlighten, debates on intuition and analysis in dual-systems theories.

2014 ◽  
Vol 532 ◽  
pp. 113-117
Author(s):  
Zhou Jin ◽  
Ru Jing Wang ◽  
Jie Zhang

The rotating machineries in a factory usually have the characteristics of complex structure and highly automated logic, which generated a large amounts of monitoring data. It is an infeasible task for uses to deal with the massive data and locate fault timely. In this paper, we explore the causality between symptom and fault in the context of fault diagnosis in rotating machinery. We introduce data mining into fault diagnosis and provide a formal definition of causal diagnosis rule based on statistic test. A general framework for diagnosis rule discovery based on causality is provided and a simple implementation is explored with the purpose of providing some enlightenment to the application of causality discovery in fault diagnosis of rotating machinery.


2021 ◽  
Vol 16 (7-8) ◽  
pp. 106-109
Author(s):  
L.O. Malsteva ◽  
W.W. Nikonov ◽  
N.A. Kazimirova ◽  
A.A. Lopata

The review aims to present the chronological sequence of developing universal definitions of myocardial infarction, new ideas for improving the screening of post-infectious and sepsis-associated myocardial infarction (MI) (casuistic masks of myocardial infarction). The stages of the development of the common and global definition of myocardial infarction are outlined: 1 — by WHO working groups based on ECG for epidemiological studies; 2 — by the European Society of Cardiology and the American College of Cardio-logy using clinical and biochemical approaches; 3 — the Global Task Force consensus document of universal definition with subsequent classification of MI into five subtypes (spontaneous, dissonance in oxygen delivery and consumption; lethal outcome before the rise of specific markers of myocardial damage; PCI-associated; CABG- associated); 4 — review by the Joint Task Force of the above document based on the inclusion of more sensitive markers — troponins; 5 — the allocation of 17 non-ischemic myocardial damage, accompanied by an increase in the level of troponin; 6 — characteristic of the atrial natriuretic peptide from the standpoint of its synthesis, storage, release, diagnostic value as a biomarker of acute myocardial dama­ge; 7 — a clinical definition of myocardial infarction, presented in materials of the III Consensus on myocardial infarction 2017. The diagnosis of myocardial infarction using the criteria set in this document requires the integration of clinical data, ECG patterns, laboratory data, imaging findings, and, in some cases, pathological results, which are considered in the context of the time frame of the suspec­ted event. K. Thygesen et al. consider the additional use of: 1) cardiovascular magnetic resonance to determine the etiology of myocardial damage; 2) computer coronary angiography with suspected myocardial infarction. Myocardial infarction is a combination of specific cardio markers with at least one of the symptoms listed above. The formation of myocardial infarction can occur during/after acute respiratory infection. Causal relationships between these two states are established. Post-infectious myocardial infarction is strongly recommended to be individualized as a separate diagnostic entity. In sepsis, global myocardial ischemia with ischemic myocardial damage arises as a result of humoral and cellular factors, accompanied by an increase in troponins, a decrease in the ejection fraction of the left ventricle by 45 % and an increase in the final diastolic size of the left ventricle, the development of sepsis-associated multiple organ fai­lure, which is an unfavourable prognosis factor.


2021 ◽  
pp. 21-24
Author(s):  
A. V. Fedorova ◽  
N. V. Kochergina ◽  
A. B. Bludov ◽  
I. V. Boulycheva ◽  
E. A. Sushentsov ◽  
...  

Purpose. Determining the diagnostic value of magnetic resonance imaging in the accurate definition of chondrosarcoma of bone grade at the pre-surgery examination. Material and methods. We analyzed examination data (magnetic resonance imaging with no contrast enhancement) of 70 patients with chondrosarcoma (35 patients with low-grade chondrosarcoma and 35 patients with high grade chondrosarcoma). Informative weighted coefficients were determined separately for ‘learning’ and ‘examination’ samples. On the basis of weighted coefficients, the decisive rule was created for differentiation between low-grade and high-grade chondrosarcoma. Results. The sensitivity of the method was 87.0%, specificity was 95.6%, total correct classification was 91.03%. Conclusion. Magnetic resonance imaging is a highly informative method for prediction of chondrosarcoma grade at the pre-surgery examination.


2007 ◽  
Vol 112 (11) ◽  
pp. 577-582 ◽  
Author(s):  
Tetsuo Konno ◽  
Noboru Fujino ◽  
Kenshi Hayashi ◽  
Katsuharu Uchiyama ◽  
Eiichi Masuta ◽  
...  

Differences in the diagnostic value of a variety of definitions of negative T waves for HCM (hypertrophic cardiomyopathy) have not yet been clarified, resulting in a number of definitions being applied in previous studies. The aim of the present study was to determine the most accurate diagnostic definition of negative T waves for HCM in genotyped populations. Electrocardiographic and echocardiographic findings were analysed in 161 genotyped subjects (97 carriers and 64 non-carriers). We applied three different criteria that have been used in previous studies: Criterion 1, negative T wave >10 mm in depth in any leads; Criterion 2, negative T wave >3 mm in depth in at least two leads; and Criterion 3, negative T wave >1 mm in depth in at least two leads. Of the three criteria, Criterion 3 had the highest sensitivity (43% compared with 5 and 26% in Criterion 1 and Criterion 2 respectively; P<0.0001) and retained a specificity of 95%, resulting in the highest accuracy. In comparison with abnormal Q waves, negative T waves for Criterion 3 had a lower sensitivity in detecting carriers without LVH (left ventricular hypertrophy) (12.9% for negative T waves compared with 22.6% for abnormal Q waves). On the other hand, in detecting carriers with LVH, the sensitivity of negative T waves increased in a stepwise direction with the increasing extent of LVH (P<0.001), whereas there was less association between the sensitivity of abnormal Q waves and the extent of LVH. In conclusion, Criterion 3 for negative T waves may be the most accurate definition of HCM based on genetic diagnoses. Negative T waves may show different diagnostic value according to the different criteria and phenotypes in genotyped populations with HCM.


Author(s):  
Lourdes M. Brasil ◽  
◽  
Jean C. C. Rojas ◽  
Fernando M. de Azevedo ◽  
Carlos W. D. de Almeida ◽  
...  

This work represents the hybrid module of the IACVIRTUAL meta-environment. In this context we will basically approach the Hybrid Expert System (HES), which is composed by the Neural Networks Based Expert System (NNES) and by the Rule-Based Expert System (RBES). The HES is destined to support the decision of a clinical-surgical team, in the area of cardiology, in the definition of a therapeutic conduct in patients with coronary heart disease. The implementation process starts with the Knowledge Acquisition (KA), which comes from the analysis of a series of clinical parameters, which are used as input data for the NNES. This way, knowledge acquired during elicitation is converted in fuzzy rules. Through these rules, elicitated knowledge is mapped in AND/OR graphs, which then represent the starting structure of the NNES. Learning and optimization of the RBES are made through the Genetic-Backpropagation Based Learning Algorithm (GENBACK). This algorithm can, during the learning process, modify the weight of the connections, as well as the network structure. Knowledge abstracted from the RBES, being already refined, as well as trained and tested, is used to form the Knowledge Base of the RBES.


2015 ◽  
Vol 19 (3) ◽  
pp. 269-282
Author(s):  
Luke Pearson

This essay attempts to outline the ways in which contemporary videogames produce spatial experiences, and how architects might interrogate their unique media form. Framing videogames as both computational constructions and cultural artefacts, the paper places the study in a lineage of architectural thinkers examining ‘pop-culture’ and technology. This draws from the Smithson's writings on advertisements as technical images, Venturi Scott-Brown's studies on symbolism, through to Reyner Banham's definition of mass produced gizmos. The paper first outlines the importance of videogames on society and their Smithsonian impulses towards architectural design. To support this, I examine the work of game theorists such as Espen Aarseth and Ian Bogost. Aarseth argues that game spaces sever certain ties and ‘deviate’ from reality in order to become playable spaces. Bogost contends that game rules produce ‘procedural rhetoric’ - games may advance arguments through the playing of their rules. Reading from these theories I argue that these rule-based breaks from the real are a potent site for architectural speculation.The second section comprises design case studies scrutinising existing game worlds and producing new videogames as architectural experiments. I begin by examining the significance of symbolism in videogame worlds, and how this might provide alternative trajectories for digital architectural design. I subsequently explore Atkinson and Willis’ concept of the ludodrome, slippages between virtual and real, and discuss Ubiquity, a game I produced to explore this condition. I return to Banham's Great Gizmo, alongside PW Singer's writings on military robotics, to see the gamepad as a new order of gizmo for colonising space. And I discuss ‘Grand Theft Auto V’'s loading screen as a manifestation of satellite imagery aesthetics that collapse space. The paper concludes that games are powerful media for spatial experimentation and we must prepare for new generations of designers highly influenced by such ‘deviated’ architectures.


2017 ◽  
Vol 40 (1) ◽  
pp. 65-83 ◽  
Author(s):  
Guillermo Domingo Martinez ◽  
Heleno Bolfarine ◽  
Hugo Salinas

Regression analysis is a technique widely used in different areas ofhuman knowledge, with distinct distributions for the error term. Itis the case, however, that regression models with bimodal responsesor, equivalently, with the error term following a bimodal distribution are notcommon in the literature, perhaps due to the lack of simple to dealwith bimodal error distributions. In this paper we propose a simpleto deal with bimodal regression model with a symmetric-asymmetricdistribution for the error term for which for some values of theshape parameter it can be bimodal. This new distribution containsthe normal and skew-normal as special cases. A realdata application reveals that the new model can be extremely usefulin such situations.


Sign in / Sign up

Export Citation Format

Share Document