scholarly journals Normative decision asymmetries with symmetric priors but asymmetric evidence

2020 ◽  
Author(s):  
Tahra Eissa ◽  
Joshua I Gold ◽  
Krešimir Josić ◽  
Zachary P Kilpatrick

AbstractDecisions based on rare events are challenging because rare events alone can be both informative and unreliable as evidence. How humans should and do overcome this challenge is not well understood. Here we present results from a preregistered study of 200 on-line participants performing a simple inference task in which the evidence was rare and asymmetric but the priors were symmetric. Consistent with a Bayesian ideal observer, most participants exhibited choice asymmetries that reflected a tendency to rationally interpret a rare event as evidence for the alternative likely to produce slightly more events, even when the two alternatives were equally likely a priori. A subset of participants exhibited additional biases based on an under-weighing of rare events. The results provide new quantitative and theoretically grounded insights into rare-event inference, which is relevant to both real-world problems like predicting stock-market crashes and common laboratory tasks like predicting changes in reward contingencies.

10.29007/ch8v ◽  
2018 ◽  
Author(s):  
Sebastian Lindner ◽  
Raphael Elsner ◽  
Phuong Nga Tran ◽  
Andreas Timm-Giel

The Limited Relative Error algorithm is an alternative statistical method for data evaluation. Through online result analysis it continuously requests more samples until it deems the evaluation confident enough. With this it allows researchers to hand over the control of simulation time to the algorithm, and through a-priori configuration the target result resolution is set so that arbitrarily rare events can be investigated. We provide a new description of the method as well as a stand-alone implementation and an integration of the algorithm into the OMNeT++ simulator.


2020 ◽  
Vol 39 (6) ◽  
pp. 8463-8475
Author(s):  
Palanivel Srinivasan ◽  
Manivannan Doraipandian

Rare event detections are performed using spatial domain and frequency domain-based procedures. Omnipresent surveillance camera footages are increasing exponentially due course the time. Monitoring all the events manually is an insignificant and more time-consuming process. Therefore, an automated rare event detection contrivance is required to make this process manageable. In this work, a Context-Free Grammar (CFG) is developed for detecting rare events from a video stream and Artificial Neural Network (ANN) is used to train CFG. A set of dedicated algorithms are used to perform frame split process, edge detection, background subtraction and convert the processed data into CFG. The developed CFG is converted into nodes and edges to form a graph. The graph is given to the input layer of an ANN to classify normal and rare event classes. Graph derived from CFG using input video stream is used to train ANN Further the performance of developed Artificial Neural Network Based Context-Free Grammar – Rare Event Detection (ACFG-RED) is compared with other existing techniques and performance metrics such as accuracy, precision, sensitivity, recall, average processing time and average processing power are used for performance estimation and analyzed. Better performance metrics values have been observed for the ANN-CFG model compared with other techniques. The developed model will provide a better solution in detecting rare events using video streams.


Cells ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 826
Author(s):  
Rafael Kretschmer ◽  
Marcelo Santos de Souza ◽  
Ivanete de Oliveira Furo ◽  
Michael N. Romanov ◽  
Ricardo José Gunski ◽  
...  

Interchromosomal rearrangements involving microchromosomes are rare events in birds. To date, they have been found mostly in Psittaciformes, Falconiformes, and Cuculiformes, although only a few orders have been analyzed. Hence, cytogenomic studies focusing on microchromosomes in species belonging to different bird orders are essential to shed more light on the avian chromosome and karyotype evolution. Based on this, we performed a comparative chromosome mapping for chicken microchromosomes 10 to 28 using interspecies BAC-based FISH hybridization in five species, representing four Neoaves orders (Caprimulgiformes, Piciformes, Suliformes, and Trogoniformes). Our results suggest that the ancestral microchromosomal syntenies are conserved in Pteroglossus inscriptus (Piciformes), Ramphastos tucanus tucanus (Piciformes), and Trogon surrucura surrucura (Trogoniformes). On the other hand, chromosome reorganization in Phalacrocorax brasilianus (Suliformes) and Hydropsalis torquata (Caprimulgiformes) included fusions involving both macro- and microchromosomes. Fissions in macrochromosomes were observed in P. brasilianus and H. torquata. Relevant hypothetical Neognathae and Neoaves ancestral karyotypes were reconstructed to trace these rearrangements. We found no interchromosomal rearrangement involving microchromosomes to be shared between avian orders where rearrangements were detected. Our findings suggest that convergent evolution involving microchromosomal change is a rare event in birds and may be appropriate in cytotaxonomic inferences in orders where these rearrangements occurred.


Dependability ◽  
2018 ◽  
Vol 18 (4) ◽  
pp. 3-9 ◽  
Author(s):  
I. S. Shubinsky ◽  
A. M. Zamyshliaev ◽  
L. P. Papi

The paper examines the reliability of an information management system as its ability to provide the required services that can be justifiably trusted. It is assumed that the system functions without an operator. The aim is to ensure the dependability of a multimodule control system, when the problem-solving results are affected by failures, faults and errors of problem-solution by the system’s computation modules (CMs). Conventional fault tolerance methods do not provide the desired effect, as even under infinite structural redundancy yet real capabilities of on-line detection of CM failures or faults the system’s dependability is significantly lower than expected. The paper proposes and evaluates the methods of adaptive dependability. They are to ensure the observability of control systems under limited capabilities of component CM operability supervision, as well as achieving the required levels of dependability of information management systems in cases of insignificant float time and structural redundancy. These goals are achieved through active (and automatic) reassignment of the available computational resources for on-line information processing. The methods of adaptive dependability enable – with no interruption of computational processes and while solving real-world problems – timely automatic detection and elimination of failures, faults of CMs and errors in the solution of specified problems through on-line localization of faulty modules and subsequent automatic reconfiguration of the system with the elimination of such modules from operation.


Author(s):  
Carlos A. Maldonado ◽  
Marc L. Jlesnick

The Internet has become a growing channel for consumer purchases. Half of all U.S. consumers made at least one purchase on-line in 2001. However, many consumers report frustration with the lack of support for navigation within many Internet retailers' web sites. Several design patterns have been suggested to overcome these limitations, such as expanded hierarchies and breadcrumbs. This study investigated the effects of these design patterns on users' quantitative performance and subjective preference for ecommerce web sites. Expanded hierarchies, a design pattern that is commonly used by many retail web sites, degraded all of the performance metrics assessed in the study. Users required more time, made more errors, used more clicks, and had lower satisfaction scores for sites designed with expanded hierarchies. The results for breadcrumbs suggest that they may improve performance. The inclusion of breadcrumbs reduced the number of clicks required by users to complete the tasks, but other performance metrics did not reach statistical significance. The results indicate that design patterns that are believed to improve performance a priori may not yield the results expected.


1974 ◽  
Vol 96 (4) ◽  
pp. 426-432 ◽  
Author(s):  
R. Isermann ◽  
U. Bauer

An identification method is described which first identifies a linear nonparametric model (crosscorrelation function, impulse response) by correlation analysis and then estimates the parameters of a parametric model (discrete transfer function) and also includes a method for the detection of the model order and the time delay. The performance, the computational expense and the overall reliability of this method is compared with five other identification methods. This two-step identification method, which can be applied off-line or on-line, is especially suited to identification by process computers, since it has the properties: Little a priori knowledge about the structure of the process model; very short computation time; small computer storage; no initial values of matrices and parameters are necessary and no divergence is possible for the on-line version. Results of an on-line identification of an industrial process with a process computer are shown.


Author(s):  
Susumu Hara ◽  
Kenji Nakamura ◽  
Tatsuo Narikiyo

This study discusses the positioning and vibration control of time-varying vibration systems whose parameters are time-varying. We assume that the time-varying parameter of a vibration system is detected by an on-line measurement or Wavelet analysis. This paper treats two control methods based on nonstationary optimal regulators (NORs) for this problem. The first method is a gain-scheduling of NORs. An actual controller is obtained by the interpolation of plural NORs designed a priori. The other one is an NOR design based on Wavelet analysis of the vibration system. In the second case, single NOR derived from the analysis result is applied. This study shows the effectiveness of these methods by numerical calculations and experiments. From the comparison of these methods, this paper suggests suitable applications of NOR according to the characteristics of each control problem.


Author(s):  
Yanwen Xu ◽  
Pingfeng Wang

Abstract Analysis of rare failure events accurately is often challenging with an affordable computational cost in many engineering applications, and this is especially true for problems with high dimensional system inputs. The extremely low probabilities of occurrences for those rare events often lead to large probability estimation errors and low computational efficiency. Thus, it is vital to develop advanced probability analysis methods that are capable of providing robust estimations of rare event probabilities with narrow confidence bounds. Generally, confidence intervals of an estimator can be established based on the central limit theorem, but one of the critical obstacles is the low computational efficiency, since the widely used Monte Carlo method often requires a large number of simulation samples to derive a reasonably narrow confidence interval. This paper develops a new probability analysis approach that can be used to derive the estimates of rare event probabilities efficiently with narrow estimation bounds simultaneously for high dimensional problems. The asymptotic behaviors of the developed estimator has also been proved theoretically without imposing strong assumptions. Further, an asymptotic confidence interval is established for the developed estimator. The presented study offers important insights into the robust estimations of the probability of occurrences for rare events. The accuracy and computational efficiency of the developed technique is assessed with numerical and engineering case studies. Case study results have demonstrated that narrow bounds can be built efficiently using the developed approach, and the true values have always been located within the estimation bounds, indicating that good estimation accuracy along with a significantly improved efficiency.


2005 ◽  
Vol 29 (2) ◽  
pp. 195-209
Author(s):  
Dany Dionne ◽  
Hannah Michalska

A new adaptive proportional navigation law for interception of a maneuvering target is presented. The approach employs a bank of guidance laws and an on-line governor to select the guidance law in effect at each time instant. The members of the bank are the proportional navigation law and a companion law suitable for a target moving with a constant acceleration. The governor is a hierarchical decision rule which uses the outputs from a maneuver detector and the available a-priori information about the expected number of evasive maneuvers. Simulation results demonstrate that the adaptive approach leads to a reduction in the miss distance as compared with cases where only a single non-adaptive guidance law is available.


2019 ◽  
Vol 63 (8) ◽  
pp. 1819-1848
Author(s):  
Dariusz Dereniowski ◽  
Dorota Osula

Abstract We consider the following on-line pursuit-evasion problem. A team of mobile agents called searchers starts at an arbitrary node of an unknown network. Their goal is to execute a search strategy that guarantees capturing a fast and invisible intruder regardless of its movements using as few searchers as possible. We require that the strategy is connected and monotone, that is, at each point of the execution the part of the graph that is guaranteed to be free of the fugitive is connected and whenever some node gains a property that it cannot be occupied by the fugitive, the strategy must operate in such a way to keep this property till its end. As a way of modeling two-dimensional shapes, we restrict our attention to networks that are embedded into partial grids: nodes are placed on the plane at integer coordinates and only nodes at distance one can be adjacent. Agents do not have any knowledge about the graph a priori, but they recognize the direction of the incident edge (up, down, left or right). We give an on-line algorithm for the searchers that allows them to compute a connected and monotone strategy that guarantees searching any unknown partial grid with the use of $O(\sqrt {n})$ O ( n ) searchers, where n is the number of nodes in the grid. As for a lower bound, there exist partial grids that require ${\varOmega }(\sqrt {n})$ Ω ( n ) searchers. Moreover, we prove that for each on-line searching algorithm there is a partial grid that forces the algorithm to use ${\varOmega }(\sqrt {n})$ Ω ( n ) searchers but $O(\log n)$ O ( log n ) searchers are sufficient in the off-line scenario. This gives a lower bound on ${\varOmega }(\sqrt {n}/\log n)$ Ω ( n / log n ) in terms of achievable competitive ratio of any on-line algorithm.


Sign in / Sign up

Export Citation Format

Share Document