Team Size and Decision Rule in the Performance of Simulated Monitoring Teams

Author(s):  
Wayne L. Waag ◽  
Charles G. Halcomb

A methodology is presented for randomly creating “teams” from a data pool of individual response records. Using this approach, the effects of two variables on team monitoring performance were investigated: (1) team size and (2) the decision rule employed in defining the requirements of a “team” response. Size of the simulated teams was varied from two to five members. The decision rule was varied from “parallel” in which a response by any one or more members produced a “team” response to “series” in which a “team” response occurred only if all members responded. “Parallel” teams were found to maximize correct detections while “series” teams eliminated all false alarms. For each decision rule, detection rate increased as a function of team size. For each team size, detection rate deteriorated as the decision rule required more members to respond correctly.

Author(s):  
Sunilkumar Soni ◽  
Santanu Das ◽  
Aditi Chattopadhyay

An optimal sensor placement methodology is proposed based on detection theory framework to maximize the detection rate and minimize the false alarm rate. Minimizing the false alarm rate for a given detection rate plays an important role in improving the efficiency of a Structural Health Monitoring (SHM) system as it reduces the number of false alarms. The placement technique is such that the sensor features are as directly correlated and as sensitive to damage as possible. The technique accounts for a number of factors, like actuation frequency and strength, minimum damage size, damage detection scheme, material damping, signal to noise ratio (SNR) and sensing radius. These factors are not independent and affect each other. Optimal sensor placement is done in two steps. First, a sensing radius, which can capture any detectable change caused by a perturbation and above a certain threshold, is calculated. This threshold value is based on Neyman-Pearson detector that maximizes the detection rate for a fixed false alarm rate. To avoid sensor redundancy, a criterion to minimize sensing region overlaps of neighboring sensors is defined. Based on the sensing region and the minimum overlap concept, number of sensors needed on a structural component is calculated. In the second step, a damage distribution pattern, known as probability of failure distribute, is calculated for a structural component using finite element analysis. This failure distribution helps in selecting the most sensitive sensors, thereby removing those making remote contributions to the overall detection scheme.


2019 ◽  
Vol 8 (9) ◽  
pp. 384 ◽  
Author(s):  
Park ◽  
Lee

Remote sensing technologies, particularly with Synthetic Aperture Radar (SAR) system, can provide timely and critical information to assess landslide distributions over large areas. Most space-borne SAR systems have been operating in different polarimetric modes to meet various operational requirements. This study aims to discuss how much detectability can be expected in the landslide map produced from the single-, dual-, and quad-polarization modes of observation. The experimental analysis of the characteristic changes of PALSAR-2 signals showed that quad-polarization parameters indicating signal depolarization properties revealed noticeable landslide-induced temporal changes for all local incidence angle ranges. To produce a landslide map, a simple change detection method based on characteristic scattering properties of landslide areas was proposed. The accuracy assessment results showed that the depolarization parameters, such as the co-pol coherence and polarizing contribution, can identify areas affected by landslides with a detection rate of 60%, and a false-alarm rate of 5%. On the other hand, the single- or dual-pol parameters can only be expected to provide half the accuracy with significant false-alarms in areas with temporal variations independent of landslides.


1964 ◽  
Vol 19 (2) ◽  
pp. 435-440 ◽  
Author(s):  
Earl L. Wiener ◽  
Gary K. Poock ◽  
Matthew Steele

A three-group experiment was performed to evaluate the effect of a secondary task of simple mental arithmetic on visual monitoring performance. The primary task was detection of a slightly larger excursion of a voltmeter needle making 50 uniform excursions per minute. The length of the vigil was 48 min., during which 32 signals were presented. The time-sharing group (T) performed, in addition to the monitoring task, a secondary task of adding two one-digit numbers presented auditorally 3 times a minute. Two control groups, one with the numbers presented (N) and one with only random noise (C), performed only the monitoring task. Results showed no difference in detection rate between groups, but a significant time decrement ( p < .001) and no group-by-time periods interaction. Commissive errors were significantly higher in the time-sharing group than the control groups. The results are seen as contrary to the arousal theory of vigilance.


Author(s):  
Matthew T. Nare ◽  
Gabriella M. Hancock

Healthcare represents an industry where any errors have the potential to lead to significant consequences such as unintended patient injury or death. Teams of healthcare professionals work together to complete patient care activities across several hospital environments including operating rooms, trauma centers, and care floors. While most healthcare tasks are patient-facing, some tasks such as cardiac telemetry monitoring occur without much patient interaction. Even so, cardiac telemetry monitoring represents a task where errors result in patient harm or death. The purpose of this study was to investigate the vigilance decrement using a healthcare visual search task based on cardiac telemetry monitoring. Results of this study were viewed through the lens of resource theory of vigilance. Findings from this study include a decrease in correct detections and false alarms with increased workload which supports previous vigilance decrement research as well providing directions for future research.


2018 ◽  
Vol 18 (01) ◽  
pp. e05 ◽  
Author(s):  
John Adedapo Ojo ◽  
Jamiu Alabi Oladosu

Video-based fire detection (VFD) technologies have received significant attention from both academic and industrial communities recently. However, existing VFD approaches are still susceptible to false alarms due to changes in illumination, camera noise, variability of shape, motion, colour, irregular patterns of smoke and flames, modelling and training inaccuracies. Hence, this work aimed at developing a VSD system that will have a high detection rate, low false-alarm rate and short response time. Moving blocks in video frames were segmented and analysed in HSI colour space, and wavelet energy analysis of the smoke candidate blocks was performed. In addition, Dynamic texture descriptors were obtained using Weber Local Descriptor in Three Orthogonal Planes (WLD-TOP). These features were combined and used as inputs to Support Vector Classifier with radial based kernel function, while post-processing stage employs temporal image filtering to reduce false alarm. The algorithm was implemented in MATLAB 8.1.0.604 (R2013a). Accuracy of 99.30%, detection rate of 99.28% and false alarm rate of 0.65% were obtained when tested with some online videos. The output of this work would find applications in early fire detection systems and other applications such as robot vision and automated inspection.


2020 ◽  
Author(s):  
Sonam Chhikara ◽  
Rajeev Kumar

Steganography hides the data within a media file in an imperceptible way. Steganalysis exposes steganography by using detection measures. Traditionally, Steganalysis revealed steganography by targeting perceptible and statistical properties which results in developing secure steganography schemes. In this work, we target LSB image steganography by using entropy and joint entropy metrics for steganalysis. First, the Embedded image is processed for feature extraction then analyzed by entropy and joint entropy with their corresponding original image. Second, SVM and Ensemble classifiers are trained according to the analysis results. The decision of classifiers discriminates cover image from stego image. This scheme is further applied on attacked stego image for checking detection reliability. Performance evaluation of proposed scheme is conducted over grayscale image datasets. We analyzed LSB embedded images by Comparing information gain from entropy and joint entropy metrics. Results conclude that entropy of the suspected image is more preserving than joint entropy. As before histogram attack, detection rate with entropy metric is 70% and 98% with joint entropy metric. However after an attack, entropy metric ends with 30% detection rate while joint entropy metric gives 93% detection rate. Therefore, joint entropy proves to be better steganalysis measure with 93% detection accuracy and less false alarms with varying hiding ratio.


2018 ◽  
pp. 286-312
Author(s):  
Masoumeh Zareapoor ◽  
Pourya Shamsolmoali ◽  
M. Afshar Alam

The fraud detection method requires a holistic approach where the objective is to correctly classify the transactions as legitimate or fraudulent. The existing methods give importance to detect all fraudulent transactions since it results in money loss. For this most of the time, they have to compromise on some genuine transactions. Thus, the major issue that the credit card fraud detection systems face today is that a significant percentage of transactions labelled as fraudulent are in fact legitimate. These “false alarms” delay the transactions and creates inconvenience and dissatisfaction to the customer. Thus, the objective of this research is to develop an intelligent data mining based fraud detection system for secure online payment transaction system. The performance evaluation of the proposed model is done on real credit card dataset and it is found that the proposed model has high fraud detection rate and less false alarm rate than other state-of-the-art classifiers.


Author(s):  
James C. Ferraro ◽  
Mustapha Mouloua

Despite its rapid advancement, automation remains vulnerable to system failures. The reliability of automation may impact users’ trust and how they interact with it. Additionally, the type of error can uniquely redirect user behavior. This study investigated how reliability and error type impact operator trust and monitoring performance. Participants completed a monitoring task at either 50% or 90% reliability, experiencing either misses or false alarms from an automated alert system. It was hypothesized that automation reliability would impact trust, while error type would also impact reliance and compliance behaviors. Results indicated that misses had a greater impact on monitoring performance than false alarms, while reliability did not influence performance. Trust was not influenced by reliability or error type and showed no relationship with performance measures. These results can help further clarify the way automation failures shape how humans interact with automation and inform the design of future automated systems.


2004 ◽  
Vol 126 (1) ◽  
pp. 55-61 ◽  
Author(s):  
Ranjan Ganguli ◽  
Budhadipta Dan

Trend shift detection is posed as a two-part problem: filtering of the gas turbine measurement deltas followed by the use of edge detection algorithms. Measurement deltas are deviations in engine gas path measurements from a “good” baseline engine and are a key health signal used for gas turbine performance diagnostics. The measurements used in this study are exhaust gas temperature, low rotor speed, high rotor speed and fuel flow, which are called cockpit measurements and are typically found on most commercial jet engines. In this study, a cascaded recursive median (RM) filter, of increasing order, is used for the purpose of noise reduction and outlier removal, and a hybrid edge detector that uses both gradient and Laplacian of the cascaded RM filtered signal are used for the detection of step change in the measurements. Simulated results with test signals indicate that cascaded RM filters can give a noise reduction of more than 38% while preserving the essential features of the signal. The cascaded RM filter also shows excellent robustness in dealing with outliers, which are quite often found in gas turbine data, and can cause spurious trend detections. Suitable thresholding of the gradient edge detector coupled with the use of the Laplacian edge detector for cross checking can reduce the system false alarms and missed detection rate. Further reduction in the trend shift detection false alarm and missed detection rate can be achieved by selecting gas path measurements with higher signal-to-noise ratios.


Sign in / Sign up

Export Citation Format

Share Document