scholarly journals Vision-Based Finger Tapping Detection Without Fingertip Observation

2021 ◽  
Vol 33 (3) ◽  
pp. 484-493
Author(s):  
Shotaro Narita ◽  
◽  
Shingo Kagami ◽  
Koichi Hashimoto

A machine learning approach is investigated in this study to detect a finger tapping on a handheld surface, where the movement of the surface is observed visually; however, the tapping finger is not directly visible. A feature vector extracted from consecutive frames captured by a high-speed camera that observes a surface patch is input to a convolutional neural network to provide a prediction label indicating whether the surface is tapped within the sequence of consecutive frames (“tap”), the surface is still (“still”), or the surface is moved by hand (“move”). Receiver operating characteristics analysis on a binary discrimination of “tap” from the other two labels shows that true positive rates exceeding 97% are achieved when the false positive rate is fixed at 3%, although the generalization performance against different tapped objects or different ways of tapping is not satisfactory. An informal test where a heuristic post-processing filter is introduced suggests that the use of temporal history information should be considered for further improvements.

2005 ◽  
Vol 46 (7) ◽  
pp. 750-755 ◽  
Author(s):  
Y. Baba ◽  
H. Hokotate ◽  
M. Nakajo

Purpose: To evaluate adequate criteria for adrenal vein sampling values in patients with aldosterone producing adrenal adenoma (APA), retrospectively. Material and Methods: Between 1988 and 2002, 59 hypertensive patients (15 M and 44 F, aged 47.58±9.45 years) were referred to our hospital and diagnosed with APA based on established criteria. During the same period, 23 patients with non-functioning adrenal adenoma (11 M and 12 F aged 53.56±11.76 years) were diagnosed based on computed tomography and laboratory data. Results: All 82 patients were enrolled in the present study. Bilateral adrenal vein sampling (AVS) for measurement of plasma aldosterone (A) and cortisol (C) was performed, and a receiver operating characteristics (ROC) analysis was conducted to establish the best criteria from the AVS-derived index in patients with APA. A (APA side)/A (contralateral side) was confirmed to provide the best diagnostic accuracy {(>2.5: right APA, sensitivity 83.3%, specificity 79.5%), (>3: left APA, sensitivity 84.2%, specificity 76.9)}. The Az values for A (APA side)/A (contralateral side) were 0.8948 and 0.9260 for right and left APA, respectively. Conclusion: The A (APA side)/A (contralateral side) value was the best compromise for sensitivity and false-positive rate for lateralization of APA.


Author(s):  
Stefano Mariani ◽  
Thompson V. Nguyen ◽  
Xuan Zhu ◽  
Simone Sternini ◽  
Francesco Lanza di Scalea ◽  
...  

The University of California at San Diego (UCSD), under a Federal Railroad Administration (FRA) Office of Research and Development (R&D) grant, is developing a system for high-speed and non-contact rail defect detection. A prototype using an ultrasonic air-coupled guided wave signal generation and air-coupled signal detection, paired with a real-time statistical analysis algorithm, has been realized. This system requires a specialized filtering approach based on electrical impedance matching due to the inherently poor signal-to-noise ratio of air-coupled ultrasonic measurements in rail steel. Various aspects of the prototype have been designed with the aid of numerical analyses. In particular, simulations of ultrasonic guided wave propagation in rails have been performed using a Local Interaction Simulation Approach (LISA) algorithm. The system’s operating parameters were selected based on Receiver Operating Characteristic (ROC) curves, which provide a quantitative manner to evaluate different detection performances based on the trade-off between detection rate and false positive rate. The prototype based on this technology was tested in October 2014 at the Transportation Technology Center (TTC) in Pueblo, Colorado, and again in November 2015 after incorporating changes based on lessons learned.


2021 ◽  
Author(s):  
SUSHANT KUMAR PANDEY ◽  
Anil Kumar Tripathi

Abstract The quality of the defect datasets is a critical issue in the domain of software defect prediction (SDP). These datasets are obtained through the mining of software repositories. Resent studies claims over the quality of the defect dataset. It is because of inconsistency between bug/clean fix keyword in fault reports and the corresponding link in the change management logs. Class Imbalance (CI) problem is also a big challenging issue in SDP models. The defect prediction method trained using noisy and imbalanced data leads to inconsistent and unsatisfactory results. Combined analysis over noisy instances and CI problem needs to be required. To the best of our knowledge, there are insufficient studies that have been done over such aspects. In this paper, we deal with the impact of noise and CI problem on five baseline SDP models; we manually added the various noise level (0 to 80%) and identified its impact on the performance of those SDP models. Moreover, we further provide guidelines for the possible range of tolerable noise for baseline models. We have also suggested the SDP model, which has the highest noise tolerable ability and outperforms over other classical methods. The True Positive Rate (TPR) and False Positive Rate (FPR) values of the baseline models reduce between 20\% to 30\% after adding 10% to 40% noisy instances. Similarly, the ROC (Receiver Operating Characteristics) values of SDP models reduces to 40% to 50%. The suggested model leads to avoid noise between 40% to 60% as compared to other traditional models.


Author(s):  
Tran Ngoc Thinh ◽  
Cuong Pham-Quoc ◽  
Biet Nguyen-Hoang ◽  
Thuy-Chau Tran-Thi ◽  
Chien Do-Minh ◽  
...  

In this paper, we propose a novel FPGA-based high-speed DDoS countermeasure system that can flexibly adapt to DDoS attacks while still maintaining system performance. The system includes a packet decoder module and multiple DDoS countermeasure mechanisms. We apply dynamic partial reconfiguration technique in this system so that the countermeasure mechanisms can be flexibly changed or updated on-the-fly. The proposed system architecture separates DDoS protection modules (which implement DDoS countermeasure techniques) from the packet decoder module. By using this approach, one DDoS protection module can be reconfigured without interfering with other modules. The proposed system is implemented on a NetFPGA 10G board. The synthesis results show that the system can work at up to 116.782 MHz while utilizing up to 39.9% Registers and 49.85% BlockRAM of the Xilinx Virtex xcv5tx240t FPGA device on the NetFPGA 10G board. The system achieves the detection rate of 100% with the false negative rate at 0% and false positive rate closed to 0.16%. The prototype system achieves packet decoding throughput at 9.869 Gbps in half-duplex mode and 19.738 Gbps in full-duplex mode.


Author(s):  
Aureli Alabert ◽  
Mercè Farré

AbstractThe doctrinal paradox is analysed from a probabilistic point of view assuming a simple parametric model for the committee’s behaviour. The well known premise-based and conclusion-based majority rules are compared in this model, by means of the concepts of false positive rate (FPR), false negative rate (FNR) and Receiver Operating Characteristics (ROC) space. We introduce also a new rule that we call path-based, which is somehow halfway between the other two. Under our model assumptions, the premise-based rule is shown to be the best of the three according to an optimality criterion based in ROC maps, for all values of the model parameters (committee size and competence of its members), when equal weight is given to FPR and FNR. We extend this result to prove that, for unequal weights of FNR and FPR, the relative goodness of the rules depends on the values of the competence and the weights, in a way which is precisely described. The results are illustrated with some numerical examples.


Author(s):  
M. B. Shete

Abstract: In the world of technology, there are various zones through which different companies may adopt technologies which sustenance decision-making, Artificial Intelligence is the most creative advancement, generally used to help various companies and institutions in business approaches, authoritative aspects and individual’s administration. As of late, consideration has progressively been paid to Human Resources (HR), since professional excellence and capabilities address a development factor and a genuine upper hand for organizations. Subsequent to having been acquainted with deals and showcasing offices, manmade brainpower is additionally beginning to direct representative related choices inside HR the board. The reason for existing is to help choices that are put together not with respect to emotional viewpoints but rather on target information investigation. The objective of this work is to break down how target factors impact representative weakening, to distinguish the fundamental driver that add to a specialist's choice to leave an organization, and to have the option to foresee whether a specific worker will leave the organization. After the testing, the proposed model of an algorithm for the prediction of workers in any industry, attrition is tested on actual dataset with almost 150 samples. With this algorithm best results are generated in terms of all experimental parameters. It uncovers the best review rate, since it estimates the capacity of a classifier to track down every one of the True positive rates and accomplishes a generally false positive rate. The introduced result will help us in distinguishing the conduct of representatives who can be attired throughout the following time. Trial results uncover that the strategic relapse approach can reach up to 86% exactness over another. There are the few algorithms that can be used for processing the data, KNearest Neighbour, logistic regression, decision Tree, random Forest, Support Vector Machine etc. Keywords: Employees Attrition, Machine Learning, Support vector machine (SVM), KNN (K-Nearest Neighbour)


2002 ◽  
Vol 41 (01) ◽  
pp. 37-41 ◽  
Author(s):  
S. Shung-Shung ◽  
S. Yu-Chien ◽  
Y. Mei-Due ◽  
W. Hwei-Chung ◽  
A. Kao

Summary Aim: Even with careful observation, the overall false-positive rate of laparotomy remains 10-15% when acute appendicitis was suspected. Therefore, the clinical efficacy of Tc-99m HMPAO labeled leukocyte (TC-WBC) scan for the diagnosis of acute appendicitis in patients presenting with atypical clinical findings is assessed. Patients and Methods: Eighty patients presenting with acute abdominal pain and possible acute appendicitis but atypical findings were included in this study. After intravenous injection of TC-WBC, serial anterior abdominal/pelvic images at 30, 60, 120 and 240 min with 800k counts were obtained with a gamma camera. Any abnormal localization of radioactivity in the right lower quadrant of the abdomen, equal to or greater than bone marrow activity, was considered as a positive scan. Results: 36 out of 49 patients showing positive TC-WBC scans received appendectomy. They all proved to have positive pathological findings. Five positive TC-WBC were not related to acute appendicitis, because of other pathological lesions. Eight patients were not operated and clinical follow-up after one month revealed no acute abdominal condition. Three of 31 patients with negative TC-WBC scans received appendectomy. They also presented positive pathological findings. The remaining 28 patients did not receive operations and revealed no evidence of appendicitis after at least one month of follow-up. The overall sensitivity, specificity, accuracy, positive and negative predictive values for TC-WBC scan to diagnose acute appendicitis were 92, 78, 86, 82, and 90%, respectively. Conclusion: TC-WBC scan provides a rapid and highly accurate method for the diagnosis of acute appendicitis in patients with equivocal clinical examination. It proved useful in reducing the false-positive rate of laparotomy and shortens the time necessary for clinical observation.


1993 ◽  
Vol 32 (02) ◽  
pp. 175-179 ◽  
Author(s):  
B. Brambati ◽  
T. Chard ◽  
J. G. Grudzinskas ◽  
M. C. M. Macintosh

Abstract:The analysis of the clinical efficiency of a biochemical parameter in the prediction of chromosome anomalies is described, using a database of 475 cases including 30 abnormalities. A comparison was made of two different approaches to the statistical analysis: the use of Gaussian frequency distributions and likelihood ratios, and logistic regression. Both methods computed that for a 5% false-positive rate approximately 60% of anomalies are detected on the basis of maternal age and serum PAPP-A. The logistic regression analysis is appropriate where the outcome variable (chromosome anomaly) is binary and the detection rates refer to the original data only. The likelihood ratio method is used to predict the outcome in the general population. The latter method depends on the data or some transformation of the data fitting a known frequency distribution (Gaussian in this case). The precision of the predicted detection rates is limited by the small sample of abnormals (30 cases). Varying the means and standard deviations (to the limits of their 95% confidence intervals) of the fitted log Gaussian distributions resulted in a detection rate varying between 42% and 79% for a 5% false-positive rate. Thus, although the likelihood ratio method is potentially the better method in determining the usefulness of a test in the general population, larger numbers of abnormal cases are required to stabilise the means and standard deviations of the fitted log Gaussian distributions.


2019 ◽  
Author(s):  
Amanda Kvarven ◽  
Eirik Strømland ◽  
Magnus Johannesson

Andrews & Kasy (2019) propose an approach for adjusting effect sizes in meta-analysis for publication bias. We use the Andrews-Kasy estimator to adjust the result of 15 meta-analyses and compare the adjusted results to 15 large-scale multiple labs replication studies estimating the same effects. The pre-registered replications provide precisely estimated effect sizes, which do not suffer from publication bias. The Andrews-Kasy approach leads to a moderate reduction of the inflated effect sizes in the meta-analyses. However, the approach still overestimates effect sizes by a factor of about two or more and has an estimated false positive rate of between 57% and 100%.


Sign in / Sign up

Export Citation Format

Share Document