scholarly journals Searching for tainted vulnerabilities in static analysis tool Svace

Author(s):  
Alexey Evgenevich Borodin ◽  
Alexey Vyacheslavovich Goremykin ◽  
Sergey Pavlovitch Vartanov ◽  
Andrey Andreevich Belevantsev

The paper is dedicated to search for taint-based errors in the source code of programs, i.e. errors caused by unsafe use of data obtained from external sources, which could potentially be modified by an attacker. The interprocedural static analyzer Svace was used as a basis. The analyzer searches both for defects in the program and searches for suspicious places where the logic of the program may be violated. The goal is to find as many errors as possible at an acceptable speed and a low level of false positives (< 20-35%). To find errors, Svace with help of modified compiler builds a low-level typed intermediate representation, which is used as an input to the main SvEng analyzer. The analyzer builds a call graph and then performs summary-based analysis. In this analysis, the functions are traversed according to the call graph starting from the leaves. After analyzing the function, its summary is created, which will then be used to analyze the call instructions. The analysis has both high speed and good scalability. Intra-procedural analysis is based on symbolic execution with the union of states at merge points of paths. An SMT solver can be used to filter out infeasible paths for some checkers. In this case, the SMT-solver is called only if there is a suspicion of an error. The analyzer has been expanded to find defects of tainted data using. The checkers are implemented as plugins by using the source-sink scheme. The sources are calls of library functions that receive data from outside the program, as well as the arguments of the main function. Sinks are accessing to arrays, using variables as a step or loop boundary, calling functions that require checked arguments. Checkers covering most of the possible types of vulnerabilities for tainted integers and strings have been implemented. The Juliet project was used to assess the coverage. The false negative rate ranged from 46,31% to 81.17% with a small number of false positives.

2010 ◽  
Vol 15 (9) ◽  
pp. 1116-1122 ◽  
Author(s):  
Xiaohua Douglas Zhang

In most genome-scale RNA interference (RNAi) screens, the ultimate goal is to select siRNAs with a large inhibition or activation effect. The selection of hits typically requires statistical control of 2 errors: false positives and false negatives. Traditional methods of controlling false positives and false negatives do not take into account the important feature in RNAi screens: many small-interfering RNAs (siRNAs) may have very small but real nonzero average effects on the measured response and thus cannot allow us to effectively control false positives and false negatives. To address for deficiencies in the application of traditional approaches in RNAi screening, the author proposes a new method for controlling false positives and false negatives in RNAi high-throughput screens. The false negatives are statistically controlled through a false-negative rate (FNR) or false nondiscovery rate (FNDR). FNR is the proportion of false negatives among all siRNAs examined, whereas FNDR is the proportion of false negatives among declared nonhits. The author also proposes new concepts, q*-value and p*-value, to control FNR and FNDR, respectively. The proposed method should have broad utility for hit selection in which one needs to control both false discovery and false nondiscovery rates in genome-scale RNAi screens in a robust manner.


2020 ◽  
Vol 32 (6) ◽  
pp. 87-100
Author(s):  
Alexey Evgenevich Borodin ◽  
Irina Aleksandrovna Dudina

Svace is a static analysis tool for bug detection in C/C++/Java source code. To analyze a program, Svace performs an intra-procedure analysis of individual functions, starting from the leaves of a call-graph and moving towards the roots, and uses summaries of previously analyzed procedures at call-cites. In this paper, we overview the approaches and techniques employed by Svace for the intra-procedural analysis. This phase is performed by an analyzer engine and an extensible set of detectors. The core engine employs a symbolic execution approach with state merging. It uses value numbering to reduce the set of symbolic expressions, maintains points-to relationship graph for memory modeling, and performs strong and weak updates of program values. Detectors are responsible for discovering and reporting bugs. They calculate different properties of program values using a variety of abstract domains. All detectors work simultaneously orchestrated by the engine. Svace analysis is unsound and employs a variety of heuristics to speed-up. We designed Svace to analyze big projects (several MLOCs) in just a few hours and report as many warnings as possible, while keeping a good quality of reports ≥ 65 of true positives). For example, Tizen 5.5 (20MLOC) analysis takes 8.6 hours and produces 18,920 warnings, more than 70% of which are true-positive.


1996 ◽  
Vol 79 (3) ◽  
pp. 939-945 ◽  
Author(s):  
Cooper B. Holmes ◽  
Megan J. Beishline

Combined Verbal and Quantitative GRE scores were obtained from the records of 24 former students of a master's degree program (from a total of 128 students) who had successfully completed a doctorate in psychology or who had withdrawn from a psychology doctoral program. Success rate by classification with the GRE was calculated using both a cut-off of 1000 and a cut-off of 1100. The results indicated a high false negative rate, that is, students whose GRE scores would not predict success but who obtained a Ph.D.


Cybersecurity ◽  
2020 ◽  
Vol 3 (1) ◽  
Author(s):  
Lili Xu ◽  
Mingjie Xu ◽  
Feng Li ◽  
Wei Huo

Abstract The Integer-Overflow-to-Buffer-Overflow (IO2BO) vulnerability has been widely exploited by attackers to cause severe damages to computer systems. Automatically identifying this kind of vulnerability is critical for software security. Despite many works have been done to mitigate integer overflow, existing tools either report large number of false positives or introduce unacceptable time consumption. To address this problem, in this article we present a static analysis framework. It first constructs an inter-procedural call graph and utilizes taint analysis to accurately identify potential IO2BO vulnerabilities. Then it uses a light-weight method to further filter out false positives. Specifically, it generates constraints representing the conditions under which a potential IO2BO vulnerability can be triggered, and feeds the constraints to SMT solver to decide their satisfiability. We have implemented a prototype system ELAID based on LLVM, and evaluated it on 228 programs of the NIST’s SAMATE Juliet test suite and 14 known IO2BO vulnerabilities in real world. The experiment results show that our system can effectively and efficiently detect all known IO2BO vulnerabilities.


Author(s):  
Tran Ngoc Thinh ◽  
Cuong Pham-Quoc ◽  
Biet Nguyen-Hoang ◽  
Thuy-Chau Tran-Thi ◽  
Chien Do-Minh ◽  
...  

In this paper, we propose a novel FPGA-based high-speed DDoS countermeasure system that can flexibly adapt to DDoS attacks while still maintaining system performance. The system includes a packet decoder module and multiple DDoS countermeasure mechanisms. We apply dynamic partial reconfiguration technique in this system so that the countermeasure mechanisms can be flexibly changed or updated on-the-fly. The proposed system architecture separates DDoS protection modules (which implement DDoS countermeasure techniques) from the packet decoder module. By using this approach, one DDoS protection module can be reconfigured without interfering with other modules. The proposed system is implemented on a NetFPGA 10G board. The synthesis results show that the system can work at up to 116.782 MHz while utilizing up to 39.9% Registers and 49.85% BlockRAM of the Xilinx Virtex xcv5tx240t FPGA device on the NetFPGA 10G board. The system achieves the detection rate of 100% with the false negative rate at 0% and false positive rate closed to 0.16%. The prototype system achieves packet decoding throughput at 9.869 Gbps in half-duplex mode and 19.738 Gbps in full-duplex mode.


Financial Crisis has been the stern problem experienced by various organizations or even common people when interested in investing in any Financial institutions like banks, Funds development institutions etc. Hence it is mandatory that a reliable prediction system should be applied in early prediction of Financial Crisis Prediction thereby preventing investment in weak financial institutions that might lead to bankruptcy. The Paper focuses on designing a Hybrid Optimized Algorithm called Hybrid Unified Machine Classifier (HUMC) based on Machine Learning Technique that would be capable of identifying categorized and continuous variables in a financial crisis dataset and determine the confusion matrix that can be instilled in performance analysis tool comprising of analytics and prediction related to Accuracy, F-Score, Sensitivity, Specificity, False Positive Rate (FPR) and False Negative Rate (FNR) respectively. Early testing with the training set of Australian credit dataset were tested with machine learning classifiers like Decision Tree, PART, Naive Bayesian, RBF Network and Multilayer Perceptron algorithms with accuracies 85.50%, 83.62%, 77.24%, 82.75% and 84.93% respectively. The Algorithm HUMC was developed based on combining classification features from decision tree, identifying hidden nodes and model with boosting technique that could enhance the performance levels of the Financial Crisis Prediction. The design of algorithm comprised of best characteristics of both classification and neural networks that are capable to find categorization criteria in the dataset at the first level and also to find the hidden continuous data during the second stage respectively. The design of HUMC was implemented and tested with MATLAB. The Result showed that HUMC algorithm showed greater accuracy (86.25%) in comparison to other classifier models along with other performance measures. Thus, this algorithm enhances the prediction of Financial Crisis predictions with good performance.


Methodology ◽  
2019 ◽  
Vol 15 (3) ◽  
pp. 97-105
Author(s):  
Rodrigo Ferrer ◽  
Antonio Pardo

Abstract. In a recent paper, Ferrer and Pardo (2014) tested several distribution-based methods designed to assess when test scores obtained before and after an intervention reflect a statistically reliable change. However, we still do not know how these methods perform from the point of view of false negatives. For this purpose, we have simulated change scenarios (different effect sizes in a pre-post-test design) with distributions of different shapes and with different sample sizes. For each simulated scenario, we generated 1,000 samples. In each sample, we recorded the false-negative rate of the five distribution-based methods with the best performance from the point of view of the false positives. Our results have revealed unacceptable rates of false negatives even with effects of very large size, starting from 31.8% in an optimistic scenario (effect size of 2.0 and a normal distribution) to 99.9% in the worst scenario (effect size of 0.2 and a highly skewed distribution). Therefore, our results suggest that the widely used distribution-based methods must be applied with caution in a clinical context, because they need huge effect sizes to detect a true change. However, we made some considerations regarding the effect size and the cut-off points commonly used which allow us to be more precise in our estimates.


Author(s):  
Brian M. Katt ◽  
Casey Imbergamo ◽  
Fortunato Padua ◽  
Joseph Leider ◽  
Daniel Fletcher ◽  
...  

Abstract Introduction There is a known false negative rate when using electrodiagnostic studies (EDS) to diagnose carpal tunnel syndrome (CTS). This can pose a management dilemma for patients with signs and symptoms that correlate with CTS but normal EDS. While corticosteroid injection into the carpal tunnel has been used in this setting for diagnostic purposes, there is little data in the literature supporting this practice. The purpose of this study is to evaluate the prognostic value of a carpal tunnel corticosteroid injection in patients with a normal electrodiagnostic study but exhibiting signs and symptoms suggestive of carpal tunnel, who proceed with a carpal tunnel release. Materials and Methods The group included 34 patients presenting to an academic orthopedic practice over the years 2010 to 2019 who had negative EDS, a carpal tunnel corticosteroid injection, and a carpal tunnel release. One patient (2.9%), where the response to the corticosteroid injection was not documented, was excluded from the study, yielding a study cohort of 33 patients. Three patients had bilateral disease, yielding 36 hands for evaluation. Statistical analysis was performed using Chi-square analysis for nonparametric data. Results Thirty-two hands (88.9%) demonstrated complete or partial relief of neuropathic symptoms after the corticosteroid injection, while four (11.1%) did not experience any improvement. Thirty-one hands (86.1%) had symptom improvement following surgery, compared with five (13.9%) which did not. Of the 32 hands that demonstrated relief following the injection, 29 hands (90.6%) improved after surgery. Of the four hands that did not demonstrate relief after the injection, two (50%) improved after surgery. This difference was statistically significant (p = 0.03). Conclusion Patients diagnosed with a high index of suspicion for CTS do well with operative intervention despite a normal electrodiagnostic test if they have had a positive response to a preoperative injection. The injection can provide reassurance to both the patient and surgeon before proceeding to surgery. Although patients with a normal electrodiagnostic test and no response to cortisone can still do well with surgical intervention, the surgeon should carefully review both the history and physical examination as surgical success may decrease when both diagnostic tests are negative. Performing a corticosteroid injection is an additional diagnostic tool to consider in the management of patients with CTS and normal electrodiagnostic testing.


2020 ◽  
Vol 22 (1) ◽  
pp. 25-29
Author(s):  
Zubayer Ahmad ◽  
Mohammad Ali ◽  
Kazi lsrat Jahan ◽  
ABM Khurshid Alam ◽  
G M Morshed

Background: Biliary disease is one of the most common surgical problems encountered all over the world. Ultrasound is widely accepted for the diagnosis of biliary system disease. However, it is a highly operator dependent imaging modality and its diagnostic success is also influenced by the situation, such as non-fasting, obesity, intestinal gas. Objective: To compare the ultrasonographic findings with the peroperative findings in biliary surgery. Methods: This prospective study was conducted in General Hospital, comilla between the periods of July 2006 to June 2008 among 300 patients with biliary diseases for which operative treatment is planned. Comparison between sonographic findings with operative findings was performed. Results: Right hypochondriac pain and jaundice were two significant symptoms (93% and 15%). Right hypochondriac tenderness, jaundice and palpable gallbladder were most valuable physical findings (respectively, 40%, 15% and 5%). Out of 252 ultrasonically positive gallbladder, stone were confirmed in 249 cases preoperatively. Sensitivity of USG in diagnosis of gallstone disease was 100%. There was, however, 25% false positive rate detection. Specificity was, however, 75% in this case. USG could demonstrate stone in common bile duct in only 12 out of 30 cases. Sensitivity of the test in diagnosing common bile duct stone was 40%, false negative rate 60%. In the series, ultrasonography sensitivity was 100% in diagnosing stone in cystic duct. USG could detect with relatively good but less sensitivity the presence of chronic cholecystitis (92.3%) and worm inside gallbladder (50%). Conclusion: Ultrasonography is the most important investigation in the diagnosis of biliary disease and a useful test for patients undergoing operative management for planning and anticipating technical difficulties. Journal of Surgical Sciences (2018) Vol. 22 (1): 25-29


Author(s):  
Yongmei Liu ◽  
Rajen Dias

Abstract Study presented here has shown that Infrared thermography has the potential to be a nondestructive analysis tool for evaluating package sublayer defects. Thermal imaging is achieved by applying pulsed external heating to the package surface and monitoring the surface thermal response as a function of time with a high-speed IR camera. Since the thermal response of the surface is affected by the defects such as voids and delamination below the package surface, the technique can be used to assist package defects detection and analysis.


Sign in / Sign up

Export Citation Format

Share Document