An Evaluation of Container Security Vulnerability Detection Tools

2021 ◽  
Author(s):  
Omar Javed ◽  
Salman Toor
Author(s):  
Subhasish Goswami ◽  
Rabijit Singh ◽  
Nayanjeet Saikia ◽  
Kaushik Kumar Bora ◽  
Utpal Sharma

PLoS ONE ◽  
2019 ◽  
Vol 14 (8) ◽  
pp. e0221530
Author(s):  
Yuancheng Li ◽  
Longqiang Ma ◽  
Liang Shen ◽  
Junfeng Lv ◽  
Pan Zhang

2019 ◽  
Vol 9 (23) ◽  
pp. 5100
Author(s):  
Congxi Song ◽  
Xu Zhou ◽  
Qidi Yin ◽  
Xinglu He ◽  
Hangwei Zhang ◽  
...  

Fuzzing is an effective technology in software testing and security vulnerability detection. Unfortunately, fuzzing is an extremely compute-intensive job, which may cause thousands of computing hours to find a bug. Current novel works generally improve fuzzing efficiency by developing delicate algorithms. In this paper, we propose another direction of improvement in this field, i.e., leveraging parallel computing to improve fuzzing efficiency. In this way, we develop P-fuzz, a parallel fuzzing framework that can utilize massive, distributed computing resources to fuzz. P-fuzz uses a database to share the fuzzing status such as seeds, the coverage information, etc. All fuzzing nodes get tasks from the database and update their fuzzing status to the database. Also, P-fuzz handles some data races and exceptions in parallel fuzzing. We compare P-fuzz with AFL and a parallel fuzzing framework Roving in our experiment. The result shows that P-fuzz can easily speed up AFL about 2.59× and Roving about 1.66× on average by using 4 nodes.


2020 ◽  
Vol 10 (24) ◽  
pp. 9119
Author(s):  
Francesc Mateo Tudela ◽  
Juan-Ramón Bermejo Higuera ◽  
Javier Bermejo Higuera ◽  
Juan-Antonio Sicilia Montalvo ◽  
Michael I. Argyros

The design of the techniques and algorithms used by the static, dynamic and interactive security testing tools differ. Therefore, each tool detects to a greater or lesser extent each type of vulnerability for which they are designed for. In addition, their different designs mean that they have different percentages of false positives. In order to take advantage of the possible synergies that different analysis tools types may have, this paper combines several static, dynamic and interactive analysis security testing tools—static white box security analysis (SAST), dynamic black box security analysis (DAST) and interactive white box security analysis (IAST), respectively. The aim is to investigate how to improve the effectiveness of security vulnerability detection while reducing the number of false positives. Specifically, two static, two dynamic and two interactive security analysis tools will be combined to study their behavior using a specific benchmark for OWASP Top Ten security vulnerabilities and taking into account various scenarios of different criticality in terms of the applications analyzed. Finally, this study analyzes and discuss the values of the selected metrics applied to the results for each n-tools combination.


Sign in / Sign up

Export Citation Format

Share Document