scholarly journals Benchmarking Approach to Compare Web Applications Static Analysis Tools Detecting OWASP Top Ten Security Vulnerabilities

2020 ◽  
Vol 64 (3) ◽  
pp. 1555-1577 ◽  
Author(s):  
Juan R. Bermejo Higuera ◽  
Javier Bermejo Higuera ◽  
Juan A. Sicilia Montalvo ◽  
Javier Cubo Villalba ◽  
Juan Jos�Nombela P閞ez
Computing ◽  
2018 ◽  
Vol 101 (2) ◽  
pp. 161-185 ◽  
Author(s):  
Paulo Nunes ◽  
Ibéria Medeiros ◽  
José Fonseca ◽  
Nuno Neves ◽  
Miguel Correia ◽  
...  

SQL injection vulnerabilities have been predominant on database-driven web applications since almost one decade. Exploiting such vulnerabilities enables attackers to gain unauthorized access to the back-end databases by altering the original SQL statements through manipulating user input. Testing web applications for identifying SQL injection vulnerabilities before deployment is essential to get rid of them. However, checking such vulnerabilities by hand is very tedious, difficult, and time-consuming. Web vulnerability static analysis tools are software tools for automatically identifying the root cause of SQL injection vulnerabilities in web applications source code. In this paper, we test and evaluate three free/open source static analysis tools using eight web applications with numerous known vulnerabilities, primarily for false negative rates. The evaluation results were compared and analysed, and they indicate a need to improve the tools.


Author(s):  
Marco Pistoia ◽  
Omer Tripp ◽  
David Lubensky

Mobile devices have revolutionized many aspects of our lives. Without realizing it, we often run on them programs that access and transmit private information over the network. Integrity concerns arise when mobile applications use untrusted data as input to security-sensitive computations. Program-analysis tools for integrity and confidentiality enforcement have become a necessity. Static-analysis tools are particularly attractive because they do not require installing and executing the program, and have the potential of never missing any vulnerability. Nevertheless, such tools often have high false-positive rates. In order to reduce the number of false positives, static analysis has to be very precise, but this is in conflict with the analysis' performance and scalability, requiring a more refined model of the application. This chapter proposes Phoenix, a novel solution that combines static analysis with machine learning to identify programs exhibiting suspicious operations. This approach has been widely applied to mobile applications obtaining impressive results.


Author(s):  
Marco Pistoia ◽  
Omer Tripp ◽  
David Lubensky

Mobile devices have revolutionized many aspects of our lives. Without realizing it, we often run on them programs that access and transmit private information over the network. Integrity concerns arise when mobile applications use untrusted data as input to security-sensitive computations. Program-analysis tools for integrity and confidentiality enforcement have become a necessity. Static-analysis tools are particularly attractive because they do not require installing and executing the program, and have the potential of never missing any vulnerability. Nevertheless, such tools often have high false-positive rates. In order to reduce the number of false positives, static analysis has to be very precise, but this is in conflict with the analysis' performance and scalability, requiring a more refined model of the application. This chapter proposes Phoenix, a novel solution that combines static analysis with machine learning to identify programs exhibiting suspicious operations. This approach has been widely applied to mobile applications obtaining impressive results.


Author(s):  
Pietro Ferrara ◽  
Amit Kr Mandal ◽  
Agostino Cortesi ◽  
Fausto Spoto

AbstractThe Open Web Application Security Project (OWASP), released the “OWASP Top 10 Internet of Things 2018” list of the high-priority security vulnerabilities for IoT systems. The diversity of these vulnerabilities poses a great challenge toward development of a robust solution for their detection and mitigation. In this paper, we discuss the relationship between these vulnerabilities and the ones listed by OWASP Top 10 (focused on Web applications rather than IoT systems), how these vulnerabilities can actually be exploited, and in which cases static analysis can help in preventing them. Then, we present an extension of an industrial analyzer (Julia) that already covers five out of the top seven vulnerabilities of OWASP Top 10, and we discuss which IoT Top 10 vulnerabilities might be detected by the existing analyses or their extension. The experimental results present the application of some existing Julia’s analyses and their extension to IoT systems, showing its effectiveness of the analysis of some representative case studies.


2021 ◽  

Abstract Many security vulnerabilities can be detected by static analysis. This paper is a case study and a performance comparison of four open-source static analysis tools and plugins (PMD, SpotBugs, Find Security Bugs, and SonarQube) on Java source code. Experiments have been conducted on the widely used Juliet Test Suite with respect to six selected weaknesses from the official Top 25 list of Common Weakness Enumeration. In this study, analysis metrics have been calculated for helping Java developers decide which tools can be used when checking their programs for security vulnerabilities. It turned out that particular weaknesses are best detected with particular tools.


2020 ◽  
Vol 10 (24) ◽  
pp. 9119
Author(s):  
Francesc Mateo Tudela ◽  
Juan-Ramón Bermejo Higuera ◽  
Javier Bermejo Higuera ◽  
Juan-Antonio Sicilia Montalvo ◽  
Michael I. Argyros

The design of the techniques and algorithms used by the static, dynamic and interactive security testing tools differ. Therefore, each tool detects to a greater or lesser extent each type of vulnerability for which they are designed for. In addition, their different designs mean that they have different percentages of false positives. In order to take advantage of the possible synergies that different analysis tools types may have, this paper combines several static, dynamic and interactive analysis security testing tools—static white box security analysis (SAST), dynamic black box security analysis (DAST) and interactive white box security analysis (IAST), respectively. The aim is to investigate how to improve the effectiveness of security vulnerability detection while reducing the number of false positives. Specifically, two static, two dynamic and two interactive security analysis tools will be combined to study their behavior using a specific benchmark for OWASP Top Ten security vulnerabilities and taking into account various scenarios of different criticality in terms of the applications analyzed. Finally, this study analyzes and discuss the values of the selected metrics applied to the results for each n-tools combination.


Cybersecurity ◽  
2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Roee S. Leon ◽  
Michael Kiperberg ◽  
Anat Anatey Leon Zabag ◽  
Nezer Jacob Zaidenberg

AbstractMalware analysis is a task of utmost importance in cyber-security. Two approaches exist for malware analysis: static and dynamic. Modern malware uses an abundance of techniques to evade both dynamic and static analysis tools. Current dynamic analysis solutions either make modifications to the running malware or use a higher privilege component that does the actual analysis. The former can be easily detected by sophisticated malware while the latter often induces a significant performance overhead. We propose a method that performs malware analysis within the context of the OS itself. Furthermore, the analysis component is camouflaged by a hypervisor, which makes it completely transparent to the running OS and its applications. The evaluation of the system’s efficiency suggests that the induced performance overhead is negligible.


Sign in / Sign up

Export Citation Format

Share Document