Evaluating State-of-the-Art Free and Open Source Static Analysis Tools Against Buffer Errors in Android Apps

Author(s):  
Bushra Aloraini ◽  
Meiyappan Nagappan
Author(s):  
Tao Zhang ◽  
Wenjun Hu ◽  
Xiapu Luo ◽  
Xiaobo Ma

Recently, there has been consistent growth in Android applications (apps). Under these circumstances, software maintenance for Android apps becomes an essential and important task. The core of software maintenance is to locate bugs in source files. Previous bug localization approaches mainly focus on open-source desktop software (e.g. Eclipse, Mozilla, GCC). Even though a few studies locate the bugs in the Android apps, they are dedicated to a special app named ZXing, without developing a general method to locate the bugs in Android apps by taking into account the unique characteristics of Android apps’ bug reports. Such characteristics include fewer number of historical bug reports, insufficient detailed description, etc. These characteristics hinder existing localization approaches from being directly delivered to Android apps, because lack of enough information degrades the performance of those localization approaches relying on historical bug reports. Commit messages include more informative data which can provide the details of reported bugs. Therefore, in this paper, we propose a novel information retrieval-based approach which utilizes commit messages to locate new bugs in Android apps. This approach not only considers the structured textual similarity between the given bug and the candidate source files, but also computes the unstructured textual similarities between the new bug and the commit messages linked to the corresponding source files. According to the experimental results on 10 popular open-source Android apps managed by GitHub, our approach outperforms the state-of-the-art bug localization methods that include BugLocator, BLUiR, and two-phase model.


Author(s):  
Lucas Torri ◽  
Guilherme Fachini ◽  
Leonardo Steinfeld ◽  
Vesmar Camara ◽  
Luigi Carro ◽  
...  

2021 ◽  
Vol 24 (3) ◽  
pp. 1-37
Author(s):  
Amit Seal Ami ◽  
Kaushal Kafle ◽  
Kevin Moran ◽  
Adwait Nadkarni ◽  
Denys Poshyvanyk

Mobile application security has been a major area of focus for security research over the course of the last decade. Numerous application analysis tools have been proposed in response to malicious, curious, or vulnerable apps. However, existing tools, and specifically, static analysis tools, trade soundness of the analysis for precision and performance and are hence sound y . Unfortunately, the specific unsound choices or flaws in the design of these tools is often not known or well documented, leading to misplaced confidence among researchers, developers, and users. This article describes the Mutation-Based Soundness Evaluation (μSE) framework, which systematically evaluates Android static analysis tools to discover, document, and fix flaws, by leveraging the well-founded practice of mutation analysis. We implemented μSE and applied it to a set of prominent Android static analysis tools that detect private data leaks in apps. In a study conducted previously, we used μSE to discover 13 previously undocumented flaws in FlowDroid, one of the most prominent data leak detectors for Android apps. Moreover, we discovered that flaws also propagated to other tools that build upon the design or implementation of FlowDroid or its components. This article substantially extends our μSE framework and offers a new in-depth analysis of two more major tools in our 2020 study; we find 12 new, undocumented flaws and demonstrate that all 25 flaws are found in more than one tool, regardless of any inheritance-relation among the tools. Our results motivate the need for systematic discovery and documentation of unsound choices in soundy tools and demonstrate the opportunities in leveraging mutation testing in achieving this goal.


2021 ◽  

Abstract Many security vulnerabilities can be detected by static analysis. This paper is a case study and a performance comparison of four open-source static analysis tools and plugins (PMD, SpotBugs, Find Security Bugs, and SonarQube) on Java source code. Experiments have been conducted on the widely used Juliet Test Suite with respect to six selected weaknesses from the official Top 25 list of Common Weakness Enumeration. In this study, analysis metrics have been calculated for helping Java developers decide which tools can be used when checking their programs for security vulnerabilities. It turned out that particular weaknesses are best detected with particular tools.


Cybersecurity ◽  
2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Roee S. Leon ◽  
Michael Kiperberg ◽  
Anat Anatey Leon Zabag ◽  
Nezer Jacob Zaidenberg

AbstractMalware analysis is a task of utmost importance in cyber-security. Two approaches exist for malware analysis: static and dynamic. Modern malware uses an abundance of techniques to evade both dynamic and static analysis tools. Current dynamic analysis solutions either make modifications to the running malware or use a higher privilege component that does the actual analysis. The former can be easily detected by sophisticated malware while the latter often induces a significant performance overhead. We propose a method that performs malware analysis within the context of the OS itself. Furthermore, the analysis component is camouflaged by a hypervisor, which makes it completely transparent to the running OS and its applications. The evaluation of the system’s efficiency suggests that the induced performance overhead is negligible.


Sign in / Sign up

Export Citation Format

Share Document