scholarly journals Deep Dive into Directory Traversal and File Inclusion Attacks leads to Privilege Escalation

Author(s):  
Mrunalsinh Chawda ◽  
Dr. Priyanka Sharma ◽  
Mr. Jatin Patel

In Modern Web application directory traversal vulnerability that can potentially allow an attacker to view arbitrary files and some sensitive files. They can exploit identified vulnerabilities or misconfigurations to obtain root privileges. When building the web application, ensure that some arbitrary file is not publicly available via the production server. when an attacker can include. Traversal vulnerabilities this vulnerability exploits the dynamic file include a mechanism that exists in programming frameworks a local file inclusion happens when uncontrolled user input such as form values or headers for example are used to construct a file include paths. By exploiting directory traversal attacks in web servers, they can do anything and with chaining with code injection they can upload a shell into a web server and perform a website defacement attack. Path-traversal attacks take advantage of vulnerable Website parameters by including a URL reference to remotely hosted malicious code, allowing remote code execution and leads to privilege escalation attack.

Organizational web servers reflect the public image of an organization and serve web pages/information to organizational clients via web browsers using HTTP protocol. Some of the web server software may contain web applications that enable users to perform high-level tasks, such as querying a database and delivering the output through the web server to the client browser as an HTML file. Hackers always try to exploit the different vulnerabilities or flaws existing in web servers and web applications, which can pose a big threat for an organization. This chapter provides the importance of protecting web servers and applications along with the different tools used for analyzing the security of web servers and web applications. The chapter also introduces different web attacks that are carried out by an attacker either to gain illegal access to the web server data or reduce the availability of web services. The web server attacks includes denial of service (DOS) attacks, buffer overflow exploits, website defacement with sql injection (SQLi) attacks, cross site scripting (XSS) attacks, remote file inclusion (RFI) attacks, directory traversal attacks, phishing attacks, brute force attacks, source code disclosure attacks, session hijacking, parameter form tampering, man-in-the-middle (MITM) attacks, HTTP response splitting attacks, cross-site request forgery (XSRF), lightweight directory access protocol (LDAP) attacks, and hidden field manipulation attacks. The chapter explains different web server and web application testing tools and vulnerability scanners including Nikto, BurpSuite, Paros, IBM AppScan, Fortify, Accunetix, and ZAP. Finally, the chapter also discusses countermeasures to be implemented while designing any web application for any organization in order to reduce the risk.


Author(s):  
Ibrahim Mahmood Ibrahim ◽  
Siddeeq Y. Ameen ◽  
Hajar Maseeh Yasin ◽  
Naaman Omar ◽  
Shakir Fattah Kak ◽  
...  

Today, web services rapidly increased and are accessed by many users, leading to massive traffic on the Internet. Hence, the web server suffers from this problem, and it becomes challenging to manage the total traffic with growing users. It will be overloaded and show response time and bottleneck, so this massive traffic must be shared among several servers. Therefore, the load balancing technologies and server clusters are potent methods for dealing with server bottlenecks. Load balancing techniques distribute the load among servers in the cluster so that it balances all web servers. The motivation of this paper is to give an overview of the several load balancing techniques used to enhance the efficiency of web servers in terms of response time, throughput, and resource utilization. Different algorithms are addressed by researchers and get good results like the pending job, and IP hash algorithms achieve better performance.


2018 ◽  
Vol 7 (3.6) ◽  
pp. 106
Author(s):  
B J. Santhosh Kumar ◽  
Kankanala Pujitha

Application uses URL as contribution for Web Application Vulnerabilities recognition. if the length of URL is too long then it will consume more time to scan the URL (Ain Zubaidah et.al 2014).Existing system can notice the web pages but not overall web application. This application will test for URL of any length using String matching algorithm. To avoid XSS and CSRF and detect attacks that try to sidestep program upheld arrangements by white list and DOM sandboxing techniques (Elias Athanasopoulos et.al.2012). The web application incorporates a rundown of cryptographic hashes of legitimate (trusted) client side contents. In the event that there is a cryptographic hash for the content in the white list. On the off chance that the hash is discovered the content is viewed as trusted or not trusted. This application makes utilization of SHA-1 for making a message process. The web server stores reliable scripts inside div or span HTML components that are attribute as reliable. DOM sandboxing helps in identifying the script or code. Partitioning Program Symbols into Code and Non-code. This helps to identify any hidden code in trusted tag, which bypass web server. Scanning the website for detecting the injection locations and injecting the mischievous XSS assault vectors in such infusion focuses and check for these assaults in the helpless web application( Shashank Gupta et.al 2015).The proposed application improve the false negative rate.  


Author(s):  
Dilip Singh Sisodia

Web robots are autonomous software agents used for crawling websites in a mechanized way for non-malicious and malicious reasons. With the popularity of Web 2.0 services, web robots are also proliferating and growing in sophistication. The web servers are flooded with access requests from web robots. The web access requests are recorded in the form of web server logs, which contains significant knowledge about web access patterns of visitors. The presence of web robot access requests in log repositories distorts the actual access patterns of human visitors. The human visitors' actual web access patterns are potentially useful for enhancement of services for more satisfaction or optimization of server resources. In this chapter, the correlative access patterns of human visitors and web robots are discussed using the web server access logs of a portal.


Respati ◽  
2020 ◽  
Vol 15 (2) ◽  
pp. 6
Author(s):  
Lukman Lukman ◽  
Melati Suci

INTISARIKeamanan jaringan pada web server merupakan bagian yang paling penting untuk menjamin integritas dan layanan bagi pengguna. Web server sering kali menjadi target serangan yang mengakibatkan kerusakan data. Salah satunya serangan SYN Flood merupakan jenis serangan Denial of Service (DOS) yang memberikan permintaan SYN secara besar-besaran kepada web server.Untuk memperkuat keamanan jaringan web server penerapan Intrusion Detection System (IDS) digunakan untuk mendeteksi serangan, memantau dan menganalisa serangan pada web server. Software IDS yang sering digunakan yaitu IDS Snort dan IDS Suricata yang memiliki kelebihan dan kekurangannya masing-masing. Tujuan penelitian kali ini untuk membandingkan kedua IDS menggunakan sistem operasi linux dengan pengujian serangan menggunakan SYN Flood yang akan menyerang web server kemudian IDS Snort dan Suricata yang telah terpasang pada web server akan memberikan peringatan jika terjadi serangan. Dalam menentukan hasil perbandingan, digunakan parameter-parameter yang akan menjadi acuan yaitu jumlah serangan yang terdeteksi dan efektivitas deteksi serangan dari kedua IDS tersebut.Kata kunci: Keamanan jaringan, Web Server, IDS, SYN Flood, Snort, Suricata. ABSTRACTNetwork security on the web server is the most important part to guarantee the integrity and service for users. Web servers are often the target of attacks that result in data damage. One of them is the SYN Flood attack which is a type of Denial of Service (DOS) attack that gives a massive SYN request to the web server.To strengthen web server network security, the application of Intrusion Detection System (IDS) is used to detect attacks, monitor and analyze attacks on web servers. IDS software that is often used is IDS Snort and IDS Suricata which have their respective advantages and disadvantages.The purpose of this study is to compare the two IDS using the Linux operating system with testing the attack using SYN Flood which will attack the web server then IDS Snort and Suricata that have been installed on the web server will give a warning if an attack occurs. In determining the results of the comparison, the parameters used will be the reference, namely the number of attacks detected and the effectiveness of attack detection from the two IDS.Keywords: Network Security, Web Server, IDS, SYN Flood, Snort, Suricata.


2021 ◽  
Vol 5 (1) ◽  
pp. 132-138
Author(s):  
Hataw Jalal Mohammed ◽  
Kamaran Hama Ali Faraj

The web servers (WSGI-Python) and (PHP-Apache) are in middleware tier architecture. Middleware architecture is between frontend tier and backend tier, otherwise it’s a connection between frontend tier and backend tier for three tier architecture. The ELearning systems are designed by two different dynamic web technologies. First is by Python-WSGI and the second is by Personal Home Page (PHP-Apache). The two websites were designed with different open source and cross platform web technologies programming language namely; Python and PHP in the same structure and weight will evaluate perform over two different operating systems (OSs): 1) Windows-16 and 2) Linux-Ubuntu 20.4. Both systems run over the same computer architecture (64bit) as a server side with a common backend MySQL web database for both of them. Nevertheless, the middleware for PHP is a cross Apache MySQL PHP Perl (XAMPP), but the middleware for Python is Pycharm and the web server gateway interface (WSGI). WSGI and Apache are both web servers and this paper will show which of them has a better response time (RT). On the one hand, the experimental results demonstrate that the Python-WSGI is even weightier in Mbyte than PHP-Apache, on the other hand Python is still faster and more accurate than PHP. The designed SPG is by handwriting codes: one time designed the SPG by PHP source code and the other time designed by Python source code. Both Python-WSGI and PHP-Apache results are targeted to compare by the least time in milliseconds and take in to account enhanced performance.


Author(s):  
Apurva Solanki ◽  
Aryan Parekh ◽  
Gaurav Chawda ◽  
Mrs. Geetha S.

Day by day, the number of users are increasing on the internet and the web servers need to cater to the requests constantly, also if compared to the past years this year, due to a global pandemic and lockdown in various countries, the requests on the web have surged exponentially. The complexity of configuring a web server is also increasing as the development continues. In this paper, we propose a Lightron web server, which is highly scalable and can cater many requests at a time. Additionally, to ease users from the configuration of the web server we introduced Graphical User Interface which is beginner friendly.


2014 ◽  
Vol 3 (4) ◽  
pp. 1-16 ◽  
Author(s):  
Harikesh Singh ◽  
Shishir Kumar

Load balancing applications introduce delays due to load relocation among various web servers and depend upon the design of balancing algorithms and resources required to share in the large and wide applications. The performance of web servers depends upon the efficient sharing of the resources and it can be evaluated by the overall task completion time of the tasks based on the load balancing algorithm. Each load balancing algorithm introduces delay in the task allocation among the web servers, but still improved the performance of web servers dynamically. As a result, the queue-length of web server and average waiting time of tasks decreases with load balancing instants based on zero, deterministic, and random types of delay. In this paper, the effects of delay due to load balancing have been analyzed based on the factors: average queue-length and average waiting time of tasks. In the proposed Ratio Factor Based Delay Model (RFBDM), the above factors are minimized and improved the functioning of the web server system based on the average task completion time of each web server node. Based on the ratio of average task completion time, the average queue-length and average waiting time of the tasks allocated to the web server have been analyzed and simulated with Monte-Carlo simulation. The results of simulation have shown that the effects of delays in terms of average queue-length and average waiting time using proposed model have minimized in comparison to existing delay models of the web servers.


2021 ◽  
Vol 17 (2) ◽  
pp. 58-65
Author(s):  
Iman Khazal ◽  
Mohammed Hussain

Cross-Site Scripting (XSS) is one of the most common and dangerous attacks. The user is the target of an XSS attack, but the attacker gains access to the user by exploiting an XSS vulnerability in a web application as Bridge. There are three types of XSS attacks: Reflected, Stored, and Dom-based. This paper focuses on the Stored-XSS attack, which is the most dangerous of the three. In Stored-XSS, the attacker injects a malicious script into the web application and saves it in the website repository. The proposed method in this paper has been suggested to detect and prevent the Stored-XSS. The prevent Stored-XSS Server (PSS) was proposed as a server to test and sanitize the input to web applications before saving it in the database. Any user input must be checked to see if it contains a malicious script, and if so, the input must be sanitized and saved in the database instead of the harmful input. The PSS is tested using a vulnerable open-source web application and succeeds in detection by determining the harmful script within the input and prevent the attack by sterilized the input with an average time of 0.3 seconds.


Sign in / Sign up

Export Citation Format

Share Document