scholarly journals Web Server Performance Improvement Using Dynamic Load Balancing Techniques: A Review

Author(s):  
Ibrahim Mahmood Ibrahim ◽  
Siddeeq Y. Ameen ◽  
Hajar Maseeh Yasin ◽  
Naaman Omar ◽  
Shakir Fattah Kak ◽  
...  

Today, web services rapidly increased and are accessed by many users, leading to massive traffic on the Internet. Hence, the web server suffers from this problem, and it becomes challenging to manage the total traffic with growing users. It will be overloaded and show response time and bottleneck, so this massive traffic must be shared among several servers. Therefore, the load balancing technologies and server clusters are potent methods for dealing with server bottlenecks. Load balancing techniques distribute the load among servers in the cluster so that it balances all web servers. The motivation of this paper is to give an overview of the several load balancing techniques used to enhance the efficiency of web servers in terms of response time, throughput, and resource utilization. Different algorithms are addressed by researchers and get good results like the pending job, and IP hash algorithms achieve better performance.

2013 ◽  
Vol 1 (2) ◽  
pp. 28
Author(s):  
Dite Ardian ◽  
Adian Fatchur Rochim ◽  
Eko Didik Widianto

The development of internet technology has many organizations that expanded service website. Initially used single web server that is accessible to everyone through the Internet, but when the number of users that access the web server is very much the traffic load to the web server and the web server anyway. It is necessary for the optimization of web servers to cope with the overload received by the web server when traffic is high. Methodology of this final project research include the study of literature, system design, and testing of the system. Methods from the literature reference books related as well as from several sources the internet. The design of this thesis uses Haproxy and Pound Links as a load balancing web server. The end of this reaserch is testing the network system, where the system will be tested this stage so as to create a web server system that is reliable and safe. The result is a web server system that can be accessed by many user simultaneously rapidly as load balancing Haproxy and Pound Links system which is set up as front-end web server performance so as to create a web server that has performance and high availability.


Author(s):  
Apurva Solanki ◽  
Aryan Parekh ◽  
Gaurav Chawda ◽  
Mrs. Geetha S.

Day by day, the number of users are increasing on the internet and the web servers need to cater to the requests constantly, also if compared to the past years this year, due to a global pandemic and lockdown in various countries, the requests on the web have surged exponentially. The complexity of configuring a web server is also increasing as the development continues. In this paper, we propose a Lightron web server, which is highly scalable and can cater many requests at a time. Additionally, to ease users from the configuration of the web server we introduced Graphical User Interface which is beginner friendly.


2014 ◽  
Vol 3 (4) ◽  
pp. 1-16 ◽  
Author(s):  
Harikesh Singh ◽  
Shishir Kumar

Load balancing applications introduce delays due to load relocation among various web servers and depend upon the design of balancing algorithms and resources required to share in the large and wide applications. The performance of web servers depends upon the efficient sharing of the resources and it can be evaluated by the overall task completion time of the tasks based on the load balancing algorithm. Each load balancing algorithm introduces delay in the task allocation among the web servers, but still improved the performance of web servers dynamically. As a result, the queue-length of web server and average waiting time of tasks decreases with load balancing instants based on zero, deterministic, and random types of delay. In this paper, the effects of delay due to load balancing have been analyzed based on the factors: average queue-length and average waiting time of tasks. In the proposed Ratio Factor Based Delay Model (RFBDM), the above factors are minimized and improved the functioning of the web server system based on the average task completion time of each web server node. Based on the ratio of average task completion time, the average queue-length and average waiting time of the tasks allocated to the web server have been analyzed and simulated with Monte-Carlo simulation. The results of simulation have shown that the effects of delays in terms of average queue-length and average waiting time using proposed model have minimized in comparison to existing delay models of the web servers.


2013 ◽  
Vol 8 (2) ◽  
Author(s):  
Jefry Alvonsius Rabu ◽  
Joko Purwadi ◽  
Willy Sudiarto Raharjo

Web server plays a vital role on serving requests from clients. As Internet users grows so fast, the request number increases significantly, thus reducing the overall performance of the web server. One solution is to implement a load balancing system.In this paper, we implement load balancing with Linux Virtual Server to distribute the load to several machines within a cluster. We compared Round Robbin and Least Connection algorithm. The study reveals that load balancing using LVS-NAT can double the throughput output compared to single web server, but less significant on response time and CPU utilization. Implementing LVS-NAT using round robbin algorithm is more robust in optimizing the throughput, CPU utilization, and response time compared to least connection.


2019 ◽  
Vol 6 (2) ◽  
pp. 211
Author(s):  
Dodon Turianto Nugrahadi ◽  
Rudy Herteno ◽  
Muhammad Anshari

<div class="WordSection1"><p><em>The rapid development of technology, the increase in web-based systems and development of microcontroller device, have an impact on the ability of web servers to respond in serving client requests. This study aims to analyze load balancing methods </em><em>algoritma round robin </em><em>and </em><em>tuning that significant influence for</em><em> the response time and the number of clients who are able to be handled in serving clients on the web server with microcontroller device. From this study with Stresstool testing the response time was 2064, 2331,4 and 1869,2ms for not using load balancing and 2270, 2306,2 and 2202ms with load balancing from 700 requests served by web servers. It can be concluded that web server response times that use load balancing are smaller than web servers without load balancing. Furthermore, using tuning with the response time obtained at 3103.4ms from 1100 requests. So, with tuning can reduce response time and increase the number of requests.</em><em> With level significant level calculatio, have it khown that tuning configuration give significant effe</em><em>ct</em><em>  for </em><em>the response time and the number of clients in microcontroller.</em></p><p><strong>Keywords</strong>: <em>Web server, Raspberry, Load balancing, Response time, Stresstool.</em></p><p><em>Perkembangan </em><em>implementasi teknologi</em><em> yang pesat, seiring dengan perkembangannya </em><em>sistem </em><em>berbasis web dan perangkat mikrokontroler</em><em>, berdampak pada kemampuan web server dalam memberikan tanggap untuk melayani permintaan klien. Penelitian ini bertujuan untuk menganalisis metode load balance</em><em> </em><em>algoritma round robin dan tuning yang berpengaruh terhadap waktu tanggap dan banyaknya jumlah klien yang mampu ditangani dalam melayani klien pada web server</em><em> dengan mikrokontroler</em><em>. Dari penelitian ini </em><em>dengan pengujian Stresstool </em><em>didapatkan waktu tanggap sebesar </em>2331,4, 2064, 1869,2<em>ms untuk tanpa load balancing dan </em>2270, 2306,2<em> dan </em>2202<em>ms dengan load balancing dari </em><em>600 permintaan </em><em>yang dilayani </em><em>web server</em><em>. Dapat disimpulkan bahwa waktu tanggap web server yang menggunakan load balancing lebih kecil dibandingkan web server yang tidak menggunakan load balancing. Selanjutnya menggunakan tuning dengan waktu tanggap sebesar 3103,4ms dari 1100 permintaan. Jadi, tuning dapat mempersingkat waktu tanggap dan meningkatkan jumlah permintaan yang dilayani web server.</em><em> Selanjutnya dengan penghitungan tingkat pengaruh, bahwa </em><em>diketahui konfigurasi load balancing algoritma round robin  dan tuning memberikan pengaruh secara signifikan terhadap waktu tanggap dan jumlah permintaan pada mikrokontroler.</em></p><p><strong><em>Kata kunci</em></strong><em> : </em><em>Web server</em><em>,</em><em> Raspberry,</em><em> Load balancing, Waktu tanggap, Stresstool,</em><em> Jmeter,</em><em> Klien</em></p></div><em><br clear="all" /> </em>


2010 ◽  
Vol 7 (1) ◽  
pp. 153-162 ◽  
Author(s):  
Lin Zhang ◽  
Li Xiao-Ping ◽  
Su Yuan

According to the different requests of Web and the heterogeneity of Web server, the paper presents a content-based loadbalancing algorithm. The mechanism of this algorithm is that a corresponding request is allocated to the server with the lowest load according to the degree of effects on the server and a combination of load state of server. Besides, apply a method of random distributing base-probability to assign each request to an appropriate server in terms of their weight. All the parameters that will be used in the algorithm can be acquired by simulated test. Experimental results suggest that this algorithm can balance the load of web server clusters effectively, make full use of the existing source of software and hardware, highly improve the server's performance, and even make the best use of the web server.


2017 ◽  
Vol 6 (1) ◽  
pp. 55
Author(s):  
Molavi Arman ◽  
Novan Wijaya ◽  
Hafiz Irsyad

In the industrialized world many companies are using web as a form of promotion or online transactions such as e-commerce. It handles millions of hit visits on the web server. There are many things that can cause the failure hit. One of them is the number of requests or transactions to the web server that is not able to be handled or the slowness of the response, which is very dangerous and detrimental for the company. The single web server with extremely expensive brands is a way to face the constraints overload, but only certain companies are able to have it. The issues faced, how to use some of the PCs as a web server with a reasonable economic value, could be empowered by implementing network load balancing technology. Network load balancing is a technology solution that is used, and expected to be able to handle the simultaneous load of web server with small output failure. Web server performance measurement at the round robin algorithm and least connection include parameters against components such as throughput, request loss, response time and cpu utilization, from the result of the measurement, it will be seen which is the best algorithm to use.


Author(s):  
Intan Ferina Irza ◽  
Zulhendra Zulhendra ◽  
Efrizon Efrizon

The internet world in the globalization era is now developing. Anyone and anywhere can access the internet if you already have the tools and connections are adequate. There are two-ways relationship in accessing a web, they are Client and Server. Good Web Server performance can affect the quality of two-ways relationship between Client and Server. There are two Web Servers that are widely used today are Apache and Nginx. As a media content provider is expected to meet all the needs of users, especially in terms of performance of the device itself. To prove how the apache and nginx Web Server performance compare to the data request by the user, it is necessary to do a test and compare the parameters of each Web Server. Based on the problems above, the authors want to analyze and compare the performance of both Web servers are Apache and Nginx, so users can choose the best Web Server. The author only compares the parameters of throughput, connection, request, reply and error by assigning load to each test and performed on attributes that exist on beritalinux.com virtually. After testing, the results obtained where in responding and connecting data that requested by the client of web application server nginx was superior to apache. From these results, it is recommended to admin on beritalinux.com to use nginx web server for better website performance.Keywords: Analysis, Performance, Web Server, Apache, Nginx, HTTPERF


Sign in / Sign up

Export Citation Format

Share Document