scholarly journals Survey on: A variety of AQM algorithm schemas and intelligent techniques developed for congestion control

Author(s):  
Amar A. Mahawish ◽  
Hassan J. Hassan

The congestion on the internet is the main issue that affects the performance of transition data over the network. An algorithm for congestion control is required to keep any network efficient and reliable for transfer traffic data of the users. Many Algorithms had been suggested over the years to improve the control of congestion that occurs in the network such as drop tail packets. Recently there are many algorithms have been developed to overcome the drawback of the drop tail procedure. One of the important algorithms developed is active queue management (AQM) that provides efficient congestion control by reducing drop packets, this technique considered as a base for many other congestion control algorithms schema. It works at the network core (router) for controlling the drop and marking of packets in the router's buffer before the congestion inception. In this study, a comprehensive survey is done on the AQM Algorithm schemas that proposed and modification these algorithms to achieve the best performance, the classification of AQM algorithms based on queue length, queue delay, or both. The advantages and limitations of each algorithm have been discussed. Also, debate the intelligent techniques procedure with AQM algorithm to achieve optimization in performance of algorithm operation. Finally, the comparison has been discussed among algorithms to find the weakness and powerful of each one based on different metrics.

Sensors ◽  
2019 ◽  
Vol 19 (16) ◽  
pp. 3484 ◽  
Author(s):  
Jiashuai Wang ◽  
Xiaoping Yang ◽  
Ying Liu ◽  
Zhihong Qian

Existing hop-by-hop congestion control algorithms are mainly divided into two categories: those improving the sending rate and those suppressing the receiving rate. However, these congestion control algorithms have problems with validity and limitations. It is likely that the network will be paralyzed due to the unreasonable method of mitigating congestion. In this paper, we present a contention-based hop-by-hop bidirectional congestion control algorithm (HBCC). This algorithm uses the congestion detection method with queue length as a parameter. By detecting the queue length of the current node and the next hop node, the congestion conditions can be divided into the following four categories: 0–0, 0–1, 1–0, 1–1 (0 means no congestion, 1 means congestion). When at least one of the two nodes is congested, the HBCC algorithm adaptively adjusts the contention window of the current node, which can change the priority of the current node to access the channel. In this way, the buffer queue length of the congested node is reduced. When the congestion condition is 1–1, the hop-by-hop priority congestion control (HPCC) method proposed in this paper is used. This algorithm adaptively changes the adjustment degree of the current node competition window and improves the priority of congestion processing of the next hop node. The NS2 simulation shows that by using the HBCC algorithm, when compared with distributed coordination function (DCF) without congestion control, the proposed unidirectional congestion control algorithms hop-by-hop receiving-based congestion control (HRCC) and hop-by-hop sending-based congestion control (HSCC), and the existing congestion control algorithm congestion alleviation—MAC (CA-MAC), the average saturation throughput increased by approximately 90%, 62%, 12%, and 62%, respectively, and the buffer overflow loss ratio reduced by approximately 80%, 79%, 44%, and 79%.


2017 ◽  
Vol 14 (2) ◽  
pp. 77
Author(s):  
Rastri Prathivi ◽  
Vensy Vydia

<p>Worm attacks become a dangerous threat and cause damage in the Internet network. If the Internet network worms and trojan attacks the very disruption of traffic data as well as create bandwidth capacity has increased and wasted making the Internet connection is slow. Detecting worms and trojan on the Internet network, especially new variants of worms and trojans and worms and trojans hidden is still a challenging problem. Worm and trojan attacks generally occur in computer networks or the Internet which has a low level of security and vulnerable to infection. The detection and analysis of the worm and trojan attacks in the Internet network can be done by looking at the anomalies in Internet traffic and internet protocol addresses are accessed.<br />This research used experimental research applying C4.5 and Bayesian Network methods to accurately classify anomalies in network traffic internet. Analysis of classification is applied to an internet address, internet protocol and internet bandwidth that allegedly attacked and trojan worm attacks.<br />The results of this research is a result of analysis and classification of internet addresses, internet protocol and internet bandwidth to get the attack worms and trojans.</p>


The new development in the architecture of Internet has increased internet traffic. The introduction of Peer to Peer (P2P) applications are affecting the performance of traditional internet applications. Network optimization is used to monitor and manage the internet traffic and improve the performance of internet applications. The existing optimizations methods are not able to provide better management for networks. Machine learning (ML) is one of the familiar techniques to handle the internet traffic. It is used to identify and reduce the traffic. The lack of relevant datasets have reduced the performance of ML techniques in classification of internet traffic. The aim of the research is to develop a hybrid classifier to classify the internet traffic data and mitigate the traffic. The proposed method is deployed in the classification of traffic traces of University Technology Malaysia. The method has produced an accuracy of 98.3% with less computation time


Author(s):  
Petar Halachev ◽  
Victoria Radeva ◽  
Albena Nikiforova ◽  
Miglena Veneva

This report is dedicated to the role of the web site as an important tool for presenting business on the Internet. Classification of site types has been made in terms of their application in the business and the types of structures in their construction. The Models of the Life Cycle for designing business websites are analyzed and are outlined their strengths and weaknesses. The stages in the design, construction, commissioning, and maintenance of a business website are distinguished and the activities and requirements of each stage are specified.


2020 ◽  
Author(s):  
Kunal Srivastava ◽  
Ryan Tabrizi ◽  
Ayaan Rahim ◽  
Lauryn Nakamitsu

<div> <div> <div> <p>Abstract </p> <p>The ceaseless connectivity imposed by the internet has made many vulnerable to offensive comments, be it their physical appearance, political beliefs, or religion. Some define hate speech as any kind of personal attack on one’s identity or beliefs. Of the many sites that grant the ability to spread such offensive speech, Twitter has arguably become the primary medium for individuals and groups to spread these hurtful comments. Such comments typically fail to be detected by Twitter’s anti-hate system and can linger online for hours before finally being taken down. Through sentiment analysis, this algorithm is able to distinguish hate speech effectively through the classification of sentiment. </p> </div> </div> </div>


1999 ◽  
Vol 27 (1) ◽  
pp. 212-213 ◽  
Author(s):  
Narayanan Venkitaraman ◽  
Tae-eun Kim ◽  
Kang-Won Lee

Sign in / Sign up

Export Citation Format

Share Document