scholarly journals Influence of load balancing on quality of real time data transmission

2009 ◽  
Vol 6 (3) ◽  
pp. 515-524 ◽  
Author(s):  
Natasa Maksic ◽  
Petar Knezevic ◽  
Marija Antic ◽  
Aleksandra Smiljanic

The routing algorithm with load balancing presented in [1] represents the modification of OSPF protocol, which enables the optimization to achieve higher network throughput. In the case of routing with load balancing, packets belonging to the same stream use different paths in the network. This paper analyzes the influence of the difference in packet propagation times on the quality of real-time data transmission. The proposed algorithm was implemented and the simulation network was formed to measure the jitter. .

2021 ◽  
Author(s):  
Saurabh Shukla ◽  
Mohd. Fadzil Hassan ◽  
Duc Chung Tran ◽  
Rehan Akbar ◽  
Irving Vitra Paputungan ◽  
...  

2013 ◽  
Author(s):  
Abdulrahman A. Al-Amer ◽  
Muhammad Al-Gosayir ◽  
Naser Al-Naser ◽  
Hussain Al-Towaileb

2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Woochul Kang ◽  
Jaeyong Chung

With ubiquitous deployment of sensors and network connectivity, amounts of real-time data for embedded systems are increasing rapidly and database capability is required for many embedded systems for systematic management of real-time data. In such embedded systems, supporting the timeliness of tasks accessing databases is an important problem. However, recent multicore-based embedded architectures pose a significant challenge for such data-intensive real-time tasks since the response time of accessing data can be significantly affected by potential intercore interferences. In this paper, we propose a novel feedback control scheme that supports the timeliness of data-intensive tasks against unpredictable intercore interferences. In particular, we use multiple inputs/multiple outputs (MIMO) control method that exploits multiple control knobs, for example, CPU frequency and the Quality-of-Data (QoD) to handle highly unpredictable workloads in multicore systems. Experimental results, using actual implementation, show that the proposed approach achieves the target Quality-of-Service (QoS) goals, such as task timeliness and Quality-of-Data (QoD) while consuming less energy compared to baseline approaches.


2021 ◽  
Author(s):  
Graciela Eva Naveda ◽  
France Dominique Louie ◽  
Corinna Locatelli ◽  
Julien Davard ◽  
Sara Fragassi ◽  
...  

Abstract Natural gas has become one of the major sources of energy for homes, public buildings and businesses, therefore gas storage is particularly important to ensure continuous provision compensating the differences between supply and demand. Stogit, part of Snam group, has been carrying out gas storage activities since early 1960's. Natural gas is usually stored underground, in large storage reservoirs. The gas is injected into the porous rock of depleted reservoirs bringing the reservoir nearby to its original condition. Injected gas can be withdrawn depending on the need. Gas market demands for industries and homes in Italy are mostly guaranteed from those Stogit reservoirs even in periods when imports are in crisis. Typically, from April to October, the gas is injected in these natural reservoirs that are "geologically tested"; while from November to March, gas is extracted from the same reservoirs and pumped into the distribution networks to meet the higher consumer demand.  Thirty-eight (38) wells, across nine (9) depleted fields, are completed with downhole quartz gauges and some of them with fiber-optics gauges. Downhole gauges are installed to continuously measure and record temperature and pressure from multiple reservoirs. The Real Time data system installed for 29 wells is used to collect, transmit and make available downhole data to Stogit (Snam) headquarter office. Data is automatically collected from remote terminal units (RTUs) and transferred over Stogit (Snam) network. The entire system works autonomously and has the capability of being remotely managed from anywhere over the corporate Stogit (Snam) IT network. Historical trends, including fiber optics gauges ones, are visualized and data sets could be retrieved using a fast and user-friendly software that enables data import into interpretation and reservoir modeling software. The use of this data collection and transmission system, versus the traditional manual download, brought timely data delivery to multiple users, coupled with improved personnel safety since land travels were eliminated. The following pages describe the case study, lessons learned, and integrated new practices used to improve the current and future data transmission deployments.


Author(s):  
Manjunath Ramachandra ◽  
Vikas Jain

The present day Internet traffic largely caters for the multimedia traffic throwing open new and unthinkable applications such as tele-surgery. The complexity of data transactions increases with a demand for in time and real time data transfers, demanding the limited resources of the network beyond their capabilities. It requires a prioritization of data transfers, controlled dumping of data over the network etc. To make the matter worse, the data from different origin combine together imparting long lasting detrimental features such as self similarity and long range dependency in to the traffic. The multimedia data fortunately is associated with redundancies that may be removed through efficient compression techniques. There exists a provision to control the compression or bitrates based on the availability of resources in the network. The traffic controller or shaper has to optimize the quality of the transferred multimedia data depending up on the state of the network. In this chapter, a novel traffic shaper is introduced considering the adverse properties of the network and counteract with the same.


Sign in / Sign up

Export Citation Format

Share Document