scholarly journals Analysis of total signal decay and capacity of information data in wireless atmospheric communication links. Part 2

Author(s):  
Irit Juwiler ◽  
Irina Bronfman ◽  
Natan Blaunstein

Introduction: Analysis of total signal decay is based on prognosis of the total path loss occurring in the atmospheric communication links, accounting for effects of gaseous structures attenuation and scattering, hydrometeors (rain, snow and clouds) absorption and attenuation, and turbulent structures fast fading on radio and optical signals passing atmospheric channels with fading. Purpose: To perform a novel methodology of definition and estimation of effects of decay, absorption, scattering, and fading of radio and optical signals propagating in atmospheric channels in various meteorological conditions. Results: Wasanalyzed the impact of gaseous structures, hydrometeors and turbulent structures in total path loss for link budget design and in degradation of data stream parameters, such as capacity, spectral efficiency and bit-error-rate, which lead in loss of information data signals passing such kinds of channels with fast fading and decrease of quality of service. An optimal algorithm was found of the total path loss prediction for various meteorological situations occurring in the real atmosphere at different heights and for various frequencies of radiated signals. A method was proposed of how to evaluate the data stream parameters, capacity, spectralefficiency and bit-error-rate, accounting for the effects of atmospheric turbulence impact on fast fading, which corrupts information passing such kinds of channels. All practical tests were illustrated by the use of the MATLAB utility. A new methodology was proposed on how to evaluate and estimate the capacity, the spectral efficiency, and the loss in energy and in the information data stream for different scenarios of radio and optical signals propagation via atmospheric channels with fading caused by different meteorological conditions. Practical relevance: The results obtained allow to achieve better accuracy of prognosis and increase quality of service in atmospheric communication channels.

Author(s):  
I. Juwiler ◽  
I. Bronfman ◽  
N. Blaunstein

Introduction: This article is based on the recent research work in the field of two subjects: signal data parameters in fiber optic communication links, and dispersive properties of optical signals caused by non-homogeneous material phenomena and multimode propagation of optical signals in such kinds of wired links.Purpose: Studying multimode dispersion by analyzing the propagation of guiding optical waves along a fiber optic cable with various refractive index profiles of the inner optical cable (core) relative to the outer cladding, as well as dispersion properties of a fiber optic cable due to inhomogeneous nature of the cladding along the cable, for two types of signal code sequences transmitted via the cable: return-to-zero and non-return-to-zero ones.Methods: Dispersion properties of multimode propagation inside a fiber optic cable are analyzed with an advanced 3D model of optical wave propagation in a given guiding structure. The effects of multimodal dispersion and material dispersion causing the optical signal delay spread along the cable were investigated analytically and numerically.Results: Time dispersion properties were obtained and graphically illustrated for two kinds of fiber optic structures with different refractive index profiles. The dispersion was caused by multimode (e.g. multi-ray) propagation and by the inhomogeneous nature of the material along the cable. Their effect on the capacity and spectral efficiency of a data signal stream passing through such a guiding optical structure is illustrated for arbitrary refractive indices of the inner (core) and outer (cladding) elements of the optical cable. A new methodology is introduced for finding and evaluating the effects of time dispersion of optical signals propagating in fiber optic structures of various kinds. An algorithm is proposed for estimating the spectral efficiency loss measured in bits per second per Hertz per each kilometer along the cable, for arbitrary presentation of the code signals in the data stream, non-return-to zero or return-to-zero ones. All practical tests are illustrated by MATLAB utility.


Author(s):  
D. V. Shelkovoy ◽  
A. A. Chernikov

The testing results of required channel resource mathematical estimating models for the for serving the proposed multimedia load in packet-switched communication networks are presented in the article. The assessment of the attainable level of quality of service at the level of data packet transportation was carried out by means of simulation modeling of the functioning of a switching node of a communication network. The developed modeling algorithm differs from the existing ones by taking into account the introduced delay for processing each data stream packet arriving at the switching node, depending on the size of the reserved buffer and the channel resource for its maintenance. A joint examination of the probability of packet loss and the introduced delay in the processing of data packets in the border router allows a comprehensive assessment of the quality of service «end to end», which in turn allows you to get more accurate values of the effective data transmitted rate by aggregating flows at the entrance to the transport network.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1534
Author(s):  
Remigiusz Rajewski

The banyan-type switching networks, well known in switching theory and called the logdN switching fabrics, are composed of symmetrical switching elements of size d×d. In turn, the modified baseline architecture, called the MBA(N,e,g), is only partially built from symmetrical optical switching elements, and it is constructed mostly from asymmetrical optical switching elements. Recently, it was shown that the MBA(N,e,g) structure requires a lower number of passive as well as active optical elements than the banyan-type switching fabric of the same capacity and functionality, which makes it an attractive solution. However, the optical signal-to-crosstalk ratio for the MBA(N,e,g) was not investigated before. Therefore, in this paper, the optical signal-to-crosstalk ratio in the MBA(N,e,g) was determined. Such crosstalk influences the output signal’s quality. Thus, if such crosstalk is lower, the signal quality is better. The switching fabric proposed in the author’s previous work has lower optical signal losses than a typical Beneš and banyan-type switching networks of this same capacity and functionality, which gives better quality of transmitted optical signals at the switching node’s output. The investigated MBA(N,e,g) architecture also contains one stage fewer than banyan-type network of the same capacity, which is an essential feature from the optical switching point of view.


2016 ◽  
Vol 65 (8) ◽  
pp. 6038-6050 ◽  
Author(s):  
Zhicheng Dong ◽  
Pingzhi Fan ◽  
Rose Qingyang Hu ◽  
Jake Gunther ◽  
Xianfu Lei

2010 ◽  
Vol 56 (4) ◽  
pp. 351-355
Author(s):  
Marcin Rodziewicz

Joint Source-Channel Coding in Dictionary Methods of Lossless Data Compression Limitations on memory and resources of communications systems require powerful data compression methods. Decompression of compressed data stream is very sensitive to errors which arise during transmission over noisy channels, therefore error correction coding is also required. One of the solutions to this problem is the application of joint source and channel coding. This paper contains a description of methods of joint source-channel coding based on the popular data compression algorithms LZ'77 and LZSS. These methods are capable of introducing some error resiliency into compressed stream of data without degradation of the compression ratio. We analyze joint source and channel coding algorithms based on these compression methods and present their novel extensions. We also present some simulation results showing usefulness and achievable quality of the analyzed algorithms.


2018 ◽  
Vol 7 (2.28) ◽  
pp. 181
Author(s):  
Ali M. Al-Saegh

Building scheduling algorithms in satellite communication links became a necessity according to the typical problems that satellite networks suffers from, such as congestions, jamming, mobility, atmospheric impairment, and achieving the quality of service (QoS) requirements. However, building efficient algorithms needs several considerations that should be taken into account. Such as satellite and earth station node(s), link parameters and specifications, along with the service requirements and limitations. This paper presents efficient approach for accumulating the effective considerations that the designer should employ as a framework for building proper and efficient scheduling algorithm. The proposed approach provides proper solutions to the satellite communications impairments and satisfies the quality of service requirements in satellite communication networks.  


Mnemosyne ◽  
2015 ◽  
Vol 68 (5) ◽  
pp. 794-813
Author(s):  
Cornelis van Tilburg

The founding of a city requires certain hygienic and meteorological conditions. The climate must be moderate, neither too hot, nor too cold; neither too dry, nor too moist; fresh air and water are crucial. Ancient medical writers such as the authors of the Hippocratic Corpus, Celsus and Galen prescribe ideal conditions for the city. Wind-directions, local climate (heat, cold, humidity), quantity and quality of air and water and a clean environment were crucial factors to establish a healthy city. Did their opinions correspond with the opinions of non-medical ancient sources like Vitruvius, Varro, and Columella? And, finally, were these conditions really realised in practice, as proved by excavations? According to his book Res rusticae, the Roman author M. Terentius Varro improved the hygienic situation by cleaning polluted air, when he changed the position of doors and windows. If this story is true, there is evidence that there was some knowledge of improving health, bringing theory into practice.


Author(s):  
Sana Rekik

The advent of geospatial big data has led to a paradigm shift where most related applications became data driven, and therefore intensive in both data and computation. This revolution has covered most domains, namely the real-time systems such as web search engines, social networks, and tracking systems. These later are linked to the high-velocity feature, which characterizes the dynamism, the fast changing and moving data streams. Therefore, the response time and speed of such queries, along with the space complexity, are among data stream analysis system requirements, which still require improvements using sophisticated algorithms. In this vein, this chapter discusses new approaches that can reduce the complexity and costs in time and space while improving the efficiency and quality of responses of geospatial big data stream analysis to efficiently detect changes over time, conclude, and predict future events.


2018 ◽  
Vol 23 ◽  
pp. 00016 ◽  
Author(s):  
Joanna A. Kamińska

Two data mining methods – a random forest and boosted regression trees – were used to model values of roadside air pollution depending on meteorological conditions and traffic flow, using the example of data obtained in the city of Wrocław in the years 2015–2016. Eight explanatory variables – five continuous and three categorical – were considered in the models. A comparison was made of the quality of the fit of the models to empirical data. Commonly used goodness-of-fit measures did not imply a significant preference for either of the methods. Residual analysis was also performed; this showed boosted regression trees to be a more effective method for predicting typical values in the modelling of NO2, NOx and PM2.5, while the random forest method leads to smaller errors when predicting peaks.


2013 ◽  
Vol 4 (4) ◽  
pp. 1-22
Author(s):  
Zrinka Lukač ◽  
Manuel Laguna

The recent development in network multimedia technology has created numerous real-time multimedia applications where the Quality-of-Service (QoS) requirements are quite rigorous. This has made multicasting under QoS constraints one of the most prominent routing problems. The authors consider the problem of the efficient delivery of data stream to receivers for multi-source communication groups. Efficiency in this context means to minimize cost while meeting bounds on the end-to-end delay of the application. The authors adopt the multi-core approach and utilize SPAN (Karaman and Hassane, 2007)—a core-based framework for multi-source group applications — as the basis to develop greedy randomized adaptive search procedures (GRASP) for the associated constrained cost minimization problem. The procedures are tested in asymmetric networks and computational results show that they consistently outperform their counterparts in the literature.


Sign in / Sign up

Export Citation Format

Share Document