scholarly journals Impact of scheduling algorithms on the performance of telemedicine traffic in cellular networks

2021 ◽  
Vol 9 (2) ◽  
pp. 13-27
Author(s):  
E. J. Obamila ◽  

Efficient transmission of medical information is an emerging area of telecommunication engineering because it conveys critical data about a patient’s state and vital measurements. Consequently, it is required that such transmissions be accelerated and errorless. This requirement is beyond the norm of only scheduling users at a Base Station but calls for the provisioning of guaranteed bandwidth for transmission of these critical medical data. To achieve this, there is a need to develop a scheduling scheme that will prioritize all forms of Telemedicine traffic over regular traffic at the Base Station. But there is also the need to measure, evaluate and quantify the impact of the developed scheduling scheme on telemedicine traffic transmission in cellular networks in terms of the throughputs attained. To address these problems, priority and non-priority based scheduling algorithms for telemedicine traffic transmission were developed and simulated using MATLAB 8.1.0 software and the impact of the developed algorithm on telemedicine traffic transmission was evaluated. The result represents a significant increase in telemedicine user’s throughputs with a priority scheduling scheme. Over 20 rounds, the impact of packet sizes, traffic load conditions and codec rates on the average throughputs of telemedicine traffics were studied and discussed.

2018 ◽  
Vol 27 (12) ◽  
pp. 1850195
Author(s):  
P. Mangayarkarasi ◽  
J. Raja

Energy-efficient and reliable data transmission is a challenging task in wireless relay networks (WRNs). Energy efficiency in cellular networks has received significant attention because of the present need for reduced energy consumption, thereby maintaining the profitability of networks, which in turn makes these networks “greener”. The urban cell topography needs more energy to cover the total area of the cell. The base station does not cover the entire area in a given topography and adding more number of base stations is a cost prohibitive one. Energy-efficient relay placement model which calculates the maximum cell coverage is proposed in this work that covers all sectors and also an energy-efficient incremental redundancy-hybrid automatic repeat request (IR-HARQ) power allocation scheme to improve the reliability of the network by improving the overall network throughput is proposed. An IR-HARQ power allocation method maximizes the average incremental mutual information at each round, and its throughput quickly converges to the ergodic channel capacity as the number of retransmissions increases. Simulation results show that the proposed IR-HARQ power allocation achieves full channel capacity with average transmission delay and maintains good throughput under less power consumption. Also the impact of relaying performance on node distances between relay station and base station as well as between user and relay station and relay height for line of sight conditions are analyzed using full decode and forward (FDF) and partial decode and forward (PDF) relaying schemes. Compared to FDF scheme, PDF scheme provides better performance and allows more freedom in the relay placement for an increase in cell coverage.


2018 ◽  
Vol 2018 ◽  
pp. 1-10 ◽  
Author(s):  
Jiaqi Lei ◽  
Hongbin Chen ◽  
Feng Zhao

The energy efficiency (EE) is a key metric of ultradense heterogeneous cellular networks (HCNs). Earlier works on the EE analysis of ultradense HCNs by using the stochastic geometry tool only focused on the impact of the base station density ratio and ignored the function of different tiers. In this paper, a two-tier ultradense HCN with small-cell base stations (SBSs) and user equipments (UEs) densely deployed in a traditional macrocell network is considered. Firstly, the performance of the ultradense HCN in terms of the association probability, average link spectral efficiency (SE), average downlink throughput, and average EE is theoretically analyzed by using the stochastic geometry tool. Then, the problem of maximizing the average EE while meeting minimum requirements of the average link SE and average downlink throughput experienced by UEs in macrocell and small-cell tiers is formulated. As it is difficult to obtain the explicit expression of average EE, impacts of the SBS density ratio and signal-to-interference-plus-noise ratio (SINR) threshold on the network performance are investigated through numerical simulations. Simulation results validate the accuracy of theoretical results and demonstrate that the maximum value of average EE can be achieved by optimizing the SBS density ratio and the SINR threshold.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Ahmed A. Ali ◽  
Rosdiadee Nordin ◽  
Mahamod Ismail ◽  
Huda Abdullah

In Long Term Evolution-Advanced (LTE-A), the signal quality in a wireless channel is estimated based on the channel quality measurements. The measurement results are used to select suitable modulation and coding scheme for each transmission, feedback, and processing delay, which can cause a mismatch between channel quality information (CQI) and current channel state at the base station. However, prospect delays in the reception of such CQI may lead to a system performance degradation. This study analyzes the impact of CQI feedback delay on joint user scheduling (JUS) scheme and separated random user scheduling (SRUS) scheme in LTE-A system over carrier aggregation. The analysis will be compared with the system having delayed channel and perfect knowledge at different deployment scenario. We will study the throughput performance of both scheduling schemes with different deployment scenario, and then recommend the suitable deployment scenario to keep the desired QoS for a specific number of users. Results show that, in main beam directed at sector boundaries and diverse coverage, JUS scheme performs better than SRUS, which can justify the intensive use of user equipment power and extra control signaling overhead.


1975 ◽  
Vol 14 (04) ◽  
pp. 199-201
Author(s):  
D. Lahaye ◽  
D. Roosels ◽  
J. Viaene

The drafting of a medical computer file for pneumoconiosis at the Fund of Occupational Diseases was essentially based on an intuitive choice of medical information processing from a large experience. For statistical purposes, however, a more scientific selection of stored information is needed. Therefore, we checked out the medical data on 25,830 complete medical records. The frequency of all answer possib-ilities was tested question by question. A minority of questions had to be reexamined because the yled to insufficient answers. It will be possible in the future to improve the storage of medical information with this work method. A further investigation in which the results achieved by several physicians will be compared opens the possibility of rating the impact of subjective judgments in medical examinations.


2020 ◽  
pp. 5-15
Author(s):  
Abhishek P. Patil ◽  
◽  
◽  
Neelika Chakrabarti

The Health Insurance Portability and Accountability Act of 1996 was brought in to serve as a legislation that could essentially assist in reorganizing the flow of healthcare information, prescribing how sensitive medical data stored with healthcare/insurance firms should be protected from stealing and tampering. It has served as a pioneer in the world of privacy in healthcare and set one of the earliest benchmarks for any legal instruments regarding the storing and dissemination of medical information in the form of electronic health records. The HITECH act of 2009 and the HIPAA omnibus rule of 2013 further cemented the use of standardized frameworks which can help control, reduce and track any possible breaches of confidentiality and integrity of such personal information. This paper explores the content, reasoning, and timeline of the HIPAA act and the impact it creates on the health information technology sector. It also explains the challenges that are faced in the implementation of the policy and gives a holistic perspective of the rights and responsibilities of each stakeholder involved.


Sensors ◽  
2021 ◽  
Vol 21 (18) ◽  
pp. 6253
Author(s):  
Tuan Anh Nguyen ◽  
Iure Fe ◽  
Carlos Brito ◽  
Vishnu Kumar Kaliappan ◽  
Eunmi Choi ◽  
...  

The aggressive waves of ongoing world-wide virus pandemics urge us to conduct further studies on the performability of local computing infrastructures at hospitals/medical centers to provide a high level of assurance and trustworthiness of medical services and treatment to patients, and to help diminish the burden and chaos of medical management and operations. Previous studies contributed tremendous progress on the dependability quantification of existing computing paradigms (e.g., cloud, grid computing) at remote data centers, while a few works investigated the performance of provided medical services under the constraints of operational availability of devices and systems at local medical centers. Therefore, it is critical to rapidly develop appropriate models to quantify the operational metrics of medical services provided and sustained by medical information systems (MIS) even before practical implementation. In this paper, we propose a comprehensive performability SRN model of an edge/fog based MIS for the performability quantification of medical data transaction and services in local hospitals or medical centers. The model elaborates different failure modes of fog nodes and their VMs under the implementation of fail-over mechanisms. Sophisticated behaviors and dependencies between the performance and availability of data transactions are elaborated in a comprehensive manner when adopting three main load-balancing techniques including: (i) probability-based, (ii) random-based and (iii) shortest queue-based approaches for medical data distribution from edge to fog layers along with/without fail-over mechanisms in the cases of component failures at two levels of fog nodes and fog virtual machines (VMs). Different performability metrics of interest are analyzed including (i) recover token rate, (ii) mean response time, (iii) drop probability, (iv) throughput, (v) queue utilization of network devices and fog nodes to assimilate the impact of load-balancing techniques and fail-over mechanisms. Discrete-event simulation results highlight the effectiveness of the combination of these for enhancing the performability of medical services provided by an MIS. Particularly, performability metrics of medical service continuity and quality are improved with fail-over mechanisms in the MIS while load balancing techniques help to enhance system performance metrics. The implementation of both load balancing techniques along with fail-over mechanisms provide better performability metrics compared to the separate cases. The harmony of the integrated strategies eventually provides the trustworthiness of medical services at a high level of performability. This study can help improve the design of MIS systems integrated with different load-balancing techniques and fail-over mechanisms to maintain continuous performance under the availability constraints of medical services with heavy computing workloads in local hospitals/medical centers, to combat with new waves of virus pandemics.


2017 ◽  
Vol 2017 ◽  
pp. 1-14 ◽  
Author(s):  
Abu Jahid ◽  
Abdullah Bin Shams ◽  
Md. Farhad Hossain

This paper proposes a novel framework for PV-powered cellular networks with a standby grid supply and an essential energy management technique for achieving envisaged green networks. The proposal considers an emerging cellular network architecture employing two types of coordinated multipoint (CoMP) transmission techniques for serving the subscribers. Under the proposed framework, each base station (BS) is powered by an individual PV solar energy module having an independent storage device. BSs are also connected to the conventional grid supply for meeting additional energy demand. We also propose a dynamic inter-BS solar energy sharing policy through a transmission line for further greening the proposed network by minimizing the consumption from the grid supply. An extensive simulation-based study in the downlink of a Long-Term Evolution (LTE) cellular system is carried out for evaluating the energy efficiency performance of the proposed framework. System performance is also investigated for identifying the impact of various system parameters including storage factor, storage capacity, solar generation capacity, transmission line loss, and different CoMP techniques.


Electronics ◽  
2019 ◽  
Vol 8 (10) ◽  
pp. 1187 ◽  
Author(s):  
Enass Hriba ◽  
Matthew C. Valenti

In this paper, we provide a comprehensive analysis of macrodiversity for millimeter wave (mmWave) cellular networks. The key issue with mmWave networks is that signals are prone to blocking by objects in the environment, which causes paths to go from line-of-sight (LOS) to non-LOS (NLOS). We identify macrodiversity as an important strategy for mitigating blocking, as with macrodiversity the user will attempt to connect with two or more base stations. Diversity is achieved because if the closest base station is blocked, then the next base station might still be unblocked. However, since it is possible for a single blockage to simultaneously block the paths to two base stations, the issue of correlated blocking must be taken into account by the analysis. Our analysis characterizes the macrodiverity gain in the presence of correlated random blocking and interference. To do so, we develop a framework to determine distributions for the LOS probability, Signal to Noise Ratio (SNR), and Signal to Interference and Noise Ratio (SINR) by taking into account correlated blocking. We validate our framework by comparing our analysis, which models blockages using a random point process, with an analysis that uses real-world data to account for blockage. We consider a cellular uplink with both diversity combining and selection combining schemes. We also study the impact of blockage size and blockage density along with the effect of co-channel interference arising from other cells. We show that the assumption of independent blocking can lead to an incorrect evaluation of macrodiversity gain, as the correlation tends to decrease macrodiversity gain.


The message for call requests is created by mobile devices during a call which is then sent to a base station (BS). A BS processes the response of a call request and chooses to accept or deny the call. Signals such as location notifications, paging and switching due to user mobility take a significant share of the total traffic load within mobile cellular networks. Therefore, between signaling packets, the maximum allowable delays may differ. This time will be delayed because if the time is longer than the allowable pause. The quality of service is therefore reduced, which for service providers is not acceptable. In this paper, we propose an empirical model to determine an overall delay in the processing of wireless cell network signaling packets, which involves the delay in the radio channel and the wired component delay in processing. We are demonstrating the effectiveness of priority processing in reducing handoff delays. We also assess the delay between cells according to their positions in the area and their influence on processing delays by the number of nodes. In addition, we evaluate the difference in delay between cells depending on their position within the network area and how many stations influence time delayed processing.


2015 ◽  
Vol 14 (8) ◽  
pp. 6014-6020
Author(s):  
Gheorghe Mihaela ◽  
Petre Stefania Ruxandra

Over the past decades, the field of medical informatics has been growing rapidly and has drawn the attention of many researchers. The digitization of different medical information, including medical history records, research papers, medical images, laboratory analysis and reports, has generated large amounts of data that need to be handled. As the rate of data acquisition is greater than the rate of data interpretation, new computational technologies are needed in order to manage the resulted repositories of medical data and to extract relevant knowledge from them. Such methods are provided by data mining techniques, which are used for discovering meaningful patterns and trends within the data and help improving various aspects of health informatics. In order to apply data mining techniques, the data needs to be cleansed and transformed, normalization being one of the most important pre-processing methods that accomplish this purpose.This paper aims to present the impact of applying different data normalization methods, on the performance obtained with the K-Nearest Neighbour algorithm on medical data sets.


Sign in / Sign up

Export Citation Format

Share Document