scholarly journals Enhancement of Data Security using Circular Queue Based Encryption Algorithm

Data security is a progressing provocation for designers and Hackers. To battle different attacks by hackers, there is a need for increasingly solid security advancements. In this proposed system, a low complexity circular queue data structure security algorithm can be created. Numerous complicating variable components are used to enhance the quality of this algorithm and make recuperation of original message for hackers, becomes progressively troublesome. These tunable components are the size of the circular queue, the start of the picked keyword letter and the various portrayals of a number are in the Fibonacci manner. All letters ought to be changed over into ASCII binary configuration so as to be utilized by security algorithm in the sensible and shift operations. The outcomes demonstrate that our proposed security algorithm has half low complexity than analyzed multiple circular queues algorithm (MCQA). Fibonacci manner and variable number of difficult factors in this algorithm give adaptability in changing the security of the algorithm as per the conditions. Circular queue is an data structure can be utilized in the data security to make figured message progressively hard to disentangle. For example, a calculation that utilizes the moving and supplanting tasks of bi-section bi-push for round line to expand security. An irregular number was utilized in this algorithm to control the moving between the line and section, in the end this lead to expand the unpredictability of plaintext decrypting

2019 ◽  
Vol 10 (11) ◽  
pp. 1131-1135
Author(s):  
Tomas Hambili Paulo Sanjuluca ◽  
◽  
Ricardo Correia ◽  
Anabela Antunes de Almeida ◽  
Ana Gloria Diaz Martinez ◽  
...  

Introduction: In order to have a good assessment of the quality of maternal and child health care, it is essential that there is up-to-date and reliable information. Objective: To evaluate the impact of the implementation of a computerized database of clinical processes in the admission, archive and medical statistics section, of Maternity hospital Irene Neto/Lubango-Angola. Methodology: A descriptive study with a quantitative and qualitative approach to carry out a retrospective case study deliveries and newborns, records from 2014 to 2017. Final considerations: The implementation of this project may contribute to the improvement of clinical management support management of the hospital as well as facilitating access to information for research and scientific production.


2020 ◽  
Vol 2020 (4) ◽  
pp. 25-32
Author(s):  
Viktor Zheltov ◽  
Viktor Chembaev

The article has considered the calculation of the unified glare rating (UGR) based on the luminance spatial-angular distribution (LSAD). The method of local estimations of the Monte Carlo method is proposed as a method for modeling LSAD. On the basis of LSAD, it becomes possible to evaluate the quality of lighting by many criteria, including the generally accepted UGR. UGR allows preliminary assessment of the level of comfort for performing a visual task in a lighting system. A new method of "pixel-by-pixel" calculation of UGR based on LSAD is proposed.


2020 ◽  
Author(s):  
JAYDIP DATTA

With Reference to earlier works like MATHEMATICAL STATISTICS: AN APPLICATION BASED STATISTICS, December 2019 , DOI : 10.13140/RG.2.2.32537.57446 / DATA STRUCTURE & MANAGEMENT SYSTEM: A REVIEW, December 2019 , DOI : 10.13140/RG.2.2.36453.96488 / OPTIMISATION: A VIEW FROM INDUSTRIAL ECONOMICS , January 2020 , DOI : 10.13140/RG.2.2.35662.61764 the following aspects of any general graduate engineering courses highlight the following feature.


2021 ◽  
Vol 15 ◽  
pp. 174830262110080
Author(s):  
Changjun Zha* ◽  
Qian Zhang* ◽  
Huimin Duan

Traditional single-pixel imaging systems are aimed mainly at relatively static or slowly changing targets. When there is relative motion between the imaging system and the target, sizable deviations between the measurement values and the real values can occur and result in poor image quality of the reconstructed target. To solve this problem, a novel dynamic compressive imaging system is proposed. In this system, a single-column digital micro-mirror device is used to modulate the target image, and the compressive measurement values are obtained for each column of the image. Based on analysis of the measurement values, a new recovery model of dynamic compressive imaging is given. Differing from traditional reconstruction results, the measurement values of any column of vectors in the target image can be used to reconstruct the vectors of two adjacent columns at the same time. Contingent upon characteristics of the results, a method of image quality enhancement based on an overlapping average algorithm is proposed. Simulation experiments and analysis show that the proposed dynamic compressive imaging can effectively reconstruct the target image; and that when the moving speed of the system changes within a certain range, the system reconstructs a better original image. The system overcomes the impact of dynamically changing speeds, and affords significantly better performance than traditional compressive imaging.


Author(s):  
Tianqi Jing ◽  
Shiwen He ◽  
Fei Yu ◽  
Yongming Huang ◽  
Luxi Yang ◽  
...  

AbstractCooperation between the mobile edge computing (MEC) and the mobile cloud computing (MCC) in offloading computing could improve quality of service (QoS) of user equipments (UEs) with computation-intensive tasks. In this paper, in order to minimize the expect charge, we focus on the problem of how to offload the computation-intensive task from the resource-scarce UE to access point’s (AP) and the cloud, and the density allocation of APs’ at mobile edge. We consider three offloading computing modes and focus on the coverage probability of each mode and corresponding ergodic rates. The resulting optimization problem is a mixed-integer and non-convex problem in the objective function and constraints. We propose a low-complexity suboptimal algorithm called Iteration of Convex Optimization and Nonlinear Programming (ICONP) to solve it. Numerical results verify the better performance of our proposed algorithm. Optimal computing ratios and APs’ density allocation contribute to the charge saving.


2021 ◽  
Vol 14 ◽  
pp. 117954412199377
Author(s):  
Philip Muccio ◽  
Josh Schueller ◽  
Miriam van Emde Boas ◽  
Norm Howe ◽  
Edward Dabrowski ◽  
...  

Chronic lower back pain is one of the most common medical conditions leading to a significant decrease in quality of life. This study retrospectively analyzed whether the AxioBionics Wearable Therapy Pain Management (WTPM) System, a customized and wearable electrical stimulation device, alleviated chronic lower back pain, and improved muscular function. This study assessed self-reported pain levels using the visual analog scale before and during the use of the AxioBionics WTPM System when performing normal activities such as sitting, standing, and walking (n = 69). Results showed that both at-rest and activity-related pain were significantly reduced during treatment with the AxioBionics WTPM System (% reduction in pain: 64% and 60%, respectively; P < .05). Thus, this study suggests that the AxioBionics WTPM System is efficacious in treating chronic lower back pain even when other therapies have failed to sufficiently decrease reported pain levels.


2013 ◽  
Vol 694-697 ◽  
pp. 3675-3679
Author(s):  
Yi Xiang ◽  
Jun Peng ◽  
Qian Xiong ◽  
Liang Lei ◽  
Ming Ying You

Targeting at the "Data Structure" bilingual classes with regard to the lack of qualified teachers, a sharp learning curve and the poor teaching effect that are widespread in colleges and universities, this paper gives an analysis in an attempt to find a solution to the issue on quality of teachers, by means of the development of a network teaching platform and a supporting resource library for carrying out high-quality bilingual teaching.


Author(s):  
Ahmad Al-Jarrah ◽  
Amer Albsharat ◽  
Mohammad Al-Jarrah

<p>This paper proposes a new algorithm for text encryption utilizing English words as a unit of encoding. The algorithm vanishes any feature that could be used to reveal the encrypted text through adopting variable code lengths for the English words, utilizing a variable-length encryption key, applying two-dimensional binary shuffling techniques at the bit level, and utilizing four binary logical operations with randomized shuffling inputs. English words that alphabetically sorted are divided into four lookup tables where each word has assigned an index. The strength of the proposed algorithm concluded from having two major components. Firstly, each lookup table utilizes different index sizes, and all index sizes are not multiples of bytes. Secondly, the shuffling operations are conducted on a two-dimensional binary matrix with variable length. Lastly, the parameters of the shuffling operation are randomized based on a randomly selected encryption key with varying size. Thus, the shuffling operations move adjacent bits away in a randomized fashion. Definitively, the proposed algorithm vanishes any signature or any statistical features of the original message. Moreover, the proposed algorithm reduces the size of the encrypted message as an additive advantage which is achieved through utilizing the smallest possible index size for each lookup table.</p>


Sign in / Sign up

Export Citation Format

Share Document