scholarly journals Parallel Computing Enabled Cloudd-Based IOT Applications

2021 ◽  
Author(s):  
Abiraami T V ◽  
Maithili K ◽  
Nivetha J E

In delay-sensitive IoT applications, the acquisition and processing of data from the sensor devices to the cloud-based computing environment result in higher computational cost and inefficient system. The cloud servers dedicated to performing larger tasks are found quite difficult as IoT applications collect data frequently. Hence there is a constant retrieval and updating leads to the synchronization of data in the cloud. The potential of cloud servers and virtual machines tend to lose the ability of computing larger tasks. Hence the scope of parallel computing in cloud-based IoT applications are proposed with certain parallel shared models of computation. The Parallel Random Access Machine is introduced in cloud-based IoT applications. The parallel algorithms are designed to eliminate the conflicts encountered in the proposed model. Hence Conflicts of Concurrent Read Concurrent Write PRAM and Conflicts of Concurrent Read Exclusive Write PRAM algorithms are introduced which promotes the efficiency of cloud-based IoT applications.

1997 ◽  
Vol 62 (2) ◽  
pp. 103-110 ◽  
Author(s):  
Dany Breslauer ◽  
Artur Czumaj ◽  
Devdatt P. Dubhashi ◽  
Friedhelm Meyer auf der Heide

2019 ◽  
Vol 18 (1) ◽  
pp. 709-723 ◽  
Author(s):  
Derya Malak ◽  
Howard Huang ◽  
Jeffrey G. Andrews

2020 ◽  
Vol 33 (18) ◽  
pp. 7777-7786
Author(s):  
Kaiyue Shan ◽  
Xiping Yu

AbstractThe establishment of a tropical cyclone (TC) trajectory model that can represent the basic physics and is practically advantageous considering both accuracy and computational cost is essential to the climatological studies of various global TC activities. In this study, a simple deterministic model is proposed based on a newly developed semiempirical formula for the beta drift under known conditions of the environmental steering flow. To verify the proposed model, all historical TC tracks in the western North Pacific and the North Atlantic Ocean basins during the period 1979–2018 are simulated and statistically compared with the relevant results derived from observed data. The proposed model is shown to well capture the spatial distribution patterns of the TC occurrence frequency in the two ocean basins. Prevailing TC tracks as well as the latitudinal distribution of the landfall TC number in the western North Pacific Ocean basin are also shown to agree better with the results derived from observed data, as compared to the existing models that took different strategies to include the effect of the beta drift. It is then concluded that the present model is advantageous in terms of not only the accuracy but also the capacity to accommodate the varying climate. It is thus believed that the proposed TC trajectory model has the potential to be used for assessing possible impacts of climate change on tropical cyclone activities.


Author(s):  
Yong Xiao ◽  
Ling Wei ◽  
Junhao Feng ◽  
Wang En

Edge computing has emerged for meeting the ever-increasing computation demands from delay-sensitive Internet of Things (IoT) applications. However, the computing capability of an edge device, including a computing-enabled end user and an edge server, is insufficient to support massive amounts of tasks generated from IoT applications. In this paper, we aim to propose a two-tier end-edge collaborative computation offloading policy to support as much as possible computation-intensive tasks while making the edge computing system strongly stable. We formulate the two-tier end-edge collaborative offloading problem with the objective of minimizing the task processing and offloading cost constrained to the stability of queue lengths of end users and edge servers. We perform analysis of the Lyapunov drift-plus-penalty properties of the problem. Then, a cost-aware computation offloading (CACO) algorithm is proposed to find out optimal two-tier offloading decisions so as to minimize the cost while making the edge computing system stable. Our simulation results show that the proposed CACO outperforms the benchmarked algorithms, especially under various number of end users and edge servers.


Author(s):  
Yujiang Xie ◽  
Catherine A. Rychert ◽  
Nicholas Harmon ◽  
Qinya Liu ◽  
Dirk Gajewski

Abstract Full waveform inversion or adjoint tomography has routinely been performed to image the internal structure of the Earth at high resolution. This is typically done using the Fréchet kernels and the approximate Hessian or the approximate inverse Hessian because of the high-computational cost of computing and storing the full Hessian. Alternatively, the full Hessian kernels can be used to improve inversion resolutions and convergence rates, as well as possibly to mitigate interparameter trade-offs. The storage requirements of the full Hessian kernel calculations can be reduced by compression methods, but often at a price of accuracy depending on the compression factor. Here, we present open-source codes to compute both Fréchet and full Hessian kernels on the fly in the computer random access memory (RAM) through simultaneously solving four wave equations, which we call Quad Spectral-Element Method (QuadSEM). By recomputing two forward fields at the same time that two adjoint fields are calculated during the adjoint simulation, QuadSEM constructs the full Hessian kernels using the exact forward and adjoint fields. In addition, we also implement an alternative approach based on the classical wavefield storage method (WSM), which stores forward wavefields every kth (k≥1) timestep during the forward simulation and reads required fields back into memory during the adjoint simulation for kernel construction. Both Fréchet and full Hessian kernels can be computed simultaneously through the QuadSEM or the WSM code, only doubling the computational cost compared with the computation of Fréchet kernels alone. Compared with WSM, QuadSEM can reduce the disk space and input/output cost by three orders of magnitude in the presented examples that use 15,000 timesteps. Numerical examples are presented to demonstrate the functionality of the methods, and the computer codes are provided with this contribution.


2021 ◽  
Vol 11 (3) ◽  
pp. 19-32
Author(s):  
Shahin Fatima ◽  
Shish Ahmad

Cloud computing has become a feasible solution for virtualization of cloud resources. Although it has many prospective to hold individuals by providing many benefits to organizations, still there are security loopholes to outsource data. To ensure the ‘security' of data in cloud computing, quantum key cryptography is introduced. Quantum cryptography makes use of quantum mechanics and qubits. The proposed method made use of quantum key distribution with Kerberos to secure the data on the cloud. The paper discussed the model for quantum key distribution which makes use of Kerberos ticket distribution center for authentication of cloud service providers. The proposed model is compared with quantum key distribution and provides faster computation by producing less error rate.


Sign in / Sign up

Export Citation Format

Share Document