computing capacity
Recently Published Documents


TOTAL DOCUMENTS

77
(FIVE YEARS 33)

H-INDEX

7
(FIVE YEARS 2)

2022 ◽  
Author(s):  
Ozgur Umut Akgul ◽  
Wencan Mao ◽  
Byungjin Cho ◽  
Yu Xiao

<div>Edge/fog computing is a key enabling technology in 5G and beyond for fulfilling the tight latency requirements of compute-intensive vehicular applications such as cooperative driving. Concerning the spatio-temporal variation in the vehicular traffic flows and the demand for edge computing capacity generated by connected vehicles, vehicular fog computing (VFC) has been proposed as a cost-efficient deployment model that complements stationary fog nodes with mobile ones carried by moving vehicles. Accessing the feasibility and the applicability of such hybrid topology, and further planning and managing the networking and computing resources at the edge, require deep understanding of the spatio-temporal variations in the demand and the supply of edge computing capacity as well as the trade-offs between achievable Quality-of-Services and potential deployment and operating costs. To meet such requirements, we propose in this paper an open platform for simulating the VFC environment and for evaluating the performance and cost efficiency of capacity planning and resource allocation strategies under diverse physical conditions and business strategies. Compared with the existing edge/fog computing simulators, our platform supports the mobility of fog nodes and provides a realistic modeling of vehicular networking with the 5G and beyond network in the urban environment. We demonstrate the functionality of the platform using city-scale VFC capacity planning as example. The simulation results provide insights on the feasibility of different deployment strategies from both technical and financial perspectives.</div>


2022 ◽  
Author(s):  
Ozgur Umut Akgul ◽  
Wencan Mao ◽  
Byungjin Cho ◽  
Yu Xiao

<div>Edge/fog computing is a key enabling technology in 5G and beyond for fulfilling the tight latency requirements of compute-intensive vehicular applications such as cooperative driving. Concerning the spatio-temporal variation in the vehicular traffic flows and the demand for edge computing capacity generated by connected vehicles, vehicular fog computing (VFC) has been proposed as a cost-efficient deployment model that complements stationary fog nodes with mobile ones carried by moving vehicles. Accessing the feasibility and the applicability of such hybrid topology, and further planning and managing the networking and computing resources at the edge, require deep understanding of the spatio-temporal variations in the demand and the supply of edge computing capacity as well as the trade-offs between achievable Quality-of-Services and potential deployment and operating costs. To meet such requirements, we propose in this paper an open platform for simulating the VFC environment and for evaluating the performance and cost efficiency of capacity planning and resource allocation strategies under diverse physical conditions and business strategies. Compared with the existing edge/fog computing simulators, our platform supports the mobility of fog nodes and provides a realistic modeling of vehicular networking with the 5G and beyond network in the urban environment. We demonstrate the functionality of the platform using city-scale VFC capacity planning as example. The simulation results provide insights on the feasibility of different deployment strategies from both technical and financial perspectives.</div>


2022 ◽  
Author(s):  
Bin Xu ◽  
Tao Deng ◽  
Yichuan Liu ◽  
Yunkai Zhao ◽  
Zipeng Xu ◽  
...  

Abstract The combination of idle computing resources in mobile devices and the computing capacity of mobile edge servers enables all available devices in an edge network to complete all computing tasks in coordination to effectively improve the computing capacity of the edge network. This is a research hotspot for 5G technology applications. Previous research has focused on the minimum energy consumption and/or delay to determine the formulation of the computational offloading strategy but neglected the cost required for the computation of collaborative devices (mobile devices, mobile edge servers, etc.); therefore, we proposed a cost-based collaborative computation offloading model. In this model, when a task requests these devices' assistance in computing, it needs to pay the corresponding calculation cost; and on this basis, the task is offloaded and computed. In addition, for the model, we propose an adaptive neighborhood search based on simulated annealing algorithm (ANSSA) to jointly optimize the offloading decision and resource allocation with the goal of minimizing the sum of both the energy consumption and calculation cost. The adaptive mechanism enables different operators to update the probability of selection according to historical experience and environmental perception, which makes the individual evolution have certain autonomy. A large number of experiments conducted on different scales of mobile user instances show that the ANSSA can obtain satisfactory time performance with guaranteed solution quality. The experimental results demonstrate the superiority of the mobile edge computing (MEC) offloading system. It is of great significance to strike a balance between maintaining the life cycle of smart mobile devices and breaking the performance bottleneck of MEC servers.


Author(s):  
Raghavendra Devidas ◽  
Hrushikesh Srinivasachar

With increased vulnerabilities and vast technology landscapes, it is extremely critical to build systems which are highly resistant to cyber-attacks, to break into systems to exploit. It is almost impossible to build 100% secure authentication &amp; authorization mechanisms merely through standard password / PIN (With all combinations of special characters, numbers &amp; upper/lower case alphabets and by using any of the Graphical password mechanisms). The immense computing capacity and several hacking methods used, make almost every authentication method susceptible to cyber-attacks in one or the other way. Only proven / known system which is not vulnerable in spite of highly sophisticated computing power is, human brain. In this paper, we present a new method of authentication using a combination of computer&rsquo;s computing ability in combination with human intelligence. In fact this human intelligence is personalized making the overall security method more secure. Text based passwords are easy to be cracked [6]. There is an increased need for an alternate and more complex authentication and authorization methods. Some of the Methods [7] [8] in the category of Graphical passwords could be susceptible, when Shoulder surfing/cameras/spy devices are used.


2021 ◽  
Vol 9 (2) ◽  
pp. 5-35
Author(s):  
César Daltoé Berci ◽  
Ceslo Pascoli Bottura

Several characteristics of financial time series are of interest both from an academic point of view, which is intended to analyze the dynamics of the data and its numerical properties, as well as from investors point of view, who use this knowledge to generate profit in their financial transactions. By applying several analysis tools and using a massive computing capacity, the numerical and statistical properties of the assets that compose the IBOVESPA index were evaluated. Given the relevance and scope of the analyzed time series, the results obtained from this analysis can serve as a basis for the characterization of financial time series


2021 ◽  
pp. 1-6
Author(s):  
Jason M. Pudlo ◽  
William C. Ellis ◽  
Jamie M. Cole

ABSTRACT Increased computing capacity and the spread of computational knowledge has generated the expectation that organizations and municipalities use large quantities of data to drive decision making. However, municipalities may lack the resources to meaningfully use their data for decision making. Relatedly, political science and public administration programs face the challenge of training students for success in this environment. We believe one remedy is the adoption of coproduction as a pedagogical strategy. This article presents a case study of a partnership between a university research team and a municipal emergency communications center as a demonstration of how coproduction can be harnessed as a teaching tool. Findings from this project were presented at the Southern Political Science Association Annual Meeting, January 8–11, 2020, in San Juan, Puerto Rico.


Electronics ◽  
2021 ◽  
Vol 10 (17) ◽  
pp. 2176
Author(s):  
Jingyu Liu ◽  
Qiong Wang ◽  
Dunbo Zhang ◽  
Li Shen

Deep learning has achieved outstanding results in various tasks in machine learning under the background of rapid increase in equipment’s computing capacity. However, while achieving higher performance and effects, model size is larger, training and inference time longer, the memory and storage occupancy increasing, the computing efficiency shrinking, and the energy consumption augmenting. Consequently, it’s difficult to let these models run on edge devices such as micro and mobile devices. Model compression technology is gradually emerging and researched, for instance, model quantization. Quantization aware training can take more accuracy loss resulting from data mapping in model training into account, which clamps and approximates the data when updating parameters, and introduces quantization errors into the model loss function. In quantization, we found that some stages of the two super-resolution model networks, SRGAN and ESRGAN, showed sensitivity to quantization, which greatly reduced the performance. Therefore, we use higher-bits integer quantization for the sensitive stage, and train the model together in quantization aware training. Although model size was sacrificed a little, the accuracy approaching the original model was achieved. The ESRGAN model was still reduced by nearly 67.14% and SRGAN model was reduced by nearly 68.48%, and the inference time was reduced by nearly 30.48% and 39.85% respectively. What’s more, the PI values of SRGAN and ESRGAN are 2.1049 and 2.2075 respectively.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0256665
Author(s):  
Muhammad Rabani Mohd Romlay ◽  
Azhar Mohd Ibrahim ◽  
Siti Fauziah Toha ◽  
Philippe De Wilde ◽  
Ibrahim Venkat

Low-end LiDAR sensor provides an alternative for depth measurement and object recognition for lightweight devices. However due to low computing capacity, complicated algorithms are incompatible to be performed on the device, with sparse information further limits the feature available for extraction. Therefore, a classification method which could receive sparse input, while providing ample leverage for the classification process to accurately differentiate objects within limited computing capability is required. To achieve reliable feature extraction from a sparse LiDAR point cloud, this paper proposes a novel Clustered Extraction and Centroid Based Clustered Extraction Method (CE-CBCE) method for feature extraction followed by a convolutional neural network (CNN) object classifier. The integration of the CE-CBCE and CNN methods enable us to utilize lightweight actuated LiDAR input and provides low computing means of classification while maintaining accurate detection. Based on genuine LiDAR data, the final result shows reliable accuracy of 97% through the method proposed.


2021 ◽  
Vol 8 (2) ◽  
pp. 102-115
Author(s):  
Yamini D Shah ◽  
Shailvi M Soni ◽  
Manish P Patel

Artificial Intelligence (AI) is described as a field of science and engineering that is concerned with the artificial appreciation of what is generally referred to as prudent behavior and the formation of fascinations that demonstrate such conduct. AI is an expansive concept that encloses a series of advances (a considerable lot of which have been being worked on for quite a few years) that are expected to use human-like insight to handle the problems. Right now in combination with enhanced AI developments like extreme or significantly more engaged, we are experiencing a renewed enthusiasm for AI, energized by a tremendous increase in computing capacity and a significantly greater increase in knowledge. AI, along with machine learning, can be used in computer vision. More advantages in the field of engineering as well as in medicine can be accomplished based on these future scenarios worldwide. Healthcare is seen as the next domain that is said to be altered by the use of the concept of artificial intelligence. The AI process is used for critical diseases such as cancer, neurology, cardiology and diabetes. The review includes the ongoing flow status of medical services for AI applications. A few progressive explorations of AI applications in medicinal services that provide a perspective on future where human interactions are gradually brought together by social insurance conveyance. Likewise, this review will discuss how AI and machine learning can save the life of someone. It is also a guide for healthcare professionals to see how, when, and where AI can be more efficient and have the desired outcomes.


Sign in / Sign up

Export Citation Format

Share Document