scholarly journals Fuzzy Method and Neural Network Model Parallel Implementation of Multi-Layer Neural Network Based on Cloud Computing for Real Time Data Transmission in Large Offshore Platform

2017 ◽  
Vol 24 (s2) ◽  
pp. 39-44 ◽  
Author(s):  
Zhang Hu ◽  
Wei Qin

Abstract With the rapid development of electronic technology, network technology and cloud computing technology, the current data is increasing in the way of mass, has entered the era of big data. Based on cloud computing clusters, this paper proposes a novel method of parallel implementation of multilayered neural networks based on Map-Reduce. Namely in order to meet the requirements of big data processing, this paper presents an efficient mapping scheme for a fully connected multi-layered neural network, which is trained by using error back propagation (BP) algorithm based on Map-Reduce on cloud computing clusters (MRBP). The batch-training (or epoch-training) regimes are used by effective segmentation of samples on the clusters, and are adopted in the separated training method, weight summary to achieve convergence by iterating. For a parallel BP algorithm on the clusters and a serial BP algorithm on uniprocessor, the required time for implementing the algorithms is derived. The performance parameters, such as speed-up, optimal number and minimum of data nodes are evaluated for the parallel BP algorithm on the clusters. Experiment results demonstrate that the proposed parallel BP algorithm in this paper has better speed-up, faster convergence rate, less iterations than that of the existed algorithms.

2019 ◽  
Vol 3 (2) ◽  
pp. 152
Author(s):  
Xianglan Wu

<p>In today's society, the rise of the Internet and rapid development make every day produce a huge amount of data. Therefore, the traditional data processing mode and data storage can not be fully analyzed and mined these data. More and more new information technologies (such as cloud computing, virtualization and big data, etc.) have emerged and been applied, the network has turned from informationization to intelligence, and campus construction has ushered in the stage of smart campus construction.The construction of intelligent campus refers to big data and cloud computing technology, which improves the informatization service quality of colleges and universities by integrating, storing and mining huge data.</p>


Big Data ◽  
2016 ◽  
pp. 2165-2198
Author(s):  
José Carlos Cavalcanti

Analytics (discover and communication of patterns, with significance, in data) of Big Data (basically characterized by large structured and unstructured data volumes, from a variety of sources, at high velocity - i.e., real-time data capture, storage, and analysis), through the use of Cloud Computing (a model of network computing) is becoming the new “ABC” of information and communication technologies (ICTs), with important effects for the generation of new firms and for the restructuring of those ones already established. However, as this chapter argues, successful application of these new ABC technologies and tools depends on two interrelated policy aspects: 1) the use of a proper model which could help one to approach the structure and dynamics of the firm, and, 2) how the complex trade-off between information technology (IT) and communication technology (CT) costs is handled within, between and beyond firms, organizations and institutions.


Author(s):  
Amitava Choudhury ◽  
Kalpana Rangra

Data type and amount in human society is growing at an amazing speed, which is caused by emerging new services such as cloud computing, internet of things, and location-based services. The era of big data has arrived. As data has been a fundamental resource, how to manage and utilize big data better has attracted much attention. Especially with the development of the internet of things, how to process a large amount of real-time data has become a great challenge in research and applications. Recently, cloud computing technology has attracted much attention to high performance, but how to use cloud computing technology for large-scale real-time data processing has not been studied. In this chapter, various big data processing techniques are discussed.


Author(s):  
José Carlos Cavalcanti

Analytics (discover and communication of patterns, with significance, in data) of Big Data (basically characterized by large structured and unstructured data volumes, from a variety of sources, at high velocity - i.e., real-time data capture, storage, and analysis), through the use of Cloud Computing (a model of network computing) is becoming the new “ABC” of information and communication technologies (ICTs), with important effects for the generation of new firms and for the restructuring of those ones already established. However, as this chapter argues, successful application of these new ABC technologies and tools depends on two interrelated policy aspects: 1) the use of a proper model which could help one to approach the structure and dynamics of the firm, and, 2) how the complex trade-off between information technology (IT) and communication technology (CT) costs is handled within, between and beyond firms, organizations and institutions.


Author(s):  
Rajni Aron ◽  
Deepak Kumar Aggarwal

Cloud Computing has become a buzzword in the IT industry. Cloud Computing which provides inexpensive computing resources on the pay-as-you-go basis is promptly gaining momentum as a substitute for traditional Information Technology (IT) based organizations. Therefore, the increased utilization of Clouds makes an execution of Big Data processing jobs a vital research area. As more and more users have started to store/process their real-time data in Cloud environments, Resource Provisioning and Scheduling of Big Data processing jobs becomes a key element of consideration for efficient execution of Big Data applications. This chapter discusses the fundamental concepts supporting Cloud Computing & Big Data terms and the relationship between them. This chapter will help researchers find the important characteristics of Cloud Resource Management Systems to handle Big Data processing jobs and will also help to select the most suitable technique for processing Big Data jobs in Cloud Computing environment.


2014 ◽  
Vol 1 (2) ◽  
pp. 1-17 ◽  
Author(s):  
Hoda Ahmed Abdelhafez

The internet era creates new types of large and real-time data; much of those data are non-standard such as streaming and sensor-generated data. Advanced big data technologies enable organizations to extract insights from sophisticated data. Volume, variety and velocity represent big data challenges, which cause difficulties in capture, storage, search, sharing, analysis and visualization. Therefore, technologies like No-SQL, Hadoop and cloud computing used to extract value from large volumes and a wide variety of data to discover business needs. This article's goal is to focus on the challenges of big data and how the recent technologies can be used to address those issues, which are illustrated through real world case studies. The article also presents the lessons learned from these case studies.


Sensors ◽  
2019 ◽  
Vol 19 (10) ◽  
pp. 2338 ◽  
Author(s):  
Yuanju Qu ◽  
Xinguo Ming ◽  
Siqi Qiu ◽  
Maokuan Zheng ◽  
Zengtao Hou

With the development of the internet of things (IoTs), big data, smart sensing technology, and cloud technology, the industry has entered a new stage of revolution. Traditional manufacturing enterprises are transforming into service-oriented manufacturing based on prognostic and health management (PHM). However, there is a lack of a systematic and comprehensive framework of PHM to create more added value. In this paper, the authors proposed an integrative framework to systematically solve the problem from three levels: Strategic level of PHM to create added value, tactical level of PHM to make the implementation route, and operational level of PHM in a detailed application. At the strategic level, the authors provided the innovative business model to create added value through the big data. Moreover, to monitor the equipment status, the health index (HI) based on a condition-based maintenance (CBM) method was proposed. At the tactical level, the authors provided the implementation route in application integration, analysis service, and visual management to satisfy the different stakeholders’ functional requirements through a convolutional neural network (CNN). At the operational level, the authors constructed a self-sensing network based on anti-inference and self-organizing Zigbee to capture the real-time data from the equipment group. Finally, the authors verified the feasibility of the framework in a real case from China.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Zhidong Sun ◽  
Xueqing Li

With the rapid development of information technology, a scientific theory is brought by the rapid progress of science and technology. The advancement of science and technology of the impact on every field, changing the mode of transmission of information, the advent of big data for promotion and dissemination of resources played their part, let more and more people benefit. In the context of cloud computing, big data ushered in another upsurge of development and growth. Given this, the live broadcast training platform, which focuses on enterprise staff training and network education, arises at the right moment. People favor its convenience, real-time performance, and high efficiency. However, the low-value density of big data and cloud computing’s security problem has difficulties constructing a live broadcast training platform. In this paper, the live broadcast training platform’s structure is improved by constructing three modules: the live training module based on cloud computing, the user recommendation module based on big data, and the security policy guarantee module. In addition, to ensure that the trainees can receive training anytime and anywhere, this paper uses wireless communication technology to ensure the quality and speed of all users’ live video sources.


Data analytics (DA) is the job of reviewing datasets in order to frame conclusions about the information they have, increasingly using specialized systems and software. As with the emergence of Big Data, data analytics was needed. The problems that we are considering are going to be in a fraud detection application. Where we'll considering major aspects such application-independent format(XML/JSON) for the clusterization process based on the no label classification algorithm where we will focusing on the clusters to enhance the oversampling process and utilize the merits of parallel computing to speed up our system. We aim to use MapReduce functionality in our application and deploy it on Amazon AWS. Datasets gathered for studies often comprise millions of records and can carry hard-to-detect concealed pitfalls. In this paper, we are working on two datasets. The first one is a medical dataset and the second one is a customer dataset. Big Data Analytics is the suggested solution in this day and age, with growing demands for analyzing huge information sets and performing the required processing on complicated data structures. The problem faced at the moment is mainly, how to store and analyze the large amount of data which is generated from heterogeneous sources like social media and what to use to make data fast accessible as well as in pocket budget. To resolve all problems Map-Reduce framework is useful-by offering an integrated technique towards machine learning, it speeds up processing. In this, we will explore the LEOS algorithm, SVM, MapReduce and JOSE algorithm, their requirements, their benefits, their disadvantages, difficulties, and their corresponding solutions.


Sign in / Sign up

Export Citation Format

Share Document