Comparison Study of Different NoSQL and Cloud Paradigm for Better Data Storage Technology

Author(s):  
Pankaj Lathar ◽  
K. G. Srinivasa ◽  
Abhishek Kumar ◽  
Nabeel Siddiqui

Advancements in web-based technology and the proliferation of sensors and mobile devices interacting with the internet have resulted in immense data management requirements. These data management activities include storage, processing, demand of high-performance read-write operations of big data. Large-scale and high-concurrency applications like SNS and search engines have appeared to be facing challenges in using the relational database to store and query dynamic user data. NoSQL and cloud computing has emerged as a paradigm that could meet these requirements. The available diversity of existing NoSQL and cloud computing solutions make it difficult to comprehend the domain and choose an appropriate solution for a specific business task. Therefore, this chapter reviews NoSQL and cloud-system-based solutions with the goal of providing a perspective in the field of data storage technology/algorithms, leveraging guidance to researchers and practitioners to select the best-fit data store, and identifying challenges and opportunities of the paradigm.

2018 ◽  
Vol 7 (4.6) ◽  
pp. 13
Author(s):  
Mekala Sandhya ◽  
Ashish Ladda ◽  
Dr. Uma N Dulhare ◽  
. . ◽  
. .

In this generation of Internet, information and data are growing continuously. Even though various Internet services and applications. The amount of information is increasing rapidly. Hundred billions even trillions of web indexes exist. Such large data brings people a mass of information and more difficulty discovering useful knowledge in these huge amounts of data at the same time. Cloud computing can provide infrastructure for large data. Cloud computing has two significant characteristics of distributed computing i.e. scalability, high availability. The scalability can seamlessly extend to large-scale clusters. Availability says that cloud computing can bear node errors. Node failures will not affect the program to run correctly. Cloud computing with data mining does significant data processing through high-performance machine. Mass data storage and distributed computing provide a new method for mass data mining and become an effective solution to the distributed storage and efficient computing in data mining. 


2013 ◽  
Vol 756-759 ◽  
pp. 1275-1279
Author(s):  
Lin Na Huang ◽  
Feng Hua Liu

Cloud storage of high performance is the basic condition for cloud computing. This article introduces the concept and advantage of cloud storage, discusses the infrastructure of cloud storage system as well as the architecture of cloud data storage, researches the details about the design of Distributed File System within cloud data storage, at the same time, puts forward different developing strategies for the enterprises according to the different roles that the enterprises are acting as during the developing process of cloud computing.


2020 ◽  
Vol 17 (9) ◽  
pp. 4411-4418
Author(s):  
S. Jagannatha ◽  
B. N. Tulasimala

In the world of information communication technology (ICT) the term Cloud Computing has been the buzz word. Cloud computing is changing its definition the way technocrats are using it according to the environment. Cloud computing as a definition remains very contentious. Definition is stated liable to a particular application with no unanimous definition, making it altogether elusive. In spite of this, it is this technology which is revolutionizing the traditional usage of computer hardware, software, data storage media, processing mechanism with more of benefits to the stake holders. In the past, the use of autonomous computers and the nodes that were interconnected forming the computer networks with shared software resources had minimized the cost on hardware and also on the software to certain extent. Thus evolutionary changes in computing technology over a few decades has brought in the platform and environment changes in machine architecture, operating system, network connectivity and application workload. This has made the commercial use of technology more predominant. Instead of centralized systems, parallel and distributed systems will be more preferred to solve computational problems in the business domain. These hardware are ideal to solve large-scale problems over internet. This computing model is data-intensive and networkcentric. Most of the organizations with ICT used to feel storing of huge data, maintaining, processing of the same and communication through internet for automating the entire process a challenge. In this paper we explore the growth of CC technology over several years. How high performance computing systems and high throughput computing systems enhance computational performance and also how cloud computing technology according to various experts, scientific community and also the service providers is going to be more cost effective through different dimensions of business aspects.


2012 ◽  
Vol 13 (03n04) ◽  
pp. 1250009 ◽  
Author(s):  
CHANGQING JI ◽  
YU LI ◽  
WENMING QIU ◽  
YINGWEI JIN ◽  
YUJIE XU ◽  
...  

With the rapid growth of emerging applications like social network, semantic web, sensor networks and LBS (Location Based Service) applications, a variety of data to be processed continues to witness a quick increase. Effective management and processing of large-scale data poses an interesting but critical challenge. Recently, big data has attracted a lot of attention from academia, industry as well as government. This paper introduces several big data processing techniques from system and application aspects. First, from the view of cloud data management and big data processing mechanisms, we present the key issues of big data processing, including definition of big data, big data management platform, big data service models, distributed file system, data storage, data virtualization platform and distributed applications. Following the MapReduce parallel processing framework, we introduce some MapReduce optimization strategies reported in the literature. Finally, we discuss the open issues and challenges, and deeply explore the research directions in the future on big data processing in cloud computing environments.


2014 ◽  
Vol 687-691 ◽  
pp. 3733-3737
Author(s):  
Dan Wu ◽  
Ming Quan Zhou ◽  
Rong Fang Bie

Massive image processing technology requires high requirements of processor and memory, and it needs to adopt high performance of processor and the large capacity memory. While the single or single core processing and traditional memory can’t satisfy the need of image processing. This paper introduces the cloud computing function into the massive image processing system. Through the cloud computing function it expands the virtual space of the system, saves computer resources and improves the efficiency of image processing. The system processor uses multi-core DSP parallel processor, and develops visualization parameter setting window and output results using VC software settings. Through simulation calculation we get the image processing speed curve and the system image adaptive curve. It provides the technical reference for the design of large-scale image processing system.


Cloud computing is the on-request accessibility of computer system resources, specially data storage and computing power, without direct dynamic management by the client. In the simplest terms, cloud computing means storing and accessing data and programs over the Internet instead of your computer’s hard drive. Along the improvement of cloud computing, more and more applications are migrated into the cloud. A significant element of distributed computing is pay-more only as costs arise. Distributed computing gives strong computational capacity to the general public at diminished cost that empowers clients with least computational assets to redistribute their huge calculation outstanding burdens to the cloud, and monetarily appreciate the monstrous computational force, transmission capacity, stockpiling, and even reasonable programming that can be partaken in a compensation for each utilization way Tremendous bit of leeway is the essential objective that forestalls the wide scope of registering model for clients when their secret information are expended during the figuring procedure. Critical thinking is a system to arrive at the pragmatic objective of specific instruments that tackles the issues as well as shield from pernicious practices.. In this paper, we examine secure outsourcing for large-scale systems of linear equations, which are the most popular problems in various engineering disciplines. Linear programming is an operation research technique formulates private data by the customer for LP problem as a set of matrices and vectors, to develop a set of efficient privacypreserving problem transformation techniques, which allow customers to transform original LP problem into some arbitrary one while protecting sensitive input/output information. Identify that LP problem solving in Cloud component is efficient extra cost on cloud server. In this paper we are utilizing Homomorphic encryption system to increase the performance and time efficiency


Author(s):  
Kyuseok Shim ◽  
Sang Kyun Cha ◽  
Lei Chen ◽  
Wook-Shin Han ◽  
Divesh Srivastava ◽  
...  

Author(s):  
Adrian Jackson ◽  
Michèle Weiland

This chapter describes experiences using Cloud infrastructures for scientific computing, both for serial and parallel computing. Amazon’s High Performance Computing (HPC) Cloud computing resources were compared to traditional HPC resources to quantify performance as well as assessing the complexity and cost of using the Cloud. Furthermore, a shared Cloud infrastructure is compared to standard desktop resources for scientific simulations. Whilst this is only a small scale evaluation these Cloud offerings, it does allow some conclusions to be drawn, particularly that the Cloud can currently not match the parallel performance of dedicated HPC machines for large scale parallel programs but can match the serial performance of standard computing resources for serial and small scale parallel programs. Also, the shared Cloud infrastructure cannot match dedicated computing resources for low level benchmarks, although for an actual scientific code, performance is comparable.


Sign in / Sign up

Export Citation Format

Share Document