scholarly journals RESEARCH AND DESIGN OF INSPECTION CLOUD PLATFORM FRAMEWORK FOR SURVEYING AND MAPPING PRODUCTS

Author(s):  
L. B. Zhang ◽  
H. Chen ◽  
Z. Li

Abstract. With the continuous improvement of modern surveying and mapping technology and with the plentiful of achievements, traditional quality inspection software for single machine, single task and single data type, difficult to massive multi-source isomerization achievements, difficult to meet the requirement of rapid, accurate and efficient quality inspection. With the development of IT technology such as cloud computing, big data and artificial intelligence, the quality inspection software needs to combine cloud computing technology with quality inspection business, refactoring software framework. Facing to the storage and spatial query requirement of inspection for surveying and mapping products, the paper researches and designs the spatial data distributed storage and the spatial data distributed index in cloud platform. The Management of inspection rule is the core in cloud platform. Inspection rule is the minimum operating independent unit, which becomes inspection item by parameterization, the paper builds full run-time operating mechanism in cloud platform for inspection rule. Finally, Combining the inspection requirement for surveying and mapping products and business, the paper researches and design the cloud framework for surveying and mapping products.

2019 ◽  
pp. 1-4
Author(s):  
C. T. Kantharaja

Cloud computing technology has signicant role in academic libraries. Most of the library services are available on cloud platform and library software vendors developed their Library Management Software on cloud platform. It is the right time for library professionals to upgrade their technical skills to provide good services to the library stakeholders. This study shows the library services and facilities available on cloud. It is the right time to migrate to cloud


2015 ◽  
Vol 8 (1) ◽  
pp. 206-210 ◽  
Author(s):  
Yu Junyang ◽  
Hu Zhigang ◽  
Han Yuanyuan

Current consumption of cloud computing has attracted more and more attention of scholars. The research on Hadoop as a cloud platform and its energy consumption has also received considerable attention from scholars. This paper presents a method to measure the energy consumption of jobs that run on Hadoop, and this method is used to measure the effectiveness of the implementation of periodic tasks on the platform of Hadoop. Combining with the current mainstream of energy estimate formula to conduct further analysis, this paper has reached a conclusion as how to reduce energy consumption of Hadoop by adjusting the split size or using appropriate size of workers (servers). Finally, experiments show the effectiveness of these methods as being energy-saving strategies and verify the feasibility of the methods for the measurement of periodic tasks at the same time.


2015 ◽  
Vol 4 (1) ◽  
pp. 135-142 ◽  
Author(s):  
Nimisha Singh ◽  
Abha Rishi

As the world becomes increasingly interlinked through the Internet, cyberspace frauds are also on the rise. This is a case study on a company, Pyramid Cyber Security (P) Ltd., which specializes in digital crime, fraud and forensic solutions and services in India. Over the years, the company has established several digital forensics laboratories and security projects for agencies in law enforcement, the public sector and corporate organizations. With the scalability, flexibility and economic advantage offered by cloud computing, more and more organizations are moving towards cloud for their applications. With all the benefits of cloud computing, it also opens up a company to the danger of digital crime and security breaches on the cloud platform. This has thrown open new vistas for Pyramid, putting it in a dilemma of whether to focus on the existing business or explore new opportunities in cloud forensics investigation thrown by the wide acceptance of cloud computing. It also poses the question whether a company should go in for pre-incident or post-incident digital network security architecture. It is a teaching case.


Author(s):  
VINITHA S P ◽  
GURUPRASAD E

Cloud computing has been envisioned as the next generation architecture of IT enterprise. It moves the application software and databases to the centralized large data centers where management of data and services may not be fully trustworthy. This unique paradigm brings out many new security challenges like, maintaining correctness and integrity of data in cloud. Integrity of cloud data may be lost due to unauthorized access, modification or deletion of data. Lacking of availability of data may be due to the cloud service providers (CSP), in order to increase their margin of profit by reducing the cost, CSP may discard rarely accessed data without detecting in timely fashion. To overcome above issues, flexible distributed storage, token utilizing, signature creations used to ensure integrity of data, auditing mechanism used assists in maintaining the correctness of data and also locating, identifying of server where exactly the data has been corrupted and also dependability and availability of data achieved through distributed storage of data in cloud. Further in order to ensure authorized access to cloud data a admin module has been proposed in our previous conference paper, which prevents unauthorized users from accessing data and also selective storage scheme based on different parameters of cloud servers proposed in previous paper, in order to provide efficient storage of data in the cloud. In order to provide more efficiency in this paper dynamic data operations are supported such as updating, deletion and addition of data.


Author(s):  
Nghia Viet Nguyen ◽  
Thu Hoai Thi Trinh ◽  
Hoa Thi Pham ◽  
Trang Thu Thi Tran ◽  
Lan Thi Pham ◽  
...  

Land cover is a critical factor for climate change and hydrological models. The extraction of land cover data from remote sensing images has been carried out by specialized commercial software. However, the limitations of computer hardware and algorithms of the commercial software are costly and make it take a lot of time, patience, and skills to do the classification. The cloud computing platform Google Earth Engine brought a breakthrough in 2010 for analyzing and processing spatial data. This study applied Object-based Random Forest classification in the Google Earth Engine platform to produce land cover data in 2010 in the Vu Gia - Thu Bon river basin. The classification results showed 7 categories of land cover consisting of plantation forest, natural forest, paddy field, urban residence, rural residence, bare land, and water surface, with an overall accuracy of 73.9% and kappa of 0.70.


2021 ◽  
Vol 17 (5) ◽  
pp. e1008977
Author(s):  
Amir Bahmani ◽  
Kyle Ferriter ◽  
Vandhana Krishnan ◽  
Arash Alavi ◽  
Amir Alavi ◽  
...  

Genomic data analysis across multiple cloud platforms is an ongoing challenge, especially when large amounts of data are involved. Here, we present Swarm, a framework for federated computation that promotes minimal data motion and facilitates crosstalk between genomic datasets stored on various cloud platforms. We demonstrate its utility via common inquiries of genomic variants across BigQuery in the Google Cloud Platform (GCP), Athena in the Amazon Web Services (AWS), Apache Presto and MySQL. Compared to single-cloud platforms, the Swarm framework significantly reduced computational costs, run-time delays and risks of security breach and privacy violation.


Author(s):  
C. X. Chen ◽  
J. X. Zhang ◽  
H. T. Zhao ◽  
Y. M. Xu ◽  
S. Yin

Abstract. Today's society has entered the era of big data, and the quality of surveying and mapping results has become the focus of government departments. As the statistical results of other industries, surveying and mapping results as one of the basic data sources provide data support for government decision-making, The status of surveying and mapping projects is constantly improving. This article introduces the ISO9001 quality management system implemented by the surveying and mapping production unit, the CMA quality management system implemented by the surveying and mapping quality inspection unit, and the first-level acceptance system for the two-level inspection of surveying and mapping products. Through the cause and effect diagram, taking the quality control of the fundamental geographical conditions monitoring of the national major surveying and mapping project as an example, the use method of the core tool 5M1E (Man, Machine, Material, Methods, Measurement, Environment) in the quality management system is demonstrated to prove that the quality management system plays an important role in the project. Provide reference experience for peers.


The distributed computing is the buzz in recent past, cloud computing stands first in this category. This is since, the users can adapt anything related to data storage, magnificent computing facilities on a system with less infrastructure from anywhere at any time. On other dimension such public and private cloud computing strategies would also attracts the foul players to perform intrusion practices. This is since, the comfortability that the cloud platform providing to end users intends them to adapt these services in regard to save or compute the sensitive data. The scope of vulnerability to breach the data or services over cloud computing is more frequent and easier, which is since, these services relies on internet protocol. In this regard, the research in intrusion detection defense mechanisms is having prominent scope. This manuscript, projecting a novel intrusion detection mechanism called "calibration factors-based intrusion detection (CFID)" for cloud computing networks. The experimental study portrayed the significant scope of the proposal CFID to detect the intrusion activities listed as remoteto-Local, Port Scanning, and Virtual-Machine-Trapping.


Author(s):  
MD. Jareena Begum ◽  
B. Haritha

Cloud computing assumes an essential job in the business stage as figuring assets are conveyed on request to clients over the Internet. Distributed computing gives on-request and pervasive access to a concentrated pool of configurable assets, for example, systems, applications, and administrations This guarantees the vast majority of undertakings and number of clients externalize their information into the cloud worker. As of late, secure deduplication strategies have bid extensive interests in the both scholastic and mechanical associations. The primary preferred position of utilizing distributed storage from the clients' perspective is that they can diminish their consumption in buying and keeping up capacity framework. By the creating data size of appropriated registering, a decline in data volumes could help providers reducing the costs of running gigantic accumulating system and saving power usage. So information deduplication strategies have been proposed to improve capacity effectiveness in cloud stockpiles. Also, thinking about the assurance of delicate documents. Before putting away the records into the cloude stockpile they frequently utilize some encryption calculations to ensure them.In this paper we propose stratagies for secure information deduplication


Sign in / Sign up

Export Citation Format

Share Document