Data Encryption and Decryption Techniques for a High Secure Dataset using Artificial Intelligence

Author(s):  
B. Herawan Hayadi ◽  
◽  
Edy Victor Haryanto ◽  

The science of extracting patterns, trends, and actionable data analysis detail of large data sets. The growing existence of data in different county’s servers with structured, semi-structured, and unstructured data formats, such as the data. The demands of these are not met by conventional IT infrastructure, a modern landscape of "Data Analysis." For these reasons, several companies are turning to as a possible solution to this unmet commercial business, Hadoop (open-source projects). The amount of data collected by organizations, especially unstructured data, as businesses burst, Hadoop is increasingly emerging as one of the primary alternatives to store and execute operations on that data. The secondary question of data analysis is defense, the rapid increase in internet use, the dramatic shift in acceptance of people who use social media apps that allow users to generate content freely and intensify the already enormous amount of the site. In today's firms, there are a few stuff to bear in mind when starting innovation ventures for big data and analytics. In the business environment, the need for secure data analytics tools is mandatory. In the previous paper, they implemented a high profile dataset using the encryption technique. Using only the encryption method, cannot secure data very highly. There is a chance of knowing the original data to the third party. To reduce the above issues, the paper introduces a new technology called “Artificial intelligence". Using this new technology, paper can achieve more security for data sets. Using both encryption and decryption models in artificial intelligence can solve the drawback in an existing paper. This will provide the data with either a significant degree of authentication analyzed to ever be. The provision of data analytics is pursued with attribute-based restricts Data extraction allows enabled. This model will work better than the present model. In both security and sensitive economic restructuring, data analytical tools.

2018 ◽  
Vol 20 (1) ◽  
Author(s):  
Tiko Iyamu

Background: Over the years, big data analytics has been statically carried out in a programmed way, which does not allow for translation of data sets from a subjective perspective. This approach affects an understanding of why and how data sets manifest themselves into various forms in the way that they do. This has a negative impact on the accuracy, redundancy and usefulness of data sets, which in turn affects the value of operations and the competitive effectiveness of an organisation. Also, the current single approach lacks a detailed examination of data sets, which big data deserve in order to improve purposefulness and usefulness.Objective: The purpose of this study was to propose a multilevel approach to big data analysis. This includes examining how a sociotechnical theory, the actor network theory (ANT), can be complementarily used with analytic tools for big data analysis.Method: In the study, the qualitative methods were employed from the interpretivist approach perspective.Results: From the findings, a framework that offers big data analytics at two levels, micro- (strategic) and macro- (operational) levels, was developed. Based on the framework, a model was developed, which can be used to guide the analysis of heterogeneous data sets that exist within networks.Conclusion: The multilevel approach ensures a fully detailed analysis, which is intended to increase accuracy, reduce redundancy and put the manipulation and manifestation of data sets into perspectives for improved organisations’ competitiveness.


:Today’s technological advancements facilitated the researcher in collecting and organizing various forms of healthcare data. Data is an integral part of health care analytics. Drug discovery for clinical data analytics forms an important breakthrough work in terms of computational approaches in health care systems. On the other hand, healthcare analysis provides better value for money. The health care data management is very challenging as 80% of the data is unstructured as it includes handwritten documents, images; computer-generated clinical reports such as MRI, ECG, city scan, etc. The paper aims at providing a summary of work carried out by scientists and researchers who worked in health care domains. More precisely the work focuses on clinical data analysis for the period 2013 to 2019. The organization of the work carried out is specifically with concerned to data sets, Techniques, and Methods used, Tools adopted, Key Findings in clinical data analysis. The overall objective is to identify the current challenges, trends, and gaps in clinical data analysis. The pathway of the work is focused on carrying out on the bibliometric survey and summarization of the key findings in a novel way.


2021 ◽  
Vol 2 (4) ◽  
pp. 1-22
Author(s):  
Jing Rui Chen ◽  
P. S. Joseph Ng

Griffith AI&BD is a technology company that uses big data platform and artificial intelligence technology to produce products for schools. The company focuses on primary and secondary school education support and data analysis assistance system and campus ARTIFICIAL intelligence products for the compulsory education stage in the Chinese market. Through big data, machine learning and data mining, scattered on campus and distributed systems enable anyone to sign up to join the huge data processing grid, and access learning support big data analysis and matching after helping students expand their knowledge in a variety of disciplines and learning and promotion. Improve the learning process based on large data sets of students, and combine ai technology to develop AI electronic devices. To provide schools with the best learning experience to survive in a competitive world.


Author(s):  
Arpit Kumar Sharma ◽  
Arvind Dhaka ◽  
Amita Nandal ◽  
Kumar Swastik ◽  
Sunita Kumari

The meaning of the term “big data” can be inferred by its name itself (i.e., the collection of large structured or unstructured data sets). In addition to their huge quantity, these data sets are so complex that they cannot be analyzed in any way using the conventional data handling software and hardware tools. If processed judiciously, big data can prove to be a huge advantage for the industries using it. Due to its usefulness, studies are being conducted to create methods to handle the big data. Knowledge extraction from big data is very important. Other than this, there is no purpose for accumulating such volumes of data. Cloud computing is a powerful tool which provides a platform for the storage and computation of massive amounts of data.


2020 ◽  
Vol 9 (1) ◽  
pp. 2206-2209

Emerging theory of Embedded System, Internet-ofThings, Data Analytics and Artificial Intelligence open up broad spectrum to develop innovative applications and to make existing system more efficient. Advent of this new technology speeds up the race towards the automation in eavery aspects of human life. This paper provides a study of embedded system, microcontroller and sensors which can be used for developing such applications. With this study, survey of vital tools and technologies is also discussed and proposed for the development of IOT based application. Paper also provides a model to gather the data from an industrial plant on which data analytics can be done. This paper shows a wide range of IOT perspective for the design of embedded system by discussing research done in it and to use that theory to develop a product which can be helpful in society for the welfares of human kind.


Author(s):  
Louise Leenen ◽  
Thomas Meyer

Cybersecurity analysts rely on vast volumes of security event data to predict, identify, characterize, and deal with security threats. These analysts must understand and make sense of these huge datasets in order to discover patterns which lead to intelligent decision making and advance warnings of possible threats, and this ability requires automation. Big data analytics and artificial intelligence can improve cyber defense. Big data analytics methods are applied to large data sets that contain different data types. The purpose is to detect patterns, correlations, trends, and other useful information. Artificial intelligence provides algorithms that can reason or learn and improve their behavior, and includes semantic technologies. A large number of automated systems are currently based on syntactic rules which are generally not sophisticated enough to deal with the level of complexity in this domain. An overview of artificial intelligence and big data technologies in cyber defense is provided, and important areas for future research are identified and discussed.


Author(s):  
Tianxiang He

The development of artificial intelligence (AI) technology is firmly connected to the availability of big data. However, using data sets involving copyrighted works for AI analysis or data mining without authorization will incur risks of copyright infringement. Considering the fact that incomplete data collection may lead to data bias, and since it is impossible for the user of AI technology to obtain a copyright licence from each and every right owner of the copyrighted works used, a mechanism that can free the data from copyright restrictions under certain conditions is needed. In the case of China, it is crucial to check whether China’s current copyright exception model can take on the role and offer that kind of function. This chapter suggests that a special AI analysis and data mining copyright exception that follows a semi-open style should be added to the current exceptions list under the Copyright Law of China.


Big Data Analytics and Deep Learning are not supposed to be two entirely different concepts. Big Data means extremely huge large data sets that can be analyzed to find patterns, trends. One technique that can be used for data analysis so that able to help us find abstract patterns in Big Data is Deep Learning. If we apply Deep Learning to Big Data, we can find unknown and useful patterns that were impossible so far. With the help of Deep Learning, AI is getting smart. There is a hypothesis in this regard, the more data, the more abstract knowledge. So a handy survey of Big Data, Deep Learning and its application in Big Data is necessary.


2019 ◽  
pp. 357-385
Author(s):  
Eric Guérin ◽  
Orhun Aydin ◽  
Ali Mahdavi-Amiri

Abstract In this chapter, we provide an overview of different artificial intelligence (AI) and machine learning (ML) techniques and discuss how these techniques have been employed in managing geospatial data sets as they pertain to Digital Earth. We introduce statistical ML methods that are frequently used in spatial problems and their applications. We discuss generative models, one of the hottest topics in ML, to illustrate the possibility of generating new data sets that can be used to train data analysis methods or to create new possibilities for Digital Earth such as virtual reality or augmented reality. We finish the chapter with a discussion of deep learning methods that have high predictive power and have shown great promise in data analysis of geospatial data sets provided by Digital Earth.


2018 ◽  
Vol 7 (4.6) ◽  
pp. 209
Author(s):  
Dr. K. Dhana Sree ◽  
Dr. C. Shoba Bindu

The two maestros Artificial Intelligence and Machine learning are ruling the data filled world with good analytics. Many of these domain skills are used in the industry to analyze and interpret the data beyond what it actually is. Supporting the known saying find the horse before the cart is ready is what it mean to normalize the data before getting it analyzed. This article focus on what normalization actually is, why normalization is needed before data analysis and how data normalization is done.  


Sign in / Sign up

Export Citation Format

Share Document