scholarly journals Integration and Optimization of Ancient Literature Information Resources Based on Big Data Technology

2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Lingling Gu

Big data refers to a collection of data that cannot be captured, managed, and processed with conventional software tools within a certain time frame. It is a massive, high-volume, high-volume data that requires new processing models to have stronger decision-making power, insight and discovery, process optimization capabilities, growth rate, and diversified information assets. This article aims to study the integration and optimization of ancient literature information resources of big data technology, that is, to integrate and optimize ancient literature information resources through big data technology and make the literature more systematic and complete, allowing readers to find and browse literature more conveniently. This paper focuses on the literary works and the related collation, annotation, and textual research results and divides the scope of each subtopic according to the genre. The biggest difference between the information platform built in this paper and the existing ancient books database is that it has the functions of semantic analysis, subject retrieval, data generation, and so on. After text learning, the computer can automatically classify related vocabulary. Based on the effective integration of big data and cultural resources, the experimental results of this article show that, so far, through technical optimization and resource integration, the number of ancient literature reincorporated has exceeded 12,000 copies, and more than 10,000 publications have been restored. Therefore, big data technology is essential for the integration and optimization of cultural resources.

2021 ◽  
Author(s):  
Kiran Chaudhary ◽  
Mansaf Alam ◽  
Mabrook S. Al-Rakhami ◽  
Abdu Gumaei

Abstract Almost many consumers are inclined by social media to purchase the product and spend more money on purchasing. We got the data from social media to analyse the consumer behaviour. We have considered the consumer data from Facebook, Twitter, LinkedIn and YouTube. There is diversity and high-speed, high volume data is coming from social media, so we used big data technology. Big Data Technology is the recent technology is used in various field of research. In this paper we have used the concept of big data technology to process data and analyse to predict the consumer behaviour on social media. We have analysed the consumer behaviour based on certain parameter and criteria. we have analysed the consumer perception, attitude towards the social media. For doing the prediction we have pre-process the data to make the quality data so that we can take the quality decision based on outcome of our model. We have used the predictive big data analytics technique to analyse the consumer behaviour prediction in this paper.


Author(s):  
Fitri Retrialisca ◽  
Umi Chotijah

Background: Big data technology has been used in several sectors in Indonesia. Adoption of big technology provides great potential for research, especially achievement in the implementation of big data in manufacturing companies. The Data Warehousing Institute (TDWI) Maturity Model is a tool that can be used to measure the state of "As-is" implementation of big data using 5 main dimensions. Maturity level shows the level of organizational ability to adjust big data technology currently.Objective: This study aims to measure the level of maturity in the implementation of big data technology in manufacturing companies PT. XYZ. This measurement is considered very important because it can know the process of managing data that is structured and has a high volume of data and provides more transparent reporting. This can help the company in making decisions that provide good information, so the company can increase the trust of stakeholders.Methods: This study uses qualitative methods to analyze research data using TWDI Maturity Model tools. Interview technique is used to retrieve respondent data where interview preparation guidelines are made by paying attention to 5 dimensions and 50 indicators in TDWI.Results: The research showed that the implementation of big data technology in the company as a whole has reached the level of corporate adoption. Infrastructure, data management, and analytics dimensions have reached the corporate adoption level while the organizational and governance dimensions are still at an early adoption level.Conclusion: To measure the maturity level of adoption of big data technology in manufacturing companies can use qualitative methods with TDWI Maturity model tools, interview guides for data retrieval by considering the 5 dimensions and 50 indicators that exist in TDWI. 


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
Abou Zakaria Faroukhi ◽  
Imane El Alaoui ◽  
Youssef Gahi ◽  
Aouatif Amine

AbstractValue Chain has been considered as a key model for managing efficiently value creation processes within organizations. However, with the digitization of the end-to-end processes which began to adopt data as a main source of value, traditional value chain models have become outdated. For this, researchers have developed new value chain models, called Data Value Chains, to carry out data driven organizations. Thereafter, new data value chains called Big Data Value chain have emerged with the emergence of Big Data in order to face new data-related challenges such as high volume, velocity, and variety. These Big Data Value Chains describe the data flow within organizations which rely on Big Data to extract valuable insights. It is a set of ordered steps using Big Data Analytics tools and mainly built for going from data generation to knowledge creation. The advances in Big Data and Big Data Value Chain, using clear processes for aggregation and exploitation of data, have given rise to what is called data monetization. Data monetization concept consists of using data from an organization to generate profit. It may be selling the data directly for cash, or relying on that data to create value indirectly. It is important to mention that the concept of monetizing data is not as new as it looks, but with the era of Big Data and Big Data Value Chain it is becoming attractive. The aim of this paper is to provide a comprehensive review of value creation, data value, and Big Data value chains with their different steps. This literature has led us to construct an end-to-end exhaustive BDVC that regroup most of the addressed phases. Furthermore, we present a possible evolution of that generic BDVC to support Big Data Monetization. For this, we discuss different approaches that enable data monetization throughout data value chains. Finally, we highlight the need to adopt specific data monetization models to suit big data specificities.


2020 ◽  
Vol 11 (4) ◽  
Author(s):  
Elena Deeva

Today, Big Data is one of the key economic and information resources needed to transform the digital economy and the competitiveness of a consulting firm. The analysis of scientific literature indicates the lack of a unified approach to the concept of "Big Data" in the consulting market. The development of Big Data technology in the market of consulting services and their commercial application is relevant. The article discusses the evolution of approaches to the concept of "Big Data", prospects and factors of development and sources of data acquisition. We also discussed the benefits of using big data technology. The research summarized available knowledge about the nature and development of Big Data technology in the consulting services market. The study revealed the power of the tool used by the analyst. The study identified a distinction between the terms "Deep Date" and "Big Date". We examined the following issues related to Big Data: the processing costs, in particular, expensive equipment, wages for qualified specialists, the problem of confidentiality and loss of information.


Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 698 ◽  
Author(s):  
Shabana Ramzan ◽  
Imran Bajwa ◽  
Rafaqut Kazmi

Handling complexity in the data of information systems has emerged into a serious challenge in recent times. The typical relational databases have limited ability to manage the discrete and heterogenous nature of modern data. Additionally, the complexity of data in relational databases is so high that the efficient retrieval of information has become a bottleneck in traditional information systems. On the side, Big Data has emerged into a decent solution for heterogenous and complex data (structured, semi-structured and unstructured data) by providing architectural support to handle complex data and by providing a tool-kit for efficient analysis of complex data. For the organizations that are sticking to relational databases and are facing the challenge of handling complex data, they need to migrate their data to a Big Data solution to get benefits such as horizontal scalability, real-time interaction, handling high volume data, etc. However, such migration from relational databases to Big Data is in itself a challenge due to the complexity of data. In this paper, we introduce a novel approach that handles complexity of automatic transformation of existing relational database (MySQL) into a Big data solution (Oracle NoSQL). The used approach supports a bi-fold transformation (schema-to-schema and data-to-data) to minimize the complexity of data and to allow improved analysis of data. A software prototype for this transformation is also developed as a proof of concept. The results of the experiments show the correctness of our transformations that outperform the other similar approaches.


Author(s):  
Yu Zhang ◽  
Yan-Ge Wang ◽  
Yan-Ping Bai ◽  
Yong-Zhen Li ◽  
Zhao-Yong Lv ◽  
...  

2017 ◽  
Author(s):  
Michael J Madison ◽  
Brett M. Frischmann ◽  
Katherine J. Strandburg

This chapter describes methods for systematically studying knowledge commons as an institutional mode of governance of knowledge and information resources, including references to adjacent but distinct approaches to research that looks primarily to the role(s) of intellectual property systems in institutional contexts concerning innovation and creativity.Knowledge commons refers to an institutional approach (commons) to governing the production, use, management, and/or preservation of a particular type of resource (knowledge or information, including resources linked to innovative and creative practice).Commons refers to a form of community management or governance. It applies to a resource, and it involves a group or community of people who share access to and/or use of the resource. Commons does not denote the resource, the community, a place, or a thing. Commons is the institutional arrangement of these elements and their coordination via combinations of law and other formal rules; social norms, customs, and informal discipline; and technological and other material constraints. Community or collective self-governance of the resource, by individuals who collaborate or coordinate among themselves effectively, is a key feature of commons as an institution, but self-governance may be and often is linked to other formal and informal governance mechanisms. For purposes of this chapter, knowledge refers to a broad set of intellectual and cultural resources. There are important differences between various resources captured by such a broad definition. For example, knowledge, information, and data may be different from each other in meaningful ways. But an inclusive term is necessary in order to permit knowledge commons researchers to capture and study a broad and inclusive range of commons institutions and to highlight the importance of examining knowledge commons governance as part of dynamic, ecological contexts


Sign in / Sign up

Export Citation Format

Share Document