Trends in Design, Optimization, Languages, and Analytical Processing of Big Data (DOLAP 2020)

2022 ◽  
Vol 104 ◽  
pp. 101929
Author(s):  
Katja Hose ◽  
Oscar Romero ◽  
Il-Yeol Song
2012 ◽  
Vol 5 (12) ◽  
pp. 1802-1813 ◽  
Author(s):  
Yanpei Chen ◽  
Sara Alspaugh ◽  
Randy Katz

2019 ◽  
Vol 1 (2) ◽  
pp. 65-80
Author(s):  
Hangjun Zhou ◽  
Jieyu Zhou ◽  
Guang Sun ◽  
Wangdong Jiang ◽  
Chuntian Luo ◽  
...  

Author(s):  
V. A. Konovalov

The paper assesses the prospects for the application of the big data paradigm in socio-economic systems through the analysis of factors that distinguish it from the well-known scientific ideas of data synthesis and decomposition. The idea of extracting knowledge directly from big data is analyzed. The article compares approaches to extracting knowledge from big data: algebraic and multidimensional data analysis used in OLAP-systems (OnLine Analytical Processing). An intermediate conclusion is made about the advisability of dividing systems for working with big data into two main classes: automatic and non-automatic. To assess the result of extracting knowledge from big data, it is proposed to use well-known scientific criteria: reliability and efficiency. It is proposed to consider two components of reliability: methodical and instrumental. The main goals of knowledge extraction in socio-economic systems are highlighted: forecasting and support for making management decisions. The factors that distinguish big data are analyzed: volume, variety, velocity, as applied to the study of socio-economic systems. The expediency of introducing a universe into systems for processing big data, which provides a description of the variety of big data and source protocols, is analyzed. The impact of the properties of sample populations from big data: incompleteness, heterogeneity, and non-representativeness, the choice of mathematical methods for processing big data is analyzed. The conclusion is made about the need for a systemic, comprehensive, cautious approach to the development of fundamental decisions of a socio-economic nature when using the big data paradigm in the study of individual socio-economic subsystems.


Author(s):  
Salman Ahmed Shaikh ◽  
Kousuke Nakabasami ◽  
Toshiyuki Amagasa ◽  
Hiroyuki Kitagawa

Data warehousing and multidimensional analysis go side by side. Data warehouses provide clean and partially normalized data for fast, consistent, and interactive multidimensional analysis. With the advancement in data generation and collection technologies, businesses and organizations are now generating big data (defined by 3Vs; i.e., volume, variety, and velocity). Since the big data is different from traditional data, it requires different set of tools and techniques for processing and analysis. This chapter discusses multidimensional analysis (also known as on-line analytical processing or OLAP) of big data by focusing particularly on data streams, characterized by huge volume and high velocity. OLAP requires to maintain a number of materialized views corresponding to user queries for interactive analysis. Precisely, this chapter discusses the issues in maintaining the materialized views for data streams, the use of special window for the maintenance of materialized views and the coupling issues of stream processing engine (SPE) with OLAP engine.


Author(s):  
M. Baby Nirmala

In this emerging era of analytics 3.0, where big data is the heart of talk in all sectors, achieving and extracting the full potential from this vast data is accomplished by many vendors through their new generation analytical processing systems. This chapter deals with a brief introduction of the categories of analytical processing system, followed by some prominent analytical platforms, appliances, frameworks, engines, fabrics, solutions, tools, and products of the big data vendors. Finally, it deals with big data analytics in the network, its security, WAN optimization tools, and techniques for cloud-based big data analytics.


2020 ◽  
Vol 9 (2) ◽  
pp. 88
Author(s):  
Damião Ribeiro de Almeida ◽  
Cláudio de Souza Baptista ◽  
Fabio Gomes de Andrade ◽  
Amilcar Soares

Trajectory data allow the study of the behavior of moving objects, from humans to animals. Wireless communication, mobile devices, and technologies such as Global Positioning System (GPS) have contributed to the growth of the trajectory research field. With the considerable growth in the volume of trajectory data, storing such data into Spatial Database Management Systems (SDBMS) has become challenging. Hence, Spatial Big Data emerges as a data management technology for indexing, storing, and retrieving large volumes of spatio-temporal data. A Data Warehouse (DW) is one of the premier Big Data analysis and complex query processing infrastructures. Trajectory Data Warehouses (TDW) emerge as a DW dedicated to trajectory data analysis. A list and discussions on problems that use TDW and forward directions for the works in this field are the primary goals of this survey. This article collected state-of-the-art on Big Data trajectory analytics. Understanding how the research in trajectory data are being conducted, what main techniques have been used, and how they can be embedded in an Online Analytical Processing (OLAP) architecture can enhance the efficiency and development of decision-making systems that deal with trajectory data.


2020 ◽  
Vol 14 (2) ◽  
pp. 71
Author(s):  
Omar Mohammed Zragat

This study aimed at discovering the impact of big data in terms of its dimensions (Variety, Velocity, Volume, and Veracity) on financial reports quality in the present business intelligence in terms of its dimensions (Online Analytical Processing (OLAP), Data Mining, and Data Warehouse) as a moderating variable in Jordanian telecom companies. The sample included (139) employees in Jordanian Telecom Companies. Multiple and Stepwise Linear Regression were used to test the effect of the independent variable on the dependent variable. And Hierarchical Regression analysis, to test the effect of the independent variable on the dependent variable in the presence of the moderating variable.  The study reached a set of results, the most prominent of which was the presence of a statistically significant effect of using big data in improve the quality of financial reports, Business intelligence contributes to improving the impact of big data in terms of its dimensions (Volume, Velocity, Variety, and Veracity) on the quality of financial reports. The study recommends the necessity of working on making use of big data and resorting to business intelligence solutions because of its great role in improving the quality of financial reports and thus supporting decision-making functions for a large group of users.


Sign in / Sign up

Export Citation Format

Share Document