Design of Library Data Warehouse Using SnowFlake Scheme Method: Case Study: Library Database of Campus XYZ

Author(s):  
Akhmad Dahlan ◽  
Ferry Wahyu Wibowo
Keyword(s):  
2018 ◽  
Vol 14 (3) ◽  
pp. 44-68 ◽  
Author(s):  
Fatma Abdelhedi ◽  
Amal Ait Brahim ◽  
Gilles Zurfluh

Nowadays, most organizations need to improve their decision-making process using Big Data. To achieve this, they have to store Big Data, perform an analysis, and transform the results into useful and valuable information. To perform this, it's necessary to deal with new challenges in designing and creating data warehouse. Traditionally, creating a data warehouse followed well-governed process based on relational databases. The influence of Big Data challenged this traditional approach primarily due to the changing nature of data. As a result, using NoSQL databases has become a necessity to handle Big Data challenges. In this article, the authors show how to create a data warehouse on NoSQL systems. They propose the Object2NoSQL process that generates column-oriented physical models starting from a UML conceptual model. To ensure efficient automatic transformation, they propose a logical model that exhibits a sufficient degree of independence so as to enable its mapping to one or more column-oriented platforms. The authors provide experiments of their approach using a case study in the health care field.


Author(s):  
Samuel Otero Schmidt ◽  
Edmir Parada Vasques Prado

Organizations are currently investing more in information technology to store and process a vast amount of information. Generally, this information does not comply with any standard, which hinders the decision-making process. The cause of the difficulties can be attributed to Information Quality (IQ), which has technical characteristics related to the architecture used in Data Warehouse (DW) and Business Intelligence (BI) environments. On the basis of the relevant literature in IQ, DW, and BI, a research model was created to identify the relations between components of DW/BI architecture and IQ dimensions. This research model was applied in a real case study (Big Financial Company in Brazil). This case study involved semi-structured interviews with managers and analysts. This chapter attempts to provide a better understanding of the relations between IT architecture and IQ in DW and BI environments. The authors hope to motivate the discussion around the development of IQ-oriented architectures for BI and the relationship between these concepts.


Author(s):  
Eric Infield ◽  
Laura Sebastian-Coleman

This paper is a case study of the data quality program implemented for Galaxy, a large health care data warehouse owned by UnitedHealth Group and operated by Ingenix. The paper presents an overview of the program’s goals and components. It focuses on the program’s metrics and includes examples of the practical application of statistical process control (SPC) for measuring and reporting on data quality. These measurements pertain directly to the quality of the data and have implications for the wider question of information quality. The paper provides examples of specific measures, the benefits gained in applying them in a data warehouse setting, and lessons learned in the process of implementing and evolving the program.


2009 ◽  
pp. 1787-1807
Author(s):  
Daniel Maier ◽  
Thomas Muegeli ◽  
Andrea Krejza

Customer investigations in the banking industry are carried out in connection with prosecutions, the administration of estates or other legal actions. The Investigation & Inquiries Department of Credit Suisse has to handle approximately 5,000 client investigations per year. To date, the investigation process has been very complex, time consuming and expensive. Several redundant query processes are needed to achieve satisfactory results. In the past few years, new regulatory requirements have led to a massive increase in the number of investigations to be performed. This case study describes how these requirements can be met by redesigning the process and building a data-warehouse-based application that automates most of the process. These two measures have significantly improved the customer investigation process, resulting in considerable cost and time savings for Credit Suisse.


Data warehouse, shortly called DW, a repository to store historical data was widely used across organizations for analyzing the data for any business decisions to be decided. It acts as a decision support system, which will help the decision makers to provide any conclusion based on the analyzed data. DW can be used across any particular fields in the public domain. Some of them would include Retail, Insurance, Finance, Sales, Services, Health Care, Education, etc. This paper analyses and proposes the datawarehouse design considerations for the supply chain. The design was explained with a detailed case study on understanding the visibility of sales order at various stages.


2017 ◽  
Vol 2 (1) ◽  
pp. 15
Author(s):  
Becky Yoose

The rise of evidence-based practices and assessment in libraries in recent years, combined with tying outcomes to future funding and resource allotments, has made libraries more reliant on patron data to determine how to allocate limited resources and funding. Libraries who want to use data for research and analysis but also wanting to protect patron privacy find themselves wondering how to balance these two priorities. This article explores The Seattle Public Library’s attempt to strike the balance between patron privacy and data analysis with the use of a data warehouse with de-identified patron data, as well as implications of data warehouses and de-identification as an option for other libraries.


Sign in / Sign up

Export Citation Format

Share Document