scholarly journals Improving Data Quality and Data Governance Using Master Data Management: A Review

Author(s):  
Sanny Hikmawati ◽  
Paulus Insap Santosa ◽  
Indriana Hidayah

Master data management (MDM) is a method of maintaining, integrating, and harmonizing master data to ensure consistent system information. The primary function of MDM is to control master data to keep it consistent, accurate, current, relevant, and contextual to meet different business needs across applications and divisions. MDM also affects data governance, which is related to establishing organizational actors’ roles, functions, and responsibilities in maintaining data quality. Poor management of master data can lead to inaccurate and incomplete data, leading to lousy stakeholder decision-making. This article is a literature review that aims to determine how MDM improves the data quality and data governance and assess the success of MDM implementation. The review results show that MDM can overcome data quality problems through the MDM process caused by data originating from various scattered sources. MDM encourages organizations to improve data management by adjusting the roles and responsibilities of business actors and information technology (IT) staff documented through data governance. Assessment of the success of MDM implementation can be carried out by organizations to improve data quality and data governance by following the existing framework.

2020 ◽  
Author(s):  
◽  
Panagiotis Lepeniotis

This research aims to identify the useful impact of Master Data Management (MDM) on a Business Transformation Programme (BTP). A BTP consists of three distinct phases. The first phase is the selection of the appropriate set of application systems as well as the introduction of new business processes across multiple lines of business and different channels. The second phase is the implementation of the new application systems and the data migration process. The third and final phase is the transition from the legacy application systems and business processes to the newly defined framework of processes and technologies that ensure the business and data continuity. MDM encompasses a pivotal role during the second and the third phase of a BTP and is defined as the process that runs in parallel with any other business process; assigning responsibility to people and technology on processing, capturing, maintaining and defining data accuracy based on a defined set of rules. Multiple parameters relevant to MDM such as change management, no practical commitment from senior management, no compliance with any data governance policies, implementing new integrations or any pre-existing data quality challenges along with multiple others, can jeopardise the successful completion of a BTP. As MDM becomes significant in the second phase, the research focuses on how the invasive circumstances arising from such parameters during this BTP phase and beyond may be addressed by the BTP’s programme directorate to enhance decision-making through the appropriate impact on MDM. The programme committee of a BTP would thus become aware of how to: a) manage master data, b) reinforce enterprise data quality and c) govern the overall BTP lifecycle by safeguarding data practices. Alongside an extensive learned literature review and industry resources to establish the research aims from the outset, the research appropriated a deductive and interpretive research methodology to two Data audits as case studies plus a series of semi-structured interviews and subjected to a comprehensive qualitative analysis. Each BTP either faced challenges or was about to face challenges. The different roles of the participants and the different phases of each BTP in which the audits took place allowed the research to employ these multiple methods to reflect different aspects of the same issue. Referring to the Data Audit Framework for added structure, the two data audits took place in two different companies. The first company was performing the audit after a failed BTP, and they had already an MDM function within the organisation. The audit focused on the performance of its already existing function. The second company had initiated a BTP and wanted to ensure that the required controls were in place for a successful delivery. These two audits provided valuable case study evidence for the evaluation of the decisions made during the BTP with regards to a) master data, b) what led the programme directorate to these decisions and c) how the decisions affected the outcome of the BTP as well as the organisation itself. The interviews consisted of twenty-eight semi-structured questions and involved eighteen people with diverse backgrounds and from divergent functions of the business. All the interviewees were participating in a BTP with an underlying MDM process. The interviews provided evidence on a) how different roles within the programme reflect and react under specific circumstances and b) how each workstream prioritised data-related activities in conjunction with the overall programme. From the case study audits and the interviews, the research identified an enhanced understanding of the reasons behind the decisions during a BTP concerning MDM, and how these decisions consequently affect the successful implementation of a BTP. From these findings, the research proposes a novel MDM-impacted BTP decision model that brings together its contributions to knowledge, and the basis for future work.


2021 ◽  
Vol 2021 (3) ◽  
pp. 24-26
Author(s):  
Christiana Klingenberg ◽  
◽  
Kristin Weber

Von Master Data Management (MDM) versprechen sich Unternehmen Effizienz, Transparenz und Risikominimierung im Umgang mit ihren Stammdaten. MDM soll dazu beitragen, Stammdaten als „Asset“ im Unternehmen zu bewirtschaften. Der vorliegende Beitrag liefert praktische Tipps, wie MDM-Implementierungen nachhaltig gestaltet werden können, damit die Daten einen Beitrag zum Unternehmenserfolg leisten. Er stellt das qualitätsorientierte Data Governance Framework vor. Das Framework stellt sicher, dass bei einer Implementierung alle Aspekte von MDM adressiert werden inkl. strategischer und organisatorischer Fragestellungen. Die konsequente Ausrichtung an der Datenqualität sorgt dafür, dass alle Unternehmensbereiche Stammdaten nutzenstiftend einsetzen können.


Author(s):  
David A. Weir ◽  
Stephen Murray ◽  
Pankaj Bhawnani ◽  
Douglas Rosenberg

Traditionally business areas within an organization individually manage data essential for their operation. This data may be incorporated into specialized software applications, MS Excel or MS Access etc., e-mail filing, and hardcopy documents. These applications and data stores support the local business area decision-making and add to its knowledge. There have been problems with this approach. Data, knowledge and decisions are only captured locally within the business area and in many cases this information is not easily identifiable or available for enterprise-wide sharing. Furthermore, individuals within the business areas often keep “shadow files” of data and information. The state of accuracy, completeness, and timeliness of the data contained within these files is often questionable. Information created and managed at a local business level can be lost when a staff member leaves his or her role. This is especially significant given ongoing changes in today’s workforce. Data must be properly managed and maintained to retain its value within the organization. The development and execution of “single version of the truth” or master data management requires a partnership between the business areas, records management, legal, and the information technology groups of an organization. Master data management is expected to yield significant gains in staff effectiveness, efficiency, and productivity. In 2011, Enbridge Pipelines applied the principles of master data management and trusted data digital repositories to a widely used, geographically dispersed small database (less than 10,000 records) that had noted data shortcomings such as incomplete or incorrect data, multiple shadow files, and inconsistent usage throughout the organization of the application that stewards the data. This paper provides an overview of best practices in developing an authoritative single source of data and Enbridge experience in applying these practices to a real-world example. Challenges of the approach used by Enbridge and lessons learned will be examined and discussed.


Sign in / Sign up

Export Citation Format

Share Document