Master Data Quality in the Era of Digitization - Toward Inter-organizational Master Data Quality in Value Networks: A Problem Identification

Author(s):  
Thomas Schäffer ◽  
Christian Leyh
Author(s):  
Mohammed Ragheb Hakawati ◽  
Yasmin Yacob ◽  
Amiza Amir ◽  
Jabiry M. Mohammed ◽  
Khalid Jamal Jadaa

Extensible Markup Language (XML) is emerging as the primary standard for representing and exchanging data, with more than 60% of the total; XML considered the most dominant document type over the web; nevertheless, their quality is not as expected. XML integrity constraint especially XFD plays an important role in keeping the XML dataset as consistent as possible, but their ability to solve data quality issues is still intangible. The main reason is that old-fashioned data dependencies were basically introduced to maintain the consistency of the schema rather than that of the data. The purpose of this study is to introduce a method for discovering pattern tableaus for XML conditional dependencies to be used for enhancing XML document consistency as a part of data quality improvement phases. The notations of the conditional dependencies as new rules are designed mainly for improving data instance and extended traditional XML dependencies by enforcing pattern tableaus of semantically related constants. Subsequent to this, a set of minimal approximate conditional dependencies (XCFD, XCIND) is discovered and learned from the XML tree using a set of mining algorithms. The discovered patterns can be used as a Master data in order to detect inconsistencies that don’t respect the majority of the dataset.


2011 ◽  
Vol 24 (3) ◽  
pp. 288-303 ◽  
Author(s):  
Anders Haug ◽  
Jan Stentoft Arlbjørn
Keyword(s):  

2020 ◽  
Author(s):  
◽  
Panagiotis Lepeniotis

This research aims to identify the useful impact of Master Data Management (MDM) on a Business Transformation Programme (BTP). A BTP consists of three distinct phases. The first phase is the selection of the appropriate set of application systems as well as the introduction of new business processes across multiple lines of business and different channels. The second phase is the implementation of the new application systems and the data migration process. The third and final phase is the transition from the legacy application systems and business processes to the newly defined framework of processes and technologies that ensure the business and data continuity. MDM encompasses a pivotal role during the second and the third phase of a BTP and is defined as the process that runs in parallel with any other business process; assigning responsibility to people and technology on processing, capturing, maintaining and defining data accuracy based on a defined set of rules. Multiple parameters relevant to MDM such as change management, no practical commitment from senior management, no compliance with any data governance policies, implementing new integrations or any pre-existing data quality challenges along with multiple others, can jeopardise the successful completion of a BTP. As MDM becomes significant in the second phase, the research focuses on how the invasive circumstances arising from such parameters during this BTP phase and beyond may be addressed by the BTP’s programme directorate to enhance decision-making through the appropriate impact on MDM. The programme committee of a BTP would thus become aware of how to: a) manage master data, b) reinforce enterprise data quality and c) govern the overall BTP lifecycle by safeguarding data practices. Alongside an extensive learned literature review and industry resources to establish the research aims from the outset, the research appropriated a deductive and interpretive research methodology to two Data audits as case studies plus a series of semi-structured interviews and subjected to a comprehensive qualitative analysis. Each BTP either faced challenges or was about to face challenges. The different roles of the participants and the different phases of each BTP in which the audits took place allowed the research to employ these multiple methods to reflect different aspects of the same issue. Referring to the Data Audit Framework for added structure, the two data audits took place in two different companies. The first company was performing the audit after a failed BTP, and they had already an MDM function within the organisation. The audit focused on the performance of its already existing function. The second company had initiated a BTP and wanted to ensure that the required controls were in place for a successful delivery. These two audits provided valuable case study evidence for the evaluation of the decisions made during the BTP with regards to a) master data, b) what led the programme directorate to these decisions and c) how the decisions affected the outcome of the BTP as well as the organisation itself. The interviews consisted of twenty-eight semi-structured questions and involved eighteen people with diverse backgrounds and from divergent functions of the business. All the interviewees were participating in a BTP with an underlying MDM process. The interviews provided evidence on a) how different roles within the programme reflect and react under specific circumstances and b) how each workstream prioritised data-related activities in conjunction with the overall programme. From the case study audits and the interviews, the research identified an enhanced understanding of the reasons behind the decisions during a BTP concerning MDM, and how these decisions consequently affect the successful implementation of a BTP. From these findings, the research proposes a novel MDM-impacted BTP decision model that brings together its contributions to knowledge, and the basis for future work.


2016 ◽  
Vol 56 (2) ◽  
pp. 575 ◽  
Author(s):  
Mark Puzey ◽  
Stephen Latham

Many organisations in the industry have a constant struggle with managing oil and gas master data effectively to enable operations and maintenance to be carried out efficiently and safely. Poor data quality costs organisations time and money, and increases risks of production downtime. This results in inefficiencies, rework, and sub-optimal decision support. In this extended abstract, the authors provide insight from their experience into overcoming many years of poor data quality that impact ability to analyse maintenance reliability data that can effectively drive efficiencies and reduce outages caused through breakdowns. Insights includes insights from a global fact-finding mission, analysing the big data management of large oil and gas and related companies around the world – with valuable lessons in understanding why these types of data projects fail so often. Challenges include finding where and why data quality is in an unreliable state, and additionally, how to allow cleansed and new data to remain clean and effective and promote efficiency. Other challenges include changing organisation behaviour to value data, and providing streamlined processes to sustain data quality.


Sign in / Sign up

Export Citation Format

Share Document