Information Quality Management
Latest Publications


TOTAL DOCUMENTS

12
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

Published By IGI Global

9781599040240, 9781599040264

2011 ◽  
pp. 190-220 ◽  
Author(s):  
Zhanming Su ◽  
Zhanming Jin

Product Information Quality (PIQ) is critical in manufacturing enterprises. Yet, the field lacks comprehensive methodologies for its evaluation. In this paper, the authors attempt to develop such a methodology, which is called Activity-based Measuring and Evaluating of PIQ (AMEQ) to form a basis for PIQ measurement and evaluation. The methodology encompasses a road map to measure and improve PIQ, an indicator system based on characteristics and logic-temporal sequences of processes, and a set of models to quantificationally describe, operate and measure the designing and manufacturing processes of mechanical product information. The methodology is illustrated through a business case. The results of the methodology are useful for determining and reviewing the best area for PIQ improvement activities.


2011 ◽  
pp. 168-189 ◽  
Author(s):  
Latif Al-Hakim

This chapter considers information flow as an important dimension of information quality and proposes a procedure for mapping information flow. The surgery management process (SMP) of a public hospital is used as a case in which to illustrate the steps of the developed procedure. The chapter discusses the issues that make information mapping of SMP a challenging task and explains the difficulties associated with traditional process mapping techniques in determining the interdependencies and information flow within and between various elements of SMP activities. The proposed procedure integrates a structured process mapping technique known as IDEF0 with another structured technique referred to as dependency structured matrix (DSM) to map the information flow within SMP. The chapter indicates that it is possible to reduce feedback from other activities that affect the performance of SMP by administratively controlling the information flow through certain activities of SMP.


2011 ◽  
pp. 275-291 ◽  
Author(s):  
Suhaiza Zailani ◽  
Premkumar Rajagopal

This paper introduces how information quality plays an important role in a supply chain performance. In order to make smarter use of global resources, the companies should pay attention to the quality of information to provide better services to their customers. This paper examines the factors influencing information quality and investigates the influences of information quality on supply chain performance. The information quality is classified into four types: accuracy, completeness, consistency and timeliness. The influencing factors include technological, organizational, and environmental characteristics. Supply chain performance is measured based on financial and non-financial indices. It can be found that the extent of information quality will increase supply chain performance and the extent of information quality is influenced by technological, organizational and environmental characteristics. The authors hope to understand the factors that influence information quality towards better supply chain performance will not only inform researchers of a better design for studying information quality but also assist in the understanding of intricate relationships between different factors.


2011 ◽  
pp. 119-144
Author(s):  
Ismael Caballero ◽  
Mario Piattini

This chapter introduces a way for assessing and improving information quality at organizations. Information is one of the most important assets for today’s enterprises since it is the basis for organizational decisions. However, as information is produced from data, both data and information quality must be managed. Although many researches have proposed technical and managerial solutions to some specific information quality problems, an integrative framework which brings together these kinds of solutions is still lacking. Our proposal consists of a framework for assessing and improving information quality through the concept of Information Management Process (IMP). An IMP is assessed according to an information quality maturity model by using an assessment and improvement methodology. The framework provides a consistent roadway for coordinating efforts and resources to manage information quality with a strategic perspective. As an application example, a study case has been included in the chapter.


Author(s):  
Zbigniew J. Gackowski

This chapter presents a qualitative inquiry into the universe of quality attributes of symbolic representation such as data and information values. It offers a rationale for a move from the internal toward the external, from the ontological to the teleological perspective. The focus is on approaches that derive attributes from established theories. The special relativity of qual-ity as applied to information values is discussed at various levels of viewing those attributes within business-decision contexts. Four cases offer examples of top-down, dataflow-up examination of quality attributes to demonstrate the potential of the teleological perspective. A rationale for a broader use of qualified names for quality attributes is given. The evolutional, by Liu and Chi (2002), and the purpose-focused, by Gackowski (2004, 2005a/b), views of operations quality offer a new potential for integrating the present theoretical contributions into a more complete, cohesive, and pragmatic model. Examples begin with the quantity and utility value of information, the direct primary attributes of quality, some of the direct secondary attributes, and end with samples of indirect attributes.


Author(s):  
Laure Berti-Equille

For non-collaborative distributed data sources, quality-driven query processing is difficult to achieve because the sources generally do not export data quality indicators. This chapter deals with the extension and adaptation of query processing for taking into account constraints on quality of distributed data. This chapter presents a novel framework for adaptive query processing on quality-extended query declarations. It proposes an expressive query language extension combining SQL and QML, the Quality of service Modeling Language proposed by Frølund and Koistinen (1998) for defining in a flexible way dimensions, and metrics on data, sources and services quality. The originality of the approach is to include the negotiation of quality contracts between the distributed data sources competing for answering the query. The principle is to find dynamically the best trade-off between the local query cost and the result quality. The author is convinced that quality of data (QoD) and quality of service (QoS) can be advantageously conciliated for tackling the problems of quality-aware query processing in distributed environments and more generally, that opens innovative research perspectives for quality-aware adaptive query processing.


Author(s):  
M. Mehdi Owrang O.

Current database technology involves processing a large volume of data in order to discover new knowledge. However, knowledge discovery on just the most detailed and recent data does not reveal the long-term trends. Relational databases create new types of problems for knowledge discovery since they are normalized to avoid redundancies and update anomalies, which make them unsuitable for knowledge discovery. A key issue in any discovery system is to ensure the consistency, accuracy, and completeness of the discovered knowledge. We describe the aforementioned problems associated with the quality of the discovered knowledge and provide some solutions to avoid them.


Author(s):  
John Talburt ◽  
Richard Wang ◽  
Kimberly Hess ◽  
Emily Kuo

This chapter introduces abstract algebra as a means of understanding and creating data quality metrics for entity resolution, the process in which records determined to represent the same real-world entity are successively located and merged. Entity resolution is a particular form of data mining that is foundational to a number of applications in both industry and government. Examples include commercial customer recognition systems and information sharing on “persons of interest” across federal intelligence agencies. Despite the importance of these applications, most of the data quality literature focuses on measuring the intrinsic quality of individual records than the quality of record grouping or integration. In this chapter, the authors describe current research into the creation and validation of quality metrics for entity resolution, primarily in the context of customer recognition systems. The approach is based on an algebraic view of the system as creating a partition of a set of entity records based on the indicative information for the entities in question. In this view, the relative quality of entity identification between two systems can be measured in terms of the similarity between the partitions they produce. The authors discuss the difficulty of applying statistical cluster analysis to this problem when the datasets are large and propose an alternative index suitable for these situations. They also report some preliminary experimental results, and outlines areas and approaches to further research in this area.


2011 ◽  
pp. 221-251 ◽  
Author(s):  
Andy Koronos ◽  
Shien Lin

This chapter discusses the critical issues of information quality (IQ) associated with engineering assets management. It introduces an asset management (AM) specific IQ framework as a means of studying IQ in engineering asset management. It argues that it is essential to ensure the quality of data in monitoring systems, control systems, maintenance systems, procurement systems, logistics systems, and range of mission support applications in order to facilitate effective AM. There is also a growing need to address the issue of IQ in enterprise asset management (EAM) systems, by analyzing existing practices and developing frameworks/models to assist engineering enterprises to capture, process and deliver quality data and information. Furthermore, the authors hope that a better understanding of the current issues and emerging key factors for ensuring high quality AM data through the use of the AM IQ framework will not only raise the general IQ awareness in engineering asset management organisations, but also assist AM and IT professionals in obtaining an insightful and overall appreciation about what AM IQ problems are and why they have emerged.


2011 ◽  
pp. 145-167
Author(s):  
Elizabeth M. Pierce

This paper takes the basic constructs of the IP-Map diagram and demonstrates how they can be combined with the Event-Driven Process Chain Methodology’s family of diagrams. This extended family of diagrams can be used to more fully describe the organizational, procedural, informational, and communication structure of a business process while at the same time highlighting the manufacture of the information products used by that business process. The paper concludes with a review of requirements for a software package that will allow analysts to model and explore their business processes with an emphasis on improving the quality of the organization’s information products.


Sign in / Sign up

Export Citation Format

Share Document