scholarly journals Creating a User-Centric Data Flow Visualization: A Case Study

Author(s):  
Karin Butler ◽  
Michelle Leger ◽  
Denis Bueno ◽  
Christopher Cuellar ◽  
Michael J. Haass ◽  
...  
Author(s):  
Bipin Chadha ◽  
R. E. Fulton ◽  
J. C. Calhoun

Abstract Information-Integration is vital for keeping manufacturing operations competitive. A case study approach has been adopted to better understand the role of information in integrated manufacturing. Information is now considered a corporate asset. Creation, processing, movement, and security of information is therefore as important as that of the products/services of an enterprise. The case studies have helped in identifying the issues involved in developing an information system and supporting software framework for a manufacturing enterprise. The case studies have helped in refining an integration model, and identifying the characteristics desirable in modeling methodologies and tools. This paper describes a case study dealing with integrated manufacture of optical fiber products. A phased development and implementation approach was adopted where a small, manageable slice of the system is considered for the case study followed by functional modeling (IDEF0) and data flow modeling (Data Flow Diagrams). This identifies the pieces of information of interest. The information relationships are modeled using Extended Entity Relationship (EER) diagrams which are then mapped on to a relational model. The relational tables thus obtained were implemented on a commercial Database Management System. The functional constraints and application interfaces were then built using SQL and commercial application interface tools. The sections in the paper describe the functional models, data flow diagrams, EER diagrams, relational database design, and user/application interfaces developed for the system. Implementation experiences and observations are discussed followed by plans for the next phase of the system.


Author(s):  
Masahide Nakamur ◽  
Hiroshi Igaki ◽  
Takahiro Kimura ◽  
Kenichi Matsumoto

In order to support legacy migration to the service-oriented architecture (SOA), this paper presents a pragmatic method that derives candidates of services from procedural programs. In the SOA, every service is supposed to be a process (procedure) with (1) open interface, (2) self-containedness, and (3) coarse granularity for business. Such services are identified from the source code and its data flow diagram (DFD), by analyzing data and control dependencies among processes. Specifically, first the DFD must be obtained with reverse-engineering techniques. For each layer of the DFD, every data flow is classified into three categories. Using the data category and control among procedures, four types of dependency are categorized. Finally, six rules are applied that aggregate mutually dependent processes and extract them as a service. A case study with a liquor shop inventory control system extracts service candidates with various granularities.


Author(s):  
Farheen Siddiqui ◽  
Parul Agarwal

In this chapter, the authors work at the feature level opinion mining and make a user-centric selection of each feature. Then they preprocess the data using techniques like sentence splitting, stemming, and many more. Ontology plays an important role in annotating documents with metadata, improving the performance of information extraction and reasoning, and making data interoperable between different applications. In order to build ontology in the method, the authors use (product) domain ontology, ConceptNet, and word net databases. They discuss the current approaches being used for the same by an extensive literature survey. In addition, an approach used for ontology-based mining is proposed and exploited using a product as a case study. This is supported by implementation. The chapter concludes with results and discussion.


Author(s):  
Dan Johansson ◽  
Mikael Wiberg

Mobility has become an omnipresent part of our modern IT society. Alongside the general taxonomy of mobile users, terminals, sessions, and services, there are also more specialized forms of mobility. Context-Awareness Supported Application Mobility (CASAM) or “Application Mobility” is one such form that is explored in this chapter. CASAM builds on the idea of using context to move an application between different devices during its execution in order to provide relevant information and/or services. The authors use a concept-driven approach to advance mobile systems research, integrating it with a more traditional user-centric method and a case study, further exploring the concept of CASAM. To empirically situate our design work they conducted an empirical study of a home care service group serving the Swedish municipality of Skellefteå, followed by an exercise in matching the properties of the CASAM concept in relation to problems within current workflow.


Author(s):  
Shaoying Liu

FRSM (Formal Requirements Specification Method) is a structured formal language and method for requirements analysis and specification construction based on data flow analysis. It uses a formalized DeMarco data flow diagram to describe the overall structure of systems and a VDM-SL like formal notation to describe precisely the functionality of components in the diagrams. This paper first describes the formal syntax and semantics of FRSM and then presents an example of using the axiom and inference rules given in the definition of the formal semantics for checking consistency of specifications. A case study of applying FRSM to a practical example is described to demonstrate the principle of constructing requirements specifications and to uncover the benefits and deficiencies of FRSM.


2021 ◽  
Vol 41 (11) ◽  
pp. 1660-1710
Author(s):  
Soroosh Saghiri ◽  
Vahid Mirzabeiki

PurposeThis paper aims to explore how omni-channel data flows should be integrated by specifying what data, omni-channel agents and information and digital technologies (IDTs) should be considered and connected.Design/methodology/approachA multiple case study method is used with 17 British companies. The studies are supported by 68 interviews with the case companies and their consumers, 5 site visits, 4 focus group meetings and the companies’ archival data and documentations.FindingsThis paper provides novel frameworks for omni-channel data flow integration from consumer and business perspectives. The frameworks consist of omni-channel agents, their data transactions and their supporting IDTs. Relatedly, this paper formalizes the omni-channel data flow integration in the forms of horizontal, vertical and total integrations and explores their contributions to the adaptability of omni-channel, as a complex adaptive system (CAS). It also discusses that how inter-organizational governance mechanisms can support data flow integration and their relevant IDT implementations.Research limitations/implicationsThe breadth and depth of the required IDTs for omni-channel integration prove the necessity for omni-channel systems to move toward total integration. Therefore, supported by CAS and inter-organizational governance theories, this research indicates how data flow integration and IDT can transform the omni-channel through self-organization and autonomy capability enhancement.Originality/valueThis research’s recommended frameworks provide a robust platform to formalize data flow integration as the omni-channel's core driver. Accordingly, it moves the literature from a basic description of “what omni-channel is” and provides a novel and significant debate on what specific data should be shared at what levels between which agents of the omni-channel, and with what type of relationship governance mechanism, to assure omni-channel horizontal, vertical and total integrations.


2002 ◽  
Vol 5 (1) ◽  
Author(s):  
Tatiana Sugeta ◽  
Adenilso da Silva Simao ◽  
Jose Carlos Maldonado ◽  
Maria Carolina Monard

Structured testing criteria are usually used to assess the adequacy of test case sets, defining coverage measures. Control and data flow based criteria employ information about the program graph as well as definition and usage of variables to establish the testing requirements. In this paper, we present an approach to prototype supporting tools for control and data flow based criteria. In the proposed approach, we use TXL — a language based in the transformational paradigm — to analyze and instrument the program under test. The instrumentation aims at making it possible to process the data by a Prolog program which allows the tester to assess the test case set adequacy. A simple example is used to illustrate the main ideas of our approach.


2010 ◽  
pp. 217-226 ◽  
Author(s):  
Alberto Morell Pérez ◽  
Jorge Marx Gómez ◽  
Carlos Pérez Risquet

Sign in / Sign up

Export Citation Format

Share Document