scholarly journals Harmonized and Reversible Development Framework for HLA based Interoperable Application

Author(s):  
Zhiying Tu ◽  
Gregory Zacharewicz ◽  
David Chen

This chapter aims at proposing an approach to implement a distributed Information System built on top of a federation of existing (reused) software components. This solution is taking as a core consideration the problem of interoperability of data exchanged between enterprises. The idea is to adapt and reuse experiences coming from the development of enterprises legacy Information Systems in order to create a HLA (High Level Architecture) based system of systems. In that perspective, this chapter proposes a new bi-directional development life cycle. MDA (Model Driven Architecture) and HLA FEDEP (Federation Development and Execution Process) are combined and harmonized to implement distributed Information Systems from enterprise models of existing system. Conversely, model reverse engineering techniques are used to help re-implement existing systems, in order to be interoperable without being fully reconstructed. Then, according to HLA 1516 evolved new features, this chapter proposes a solution based on an open source RTI, poRTIco, to implement Web enabled federates.

Author(s):  
Gregory Zacharewicz ◽  
David Chen ◽  
Bruno Vallespir

This chapter aims at presenting some future trends given in the final results of the INTEROP Network of Excellence (Chen et al., 2007) that prospect ways to support federation oriented enterprise interoperability. At first, a detailed definition of enterprise interoperability is given and the relevant concepts are structured in an Enterprise interoperability framework. A review will show two early collaborative interoperable platforms developed in the 90’s. Then, a review of more recent solutions to establish interoperability, which aim to solve previous shortcomings, is proposed. The study in this chapter focused on ongoing researches for solutions based on the High Level Architecture (HLA) at the technological level because it tends to provide the desired properties. This standard was originally developed for military interoperability of large simulators with real environment. Indeed, the HLA standard has been successfully transposed for enterprise interoperability at the implementation level, reusing modeling and simulation developed through years of experiences in distributed systems to manage causality, confidentiality and interoperability. The presentation of HLA platforms will be followed by a synthetic comparison of the various approaches. The state-of-the-art is concluded by presenting MDA (Model Driven Architecture) methodology which supports the transformation of enterprise models from conceptual level to models for execution or simulation and the emerging MDI (Model Driven Interoperability) methodology. From that postulate, it is proposed to rationalize the development lifecycle of distributed enterprise models by merging the HLA FEDEP and the MDA / MDI methodology to a unified lifecycle that will guide the development of distributed enterprises models from conceptual level to the implementation of an HLA compliant solution. At the end, and as a perspective, some discussion is given on methodology to facilitate the reuse of legacy platforms in new interoperable system of systems.


2013 ◽  
Vol 61 (3) ◽  
pp. 569-579 ◽  
Author(s):  
A. Poniszewska-Marańda

Abstract Nowadays, the growth and complexity of functionalities of current information systems, especially dynamic, distributed and heterogeneous information systems, makes the design and creation of such systems a difficult task and at the same time, strategic for businesses. A very important stage of data protection in an information system is the creation of a high level model, independent of the software, satisfying the needs of system protection and security. The process of role engineering, i.e. the identification of roles and setting up in an organization is a complex task. The paper presents the modeling and design stages in the process of role engineering in the aspect of security schema development for information systems, in particular for dynamic, distributed information systems, based on the role concept and the usage concept. Such a schema is created first of all during the design phase of a system. Two actors should cooperate with each other in this creation process, the application developer and the security administrator, to determine the minimal set of user’s roles in agreement with the security constraints that guarantee the global security coherence of the system.


Author(s):  
Ricardo Pérez-Castillo ◽  
Ignacio García Rodríguez de Guzmán ◽  
Mario Piattini

Legacy information systems can be a serious headache for companies because, on the one hand, these systems cannot be thrown away since they store a lot of valuable business knowledge over time, and on the other hand, they cannot be maintained easily at an acceptable cost. For many years, reengineering has been a solution to this problem because it facilitates the reuse of the software artifacts and knowledge embedded in the system. However, reengineering often fails due to the fact that it carries out non-standardized and ad hoc processes. Currently, software modernization, and particularly ADM (Architecture-Driven Modernization), standardized by the OMG, is proving to be an important solution to that problem, since ADM advocates carrying out reengineering processes taking into account the principles and standards of model-driven development. This chapter provides an overview of ADM and shows how it allows legacy information systems to evolve, making them more agile, preserving the embedded business knowledge, and reducing maintenance costs. Also, this chapter presents the software archeology process using ADM and some ADM success stories.


Author(s):  
João Duarte ◽  
André Vasconcelos

In the past decade, the rush to technology has created several flaws in terms of managing computers, applications, and middleware and information systems. Therefore, organizations struggle to understand how these elements behave. Even today, as Enterprise Architectures grow in significance and are acknowledged as advantageous artifacts to help manage change, their benefit to the organization has yet to be fully explored. In this paper, the authors focus on the challenge of real-time information systems evaluation, using the enterprise architecture as a boundary object and a base for communication. The solution proposed is comprised of five major steps: establishing a strong conceptual base on the evaluation of information systems, defining a high level language for this activity, extending an architecture creation pipeline, creating a framework that automates it, and the framework’s implementation. The conceptual framework proposed avoids imprecise definitions of quality and quality attributes, was materialized in a model-eval-display loop framework, and was implemented using Model Driven Software Development practices and tools. Finally, a prototype is applied to a real-world scenario to verify the conceptual solution in practice.


2014 ◽  
Author(s):  
Rafael de Paula Herrera ◽  
Alan Salvany Felinto

Legacy Information Systems play key-roles on organizations development and growth. However, they can be considered as risky factor to operations chain whether they do not meet the demanding or become acting as single point of failures. In this work, we propose a migration model which is able to handle systems that depend on Relational Databases and its changes were driven through the use of a distributed middleware. We also pose how this approach was successfully applied while migrating a Legacy Information System to a Cloud Computing based infra-structure, adding fault-tolerance to its architecture as a competitive advantage, enabling the related services to be clustered and then horizontal scaled on demand. All major concerns on how the whole solution and its aggregated tools were conceived are discussed in high-level details, so them can be solely reproduced and integrated to another systems in order to achieve the same goals or improve its level of quality assurance.


Author(s):  
Sandugash Serikbayeva ◽  
J. A. Tussupov ◽  
M. A. Sambetbayeva ◽  
A.S. Yerimbetova ◽  
G.B. Borankulova ◽  
...  

Based on the analysis of typical scenarios of information servers, the tasks that should be solved when organizing an access control system for distributed information resources are formulated. The possibilities of the Z39.50 technologies as the most suitable for building such a system are considered. Within the framework of this technology, three access control models are discussed, which differ in the degree of integration of information server functions with the Z39.50 technologies.The creation and support of distributed information systems and electronic libraries that integrate heterogeneous information resources and operate in various software and hardware environments requires special approaches to managing these systems. If the resources or data themselves can be managed locally, even for distributed information systems, then the task of managing access to distributed resources cannot be solved within the framework of local administration. The justification of the last thesis can be seen when considering typical scenarios of the information server, which we will describe below


2020 ◽  
Vol 2020 (3) ◽  
pp. 38-46
Author(s):  
Andrey Krasov ◽  
Stanislav Shterenberg ◽  
Andrey Moskal'chuk

In view of the growth of scale and significance of information structure the problem in ensuring information safety and skilled technician training in this field becomes more and more urgent. A virtual laboratory within the use of which one could carry out own investigations will render in this work a considerable assistance. In this paper there is described a method for virtual laboratory creation with the aid of which it would be possible to carry out testing for penetration and to analyze methods ensuring safety for information systems. In the course of this laboratory creation there was used license software taking into account system requirements of the computer in which this laboratory was accommodated. Further in this paper there was considered a method for penetration testing on Metasploitable 2 vulnerable machine pre-configured. Next stage of the work consists in the realization and thorough consideration of potentialities in the application of the created laboratory for the simulation of own scripts by the example of Drupal vulnerable version. In the course of the operation there was shown a code for Python allowing the fulfillment of remote code carrying out. The purpose of this work was the formation of environment allowing the mastery for hacker’s action estimate and the analysis the origin of system certain vulnerable places. The actual skills obtained will render assistance in the information safety increase of actual information systems.


2004 ◽  
Vol 4 (4) ◽  
pp. 281-293 ◽  
Author(s):  
Teresa Wu ◽  
Nan Xie ◽  
Jennifer Blackhurst

Collaborative Product Development (CPD) is an engineering process that involves decision-making through iterative communication and coordination among product designers throughout the lifecycle of the product. The high level of collaboration and communication of a CPD environment requires a robust Distributed Information System. In this paper, we review existing research related to CPD and propose an information framework termed VE4PD based on the integration of web services and agent technologies to manage the CPD process. An implementation system is developed to validate the application.


Sign in / Sign up

Export Citation Format

Share Document