7th Database Symposium: Engineering Data Management — Key to Success in a Global Market
Latest Publications


TOTAL DOCUMENTS

25
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By American Society Of Mechanical Engineers

9780791811696

Author(s):  
Lee P. Brintle ◽  
Elizabeth A. Koppes ◽  
J. K. Wu

Abstract The Tracked Vehicle Workstation (TVWS) is a distributed, object-oriented design environment that stores the large amounts of data associated with mechanical system’s design and analysis across heterogeneous UNIX† machines on a network. A Distributed ASCII File Storage (DAFS) server was developed to provide an easy to port, easy to modify means of retrieving and updating files on remote machines. This paper describes the different techniques and difficulties of methods the TVWS environment has used previously, including commercial solutions, single-node storage and the Network File System (NFS)‡, as well as a description of the current method utilizing DAFS.


Author(s):  
Yamini Gourishankar ◽  
Frank Weisgerber

Abstract It is observed that calculating the wind pressures on structures involves more data retrieval from the ASCE standard than any subjective reasoning on the designer’s part. Once the initial design requirements are established, the procedure involved in the computation is straightforward. This paper discusses an approach to automate the process associated with wind pressure computation on one story and multi-story buildings using a data management strategy (implemented using the ORACLE database management system). In the prototype system developed herein, the designer supplies the design requirements in the form of the structure’s exposure type, its dimensions and the nature of occupancy of the structure. Using these requirements, the program retrieves the necessary standards data from an independently maintained database, and computes the wind pressures. The final output contains the wind pressures on the main wind force resisting system, and on the components and claddings, for wind blowing parallel and perpendicular to the ridge. The knowledge encoded in the system was gained from ASCE codes, design guidelines and as a result of interviews with various experts and practitioners. Several information modeling methodologies such as the entity relationship model, IDEF 1X, etc. were employed in the system analysis and design phase of this project. The prototype is implemented on an IBM PC using the ORACLE DBMS and the ‘C’ programming language. Appendix A illustrates a sample run.


Author(s):  
Art Goldschmidt

Abstract NAVFAC is a system of some 4200 workstations to manage U.S. Naval Facilities world-wide. The system applications are from the architectural, cartographic, civil, electrical, and mechanical engineering disciplines. NAVFAC includes a Modeling and Drawing Management System (MDMS) to manage the tens of thousands disparate files in the environment. IBM responded to the dozens of detailed requirements for MDMS with a custom-built solution which includes several innovations in the areas of configuration management, version control, and globally distributed file management.


Author(s):  
K. J. Cleetus

Abstract In order to make the traditional product structure tree representation amenable to concurrent engineering, relationships like perspective-of and dependent-on have to be added to the essential part-of relationship. Complex data can be held in proprietary formats, while simple data will be in a common representation for direct access by diverse disciplines. Coordination among team members in a project can be carried out using such a model. Besides, a virtually unified view of all the data is possible, though they may lie in distributed and heterogeneous data bases. A very necessary characteristic of such a model is that its time evolution should be easy to represent in order to reflect the dynamic nature of product development, where the model itself, and not merely the data values change. Managing versions is also facilitated by the comprehensive structure of the Unified Product Data Model (UPDM).


Author(s):  
Morten Lovstad ◽  
Tor G. Syvertsen

Abstract Huge steel or reinforced concrete structures in deep waters support the installations for oil and gas production in the North Sea. Steady operations in a hostile environment require that structural safety and integrity is maintained. For rapid evaluation and assessment of structural integrity in case of modifications or urgency situations, Structural Integrity Systems are established, comprising computational models and structural analysis programs. A major problem for structural assessment at short notice is to keep the analysis models updated and consistent with the actual state of the physical structure and the loadings. This paper proposes a layered approach for model integration, which enable maintenance of the models at a high level, from which detailed analysis models are derived in a consistent manner.


Author(s):  
William J. Rasdorf ◽  
Lisa K. Spainhour

Abstract Researchers and materials engineers require a greater understanding of the problems and solutions that emerge when integrating composite materials data with computer technology so that utilitarian composite materials databases can be developed to effectively and efficiently support analysis and design software. Composite materials constitute a representational challenge due to their composition and use. However, this paper suggests that a conceptual composite material data model and application software interfaces must be developed to support the dissemination and use of composite materials data. This paper primarily serves to analyze several of the problems facing developers of composite materials databases, evolving from the complexity of the materials themselves and from the current lack of testing and data representation standards. Without a clear understanding of the scope and nature of these problems, there is no possibility of designing concise yet comprehensive composites data models, yet we feel that such an understanding is presently lacking. In addition, an effort is made to present possible solutions to these difficulties being suggested and/or implemented both by the authors and by other researchers in the field. Such an effort provides a firm foundation upon which future research may be based.


Author(s):  
K. C. Morris

Abstract The problem of sharing data has many facets. The need to share data across multiple enterprises, different hardware platforms, different data storage paradigms and systems, and a variety of network architectures is growing. The emerging Standard for The Exchange of Product Model Data (STEP), being developed in the International Organization for Standardization (ISO), addresses this need by providing information models, called application protocols, which clearly and unambiguously describe data. The validity of these information models is essential for success in sharing data in a highly automated engineering environment. This paper describes the Data Probe: a tool for examining, editing, and managing EXPRESS-based data. The Data Probe tool supports the validation of STEP application protocols. The paper includes a description of the software architecture, the initial implementation, and plans for future enhancements. The software is designed as independent components which can be incorporated into other STEP-related systems or software requiring general purpose editing tools for structured information. The initial version of the Data Probe tool is based on two implementation mechanisms defined within STEP: the conceptual modeling language EXPRESS and the STEP exchange file format. Future work will focus on integrating a database system into the software. The software architecture and the use of object-oriented techniques enables code reusability and system extensibility and has been instrumental for a phased implementation. The software is under development at the National Institute of Standards and Technology and is in the public-domain. The software supports the Validation Testing System, part of the Application Protocol Development Environment, at the CALS-sponsored National PDES Testbed. (PDES, Product Data Exchange using STEP, is the U.S. effort in support of the international standard.)


Author(s):  
Wolfgang Mueller ◽  
Bernd Kleinjohann

Abstract Most engineering tasks are highly data intensive coping with the increasing complexity of systems. Gigabytes of heterogeneous engineering data have to be managed consistently by a huge array of tools. This necessitates sophisticated integration techniques based on a common database management system in order to decrease the amount of data that have to be exchanged between these tools. In this paper we present a new approach to distributed design frameworks, integrating graphical as well as text processing tools. Tools may online share the same graphical and logical data synchronized by the event mechanism of an object management system. The synchronization concept is based on a tight integration into an object-oriented object management system and provides means to keep the graphical views of multiple agents consistent. We outline these concepts by the example of an integrated EXPRESS modeling workbench.


Author(s):  
R. Karinathi ◽  
V. Jagannathan ◽  
V. Montan ◽  
J. Petro ◽  
M. Sobolewski ◽  
...  

Abstract The engineering data of a large enterprise is typically distributed over a wide area and archived in a variety of databases and file systems. Access to such information is crucial to a team member, particularly in a concurrent engineering setting. However, this is not easy, because (1) a model of the relevant information is not available, and (2) there is no simple way to access the information without being knowledgeable about various computer data formats, file systems, and networks. However, in a concurrent engineering environment, there is every need to be aware of the perspectives of the other members of the team. We have developed a system called the Information Sharing Server (ISS) to enable access to diverse and distributed information within a corporation. Such data could be stored in different repositories such as databases (relational, object oriented, etc.) and file systems including those that contain multiple media (text, graphics, audio, etc.). The ISS maintains an enterprise model that is visible to the user. The modeling of the enterprise is done in a language called EXPRESS developed by the STEP consortium as an international standard. The ISS also stores mappings from the model to the actual data residing in the repositories. The ISS accepts requests from the user and converts them into requests specific to a repository. The request is then communicated to the repository over the network and the results are fetched back to the user. The ISS is currently integrated with engineering data of two domains: electrical and mechanical. Our paper describes the methodology of the ISS, the details of the implementation and extensions planned for the future. We believe the transparency offered by the ISS will make it a very useful tool for an engineer and make it very convenient to integrate heterogeneous legacy databases.


Author(s):  
Stephen J. Schoonmaker

Abstract This paper presents a management model for “in-house” engineering software. This model expands on a previous publication by the author, and this model is intended to satisfy the requirements of an ISO 9000 registration. ISO 9000 is the quality assurance standard of the International Organization for Standardization. ISO 9000 registration, which includes on-site independent auditing, is becoming a requirement for some businesses operating in the global marketplace. This paper draws on the author’s first-hand experience with this process. The author also urges the ASME to produce guidelines or standards on this material which will allow other mechanical engineers to benefit from his experience.


Sign in / Sign up

Export Citation Format

Share Document