An Advanced Coupling Complexity Metric for Evaluating the Quality of OO Software Modules

2018 ◽  
Vol 6 (9) ◽  
pp. 78-82
Author(s):  
N. Vijayaraj ◽  
T.N. Ravi
Keyword(s):  
Author(s):  
Joannes Gullaksen

Abstract Development of software application for subsea engineering design and analysis is to a large extent based on codes and standards used in the offshore industry when considering subsea pipelines. In this paper a software is described which main purpose is to facilitate the design and analysis process and such that results and documentation are automatically generated to increase quality of documentation. Current scope is a standard calculation tool covering different aspects of design in compliance with relevant offshore codes. A modularization technique is used to divide the software system into multiple discrete and independent modules based on offshore codes, which are capable of carrying out task(s) independently. All modules in range operate from a project model that is accessed directly by other modules for analysis and performance prediction and allows design changes to flow through automatically to facilitate smooth communication and coordination between different design activities. All the modules have a number of common design features. The quality of an implementation of each offshore code in independent software modules is measured by defining the level of inter-dependability among modules and their interaction among them, and by defining the degree of intra-dependability within elements of a module. This modularization technique also includes other benefits, such as ease of maintenance and updates. The improvements are related to the objectives of a state-of-the-art procedure of performing engineering, design and analysis by use of offshore codes implemented in a software application. The application is developed in .NET C# language with MS Visual Studio Technology that provides a powerful graphical user interface well integrated in windows environment.


Author(s):  
David del Rio Astorga ◽  
Manuel F Dolz ◽  
Luis Miguel Sánchez ◽  
J Daniel García ◽  
Marco Danelutto ◽  
...  

Since the ‘free lunch’ of processor performance is over, parallelism has become the new trend in hardware and architecture design. However, parallel resources deployed in data centers are underused in many cases, given that sequential programming is still deeply rooted in current software development. To address this problem, new methodologies and techniques for parallel programming have been progressively developed. For instance, parallel frameworks, offering programming patterns, allow expressing concurrency in applications to better exploit parallel hardware. Nevertheless, a large portion of production software, from a broad range of scientific and industrial areas, is still developed sequentially. Considering that these software modules contain thousands, or even millions, of lines of code, an extremely large amount of effort is needed to identify parallel regions. To pave the way in this area, this paper presents Parallel Pattern Analyzer Tool, a software component that aids the discovery and annotation of parallel patterns in source codes. This tool simplifies the transformation of sequential source code to parallel. Specifically, we provide support for identifying Map, Farm, and Pipeline parallel patterns and evaluate the quality of the detection for a set of different C++ applications.


Author(s):  
A. L. Popov

Objectives. The aim of the study is to develop a set of interconnected information blocks for the formation, subsequent processing and application of data in the preparation of reporting and functional documents.Method. The choice of regularly used templates as a basis for the development of information blocks. Based on the structure and forms of the selected templates, software modules for information block interfaces have been developed.Result. Based on the results of identifying and analyzing problematic issues in the activities of the operational duty shift for the automation and intellectualization of the activities of specialists, the idea of developing and using universal information blocks as part of a database management system for automated workplaces was proposed. Interfaces of information blocks are included in the composition of each workstation and provide for the processes of: entering, changing, searching and filtering data; Previewing, printing and saving documents. In the development of algorithms, original software solutions were applied: combining data prepared by several specialists, generating document details, transferring information between different workstations, monitoring the availability of documents, accessing the necessary reference information.Conclusion. The introduction of the developed information blocks: “Schedule of SLM duty”, “Requisites”, “Journal of Carriage of Duty”, “Control”, “Reference book” into the complex of automated workplaces of operational duty shift in the Crisis Management Center allows you to refuse as a result of automation from manual filling of text templates and, as a result, reduce labor intensity, increase resource and improve the quality of preparation of operational reporting documents in the event of accidents, emergencies, fires and functions national documents in the mode of daily activities in accordance with the regulatory, job and functional duties of specialists.


2019 ◽  
Vol 1 (1) ◽  
pp. 59-63
Author(s):  
Andreas Andreas ◽  
Riska Natariasari

ERP systems are integrated information systems that can be applied in both business and non-business organizations. For business organizations it covers the entire functional enterprise that includes accounting and finance, production, sales, purchasing, personnel and other functions. These functions are separated by software modules and interconnected with the integrated data center. Implementation of ERP systems does not always provide satisfaction for end-users. This paper examines the quality of information systems and service that impact on end-user satisfaction, specifically banking companies located in Pekanbaru, Indonesia. Data analysis results reveal that the information systems and service quality partially affect end-users’ satisfaction with ERP systems and thus these findings, remind the designers of ERP systems to improve the quality  information systems and the availability of user friendly service.


Author(s):  
Benjamin Denham ◽  
Russel Pears ◽  
Andy M. Connor

Evaluating software modules for inclusion in a Drupal website is a crucial and complex task that currently requires manual assessment of a number of module facets. This study applied data-mining techniques to identify quality-related metrics associated with highly popular and unpopular Drupal modules. The data-mining approach produced a set of important metrics and thresholds that highlight a strong relationship between the overall perceived reliability of a module and its popularity. Areas for future research into open-source software quality are presented, including a proposed module evaluation tool to aid developers in selecting high-quality modules.


2021 ◽  
Vol 12 (4) ◽  
pp. 0-0

Software quality engineering applied numerous techniques for assuring the quality of software, namely testing, verification, validation, fault tolerance, and fault prediction of the software. The machine learning techniques facilitate the identification of software modules as faulty or non-faulty. In most of the research, these approaches predict the fault-prone module in the same release of the software. Although, the model is found to be more efficient and validated when training and tested data are taken from previous and subsequent releases of the software respectively. The contribution of this paper is to predict the faults in two scenarios i.e. inter and intra release prediction. The comparison of both intra and inter-release fault prediction by computing various performance matrices using machine learning methods shows that intra-release prediction is having better accuracy compared to inter-releases prediction across all the releases. Also, but both the scenarios achieve good results in comparison to existing research work.


Author(s):  
K. T. Tokuyasu

During the past investigations of immunoferritin localization of intracellular antigens in ultrathin frozen sections, we found that the degree of negative staining required to delineate u1trastructural details was often too dense for the recognition of ferritin particles. The quality of positive staining of ultrathin frozen sections, on the other hand, has generally been far inferior to that attainable in conventional plastic embedded sections, particularly in the definition of membranes. As we discussed before, a main cause of this difficulty seemed to be the vulnerability of frozen sections to the damaging effects of air-water surface tension at the time of drying of the sections.Indeed, we found that the quality of positive staining is greatly improved when positively stained frozen sections are protected against the effects of surface tension by embedding them in thin layers of mechanically stable materials at the time of drying (unpublished).


Author(s):  
L. D. Jackel

Most production electron beam lithography systems can pattern minimum features a few tenths of a micron across. Linewidth in these systems is usually limited by the quality of the exposing beam and by electron scattering in the resist and substrate. By using a smaller spot along with exposure techniques that minimize scattering and its effects, laboratory e-beam lithography systems can now make features hundredths of a micron wide on standard substrate material. This talk will outline sane of these high- resolution e-beam lithography techniques.We first consider parameters of the exposure process that limit resolution in organic resists. For concreteness suppose that we have a “positive” resist in which exposing electrons break bonds in the resist molecules thus increasing the exposed resist's solubility in a developer. Ihe attainable resolution is obviously limited by the overall width of the exposing beam, but the spatial distribution of the beam intensity, the beam “profile” , also contributes to the resolution. Depending on the local electron dose, more or less resist bonds are broken resulting in slower or faster dissolution in the developer.


Author(s):  
G. Lehmpfuhl

Introduction In electron microscopic investigations of crystalline specimens the direct observation of the electron diffraction pattern gives additional information about the specimen. The quality of this information depends on the quality of the crystals or the crystal area contributing to the diffraction pattern. By selected area diffraction in a conventional electron microscope, specimen areas as small as 1 µ in diameter can be investigated. It is well known that crystal areas of that size which must be thin enough (in the order of 1000 Å) for electron microscopic investigations are normally somewhat distorted by bending, or they are not homogeneous. Furthermore, the crystal surface is not well defined over such a large area. These are facts which cause reduction of information in the diffraction pattern. The intensity of a diffraction spot, for example, depends on the crystal thickness. If the thickness is not uniform over the investigated area, one observes an averaged intensity, so that the intensity distribution in the diffraction pattern cannot be used for an analysis unless additional information is available.


Sign in / Sign up

Export Citation Format

Share Document