scholarly journals Imperfect Debugging in Software Systems by SRGM with TEF

Author(s):  
K. Swetha

Abstract: In the Proposed work we are going to assimilate two important process called TEF and imperfect debugging in software systems for analyzing FDP and FCP. Byapplying the tools called debuggers we are going to identify the failures and going to correct them in order to attain the high quality reliability. As we know, testingeffort function is predicted during this time by allocating the resources which influences considerably only for the fault identification rate and also for the correction of such faults. Additionally, new faults may be included for evaluating as the feedback. In this technique, first it is proposed to demonstrate for the inclusion of TEF and fault introduction into FDP and later develop FCP as delayedFDP with a proper effort for correction. The FCP as well FCP as paired specific models which are extracted based on the basis of types of assumptions of introducing fault introduction as well as correction effort. In addition, the optimal policy of software releasefor different criteria with examples was also presentedin this work. Keywords: FDP, FCP, TEF, Fault

Author(s):  
R. B. Lenin ◽  
S. Ramaswamy ◽  
Liguo Yu ◽  
R. B. Govindan

Complex software systems and the huge amounts of data they produce are becoming an integral part of our organizations. We are also becoming increasingly dependent on high quality software products in our everyday lives. These systems ‘evolve’ as we identify and correct existing defects, provide new functionalities, or increase their nonfunctional qualities - such as security, maintainability, performance, etc. Simultaneously, more software development projects are distributed over multiple locations (often globally) and are often several millions of dollars in development costs. Consequently, as the Internet continually eliminates geographic boundaries, the concept of doing business within a single country has given way to companies focusing on competing in an international marketplace. The digitalization of work and the reorganization of work processes across many organizations have resulted in routine and/or commodity components being outsourced.


1994 ◽  
Vol 23 (4) ◽  
pp. 261-267 ◽  
Author(s):  
Iain J. Mckendrick ◽  
George Gettinby ◽  
Yiqun Gu ◽  
Andrew Peregrine ◽  
Crawford Revie

Large scale population growth in sub-Saharan Africa makes it imperative to achieve an equivalent increase in food production in this area. It is also important that any increase be sustainable in the long-term, not causing lasting damage to local ecosystems. Recent advances in information technology make the successful diffusion of relevant expertise to farmers a more practical option than ever before. How this might be achieved is described in this paper, which considers the transfer of expertise in the diagnosis, treatment and management of trypanosomiasis in cattle. Using current technology, the combination of different software systems in one integrated hybrid system could allow the delivery of high quality, well focused information to the potential user.


Author(s):  
Xiaobing Dang ◽  
Kai He ◽  
Feifei Zhang ◽  
Qiyang Zuo ◽  
Ruxu Du

Bending curved metal plate is an important process for many heavy industries such as shipbuilding. It is a basic process to manufacture hull surface. The conventional method is the so-called line heating method, which is not only labor intensive but also inefficient and error-prone. We have presented a new method of incremental bending. In this paper the incremental bending system is explained from its hardware and software. A kind of curved metal plate is formed by experiment and finite element simulation. The manufactured workpiece is of high quality with smooth surface. The presented method can be successfully used to form curved metal plate. It is a highly flexible forming method. It would have a wide application in industry.


Author(s):  
Saqib Saeed ◽  
Farrukh Masood Khawaja ◽  
Zaigham Mahmood

Pervasive systems and increased reliance on embedded systems require that the underlying software is properly tested and has in-built high quality. The approaches often adopted to realize software systems have inherent weaknesses that have resulted in less robust software applications. The requirement of reliable software suggests that quality needs to be instilled at all stages of a software development paradigms, especially at the testing stages of the development cycle ensuring that quality attributes and parameters are taken into account when designing and developing software. In this respect, numerous tools, techniques, and methodologies have also been proposed. In this chapter, the authors present and review different methodologies employed to improve the software quality during the software development lifecycle.


2019 ◽  
Vol 2019 ◽  
pp. 1-10
Author(s):  
Li Zhiyong ◽  
Zhao Hongdong ◽  
Zeng Ruili ◽  
Xia Kewen ◽  
Guo Qiang ◽  
...  

In order to select fault feature parameters simply and quickly and improve the identification rate of diesel engine faults by using the vibration signals, this paper proposes a diesel engine fault identification method on the basis of the Pearson correlation coefficient diagram (PCC Diagram) and the orthogonal vibration signals. At first, the orthogonal vibration acceleration signals are synchronously acquired in the direction of the top and side of the cylinder head. And the time-domain feature parameters are extracted from the orthogonal vibration acceleration signals to obtain the Pearson correlation coefficient (PCC). Then, the correlation coefficient diagram used to do feature parameter screening is constructed by selecting the feature parameters with the correlation coefficient of more than 0.9. Finally, generalized regression neural network (GRNN) is adopted to classify and identify fuel supply fault in diesel engine. The results show that using the PCC Diagram can simplify the selection process of the feature parameters of the orthogonal vibration signals quickly and effectively. It can also improve the fault identification rate of diesel engine significantly with the help of adding the newly proposed cross-correlation coefficient of the orthogonal vibration signals into the GRNN input feature vector set.


Computing ◽  
2021 ◽  
Author(s):  
Evangelos Ntentos ◽  
Uwe Zdun ◽  
Konstantinos Plakidas ◽  
Patric Genfer ◽  
Sebastian Geiger ◽  
...  

AbstractOne of the chief problems in software architecture is avoiding architecture model drift and erosion in all kinds of complex software systems. Microservice-based systems introduce new challenges in this context, as they often use a large variety of technologies in their latest iteration, and are changed and released very frequently. Existing solutions that can be used to reconstruct architecture models fall short in addressing these new challenges, as they cannot easily cope with continuous evolution, their accuracy is too low, and highly polyglot settings are not supported well. In this work, we report on a research study aiming to design a highly accurate architecture model abstraction approach for comprehending component architecture models of highly polyglot systems that can cope with continuous evolution. After analyzing the results of related studies, we found two possible architecture model abstraction approaches that meet the requirements of our study: an opportunistic, and a reusable semi-automatic detector-based approach. We have conducted an empirical case study for validation and comparison of the two approaches. We conclude that both detector approaches are feasible. In our case study, the reusable approach breaks even in terms of time and effort needed for establishing reuse, if modest reuse of detectors is possible, and is producing slightly more high quality and evolution-stable solutions than the opportunistic approach.


Development of complex and quality software necessitates the use of a development model, so that the development process is efficient, reliable and faster. Software development life cycle (SDLC) is a well-defined and wellorganized process used to plan, develop, deploy and maintain high quality software systems. DevOps is one recent addition to SDLC that ensures that the development and operations team collaborate to accelerate the deployment and delivery of higher quality software products. This paper throws a light on how development processes are accelerated using DevOps tactics like continuous integration and deployment (CI/CD) pipelines. however, there are several factors that prevent the organizations from using these approaches. Discovering the evolution of DevOps and its continuous practices, gives a thorough understanding of the importance of the DevOps culture. Manual deployment and testing increase the feedback time of a commit operation. The paper discusses various tools available in the DevOps community that can be used to automate various stages of continuous integration and deployment pipeline, so that the feedback time is reduced.


2007 ◽  
Vol 353-358 ◽  
pp. 2716-2719
Author(s):  
Hong Sheng Su ◽  
You Peng Zhang

To improve the accuracy and overcome the flaws of single neural network, an integrated neural classifier for stream turbine vibration fault identification is proposed based on particle swarm optimization (PSO) in the paper. The method firstly establishes diagnosis decision table of stream turbines from fault sources to fault symptoms based on wavelet package decomposition technique to faults wave-shape. Then the discrete decision table is acquired by quantifying attribute values in decision table using information entropy, a simplified decision table then is generated by rough set reduction. Based on it, several neural networks are applied to identify steam turbine faults at the same time, and their results are integrated with PSO-based. Both simulation and trial in stream turbine damage identification indicate that the proposed method has higher identification rate and shorter training time as well as excellent generalized ability, and is an ideal pattern classifier.


Author(s):  
DONGFENG WANG ◽  
FAROKH B. BASTANI ◽  
I-LING YEN

Basically, the development of a software system contains specification, design, and implementation. Various specification mechanisms and design methods have been proposed to facilitate the implementation of software systems. However, high system quality cannot be easily assured because of some limitations of these current design methods as well as semantic gaps between the specification mechanisms and the design methods. In particular, manual effort is needed to transform the specification of a system into a design framework for the system. Considering these problems, in this paper, we propose a new system design method. This design method is systematic because by using it, the design framework of a system can be automatically created from the specification of the system. Also, the resulting design framework can support high quality assurance for the system. This design framework is a composition of several individual components, each of which can be completely independently developed and hardened. Further, the system properties (reliability, safety, liveness, etc.) can be mathematically inferred from the properties of the individual components. These components are referred to as IDEAL (Independently Developable End-user Assessable Logical) components and the design method is mainly targeted for continuous process-control systems. The paper develops the approach and illustrates it for a vehicle control system.


Sign in / Sign up

Export Citation Format

Share Document