scholarly journals Optimal Release Time Estimation of Software System using Box-Cox Transformation and Neural Network

Author(s):  
Momotaz Begum ◽  
Tadashi Dohi

The determination of the software release time for a new software product is the most critical issue for designing and controlling software development processes. This paper presents an innovative technique to predict the optimal software release time using a neural network. In our approach, a three-layer perceptron neural network with multiple outputs is used, where the underlying software fault count data are transformed into the Gaussian data by means of the well-known Box-Cox power transformation. Then the prediction of the optimal software release time, which minimizes the expected software cost, is carried out using the neural network. Numerical examples with four actual software fault count data sets are presented, where we compare our approach with conventional Non-Homogeneous Poisson Process (NHPP) -based Software Reliability Growth Models (SRGMs).

Author(s):  
Yuka Minamino ◽  
Shinji Inoue ◽  
Shigeru Yamada

Existing optimal software release problems have been discussed by using an evaluation criterion such as cost, reliability, delivery. When we use the methods by those evaluation criteria, the optimal release time is determined by an evaluation criterion. However, it is more realistic that we determine the optimal release time with multiple attributes. Therefore, in this study, we estimate the optimal release time by using multi-attribute utility theory (MAUT). Since MAUT is one of utility theories, we can estimate an optimal release time and change-point from the perspective of utility by maximizing the multi-attribute utility function. Especially, we consider the both of two attributes: cost and reliability. Then, we apply a software reliability growth model (SRGM) with change-point to represent the cost and reliability attributes. Concretely, we use an exponential SRGM with change-point. That is, we can estimate not only the optimal release time but also change-point. Finally, we show numerical examples by using actual data sets. Especially, we check the behavior of the optimal release time, change-point, total software cost and utility.


Author(s):  
Tadashi Dohi ◽  
Naoto Kaio ◽  
Shunji Osaki

This paper presents a new stochastic model for determining the optimal release time for a computer software in testing phase, taking account of the debugging time lag. In the earlier works, most of software release models were considered, but it was assumed that an error detected can be removed instantaneously. In other words, none discussed quantitatively the effect of the software maintenance action in the optimal software release time. Main purpose of this work is to relate the optimal software release policy with the arrival-service process on the software operation phase by users. We use the Non-Homogeneous Poisson Process (NHPP) type of software reliability growth models as the software error detection phenomena and obtain the optimal software release policies minimizing the expected total software costs. As a result, the usage circumstance of a software in operation phase gives a monotone effect to the software release planning.


Author(s):  
PARMOD KUMAR KAPUR ◽  
V. S. SARMA YADAVALLI ◽  
SUNIL KUMAR KHATRI ◽  
MASHAALLAH BASIRZADEH

Modeling of software reliability has gained lot of importance in recent years. Use of software-critical applications has led to tremendous increase in amount of work being carried out in software reliability growth modeling. Number of analytic software reliability growth models (SRGM) exists in literature. They are based on some assumptions; however, none of them works well across different environments. The current software reliability literature is inconclusive as to which models and techniques are best, and some researchers believe that each organization needs to try several approaches to determine what works best for them. Data-driven artificial neural-network (ANN) based models, on other side, provide better software reliability estimation. In this paper we present a new dimension to build an ensemble of different ANN to improve the accuracy of estimation for complex software architectures. Model has been validated on two data sets cited from the literature. Results show fair improvement in forecasting software reliability over individual neural-network based models.


Author(s):  
Ompal Singh ◽  
Saurabh Panwar ◽  
P. K. Kapur

In software engineering literature, numerous software reliability growth models have been designed to evaluate and predict the reliability of the software products and to measure the optimal time-to-market of the software systems. Most existing studies on software release time assessment assumes that when software is released, its testing process is terminated. In practice, however, the testing team releases the software product first and continues the testing process for an added period in the operational phase. Therefore, in this study, a coherent reliability growth model is developed to predict the expected reliability of the software product. The debugging process is considered imperfect as new faults can be introduced into the software during each fault removal. The proposed model assumes that the fault observation rate of the testing team modifies after the software release. The release time of the software is therefore regarded as the change-point. It has been established that the veracity of the performance of the growth models escalates by incorporating the change-point theory. A unified approach is utilized to model the debugging process wherein both testers and users simultaneously identify the faults in the post-release testing phase. A joint optimization problem is formulated based on the two decision criteria: cost and reliability. In order to assimilate the manager’s preferences over these two criteria, a multi-criteria decision-making technique known as multi-attribute utility theory is employed. A numerical illustration is further presented by using actual data sets from the software project to determine the optimal software time-to-market and testing termination time.


2018 ◽  
Vol 24 (1) ◽  
pp. 22-36 ◽  
Author(s):  
Momotaz Begum ◽  
Tadashi Dohi

Purpose The purpose of this paper is to present a novel method to estimate the optimal software testing time which minimizes the relevant expected software cost via a refined neural network approach with the grouped data, where the multi-stage look ahead prediction is carried out with a simple three-layer perceptron neural network with multiple outputs. Design/methodology/approach To analyze the software fault count data which follows a Poisson process with unknown mean value function, the authors transform the underlying Poisson count data to the Gaussian data by means of one of three data transformation methods, and predict the cost-optimal software testing time via a neural network. Findings In numerical examples with two actual software fault count data, the authors compare the neural network approach with the common non-homogeneous Poisson process-based software reliability growth models. It is shown that the proposed method could provide a more accurate and more flexible decision making than the common stochastic modeling approach. Originality/value It is shown that the neural network approach can be used to predict the optimal software testing time more accurately.


Author(s):  
Vishal Pradhan ◽  
Ajay Kumar ◽  
Joydip Dhar

The fault reduction factor (FRF) is a significant parameter for controlling the software reliability growth. It is the ratio of net fault correction to the number of failures encountered. In literature, many factors affect the behaviour of FRF, namely fault dependency, debugging time-lag, human learning behaviour and imperfect debugging. Besides this, several distributions, for example, inflection S-shaped, Weibull and Exponentiated-Weibull, are used as FRF. However, these standard distributions are not flexible to describe the observed behaviour of FRFs. This paper proposes three different software reliability growth models (SRGMs), which incorporate a three-parameter generalized inflection S-shaped (GISS) distribution as FRF. To model realistic SRGMs, time lags between fault detection and fault correction processes are also incorporated. This study proposed two models for the single release, whereas the third model is designed for multi-release software. Moreover, the first model is in perfect debugging, while the rest of the two are in an imperfect debugging environment. The extensive experiments are conducted for the proposed models with six single release and one multi-release data-sets. The choice of GISS distribution as an FRF improves the software reliability evaluation in comparison with the existing systems in the literature. Finally, the development cost and optimal release time are calculated in a perfect debugging environment.


Author(s):  
Qingtian Zeng ◽  
Qiang Sun ◽  
Geng Chen ◽  
Hua Duan

AbstractWireless cellular traffic prediction is a critical issue for researchers and practitioners in the 5G/B5G field. However, it is very challenging since the wireless cellular traffic usually shows high nonlinearities and complex patterns. Most existing wireless cellular traffic prediction methods lack the abilities of modeling the dynamic spatial–temporal correlations of wireless cellular traffic data, thus cannot yield satisfactory prediction results. In order to improve the accuracy of 5G/B5G cellular network traffic prediction, an attention-based multi-component spatiotemporal cross-domain neural network model (att-MCSTCNet) is proposed, which uses Conv-LSTM or Conv-GRU for neighbor data, daily cycle data, and weekly cycle data modeling, and then assigns different weights to the three kinds of feature data through the attention layer, improves their feature extraction ability, and suppresses the feature information that interferes with the prediction time. Finally, the model is combined with timestamp feature embedding, multiple cross-domain data fusion, and jointly with other models to assist the model in traffic prediction. Experimental results show that compared with the existing models, the prediction performance of the proposed model is better. Among them, the RMSE performance of the att-MCSTCNet (Conv-LSTM) model on Sms, Call, and Internet datasets is improved by 13.70 ~ 54.96%, 10.50 ~ 28.15%, and 35.85 ~ 100.23%, respectively, compared with other existing models. The RMSE performance of the att-MCSTCNet (Conv-GRU) model on Sms, Call, and Internet datasets is about 14.56 ~ 55.82%, 12.24 ~ 29.89%, and 38.79 ~ 103.17% higher than other existing models, respectively.


Sign in / Sign up

Export Citation Format

Share Document