scholarly journals Have Variability Tools Fulfilled the Needs of the Software Industry?

2020 ◽  
Vol 26 (10) ◽  
pp. 1282-1311
Author(s):  
Ana Allian ◽  
Edson OliveiraJr ◽  
Rafael Capilla ◽  
Elisa Nakagawa

For nearly 30 years, industry and researchers have proposed many software variability tools to cope with the complexity of modeling variability in software development, followed by a number of publications on variability techniques built upon theoretical foundations. After more than 25 years of the practice of software variability, there are not many studies investigating the impact of software variability tools in the industry and the perception of practitioners. For this reason, we investigate in this research work how existing software variability tools fulfill the needs of companies demanding this kind of tool support. We conducted a survey with practitioners from companies in eight different countries in order to analyze the missing capabilities of software variability management tools and we compared the results of the survey with the scientifoc literature through a systematic mapping study (SMS) to analyze if the proposed solutions cover the needs required by practitioners. Our major findings indicate that many tools lack important qualities such as interoperability, collaborative work, code generation, scalability, impact analysis, and test; while the results from the SMS showed these such capabilities are, to some extent, found in some of the existing tools.

Author(s):  
Yasmina Maizi ◽  
Ygal Bendavid

With the fast development of IoT technologies and the potential of real-time data gathering, allowing decision makers to take advantage of real-time visibility on their processes, the rise of Digital Twins (DT) has attracted several research interests. DT are among the highest technological trends for the near future and their evolution is expected to transform the face of several industries and applications and opens the door to a huge number of possibilities. However, DT concept application remains at a cradle stage and it is mainly restricted to the manufacturing sector. In fact, its true potential will be revealed in many other sectors. In this research paper, we aim to propose a DT prototype for instore daily operations management and test its impact on daily operations management performances. More specifically, for this specific research work, we focus the impact analysis of DT in the fitting rooms’ area.


Author(s):  
D. JEYAMALA ◽  
K. SABARI NATHAN ◽  
A. JALILA ◽  
S. BALAMURUGAN

High quality software can be obtained by means of resolving the complexity of the software. According to Pareto principle, 20% of components lead to 80% of the problems [1]. So, we need to identify those 20% of the components during testing. Therefore, this research work suggested an automated software testing framework to identify critical components using mutant based dynamic impact analysis for Software under Test (SUT). Mutants are automatically generated by injecting faults in the components using Offutt mutation operators and they are utilised to identify their impact level over other components of the system. The generated mutants and original program are executed using the suite of test cases, based on the conclusion of both the results, the mutation score is assessed and furthermore it is utilised as the test case adequacy criterion to recognize the impact level of it over the other components of a system. The outcome of this innovative approach is a testing tool entitled as JImpact Analyzer that automates the entire task and has generates miscellaneous graphs for visualization purpose.


2016 ◽  
Vol 13 (2) ◽  
pp. 74-101
Author(s):  
Gustavo Ansaldi Oliva ◽  
Marco Aurélio Gerosa ◽  
Fabio Kon ◽  
Virginia Smith ◽  
Dejan Milojicic

In ever-changing business environments, organizations continuously refine their processes to benefit from and meet the constraints of new technology, new business rules, and new market requirements. Workflow management systems (WFMSs) support organizations in evolving their processes by providing them with technological mechanisms to design, enact, and monitor workflows. However, workflows repositories often grow and start to encompass a variety of interdependent workflows. Without appropriate tool support, keeping track of such interdependencies and staying aware of the impact of a change in a workflow schema becomes hard. Workflow designers are often blindsided by changes that end up inducing side- and ripple-effects. This poses threats to the reliability of the workflows and ultimately hampers the evolvability of the workflow repository as a whole. In this paper, the authors introduce a change impact analysis approach based on metrics and visualizations to support the evolution of workflow repositories. They implemented the approach and later integrated it as a module in the HP Operations Orchestration (HP OO) WFMS. The authors conducted an exploratory study in which they thoroughly analyzed the workflow repositories of 8 HP OO customers. They characterized the customer repositories from a change impact perspective and compared them against each other. The authors were able to spot the workflows with high change impact among thousands of workflows in each repository. They also found that while the out-of-the-box repository included in HP OO had 10 workflows with high change impact, customer repositories included 11 (+10%) to 35 (+250%) workflows with this same characteristic. This result indicates the extent to which customers should put additional effort in evolving their repositories. The authors' approach contributes to the body of knowledge on static workflow evolution and complements existing dynamic workflow evolution approaches. Their techniques also aim to help organizations build more flexible and reliable workflow repositories.


2020 ◽  
Vol 10 (21) ◽  
pp. 7749
Author(s):  
Arshad Ahmad ◽  
José Luis Barros Justo ◽  
Chong Feng ◽  
Arif Ali Khan

Context: The use of controlled vocabularies (CVs) aims to increase the quality of the specifications of the software requirements, by producing well-written documentation to reduce both ambiguities and complexity. Many studies suggest that defects introduced at the requirements engineering (RE) phase have a negative impact, significantly higher than defects in the later stages of the software development lifecycle. However, the knowledge we have about the impact of using CVs, in specific RE activities, is very scarce. Objective: To identify and classify the type of CVs, and the impact they have on the requirements engineering phase of software development. Method: A systematic mapping study, collecting empirical evidence that is published up to July 2019. Results: This work identified 2348 papers published pertinent to CVs and RE, but only 90 primary published papers were chosen as relevant. The process of data extraction revealed that 79 studies reported the use of ontologies, whereas the remaining 11 were focused on taxonomies. The activities of RE with greater empirical support were those of specification (29 studies) and elicitation (28 studies). Seventeen different impacts of the CVs on the RE activities were classified and ranked, being the two most cited: guidance and understanding (38%), and automation and tool support (22%). Conclusions: The evolution of the last 10 years in the number of published papers shows that interest in the use of CVs remains high. The research community has a broad representation, distributed across the five continents. Most of the research focuses on the application of ontologies and taxonomies, whereas the use of thesauri and folksonomies is less reported. The evidence demonstrates the usefulness of the CVs in all RE activities, especially during elicitation and specification, helping developers understand, facilitating the automation process and identifying defects, conflicts and ambiguities in the requirements. Collaboration in research between academic and industrial contexts is low and should be promoted.


2015 ◽  
Vol 15 (10) ◽  
pp. 2283-2297 ◽  
Author(s):  
M. Bostenaru Dan ◽  
I. Armas

Abstract. This study is aimed to create an alternative to the classical GIS representation of the impact of earthquake hazards on urban areas. To accomplish this, the traditional map was revised, so that it can cope with contemporary innovative ways of planning, namely strategic planning. As in the theory of fractals, the building dimension and the urban neighbourhood dimension are addressed as different geographic scales between which lessons for decisions can be learned through regression. The interaction between the two scales is useful when looking for alternatives, for the completion of a GIS analysis, and in choosing the landmarks, which, in the case of hazards, become strategic elements in strategic planning. A methodology to innovate mapping as a digital means for analysing and visualising the impact of hazards is proposed. This method relies on concepts from various geography, urban planning, structural engineering and architecture approaches related to disaster management. The method has been tested at the building scale for the N–S Boulevard in Bucharest, Romania, called Magheru. At the urban scale, an incident database has been created, in which the case study for the building level can be mapped. The paper presented is part of a larger research work, which addresses decision making using the framework shown here. The main value of the paper is in proposing a conceptual framework to deconstruct the map for digital earthquake disaster impact analysis and representation. The originality of the concept consists in the representation of elements at different scales considered to be of different levels of importance in the urban tissue, according to the analysis to be performed on them.


2015 ◽  
Vol 3 (5) ◽  
pp. 3287-3321 ◽  
Author(s):  
M. Bostenaru Dan ◽  
I. Armas

Abstract. We aim to create an alternative to GIS representation of the impact of hazards on urban areas. To accomplish this, we revise the traditional map, so that it can cope with today's innovative ways of planning, namely strategic planning. As in the theory of fractals, we address the building dimension and the urban neighbourhood dimension as different geographic scales between which lessons for decisions can be learned through regression. The interaction between the two scales can be seen when looking for alternatives or the completion of a GIS analysis, or in chosing the landmarks, which, in the case of hazards, become strategic elements in strategic planning. A methodology to innovate mapping as a digital means for analysing and visualising the impact of hazards has been developed. This new method relies on concepts from various geography, urban planning, structural engineering and architecture approaches related to disaster management. The method has been tested at the building scale for the central N–S boulevard in Bucharest, Romania, comprising the protected urban zone 04 "Magheru". At the urban scale, an incident database has been created, in which the case study for the building level can be mapped. The paper presented is part of a larger research work, which addresses decision making using the framework shown here. The main value of the paper is in proposing a conceptual framework to deconstruct the map for digital disaster impact analysis and representation. This concept is highly original, because it considers the representation of elements at different scales to be of different importance in the urban tissue, according to the analysis to be performed on them.


Author(s):  
Chetna Gupta ◽  
Varun Gupta

Change is an integral part of any software system. Predicting the impact of making changes through techniques of change impact analysis helps engineers identify and analyze those parts of the system that will be potentially affected by requested change(s). This chapter presents a semi-automated approach to (a) compute likelihood of impacted functions in a system through identification and analysis of functional dependencies between them and (b) assist software engineers in selective regression testing. This technique first classifies the impact set data into two categories based on their type of impact propagation. Next it performs prioritization on classified data to rank functions into higher and lower levels according to the degree of impact they will make. This prediction will help in lowering maintenance cost and effort of software engineers. Thus, a software engineer can first run those test cases which cover segments with higher impacted priority to minimize regression test selection.


2020 ◽  
Vol 91 (3) ◽  
pp. 31301
Author(s):  
Nabil Chakhchaoui ◽  
Rida Farhan ◽  
Meriem Boutaldat ◽  
Marwane Rouway ◽  
Adil Eddiai ◽  
...  

Novel textiles have received a lot of attention from researchers in the last decade due to some of their unique features. The introduction of intelligent materials into textile structures offers an opportunity to develop multifunctional textiles, such as sensing, reacting, conducting electricity and performing energy conversion operations. In this research work nanocomposite-based highly piezoelectric and electroactive β-phase new textile has been developed using the pad-dry-cure method. The deposition of poly (vinylidene fluoride) (PVDF) − carbon nanofillers (CNF) − tetraethyl orthosilicate (TEOS), Si(OCH2CH3)4 was acquired on a treated textile substrate using coating technique followed by evaporation to transform the passive (non-functional) textile into a dynamic textile with an enhanced piezoelectric β-phase. The aim of the study is the investigation of the impact the coating of textile via piezoelectric nanocomposites based PVDF-CNF (by optimizing piezoelectric crystalline phase). The chemical composition of CT/PVDF-CNC-TEOS textile was detected by qualitative elemental analysis (SEM/EDX). The added of 0.5% of CNF during the process provides material textiles with a piezoelectric β-phase of up to 50% has been measured by FTIR experiments. These results indicated that CNF has high efficiency in transforming the phase α introduced in the unloaded PVDF, to the β-phase in the case of nanocomposites. Consequently, this fabricated new textile exhibits glorious piezoelectric β-phase even with relatively low coating content of PVDF-CNF-TEOS. The study demonstrates that the pad-dry-cure method can potentially be used for the development of piezoelectric nanocomposite-coated wearable new textiles for sensors and energy harvesting applications. We believe that our study may inspire the research area for future advanced applications.


The university is considered one of the engines of growth in a local economy or its market area, since its direct contributions consist of 1) employment of faculty and staff, 2) services to students, and supply chain links vendors, all of which define the University’s Market area. Indirect contributions consist of those agents associated with the university in terms of community and civic events. Each of these activities represent economic benefits to their host communities and can be classified as the economic impact a university has on its local economy and whose spatial market area includes each of the above agents. In addition are the critical links to the University, which can be considered part of its Demand and Supply chain. This paper contributes to the field of Public/Private Impact Analysis, which is used to substantiate the social and economic benefits of cooperating for economic resources. We use Census data on Output of Goods and Services, Labor Income on Salaries, Wages and Benefits, Indirect State and Local Taxes, Property Tax Revenue, Population, and Inter-Industry to measure economic impact (Implan, 2016).


Author(s):  
Kulwant Singh ◽  
Gurbhinder Singh ◽  
Harmeet Singh

The weight reduction concept is most effective to reduce the emissions of greenhouse gases from vehicles, which also improves fuel efficiency. Amongst lightweight materials, magnesium alloys are attractive to the automotive sector as a structural material. Welding feasibility of magnesium alloys acts as an influential role in its usage for lightweight prospects. Friction stir welding (FSW) is an appropriate technique as compared to other welding techniques to join magnesium alloys. Field of friction stir welding is emerging in the current scenario. The friction stir welding technique has been selected to weld AZ91 magnesium alloys in the current research work. The microstructure and mechanical characteristics of the produced FSW butt joints have been investigated. Further, the influence of post welding heat treatment (at 260 °C for 1 h) on these properties has also been examined. Post welding heat treatment (PWHT) resulted in the improvement of the grain structure of weld zones which affected the mechanical performance of the joints. After heat treatment, the tensile strength and elongation of the joint increased by 12.6 % and 31.9 % respectively. It is proven that after PWHT, the microhardness of the stir zone reduced and a comparatively smoothened microhardness profile of the FSW joint obtained. No considerable variation in the location of the tensile fracture was witnessed after PWHT. The results show that the impact toughness of the weld joints further decreases after post welding heat treatment.


Sign in / Sign up

Export Citation Format

Share Document