Open Source Software Systems

2010 ◽  
Vol 2 (4) ◽  
pp. 28-47
Author(s):  
R. B. Lenin ◽  
S. Ramaswamy ◽  
Liguo Yu ◽  
R. B. Govindan

Complex software systems and the huge amounts of data they produce are becoming an integral part of our organizations. We are also becoming increasingly dependent on high quality software products in our everyday lives. These systems ‘evolve’ as we identify and correct existing defects, provide new functionalities, or increase their nonfunctional qualities - such as security, maintainability, performance, etc. Simultaneously, more software development projects are distributed over multiple locations (often globally) and are often several millions of dollars in development costs. Consequently, as the Internet continually eliminates geographic boundaries, the concept of doing business within a single country has given way to companies focusing on competing in an international marketplace. The digitalization of work and the reorganization of work processes across many organizations have resulted in routine and/or commodity components being outsourced.

Author(s):  
R. B. Lenin ◽  
S. Ramaswamy ◽  
Liguo Yu ◽  
R. B. Govindan

Complex software systems and the huge amounts of data they produce are becoming an integral part of our organizations. We are also becoming increasingly dependent on high quality software products in our everyday lives. These systems ‘evolve’ as we identify and correct existing defects, provide new functionalities, or increase their nonfunctional qualities - such as security, maintainability, performance, etc. Simultaneously, more software development projects are distributed over multiple locations (often globally) and are often several millions of dollars in development costs. Consequently, as the Internet continually eliminates geographic boundaries, the concept of doing business within a single country has given way to companies focusing on competing in an international marketplace. The digitalization of work and the reorganization of work processes across many organizations have resulted in routine and/or commodity components being outsourced.


Development of complex and quality software necessitates the use of a development model, so that the development process is efficient, reliable and faster. Software development life cycle (SDLC) is a well-defined and wellorganized process used to plan, develop, deploy and maintain high quality software systems. DevOps is one recent addition to SDLC that ensures that the development and operations team collaborate to accelerate the deployment and delivery of higher quality software products. This paper throws a light on how development processes are accelerated using DevOps tactics like continuous integration and deployment (CI/CD) pipelines. however, there are several factors that prevent the organizations from using these approaches. Discovering the evolution of DevOps and its continuous practices, gives a thorough understanding of the importance of the DevOps culture. Manual deployment and testing increase the feedback time of a commit operation. The paper discusses various tools available in the DevOps community that can be used to automate various stages of continuous integration and deployment pipeline, so that the feedback time is reduced.


2015 ◽  
Vol 8 (1) ◽  
pp. 62-81
Author(s):  
Héctor J. Macho ◽  
Gregorio Robles ◽  
Jesus M. González-Barahona

In today's world, management often rely on FLOSS (Free/Libre/Open Source Software) systems to run their organizations. However, the nature of FLOSS is different from the software they have been using in the last decades. Its development model is distributed, and its authors are diverse as many volunteers and companies may collaborate in the project. In this paper, the authors want to shed some light on how to evaluate a FLOSS system by looking at the Moodle platform, which is currently the most used learning management system among educational institutions worldwide. In contrast with other evaluation models that have been proposed so far, the one presented here is based on retrieving historical information that can be obtained publicly from the Internet, allowing the authors to study its evolution. As a result, they will show how using their methodology management can take informed decisions that lower the risk that organizations face when investing in a FLOSS system.


Author(s):  
Irina Lobacheva ◽  
◽  
Nataliya Koceruba ◽  

With the development of information systems and technologies, their spread and improvement, mankind has invented a large number of ways to make life easier with the help of the Internet and electronics. The economy is also not standing still: for example, a few years ago, online stores became very popular, offering customers a wide range of products. Thanks to them, many different economic transactions are conducted every day via the Internet. As a result, the concept of e-commerce has developed rapidly. This term means doing business on global networks. In a simpler sense - trade via the Internet. In addition to stationary stores, various companies also open online ones. As a result, such companies have the opportunity to increase competitiveness, reduce costs associated with the sale of products and provide more useful and high-quality information about goods to their customers.


Author(s):  
CHRISTOPHER M. LOTT

The use of empirical data to understand and improve software products and software engineering processes is gaining ever increasing attention. Empirical data from products and processes is needed to help an organization understand and improve its way of doing business in the software domain. Additional motivation for collecting and using data is provided by the need to conform to guidelines and standards which mandate measurement, specifically the SEI’s Capability Maturity Model and ISO 9000–3. Some software engineering environments (SEEs) offer automated support for collecting and, in a few cases, using empirical data. Measurement will clearly play a significant role in future SEEs. The paper surveys the trend towards supporting measurement in SEEs and gives details about several existing research and commercial software systems.


Author(s):  
Faraz Idris Khan ◽  
Yasir Javed ◽  
Mamdouh Alenezi

<p class="Abstract">Incorporating Open Source Software (OSS) tools in software development is increasing day by day due to their accessibility on the internet. With the advantages of OSS comes disadvantages in terms of security vulnerabilities. Therefore, in this paper, we analyzed four famous open source software tools (i.e. Moodle, Joomla, Flask and VLC media player) which are used by software developers nowadays. For the analysis of each system, security vulnerabilities and weakness were identified, threat models were modeled,and code inspection was performed. The findings are discussed in more details.</p>


2020 ◽  
Vol 2020 (4) ◽  
pp. 116-1-116-7
Author(s):  
Raphael Antonius Frick ◽  
Sascha Zmudzinski ◽  
Martin Steinebach

In recent years, the number of forged videos circulating on the Internet has immensely increased. Software and services to create such forgeries have become more and more accessible to the public. In this regard, the risk of malicious use of forged videos has risen. This work proposes an approach based on the Ghost effect knwon from image forensics for detecting forgeries in videos that can replace faces in video sequences or change the mimic of a face. The experimental results show that the proposed approach is able to identify forgery in high-quality encoded video content.


Author(s):  
Maria Ulan ◽  
Welf Löwe ◽  
Morgan Ericsson ◽  
Anna Wingkvist

AbstractA quality model is a conceptual decomposition of an abstract notion of quality into relevant, possibly conflicting characteristics and further into measurable metrics. For quality assessment and decision making, metrics values are aggregated to characteristics and ultimately to quality scores. Aggregation has often been problematic as quality models do not provide the semantics of aggregation. This makes it hard to formally reason about metrics, characteristics, and quality. We argue that aggregation needs to be interpretable and mathematically well defined in order to assess, to compare, and to improve quality. To address this challenge, we propose a probabilistic approach to aggregation and define quality scores based on joint distributions of absolute metrics values. To evaluate the proposed approach and its implementation under realistic conditions, we conduct empirical studies on bug prediction of ca. 5000 software classes, maintainability of ca. 15000 open-source software systems, and on the information quality of ca. 100000 real-world technical documents. We found that our approach is feasible, accurate, and scalable in performance.


Sign in / Sign up

Export Citation Format

Share Document