scholarly journals Can Open Source projects succeed when the producers are not users? Lessons from the data processing field1

2012 ◽  
Vol 16 ◽  
pp. 113-127 ◽  
Author(s):  
Nicolas Jullien ◽  
Karine Roudaut

Free/Libre Open Source Software (FLOSS) proposes an original way to solve the incentive dilemma for the production of information goods, based on von Hippel (1988)’s user-as-innovator principle: as users benefit from innovation, they have incentive to produce it, and as they can expect cumulative innovation on their own proposition, they have incentive to share it. But what is the incentive for producers when they are not users? We discuss this question via a qualitative study of FLOSS projects in “algorithm-based industries”. We find that in this case producers hardly participate in such projects.

2019 ◽  
Author(s):  
H. Soon Gweon ◽  
Liam P. Shaw ◽  
Jeremy Swann ◽  
Nicola De Maio ◽  
Manal AbuOun ◽  
...  

ABSTRACTBackgroundShotgun metagenomics is increasingly used to characterise microbial communities, particularly for the investigation of antimicrobial resistance (AMR) in different animal and environmental contexts. There are many different approaches for inferring the taxonomic composition and AMR gene content of complex community samples from shotgun metagenomic data, but there has been little work establishing the optimum sequencing depth, data processing and analysis methods for these samples. In this study we used shotgun metagenomics and sequencing of cultured isolates from the same samples to address these issues. We sampled three potential environmental AMR gene reservoirs (pig caeca, river sediment, effluent) and sequenced samples with shotgun metagenomics at high depth (∼200 million reads per sample). Alongside this, we cultured single-colony isolates ofEnterobacteriaceaefrom the same samples and used hybrid sequencing (short- and long-reads) to create high-quality assemblies for comparison to the metagenomic data. To automate data processing, we developed an open-source software pipeline, ‘ResPipe’.ResultsTaxonomic profiling was much more stable to sequencing depth than AMR gene content. 1 million reads per sample was sufficient to achieve <1% dissimilarity to the full taxonomic composition. However, at least 80 million reads per sample were required to recover the full richness of different AMR gene families present in the sample, and additional allelic diversity of AMR genes was still being discovered in effluent at 200 million reads per sample. Normalising the number of reads mapping to AMR genes using gene length and an exogenous spike ofThermus thermophilusDNA substantially changed the estimated gene abundance distributions. While the majority of genomic content from cultured isolates from effluent was recoverable using shotgun metagenomics, this was not the case for pig caeca or river sediment.ConclusionsSequencing depth and profiling method can critically affect the profiling of polymicrobial animal and environmental samples with shotgun metagenomics. Both sequencing of cultured isolates and shotgun metagenomics can recover substantial diversity that is not identified using the other methods. Particular consideration is required when inferring AMR gene content or presence by mapping metagenomic reads to a database. ResPipe, the open-source software pipeline we have developed, is freely available (https://gitlab.com/hsgweon/ResPipe).


2019 ◽  
Vol 14 (1) ◽  
Author(s):  
H. Soon Gweon ◽  
◽  
Liam P. Shaw ◽  
Jeremy Swann ◽  
Nicola De Maio ◽  
...  

Abstract Background Shotgun metagenomics is increasingly used to characterise microbial communities, particularly for the investigation of antimicrobial resistance (AMR) in different animal and environmental contexts. There are many different approaches for inferring the taxonomic composition and AMR gene content of complex community samples from shotgun metagenomic data, but there has been little work establishing the optimum sequencing depth, data processing and analysis methods for these samples. In this study we used shotgun metagenomics and sequencing of cultured isolates from the same samples to address these issues. We sampled three potential environmental AMR gene reservoirs (pig caeca, river sediment, effluent) and sequenced samples with shotgun metagenomics at high depth (~ 200 million reads per sample). Alongside this, we cultured single-colony isolates of Enterobacteriaceae from the same samples and used hybrid sequencing (short- and long-reads) to create high-quality assemblies for comparison to the metagenomic data. To automate data processing, we developed an open-source software pipeline, ‘ResPipe’. Results Taxonomic profiling was much more stable to sequencing depth than AMR gene content. 1 million reads per sample was sufficient to achieve < 1% dissimilarity to the full taxonomic composition. However, at least 80 million reads per sample were required to recover the full richness of different AMR gene families present in the sample, and additional allelic diversity of AMR genes was still being discovered in effluent at 200 million reads per sample. Normalising the number of reads mapping to AMR genes using gene length and an exogenous spike of Thermus thermophilus DNA substantially changed the estimated gene abundance distributions. While the majority of genomic content from cultured isolates from effluent was recoverable using shotgun metagenomics, this was not the case for pig caeca or river sediment. Conclusions Sequencing depth and profiling method can critically affect the profiling of polymicrobial animal and environmental samples with shotgun metagenomics. Both sequencing of cultured isolates and shotgun metagenomics can recover substantial diversity that is not identified using the other methods. Particular consideration is required when inferring AMR gene content or presence by mapping metagenomic reads to a database. ResPipe, the open-source software pipeline we have developed, is freely available (https://gitlab.com/hsgweon/ResPipe).


Author(s):  
Andrew McCullum

In 2015, Central Asia made some vital enhancements in nature for cross-fringe e-business: Kazakhstan's promotion to the World Trade Organization (WTO) will help business straightforwardness, while the Kyrgyz Republic's enrollment in the Eurasian Customs Union grows its buyer base. Why e-business? Two reasons to begin with, e-trade diminishes the expense of separation. Focal Asia is the most elevated exchange cost locale on the planet: unlimited separations from real markets make discovering purchasers testing, shipping merchandise moderate, and fare costs high. Second, e-business can pull in populaces that are customarily under-spoke to in fare markets, for example, ladies, little organizations and rustic business visionaries.


Open Physics ◽  
2012 ◽  
Vol 10 (1) ◽  
Author(s):  
David Nečas ◽  
Petr Klapetek

AbstractIn this article, we review special features of Gwyddion—a modular, multiplatform, open-source software for scanning probe microscopy data processing, which is available at http://gwyddion.net/. We describe its architecture with emphasis on modularity and easy integration of the provided algorithms into other software. Special functionalities, such as data processing from non-rectangular areas, grain and particle analysis, and metrology support are discussed as well. It is shown that on the basis of open-source software development, a fully functional software package can be created that covers the needs of a large part of the scanning probe microscopy user community.


Geophysics ◽  
2018 ◽  
Vol 83 (2) ◽  
pp. F9-F20 ◽  
Author(s):  
Can Oren ◽  
Robert L. Nowack

We present an overview of reproducible 3D seismic data processing and imaging using the Madagascar open-source software package. So far, there has been a limited number of studies on the processing of real 3D data sets using open-source software packages. Madagascar with its wide range of individual programs and tools available provides the capability to fully process 3D seismic data sets. The goal is to provide a streamlined illustration of the approach for the implementation of 3D seismic data processing and imaging using the Madagascar open-source software package. A brief introduction is first given to the Madagascar open-source software package and the publicly available 3D Teapot Dome seismic data set. Several processing steps are applied to the data set, including amplitude gaining, ground roll attenuation, muting, deconvolution, static corrections, spike-like random noise elimination, normal moveout (NMO) velocity analysis, NMO correction, stacking, and band-pass filtering. A 3D velocity model in depth is created using Dix conversion and time-to-depth scaling. Three-dimensional poststack depth migration is then performed followed by [Formula: see text]-[Formula: see text] deconvolution and structure-enhancing filtering of the migrated image to suppress random noise and enhance the useful signal. We show that Madagascar, as a powerful open-source environment, can be used to construct a basic workflow to process and image 3D seismic data in a reproducible manner.


2014 ◽  
Vol 39 (2) ◽  
pp. 127-143 ◽  
Author(s):  
Abhoy K Ojha ◽  
Ravi Anand Rao

Institutional theory offers a very powerful lens to understand and explain societal phenomena. In the context of innovation and technology, this perspective provides insights that complement the understandings derived from a focus on just technology or economics. Adopting this standpoint, this paper examines the emergence of the organizational field of open source software as a response to the norms of propriety software that were unacceptable to many passionate software researchers and programmers. The context of software product development has some unique characteristics that separates it from other industries. First, software products are information goods. In general, information goods have very high fixed costs of development and low marginal costs of reproduction which often leads to market inefficiencies. Second, IP protection has the potential to exaggerate the problem of market inefficiencies. Third, software is an input and also an output of the production function and IP protection has the potential to make the cost of software products prohibitively high. Fourth, the Internet has created the potential for the larger society to participate in the production process. These features of the software industry influence the dynamics among software professionals and orgnizations creating a distinctive context which can be better understood through the lens of institutional theory. According to institution theory, organizations seek to obtain legitimacy, which goes beyond technological or economic performance, by conforming to institutional requirements in a context. There are three forms of legitimacy. Pragmatic legitimacy, based on regulative requirements, is acquired by complying with the legal and regulative rules in the organizational field. Moral legitimacy, based on normative requirements, is obtained by ensuring that the activities of an organization promote societal good or welfare. Finally, cognitive legitimacy is derived from the extent to which the activities of an organization mesh with the taken-for-granted norms in the larger context. While institutions are normally sustained for long, they do experience change. Institutional change is driven by institutional entrepreneurs who create, maintain, and disrupt the practices that are considered legitimate, and challenge the boundaries that demarcate one field from another. The findings of this study capture the intricate dynamics and interactions among institutional requirements, software professionals and organizations that led to the norms of the institution of propriety software being challenged. It suggests that the process of institutional change can lead to the creation of a new alternate organizational field leaving the original field largely untouched. This paper contributes to the understanding of the software industry and suggests implications for other industries that produce information goods.


2020 ◽  
Vol 9 (11) ◽  
pp. 679
Author(s):  
Nathalie Guimarães ◽  
Luís Pádua ◽  
Telmo Adão ◽  
Jonáš Hruška ◽  
Emanuel Peres ◽  
...  

Currently, the use of free and open-source software is increasing. The flexibility, availability, and maturity of this software could be a key driver to develop useful and interesting solutions. In general, open-source solutions solve specific tasks that can replace commercial solutions, which are often very expensive. This is even more noticeable in areas requiring analysis and manipulation/visualization of a large volume of data. Considering that there is a major gap in the development of web applications for photogrammetric processing, based on open-source technologies that offer quality results, the application presented in this article is intended to explore this niche. Thus, in this article a solution for photogrammetric processing is presented, based on the integration of MicMac, GeoServer, Leaflet, and Potree software. The implemented architecture, focusing on open-source software for data processing and for graphical manipulation, visualization, measuring, and analysis, is presented in detail. To assess the results produced by the proposed web application, a case study is presented, using imagery acquired from an unmanned aerial vehicle in two different areas.


Sign in / Sign up

Export Citation Format

Share Document