Software documentation with markdoc 5.0

Author(s):  
E. F. Haghish

markdoc is a general-purpose literate programming package for generating dynamic documents, dynamic presentation slides, Stata help files, and package vignettes in various formats. In this article, I introduce markdoc version 5.0, which performs independently of any third-party software, using the mini engine. The mini engine is a lightweight alternative to Pandoc (MacFarlane [2006, https://pandoc.org/ ]), completely written in Stata. I also propose a procedure for remodeling package documentation and data documentation in Stata and present a tutorial for generating help files, package vignettes, and GitHub Wiki documentation using markdoc.

2010 ◽  
Vol 2 (5) ◽  
Author(s):  
Johan Berntsson ◽  
Norman Lin ◽  
Zoltan Dezso

In this paper we present a general-purpose middleware, called ExtSim that allows OpenSim to communicate with external simulation software, and to synchronize the in-world representation of the simulator state. We briefly present two projects in ScienceSim where ExtSim has been used; Galaxsee which is an interactive real-time N-body simulation, and a protein folding demonstration, before discussing the merits and problems with the current approach. The main limitation is that we until now only have been limited to a third-party viewer, and a fixed server-client protocol, but we present our work on a new viewer, called 3Di Viewer “Rei”, which opens new possibilities in enhancing both performance and richness of the visualization suitable for scientific computing,. Finally we discuss some ideas we are currently studying for future work.


2019 ◽  
Vol 35 (21) ◽  
pp. 4402-4404 ◽  
Author(s):  
Carol L Ecale Zhou ◽  
Stephanie Malfatti ◽  
Jeffrey Kimbrel ◽  
Casandra Philipson ◽  
Katelyn McNair ◽  
...  

Abstract Summary To address the need for improved phage annotation tools that scale, we created an automated throughput annotation pipeline: multiple-genome Phage Annotation Toolkit and Evaluator (multiPhATE). multiPhATE is a throughput pipeline driver that invokes an annotation pipeline (PhATE) across a user-specified set of phage genomes. This tool incorporates a de novo phage gene calling algorithm and assigns putative functions to gene calls using protein-, virus- and phage-centric databases. multiPhATE’s modular construction allows the user to implement all or any portion of the analyses by acquiring local instances of the desired databases and specifying the desired analyses in a configuration file. We demonstrate multiPhATE by annotating two newly sequenced Yersinia pestis phage genomes. Within multiPhATE, the PhATE processing pipeline can be readily implemented across multiple processors, making it adaptable for throughput sequencing projects. Software documentation assists the user in configuring the system. Availability and implementation multiPhATE was implemented in Python 3.7, and runs as a command-line code under Linux or Unix. multiPhATE is freely available under an open-source BSD3 license from https://github.com/carolzhou/multiPhATE. Instructions for acquiring the databases and third-party codes used by multiPhATE are included in the distribution README file. Users may report bugs by submitting to the github issues page associated with the multiPhATE distribution. Supplementary information Supplementary data are available at Bioinformatics online.


2011 ◽  
Vol 28 (1) ◽  
pp. 15-27 ◽  
Author(s):  
Christopher J. Fluke ◽  
David G. Barnes ◽  
Benjamin R. Barsdell ◽  
Amr H. Hassan

AbstractGeneral-purpose computing on graphics processing units (GPGPU) is dramatically changing the landscape of high performance computing in astronomy. In this paper, we identify and investigate several key decision areas, with a goal of simplifying the early adoption of GPGPU in astronomy. We consider the merits of OpenCL as an open standard in order to reduce risks associated with coding in a native, vendor-specific programming environment, and present a GPU programming philosophy based on using brute force solutions. We assert that effective use of new GPU-based supercomputing facilities will require a change in approach from astronomers. This will likely include improved programming training, an increased need for software development best practice through the use of profiling and related optimisation tools, and a greater reliance on third-party code libraries. As with any new technology, those willing to take the risks and make the investment of time and effort to become early adopters of GPGPU in astronomy, stand to reap great benefits.


Author(s):  
Robert M. Andrews ◽  
Michael Smith

Fracture control studies for new gas transmission pipelines usually produce a specified minimum Charpy energy, often including “correction factors”, which will ensure that a crack will arrest in the body of the pipe. The basic pipeline parameters such as pressure, pipe grade, diameter and wall thickness will be fixed early in design, and the reservoir and process engineering design will set limits on the extremes of the gas composition. The inverse case, where the gas composition in an existing pipeline is to be changed from the original design basis, is more challenging. Changes in composition can arise from ageing of the reservoir supplying a pipeline, or opportunities for the operator to generate additional revenue from 3rd party access. Sales gas specification limits for general purpose natural gas transmission often have broad limits, which can be met by a wide range of compositions. As a wide range of gas compositions can give the same crack driving force, determining the composition limits is a “many to one” problem without a unique solution. This paper describes the derivation of an envelope of richer gas compositions which gave an acceptable probability of crack arrest in an existing pipeline which had originally been designed for a very lean gas mixture. Hence it was necessary to limit the amount of rich third party gas to ensure that the crack driving force did not increase sufficiently to propagate a long running fracture. Manufacturing test data for the linepipe were used with the EPRG probabilistic approach to derive a characteristic Charpy energy which would achieve a 95% probability of crack arrest in 5 joints or fewer. After “uncorrecting” the high Charpy energy, the value was used with the Battelle Two Curve model to analyse a range of gas compositions and derive an envelope of acceptable compositions. Sensitivity studies were carried out to assess the effects of increasing the temperature and of expanding the limits for nitrogen and carbon dioxide beyond the initial assumptions. It is concluded that for a specific case it will be possible to solve the inverse problem and produce composition limits which will allow increased flexibility of operation whilst maintaining safety.


Author(s):  
Ste´phane Hertz-Cle´mens

The main cause of failure for the most transmission pipelines is mechanical damage due to third party activities [1]. Usually, the damage takes form of a dent with an associated gouge. When this type of defect is detected, repair procedures of pipelines operators necessarily implies the removal of the gouge by grinding. This type of defect is considered as critical for the pipeline integrity; however, the importance of the various characteristics of the dent (depth, length, width) regarding pipeline integrity is not well known. To achieve a better understanding of this kind of mechanical damage, Gaz de France has been working on the numerical modelling of realistic defects for several years. This study presents the comparison between the results of experimental and realistic dent creation tests, and the predictions of the associated numerical model. Tests were carried out using different pipelines. All the parameters of the study were chosen in the range of gas transmission pipelines. Dents with gouges were created with Gaz de France Research and Development Division full-scale external damage testing facility: the pipe aggression rig. The tests were simulated with the general-purpose finite element (FE) package Abaqus/Standard 6.4. A good correlation was found between the FE calculation results and experimental data, in term of residual dent depth, load and strain levels measured during the creation of the dent. This study shows that a better knowledge of the mechanical damage induced by third party activity can be achieved with the development of numerical models.


2019 ◽  
Vol 214 ◽  
pp. 03018
Author(s):  
Wojciech Krzemien ◽  
Federico Stagni ◽  
Christophe Haen ◽  
Zoltan Mathe ◽  
Andrew McNab ◽  
...  

The Message Queue (MQ) architecture is an asynchronous communication scheme that provides an attractive solution for certain scenarios in a distributed computing model. The introduction of MQ as an intermediate component in-between the interacting processes allows to decouple the end-points making the system more flexible and providing high scalability and redundancy. DIRAC is a general-purpose interware software for distributed computing systems, which offers a common interface to a number of heterogeneous providers and guarantees transparent and reliable usage of the resources. The DIRAC platform has been adapted by several scientific projects, including High Energy Physics communities like LHCb, the Linear Collider and Belle2. A Message Queue generic interface has been incorporated into the DIRAC framework to help solving the scalability challenges that must be addressed during LHC Run3, starting in 2021. It allows to use the MQ scheme for a message exchange among the DIRAC components or to communicate with third-party services. Within this contribution we describe the integration of MQ systems with DIRAC and several use cases are shown. Message Queues are foreseen to be used in the pilot logging system, and as a backbone of the DIRAC component logging system and monitoring.


2014 ◽  
Vol 48 (4) ◽  
pp. 322-354 ◽  
Author(s):  
Paolo Manghi ◽  
Michele Artini ◽  
Claudio Atzori ◽  
Alessia Bardi ◽  
Andrea Mannocci ◽  
...  

Purpose – The purpose of this paper is to present the architectural principles and the services of the D-NET software toolkit. D-NET is a framework where designers and developers find the tools for constructing and operating aggregative infrastructures (systems for aggregating data sources with heterogeneous data models and technologies) in a cost-effective way. Designers and developers can select from a variety of D-NET data management services, can configure them to handle data according to given data models, and can construct autonomic workflows to obtain personalized aggregative infrastructures. Design/methodology/approach – The paper provides a definition of aggregative infrastructures, sketching architecture, and components, as inspired by real-case examples. It then describes the limits of current solutions, which find their lacks in the realization and maintenance costs of such complex software. Finally, it proposes D-NET as an optimal solution for designers and developers willing to realize aggregative infrastructures. The D-NET architecture and services are presented, drawing a parallel with the ones of aggregative infrastructures. Finally, real-cases of D-NET are presented, to show-case the statement above. Findings – The D-NET software toolkit is a general-purpose service-oriented framework where designers can construct customized, robust, scalable, autonomic aggregative infrastructures in a cost-effective way. D-NET is today adopted by several EC projects, national consortia and communities to create customized infrastructures under diverse application domains, and other organizations are enquiring for or are experimenting its adoption. Its customizability and extendibility make D-NET a suitable candidate for creating aggregative infrastructures mediating between different scientific domains and therefore supporting multi-disciplinary research. Originality/value – D-NET is the first general-purpose framework of this kind. Other solutions are available in the literature but focus on specific use-cases and therefore suffer from the limited re-use in different contexts. Due to its maturity, D-NET can also be used by third-party organizations, not necessarily involved in the software design and maintenance.


2016 ◽  
Vol 31 (1) ◽  
pp. 9-34
Author(s):  
Karl A. Froeschl ◽  
Tetsuo Yamada ◽  
Roman Kudrna

UNIDO maintains a long tradition in the compilation of international statistics on industrial production and, particularly, has developed a suite of tools and techniques for improving third-party data, mainly supplied directly or indirectly by National Statistics Offices, toenable and enhance both cross-country and long-term comparability. However, changing IT environments, socio-economic conditions, and customer requirements increasingly challenge established procedures and behaviors. After reviewing (i) the relevance of industrial statistics in general and that of international industrial statistics in particular for the industrial development in the macro economic framework, (ii) the importance of datacomparability on those statistics for accurate and objective analysis with regard to the industrial development, and (iii) UNIDO?s specific efforts in increasing cross-country data comparability, this paper discusses a proposal for an integrated data and data documentation framework aimed at (i) recording all measures taken by UNIDO and (ii) reporting on all residual inconsistencies and deficiencies adversely affecting data comparability and interpretation. The proposal is illustrated by a prototype implementationusing current UNIDO data.


2019 ◽  
Vol 5 ◽  
pp. e227
Author(s):  
Volodymyr B. Kopei ◽  
Oleh R. Onysko ◽  
Vitalii G. Panchuk

Typically, component-oriented acausal hybrid modeling of complex dynamic systems is implemented by specialized modeling languages. A well-known example is the Modelica language. The specialized nature, complexity of implementation and learning of such languages somewhat limits their development and wide use by developers who know only general-purpose languages. The paper suggests the principle of developing simple to understand and modify Modelica-like system based on the general-purpose programming language Python. The principle consists in: (1) Python classes are used to describe components and their systems, (2) declarative symbolic tools SymPy are used to describe components behavior by difference or differential equations, (3) the solution procedure uses a function initially created using the SymPy lambdify function and computes unknown values in the current step using known values from the previous step, (4) Python imperative constructs are used for simple events handling, (5) external solvers of differential-algebraic equations can optionally be applied via the Assimulo interface, (6) SymPy package allows to arbitrarily manipulate model equations, generate code and solve some equations symbolically. The basic set of mechanical components (1D translational “mass”, “spring-damper” and “force”) is developed. The models of a sucker rods string are developed and simulated using these components. The comparison of results of the sucker rod string simulations with practical dynamometer cards and Modelica results verify the adequacy of the models. The proposed approach simplifies the understanding of the system, its modification and improvement, adaptation for other purposes, makes it available to a much larger community, simplifies integration into third-party software.


2020 ◽  
Author(s):  
Alifa Hayati ◽  
Afriyeni Afriyeni

This study aims to analyze and add insight about credit by using a Loan to Deposit Ratio (LDR) and Non Performing Loans (NPL) at PT. BPR Jorong Kampung Tangah Pariaman to find out the efficiency and effectiveness of third party funds and bad loans owned by this bank. The data used is taken from secondary data from the financial data documentation of PT. BPR Jorong, the village of Tangah Pariaman. The results showed quite good because third party funds declined slightly while bad loans were balanced


Sign in / Sign up

Export Citation Format

Share Document