scholarly journals Software engineering principles to improve quality and performance of R software

2019 ◽  
Vol 5 ◽  
pp. e175
Author(s):  
Seth Russell ◽  
Tellen D. Bennett ◽  
Debashis Ghosh

Today’s computational researchers are expected to be highly proficient in using software to solve a wide range of problems ranging from processing large datasets to developing personalized treatment strategies from a growing range of options. Researchers are well versed in their own field, but may lack formal training and appropriate mentorship in software engineering principles. Two major themes not covered in most university coursework nor current literature are software testing and software optimization. Through a survey of all currently available Comprehensive R Archive Network packages, we show that reproducible and replicable software tests are frequently not available and that many packages do not appear to employ software performance and optimization tools and techniques. Through use of examples from an existing R package, we demonstrate powerful testing and optimization techniques that can improve the quality of any researcher’s software.

2021 ◽  
Author(s):  
Jason Hunter ◽  
Mark Thyer ◽  
Dmitri Kavetski ◽  
David McInerney

<p>Probabilistic predictions provide crucial information regarding the uncertainty of hydrological predictions, which are a key input for risk-based decision-making. However, they are often excluded from hydrological modelling applications because suitable probabilistic error models can be both challenging to construct and interpret, and the quality of results are often reliant on the objective function used to calibrate the hydrological model.</p><p>We present an open-source R-package and an online web application that achieves the following two aims. Firstly, these resources are easy-to-use and accessible, so that users need not have specialised knowledge in probabilistic modelling to apply them. Secondly, the probabilistic error model that we describe provides high-quality probabilistic predictions for a wide range of commonly-used hydrological objective functions, which it is only able to do by including a new innovation that resolves a long-standing issue relating to model assumptions that previously prevented this broad application.  </p><p>We demonstrate our methods by comparing our new probabilistic error model with an existing reference error model in an empirical case study that uses 54 perennial Australian catchments, the hydrological model GR4J, 8 common objective functions and 4 performance metrics (reliability, precision, volumetric bias and errors in the flow duration curve). The existing reference error model introduces additional flow dependencies into the residual error structure when it is used with most of the study objective functions, which in turn leads to poor-quality probabilistic predictions. In contrast, the new probabilistic error model achieves high-quality probabilistic predictions for all objective functions used in this case study.</p><p>The new probabilistic error model and the open-source software and web application aims to facilitate the adoption of probabilistic predictions in the hydrological modelling community, and to improve the quality of predictions and decisions that are made using those predictions. In particular, our methods can be used to achieve high-quality probabilistic predictions from hydrological models that are calibrated with a wide range of common objective functions.</p>


Author(s):  
Vincenzo De Florio ◽  
Chris Blondia

Current software systems and the environments such systems are meant for requiring a precise characterization of the available resources and provisions to constantly re-optimize in the face of endogenous and exogenous changes and failures. This paper claims that it is simply not possible today to conceive software design without explicitly addressing adaptability and dependability. As an example, the authors remark on how mobile computing technologies call for effective software engineering techniques to design, develop and maintain services that are prepared to continue the distribution of a fixed, agreed-upon quality of service despite of the changes in the location of the client software, performance failures, and the characteristics of the environment. This paper concludes that novel paradigms are required for software engineering so as to provide effective system structures for adaptive and dependable services while keeping the design complexity under control. In this paper, the authors discuss this problem and propose one such structure, also briefly surveying the major milestones in the state of the art in this domain.


Author(s):  
Vincenzo De Florio ◽  
Chris Blondia

Current software systems and the environments such systems are meant for requiring a precise characterization of the available resources and provisions to constantly re-optimize in the face of endogenous and exogenous changes and failures. This paper claims that it is simply not possible today to conceive software design without explicitly addressing adaptability and dependability. As an example, the authors remark on how mobile computing technologies call for effective software engineering techniques to design, develop and maintain services that are prepared to continue the distribution of a fixed, agreed-upon quality of service despite of the changes in the location of the client software, performance failures, and the characteristics of the environment. This paper concludes that novel paradigms are required for software engineering so as to provide effective system structures for adaptive and dependable services while keeping the design complexity under control. In this paper, the authors discuss this problem and propose one such structure, also briefly surveying the major milestones in the state of the art in this domain.


1986 ◽  
Vol 21 (3) ◽  
pp. 286-299
Author(s):  
Arthur Knight

THE TERM CORPORATE GOVERNANCE HAS COME INTO USE TO describe both the purposes and the methods which determine the structure and the control of companies. A wide range of legal, regulatory and less formalized arrangements is thus embraced. In the UK in recent years discussion has related to a number of interrelated issues: the structure and functioning of boards of directors, reporting to shareholders and the ways in which shareholders use their power. These issues have a bearing upon business performance, though the debate about ways to improve the quality of management embraces also the cultural factors, the educational system and training arrangements; and performance depends too upon factors wholly or largely beyond the influence of managers, such as the tensions from class-division, over-powerful unions and the uncertainties which flow from discontinuities in public policy which are especially evident in the British political system. But in the general debate the corporate governance issues have perhaps had less attention than they deserve; the discussion has been confined to a limited circle. It is proposed here to concentrate on non-executive directors.


Author(s):  
David Worth ◽  
Chris Greenough ◽  
Shawn Chin

The purpose of this chapter is to introduce scientific software developers to software engineering tools and techniques that will save them much blood, sweat, and tears and allow them to demonstrate the quality of their software. By introducing ideas around the software development life cycle, source code analysis, documentation, and testing, and touching on best practices, this chapter demonstrates ways in which scientific software can be improved and future developments made easier. This is not a research article on current software engineering methods, nor does it attempt to specify best practices. Its aim is to introduce components that can be built into a tailored process. The chapter draws upon ideas of best practice current in software engineering, but recommends using these only selectively. This is done by presenting details of tools that can be used to implement these ideas and a set of case studies to demonstrate their use.


2013 ◽  
Vol 64 (3) ◽  
pp. 133-142 ◽  
Author(s):  
Amin Safari ◽  
Ali Ahmadian ◽  
Masoud Aliakbar Golkar

Recently, honey bee mating optimization (HBMO) technique and genetic algorithms (GA) have attracted considerable attention among various modern heuristic optimization techniques. This paper presents the application and performance comparison of HBMO and GA optimization techniques, for coordinated design of STATCOM and PSS. The design objective is to enhance damping of the low frequency oscillations. The design problem of the controller is formulated as an optimization problem and both HBMO and GA optimization techniques are employed to search for optimal controller parameters. The performance of both optimization techniques for damping low frequency oscillations are tested and demonstrated through nonlinear time-domain simulation and some performance indices studies to different disturbances over a wide range of loading conditions. The results show that the designed controller by HBMO performs better than GA in finding the solution. Moreover, the system performance analysis under different operating conditions show that the φ based controller is superior to the C based controller.


Author(s):  
David Worth ◽  
Chris Greenough ◽  
Shawn Chin

The purpose of this chapter is to introduce scientific software developers to software engineering tools and techniques that will save them much blood, sweat, and tears and allow them to demonstrate the quality of their software. By introducing ideas around the software development life cycle, source code analysis, documentation, and testing, and touching on best practices, this chapter demonstrates ways in which scientific software can be improved and future developments made easier. This is not a research article on current software engineering methods, nor does it attempt to specify best practices. Its aim is to introduce components that can be built into a tailored process. The chapter draws upon ideas of best practice current in software engineering, but recommends using these only selectively. This is done by presenting details of tools that can be used to implement these ideas and a set of case studies to demonstrate their use.


Neurosurgery ◽  
2017 ◽  
Vol 82 (1) ◽  
pp. 24-34 ◽  
Author(s):  
Jennifer L Shah ◽  
Gordon Li ◽  
Jenny L Shaffer ◽  
Melissa I Azoulay ◽  
Iris C Gibbs ◽  
...  

Abstract Glioblastoma is the most common primary brain tumor in adults. Standard therapy depends on patient age and performance status but principally involves surgical resection followed by a 6-wk course of radiation therapy given concurrently with temozolomide chemotherapy. Despite such treatment, prognosis remains poor, with a median survival of 16 mo. Challenges in achieving local control, maintaining quality of life, and limiting toxicity plague treatment strategies for this disease. Radiotherapy dose intensification through hypofractionation and stereotactic radiosurgery is a promising strategy that has been explored to meet these challenges. We review the use of hypofractionated radiotherapy and stereotactic radiosurgery for patients with newly diagnosed and recurrent glioblastoma.


2020 ◽  
Vol 7 (2) ◽  
pp. 34-41
Author(s):  
VLADIMIR NIKONOV ◽  
◽  
ANTON ZOBOV ◽  

The construction and selection of a suitable bijective function, that is, substitution, is now becoming an important applied task, particularly for building block encryption systems. Many articles have suggested using different approaches to determining the quality of substitution, but most of them are highly computationally complex. The solution of this problem will significantly expand the range of methods for constructing and analyzing scheme in information protection systems. The purpose of research is to find easily measurable characteristics of substitutions, allowing to evaluate their quality, and also measures of the proximity of a particular substitutions to a random one, or its distance from it. For this purpose, several characteristics were proposed in this work: difference and polynomial, and their mathematical expectation was found, as well as variance for the difference characteristic. This allows us to make a conclusion about its quality by comparing the result of calculating the characteristic for a particular substitution with the calculated mathematical expectation. From a computational point of view, the thesises of the article are of exceptional interest due to the simplicity of the algorithm for quantifying the quality of bijective function substitutions. By its nature, the operation of calculating the difference characteristic carries out a simple summation of integer terms in a fixed and small range. Such an operation, both in the modern and in the prospective element base, is embedded in the logic of a wide range of functional elements, especially when implementing computational actions in the optical range, or on other carriers related to the field of nanotechnology.


2019 ◽  
Vol 1 (1) ◽  
pp. 92
Author(s):  
Fazidah Hanim Husain

Lighting is one of the key elements in any space and building infrastructure. Good design for an area in the building requires sufficient light that contributes to the efficiency of the activities. The correct method allows natural light to transmit, reduce heat and glare in providing a conducive learning environment. Light plays a significant influence to the quality of space and contributes focus of the students in an architecture studio. Previous research has shown that the effect of light also controlled emotions, behavior, and mood of the students. The operations of artificial lighting that have been used most of the time in an architecture studio during day and night may create lavishness and inadequacy at the same time. Therefore, this paper focuses on the identifying the quality of light for the architecture studio in UiTM (Perak), to instill a creative learning environment. Several methodologies adopted in this study such as illuminance level measurement using lux meter (LM-8100), and a questionnaire survey in gauging the lighting comfort level from students’ perspective. The study revealed that the illuminance level in the architecture studio is insufficient and not in the acceptable range stated in the Malaysian: Standards 1525:2007 and  not evenly distributed.  The study also concluded that the current studio environment is not condusive and appears monotonous. 


Sign in / Sign up

Export Citation Format

Share Document