scholarly journals Open Science Expectations for Simulation-Based Research

2021 ◽  
Vol 3 ◽  
Author(s):  
Gretchen L. Mullendore ◽  
Matthew S. Mayernik ◽  
Douglas C. Schuster

There is strong agreement across the sciences that replicable workflows are needed for computational modeling. Open and replicable workflows not only strengthen public confidence in the sciences, but also result in more efficient community science. However, the massive size and complexity of geoscience simulation outputs, as well as the large cost to produce and preserve these outputs, present problems related to data storage, preservation, duplication, and replication. The simulation workflows themselves present additional challenges related to usability, understandability, documentation, and citation. These challenges make it difficult for researchers to meet the bewildering variety of data management requirements and recommendations across research funders and scientific journals. This paper introduces initial outcomes and emerging themes from the EarthCube Research Coordination Network project titled “What About Model Data? - Best Practices for Preservation and Replicability,” which is working to develop tools to assist researchers in determining what elements of geoscience modeling research should be preserved and shared to meet evolving community open science expectations.Specifically, the paper offers approaches to address the following key questions:• How should preservation of model software and outputs differ for projects that are oriented toward knowledge production vs. projects oriented toward data production?• What components of dynamical geoscience modeling research should be preserved and shared?• What curation support is needed to enable sharing and preservation for geoscience simulation models and their output?• What cultural barriers impede geoscience modelers from making progress on these topics?

2018 ◽  
Vol 37 (4) ◽  
Author(s):  
Heidi Enwald

Open research data is data that is free to access, reuse, and redistribute. This study focuses on the perceptions, opinions and experiences of staff and researchers of research institutes on topics related to open research data. Furthermore, the differences across gender, role in the research organization and research field were investigated. An international questionnaire survey, translated into Finnish and Swedish, was used as the data collection instrument. An online survey was distributed through an open science related network to Finnish research organizations. In the end, 469 responded to all 24 questions of the survey. Findings indicate that many are still unaware or uncertain about issues related to data sharing and long-term data storage. Women as well as staff and researchers of medical and health sciences were most concerned about the possible problems associated with data sharing. Those in the beginning of their scientific careers, hesitated about sharing their data.


2014 ◽  
Vol 6 ◽  
pp. 217584 ◽  
Author(s):  
J. Schilp ◽  
C. Seidel ◽  
H. Krauss ◽  
J. Weirather

Process monitoring and modelling can contribute to fostering the industrial relevance of additive manufacturing. Process related temperature gradients and thermal inhomogeneities cause residual stresses, and distortions and influence the microstructure. Variations in wall thickness can cause heat accumulations. These occur predominantly in filigree part areas and can be detected by utilizing off-axis thermographic monitoring during the manufacturing process. In addition, numerical simulation models on the scale of whole parts can enable an analysis of temperature fields upstream to the build process. In a microscale domain, modelling of several exposed single hatches allows temperature investigations at a high spatial and temporal resolution. Within this paper, FEM-based micro- and macroscale modelling approaches as well as an experimental setup for thermographic monitoring are introduced. By discussing and comparing experimental data with simulation results in terms of temperature distributions both the potential of numerical approaches and the complexity of determining suitable computation time efficient process models are demonstrated. This paper contributes to the vision of adjusting the transient temperature field during manufacturing in order to improve the resulting part's quality by simulation based process design upstream to the build process and the inline process monitoring.


2012 ◽  
pp. 862-880
Author(s):  
Russ Miller ◽  
Charles Weeks

Grids represent an emerging technology that allows geographically- and organizationally-distributed resources (e.g., computer systems, data repositories, sensors, imaging systems, and so forth) to be linked in a fashion that is transparent to the user. The New York State Grid (NYS Grid) is an integrated computational and data grid that provides access to a wide variety of resources to users from around the world. NYS Grid can be accessed via a Web portal, where the users have access to their data sets and applications, but do not need to be made aware of the details of the data storage or computational devices that are specifically employed in solving their problems. Grid-enabled versions of the SnB and BnP programs, which implement the Shake-and-Bake method of molecular structure (SnB) and substructure (BnP) determination, respectively, have been deployed on NYS Grid. Further, through the Grid Portal, SnB has been run simultaneously on all computational resources on NYS Grid as well as on more than 1100 of the over 3000 processors available through the Open Science Grid.


2019 ◽  
pp. 98-131
Author(s):  
Johannes Lenhard

This chapter shows that—and how—simulation models are epistemically opaque. Nevertheless, it is argued, simulation models can provide a means to control dynamics. Researchers can employ a series of iterated (experimental) runs of the model and can learn to orient themselves within the model—even if the dynamics of the simulation remain (at least partly) opaque. Admittedly, such an acquaintance with the model falls short of the high epistemic standards usually ascribed to mathematical models. This lower standard is still sufficient, however, when the aim is controlled intervention in technological contexts. On the other hand, opacity has to be accepted if the option for control is to remain in any way open. This chapter closes by discussing whether epistemic opacity restricts simulation-based science to a pragmatic—“weak”—version of scientific understanding.


Author(s):  
Haibo Chen ◽  
Torgeir Moan ◽  
Sverre Haver ◽  
Kjell Larsen

Tandem offloading safety between FPSO and shuttle tanker is under concern. A few collisions between the two vessels have happened in the North Sea in recent years. In these incidents, excessive relative motions (termed as surging and yawing in this paper) between FPSO and tanker are identified as “failure prone situations” which have contributed to the initiation of most collision incidents. To quantitatively assess the probability of surging and yawing events, and more importantly, to effectively reduce their occurrence in tandem offloading operation, we present a simulation-based approach in this paper, which is carried out by a state-of-the-art time-domain simulation code SIMO. The SIMO simulation models are setup and calibrated for a typical North Sea purpose-built FPSO and a DP shuttle tanker. This 2-vessel system motion in tandem offloading is simulated. The simulated relative distance and relative heading between FPSO and tanker are analyzed by fitting their extreme values into statistical models. This gives out probabilities of surging and yawing events. Sensitivity studies are performed to analyze contributions from various technical and operational factors. Measures to minimize the occurrence of surging and yawing from design and operational point of view are proposed.


2007 ◽  
Vol 15 (4) ◽  
pp. 249-268 ◽  
Author(s):  
Gurmeet Singh ◽  
Karan Vahi ◽  
Arun Ramakrishnan ◽  
Gaurang Mehta ◽  
Ewa Deelman ◽  
...  

In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO) application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.


2021 ◽  
Author(s):  
Eric Feczko ◽  
Greg Conan ◽  
Scott Marek ◽  
Brenden Tervo-Clemens ◽  
Michaela Cordova ◽  
...  

The Adolescent Brain Cognitive Development Study (ABCD), a 10 year longitudinal neuroimaging study of the largest population based and demographically distributed cohort of 9-10 year olds (N=11,877), was designed to overcome reproducibility limitations of prior child mental health studies. Besides the fantastic wealth of research opportunities, the extremely large size of the ABCD data set also creates enormous data storage, processing, and analysis challenges for researchers. To ensure data privacy and safety, researchers are not currently able to share neuroimaging data derivatives through the central repository at the National Data Archive (NDA). However, sharing derived data amongst researchers laterally can powerfully accelerate scientific progress, to ensure the maximum public benefit is derived from the ABCD study. To simultaneously promote collaboration and data safety, we developed the ABCD-BIDS Community Collection (ABCC), which includes both curated processed data and software utilities for further analyses. The ABCC also enables researchers to upload their own custom-processed versions of ABCD data and derivatives for sharing with the research community. This NeuroResource is meant to serve as the companion guide for the ABCC. In section we describe the ABCC. Section II highlights ABCC utilities that help researchers access, share, and analyze ABCD data, while section III provides two exemplar reproducibility analyses using ABCC utilities. We hope that adoption of the ABCC's data-safe, open-science framework will boost access and reproducibility, thus facilitating progress in child and adolescent mental health research.


Author(s):  
Zhimin Xi ◽  
Hao Pan ◽  
Ren-Jye Yang

Reliability analysis based on the simulation model could be wrong if the simulation model were not validated. Various model bias correction approaches have been developed to improve the model credibility by adding the identified model bias to the baseline simulation model. However, little research has been conducted for simulation models with dynamic system responses. This paper presents such a framework for model bias correction of dynamic system responses for reliability analysis by addressing three technical components including: i) a validation metric for dynamic system responses, ii) an effective approach for dynamic model bias calibration and approximation, and iii) reliability analysis considering the dynamic model bias. Two case studies including a thermal problem and a corroded beam problem are employed to demonstrate the proposed approaches for simulation-based reliability analysis.


Author(s):  
Wolfgang Fimml ◽  
Christian Fuchs ◽  
Thomas Jauk ◽  
Andreas Wimmer

The characterization of diesel sprays for the simulation-based optimization of injection strategies and combustion chamber geometries is of particular importance to reach future targets concerning performance, fuel consumption and emissions. The prediction quality of this simulation process depends largely upon the adequate calibration of the spray models used. This paper aims to present the experimental setup of a spray box, the applied optical visualization techniques and the results. Furthermore, it will show the adjustment and the validation of the simulation models based on the experimental analysis.


Author(s):  
Tero Eskola ◽  
Heikki Handroos

A Hardware-in-the-loop (HIL) simulation based method for designing and testing of fluid power driven machines has recently been studied in [1], [2] and [3]. In those papers the method has successfully been tested for driving physical prototypes with simulation models of various hydraulic circuits. Although the results of the tested method have appeared to be reasonable the critical boundary conditions of the system has not yet been studied. In this paper a simple hydraulic system is modeled and used for driving the simulator. The simulated system is then built from real components and measured. The measured and simulated results are compared. One of the main goals of this paper is to find answer to the following question: What is the maximum bandwidth that can be put out from the simulator with sufficient accuracy. The answer demonstrates the applicability of the developed HIL-simulator. Also different sizes of time steps are studied.


Sign in / Sign up

Export Citation Format

Share Document