scholarly journals Editorial: The publication of geoscientific model developments v1.0

2013 ◽  
Vol 6 (4) ◽  
pp. 1233-1242 ◽  
Author(s):  

Abstract. In 2008, the first volume of the European Geosciences Union (EGU) journal Geoscientific Model Development (GMD) was published. GMD was founded because we perceived there to be a need for a space to publish comprehensive descriptions of numerical models in the geosciences. The journal is now well established, with the submission rate increasing over time. However, there are several aspects of model publication that we believe could be further improved. In this editorial we assess the lessons learned over the first few years of the journal's life, and describe some changes to GMD's editorial policy, which will ensure that the models and model developments are published in such a way that they are of maximum value to the community. These changes to editorial policy mostly focus on improving the rigour of the review process through a stricter requirement for access to the materials necessary to test the behaviour of the models. Throughout this editorial, "must" means that the stated actions are required, and the paper cannot be published without them; "strongly encouraged" means that we encourage the action, but papers can still be published if the criteria are not met; "may" means that the action may be carried out by the authors or referees, if they so wish. We have reviewed and rationalised the manuscript types into five new categories. For all papers which are primarily based on a specific numerical model, the changes are as follows: – The paper must be accompanied by the code, or means of accessing the code, for the purpose of peer-review. If the code is normally distributed in a way which could compromise the anonymity of the referees, then the code must be made available to the editor. The referee/editor is not required to review the code in any way, but they may do so if they so wish. – All papers must include a section at the end of the paper entitled "Code availability". In this section, instructions for obtaining the code (e.g. from a supplement, or from a website) should be included; alternatively, contact information should be given where the code can be obtained on request, or the reasons why the code is not available should be clearly stated. – We strongly encourage authors to upload any user manuals associated with the code. – For models where this is practicable, we strongly encourage referees to compile the code, and run test cases supplied by the authors where appropriate. – For models which have been previously described in the "grey" literature (e.g. as internal institutional documents), we strongly encourage authors to include this grey literature as a supplement, when this is allowed by the original authors. – All papers must include a model name and version number (or other unique identifier) in the title. It is our perception that, since Geoscientific Model Development (GMD) was founded, it has become increasingly common to see model descriptions published in other more traditional journals, so we hope that our insights may be of general value to the wider geoscientific community.

Author(s):  
Vito Basile ◽  
Francesco Modica ◽  
Irene Fassi

In the present paper, a numerical approach to model the layer-by-layer construction of cured material during the Additive Manufacturing (AM) process is proposed. The method is developed by a recursive mechanical finite element (FE) analysis and takes into account forces and pressures acting on the cured material during the process, in order to simulate the behavior and investigate the failure condition sources, which lead to defects in the final part geometry. The study is focused on the evaluation of the process capability Stereolithography (SLA), to build parts with challenging features in meso-micro scale without supports. Two test cases, a cantilever part and a bridge shape component, have been considered in order to evaluate the potentiality of the approach. Numerical models have been tuned by experimental test. The simulations are validated considering two test cases and briefly compared to the printed samples. Results show the potential of the approach adopted but also the difficulties on simulation settings.


2011 ◽  
Vol 28 (8) ◽  
pp. 1007-1018 ◽  
Author(s):  
Christopher C. Hennon ◽  
Charles N. Helms ◽  
Kenneth R. Knapp ◽  
Amanda R. Bowen

Abstract An algorithm to detect and track global tropical cloud clusters (TCCs) is presented. TCCs are organized large areas of convection that form over warm tropical waters. TCCs are important because they are the “seedlings” that can evolve into tropical cyclones. A TCC satisfies the necessary condition of a “preexisting disturbance,” which provides the required latent heat release to drive the development of tropical cyclone circulations. The operational prediction of tropical cyclogenesis is poor because of weaknesses in the observational network and numerical models; thus, past studies have focused on identifying differences between “developing” (evolving into a tropical cyclone) and “nondeveloping” (failing to do so) TCCs in the global analysis fields to produce statistical forecasts of these events. The algorithm presented here has been used to create a global dataset of all TCCs that formed from 1980 to 2008. Capitalizing on a global, Gridded Satellite (GridSat) infrared (IR) dataset, areas of persistent, intense convection are identified by analyzing characteristics of the IR brightness temperature (Tb) fields. Identified TCCs are tracked as they move around their ocean basin (or cross into others); variables such as TCC size, location, convective intensity, cloud-top height, development status (i.e., developing or nondeveloping), and a movement vector are recorded in Network Common Data Form (NetCDF). The algorithm can be adapted to near-real-time tracking of TCCs, which could be of great benefit to the tropical cyclone forecast community.


2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2021 ◽  
Vol 69 (3) ◽  
pp. 857-872
Author(s):  
Kate McCue ◽  
Bill McCue

In 2018, the Chippewas of Georgina Island First Nation (GIFN) implemented a First Nation property tax system under the First Nations Fiscal Management Act (FMA)—one of the earliest First Nations in Ontario to do so. Implementation of a property tax system gave GIFN an opportunity to improve funding for and expand local services, and provide a more equitable sharing of local service costs between cottagers leasing First Nation land and the First Nation. Key challenges encountered when implementing the property tax system were building consensus around the need for a tax system, building an appropriate administrative infrastructure, carrying out property assessments, and professionals lacking knowledge of First Nation property tax. These challenges, however, presented opportunities to create a knowledge base around property taxation within GIFN, among cottage leaseholders, and in the wider community. Key lessons learned were (1) start as soon as possible; (2) First Nations Tax Commission support and standards are important; (3) staff training is important; (4) communicate early and often; (5) hold open houses; (6) local services are more than garbage collection; (7) property taxes do not harm lease rates or cottage sales; (8) educate lawyers, real estate agents, and other professionals; (9) startup costs were significant; (10) coordinate laws and standards with provincial variations; (11) modernize systems; and (12) utilize other parts of the FMA.


2021 ◽  
Vol 3 ◽  
Author(s):  
Sonya Ziaja ◽  
Mohit Chhabra

This Policy Brief provides lessons learned from regulation of climate adaptation by energy utilities. The regulatory bodies responsible for oversight of investor-owned energy utilities are ill-equipped to regulate climate adaptation in the energy sector; but they may be the only institutions with authority to do so. In 2018, the California Public Utilities Commission initiated the first quasi-legislative procedure to regulate investor owned energy utilities' climate adaptation activities. The Commission's new rules for climate adaptation offer some general guidance on climate adaptation, and require investor owned utilities to conduct and submit climate vulnerability studies. Structural limitations, including conflicting interest, capacity of staff, and scope of the problem hampered the success of adaptation regulation, which failed to address fundamental questions about what constitutes adaptive measures.


Author(s):  
Sylvain Cloutier

ABSTRACT ObjectiveStatistics Canada initiated the Canadian Statistical Demographic Database (CSDD) research project to determine if and how administrative data could be used to support the Canadian Census Program. The project’s goal is to create a census spine from administrative data sources. The CSDD’s current scope is limited to basic information (name, sex, birth date and usual place of residence) for all Canadians. MethodTwo 2011 CSDD prototypes were built using and linking hundreds of administrative files obtained mainly from other federal departments. Extensive pre-processing activities must take place prior to linkage to remove duplicates and standardize file variables. Given that Canadians do not possess a single unique identifier, administrative files were linked using record linkage methods; key matching variables were identified, validated and used to perform the linkage. This work led to the development of auxiliary files, which serve specific purposes related to the CSDD development. They also provide useful linkage keys to other Statistics Canada statistical programs.ResultsThe outcome of the CSDD is determined by comparing it to two references. First, comparisons were done at the aggregate level (Canadian, provincial and sub-provincial levels) by contrasting the results with Demography Division’s official population estimates for the 2011 Census. The CSDD was also compared with the 2011 Census of Population’s Response Database (RDB), which allows for analysis at the micro (record) level. The RDB contains non-imputed data on name, sex, birth date and usual place of residence as provided by individual census respondents. Comparisons with the RDB have allowed us to address the question, “Does the CSDD put the right person at the same address as the 2011 Census does?” Results are promising. At the aggregate level, the CSDD compares well with the demographic estimates for the 2011 Census at the national, provincial/territorial and some urban area levels. At the micro level, the CSDD contains more individuals than the RDB. Improvements are needed with regards to its ability to place persons accurately in rural areas due to the lack of good residential addresses in administrative data files. Initial results led to the planning of new CSDD prototypes, this time for 2016, in line with the 2016 Census of Population.ConclusionThe presentation will give an overview of the methods and principles behind the construction of the CSDD. Basic analytical results will present areas of strength and weakness. Lessons learned and upcoming challenges along with their proposed solutions will complete the presentation.


2021 ◽  
Author(s):  
Julie Deshayes

<p>When comparing realistic simulations produced by two ocean general circulation models, differences may emerge from alternative choices in boundary conditions and forcings, which alters our capacity to identify the actual differences between the two models (in the equations solved, the discretization schemes employed and/or the parameterizations introduced). The use of idealised test cases (idealized configurations with analytical boundary conditions and forcings, resolving a given set of equations) has proven efficient to reveal numerical bugs, determine advantages and pitfalls of certain numerical choices, and highlight remaining challenges. I propose to review historical progress enabled by the use of idealised test cases, and promote their utilization when assessing ocean dynamics as represented by an ocean model. For the latter, I would illustrate my talk using illustrations from my own research activities using NEMO in various contexts. I also see idealised test cases as a promising training tool for inexperienced ocean modellers, and an efficient solution to enlarge collaboration with experts in adjacent disciplines, such as mathematics, fluid dynamics and computer sciences.</p>


2017 ◽  
Vol 98 (10) ◽  
pp. 2057-2059 ◽  
Author(s):  
Marika M. Holland ◽  
Donald Perovich

Abstract Arctic sea ice has undergone significant change with large reductions in thickness and areal extent over the historical record. Numerical models project sea ice loss to continue for the foreseeable future, with the possibility of September ice-free conditions later this century. Understanding the mechanisms behind ice loss and its consequences for the larger Arctic and global systems is important if we are to anticipate and plan for the future. Meeting this challenge requires the collective and collaborative insights of scientists investigating the system from numerous perspectives. One impediment to progress has been a disconnect between the observational and modeling research communities. Advancing the science requires enhanced integration between these communities and more collaborative approaches to understanding Arctic sea ice loss. This paper discusses a successful effort to further these aims: a weeklong sea ice summer camp held in Barrow, Alaska (now known as Utqiaġvik), in May 2016. The camp brought together 25 participants who were a heterogeneous mix of observers and modelers from 13 different institutions at career stages from graduate students to senior researchers. The summer camp provided an accelerated program on sea ice observations and models and also fostered future collaborative interdisciplinary activities. A dialogue with Barrow community members was initiated in order to further understand the local consequences of Arctic sea ice loss. The discussion herein describes lessons learned from this activity and paths forward to advance the understanding and prediction of Arctic climate change.


Author(s):  
Nicholas Klymyshyn ◽  
Pavlo Ivanusa ◽  
Kevin Kadooka ◽  
Casey Spitz

Abstract In 2017, the United States Department of Energy (DOE) collaborated with Spanish and Korean organizations to perform a multimodal transportation test to measure shock and vibration loads imparted to used nuclear fuel (UNF) assemblies. This test used real fuel assembly components containing surrogate fuel mass to approximate the response characteristics of real, irradiated used nuclear fuel. Pacific Northwest National Laboratory was part of the test team and used the data collected during this test to validate numerical models needed to predict the response of real used nuclear fuel in other transportation configurations. This paper summarizes the modeling work and identifies lessons learned related to the modeling and analysis methodology. The modeling includes railcar dynamics using the NUCARS software code and explicit dynamic finite element modeling of used nuclear fuel cladding in LS-DYNA. The NUCARS models were validated against railcar dynamics data collected during captive track testing at the Federal Railroad Administration’s Transportation Technology Center in Pueblo, CO. The LS-DYNA models of the fuel cladding were validated against strain gage data collected throughout the test campaign. One of the key results of this work was an assessment of fuel cladding fatigue, and the methods used to calculate fatigue are detailed in this paper. The validated models and analysis methodologies described in this paper will be applied to evaluate future UNF transportation systems.


Author(s):  
Tilman Brück ◽  
Neil T N Ferguson ◽  
Valeria Izzi ◽  
Wolfgang Stojetz

Abstract In the last decade, well over $10 billion has been spent on employment programs designed to contribute to peace and stability. Despite the outlay, whether these programs perform, and how they do so, remain open questions. This study conducts three reviews to derive the status quo of knowledge. First, it draws on academic literature on the microfoundations of instability to distill testable theories of how employment programs could affect stability at the micro level. Second, it analyses academic and grey literature that directly evaluates the impacts of employment programs on peace-related outcomes. Third, it conducts a systematic review of program-based learning from over 400 interventions. This study finds good theoretical reasons to believe that employment programs could contribute to peace. However, only very limited evidence exists on overall impacts on peace or on the pathways underlying the theories of change. At the program level, the review finds strong evidence that contributions to peace and stability are often simply assumed to have occurred. This provides a major challenge for the justification of continued spending on jobs for peace programs. Instead, systematic and rigorous learning on the impacts of jobs for peace programs needs to be scaled up urgently.


Sign in / Sign up

Export Citation Format

Share Document