Boosting climate change research with direct access to high performance computers

Author(s):  
Maria Moreno de Castro ◽  
Stephan Kindermann ◽  
Sandro Fiore ◽  
Paola Nassisi ◽  
Guillaume Levavasseur ◽  
...  

<p>Earth System observational and model data volumes are constantly increasing and it can be challenging to discover, download, and analyze data if scientists do not have the required computing and storage resources at hand. This is especially the case for detection and attribution studies in the field of climate change research since we need to perform multi-source and cross-disciplinary comparisons for datasets of high-spatial and large temporal coverage. Researchers and end-users are therefore looking for access to cloud solutions and high performance compute facilities. The Earth System Grid Federation (ESGF, https://esgf.llnl.gov/) maintains a global system of federated data centers that allow access to the largest archive of model climate data world-wide. ESGF portals provide free access to the output of the data contributing to the next assessment report of the Intergovernmental Panel on Climate Change through the Coupled Model Intercomparison Project. In order to support users to directly access to high performance computing facilities to perform analyses such as detection and attribution of climate change and its impacts, the EU Commission funded a new service within the infrastructure of the European Network for Earth System Modelling (ENES, https://portal.enes.org/data/data-metadata-service/analysis-platforms). This new service is designed to reduce data transfer issues, speed up the computational analysis, provide storage, and ensure the resources access and maintenance. Furthermore, the service is free of charge, only requires a lightweight application. We will present a demo on how flexible it is to calculate climate indices from different ESGF datasets covering a wide range of temporal and spatial scales using cdo (Climate Data Operators, https://code.mpimet.mpg.de/projects/cdo/) and Jupyter notebooks running directly on the ENES partners: the DKRZ (Germany), JASMIN (UK), CMCC(Italy), and IPSL (France) high performance computing centers.</p>

Green computing is a contemporary research topic to address climate and energy challenges. In this chapter, the authors envision the duality of green computing with technological trends in other fields of computing such as High Performance Computing (HPC) and cloud computing on one hand and economy and business on the other hand. For instance, in order to provide electricity for large-scale cloud infrastructures and to reach exascale computing, we need huge amounts of energy. Thus, green computing is a challenge for the future of cloud computing and HPC. Alternatively, clouds and HPC provide solutions for green computing and climate change. In this chapter, the authors discuss this proposition by looking at the technology in detail.


Author(s):  
Atta ur Rehman Khan ◽  
Abdul Nasir Khan

Mobile devices are gaining high popularity due to support for a wide range of applications. However, the mobile devices are resource constrained and many applications require high resources. To cater to this issue, the researchers envision usage of mobile cloud computing technology which offers high performance computing, execution of resource intensive applications, and energy efficiency. This chapter highlights importance of mobile devices, high performance applications, and the computing challenges of mobile devices. It also provides a brief introduction to mobile cloud computing technology, its architecture, types of mobile applications, computation offloading process, effective offloading challenges, and high performance computing application on mobile devises that are enabled by mobile cloud computing technology.


2014 ◽  
Vol 11 (9) ◽  
pp. 10273-10317 ◽  
Author(s):  
S. Wi ◽  
Y. C. E. Yang ◽  
S. Steinschneider ◽  
A. Khalil ◽  
C. M. Brown

Abstract. This study utilizes high performance computing to test the performance and uncertainty of calibration strategies for a spatially distributed hydrologic model in order to improve model simulation accuracy and understand prediction uncertainty at interior ungaged sites of a sparsely-gaged watershed. The study is conducted using a distributed version of the HYMOD hydrologic model (HYMOD_DS) applied to the Kabul River basin. Several calibration experiments are conducted to understand the benefits and costs associated with different calibration choices, including (1) whether multisite gaged data should be used simultaneously or in a step-wise manner during model fitting, (2) the effects of increasing parameter complexity, and (3) the potential to estimate interior watershed flows using only gaged data at the basin outlet. The implications of the different calibration strategies are considered in the context of hydrologic projections under climate change. Several interesting results emerge from the study. The simultaneous use of multisite data is shown to improve the calibration over a step-wise approach, and both multisite approaches far exceed a calibration based on only the basin outlet. The basin outlet calibration can lead to projections of mid-21st century streamflow that deviate substantially from projections under multisite calibration strategies, supporting the use of caution when using distributed models in data-scarce regions for climate change impact assessments. Surprisingly, increased parameter complexity does not substantially increase the uncertainty in streamflow projections, even though parameter equifinality does emerge. The results suggest that increased (excessive) parameter complexity does not always lead to increased predictive uncertainty if structural uncertainties are present. The largest uncertainty in future streamflow results from variations in projected climate between climate models, which substantially outweighs the calibration uncertainty.


2020 ◽  
Author(s):  
Dirk Barbi ◽  
Nadine Wieters ◽  
Paul Gierz ◽  
Fatemeh Chegini ◽  
Sara Khosravi ◽  
...  

Abstract. Earth system and climate modelling involves the simulation of processes on a wide range of scales and within and across various components of the Earth system. In practice, component models are often developed independently by different research groups and then combined using a dedicated coupling software. This procedure not only leads to a strongly growing number of available versions of model components and coupled setups but also to model- and system-dependent ways of obtaining and operating them. Therefore, implementing these Earth System Models (ESMs) can be challenging and extremely time-consuming, especially for less experienced modellers, or scientists aiming to use different ESMs as in the case of inter-comparison projects. To assist researchers and modellers by reducing avoidable complexity, we developed the ESM-Tools software, which provides a standard way for downloading, configuring, compiling, running and monitoring different models - coupled ESMs and stand-alone models alike - on a variety of High-Performance Computing (HPC) systems. (The ESM-Tools are equally applicable and helpful for stand-alone as for coupled models. In fact, the ESM-Tools are used as standard compile and runtime infrastructure for FESOM2, and currently also applied for ECHAM and ICON standalone simulations. As coupled ESMs are technically the more challenging tasks, we will focus on coupled setups, always implying that stand-alone models can benefit in the same way.) With the ESM-Tools, the user is only required to provide a short script consisting of only the experiment specific definitions, while the software executes all the phases of a simulation in the correct order. The software, which is well documented and easy to install and use, currently supports four ocean models, three atmosphere models, two biogeochemistry models, an ice sheet model, an isostatic adjustment model, a hydrology model and a land-surface model. ESM-Tools has been entirely re-coded in a high-level programming language (Python) and provides researchers with an even more user-friendly interface for Earth system modelling lately. The ESM-Tools were developed within the framework of the project Advanced Earth System Model Capacity, supported by the Helmholtz Association.


2018 ◽  
Author(s):  
LM Simon ◽  
S Karg ◽  
AJ Westermann ◽  
M Engel ◽  
AHA Elbehery ◽  
...  

AbstractBackgroundWith the advent of the age of big data in bioinformatics, large volumes of data and high performance computing power enable researchers to perform re-analyses of publicly available datasets at an unprecedented scale. Ever more studies imply the microbiome in both normal human physiology and a wide range of diseases. RNA sequencing technology (RNA-seq) is commonly used to infer global eukaryotic gene expression patterns under defined conditions, including human disease-related contexts, but its generic nature also enables the detection of microbial and viral transcripts.FindingsWe developed a bioinformatic pipeline to screen existing human RNA-seq datasets for the presence of microbial and viral reads by re-inspecting the non-human-mapping read fraction. We validated this approach by recapitulating outcomes from 6 independent controlled infection experiments of cell line models and comparison with an alternative metatranscriptomic mapping strategy. We then applied the pipeline to close to 150 terabytes of publicly available raw RNA-seq data from >17,000 samples from >400 studies relevant to human disease using state-of-the-art high performance computing systems. The resulting data of this large-scale re-analysis are made available in the presented MetaMap resource.ConclusionsOur results demonstrate that common human RNA-seq data, including those archived in public repositories, might contain valuable information to correlate microbial and viral detection patterns with diverse diseases. The presented MetaMap database thus provides a rich resource for hypothesis generation towards the role of the microbiome in human disease.


2021 ◽  
Vol 14 (6) ◽  
pp. 4051-4067
Author(s):  
Dirk Barbi ◽  
Nadine Wieters ◽  
Paul Gierz ◽  
Miguel Andrés-Martínez ◽  
Deniz Ural ◽  
...  

Abstract. Earth system and climate modelling involves the simulation of processes on a wide range of scales and within and across various compartments of the Earth system. In practice, component models are often developed independently by different research groups, adapted by others to their special interests and then combined using a dedicated coupling software. This procedure not only leads to a strongly growing number of available versions of model components and coupled setups but also to model- and high-performance computing (HPC)-system-dependent ways of obtaining, configuring, building and operating them. Therefore, implementing these Earth system models (ESMs) can be challenging and extremely time consuming, especially for less experienced modellers or scientists aiming to use different ESMs as in the case of intercomparison projects. To assist researchers and modellers by reducing avoidable complexity, we developed the ESM-Tools software, which provides a standard way for downloading, configuring, compiling, running and monitoring different models on a variety of HPC systems. It should be noted that ESM-Tools is not a coupling software itself but a workflow and infrastructure management tool to provide access to increase usability of already existing components and coupled setups. As coupled ESMs are technically the more challenging tasks, we will focus on coupled setups, always implying that stand-alone models can benefit in the same way. With ESM-Tools, the user is only required to provide a short script consisting of only the experiment-specific definitions, while the software executes all the phases of a simulation in the correct order. The software, which is well documented and easy to install and use, currently supports four ocean models, three atmosphere models, two biogeochemistry models, an ice sheet model, an isostatic adjustment model, a hydrology model and a land-surface model. Compared to previous versions, ESM-Tools has lately been entirely recoded in a high-level programming language (Python) and provides researchers with an even more user-friendly interface for Earth system modelling. ESM-Tools was developed within the framework of the Advanced Earth System Model Capacity project, supported by the Helmholtz Association.


Geosciences ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 72
Author(s):  
Muhammad Rizwan Riaz ◽  
Hiroki Motoyama ◽  
Muneo Hori

Recent achievement of research on soil-structure interaction (SSI) is reviewed, with a main focus on the numerical analysis. The review is based on the continuum mechanics theory and the use of high-performance computing (HPC) and clarifies the characteristics of a wide range of treatment of SSI from a simplified model to a high fidelity model. Emphasized is that all the treatment can be regarded as the result of the mathematical approximations in solving a physical continuum mechanics problem of a soil-structure system. The use of HPC is inevitable if we need to obtain a solution of higher accuracy and finer resolution. An example of using HPC for the analysis of SSI is presented.


2013 ◽  
Vol 98 ◽  
pp. 131-135 ◽  
Author(s):  
Jean-André Vital ◽  
Michael Gaurut ◽  
Romain Lardy ◽  
Nicolas Viovy ◽  
Jean-François Soussana ◽  
...  

2021 ◽  
Author(s):  
Maria Moreno de Castro ◽  
Marco Kulüke ◽  
Fabian Wachsmann ◽  
Regina Kwee-Hinzmann ◽  
Stephan Kindermann ◽  
...  

<p>Tired of downloading tons of model results? Is your internet connection flakey? Are you about to overload your computer’s memory with the constant increase of data volume and you need more computing resources? You can request free of charge computing time at one of the supercomputers of the Infrastructure of the European Network of Earth System modelling (IS-ENES)<sup>1</sup>, the European part of Earth System Grid Federation (ESGF)<sup>2</sup>, which also hosts and maintains more than 6 Petabytes of CMIP6 and CORDEX data.</p><p>Thanks to this new EU Comission funded service, you can run your own scripts in your favorite programming language and straightforward pre- and post-process model data. There is no need for heavy data transfer, just load with one line of code the data slice you need because your script will directly access the data pool. Therefore, days-lasting calculations will be done in seconds. You can test the service, we very easily provide pre-access activities.</p><p>In this session we will run Jupyter notebooks directly on the German Climate Computing Center (DKRZ)<sup>3</sup>, one of the ENES high performance computers and a ESGF data center, showing how to load, filter, concatenate, take means, and plot several CMIP6 models to compare their results, use some CMIP6 models to calculate some climate indexes for any location and period, and evaluate model skills with observational data. We will use Climate Data Operators (cdo)<sup>4</sup> and Python packages for Big Data manipulation, as Intake<sup>5</sup>, to easily extract the data from the huge catalog, and Xarray<sup>6</sup>, to easily read NetDCF files and scale to parallel computing. We are continuously creating more use cases for multi-model evaluation, mechanisms of variability, and impact analysis, visit the demos, find more information, and apply here: https://portal.enes.org/data/data-metadata-service/analysis-platforms.<br><br>[1] https://is.enes.org/<br>[2] https://esgf.llnl.gov/<br>[3] https://www.dkrz.de/<br>[4] https://code.mpimet.mpg.de/projects/cdo/<br>[5] https://intake.readthedocs.io/en/latest/<br>[6] http://xarray.pydata.org/en/stable/</p>


Sign in / Sign up

Export Citation Format

Share Document