scholarly journals Analyzing the I/O Performance of Post-Hoc Visualization of Huge Simulation Datasets on the K Computer

2017 ◽  
Author(s):  
Eduardo C. Inacio ◽  
Jorji Nonaka ◽  
Kenji Ono ◽  
Mario A. R. Dantas

As computational science simulations produce ever increasing volumes of data, executing part or even the entire visualization pipeline in the supercomputer side becomes more a requirement than an option. Given the uniqueness of the high performance K computer architecture, the HIVE visualization framework was developed, focusing on meeting visualization and data analysis demands of scientists and engineers. In this paper, we present an analysis on the input/output (I/O) performance of post-hoc visualization. The contribution of this research work is characterized by an analysis of a set of empirical study cases considering huge simulation datasets using HIVE on the K computer. Results from the experimental effort, using a dataset produced by a real-world global climate simulation, provide a differentiated knowledge on the impact of dataset partitioning parameters in the I/O performance of large-scale visualization systems, and highlight challenges and opportunities for performance optimizations.

1995 ◽  
Vol 117 (1) ◽  
pp. 155-157 ◽  
Author(s):  
F. C. Anderson ◽  
J. M. Ziegler ◽  
M. G. Pandy ◽  
R. T. Whalen

We have examined the feasibility of using massively-parallel and vector-processing supercomputers to solve large-scale optimization problems for human movement. Specifically, we compared the computational expense of determining the optimal controls for the single support phase of gait using a conventional serial machine (SGI Iris 4D25), a MIMD parallel machine (Intel iPSC/860), and a parallel-vector-processing machine (Cray Y-MP 8/864). With the human body modeled as a 14 degree-of-freedom linkage actuated by 46 musculotendinous units, computation of the optimal controls for gait could take up to 3 months of CPU time on the Iris. Both the Cray and the Intel are able to reduce this time to practical levels. The optimal solution for gait can be found with about 77 hours of CPU on the Cray and with about 88 hours of CPU on the Intel. Although the overall speeds of the Cray and the Intel were found to be similar, the unique capabilities of each machine are better suited to different portions of the computational algorithm used. The Intel was best suited to computing the derivatives of the performance criterion and the constraints whereas the Cray was best suited to parameter optimization of the controls. These results suggest that the ideal computer architecture for solving very large-scale optimal control problems is a hybrid system in which a vector-processing machine is integrated into the communication network of a MIMD parallel machine.


2017 ◽  
Vol 10 (3) ◽  
pp. 1383-1402 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).


2020 ◽  
Vol 54 (28) ◽  
pp. 4387-4395
Author(s):  
Sanchi Arora ◽  
Abhijit Majumdar ◽  
Bhupendra Singh Butola

The beneficial effect of STF impregnation in enhancing the impact resistance of high-performance fabrics has been extensively reported in the literature. However, this research work reports that fabric structure has a decisive role in moderating the effectiveness of STF impregnation in terms of impact energy absorption. Plain woven fabrics having sett varying from 25 × 25 inch−1 to 55 × 55 inch−1 were impregnated with STF at two different padding pressures to obtain different add-ons. The impact energy absorption by STF impregnated loosely woven fabrics was found to be higher than that of their neat counterparts for both levels of add-on, while opposite trend was observed in case of tightly woven fabrics. Further, comparison of tightly woven plain, 2/2 twill, 3/1 twill and 2 × 2 matt fabrics revealed beneficial effect of STF impregnation, except for the plain woven fabric, establishing that there exists a fabric structure-STF impregnation interplay that tunes the impact resistance of woven fabrics.


2020 ◽  
Author(s):  
Paul Kim ◽  
Daniel Partridge ◽  
James Haywood

<p>Global climate model (GCM) ensembles still produce a significant spread of estimates for the future of climate change which hinders our ability to influence policymakers. The range of these estimates can only partly be explained by structural differences and varying choice of parameterisation schemes between GCMs. GCM representation of cloud and aerosol processes, more specifically aerosol microphysical properties, remain a key source of uncertainty contributing to the wide spread of climate change estimates. The radiative effect of aerosol is directly linked to the microphysical properties and these are in turn controlled by aerosol source and sink processes during transport as well as meteorological conditions.</p><p>A Lagrangian, trajectory-based GCM evaluation framework, using spatially and temporally collocated aerosol diagnostics, has been applied to over a dozen GCMs via the AeroCom initiative. This framework is designed to isolate the source and sink processes that occur during the aerosol life cycle in order to improve the understanding of the impact of these processes on the simulated aerosol burden. Measurement station observations linked to reanalysis trajectories are then used to evaluate each GCM with respect to a quasi-observational standard to assess GCM skill. The AeroCom trajectory experiment specifies strict guidelines for modelling groups; all simulations have wind fields nudged to ERA-Interim reanalysis and all simulations use emissions from the same inventories. This ensures that the discrepancies between GCM parameterisations are emphasised and differences due to large scale transport patterns, emissions and other external factors are minimised.</p><p>Preliminary results from the AeroCom trajectory experiment will be presented and discussed, some of which are summarised now. A comparison of GCM aerosol particle number size distributions against observations made by measurement stations in different environments will be shown, highlighting the difficulties that GCMs have at reproducing observed aerosol concentrations across all size ranges in pristine environments. The impact of precipitation during transport on aerosol microphysical properties in each GCM will be shown and the implications this has on resulting aerosol forcing estimates will be discussed. Results demonstrating the trajectory collocation framework will highlight its ability to give more accurate estimates of the key aerosol sources in GCMs and the importance of these sources in influencing modelled aerosol-cloud effects. In summary, it will be shown that this analysis approach enables us to better understand the drivers behind inter-model and model-observation discrepancies.</p>


Author(s):  
Gordon Bell ◽  
David H Bailey ◽  
Jack Dongarra ◽  
Alan H Karp ◽  
Kevin Walsh

The Gordon Bell Prize is awarded each year by the Association for Computing Machinery to recognize outstanding achievement in high-performance computing (HPC). The purpose of the award is to track the progress of parallel computing with particular emphasis on rewarding innovation in applying HPC to applications in science, engineering, and large-scale data analytics. Prizes may be awarded for peak performance or special achievements in scalability and time-to-solution on important science and engineering problems. Financial support for the US$10,000 award is provided through an endowment by Gordon Bell, a pioneer in high-performance and parallel computing. This article examines the evolution of the Gordon Bell Prize and the impact it has had on the field.


The paper examines the impact of resource- based capability and competitive strategy on the performance of hightech new ventures in Kerala. The resource-based capability was analyzed using the variables managerial capability, technical capability, marketing capability and input sourcing capability of the firm. Competitive strategy was measured by looking into cost strategy, quality strategy, innovation strategy, and customization strategy adopted by the firm. A descriptive research was undertaken to analyze the performance of the ventures. A survey method was administered to collect data from 83 high- tech startups in Kerala. Independent sample t-test, Multiple regression and Cluster analysis were used to arrive at a conclusion. The findings indicate that both resource- based capability and competitive strategy influences the performance of a venture. It was found that the entrepreneurs of profitable ventures exhibit better managerial capability. The results of the cluster analysis show that the entrepreneurs with high performance indicators are customer centric. The innovation strategy and marketing capability adopted by the firm has an impact on the venture performance. The findings suggest that sales, profitability and financial position can be predicted to an extend by the resourcebased capability and competitive strategy adopted by the venture. The research work can be used by emerging entrepreneurs, policymakers and incubation managers to develop a framework to improve the performance of startups.


Author(s):  
Linyuan Guo

China, the developing country with the largest and oldest public education system, is transforming its education system through a nation-wide curriculum reform. This large-scale curriculum change signifies China's complex and multi-dimensional processes and endeavors in empowering its educational system to meet the challenges and opportunities in the era of globalization. This paper reports on an interpretive case study with a particular interest in understanding the impact of the nation-wide curriculum reform on teachers in urban areas. Findings from this study present the complex dimensions of teachers’ lived experiences during this dramatic education change and shed new insights on the current teaching profession in urban China.


Significance COP22 has been dubbed "the COP of action, adaptation and Africa". It is a key opportunity to build confidence in the system of global cooperation adopted at the Paris Climate Conference. The Paris meeting ushered in a new framework for cooperation on climate change based on voluntary emissions reductions targets that will be jointly reviewed every five years. Negotiators gathering in Marrakech for COP22 face the task of making the Paris Agreement work -- and delivering results on a sufficiently large scale. Impacts Cooperation under the Paris framework will help reduce climate change effects, though overshooting of the 2 degree target is inevitable. The Paris deal's reliance on peer pressure and self-policing will risk national-level backsliding during the implementation process. Actions taken in the next ten years will determine the impact of climate change on global growth prospects for the whole of this century.


Materials ◽  
2019 ◽  
Vol 12 (24) ◽  
pp. 4161 ◽  
Author(s):  
Vincenzo Tagliaferri ◽  
Federica Trovalusci ◽  
Stefano Guarino ◽  
Simone Venettacci

In this study, the authors present a comparative analysis of different additive manufacturing (AM) technologies for high-performance components. Four 3D printers, currently available on the Italian national manufacturing market and belonging to three different AM technologies, were considered. The analysis focused on technical aspects to highlight the characteristics and performance limits of each technology, economic aspects to allow for an assessment of the costs associated with the different processes, and environmental aspects to focus on the impact of the production cycles associated with these technologies on the ecosystem, resources and human health. This study highlighted the current limits of additive manufacturing technologies in terms of production capacity in the case of large-scale production of plastic components, especially large ones. At the same time, this study highlights how the geometry of the object to be developed greatly influences the optimal choice between the various AM technologies, in both technological and economic terms. Fused deposition modeling (FDM) is the technology that exhibits the greatest limitations hindering mass production due to production times and costs, but also due to the associated environmental impact.


2019 ◽  
Vol 76 (6) ◽  
pp. 1524-1542
Author(s):  
Melissa A Haltuch ◽  
Z Teresa A’mar ◽  
Nicholas A Bond ◽  
Juan L Valero

Abstract US West Coast sablefish are economically valuable, with landings of 11.8 million pounds valued at over $31 million during 2016, making assessing and understanding the impact of climate change on the California Current (CC) stock a priority for (1) forecasting future stock productivity, and (2) testing the robustness of management strategies to climate impacts. Sablefish recruitment is related to large-scale climate forcing indexed by regionally correlated sea level (SL) and zooplankton communities that pelagic young-of-the-year sablefish feed upon. This study forecasts trends in future sablefish productivity using SL from Global Climate Models (GCMs) and explores the robustness of harvest control rules (HCRs) to climate driven changes in recruitment using management strategy evaluation (MSE). Future sablefish recruitment is likely to be similar to historical recruitment but may be less variable. Most GCMs suggest that decadal SL trends result in recruitments persisting at lower levels through about 2040 followed by higher levels that are more favorable for sablefish recruitment through 2060. Although this MSE suggests that spawning biomass and catches will decline, and then stabilize, into the future under both HCRs, the sablefish stock does not fall below the stock size that leads to fishery closures.


Sign in / Sign up

Export Citation Format

Share Document