scholarly journals Tuned Forest Fire Prediction: Static Calibration of the Evolutionary Component of ‘ESS’

2014 ◽  
Vol 17 (2) ◽  
Author(s):  
Germán Bianchini ◽  
Paola Caymes Scutari

Forest fires are a major risk factor with strong impact at eco-environmental and socio- economical levels, reasons why their study and modeling are very important. However, the models frequently have a certain level of uncertainty in some input parameters given that they must be approximated or estimated, as a consequence of diverse difficulties to accurately measure the conditions of the phenomenon in real time. This has resulted in the development of several methods for the uncertainty reduction, whose trade-off between accuracy and complexity can vary significantly. The system ESS (Evolutionary- Statistical System) is a method whose aim is to reduce the uncertainty, by combining Statistical Analysis, High Performance Computing (HPC) and Parallel Evolutionary Al- gorithms (PEAs). The PEAs use several parameters that require adjustment and that determine the quality of their use. The calibration of the parameters is a crucial task for reaching a good performance and to improve the system output. This paper presents an empirical study of the parameters tuning to evaluate the effectiveness of different configurations and the impact of their use in the Forest Fires prediction.

Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Christie L Mulvey ◽  
Sally J Rudy ◽  
David L Rodgers ◽  
Tammi J Bortner ◽  
Elizabeth H Sinz ◽  
...  

Introduction: Prompting devices for chest compressions have been advocated as a means of improving CPR quality in previous AHA guidelines. Studies have shown overall CPR quality improves with the use of these devices. Hypothesis: This study compared the impact of prompting devices on providers with varying levels of experience and proficiency. Methods: A convenience sample of 53 subjects with varying degrees of CPR experience, ranging from zero to frequent opportunities to perform CPR, were enrolled. Using a skills recording CPR manikin, data on each subject’s chest compression performance was obtained. All subjects performed an initial one-minute cycle of continuous chest compressions with no prompting device. After a brief rest, subjects were randomized to use one of two CPR prompting devices (Philips MRX with Q-CPR or Laerdal Medical CPRmeter). An additional one minute of CPR was conducted with the first device. Subjects were then crossed over to use the other prompting device after another brief rest. Results: Across the entire group, nearly all parameters significantly improved with the prompting devices, confirming previous studies on the efficacy of CPR prompting devices. However, when subjects’ results were examined by breaking the group into three performance levels (high, medium and low) based on the Overall CPR Score generated by the manikin software, there were differences in performance. Paired t -tests were conducted on the low and high performance groups. The low-level group significantly improved across 7 of 8 variables with both devices. The high-level group had only minor changes from baseline (both positive and negative) in most variables, but had significant or near significant decrease in proficiency in one variable - percent correctly released compressions ( p = 0.011 for Philips device; p = 0.052 for the Laerdal device). Conclusions: CPR prompting devices improve the overall quality of chest compressions. Individuals with existing high performance CPR skills could be distracted by the device, reducing the quality of compressions compared to using no device. When a CPR prompting device is introduced into a health care system, all providers, especially high performers, require practice with the device in order to acclimate to its use.


Author(s):  
Qiang Guan ◽  
Nathan DeBardeleben ◽  
Sean Blanchard ◽  
Song Fu ◽  
Claude H. Davis IV ◽  
...  

As the high performance computing (HPC) community continues to push towards exascale computing, HPC applications of today are only affected by soft errors to a small degree but we expect that this will become a more serious issue as HPC systems grow. We propose F-SEFI, a Fine-grained Soft Error Fault Injector, as a tool for profiling software robustness against soft errors. We utilize soft error injection to mimic the impact of errors on logic circuit behavior. Leveraging the open source virtual machine hypervisor QEMU, F-SEFI enables users to modify emulated machine instructions to introduce soft errors. F-SEFI can control what application, which sub-function, when and how to inject soft errors with different granularities, without interference to other applications that share the same environment. We demonstrate use cases of F-SEFI on several benchmark applications with different characteristics to show how data corruption can propagate to incorrect results. The findings from the fault injection campaign can be used for designing robust software and power-efficient hardware.


Energies ◽  
2019 ◽  
Vol 12 (11) ◽  
pp. 2129 ◽  
Author(s):  
Alberto Cocaña-Fernández ◽  
Emilio San José Guiote ◽  
Luciano Sánchez ◽  
José Ranilla

High Performance Computing Clusters (HPCCs) are common platforms for solving both up-to-date challenges and high-dimensional problems faced by IT service providers. Nonetheless, the use of HPCCs carries a substantial and growing economic and environmental impact, owing to the large amount of energy they need to operate. In this paper, a two-stage holistic optimisation mechanism is proposed to manage HPCCs in an eco-efficiently manner. The first stage logically optimises the resources of the HPCC through reactive and proactive strategies, while the second stage optimises hardware allocation by leveraging a genetic fuzzy system tailored to the underlying equipment. The model finds optimal trade-offs among quality of service, direct/indirect operating costs, and environmental impact, through multiobjective evolutionary algorithms meeting the preferences of the administrator. Experimentation was done using both actual workloads from the Scientific Modelling Cluster of the University of Oviedo and synthetically-generated workloads, showing statistical evidence supporting the adoption of the new mechanism.


Author(s):  
Gordon Bell ◽  
David H Bailey ◽  
Jack Dongarra ◽  
Alan H Karp ◽  
Kevin Walsh

The Gordon Bell Prize is awarded each year by the Association for Computing Machinery to recognize outstanding achievement in high-performance computing (HPC). The purpose of the award is to track the progress of parallel computing with particular emphasis on rewarding innovation in applying HPC to applications in science, engineering, and large-scale data analytics. Prizes may be awarded for peak performance or special achievements in scalability and time-to-solution on important science and engineering problems. Financial support for the US$10,000 award is provided through an endowment by Gordon Bell, a pioneer in high-performance and parallel computing. This article examines the evolution of the Gordon Bell Prize and the impact it has had on the field.


2014 ◽  
Vol 22 (2) ◽  
pp. 141-155 ◽  
Author(s):  
Daniel Laney ◽  
Steven Langer ◽  
Christopher Weber ◽  
Peter Lindstrom ◽  
Al Wegener

This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. We compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.


Author(s):  
Albert Sesé

Abstract Evidence generation by current Social and Health Sciences is coping with some important barriers that difficult credibility of scientific products. Information and communication technologies have a strong impact over social relationships in our postmodern societies. The incidence of post-truth in our context is generating a pernicious relativism, far from contrasting the information veracity. The aim of this paper is to analyze and discuss the challenges of research methods and statistical models, more specifically for Psychological research, taking into account the impact of novel techniques as big data and virtual reality. Special attention is also devoted to the discussion about statistical shortcomings of psychological research and to the reproducibility problem. Finally, some potential solutions are proposed to be applied in order to improve the quality of scientific evidence.


2019 ◽  
Vol 27 (3) ◽  
pp. 263-267
Author(s):  
Alexander S. Ayriyan

In this note we discuss the impact of development of architecture and technology of parallel computing on the typical life-cycle of the computational experiment. In particular, it is argued that development and installation of high-performance computing systems is indeed important itself regardless of specific scientific tasks, since the presence of cutting-age HPC systems within an academic infrastructure gives wide possibilities and stimulates new researches.


Sign in / Sign up

Export Citation Format

Share Document