Improving Scalability of Autonomic Systems: The Frequency-Aware Search Approach

Author(s):  
Pedro Fonseca ◽  
Hugo Miranda
Author(s):  
Apangshu Das ◽  
Sambhu Nath Pradhan

Background: Output polarity of the sub-function is generally considered to reduce the area and power of a circuit at the two-level realization. Along with area and power, the power-density is also one of the significant parameter which needs to be consider, because power-density directly converges to circuit temperature. More than 50% of the modern day integrated circuits are damaged due to excessive overheating. Methods: This work demonstrates the impact of efficient power density based logic synthesis (in the form of suitable polarity selection of sub-function of Programmable Logic Arrays (PLAs) for its multilevel realization) for the reduction of temperature. Two-level PLA optimization using output polarity selection is considered first and compared with other existing techniques and then And-Invert Graphs (AIG) based multi-level realization has been considered to overcome the redundant solution generated in two-level synthesis. AIG nodes and associated power dissipation can be reduced by rewriting, refactoring and balancing technique. Reduction of nodes leads to the reduction of the area but on the contrary increases power and power density of the circuit. A meta-heuristic search approach i.e., Nondominated Sorting Genetic Algorithm-II (NSGA-II) is proposed to select the suitable output polarity of PLA sub-functions for its optimal realization. Results: Best power density based solution saves up to 8.29% power density compared to ‘espresso – dopo’ based solutions. Around 9.57% saving in area and 9.67% saving in power (switching activity) are obtained with respect to ‘espresso’ based solution using NSGA-II. Conclusion: Suitable output polarity realized circuit is converted into multi-level AIG structure and synthesized to overcome the redundant solution at the two-level circuit. It is observed that with the increase in power density, the temperature of a particular circuit is also increases.


Energies ◽  
2021 ◽  
Vol 14 (4) ◽  
pp. 857
Author(s):  
Jahedul Islam ◽  
Md Shokor A. Rahaman ◽  
Pandian M. Vasant ◽  
Berihun Mamo Negash ◽  
Ahshanul Hoqe ◽  
...  

Well placement optimization is considered a non-convex and highly multimodal optimization problem. In this article, a modified crow search algorithm is proposed to tackle the well placement optimization problem. This article proposes modifications based on local search and niching techniques in the crow search algorithm (CSA). At first, the suggested approach is verified by experimenting with the benchmark functions. For test functions, the results of the proposed approach demonstrated a higher convergence rate and a better solution. Again, the performance of the proposed technique is evaluated with well placement optimization problem and compared with particle swarm optimization (PSO), the Gravitational Search Algorithm (GSA), and the Crow search algorithm (CSA). The outcomes of the study revealed that the niching crow search algorithm is the most efficient and effective compared to the other techniques.


2021 ◽  
Vol 11 (7) ◽  
pp. 2962
Author(s):  
Mohamadreza Afrasiabi ◽  
Christof Lüthi ◽  
Markus Bambach ◽  
Konrad Wegener

This paper presents an efficient mesoscale simulation of a Laser Powder Bed Fusion (LPBF) process using the Smoothed Particle Hydrodynamics (SPH) method. The efficiency lies in reducing the computational effort via spatial adaptivity, for which a dynamic particle refinement pattern with an optimized neighbor-search algorithm is used. The melt pool dynamics is modeled by resolving the thermal, mechanical, and material fields in a single laser track application. After validating the solver by two benchmark tests where analytical and experimental data are available, we simulate a single-track LPBF process by adopting SPH in multi resolutions. The LPBF simulation results show that the proposed adaptive refinement with and without an optimized neighbor-search approach saves almost 50% and 35% of the SPH calculation time, respectively. This achievement enables several opportunities for parametric studies and running high-resolution models with less computational effort.


Atmosphere ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 238
Author(s):  
Pablo Contreras ◽  
Johanna Orellana-Alvear ◽  
Paul Muñoz ◽  
Jörg Bendix ◽  
Rolando Célleri

The Random Forest (RF) algorithm, a decision-tree-based technique, has become a promising approach for applications addressing runoff forecasting in remote areas. This machine learning approach can overcome the limitations of scarce spatio-temporal data and physical parameters needed for process-based hydrological models. However, the influence of RF hyperparameters is still uncertain and needs to be explored. Therefore, the aim of this study is to analyze the sensitivity of RF runoff forecasting models of varying lead time to the hyperparameters of the algorithm. For this, models were trained by using (a) default and (b) extensive hyperparameter combinations through a grid-search approach that allow reaching the optimal set. Model performances were assessed based on the R2, %Bias, and RMSE metrics. We found that: (i) The most influencing hyperparameter is the number of trees in the forest, however the combination of the depth of the tree and the number of features hyperparameters produced the highest variability-instability on the models. (ii) Hyperparameter optimization significantly improved model performance for higher lead times (12- and 24-h). For instance, the performance of the 12-h forecasting model under default RF hyperparameters improved to R2 = 0.41 after optimization (gain of 0.17). However, for short lead times (4-h) there was no significant model improvement (0.69 < R2 < 0.70). (iii) There is a range of values for each hyperparameter in which the performance of the model is not significantly affected but remains close to the optimal. Thus, a compromise between hyperparameter interactions (i.e., their values) can produce similar high model performances. Model improvements after optimization can be explained from a hydrological point of view, the generalization ability for lead times larger than the concentration time of the catchment tend to rely more on hyperparameterization than in what they can learn from the input data. This insight can help in the development of operational early warning systems.


Sign in / Sign up

Export Citation Format

Share Document