scholarly journals Distributed Computing for Genetic Approximation Algorithms for the Kissing Problem (n-dimensions)

2021 ◽  
Author(s):  
Andrew Kamal

Utilizing distributed computing for genetic optimization algorithms with n-dimensions and kissing numbers in their approximation, helps offload data and increase efficiency in approximation. In order to demonstrate this, we shall utilize mathematical proofs centered around N-dimensional vectors and arrays, as well as exponential dimensional analysis. Utilizing these proofs in optimization algorithms can have processed data offloaded through a shared network of computers running simultaneous multi-threaded computational processes. One can build a computational model based off of mathematical constraints viewed as higher dimensional complexity. Formulating such proofs is based off of degree of certainty versus uncertainty in the approximation, and which processing task should be optimized in order to yield the best result.

Author(s):  
Masao Arakawa

Teamology is established by Prof. Wilde to make creative teams in project teams. As a first step, it needs questionnaires to characterize personality for each member who joins the projects. Assume in academic project based learning teams, a number of students join and we are going to make several teams. Each team should have the same potential if and only if we can make every team as that. In order to create these teams, we need to quantify students’ characters, and we need to formalize them to meet the guideline of Teamology. In this study, we are going to make multi-objective optimization formulation of Teamology, and show an example of team making by using a genetic optimization algorithms with data that was taken PBL course in Kagawa University.


2017 ◽  
Author(s):  
Eshin Jolly ◽  
Luke J. Chang

Psychology is a complicated science. It has no general axioms or mathematical proofs, is rarely directly observable, and has the privilege of being the only discipline in which the content under investigation (i.e. human psychological phenomena) are the very tools utilized to conduct this investigation. For these reasons, it is easy to be seduced by the idea that our psychological theories, limited by our cognitive capacities, accurately reflect a far more complex landscape. Like the Flatlanders in Edwin Abbot’s famous short story (1884), we may be led to believe that the parsimony offered by our low-dimensional theories reflects the reality of a much higher-dimensional problem. Here we contest that this “Flatland fallacy” leads us to seek out simplified explanations of complex phenomena, limiting our capacity as scientists to build and communicate useful models of human psychology. We suggest that this fallacy can be overcome through (1) the use of quantitative models which force researchers to formalize their theories to overcome this fallacy and (2) improved quantitative training which can build new norms for conducting psychological research.


2020 ◽  
Vol 21 (3) ◽  
Author(s):  
Mateusz Starzec ◽  
Grażyna Starzec ◽  
Mateusz Paciorek

Need for scalability of an algorithm is essential, when one wants to utilize HPC infrastructure in an efficient and reasonable way. In such infrastructures, synchronization affects the efficiency of the parallel algorithms. However, one can consider introducing certain means of desynchronization in order to increase scalability. Allowing for omitting or delaying certain messages, can be easily accepted in the case of metaheuristics. Furthermore, some simulations can also follow this pattern and handle bigger environments. The paper presents a short survey of desynchronization idea, pointing out already obtained results or sketching out the future work focused on scaling the parallel and distributed computing or simulation algorithms leveraging desynchronization.


2001 ◽  
Vol 11 (04) ◽  
pp. 1159-1167 ◽  
Author(s):  
V. MÜLLER ◽  
B. KOTCHOUBEY ◽  
J. PERELMOUTER ◽  
W. LUTZENBERGER ◽  
N. BIRBAUMER ◽  
...  

The effects of disturbing noise on the nonlinear electrical brain (EEG) dynamics were investigated in a sample of 12 healthy volunteers. The prediction that a high intensity noise increases the dimensional complexity of brain activity as compared to a low intensity noise, was confirmed using a deterministic chaos algorithm. Further, there was a tendency for rest conditions to have a higher dimensional complexity than the condition in which the low-intensity noise was presented. Although a Fourier-analysis of the EEG revealed a difference in the alpha power between the rest condition at the end of the experiment and the noise conditions, no effect of noise intensity on the EEG power spectrum was obtained. The data indicate that the disturbing effect of a loud noise results in the recruitment of additional cortical networks and thus increases the complexity of cortical cell assemblies.


Sign in / Sign up

Export Citation Format

Share Document