computational algorithms
Recently Published Documents


TOTAL DOCUMENTS

536
(FIVE YEARS 128)

H-INDEX

34
(FIVE YEARS 3)

2022 ◽  
Vol 71 (2) ◽  
pp. 3621-3634
Author(s):  
Ali Raza ◽  
Dumitru Baleanu ◽  
Muhammad Rafiq ◽  
Syed Zaheer Abbas ◽  
Abubakar Siddique ◽  
...  

Author(s):  
Yu-Ying Chuang ◽  
R. Harald Baayen

Naive discriminative learning (NDL) and linear discriminative learning (LDL) are simple computational algorithms for lexical learning and lexical processing. Both NDL and LDL assume that learning is discriminative, driven by prediction error, and that it is this error that calibrates the association strength between input and output representations. Both words’ forms and their meanings are represented by numeric vectors, and mappings between forms and meanings are set up. For comprehension, form vectors predict meaning vectors. For production, meaning vectors map onto form vectors. These mappings can be learned incrementally, approximating how children learn the words of their language. Alternatively, optimal mappings representing the end state of learning can be estimated. The NDL and LDL algorithms are incorporated in a computational theory of the mental lexicon, the ‘discriminative lexicon’. The model shows good performance both with respect to production and comprehension accuracy, and for predicting aspects of lexical processing, including morphological processing, across a wide range of experiments. Since, mathematically, NDL and LDL implement multivariate multiple regression, the ‘discriminative lexicon’ provides a cognitively motivated statistical modeling approach to lexical processing.


2021 ◽  
Vol 8 (11) ◽  
pp. 325-331
Author(s):  
Eko Hariyanto ◽  
Sri Wahyuni ◽  
Supina Batubara

The main problem studied in this study is the large number of lost students who harm universities because of the difficulty of monitoring or monitoring as a preventive measure. Therefore, this research becomes very important to be done so that college institutions can make efforts to detect early (classification) of students who potentially cannot complete their studies on time or students who will drop out (DO). Thus, PT institutions through related parties such as academic guidance lecturers, academic bureaus and others can do initial prevention by providing the best solution or solution to the problems faced by students. This research aims to determine the training data model consisting of academic and non-academic factors (including the results of extracting information from social media). Furthermore, this model is used as a basis for classifying students who have the potential to "graduate on time", "graduate not on time", and "DO". The method approach used is quantitative with text mining computational algorithms for the process of extracting knowledge / information from social media which is further used in data training, as well as data mining computational algorithms for the process of classification of potential completion of student studies. The mandatory external targeted in the first year is the publication of the international journal Scopus Q4 and in the second year is the publication of the international journal Scopus Q3. For additional external targets in the first and second years respectively are the publication of international journals indexed on reputable indexers, ISBN teaching books and copyrights. The level of technological readiness (TKT) in this study up to level 2 is the formulation of technological concepts and applications to classify the potential completion of student studies using data mining. Keywords: [student lost, knowledge/information extraction, data classification, text mining, data mining].


Author(s):  
Алексей Владимирович Снытников ◽  
Галина Геннадьевна Лазарева

Рассмотрены вопросы использования экзафлопсных вычислений для решения прикладных задач. На основе обзора работ в этой области выделены наиболее актуальные вопросы, связанные с экзафлопсными вычислениями. Особое внимание уделено особенностям программного обеспечения, алгоритмам и численным методам для экзафлопсных суперЭВМ. Приведены примеры разработки новых и адаптации существующих алгоритмов и численных методов для решения задач механики сплошной среды. Сделан анализ наиболее популярных приложений The article deals with applied issues which arise when exascale computing are used to solve applied problems. Based on the review of works in this area, the most pressing issues related to exascale calculations are highlighted. Particular attention is paid to software features, algorithms and numerical methods for exaflop supercomputers. The requirements for such programs and algorithms are formulated. Based on the review of existing approaches related to achieving high performance, the main fundamentally different and non-overlapping directions for improving the performance of calculations are highlighted. The question of the necessity for criteria of applicability for computational algorithms for exaflop supercomputers is raised. Currently, the only criterion which is used, demands the absence of a significant drop in efficiency in the transition from a petaflop calculation to a ten-petaflop calculation. In the absence of the possibility of such calculations, simulation modelling can be carried out. Examples of development for new and adaptation of existing algorithms and numerical methods for solving problems of continuum mechanics are given. The fundamental difference between algorithms specially designed for exascale machines and algorithms adapted for exaflops is shown. The analysis of publications has showed that in the field of solving problems of continuum mechanics, the approach not associated with the development of new, but rather with the adaptation of existing numerical methods and algorithms to the architecture of exaflop supercomputers prevails. The analysis of the most popular applications is made. The most relevant application of exaflop supercomputers in this area is computational fluid dynamics. This is because hydrodynamic applications are rich and diverse field. The number of publications indicates that the involvement of high-performance computing now is available and in demand


Author(s):  
Olha Milchenko

A non-linear optimal control problem for a hyperbolic system of first order equations on a line in the case of degeneracy of the initial condition line is considered. This problem describes many natural, economic and social processes, in particular, the optimality of the Slutsky demand, the theory of bio-population, etc. The research is based on the method of characteristics and the use of nonstandard variations of the increment of target functional, which leads to the construction of efficient computational algorithms.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Jungwon Yoon ◽  
Heather Billings ◽  
Chung-Il Wi ◽  
Elissa Hall ◽  
Sunghwan Sohn ◽  
...  

Abstract Background A subgroup of patients with asthma has been reported to have an increased risk for asthma-associated infectious and inflammatory multimorbidities (AIMs). To systematically investigate the association of asthma with AIMs using a large patient cohort, it is desired to leverage a broad range of electronic health record (EHR) data sources to automatically identify AIMs accurately and efficiently. Methods We established an expert consensus for an operational definition for each AIM from EHR through a modified Delphi technique. A series of questions about the operational definition of 19 AIMS (11 infectious diseases and 8 inflammatory diseases) was generated by a core team of experts who considered feasibility, balance between sensitivity and specificity, and generalizability. Eight internal and 5 external expert panelists were invited to individually complete a series of online questionnaires and provide judgement and feedback throughout three sequential internal rounds and two external rounds. Panelists’ responses were collected, descriptive statistics tabulated, and results reported back to the entire group. Following each round the core team of experts made iterative edits to the operational definitions until a moderate (≥ 60%) or strong (≥ 80%) level of consensus among the panel was achieved. Results Response rates for each Delphi round were 100% in all 5 rounds with the achievement of the following consensus levels: (1) Internal panel consensus: 100% for 8 definitions, 88% for 10 definitions, and 75% for 1 definition, (2) External panel consensus: 100% for 12 definitions and 80% for 7 definitions. Conclusions The final operational definitions of AIMs established through a modified Delphi technique can serve as a foundation for developing computational algorithms to automatically identify AIMs from EHRs to enable large scale research studies on patient’s multimorbidities associated with asthma.


2021 ◽  
Author(s):  
Jiacheng Zhang ◽  
George Alexandrou ◽  
Chris Toumazou ◽  
Melpomeni Kalofonou

2021 ◽  
Vol 2091 (1) ◽  
pp. 012072
Author(s):  
P Korolenko ◽  
R Kubanov ◽  
N Pavlov ◽  
A Zotov

Abstract A brief retrospective analysis of studies of fractal light radiation is carried out. To assess the prospects of this scientific direction, new original results of studying the diffraction propagation of vortex wave beams with a fractal structure (vortex diffractals) are also presented. For this purpose, computational algorithms and related software have been developed. In calculating the amplitude-phase and scaling characteristics of diffractals two-dimensional Weierstrass functions and multistructures of Gaussian beams were used. The results indicate a high information capacity of vortex diffractals and explain their resistance to the influence of turbulence in the propagation medium.


Sign in / Sign up

Export Citation Format

Share Document