Journal of Computing and Natural Science
Latest Publications


TOTAL DOCUMENTS

25
(FIVE YEARS 25)

H-INDEX

0
(FIVE YEARS 0)

Published By Anapub Publications

2789-181x

Author(s):  
Yung Ming ◽  
Lily Yuan

Machine Learning (ML) and Artificial Intelligence (AI) methods are transforming many commercial and academic areas, including feature extraction, autonomous driving, computational linguistics, and voice recognition. These new technologies are now having a significant effect in radiography, forensics, and many other areas where the accessibility of automated systems may improve the precision and repeatability of essential job performance. In this systematic review, we begin by providing a short overview of the different methods that are currently being developed, with a particular emphasis on those utilized in biomedical studies.


Author(s):  
Marco Scotini ◽  
Hussein Abdullah

With a central focus on the research question: “What must be done to encourage people to become more E+STEM educated?” this research is based on a Systematic review on ecological knowledge, which is linked to teachers' career growth as well as Environmental Science, Technology, Engineering, and Mathematics (E+STEM) pedagogy. The aim is to identify what instructors must do to improve their experience and credentials as E+STEM-educated people in light of expert views. To disclose expert views, a "mixed method" research approach is utilized in this study, which includes both qualitative and quantitative techniques. The technique employed is exploratory study sequencing, which is a kind of mixed-method study. The Delphi study's initial stage is to gather qualitative data on teachers' professional growth. The quantitative methodology is featured for the phase two of the Delphi research once the data has been analyzed in the first step. Lastly, the ultimate quantitative formulation (third phase) is produced following the data evaluation in the second phase.


Author(s):  
Mohammad Biglarbegian

The primary goal of this article in the research area of Advanced Engineering Informatics (AEIs) is to depict and formalize engineering knowledge that is multidimensional. This paper introduces conceptual framework and rationality as implicit methodologies to regularize knowledge. The objective of professionals, as well as the circumstances in which they work, should be considered when depicting and standardizing knowledge. The constructs of epistemology, rationality, and context are used to communicate various alternative data analysis techniques and practices that expert can use to institutionalize intricate engineering expertise and to substantiate whether a specialized conceptual model can support engineers with their challenging operations. A bottom-up method of research in advanced engineering, encompassing engineers, is suggested in this article. A social scientific approach to engendering knowledge for formalization and validating it is also recommended by us for scientists.


Author(s):  
Amir Antonie ◽  
Andrew Mathus

As a result of the parallel element setting, performance assessment and model construction are constrained. Component functions should be observable without direct connections to programming language, for example. As a result of this, solutions that are constituted interactively at program execution necessitate recyclable performance-monitoring interactions. As a result of these restrictions, a quasi, coarse-grained Performance Evaluation (PE) approach is described in this paper. A performance framework for the application system can be polymerized from these data. To validate the evaluation and model construction techniques included in the validation framework, simplistic elements with well-known optimization models are employed.


Author(s):  
Daniel Ashlock

Human knowledge was regarded as a transfer process into an applied knowledge base in the early 1980s as the creation of a Knowledge-Based Systems (KBS). The premise behind this transfer was that the KBS-required information already existed and only needed to be gathered and applied. Most of the time, the necessary information was gleaned through talking to professionals about how they handle particular problems. This knowledge was usually put to use in production rules, which were then carried out by a rule interpreter linked to them. Here, we demonstrate a number of new ideas and approaches that have emerged during the last few years. This paper presents MIKE, PROTÉGÉ-II, and Common KADS as three different modeling frameworks that may be used together or separately.


Author(s):  
Abirami S.K ◽  
Keerthika J

Rapid technological advancement provides a major potential to fulfill the agenda 2030 and the 2030 Agenda For sustainable development. Technological breakthroughs can help to eradicate poverty, monitor environmental sustainability key performance indicators, enhance food security, promote resource efficiency and effectiveness, enable deep structural transition, support social integration, combat ailments, and enhance access to higher education. Technology advancement also creates new policy problems, threatening to outrun authorities' and society's ability to respond to the changing brought about by new technology. Automation may have an uncertain and possibly detrimental effect on the economy, profitability, internationalization, and competitiveness. In that regard, this paper will focus on the technological changes in the field of science. The paper will start on an analysis of the effect of fast technological development on global disparities, then literature survey before evaluating the technological changes in science.


Author(s):  
Rimma Padovano

"Cloud computing" refers to large-scale parallel and distributed systems, which are essentially collections of autonomous. As a result, the “cloud organization” is made up on a wide range of ideas and experiences collected since the first digital computer was used to solve algorithmically complicated problems. Due to the complexity of established parallel and distributed computing ontologies, it is necessary for developers to have a high level of expertise to get the most out of the consolidated computer resources. The directions for future research for parallel and distributed computing are critically presented in this research: technology and application and cross-cutting concerns.


Author(s):  
Chris Duncan Lee

Consumers and academics are paying attention to affordable Virtual Reality (VR) (VR) remedies such as the Sony Entertainment VR, Vive VR and Oculus VR, including the Mixed-Reality Interface (MRITF) such as Hololens, which suggests it may be the next big thing in technical advancement. Nevertheless, VR has a wide history: the ideology of the remedy was initiated in 1960s, and commercialized toolkit for VR in the 1960s introduced in 1980s. This paper starts with the analysis of the development from VR to Augmented Reality (AR). We conclude by evaluating implications that MRITF, AR and VR will be capable of succeeding in the scientific disciplines, incorporating the human interaction as evident in the advent of cellular devices, altered social engagement, and understanding among individuals, as occurred with the emergence of smartphones.


Author(s):  
Rich Caruana ◽  
Yin Lou

Various challenges in real life are multi-objective and conflicting (i.e., alter concurrent optimization). This implies that a single objective is optimized based on another’s cost. The Multi-Objective Optimization (MOO) issues are challenging but potentially realistic, and due to their wide-range application, optimization challenges have widely been analyzed by research with distinct scholarly bases. Resultantly, this has yielded distinct approaches for mitigating these challenges. There is a wide-range literature concerning the approaches used to handle MOO challenges. It is important to keep in mind that each technique has its pros and limitations, and there is no optimum alternative for cure searchers in a typical scenario. The MOO challenges can be identified in various segments e.g., path optimization, airplane design, automobile design and finance, among others. This contribution presents a survey of prevailing MOO challenges and swarm intelligence approaches to mitigate these challenges. The main purpose of this contribution is to present a basis of understanding on MOO challenges.


Author(s):  
Steve Blair ◽  
Jon Cotter

The need for high-performance Data Mining (DM) algorithms is being driven by the exponentially increasing data availability such as images, audio and video from a variety of domains, including social networks and the Internet of Things (IoT). Deep learning is an emerging field of pattern recognition and Machine Learning (ML) study right now. It offers computer simulations of numerous nonlinear processing layers of neurons that may be used to learn and interpret data at higher degrees of abstractions. Deep learning models, which may be used in cloud technology and huge computational systems, can inherently capture complex structures of large data sets. Heterogeneousness is one of the most prominent characteristics of large data sets, and Heterogeneous Computing (HC) causes issues with system integration and Advanced Analytics. This article presents HC processing techniques, Big Data Analytics (BDA), large dataset instruments, and some classic ML and DM methodologies. The use of deep learning to Data Analytics is investigated. The benefits of integrating BDA, deep learning, HPC (High Performance Computing), and HC are highlighted. Data Analytics and coping with a wide range of data are discussed.


Sign in / Sign up

Export Citation Format

Share Document