domain specific language
Recently Published Documents


TOTAL DOCUMENTS

719
(FIVE YEARS 168)

H-INDEX

24
(FIVE YEARS 3)

Author(s):  
Rusul Yousif Alsalhee ◽  
Abdulhussein Mohsin Abdullah

<p>The Holy Quran, due to it is full of many inspiring stories and multiple lessons that need to understand it requires additional attention when it comes to searching issues and information retrieval. Many works were carried out in the Holy Quran field, but some of these dealt with a part of the Quran or covered it in general, and some of them did not support semantic research techniques and the possibility of understanding the Quranic knowledge by the people and computers. As for others, techniques of data analysis, processing, and ontology were adopted, which led to directed these to linguistic aspects more than semantic. Another weakness in the previous works, they have adopted the method manually entering ontology, which is costly and time-consuming. In this paper, we constructed the ontology of Quranic stories. This ontology depended in its construction on the MappingMaster domain-specific language (MappingMaster DSL)technology, through which concepts and individuals can be created and linked automatically to the ontology from Excel sheets. The conceptual structure was built using the object role modeling (ORM) modeling language. SPARQL query language used to test and evaluate the propsed ontology by asking many competency questions and as a result, the ontology answered all these questions well.</p>


2022 ◽  
Vol 64 ◽  
pp. 103006
Author(s):  
Antonia M. Reina Quintero ◽  
Salvador Martínez Pérez ◽  
Ángel Jesús Varela-Vaca ◽  
María Teresa Gómez López ◽  
Jordi Cabot

Webology ◽  
2021 ◽  
Vol 19 (1) ◽  
pp. 01-18
Author(s):  
Hayder Rahm Dakheel AL-Fayyadh ◽  
Salam Abdulabbas Ganim Ali ◽  
Dr. Basim Abood

The goal of this paper is to use artificial intelligence to build and evaluate an adaptive learning system where we adopt the basic approaches of spiking neural networks as well as artificial neural networks. Spiking neural networks receive increasing attention due to their advantages over traditional artificial neural networks. They have proven to be energy efficient, biological plausible, and up to 105 times faster if they are simulated on analogue traditional learning systems. Artificial neural network libraries use computational graphs as a pervasive representation, however, spiking models remain heterogeneous and difficult to train. Using the artificial intelligence deductive method, the paper posits two hypotheses that examines whether 1) there exists a common representation for both neural networks paradigms for tutorial mentoring, and whether 2) spiking and non-spiking models can learn a simple recognition task for learning activities for adaptive learning. The first hypothesis is confirmed by specifying and implementing a domain-specific language that generates semantically similar spiking and non-spiking neural networks for tutorial mentoring. Through three classification experiments, the second hypothesis is shown to hold for non-spiking models, but cannot be proven for the spiking models. The paper contributes three findings: 1) a domain-specific language for modelling neural network topologies in adaptive tutorial mentoring for students, 2) a preliminary model for generalizable learning through back-propagation in spiking neural networks for learning activities for students also represented in results section, and 3) a method for transferring optimised non-spiking parameters to spiking neural networks has also been developed for adaptive learning system. The latter contribution is promising because the vast machine learning literature can spill-over to the emerging field of spiking neural networks and adaptive learning computing. Future work includes improving the back-propagation model, exploring time-dependent models for learning, and adding support for adaptive learning systems.


Author(s):  
Akif Quddus Khan

This paper aims to provide an overview of the complete process in the development of a Domain-Specific Language (DSL). It explains the construction steps such as preliminary research, language implementation, and evaluation. Moreover, it provides details for different key components which are commonly found in the DSLs such as the abstraction layer, DSL metamodel, and the applications. It also explains the general limitations related to the Domain-Specific Languages for Workflows.


Sign in / Sign up

Export Citation Format

Share Document