scholarly journals Lógica difusa para una descripción de la gramática de las lenguas naturales

Triangle ◽  
2020 ◽  
pp. 73
Author(s):  
Adrià Torrens Urrutia

Defining the natural language and its gradient phenomena force us to look for formal tools that can represent the bases of a grammar with degrees of grammaticality. The mathematical and formal models are often used in linguistics. And yet, fuzzy logic has not received all the attention it deserves as a tool to explain the natural language processing. Here, we show the theoretical bases that have led us to treat the natural language (NL) inputs gradually. The basis of fuzzy logic for NL are explained here as a tool capable of defining non-discrete values, therefore gradual or fuzzy. A Property Grammar will give the rules of the fuzzy grammar.

Author(s):  
PASCUAL JULIÁN-IRANZO ◽  
FERNANDO SÁENZ-PÉREZ

Abstarct This paper introduces techniques to integrate WordNet into a Fuzzy Logic Programming system. Since WordNet relates words but does not give graded information on the relation between them, we have implemented standard similarity measures and new directives allowing the proximity equations linking two words to be generated with an approximation degree. Proximity equations are the key syntactic structures which, in addition to a weak unification algorithm, make a flexible query-answering process possible in this kind of programming language. This addition widens the scope of Fuzzy Logic Programming, allowing certain forms of lexical reasoning, and reinforcing Natural Language Processing (NLP) applications.


2021 ◽  
Author(s):  
akuwan saleh

?atural language merupakan proses pembuatan model komputasi dari bahasa, sehingga dapat terjadi interaksi antara manusia dengan komputer dengan perantaraan bahasa alami. Model komputasi ini dapat berguna untuk keperluan ilmiah seperti meneliti sifat-sifat dari suatu bentuk bahasa alami maupun untuk keperluan sehari-hari. Bidang-bidang pengetahuan yang berhubungan dengan natural language processing meliputi : Fonetik dan fonologi, morfologi, sintaksis, semantik, pragmatik, discourse knowledge, dan world knowledge. Definisi dari semantik yaitu pemetaan bentuk struktur sintaksis dengan memanfaatkan tiap kata ke dalam bentuk yang lebih mendasar dan tidak tergantung struktur kalimat. Semantik mempelajari arti suatu kata dan bagaimana dari arti kata-arti kata tersebut membentuk suatu arti dari kalimat yang utuh. Proses analisa semantik digunakan untuk mengenali kata-kata yang mendahului dan berhubungan dengan kata yang ada dalam domain. Proses ini dilakukan dengan menghubungkan struktur sintak mulai dari kata, frasa, kalimat, hingga paragraf. Dalam penelitian sebelumnya berkaitan dengan semantic mapping, pemetaan semantik dilakukan berdasarkan pada tampilan fisik dan selanjutnya peran dari suatu model/karakter dalam suatu cerita. Inti dari obyek permainan tidak harus dimunculkan dari tampilan fisik dari suatu karakter saja, tetapi juga dapat dihubungkan dengan parameter penting lain seperti pakaian, alat, benda, senjata yang dibawa oleh masing-masing karakternya.


2021 ◽  
Vol 23 (10) ◽  
pp. 81-92
Author(s):  
Dr. ASHISH KUMAR TAMRAKAR ◽  

Natural Language Processing (NLP) is the electronic tactic to analyzing text that is depends on both a set of ideas and a set of technologies. Natural Language Processing (NLP) is a subfield of artificial intelligence and etymology it thinks about the issues of computerized era and comprehension of regular human dialects. Common dialect era frameworks change over data from PC databases into ordinary sounding human dialect, and normal dialect understanding frameworks change over specimens of human dialect into more formal representations that are less demanding for PC projects to control. The Fuzzy logic-based approach provides another alternative for effective natural language analysis. It is commonly recognized that many phenomena in natural language lend themselves to descriptions by Fuzzy mathematics, including Fuzzy sets, Fuzzy relations and Fuzzy logic. By defining a Fuzzy logic system and acquiring proper rules, we hope that difficulties in analysis of speech can be alleviated. The goal of NLP is to enable communication between people and computers without resorting to memorization of complex commands and procedures.


2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Nicholas A. I. Omoregbe ◽  
Israel O. Ndaman ◽  
Sanjay Misra ◽  
Olusola O. Abayomi-Alli ◽  
Robertas Damaševičius

The use of natural language processing (NLP) methods and their application to developing conversational systems for health diagnosis increases patients’ access to medical knowledge. In this study, a chatbot service was developed for the Covenant University Doctor (CUDoctor) telehealth system based on fuzzy logic rules and fuzzy inference. The service focuses on assessing the symptoms of tropical diseases in Nigeria. Telegram Bot Application Programming Interface (API) was used to create the interconnection between the chatbot and the system, while Twilio API was used for interconnectivity between the system and a short messaging service (SMS) subscriber. The service uses the knowledge base consisting of known facts on diseases and symptoms acquired from medical ontologies. A fuzzy support vector machine (SVM) is used to effectively predict the disease based on the symptoms inputted. The inputs of the users are recognized by NLP and are forwarded to the CUDoctor for decision support. Finally, a notification message displaying the end of the diagnosis process is sent to the user. The result is a medical diagnosis system which provides a personalized diagnosis utilizing self-input from users to effectively diagnose diseases. The usability of the developed system was evaluated using the system usability scale (SUS), yielding a mean SUS score of 80.4, which indicates the overall positive evaluation.


2018 ◽  
Vol 132 ◽  
pp. 1375-1384 ◽  
Author(s):  
Charu Gupta ◽  
Amita Jain ◽  
Nisheeth Joshi

2015 ◽  
Vol 8 (2) ◽  
Author(s):  
Nisa Kurniasih Wangsanegara ◽  
Beki Subaeki

Ejaan  yang Disempurnakan  merupakan salah satu aspek penting dalam penulisan suatu dokumen. Penggunaan ejaan harus sesuai dengan yang tertera pada aturan baku yang dikeluarkan oleh Kementerian Pendidikan Nasional. Kesalahan yang banyak terjadi yaitu dalam penulisan kata, tanda baca, dan huruf kapital. Aplikasi ini akan mengidentifikasi dan menghitung jumlah kesalahan penulisan huruf kapital/ kata dan tanda baca. Pengukur penggunaan ketepatan EYD ini dibuat menggunakan metode Fuzzy Logic Tsukamoto. Proses yang dilakukan dalam metode Tsukamoto yaitu: fuzzifikasi, pembentukan rule, mesin inferensi, dan defuzzifikasi. Jumlah kata yang saat ini tersedia dalam aplikasi ini yaitu 31.759 kata yang sebagian besar diambil dari Kamus Besar Bahasa Indonesia. Aplikasi ini dibangun menggunakan MySQL sebagai database dan menggunakan tools PHP yang berbasis website. Berdasarkan pengujian yang dilakukan terhadap 20 abstrak skripsi, presentase kesesuaian hasil identifikasi yaitu 70% sesuai dengan hasil identifikasi secara manual. Kata kunci: EYD, penulisan, Fuzzy Logic Tsukamoto, PHP.


2020 ◽  
pp. 3-17
Author(s):  
Peter Nabende

Natural Language Processing for under-resourced languages is now a mainstream research area. However, there are limited studies on Natural Language Processing applications for many indigenous East African languages. As a contribution to covering the current gap of knowledge, this paper focuses on evaluating the application of well-established machine translation methods for one heavily under-resourced indigenous East African language called Lumasaaba. Specifically, we review the most common machine translation methods in the context of Lumasaaba including both rule-based and data-driven methods. Then we apply a state of the art data-driven machine translation method to learn models for automating translation between Lumasaaba and English using a very limited data set of parallel sentences. Automatic evaluation results show that a transformer-based Neural Machine Translation model architecture leads to consistently better BLEU scores than the recurrent neural network-based models. Moreover, the automatically generated translations can be comprehended to a reasonable extent and are usually associated with the source language input.


Sign in / Sign up

Export Citation Format

Share Document