On the Meaning of Logical Rules I: Syntax Versus Semantics

1999 ◽  
pp. 215-272 ◽  
Author(s):  
Jean-Yves Girard
Keyword(s):  
2018 ◽  
Author(s):  
Matthew J. Bolton ◽  
William G. Blumberg ◽  
Lara K. Ault ◽  
H. Michael Mogil ◽  
Stacie H. Hanes

Weather is important to all people, including vulnerable populations (those whose circumstances include cognitive processing, hearing, or vision differences, physical disability, homelessness, and other scenarios and factors). Autism spectrum conditions (ASC) affect information-processing and areas of neurological functioning that potentially inhibit the reception of hazardous weather information, and is of particular concern for weather messengers. People on the autism spectrum tend to score highly in tests of systemizing, a psychological process that heavily entails attention to detail and revolves around the creation of logical rules to explain things that occur in the world. This article reports the results of three preliminary studies examining weather salience–psychological attention to weather–and its potential relationships with systemizing in autistic people. Initial findings suggest that enhanced weather salience exists among autistic individuals compared to those without the condition, and that this may be related to systemizing. These findings reveal some possible strategies for communicating weather to autistic populations and motivate future work on a conceptual model that blends systemizing and chaos theory to better understand weather salience.


Processes ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 1292
Author(s):  
Muna Mohammed Bazuhair ◽  
Siti Zulaikha Mohd Jamaludin ◽  
Nur Ezlin Zamri ◽  
Mohd Shareduwan Mohd Kasihmuddin ◽  
Mohd. Asyraf Mansor ◽  
...  

One of the influential models in the artificial neural network (ANN) research field for addressing the issue of knowledge in the non-systematic logical rule is Random k Satisfiability. In this context, knowledge structure representation is also the potential application of Random k Satisfiability. Despite many attempts to represent logical rules in a non-systematic structure, previous studies have failed to consider higher-order logical rules. As the amount of information in the logical rule increases, the proposed network is unable to proceed to the retrieval phase, where the behavior of the Random Satisfiability can be observed. This study approaches these issues by proposing higher-order Random k Satisfiability for k ≤ 3 in the Hopfield Neural Network (HNN). In this regard, introducing the 3 Satisfiability logical rule to the existing network increases the synaptic weight dimensions in Lyapunov’s energy function and local field. In this study, we proposed an Election Algorithm (EA) to optimize the learning phase of HNN to compensate for the high computational complexity during the learning phase. This research extensively evaluates the proposed model using various performance metrics. The main findings of this research indicated the compatibility and performance of Random 3 Satisfiability logical representation during the learning and retrieval phase via EA with HNN in terms of error evaluations, energy analysis, similarity indices, and variability measures. The results also emphasized that the proposed Random 3 Satisfiability representation incorporates with EA in HNN is capable to optimize the learning and retrieval phase as compared to the conventional model, which deployed Exhaustive Search (ES).


1957 ◽  
Vol 77 (1) ◽  
pp. 93-102 ◽  
Author(s):  
L. Minio-Paluello
Keyword(s):  

Cod. Florence Bibl. Nazion. Centr. Conv. Soppr. J.VI.34—formerly in Niccolò Niccoli's and St. Mark's libraries—written in a beautiful French hand of c. A.D. 1150–1200—contains the second edition of Boethius's translation of Pr. An. Many scholia, written on the margins and between the lines by the same calligraphic hand which wrote the Aristotelian text or by a hand very similar to and contemporary with it, accompany the translation in this MS. They are mainly concentrated in about one-half of the work, viz. in book i.23–30 (4ob–46a) and book ii (52a–70b); quite a few accompany i. 1, 5–6, 30–45 (24a, 27b–28a, 46a–50a); almost none is to be found in i. 10–14, 17–22 (30b7–33b25, 37a25–40b10). Arrangement and writing suggest that the scribe intended to give the reader Aristotle's text together with what was available to him of an authoritative commentary.The scholia range, in nature and extent, from short glosses on single words or phrases and short summaries of sections of Aristotle's work to detailed explanations and doctrinal developments of important or difficult passages. Here and there carefully drawn diagrams illustrate logical rules and geometrical examples. The following scholia are mainly chosen from book i; others, from both books, will be given farther on.


2019 ◽  
Vol 1 ◽  
pp. 1-2
Author(s):  
Otakar Čerba

<p><strong>Abstract.</strong> Ontologies (in computer science and information science) represent the essential tool for a formalised description of concepts, data, information, knowledge and other entities as well as relations among them. Their history is relatively old. The idea of ontologies in informatics started in the mid-1970s, but ontology as the philosophical discipline connected to existence and nature of reality came from the Ancient Greek. The ontologies as a part of knowledge-based systems were discussed in the 1980s. In 1993 Thomas R. Gruber defined ontology in information science as "a specification of a conceptualisation". After that, the first languages and formats coding ontologies have been developed, and massive construction process of ontologies began. For example, the Basel Register of Thesauri, Ontologies and Classifications presents about 700 ontologies and more the 1000 other tools with a similar character. The theory of ontologies and development as ontologies are entirely on a high level. However, their implementation (especially in several domains) is in its infancy.</p><p> For example, in the geographical domain, there are many ontologies (called geo-ontologies) such as FAO (Food and Agriculture Organization of the United Nations) Geopolitical Ontology, ontologies of USGS (United States Geological Survey) or ontologies of Ordnance Survey. However, their implementation is usually limited by home organisations, which provide for the management, development and updating of ontologies. In many cases, they are not an integral part of Linked Open Data (LOD). This fact can be considered as the critical shortcoming because only in connection with Linked Open Data and free data sharing and combining the main benefits of ontologies (emphasis on a semantic description, derivation of new knowledge or complete independence) can be fully appreciated.</p><p> This document has to describe opportunities for the implementation of ontologies in cartography. The purpose of the implementation of an ontology depends on various types of ontologies. There are defined four essential types of ontologies - upper ontologies, domain ontologies, task ontologies and application ontologies.</p><p> Upper and domain ontologies contain general terms (in the case of upper ontologies) and domain-specific terms (in case of domain ontologies). Annotation properties (labels, definitions or comments) usually describe these terms, interconnected by data properties and/or object properties and restricted by logical axioms. Such ontologies are usually provided as vocabularies or thesauri. They can be used in two ways. Domain ontologies can describe cartography as a science or human activity. In previous years several paper and articles were discussing the term "cartography" and its position in Linked Open Data space, including various ontologies, ontological description of cartographic knowledge or ontological comparison of various definitions of the term "map". These activities can aim for the development of a cartographic knowledge base or building of semantic tools such as multilingual thesauri or vocabularies.</p><p> The second way consists in the exploitation of domain ontologies containing semantic information about data visualising by a map. In this case, such domain ontology can be used as a tool for development of a legend of a map, especially in a case where a map is focused on particular issues. If such ontology is published as Linked Open Data, it is possible to generate such legend automatically as well as to reflect any changes. Such solution enables an efficient interconnection of cartographers and domain experts. Domain ontologies can be used for a definition of logical rules restricting and describing data, information and knowledge. These rules and knowledge extracted in the reasoning process can be applied during the map development. They can provide information on possible combinations of data or a hierarchy of objects visualising by a map and described by a map legend.</p><p> The task ontologies are not focused on a complicated system of classes (representing types of object) as domain ontologies. They are usually based on instances (individuals) representing concrete data objects. Therefore they can be used as data resources. However, the overwhelming majority of geo-ontologies does not contain any geometry (coordinates) to enable a visualisation in a map. This apparent disadvantage shows the importance of LOD. If a task ontology is published as 5-star LOD (RDF /Resource Description Framework/ data with interconnection to external data resources published on the Web under an open license), and identity relation (links to equivalent object published in other data sets) are filled, it is possible to find in LOD space geometries as well as other additional information and attributes for visualization.</p><p> The remaining type of ontologies is called application ontology. It is a combination of both previous kinds &amp;ndash; domain ontology and task ontology. Application ontologies usually provide vocabularies as well as data stored in an ontological structure. Such a combination allows controlling data correctness and integrity by a set of logical rules. This functionality is emphasised by the rich possibilities of the Description Logic (quantifiers or types of relations). Their implementation in cartography corresponds with methods discussed in previous paragraphs. The main advantage of the approach using an application ontology consists in a homogeneous interconnection of data and semantics.</p><p> The real implementation of ontologies, other semantic resources and Linked Open Data principles in cartography can make web mapping development process more efficient, because the normalised semantic description enables to automatize many activities, including a derivation of new data and knowledge or checking of data as well as cartographic processes. Such an approach can bring the cartography closer to knowledge bases and systems and realise ideas of real-time cartography.</p><p> The research reported in this paper has been supported by the following project &amp;ndash; Sustainability support of the centre NTIS &amp;ndash; New Technologies for the Information Society, LO1506, Czech Ministry of Education, Youth and Sports.</p>


Description logic gives us the ability of reasoning with acceptable computational complexity with retaining the power of expressiveness. The power of description logic can be accompanied by the defeasible logic to manage non-monotonic reasoning. In some domains, we need flexible reasoning and knowledge representation to deal the dynamicity of such domains. In this paper, we present a DL representation for a small domain that describes the connections between different entities in a university publication system to show how could we deal with changeability in domain rules. An automated support can be provided on the basis of defeasible logical rules to represent the typicality in the knowledge base and to solve the conflicts that might happen.


Author(s):  
Wlodzislaw Duch ◽  
◽  
Rafal Adamczak ◽  
KrzysAof Grabczewski ◽  
Grzegorz Zal

Methodology of extraction of optimal sets of logical rules using neural networks and global minimization procedures has been developed. Initial rules are extracted using density estimation neural networks with rectangular functions or multilayered perceptron (MLP) networks trained with constrained backpropagation algorithm, transforming MLPs into simpler networks performing logical functions. A constructive algorithm called CMLP2LN is proposed, in which rules of increasing specificity are generated consecutively by adding more nodes to the network. Neural rule extraction is followed by optimization of rules using global minimization techniques. Estimation of confidence of various sets of rules is discussed. The hybrid approach to rule extraction has been applied to a number of benchmark and real life problems with very good results.


In our previous papers, a new Ant Routing Protocol for Ad-hoc Networks inspired from ant colony optimization was presented. We introduced a new approach which decreases both of nodes energy consumption and routing overhead within the network. The validation of our routing protocol was based on series of simulation. The results show that our new algorithm provides a significant improvement compared to other protocols. After the algorithm is defined and published, we have found important to validate formally each one of its components in order to avoid any conflict, lack or misbehaving situations. This process requires in a first step a formal specification. This is our main concern in this paper where we propose in a first part a formal specification using inference systems based on logical rules. A formal validation using these inference systems is proposed in a second step in order to prove the correctness, the soundness, the completeness and the optimality of the proposition.


Author(s):  
Xenia Naidenova

The purpose of this chapter is to demonstrate the possibility of transforming a large class of machine learning algorithms into commonsense reasoning processes based on using well-known deduction and induction logical rules. The concept of a good classification (diagnostic) test for a given set of positive examples lies in the basis of our approach to the machine learning problems. The task of inferring all good diagnostic tests is formulated as searching the best approximations of a given classification (a partitioning) on a given set of examples. The lattice theory is used as a mathematical language for constructing good classification tests. The algorithms of good tests inference are decomposed into subtasks and operations that are in accordance with main human commonsense reasoning rules.


Sign in / Sign up

Export Citation Format

Share Document