scholarly journals Large-Scale Ontological Reasoning via Datalog

Author(s):  
Mario Alviano ◽  
Marco Manna

Reasoning over OWL 2 is a very expensive task in general, and therefore the W3C identified tractable profiles exhibiting good computational properties. Ontological reasoning for many fragments of OWL 2 can be reduced to the evaluation of Datalog queries. This paper surveys some of these compilations, and in particular the one addressing queries over Horn-SHIQ knowledge bases and its implementation in DLV2 enanched by a new version of the Magic Sets algorithm.

2021 ◽  
Author(s):  
Jose Emilio Labra-Gayo ◽  
Alejandro González Hevia ◽  
Daniel Fernández Álvarez ◽  
Ammar Ammar ◽  
Dan Brickley ◽  
...  

Knowledge graphs have successfully been adopted by academia, governement and industry to represent large scale knowledge bases. Open and collaborative knowledge graphs such as Wikidata capture knowledge from different domains and harmonize them under a common format, making it easier for researchers to access the data while also supporting Open Science.Wikidata keeps getting bigger and better, which subsumes integration use cases. Having a large amount of data such as the one presented in a scopeless Wikidata offers some advantages, e.g., unique access point and common format, but also poses some challenges, e.g., performance.Regular wikidata users are not unfamiliar with running into frequent timeouts of submitted queries. Due to its popularity, limits have been imposed to allow for fair access to many.However this suppreses many interesting and complex queries that require more computational power and resources. Replicating Wikidata on one's own infrastructure can be a solution which also offers a snapshot of the contents of wikidata at some given point in time. There is no need to replicate Wikidata in full, it is possible to work with subsets targeting, for instance, a particular domain. Creating those subsets has emerged as an alternative to reduce the amount and spectrum of data offered by Wikidata. Less data makes more complex queries possible while still keeping the compatibility with the whole Wikidata as the model is kept. In this paper we report the tasks done as part of a Wikidata subsetting project during the Virtual BioHackathon Europe 2020 and SWAT4(HC)LS 2021, which had already started at NBDC/DBCLS BioHackathon 2019 in Japan, SWAT4(HC)LS hackathon 2019, and Virtual COVID-19 BioHackathon 2019. We describe some of approaches we identified to create subsets and some susbsets from the Life Sciences domain as well as other use cases we also discussed.


Author(s):  
Stefan Borgwardt ◽  
İsmail İlkan Ceylan ◽  
Thomas Lukasiewicz

We give a survey on recent advances at the forefront of research on probabilistic knowledge bases for representing and querying large-scale automatically extracted data. We concentrate especially on increasing the semantic expressivity of formalisms for representing and querying probabilistic knowledge (i) by giving up the closed-world assumption, (ii) by allowing for commonsense knowledge (and in parallel giving up the tuple-independence assumption), and (iii) by giving up the closed-domain assumption, while preserving some computational properties of query answering in such formalisms.


1994 ◽  
Vol 33 (05) ◽  
pp. 454-463 ◽  
Author(s):  
A. M. van Ginneken ◽  
J. van der Lei ◽  
J. H. van Bemmel ◽  
P. W. Moorman

Abstract:Clinical narratives in patient records are usually recorded in free text, limiting the use of this information for research, quality assessment, and decision support. This study focuses on the capture of clinical narratives in a structured format by supporting physicians with structured data entry (SDE). We analyzed and made explicit which requirements SDE should meet to be acceptable for the physician on the one hand, and generate unambiguous patient data on the other. Starting from these requirements, we found that in order to support SDE, the knowledge on which it is based needs to be made explicit: we refer to this knowledge as descriptional knowledge. We articulate the nature of this knowledge, and propose a model in which it can be formally represented. The model allows the construction of specific knowledge bases, each representing the knowledge needed to support SDE within a circumscribed domain. Data entry is made possible through a general entry program, of which the behavior is determined by a combination of user input and the content of the applicable domain knowledge base. We clarify how descriptional knowledge is represented, modeled, and used for data entry to achieve SDE, which meets the proposed requirements.


Author(s):  
Olga V. Khavanova ◽  

The second half of the eighteenth century in the lands under the sceptre of the House of Austria was a period of development of a language policy addressing the ethno-linguistic diversity of the monarchy’s subjects. On the one hand, the sphere of use of the German language was becoming wider, embracing more and more segments of administration, education, and culture. On the other hand, the authorities were perfectly aware of the fact that communication in the languages and vernaculars of the nationalities living in the Austrian Monarchy was one of the principal instruments of spreading decrees and announcements from the central and local authorities to the less-educated strata of the population. Consequently, a large-scale reform of primary education was launched, aimed at making the whole population literate, regardless of social status, nationality (mother tongue), or confession. In parallel with the centrally coordinated state policy of education and language-use, subjects-both language experts and amateur polyglots-joined the process of writing grammar books, which were intended to ease communication between the different nationalities of the Habsburg lands. This article considers some examples of such editions with primary attention given to the correlation between private initiative and governmental policies, mechanisms of verifying the textbooks to be published, their content, and their potential readers. This paper demonstrates that for grammar-book authors, it was very important to be integrated into the patronage networks at the court and in administrative bodies and stresses that the Vienna court controlled the process of selection and financing of grammar books to be published depending on their quality and ability to satisfy the aims and goals of state policy.


2019 ◽  
Author(s):  
Robert C. Hockett

This white paper lays out the guiding vision behind the Green New Deal Resolution proposed to the U.S. Congress by Representative Alexandria Ocasio-Cortez and Senator Bill Markey in February of 2019. It explains the senses in which the Green New Deal is 'green' on the one hand, and a new 'New Deal' on the other hand. It also 'makes the case' for a shamelessly ambitious, not a low-ball or slow-walked, Green New Deal agenda. At the core of the paper's argument lies the observation that only a true national mobilization on the scale of those associated with the original New Deal and the Second World War will be up to the task of comprehensively revitalizing the nation's economy, justly growing our middle class, and expeditiously achieving carbon-neutrality within the twelve-year time-frame that climate science tells us we have before reaching an environmental 'tipping point.' But this is actually good news, the paper argues. For, paradoxically, an ambitious Green New Deal also will be the most 'affordable' Green New Deal, in virtue of the enormous productivity, widespread prosperity, and attendant public revenue benefits that large-scale public investment will bring. In effect, the Green New Deal will amount to that very transformative stimulus which the nation has awaited since the crash of 2008 and its debt-deflationary sequel.


Author(s):  
Jochen von Bernstorff

The chapter explores the notion of “community interests” with regard to the global “land-grab” phenomenon. Over the last decade, a dramatic increase of foreign investment in agricultural land could be observed. Bilateral investment treaties protect around 75 per cent of these large-scale land acquisitions, many of which came with associated social problems, such as displaced local populations and negative consequences for food security in Third World countries receiving these large-scale foreign investments. Hence, two potentially conflicting areas of international law are relevant in this context: Economic, social, and cultural rights and the principles of permanent sovereignty over natural resources and “food sovereignty” challenging large-scale investments on the one hand, and specific norms of international economic law stabilizing them on the other. The contribution discusses the usefulness of the concept of “community interests” in cases where the two colliding sets of norms are both considered to protect such interests.


Electronics ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 423
Author(s):  
Márk Szalay ◽  
Péter Mátray ◽  
László Toka

The stateless cloud-native design improves the elasticity and reliability of applications running in the cloud. The design decouples the life-cycle of application states from that of application instances; states are written to and read from cloud databases, and deployed close to the application code to ensure low latency bounds on state access. However, the scalability of applications brings the well-known limitations of distributed databases, in which the states are stored. In this paper, we propose a full-fledged state layer that supports the stateless cloud application design. In order to minimize the inter-host communication due to state externalization, we propose, on the one hand, a system design jointly with a data placement algorithm that places functions’ states across the hosts of a data center. On the other hand, we design a dynamic replication module that decides the proper number of copies for each state to ensure a sweet spot in short state-access time and low network traffic. We evaluate the proposed methods across realistic scenarios. We show that our solution yields state-access delays close to the optimal, and ensures fast replica placement decisions in large-scale settings.


Genetics ◽  
2003 ◽  
Vol 165 (4) ◽  
pp. 2269-2282
Author(s):  
D Mester ◽  
Y Ronin ◽  
D Minkov ◽  
E Nevo ◽  
A Korol

Abstract This article is devoted to the problem of ordering in linkage groups with many dozens or even hundreds of markers. The ordering problem belongs to the field of discrete optimization on a set of all possible orders, amounting to n!/2 for n loci; hence it is considered an NP-hard problem. Several authors attempted to employ the methods developed in the well-known traveling salesman problem (TSP) for multilocus ordering, using the assumption that for a set of linked loci the true order will be the one that minimizes the total length of the linkage group. A novel, fast, and reliable algorithm developed for the TSP and based on evolution-strategy discrete optimization was applied in this study for multilocus ordering on the basis of pairwise recombination frequencies. The quality of derived maps under various complications (dominant vs. codominant markers, marker misclassification, negative and positive interference, and missing data) was analyzed using simulated data with ∼50-400 markers. High performance of the employed algorithm allows systematic treatment of the problem of verification of the obtained multilocus orders on the basis of computing-intensive bootstrap and/or jackknife approaches for detecting and removing questionable marker scores, thereby stabilizing the resulting maps. Parallel calculation technology can easily be adopted for further acceleration of the proposed algorithm. Real data analysis (on maize chromosome 1 with 230 markers) is provided to illustrate the proposed methodology.


2017 ◽  
Vol 899 ◽  
pp. 173-178 ◽  
Author(s):  
Ronydes Batista Jr. ◽  
Bruna Sene Alves Araújo ◽  
Pedro Ivo Brandão e Melo Franco ◽  
Beatriz Cristina Silvério ◽  
Sandra Cristina Danta ◽  
...  

In view of the constant search for new sources of renewable energy, the particulate agro-industrial waste reuse emerges as an advantageous alternative. However, despite the advantages of using the biomass as an energy source, there is still strong resistance as the large-scale replacement of petroleum products due to the lack of scientifically proven efficient conversion technologies. In this context, the pyrolysis is presented as one of the most widely used thermal decomposition processes. The knowledge of aspects of chemical kinetics, thermodynamics these will, heat and mass transfer, are so important, since influence the quality of the product. This paper presents a kinetic study of slow pyrolysis of coffee grounds waste from dynamic thermogravimetric experiments (TG), using different powder catalysts. The primary thermal decomposition was described by the one-step reaction model, which considers a single global reaction. The kinetic parameters were estimated using nonlinear regression and the differential evolution method. The coffee ground waste was dried at 105°C for 24 hours. The sample in nature was analyzed at different heating rates, being 10, 15, 20, 30 and 50 K/min. In the catalytic pyrolysis, about 5% (w/w) of catalyst were added to the sample, at a heating rate of 30 K/min. The results show that the one-step model does not accurately represent the data of weight loss (TG) and its derivative (DTG), but can do an estimative of the activation energy reaction, and can show the differences caused by the catalysts. Although no one can say anything about the products formed with the addition of the catalyst, it would be necessary to micro-pyrolysis analysis, we can say the influence of the catalyst in the samples, based on the data obtained in thermogravimetric tests.


2020 ◽  
Vol 23 (3) ◽  
pp. 189-205 ◽  
Author(s):  
Françoise Naudillon

The documentary film C’est ma terre by Fabrice Bouckat screened during the 2019 edition of Terrafestival is one of the first large-scale films produced locally on the crisis of the chlordecone molecule. This article will examine from a decolonial perspective, how its director, a Martinican with Gabonese origins who lives and works in Guadeloupe, develops a synthetic and universal vision of environmental crises, and thus demonstrates that destruction of ecosystems crosses time and space, cultures and lands, languages and peoples by bringing ecological crisis in the West Indies closer to the one experienced by the Vietnamese victims of Agent Orange.


Sign in / Sign up

Export Citation Format

Share Document