Framework for development of conceptual data modelling techniques

1991 ◽  
Vol 33 (2) ◽  
pp. 134-142 ◽  
Author(s):  
HD Crockett ◽  
J Guynes ◽  
CW Slinkman
1989 ◽  
Vol 4 (4) ◽  
pp. 205-215
Author(s):  
Daniel T. Lee

Traditional data modelling techniques of DSS and modern knowledge representation methodologies of ES are inconsistent. A new unifying model is needed for integrating the two systems into a unified whole. After a brief review of data modelling techniques and knowledge representation methodologies, the unifying model will be described and integrated systems will be used to exemplify the usefulness of the unifying model.


Semantic Web ◽  
2020 ◽  
pp. 1-16
Author(s):  
Francesco Beretta

This paper addresses the issue of interoperability of data generated by historical research and heritage institutions in order to make them re-usable for new research agendas according to the FAIR principles. After introducing the symogih.org project’s ontology, it proposes a description of the essential aspects of the process of historical knowledge production. It then develops an epistemological and semantic analysis of conceptual data modelling applied to factual historical information, based on the foundational ontologies Constructive Descriptions and Situations and DOLCE, and discusses the reasons for adopting the CIDOC CRM as a core ontology for the field of historical research, but extending it with some relevant, missing high-level classes. Finally, it shows how collaborative data modelling carried out in the ontology management environment OntoME makes it possible to elaborate a communal fine-grained and adaptive ontology of the domain, provided an active research community engages in this process. With this in mind, the Data for history consortium was founded in 2017 and promotes the adoption of a shared conceptualization in the field of historical research.


2000 ◽  
Vol 56 (3) ◽  
pp. 250-278 ◽  
Author(s):  
Kalervo Järvelin ◽  
Peter Ingwersen ◽  
Timo Niemi

This article presents a novel user‐oriented interface for generalised informetric analysis and demonstrates how informetric calculations can easily and declaratively be specified through advanced data modelling techniques. The interface is declarative and at a high level. Therefore it is easy to use, flexible and extensible. It enables end users to perform basic informetric ad hoc calculations easily and often with much less effort than in contemporary online retrieval systems. It also provides several fruitful generalisations of typical informetric measurements like impact factors. These are based on substituting traditional foci of analysis, for instance journals, by other object types, such as authors, organisations or countries. In the interface, bibliographic data are modelled as complex objects (non‐first normal form relations) and terminological and citation networks involving transitive relationships are modelled as binary relations for deductive processing. The interface is flexible, because it makes it easy to switch focus between various object types for informetric calculations, e.g. from authors to institutions. Moreover, it is demonstrated that all informetric data can easily be broken down by criteria that foster advanced analysis, e.g. by years or content‐bearing attributes. Such modelling allows flexible data aggregation along many dimensions. These salient features emerge from the query interface‘s general data restructuring and aggregation capabilities combined with transitive processing capabilities. The features are illustrated by means of sample queries and results in the article.


1997 ◽  
Vol 39 (1) ◽  
pp. 15-25 ◽  
Author(s):  
P.J.M. Frederiks ◽  
A.H.M. ter Hofstede ◽  
E. Lippe

1993 ◽  
Vol 10 (1) ◽  
pp. 65-100 ◽  
Author(s):  
A.H.M. ter Hofstede ◽  
Th.P. van der Weide

Sign in / Sign up

Export Citation Format

Share Document