Marefa

2018 ◽  
Vol 14 (3) ◽  
pp. 167-183
Author(s):  
Ahmed Ktob ◽  
Zhoujun Li

This article describes how recently, many new technologies have been introduced to the web; linked data is probably the most important. Individuals and organizations started emerging and publishing their data on the web adhering to a set of best practices. This data is published mostly in English; hence, only English agents can consume it. Meanwhile, although the number of Arabic users on the web is immense, few Arabic datasets are published. Publication catalogs are one of the primary sources of Arabic data that is not being exploited. Arabic catalogs provide a significant amount of meaningful data and metadata that are commonly stored in excel sheets. In this article, an effort has been made to help publishers easily and efficiently share their catalogs' data as linked data. Marefa is the first tool implemented that automatically extracts RDF triples from Arabic catalogs, aligns them to the BIBO ontology and links them with the Arabic chapter of DBpedia. An evaluation of the framework was conducted, and some statistical measures were generated during the different phases of the extraction process.

Author(s):  
Christian Bizer ◽  
Tom Heath ◽  
Tim Berners-Lee

The term “Linked Data” refers to a set of best practices for publishing and connecting structured data on the Web. These best practices have been adopted by an increasing number of data providers over the last three years, leading to the creation of a global data space containing billions of assertions— the Web of Data. In this article, the authors present the concept and technical principles of Linked Data, and situate these within the broader context of related technological developments. They describe progress to date in publishing Linked Data on the Web, review applications that have been developed to exploit the Web of Data, and map out a research agenda for the Linked Data community as it moves forward.


Author(s):  
Lehireche Nesrine ◽  
Malki Mimoun ◽  
Lehireche Ahmed ◽  
Reda Mohamed Hamou

The purpose of the semantic web goes well beyond a simple provision of raw data: it is a matter of linking data together. This data meshing approach, called linked data (LD), refers to a set of best practices for publishing and interlinking data on the web. Due to its principles, a new context appeared called linked enterprise data (LED). The LED is the application of linked data to the information system of the enterprise to answer all the challenge of an IS, in order to have an agile and performing System. Where internal data sources link to external data, with easy access to information in performing time. This article focuses on using the LED to support the challenges of database integration and state-of-the-art for mapping RDB to RDF based on LD. Then, the authors introduce a proposition for on demand extract transform load (ETL) of RDB to RDF mapping using algorithms. Finally, the authors present a conclusion and discussion for their perspectives to implement the solution.


2018 ◽  
Vol 42 (1) ◽  
pp. 107-123 ◽  
Author(s):  
Danila Feitosa ◽  
Diego Dermeval ◽  
Thiago Ávila ◽  
Ig Ibert Bittencourt ◽  
Bernadette Farias Lóscio ◽  
...  

Purpose Data providers have been increasingly publishing content as linked data (LD) on the Web. This process includes guidelines (i.e. good practices) to publish, share, and connect data on the Web. Several people in different areas, for instance, sciences, medicine, governments and so on, use these practices to publish data. The LD community has been proposing many practices to aid the publication of data on the Web. However, discovering these practices is a costly and time-consuming task, considering the practices that are produced by the literature. Moreover, the community still lacks a comprehensive understanding of how these practices are used for publishing LD. Thus, the purpose of this paper is to investigate and better understand how best practices support the publication of LD as well as identifying to what extent they have been applied to this field. Design/methodology/approach The authors conducted a systematic literature review to identify the primary studies that propose best practices to address the publication of LD, following a predefined review protocol. The authors then identified the motivations for recommending best practices for publishing LD and looked for evidence of the benefits of using such practices. The authors also examined the data formats and areas addressed by the studies as well as the institutions that have been publishing LD. Findings In summary, the main findings of this work are: there is empirical evidence of the benefits of using best practices for publishing LD, especially for defining standard practices, integrability and uniformity of LD; most of the studies used RDF as data format; there are many areas interested in dissemination data in a connected way; and there is a great variety of institutions that have published data on the Web. Originality/value The results presented in this systematic review can be very useful to the semantic web and LD community, since it gathers pieces of evidence from the primary studies included in the review, forming a body of knowledge regarding the use best practices for publishing LD pointing out interesting opportunities for future research.


Author(s):  
Christian Bizer ◽  
Tom Heath ◽  
Tim Berners-Lee

The term “Linked Data” refers to a set of best practices for publishing and connecting structured data on the Web. These best practices have been adopted by an increasing number of data providers over the last three years, leading to the creation of a global data space containing billions of assertions— the Web of Data. In this article, the authors present the concept and technical principles of Linked Data, and situate these within the broader context of related technological developments. They describe progress to date in publishing Linked Data on the Web, review applications that have been developed to exploit the Web of Data, and map out a research agenda for the Linked Data community as it moves forward.


Author(s):  
Tobias Käfer ◽  
Benjamin Jochum ◽  
Nico Aßfalg ◽  
Leonard Nürnberg

AbstractFor Read-Write Linked Data, an environment of reasoning and RESTful interaction, we investigate the use of the Guard-Stage-Milestone approach for specifying and executing user agents. We present an ontology to specify user agents. Moreover, we give operational semantics to the ontology in a rule language that allows for executing user agents on Read-Write Linked Data. We evaluate our approach formally and regarding performance. Our work shows that despite different assumptions of this environment in contrast to the traditional environment of workflow management systems, the Guard-Stage-Milestone approach can be transferred and successfully applied on the web of Read-Write Linked Data.


Author(s):  
Olaf Hartig ◽  
Juan Sequeda ◽  
Jamie Taylor ◽  
Patrick Sinclair
Keyword(s):  

Author(s):  
Tim Berners-Lee ◽  
Kieron O’Hara

This paper discusses issues that will affect the future development of the Web, either increasing its power and utility, or alternatively suppressing its development. It argues for the importance of the continued development of the Linked Data Web, and describes the use of linked open data as an important component of that. Second, the paper defends the Web as a read–write medium, and goes on to consider how the read–write Linked Data Web could be achieved.


2017 ◽  
Vol 10 (2-3) ◽  
pp. 109-132 ◽  
Author(s):  
Donatella Della Ratta

In this essay, I reflect on the aesthetic, political and material implications of filming as a continuous life activity since the beginning of the 2011 uprising in Syria. I argue that the blurry, shaky and pixelated aesthetics of Syrian user-generated videos serve to construct an ethical discourse (Ranciére 2009a; 2013) to address the genesis and the goal of the images produced, and to shape a political commitment to the evidence-image (Didi-Huberman 2008). However, while the unstable visuals of the handheld camera powerfully reconnect, both at a symbolic and aesthetic level, to the truthfulness of the moment of crisis in which they are generated, they fail to produce a clearer understanding of the situation and a counter-hegemonic narrative. In this article, I explore how new technologies have impacted this process of bearing witness and documenting events in real time, and how they have shaped a new understanding of the image as a networked, multiple object connected with the living archive of history, in a permanent dialogue with the seemingly endless flow of data nurtured by the web 2.0.


2020 ◽  
Vol 8 (2) ◽  
pp. 251-268 ◽  
Author(s):  
Cristela Garcia-Spitz ◽  
Kathryn Creely

How are ethnographic photographs from the twentieth century accessed and represented in the twenty-first century? This report from the Tuzin Archive for Melanesian Anthropology at the University of California San Diego Library provides an overview of the photographic materials, arrangements and types of documentation in the archive, followed by summaries of specific digitization projects of the photographs from physician Sylvester Lambert and anthropologists Roger Keesing and Harold Scheffler, among others. Through the process of digitization and online access, ethnographic photographs are transformed and may be discovered and contextualized in new ways. Utilizing new technologies and forming broad collaborations, these digitization projects incorporate both anthropological and archival practices and also raise ethical questions. This is an in-depth look at what is digitized and how it is described to re/create meaning and context and to bring new life to these images.


Sign in / Sign up

Export Citation Format

Share Document