Building a modern data archive with React, GraphQL, and friends

Author(s):  
Christian Hettlage ◽  
Lucian Botha ◽  
Nhlavutelo Macebele ◽  
Moses Mogotsi ◽  
Sifiso Myeza ◽  
...  
Keyword(s):  
MedienJournal ◽  
2017 ◽  
Vol 38 (4) ◽  
pp. 50-61 ◽  
Author(s):  
Jan Jagodzinski

This paper will first briefly map out the shift from disciplinary to control societies (what I call designer capitalism, the idea of control comes from Gilles Deleuze) in relation to surveillance and mediation of life through screen cultures. The paper then shifts to the issues of digitalization in relation to big data that have the danger of continuing to close off life as zoë, that is life that is creative rather than captured via attention technologies through marketing techniques and surveillance. The last part of this paper then develops the way artists are able to resist the big data archive by turning the data in on itself to offer viewers and participants a glimpse of the current state of manipulating desire and maintaining copy right in order to keep the future closed rather than being potentially open.


2020 ◽  
Vol 4 ◽  
pp. 101-106
Author(s):  
Konstantin Simonov ◽  
◽  
Alexander Matsulev

The study is devoted to the analysis of the features of the change in the Equivalent Water Height (EWH) parameter over the geoid based on satellite measurements of space systems. The study used the GRACE and GRACE-FO satellite data archive. The assessment was carried out on Earth as a whole, including land areas and the World Ocean. Interpretation of the anomalous state of the geoenvironment is performed using digital maps of the spatial distribution of the EWH parameter based on the histogram approach and correlation analysis. Also, a comparative analysis of the studied data from the GRACE mission and data from the new GRACE-FO satellite system launched into orbit in the summer of 2018 was carried out.


2021 ◽  
Vol 8 (1) ◽  
pp. 205395172110075
Author(s):  
Jean-Christophe Plantin

Archival data processing consists of cleaning and formatting data between the moment a dataset is deposited and its publication on the archive’s website. In this article, I approach data processing by combining scholarship on invisible labor in knowledge infrastructures with a Marxian framework and show the relevance of considering data processing as factory labor. Using this perspective to analyze ethnographic data collected during a six-month participatory observation at a U.S. data archive, I generate a taxonomy of the forms of alienation that data processing generates, but also the types of resistance that processors develop, across four categories: routine, speed, skill, and meaning. This synthetic approach demonstrates, first, that data processing reproduces typical forms of factory worker’s alienation: processors are asked to work along a strict standardized pipeline, at a fast pace, without acquiring substantive skills or having a meaningful involvement in their work. It reveals, second, how data processors resist the alienating nature of this workflow by developing multiple tactics along the same four categories. Seen through this dual lens, data processors are therefore not only invisible workers, but also factory workers who follow and subvert a workflow organized as an assembly line. I conclude by proposing a four-step framework to better value the social contribution of data workers beyond the archive.


Neuroforum ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Michael Hanke ◽  
Franco Pestilli ◽  
Adina S. Wagner ◽  
Christopher J. Markiewicz ◽  
Jean-Baptiste Poline ◽  
...  

Abstract Decentralized research data management (dRDM) systems handle digital research objects across participating nodes without critically relying on central services. We present four perspectives in defense of dRDM, illustrating that, in contrast to centralized or federated research data management solutions, a dRDM system based on heterogeneous but interoperable components can offer a sustainable, resilient, inclusive, and adaptive infrastructure for scientific stakeholders: An individual scientist or laboratory, a research institute, a domain data archive or cloud computing platform, and a collaborative multisite consortium. All perspectives share the use of a common, self-contained, portable data structure as an abstraction from current technology and service choices. In conjunction, the four perspectives review how varying requirements of independent scientific stakeholders can be addressed by a scalable, uniform dRDM solution and present a working system as an exemplary implementation.


Author(s):  
Musavver Didem Cambaz ◽  
Mehmet Özer ◽  
Yavuz Güneş ◽  
Tuğçe Ergün ◽  
Zafer Öğütcü ◽  
...  

Abstract As the earliest institute in Turkey dedicated to locating, recording, and archiving earthquakes in the region, the Kandilli Observatory and Earthquake Research Institute (KOERI) has a long history in seismic observation, which dates back to the installation of its first seismometers soon after the devastating Istanbul earthquake of 10 July 1894. For over a century, since the deployment of its first seismometer, the KOERI seismic network has grown steadily in time. In this article, we present the KOERI seismic network facilities as a data center for the seismological community, providing data and services through the European Integrated Data Archive (EIDA) and the Rapid Raw Strong-Motion (RRSM) database, both integrated in the Observatories and Research Facilities for European Seismology (ORFEUS). The objective of this article is to provide an overview of the KOERI seismic services within ORFEUS and to introduce some of the procedures that allow to check the health of the seismic network and the quality of the data recorded at KOERI seismic stations, which are shared through EIDA and RRSM.


2021 ◽  
Vol 92 (3) ◽  
pp. 1854-1875 ◽  
Author(s):  
Klaus Stammler ◽  
Monika Bischoff ◽  
Andrea Brüstle ◽  
Lars Ceranna ◽  
Stefanie Donner ◽  
...  

Abstract Germany has a long history in seismic instrumentation. The installation of the first station sites was initiated in those regions with seismic activity. Later on, with an increasing need for seismic hazard assessment, seismological state services were established over the course of several decades, using heterogeneous technology. In parallel, scientific research and international cooperation projects triggered the establishment of institutional and nationwide networks and arrays also focusing on topics other than monitoring local or regional areas, such as recording global seismicity or verification of the compliance with the Comprehensive Nuclear-Test-Ban Treaty. At each of the observatories and data centers, an extensive analysis of the recordings is performed providing high-level data products, for example, earthquake catalogs, as a base for supporting state or federal authorities, to inform the public on topics related to seismology, and for information transfer to international institutions. These data products are usually also accessible at websites of the responsible organizations. The establishment of the European Integrated Data Archive (EIDA) led to a consolidation of existing waveform data exchange mechanisms and their definition as standards in Europe, along with a harmonization of the applied data quality assurance procedures. In Germany, the German Regional Seismic Network as national backbone network and the state networks of Saxony, Saxony-Anhalt, Thuringia, and Bavaria spearheaded the national contributions to EIDA. The benefits of EIDA are attracting additional state and university networks, which are about to join the EIDA community now.


Sign in / Sign up

Export Citation Format

Share Document