scholarly journals Making ECMWF Open Data more easily accessible via cloud-based services

2021 ◽  
Author(s):  
Julia Wagemann ◽  
Umberto Modigliani ◽  
Stephan Siemen ◽  
Vasileios Baousis ◽  
Florian Pappenberger

<p>The European Centre for Medium-Range Weather Forecasts (ECMWF) is moving gradually towards an open data licence , aiming to make real-time forecast data available under a full, free and open data license by 2025. The introduction of open data policies lead in general to an increase in data requests and a broader user base. Therefore a much larger community of diverse users will be interested in accessing, understanding and using ECMWF Open Data (real-time). While an open data license is an important prerequisite, it does not automatically lead to an increased uptake of open data. In order to increase the uptake of (open) data, Wilkinson et al. (2016) defined the FAIR principles, which emphasize the need to make data better ‘findable’, ‘accessible’, ‘interoperable’ and ‘reusable’.</p><p>In 2019, we conducted a web-based survey among users of big Earth data to obtain a better understanding of users’ needs in terms of the data they are interested in, the applications they need the data for, the way they access and process data and the challenges they face. The results show that users are in particular interested in meteorological and climate forecast data, but facing challenges related to the growing data volumes, the data heterogeneity and the limited processing capacities. At the same time, survey respondents showed an interest in using cloud-based services in the near future, but expressed the need for an easier data discovery and the interoperability of data systems. Moreover, an ECMWF supported activity that made a subset of ERA5 climate reanalysis data available to the user community of the Google Earth Engine platform, revealed that interoperability of data systems is a growing bottleneck. </p><p>Conclusions from both activities are helping ECMWF to define the way forward to make ECMWF Open Data (real-time) better accessible via cloud-based services. In this presentation we would like to share and discuss lessons learned to make open data more easily ‘accessible’ and ‘interoperable’ and the role cloud-based services play in doing so. We will also cover our future plans.</p>

2020 ◽  
Author(s):  
Timo Benjamin Roettger ◽  
Michael Franke ◽  
Jennifer Cole

Real-time speech comprehension is challenging because communicatively relevant information is distributed throughout the entire utterance. In five mouse tracking experiments on German and American English we probe, if listeners, in principle, use early intonational information to anticipate upcoming referents. Listeners had to select a speaker intended referent with their mouse guided by intonational cues, allowing for anticipation by moving their hand toward the referent prior to lexical disambiguation. While German listeners (Exps. 1-3) seemed to ignore early pitch cues, American English listeners (Exps. 4-5) were in principle able to use these early pitch cues to anticipate upcoming referents. However, many listeners showed no indication of doing so. These results suggest that there are important positional asymmetries in the way intonational information is integrated, with early information being paid less attention to than later cues in the utterance. Open data, scripts, and materials can be retrieved here: https://osf.io/xf8be/.


2018 ◽  
Author(s):  
Dick Bierman ◽  
Jacob Jolij

We have tested the feasibility of a method to prevent the occurrence of so-called Questionable Research Practices (QRP). A part from embedded pre-registration the major aspect of the system is real-time uploading of data on a secure server. We outline the method, discuss the drop-out treatment and compare it to the Born-open data method, and report on our preliminary experiences. We also discuss the extension of the data-integrity system from secure server to use of blockchain technology.


2021 ◽  
Vol 83 (2) ◽  
Author(s):  
S. Engwell ◽  
L. Mastin ◽  
A. Tupper ◽  
J. Kibler ◽  
P. Acethorp ◽  
...  

AbstractUnderstanding the location, intensity, and likely duration of volcanic hazards is key to reducing risk from volcanic eruptions. Here, we use a novel near-real-time dataset comprising Volcanic Ash Advisories (VAAs) issued over 10 years to investigate global rates and durations of explosive volcanic activity. The VAAs were collected from the nine Volcanic Ash Advisory Centres (VAACs) worldwide. Information extracted allowed analysis of the frequency and type of explosive behaviour, including analysis of key eruption source parameters (ESPs) such as volcanic cloud height and duration. The results reflect changes in the VAA reporting process, data sources, and volcanic activity through time. The data show an increase in the number of VAAs issued since 2015 that cannot be directly correlated to an increase in volcanic activity. Instead, many represent increased observations, including improved capability to detect low- to mid-level volcanic clouds (FL101–FL200, 3–6 km asl), by higher temporal, spatial, and spectral resolution satellite sensors. Comparison of ESP data extracted from the VAAs with the Mastin et al. (J Volcanol Geotherm Res 186:10–21, 2009a) database shows that traditional assumptions used in the classification of volcanoes could be much simplified for operational use. The analysis highlights the VAA data as an exceptional resource documenting global volcanic activity on timescales that complement more widely used eruption datasets.


Healthcare ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 915
Author(s):  
Irena Duś-Ilnicka ◽  
Aleksander Szymczak ◽  
Małgorzata Małodobra-Mazur ◽  
Miron Tokarski

Since the 2019 novel coronavirus outbreak began in Wuhan, China, diagnostic methods in the field of molecular biology have been developing faster than ever under the vigilant eye of world’s research community. Unfortunately, the medical community was not prepared for testing such large volumes or ranges of biological materials, whether blood samples for antibody immunological testing, or salivary/swab samples for real-time PCR. For this reason, many medical diagnostic laboratories have made the switch to working in the field of molecular biology, and research undertaken to speed up the flow of samples through laboratory. The aim of this narrative review is to evaluate the current literature on laboratory techniques for the diagnosis of SARS-CoV-2 infection available on pubmed.gov, Google Scholar, and according to the writers’ knowledge and experience of the laboratory medicine. It assesses the available information in the field of molecular biology by comparing real-time PCR, LAMP technique, RNA sequencing, and immunological diagnostics, and examines the newest techniques along with their limitations for use in SARS-CoV-2 diagnostics.


Processes ◽  
2021 ◽  
Vol 9 (5) ◽  
pp. 737
Author(s):  
Chaitanya Sampat ◽  
Rohit Ramachandran

The digitization of manufacturing processes has led to an increase in the availability of process data, which has enabled the use of data-driven models to predict the outcomes of these manufacturing processes. Data-driven models are instantaneous in simulate and can provide real-time predictions but lack any governing physics within their framework. When process data deviates from original conditions, the predictions from these models may not agree with physical boundaries. In such cases, the use of first-principle-based models to predict process outcomes have proven to be effective but computationally inefficient and cannot be solved in real time. Thus, there remains a need to develop efficient data-driven models with a physical understanding about the process. In this work, we have demonstrate the addition of physics-based boundary conditions constraints to a neural network to improve its predictability for granule density and granule size distribution (GSD) for a high shear granulation process. The physics-constrained neural network (PCNN) was better at predicting granule growth regimes when compared to other neural networks with no physical constraints. When input data that violated physics-based boundaries was provided, the PCNN identified these points more accurately compared to other non-physics constrained neural networks, with an error of <1%. A sensitivity analysis of the PCNN to the input variables was also performed to understand individual effects on the final outputs.


2012 ◽  
Vol 39 (9) ◽  
pp. 1072-1082 ◽  
Author(s):  
Ali Montaser ◽  
Ibrahim Bakry ◽  
Adel Alshibani ◽  
Osama Moselhi

This paper presents an automated method for estimating productivity of earthmoving operations in near-real-time. The developed method utilizes Global Positioning System (GPS) and Google Earth to extract the data needed to perform the estimation process. A GPS device is mounted on a hauling unit to capture the spatial data along designated hauling roads for the project. The variations in the captured cycle times were used to model the uncertainty associated with the operation involved. This was carried out by automated classification, data fitting, and computer simulation. The automated classification is applied through a spreadsheet application that classifies GPS data and identifies, accordingly, durations of different activities in each cycle using spatial coordinates and directions captured by GPS and recorded on its receiver. The data fitting was carried out using commercially available software to generate the probability distribution functions used in the simulation software “Extend V.6”. The simulation was utilized to balance the production of an excavator with that of the hauling units. A spreadsheet application was developed to perform the calculations. An example of an actual project was analyzed to demonstrate the use of the developed method and illustrates its essential features. The analyzed case study demonstrates how the proposed method can assist project managers in taking corrective actions based on the near-real-time actual data captured and processed to estimate productivity of the operations involved.


2021 ◽  
Author(s):  
Yessica Fransisca ◽  
Karinka Adiandra ◽  
Vinda Manurung ◽  
Laila Warkhaida ◽  
M. Aidil Arham ◽  
...  

Abstract This paper describes the combination of strategies deployed to optimize horizontal well placement in a 40 ft thick isotropic sand with very low resistivity contrast compared to an underlying anisotropic shale in Semoga field. These strategies were developed due to previously unsuccessful attempts to drill a horizontal well with multiple side-tracks that was finally drilled and completed as a high-inclined well. To maximize reservoir contact of the subject horizontal well, a new methodology on well placement was developed by applying lessons learned, taking into account the additional challenges within this well. The first approach was to conduct a thorough analysis on the previous inclined well to evaluate each formation layer’s anisotropy ratio to be used in an effective geosteering model that could better simulate the real time environment. Correct selections of geosteering tools based on comprehensive pre-well modelling was considered to ensure on-target landing section to facilitate an effective lateral section. A comprehensive geosteering pre-well model was constructed to guide real-time operations. In the subject horizontal well, landing strategy was analysed in four stages of anisotropy ratio. The lateral section strategy focused on how to cater for the expected fault and maintain the trajectory to maximize reservoir exposure. Execution of the geosteering operations resulted in 100% reservoir contact. By monitoring the behaviour of shale anisotropy ratio from resistivity measurements and gamma ray at-bit data while drilling, the subject well was precisely landed at 11.5 ft TVD below the top of target sand. In the lateral section, wellbore trajectory intersected two faults exhibiting greater associated throw compared to the seismic estimate. Resistivity geo-signal and azimuthal resistivity responses were used to maintain the wellbore attitude inside the target reservoir. In this case history well with a low resistivity contrast environment, this methodology successfully enabled efficient operations to land the well precisely at the target with minimum borehole tortuosity. This was achieved by reducing geological uncertainty due to anomalous resistivity data responding to shale electrical anisotropy. Recognition of these electromagnetic resistivity values also played an important role in identifying the overlain anisotropic shale layer, hence avoiding reservoir exit. This workflow also helped in benchmarking future horizontal well placement operations in Semoga Field. Technical Categories: Geosteering and Well Placement, Reservoir Engineering, Low resistivity Low Contrast Reservoir Evaluation, Real-Time Operations, Case Studies


1981 ◽  
Vol 71 (4) ◽  
pp. 1351-1360
Author(s):  
Tom Goforth ◽  
Eugene Herrin

abstract An automatic seismic signal detection algorithm based on the Walsh transform has been developed for short-period data sampled at 20 samples/sec. Since the amplitude of Walsh function is either +1 or −1, the Walsh transform can be accomplished in a computer with a series of shifts and fixed-point additions. The savings in computation time makes it possible to compute the Walsh transform and to perform prewhitening and band-pass filtering in the Walsh domain with a microcomputer for use in real-time signal detection. The algorithm was initially programmed in FORTRAN on a Raytheon Data Systems 500 minicomputer. Tests utilizing seismic data recorded in Dallas, Albuquerque, and Norway indicate that the algorithm has a detection capability comparable to a human analyst. Programming of the detection algorithm in machine language on a Z80 microprocessor-based computer has been accomplished; run time on the microcomputer is approximately 110 real time. The detection capability of the Z80 version of the algorithm is not degraded relative to the FORTRAN version.


2016 ◽  
Vol 3 (1) ◽  
pp. 1
Author(s):  
Luis Roberto Vega-González

In this paper it is proposed that similarly with the evolution and maturation of any organization, the Linking and Management of Technology Office (L & MoT) of a public R&D Mexican Centre has been evolved and is in the way to be transformed into a Technology Transfer Office (TTO). Case of fifteen year evolution of the Centro de Ciencias Aplicadas y Desarrollo Tecnológico L & MoT presents empirical evidence to identify the main phases and actions that have been driving this process along this time. Standard results obtained through the years using the L & MoT Management of Technology Model (MoT) are presented. Emphasis is placed in a final section with the lessons obtained from non-standard results coming from unsuccessful negotiations and failed link actions between the Center and some external organizations. Experience has shown that not all negotiations are successful but curiously, the best lessons for the personnel of a technology transfer office are probably derived from these problematic cases.


Sign in / Sign up

Export Citation Format

Share Document