Browser based state-of-the-art software for automated data reduction, quality control and dissemination for marine carbon data

Author(s):  
Benjamin Pfeil ◽  
Steve Jones ◽  
Maren Karlsen ◽  
Camilla Stegen Landa ◽  
Rocio Castano Primo ◽  
...  

<p>Essential Ocean Variable Inorganic Carbon observations collected from instruments at sea are typically processed by individual PIs before submitting to data centres and other data archives. Often this work is done on an ad-hoc basis using unpublished, self-built software, and published in unique formats. This conflicts with the Interoperability and Reusability aspects of the FAIR data principles: such data requires significant reformatting efforts by data centres and/or end users, and reproducibility is impossible without a full record of the processing performed and QC decisions made by PIs. The manual nature of this process implies additional workload for PIs who need to submit their data to multiple archives/data product. There is a clear need to standardise the data workflow from measurement to publication using common, open source, and documented tools whose algorithms are fully accessible and all processing is recorded for full transparency.</p><p>The Ocean Thematic Centre of the European Research Infrastructure ICOS (Integrated Carbon Observation System) is developing QuinCe, a browser-based tool for uploading, processing, automatic and manual quality control, and publication of data from underway pCO₂ systems on ships and moorings. Data can be uploaded directly from instruments in any text format, where it is standardised and processed using algorithms approved by the scientific community. Automatic QC algorithms can detect many obvious data errors; afterwards PIs can perform full quality control of the data following Standard Operating Procedures and best practises. All records of QC decisions, with enforced explanatory notes, are recorded by the software to enable full traceability and reproducibility. The final QCed dataset can be downloaded by the PI, and is sent to the ICOS Carbon Portal and SOCAT project for publication. The ICOS Carbon Portal integrates marine data with ICOS data from the ecosystem and atmosphere on a regional scale and data is integrated via SOCAT in the annual Global Carbon Budgets of the Global Carbon Project where it informs policy/decision makers, the scientific community and the general public.</p><p>For platforms with operational data flows, the data is transmitted directly from ship to shore, QuinCe processes, quality controls and publishes Near Real Time data to the ICOS Carbon Portal and to Copernicus Marine Environmental Monitoring Services In Situ TAC as soon as it is received with no human intervention, greatly reducing the time from measurement to data availability.</p><p>Full metadata records for instruments are kept and maintained at the ICOS Carbon Portal, utilising existing standardised vocabularies and version control to maintain a complete history. The correct metadata for any given dataset is available at any time, and can be converted to any required format, allowing compliance with the United Nations Sustainable Development Goal 14.3.1 methodology ‘average marine acidity (pH) measured at agreed suite of representative sampling stations’ and ICOS data relevant to SDG 14.3 is distributed to IOC UNESCO’s IODE. While much of this work is currently performed manually, international efforts are underway to develop fully automated systems and these will be integrated as they become available.</p>

2007 ◽  
Vol 26 (3) ◽  
pp. 245-247
Author(s):  
Petros Karkalousos

The Schemes of External Quality Control in Laboratory Medicine in the Balkans There are many differences between the national External Quality Control Schemes all around Europe, but the most important ones are certainly those between the countries of the Balkan region. These differences are due to these countries' different political and financial development, as well as to their tradition and the development of clinical chemistry science in each one. Therefore, there are Balkan countries with very developed EQAS and others where there is no such a scheme. Undoubtedly, the scientific community in these countries wants to develop EQAS despite of the financial and other difficulties.


2018 ◽  
Vol 228 ◽  
pp. 02001
Author(s):  
Bing Han ◽  
Qiang Fu

For the sake of ameliorating the faultiness of low precision for conventional surveillance methods of water stage, and realize the goal of real time data collection, automated actions and long-distance conveying, we have designed a novel surveillance system of water stage with the resonator pressure transducer and wireless connectivity technologies. The surveillance system of water stage has come into service in a field experiment project of a certain oil and gas pipeline engineering. By analyzing and comparing the results of experiments, the system has the merits of high agility, reliability, instantaneity and accuracy, low cost, capacity of resisting disturbance, which making it ideal for use in unattended supervising of water stage for multi-spots observation based on regional scale. The surveillance system can well satisfy the actual demand of auto hydrogeological parameters monitoring for geotechnical engineering.


2021 ◽  
Author(s):  
Rose Line Spacagna ◽  
Massimo Cesarano ◽  
Stefania Fabozzi ◽  
Edoardo Peronace ◽  
Attilio Porchia ◽  
...  

<p>The Seismic Microzonation studies (SMs), promoted all over the Italian territory by the Department of Civil Protection, provide fundamental knowledge of the subsoil response in seismic conditions at the urban scale. Amplification phenomena related to lithostratigraphic and morphological characteristics, instabilities and permanent deformations activated by the earthquake, are highlighted in hazard maps produced at increasing reliability levels (level 1 to 3 of SM). In particular, zones prone to liquefaction instability are firstly identified following the predisposing factors, such as geological and geotechnical characteristics and seismicity. The robustness of the definition of these areas is strongly correlated to the availability and the spatial distribution of surveys. Moreover, the typology and quality of the investigations considerably influence the method of analysis and the degree of uncertainty of the results.</p><p>This work aims to establish an updated procedure of the actual SM guidelines and integrates recent research activities at different levels of SMs, to improve the hazard maps accuracy in terms of liquefaction susceptibility. For the scope, the case of the Calabria region in the south of Italy, well known for the high level of seismicity, was studied. At a regional scale, the base-level analysis was implemented for a preliminary assessment of the Attention Zones (AZ), potentially susceptible to liquefaction. The predisposing factors were implemented at a large scale, taking advantage of geostatistical tools to quantify uncertainties and filter inconsistent data. The regional-scale analysis allowed to highlight areas prone to liquefaction and effectively addressed the subsequent level of analysis. At a local scale, the quantitative evaluation of the liquefaction potential was assessed using simplified methods, integrating data from different survey types (CPT, SPT, Down-Hole, Cross-Hole, MASW) available in SM database. The definition of Susceptibility Zones (SZ) was provided considering additional indexes, combining the results obtained from different surveys typologies and quantifying the uncertainty due to the limited data availability with geostatistical methods. The analyses at the regional and municipality scale were matched with seismic liquefaction evidence, well documented in past seismic events. This multi-scale process optimises resource allocation to reduce the level of uncertainty for subsequent levels of analysis, providing useful information for land management and emergency planning.</p>


2020 ◽  
Vol 5 (1) ◽  
pp. 73-93
Author(s):  
Jared Eutsler ◽  
D. Kip Holderness ◽  
Megan M. Jones

ABSTRACT The Public Company Accounting Oversight Board's (PCAOB) Part II inspection reports, which disclose systemic quality control issues that auditors fail to remediate, signal poor audit quality for triennially inspected audit firms. Auditors that receive a Part II inspection report typically experience a decrease in clients, which demonstrates a general demand for audit quality. However, some companies hire auditors that receive Part II inspection reports. We examine potential reasons for hiring these audit firms. We find that relative to companies that switch to auditors without Part II reports, companies that switch to auditors with Part II reports have higher discretionary accruals in the first fiscal year after the switch, which indicates lower audit quality and a heightened risk for future fraud. We find no difference in audit fees. Our results suggest that PCAOB Part II inspection reports may signal low-quality auditors to companies that desire low-quality audits. Data Availability: Data are available from the public sources cited in the text.


2019 ◽  
pp. 205-218
Author(s):  
Theresa Chapple-McGruder ◽  
Jaime Slaughter-Acey ◽  
Jennifer Kmet ◽  
Tonia Ruddock

This chapter offers instructions on how to find the data needed for a particular public health improvement program. The chapter starts by defining two systems of data collection: primary and secondary. However, it is important to remember that all data has limitations. There is no such thing as perfect data. The use of primary data in practice or policy decision-making is often constrained by resources and time, as collecting robust data typically takes years. Although secondary data poses limits, such that it might be data not collected specifically for a particular health question, or not being representative of the population of interest, or perhaps there is a lag in data availability. However, the chapter concludes, things can always be improved even if perfection is never reached.


2020 ◽  
Vol 4 (4) ◽  
pp. 3-78 ◽  
Author(s):  
Christina Leb

AbstractCross-border data and information exchange is one of the most challenging issues for transboundary water management. While the regular exchange of data and information has been identified as one of the general principles of international water law, only a minority of treaties include direct obligations related to mutual data exchange. Technological innovations related to real-time data availability, space technology and earth observation have led to an increase in quality and availability of hydrological, meteorological and geo-spatial data. These innovations open new avenues for access to water related data and transform data and information exchange globally. This monograph is an exploratory assessment of the potential impacts of these disruptive technologies on data and information exchange obligations in international water law.


Author(s):  
Elzbieta Malinowski

Data warehouses (DWs) integrate data from different source systems in order to provide historical information that supports the decision-making process. The design of a DW is a complex and costly task since the inclusion of different data items in a DW depends on both users’ needs and data availability in source systems. Currently, there is still a lack of a methodological framework that guides developers through the different stages of the DW design process. On the one hand, there are several proposals that informally describe the phases used for developing DWs based on the authors’ experience in building such systems (Inmon, 2002; Kimball, Reeves, Ross, & Thornthwaite, 1998). On the other hand, the scientific community proposes a variety of approaches for developing DWs, discussed in the next section. Nevertheless, they either include features that are meant for the specific conceptual model used by the authors, or they are very complex. This situation has occurred since the need to build DW systems that fulfill user expectations was ahead of methodological and formal approaches for DW development, just like the one we had for operational databases.


2019 ◽  
Vol 9 (23) ◽  
pp. 5024
Author(s):  
Andrian ◽  
Kim ◽  
Ju

In space science research, the Indonesia National Institute of Aeronautics and Space (LAPAN) is concerned with the development of a system that provides actual information and predictions called the Space Weather Information and Forecast Services (SWIFtS). SWIFtS is supported by a data storage system that serves data, implementing a centralized storage model. This has some problems that impact to researchers as the primary users. The single point of failure and also the delay in data updating on the server is a significant issue when researchers need the latest data, but the server is unable to provide it. To overcome these problems, we proposed a new system that utilized a decentralized model for storing data, leveraging the InterPlanetary File System (IPFS) file system. Our proposed method focused on the automated background process, and its scheme would increase the data availability and throughput by spreading it into nodes through a peer-to-peer connection. Moreover, we also included system monitoring for real-time data flow from each node and information of node status that combines active and passive approaches. For system evaluation, the experiment was performed to determine the performance of the proposed system compared to the existing system by calculating mean replication time and the mean throughput of a node. As expected, performance evaluations showed that our proposed scheme had faster file replication time and supported high throughput.


Sign in / Sign up

Export Citation Format

Share Document