scholarly journals Implementation of Real-Time Quality Control Procedures by Means of a Probabilistic Estimate of Seawater Temperature and Its Temporal Evolution

2013 ◽  
Vol 30 (3) ◽  
pp. 609-625 ◽  
Author(s):  
Giuseppe M. R. Manzella ◽  
Marco Gambetta

Abstract Near-real-time quality control procedures for temperature profiles collected from ships of opportunity were implemented during the 1980s in oceans across the world and from the 1990s in the Mediterranean. In this sea, the procedures were originally based on seven steps (detection of end of profile, gross range check, position control, elimination of spikes, Gaussian smoothing and resampling at 1-m intervals, general malfunction control, and comparison with climatology), complemented with initial and final visual checks. The quality of data derived from a comparison with historical data (namely, climatology) depends on the availability of a huge amount of data that can statistically represent the mean characteristics of the seawater. A significant amount of data has been collected, and the existing temperature database in the Mediterranean can now provide more information on temporal and spatial variability at monthly and mesoscales, and an improved procedure for data quality control has now been adopted. New “best” estimates of monthly temperature profiles are calculated by using a maximum likelihood method. It has been found that more than one “best estimate” temperature can be defined in particular areas and depths, as a consequence of climate variability. Additional near-real-time control procedures have been included in order to provide information on long-term variability associated with data. This information is included in metafiles to be used for reanalysis and studies on long-term variability and changes.

2021 ◽  
Author(s):  
Tommaso Alberti ◽  
Davide Faranda

<p>While COVID-19 is rapidly propagating around the globe, the need for providing real-time forecasts of the epidemics pushes fits of dynamical and statistical models to available data beyond their capabilities. Here we focus on statistical predictions of COVID-19 infections performed by fitting asymptotic distributions to actual data. By taking as a case-study the epidemic evolution of total COVID-19 infections in Chinese provinces and Italian regions, we find that predictions are characterized by large uncertainties at the early stages of the epidemic growth. Those uncertainties significantly reduce after the epidemics peak is reached. Differences in the uncertainty of the forecasts at a regional level can be used to highlight the delay in the spread of the virus. Our results warn that long term extrapolation of epidemics counts must be handled with extreme care as they crucially depend not only on the quality of data, but also on the stage of the epidemics, due to the intrinsically non-linear nature of the underlying dynamics. These results suggest that real-time epidemiological projections should include wide uncertainty ranges and urge for the needs of compiling high-quality datasets of infections counts, including asymptomatic patients.</p><p>Alberti T. and Faranda D. (2020) <span>On the uncertainty of real-time predictions of epidemic growths: A COVID-19 case study for China and Italy. <em>Commun. Nonlin. Sci. Num. Sim.</em>, <strong>90</strong>, 105372.</span></p>


1987 ◽  
Vol 33 (12) ◽  
pp. 2267-2271 ◽  

Abstract A method for measuring glycated hemoglobin (Hb A1c) and an accompanying method of specimen transport to a central laboratory were developed for the multicenter Diabetes Control and Complications Trial (DCCT). In the DCCT, results for Hb A1c are used to assess chronic glycemic control for data collection and patient management. During the feasibility phase of the trial, central (CHL) and backup laboratories using automated, "high-performance" ion-exchange liquid-chromatographic methods were established. Whole-blood samples were stored (4 degrees C) at each of the 21 clinical centers for up to 72 h before air-express shipment to the CHL. Quality-control procedures included daily analyses of three calibration specimens. A pooled hemolysate was assayed frequently over time as a long-term quality control (LTQC). After 18 months, within- and between-run CVs were less than 6%. Mean values for split duplicate samples assayed in a masked fashion at the CHL were nearly identical. LTQC results indicated no significant assay drift over time. More than 6000 samples were assayed (mean interval between obtaining the blood sample and completing the assay: less than six days). Hb A1c evidently can be precisely and reliably measured in the context of a long-term, multicenter trial such as the DCCT.


1984 ◽  
Vol 30 (1) ◽  
pp. 145-149 ◽  
Author(s):  
M L Gozzo ◽  
G Barbaresi ◽  
G Giocoli ◽  
B Zappacosta ◽  
C Zuppi

Abstract We propose a statistical procedure for long-term quality-control of laboratory instruments, including daily, day-to-day, and monthly evaluations. The procedure is based on the unique and unequivocal interpretation of five results for control sera by calculation of a Reliability Index and further manipulations of this unitless parameter. This method, which we have tested during the past two years, allows for monitoring analytical performance and making comparisons with results of interlaboratory surveys. The monthly analytical variability, expressed as "total error," is an indicator of the clinical usefulness of analytical results.


2021 ◽  
Author(s):  
Orestis Faklaris ◽  
Leslie Bancel-Vallee ◽  
Aurelien Dauphin ◽  
Baptiste Monterroso ◽  
Perrine Frere ◽  
...  

Reliable, reproducible and comparable results are what biology requires from microscopy. To achieve that level of confidence, monitoring the stability of the microscope performance over time with standardized quality testing routines is essential for mining quantitative data. Three levels of microscope quality control procedures should be considered: i) usage of accessible and affordable tools and samples, ii) execution of easy and fast, preferably automatized, acquisition protocols, iii) analysis of data in the most automated way possible with adequate metrics for long-term monitoring. In this paper, we test the acquisition protocols on the mainly used microscope techniques (wide-field, spinning disk and confocal microscopy) with simple quality control tools. Seven protocols specify metrics on measuring the lateral and axial resolution (Point-Spread Function) of the system, field flatness, chromatic aberrations and co-registration, illumination power monitoring and stability, stage drift and positioning repeatability and finally temporal and spatial noise sources of camera detectors. We designed an ImageJ/FiJi java plugin named MetroloJ_QC to incorporate the identified metrics and automatize the data processing for the analysis. After processing and comparing the data of microscopes from more than ten imaging facilities, we test the robustness of the metrics and the protocols by determining experimental limit values. Our results give a first extensive characterization of the quality control procedures of a light microscope, with an automated data processing and experimental limit values that can be used by core facility staff and researchers to monitor the microscope performance over time.


ACTA IMEKO ◽  
2016 ◽  
Vol 5 (1) ◽  
pp. 64
Author(s):  
Daniel M. Toma ◽  
Albert Garcia Benadí ◽  
Bernat-Joan Manuel-Gonzalez ◽  
Joaquin Del-Río-Fernandez

With the advances of last year's technologies many new observation platforms have been created and connected on network for the diffusion of numerous and diverse observations, and also it provided a great possibility to connect all kind of people facilitating the creations of great scale and long-term studies. This paper is focused on the marine observations and platforms employed for this scope. Real time data and the big data have to accomplish some minimal quality of data requirements. Usually, the task to ensure these quality requirements is accomplished by the platforms responsible. The aim of paper is to explain the design of these quality control systems and its implementation in an ocean observation platform.


2003 ◽  
Vol 21 (1) ◽  
pp. 49-62 ◽  
Author(s):  
G. M. R. Manzella ◽  
E. Scoccimarro ◽  
N. Pinardi ◽  
M. Tonani

Abstract. A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000), six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT), was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1) Position control; (2) Elimination of spikes; (3) Re-sampling at a 1 metre vertical interval; (4) Filtering; (5) General malfunctioning check; (6) Comparison with climatology (and distance from this in terms of standard deviations); (7) Visual check; and (8) Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment. Key words. Oceanography: physical (instruments and techniques; general circulation; hydrography)


Sign in / Sign up

Export Citation Format

Share Document