scholarly journals An overview of the National COVID-19 Chest Imaging Database: data quality and cohort analysis

GigaScience ◽  
2021 ◽  
Vol 10 (11) ◽  
Author(s):  
Dominic Cushnan ◽  
Oscar Bennett ◽  
Rosalind Berka ◽  
Ottavia Bertolli ◽  
Ashwin Chopra ◽  
...  

Abstract Background The National COVID-19 Chest Imaging Database (NCCID) is a centralized database containing mainly chest X-rays and computed tomography scans from patients across the UK. The objective of the initiative is to support a better understanding of the coronavirus SARS-CoV-2 disease (COVID-19) and the development of machine learning technologies that will improve care for patients hospitalized with a severe COVID-19 infection. This article introduces the training dataset, including a snapshot analysis covering the completeness of clinical data, and availability of image data for the various use-cases (diagnosis, prognosis, longitudinal risk). An additional cohort analysis measures how well the NCCID represents the wider COVID-19–affected UK population in terms of geographic, demographic, and temporal coverage. Findings The NCCID offers high-quality DICOM images acquired across a variety of imaging machinery; multiple time points including historical images are available for a subset of patients. This volume and variety make the database well suited to development of diagnostic/prognostic models for COVID-associated respiratory conditions. Historical images and clinical data may aid long-term risk stratification, particularly as availability of comorbidity data increases through linkage to other resources. The cohort analysis revealed good alignment to general UK COVID-19 statistics for some categories, e.g., sex, whilst identifying areas for improvements to data collection methods, particularly geographic coverage. Conclusion The NCCID is a growing resource that provides researchers with a large, high-quality database that can be leveraged both to support the response to the COVID-19 pandemic and as a test bed for building clinically viable medical imaging models.

2021 ◽  
Author(s):  
Dominic Cushnan ◽  
Oscar Bennett ◽  
Rosalind Berka ◽  
Ottavia Bertolli ◽  
Ashwin Chopra ◽  
...  

AbstractThe National COVID-19 Chest Imaging Database (NCCID) is a centralised database containing chest X-rays, chest Computed Tomography (CT) scans and cardiac Magnetic Resonance Images (MRI) from patients across the UK, jointly established by NHSX, the British Society of Thoracic Imaging (BSTI), Royal Surrey NHS Foundation Trust (RSNFT) and Faculty. The objective of the initiative is to support a better understanding of the coronavirus SARS-CoV-2 disease (COVID-19) and development of machine learning (ML) technologies that will improve care for patients hospitalised with a severe COVID-19 infection. The NCCID is now accumulating data from 20 NHS Trusts and Health Boards across England and Wales, with a total contribution of approximately 25,000 imaging studies in the training set (at time of writing) and is actively being used as a research tool by several organisations. This paper introduces the training dataset, including a snapshot analysis performed by NHSX covering: the completeness of clinical data, the availability of image data for the various use-cases (diagnosis, prognosis and longitudinal risk) and potential model confounders within the imaging data. The aim is to inform both existing and potential data users of the NCCID’s suitability for developing diagnostic/prognostic models. In addition, a cohort analysis was performed to measure the representativeness of the NCCID to the wider COVID-19 affected population. Three major aspects were included: geographic, demographic and temporal coverage, revealing good alignment in some categories, e.g., sex and identifying areas for improvements to data collection methods, particularly with respect to geographic coverage. All analyses and discussions are focused on the implications for building ML tools that will generalise well to the clinical use cases.


Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4554
Author(s):  
Ralph-Alexandru Erdelyi ◽  
Virgil-Florin Duma ◽  
Cosmin Sinescu ◽  
George Mihai Dobre ◽  
Adrian Bradu ◽  
...  

The most common imaging technique for dental diagnoses and treatment monitoring is X-ray imaging, which evolved from the first intraoral radiographs to high-quality three-dimensional (3D) Cone Beam Computed Tomography (CBCT). Other imaging techniques have shown potential, such as Optical Coherence Tomography (OCT). We have recently reported on the boundaries of these two types of techniques, regarding. the dental fields where each one is more appropriate or where they should be both used. The aim of the present study is to explore the unique capabilities of the OCT technique to optimize X-ray units imaging (i.e., in terms of image resolution, radiation dose, or contrast). Two types of commercially available and widely used X-ray units are considered. To adjust their parameters, a protocol is developed to employ OCT images of dental conditions that are documented on high (i.e., less than 10 μm) resolution OCT images (both B-scans/cross sections and 3D reconstructions) but are hardly identified on the 200 to 75 μm resolution panoramic or CBCT radiographs. The optimized calibration of the X-ray unit includes choosing appropriate values for the anode voltage and current intensity of the X-ray tube, as well as the patient’s positioning, in order to reach the highest possible X-rays resolution at a radiation dose that is safe for the patient. The optimization protocol is developed in vitro on OCT images of extracted teeth and is further applied in vivo for each type of dental investigation. Optimized radiographic results are compared with un-optimized previously performed radiographs. Also, we show that OCT can permit a rigorous comparison between two (types of) X-ray units. In conclusion, high-quality dental images are possible using low radiation doses if an optimized protocol, developed using OCT, is applied for each type of dental investigation. Also, there are situations when the X-ray technology has drawbacks for dental diagnosis or treatment assessment. In such situations, OCT proves capable to provide qualitative images.


2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Kara-Louise Royle ◽  
David A. Cairns

Abstract Background The United Kingdom Myeloma Research Alliance (UK-MRA) Myeloma Risk Profile is a prognostic model for overall survival. It was trained and tested on clinical trial data, aiming to improve the stratification of transplant ineligible (TNE) patients with newly diagnosed multiple myeloma. Missing data is a common problem which affects the development and validation of prognostic models, where decisions on how to address missingness have implications on the choice of methodology. Methods Model building The training and test datasets were the TNE pathways from two large randomised multicentre, phase III clinical trials. Potential prognostic factors were identified by expert opinion. Missing data in the training dataset was imputed using multiple imputation by chained equations. Univariate analysis fitted Cox proportional hazards models in each imputed dataset with the estimates combined by Rubin’s rules. Multivariable analysis applied penalised Cox regression models, with a fixed penalty term across the imputed datasets. The estimates from each imputed dataset and bootstrap standard errors were combined by Rubin’s rules to define the prognostic model. Model assessment Calibration was assessed by visualising the observed and predicted probabilities across the imputed datasets. Discrimination was assessed by combining the prognostic separation D-statistic from each imputed dataset by Rubin’s rules. Model validation The D-statistic was applied in a bootstrap internal validation process in the training dataset and an external validation process in the test dataset, where acceptable performance was pre-specified. Development of risk groups Risk groups were defined using the tertiles of the combined prognostic index, obtained by combining the prognostic index from each imputed dataset by Rubin’s rules. Results The training dataset included 1852 patients, 1268 (68.47%) with complete case data. Ten imputed datasets were generated. Five hundred twenty patients were included in the test dataset. The D-statistic for the prognostic model was 0.840 (95% CI 0.716–0.964) in the training dataset and 0.654 (95% CI 0.497–0.811) in the test dataset and the corrected D-Statistic was 0.801. Conclusion The decision to impute missing covariate data in the training dataset influenced the methods implemented to train and test the model. To extend current literature and aid future researchers, we have presented a detailed example of one approach. Whilst our example is not without limitations, a benefit is that all of the patient information available in the training dataset was utilised to develop the model. Trial registration Both trials were registered; Myeloma IX-ISRCTN68454111, registered 21 September 2000. Myeloma XI-ISRCTN49407852, registered 24 June 2009.


1996 ◽  
Vol 14 (4-6) ◽  
pp. 341-352 ◽  
Author(s):  
W. F. Kuhs ◽  
F. C. Bauer ◽  
R. Hausmann ◽  
H. Ahsbahs ◽  
R. Dorwarth ◽  
...  

1934 ◽  
Vol 30 (2) ◽  
pp. 310-316
Author(s):  
R. Ya. Gasul

In spite of the fact that already in 1913 the first quite successful attempts were made to treat gastric ulcers with X-rays (Kodo) and to influence the secretion and acidity of gastric juice with the help of illumination (Brgel, 1916), the clinic has not yet shown sufficient interest in the use of X-rays in the treatment of ulcers and their complications. Some clinicians still believe that only malignant neoplasms are treated with X-rays, and only those that are rejected by surgeons.


Molecules ◽  
2019 ◽  
Vol 24 (19) ◽  
pp. 3490
Author(s):  
Krishna P. Khakurel ◽  
Borislav Angelov ◽  
Jakob Andreasson

Crystallography has long been the unrivaled method that can provide the atomistic structural models of macromolecules, using either X-rays or electrons as probes. The methodology has gone through several revolutionary periods, driven by the development of new sources, detectors, and other instrumentation. Novel sources of both X-ray and electrons are constantly emerging. The increase in brightness of these sources, complemented by the advanced detection techniques, has relaxed the traditionally strict need for large, high quality, crystals. Recent reports suggest high-quality diffraction datasets from crystals as small as a few hundreds of nanometers can be routinely obtained. This has resulted in the genesis of a new field of macromolecular nanocrystal crystallography. Here we will make a brief comparative review of this growing field focusing on the use of X-rays and electrons sources.


2012 ◽  
Vol 23 ◽  
pp. v38
Author(s):  
H.F. Peach ◽  
P.W.M. Johnson ◽  
S. Johnson ◽  
L.K. Jones ◽  
M. Jones ◽  
...  

2020 ◽  
Vol 49 (4) ◽  
pp. 1316-1325 ◽  
Author(s):  
Sarah Booth ◽  
Richard D Riley ◽  
Joie Ensor ◽  
Paul C Lambert ◽  
Mark J Rutherford

Abstract Background Prognostic models are typically developed in studies covering long time periods. However, if more recent years have seen improvements in survival, then using the full dataset may lead to out-of-date survival predictions. Period analysis addresses this by developing the model in a subset of the data from a recent time window, but results in a reduction of sample size. Methods We propose a new approach, called temporal recalibration, to combine the advantages of period analysis and full cohort analysis. This approach develops a model in the entire dataset and then recalibrates the baseline survival using a period analysis sample. The approaches are demonstrated utilizing a prognostic model in colon cancer built using both Cox proportional hazards and flexible parametric survival models with data from 1996–2005 from the Surveillance, Epidemiology, and End Results (SEER) Program database. Comparison of model predictions with observed survival estimates were made for new patients subsequently diagnosed in 2006 and followed-up until 2015. Results Period analysis and temporal recalibration provided more up-to-date survival predictions that more closely matched observed survival in subsequent data than the standard full cohort models. In addition, temporal recalibration provided more precise estimates of predictor effects. Conclusion Prognostic models are typically developed using a full cohort analysis that can result in out-of-date long-term survival estimates when survival has improved in recent years. Temporal recalibration is a simple method to address this, which can be used when developing and updating prognostic models to ensure survival predictions are more closely calibrated with the observed survival of individuals diagnosed subsequently.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Eunjeong Kang ◽  
Yaerim Kim ◽  
Yong Chul Kim ◽  
Soojin Lee ◽  
Seungyeup Han ◽  
...  

Abstract Background and Aims Glomerular diseases, a set of debilitating and complex disease entities, are related to mortality and morbidity. To gain insight into pathophysiology and novel treatment targets of glomerular disease, various types of biospecimens linked to deep clinical phenotyping including clinical information, digital pathology, and well-defined outcomes are required. We provide the rationale and design of the KOrea Renal biobank NEtwoRk System TOward Next-generation analysis (KORNERSTONE). Method The KORNERSTONE, which has been initiated by Korea Centres for Disease Control and Prevention, is designed as a multi-centre, prospective cohort study and biobank for glomerular diseases. Clinical data, questionnaires will be collected at the time of kidney biopsy and subsequently every one year after kidney biopsy. All of the clinical data will be extracted from the electrical health record and automatically uploaded to the web-based database. High-quality digital pathologies are obtained and connected in the database. Various types of biospecimens are collected at baseline and during follow-up: serum, urine, buffy coat, stool, glomerular complementary DNA (cDNA), tubulointerstitial cDNA. All data and biospecimens are processed and stored in a standardised manner. The primary outcomes are mortality and end-stage renal disease. The secondary outcomes will be deterioration renal function, remission of proteinuria, cardiovascular events and quality of life. Disussion Ethical approval has been obtained from the institutional review board of each participating centre and ethics oversight committee. The KORNERSTONE is designed to deliver pioneer insights into glomerular diseases. The study design allows comprehensive, integrated and high-quality data collection on baseline laboratory findings, clinical outcomes including administrative data and digital pathologic images. This may provide various biospecimens and information to many researchers, establish the rationale for future more individualised treatment strategies for glomerular diseases. Conclusion In conclusion, we describe the objectives and clinical protocol for the KORNERSTONE. As the first large-scale glomerulonephropathy cohort study with the integration of clinical data, biospecimens and digital pathologic images in Korea, the KORNERSTONE will help to clarify the natural course, complication profiles, and novel treatment targets of the Asian population with glomerular disease.


Sign in / Sign up

Export Citation Format

Share Document