scholarly journals Prioritisation by FIT to mitigate the impact of delays in the 2-week wait colorectal cancer referral pathway during the COVID-19 pandemic: a UK modelling study

Gut ◽  
2020 ◽  
pp. gutjnl-2020-321650 ◽  
Author(s):  
Chey Loveday ◽  
Amit Sud ◽  
Michael E Jones ◽  
John Broggio ◽  
Stephen Scott ◽  
...  

ObjectiveTo evaluate the impact of faecal immunochemical testing (FIT) prioritisation to mitigate the impact of delays in the colorectal cancer (CRC) urgent diagnostic (2-week-wait (2WW)) pathway consequent from the COVID-19 pandemic.DesignWe modelled the reduction in CRC survival and life years lost resultant from per-patient delays of 2–6 months in the 2WW pathway. We stratified by age group, individual-level benefit in CRC survival versus age-specific nosocomial COVID-19–related fatality per referred patient undergoing colonoscopy. We modelled mitigation strategies using thresholds of FIT triage of 2, 10 and 150 µg Hb/g to prioritise 2WW referrals for colonoscopy. To construct the underlying models, we employed 10-year net CRC survival for England 2008–2017, 2WW pathway CRC case and referral volumes and per-day-delay HRs generated from observational studies of diagnosis-to-treatment interval.ResultsDelay of 2/4/6 months across all 11 266 patients with CRC diagnosed per typical year via the 2WW pathway were estimated to result in 653/1419/2250 attributable deaths and loss of 9214/20 315/32 799 life years. Risk–benefit from urgent investigatory referral is particularly sensitive to nosocomial COVID-19 rates for patients aged >60. Prioritisation out of delay for the 18% of symptomatic referrals with FIT >10 µg Hb/g would avoid 89% of these deaths attributable to presentational/diagnostic delay while reducing immediate requirement for colonoscopy by >80%.ConclusionsDelays in the pathway to CRC diagnosis and treatment have potential to cause significant mortality and loss of life years. FIT triage of symptomatic patients in primary care could streamline access to colonoscopy, reduce delays for true-positive CRC cases and reduce nosocomial COVID-19 mortality in older true-negative 2WW referrals. However, this strategy offers benefit only in short-term rationalisation of limited endoscopy services: the appreciable false-negative rate of FIT in symptomatic patients means most colonoscopies will still be required.

Sensors ◽  
2020 ◽  
Vol 20 (7) ◽  
pp. 1982 ◽  
Author(s):  
Noor Ul Huda ◽  
Bolette D. Hansen ◽  
Rikke Gade ◽  
Thomas B. Moeslund

Thermal cameras are popular in detection for their precision in surveillance in the dark and for privacy preservation. In the era of data driven problem solving approaches, manually finding and annotating a large amount of data is inefficient in terms of cost and effort. With the introduction of transfer learning, rather than having large datasets, a dataset covering all characteristics and aspects of the target place is more important. In this work, we studied a large thermal dataset recorded for 20 weeks and identified nine phenomena in it. Moreover, we investigated the impact of each phenomenon for model adaptation in transfer learning. Each phenomenon was investigated separately and in combination. the performance was analyzed by computing the F1 score, precision, recall, true negative rate, and false negative rate. Furthermore, to underline our investigation, the trained model with our dataset was further tested on publicly available datasets, and encouraging results were obtained. Finally, our dataset was also made publicly available.


2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
A Cornez ◽  
G Silversmit ◽  
V Gorasso ◽  
I Grant ◽  
G M A Wyper ◽  
...  

Abstract Background Monitoring the health status of a population requires consistent and comparable data on the morbidity and mortality impacts of a disease. The Disability-Adjusted Life Year (DALY) is an increasingly used disease burden indicator, combining healthy life years lost due to living with disease (Years Lived with Disability; YLDs) and due to dying prematurely (Years of Life Lost; YLLs). In Belgium, as in many other developed countries, cancer is a major contributor to the overall burden of disease. To date, however, local estimates of the burden of cancer are lacking. Methods We estimated the burden of 48 cancers in Belgium from 2004 to 2017 in terms of DALYs, using national population-based cancer registry data and international disease models. We developed a microsimulation model to translate incidence- into prevalence-based estimates, and used expert elicitation to integrate the long-term impact of surgical treatment. Results In 2017, in Belgium, breast cancer was the cancer with the highest disease burden among women, followed by lung cancer and colorectal cancer. Among men, lung cancer had the highest disease burden, followed by colorectal cancer and prostate cancer. Between 2004 and 2017, the burden of lung cancer increased by more than 50% in women, while in both sexes, significant increases were observed in melanoma and skin cancer burden. The majority of the cancer burden remained linked to premature mortality. Conclusions Cancer maintains a major impact on the health of the Belgian population. Current resources allocated for their prevention and treatment will need to be maintained to further reduce the cancer burden. Lung cancer remains a crucial challenge, among both men and women, calling for strengthened tobacco control policies. Integrating the current study in the Belgian national burden of disease study (BeBOD) will allow monitoring the burden of cancer over time, highlighting new trends and assessing the impact of public health policies. Key messages Burden of disease studies allow assessing and monitoring the impact of diseases and risk factors in a comparable way. Cancer maintains a major impact on the health of the Belgian population; lung cancer in particular remains a crucial challenge.


Author(s):  
Marta Olive‐Gadea ◽  
Manuel Requena ◽  
Facundo Diaz ◽  
Alvaro Garcia‐Tornel ◽  
Marta Rubiera ◽  
...  

Introduction : In acute ischemic stroke patients, current guidelines recommend noninvasive vascular imaging to identify intracranial vessel occlusions (VO) that may benefit from endovascular treatment (EVT). However, VO can be missed in CT angiography (CTA) readings. We aim to evaluate the impact of consistently including CT perfusion (CTP) in admission stroke imaging protocols on VO diagnosis and EVT rates. Methods : We included patients with a suspected acute ischemic stroke that underwent urgent non‐contrast CT, CTA and CTP from April to October 2020. Hypoperfusion areas defined by Tmax>6s delay (RAPID software), congruent with the clinical symptoms and a vascular territory, were considered due to a VO (CTP‐VO). Cases in which mechanical thrombectomy was performed were defined as therapeutically relevant VO (EVT‐VO). For patients that received EVT, site of VO according to digital subtraction angiography was recorded. Two experienced neuroradiologists blinded to CTP but not to clinical symptoms, retrospectively evaluated NCCT and CTA to identify intracranial VO (CTA‐VO). We analyzed CTA‐VO sensitivity and specificity at detecting CTP‐VO and EVT‐VO respecitvely. We performed a logistic regression to test the association of Tmax>6s volumes with CTA‐VO identification and indication of EVT. Results : Of the 338 patients included in the analysis, 157 (46.5%) presented a CTP‐VO, (median Tmax>6s: 73 [29‐127] ml). CTA‐VO was identified in 83 (24.5%) of the cases. Overall CTA‐VO sensitivity for the detection of CTP‐VO was 50.3% and specificity was 97.8%. Higher hypoperfusion volume was associated with an increased CTA‐VO detection, with an odds ratio of 1.03 (95% confidence interval 1.02‐1.04) (figure). DSA was indicated in 107 patients; in 4 of them no EVT was attempted due to recanalization or a too distal VO in the first angiographic run. EVT was performed in 103 patients (30.5%. Tmax>6s: 102 [63‐160] ml), representing 65.6% of all CTP‐VO. Overall CTA‐VO sensitivity for the detection of EVT‐VO was 69.9%. The CTA‐VO sensitivity for detecting patients with indication of EVT according to clinical guidelines was as follows: 91.7% for ICA occlusions and 84.4% for M1‐MCA occlusions. For all other occlusion sites that received EVT, the CTA‐VO sensitivity was 36.1%. The overall specificity was 95.3%. Among patients who received EVT, CTA‐VO was not detected in 31 cases, resulting in a false negative rate of 30.1%. False negative CTA‐VO cases had lower Tmax>6s volumes (69[46‐99.5] vs 126[84‐169.5]ml, p<0.001) and lower NIHSS (13[8.5‐16] vs 17[14‐21], p<0.001). Conclusions : Systematically including CTP perfusion in the acute stroke admission imaging protocols may increase the diagnosis of VO and rate of EVT.


Author(s):  
Jati Pratomo ◽  
Monika Kuffer ◽  
Javier Martinez ◽  
Divyani Kohli

Object-Based Image Analysis (OBIA) has been successfully used to map slums. In general, the occurrence of uncertainties in producing geographic data is inevitable. However, most studies concentrated solely on assessing the classification accuracy and neglecting the inherent uncertainties. Our research analyses the impact of uncertainties in measuring the accuracy of OBIA-based slum detection. We selected Jakarta as our case study area, because of a national policy of slum eradication, which is causing rapid changes in slum areas. Our research comprises of four parts: slum conceptualization, ruleset development, implementation, and accuracy and uncertainty measurements. Existential and extensional uncertainty arise when producing reference data. The comparison of a manual expert delineations of slums with OBIA slum classification results into four combinations: True Positive, False Positive, True Negative and False Negative. However, the higher the True Positive (which lead to a better accuracy), the lower the certainty of the results. This demonstrates the impact of extensional uncertainties. Our study also demonstrates the role of non-observable indicators (i.e., land tenure), to assist slum detection, particularly in areas where uncertainties exist. In conclusion, uncertainties are increasing when aiming to achieve a higher classification accuracy by matching manual delineation and OBIA classification.


2007 ◽  
Vol 18 (2) ◽  
pp. 121-127 ◽  
Author(s):  
Adrienne Morrow ◽  
Philippe De Wals ◽  
Geneviève Petit ◽  
Maryse Guay ◽  
Lonny James Erickson

BACKGROUND: In the United States, implementation of the seven-valent conjugate vaccine into childhood immunization schedules has had an effect on the burden of pneumococcal disease in all ages of the population. To evaluate the impact in Canada, it is essential to have an estimate of the burden of pneumococcal disease before routine use of the vaccine.METHODS: The incidence and costs of pneumococcal disease in the Canadian population in 2001 were estimated from various sources, including published studies, provincial databases and expert opinion.RESULTS: In 2001, there were 565,000 cases of pneumococcal disease in the Canadian population, with invasive infections representing 0.7%, pneumonia 7.5% and acute otitis media 91.8% of cases. There were a total of 3000 deaths, mainly as a result of pneumonia and largely attributable to the population aged 65 years or older. There were 54,330 life-years lost due to pneumococcal disease, and 37,430 quality-adjusted life-years lost due to acute disease, long-term sequelae and deaths. Societal costs were estimated to be $193 million (range $155 to $295 million), with 82% borne by the health system and 18% borne by families. Invasive pneumococcal infections represented 17% of the costs and noninvasive infections represented 83%, with approximately one-half of this proportion attributable to acute otitis media and myringotomy.CONCLUSIONS: The burden of pneumococcal disease before routine use of the pneumococcal conjugate vaccine was substantial in all age groups of the Canadian population. This estimate provides a baseline for further analysis of the direct and indirect impacts of the vaccine.


Author(s):  
David T Levy ◽  
K Michael Cummings ◽  
Bryan W Heckman ◽  
Yameng Li ◽  
Zhe Yuan ◽  
...  

Abstract Introduction The U.S. Food and Drug Administration (FDA) has proposed lowering the nicotine content of cigarettes to a minimally addictive level to increase smoking cessation and reduce initiation. This study has two aims: (1) to determine when cigarette manufacturers had the technical capability to reduce cigarette nicotine content and (2) to estimate the lost public health benefits of implementing a standard in 1965, 1975, or 1985. Methods To determine the technical capability of cigarette companies, we reviewed public patents and internal cigarette company business records using the Truth Tobacco Industry Documents. To evaluate the impact of a very low nicotine content cigarette (VLNC) standard on smoking attributable deaths (SADs) and life-years lost (LYLs), we applied a validated (CISNET) model that uses past smoking data, along with estimates of the potential impact of VLNCs derived from expert elicitation. Results Cigarette manufacturers recognized that cigarettes were deadly and addictive before 1964. Manufacturers have had the technical capability to lower cigarette nicotine content for decades. Our model projected that a standard implemented in 1965 could have averted 21 million SADs (54% reduction) and 272 million LYLs (64% reduction) from 1965 to 2064, a standard implemented in 1975 could have averted 18.9 million SADs and 245.4 million LYLs from 1975 to 2074, and a standard implemented in 1985 could have averted 16.3 million SADs and 211.5 million LYLs from 1985 to 2084. Conclusions Millions of premature deaths could have been averted if companies had only sold VLNCs decades ago. FDA should act immediately to implement a VLNC standard. Implications Prior research has shown that a mandated reduction in the nicotine content of cigarettes could reduce the prevalence of smoking and improve public health. Here we report that cigarette manufacturers have had the ability to voluntarily implement such a standard for decades. We use a well-validated model to demonstrate that millions of smoking attributable deaths and life-years lost would have been averted if the industry had implemented such a standard.


2019 ◽  
Vol 120 (11) ◽  
pp. 1052-1058 ◽  
Author(s):  
Elisavet Syriopoulou ◽  
Eva Morris ◽  
Paul J. Finan ◽  
Paul C. Lambert ◽  
Mark J. Rutherford

SLEEP ◽  
2020 ◽  
Vol 43 (Supplement_1) ◽  
pp. A225-A225
Author(s):  
C D Morse ◽  
S Meissner ◽  
L Kodali

Abstract Introduction Sleep apnea is a serious disorder associated with numerous health conditions. In clinical practice, providers order screening home sleep testing (HST) for obstructive sleep apnea (OSA); however, there is limited research about the negative predictive value (NPV) and false negative rate of this test. Providers may not understand HST limitations; therefore, what is the NPV and false negative rate in clinical practice? Methods A retrospective study of non-diagnostic HST is conducted in a Northeastern US rural community sleep clinic. The study population includes adult patients ≥ 18 years old who underwent HST from 2016-2019. The non-diagnostic HST result is compared to the gold standard, the patient’s nocturnal polysomnogram (NPSG). The results provide the NPV (true negative/total) and false negative (true positive/total) for the non-diagnostic HST. Results We identified 211 potential patients with a mean age of 43 years, of which 67% were female. Of those, 85% (n=179) underwent NPSG, with the others declining/delaying testing or lost to follow up. The non-diagnostic HST showed 15.6% NPV for no apnea using AHI&lt;5 and 8.4% NPV using respiratory disturbance index (tRDI)&lt;5. The false negative rate for AHI/tRDI was 84.4% and 91.6%, respectively. The AHI for positive tests ranged from 5-89 per hour (mean AHI 14.9/tRDI 16/hour), of which OSA was identified with an elevated AHI (≥5) ranging from 54.2% mild, 21.8% moderate, and 8.4% severe. Conclusion The high false negative rate of the HST is alarming. Some providers and patients may forgo NPSG after non-diagnostic HST due to a lack of understanding for the HST’s limitations. Knowing that the non-diagnostic HST is a very poor predictor of no sleep apnea will help providers advise patients appropriately for the necessity of the NPSG. The subsequent NPSG provides an accurate diagnosis and, therefore, an informed decision about pursuing or eschewing sleep apnea treatment. Support none


2014 ◽  
Vol 24 (2) ◽  
pp. 238-246 ◽  
Author(s):  
Enora Laas ◽  
Mathieu Luyckx ◽  
Marjolein De Cuypere ◽  
Frederic Selle ◽  
Emile Daraï ◽  
...  

ObjectiveComplete tumor cytoreduction seems to be beneficial for patients with recurrent epithelial ovarian cancer (REOC). The challenge is to identify patients eligible for such surgery. Several scores based on simple clinical parameters have attempted to predict resectability and help in patient selection for surgery in REOC.The aims of this study were to assess the performance of these models in an independent population and to evaluate the impact of complete resection.Materials and MethodsA total of 194 patients with REOC between January 2000 and December 2010 were included in 2 French centers. Two scores were used: the AGO DESKTOP OVAR trial score and a score from Tian et al.The performance (sensitivity, specificity, and predictive values) of these scores was evaluated in our population. Survival curves were constructed to evaluate the survival impact of surgery on recurrence.ResultsPositive predictive values for complete resection were 80.6% and 74.0% for the DESKTOP trial score and the Tian score, respectively. The false-negative rate was high for both models (65.4% and 71.4%, respectively). We found a significantly higher survival in the patients with complete resection (59.4 vs 17.9 months,P< 0.01) even after adjustment for the confounding variables (hazard ratio [HR], 2.53; 95% confidence interval, 1.01–6.3;P= 0.04).ConclusionsIn REOC, surgery seems to have a positive impact on survival, if complete surgery can be achieved. However, factors predicting complete resection are not yet clearly defined. Recurrence-free interval and initial resection seem to be the most relevant factors. Laparoscopic evaluation could help to clarify the indications for surgery.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Michael M. Khayat ◽  
Sayed Mohammad Ebrahim Sahraeian ◽  
Samantha Zarate ◽  
Andrew Carroll ◽  
Huixiao Hong ◽  
...  

Abstract Background Genomic structural variations (SV) are important determinants of genotypic and phenotypic changes in many organisms. However, the detection of SV from next-generation sequencing data remains challenging. Results In this study, DNA from a Chinese family quartet is sequenced at three different sequencing centers in triplicate. A total of 288 derivative data sets are generated utilizing different analysis pipelines and compared to identify sources of analytical variability. Mapping methods provide the major contribution to variability, followed by sequencing centers and replicates. Interestingly, SV supported by only one center or replicate often represent true positives with 47.02% and 45.44% overlapping the long-read SV call set, respectively. This is consistent with an overall higher false negative rate for SV calling in centers and replicates compared to mappers (15.72%). Finally, we observe that the SV calling variability also persists in a genotyping approach, indicating the impact of the underlying sequencing and preparation approaches. Conclusions This study provides the first detailed insights into the sources of variability in SV identification from next-generation sequencing and highlights remaining challenges in SV calling for large cohorts. We further give recommendations on how to reduce SV calling variability and the choice of alignment methodology.


Sign in / Sign up

Export Citation Format

Share Document