Pre-Enucleation Chemotherapy for Eyes Severely Affected by Retinoblastoma Masks Risk of Tumor Extension and Increases Death From Metastasis

2011 ◽  
Vol 29 (7) ◽  
pp. 845-851 ◽  
Author(s):  
Junyang Zhao ◽  
Helen Dimaras ◽  
Christine Massey ◽  
Xiaolin Xu ◽  
Dongsheng Huang ◽  
...  

Purpose Initial response of intraocular retinoblastoma to chemotherapy has encouraged primary chemotherapy instead of primary enucleation for eyes with clinical features suggesting high risk of extraocular extension or metastasis. Upfront enucleation of such high-risk eyes allows pathologic evaluation of extraocular extension, key to management with appropriate surveillance and adjuvant therapy. Does chemotherapy before enucleation mask histologic features of extraocular extension, potentially endangering the child's life by subsequent undertreatment? Methods We performed retrospective analysis of 100 eyes with advanced retinoblastoma enucleated with, or without, primary chemotherapy, in Beijing Tongren Hospital, retrospectively, from October 31, 2008. The extent of retinoblastoma invasion into optic nerve, uvea, and anterior chamber on histopathology was staged by pTNM classification. The treatment groups were compared for pathologic stage (Cochran-Armitage trend test) and disease-specific mortality (competing risks methods). Results Children who received chemotherapy before enucleation had lower pTNM stage than primarily enucleated children (P = .01). Five patients who received pre-enucleation chemotherapy died as a result of extension into brain or metastasis. No patients who had primary enucleation died. For children with group E eyes, disease-specific survival (DSS) was lower with pre-enucleation chemotherapy (n = 45) than with primary enucleation (n = 37; P = .01). Enucleation longer than 3 months after diagnosis was also associated with lower DSS (P < .001). Conclusion Chemotherapy before enucleation of group E eyes with advanced retinoblastoma downstaged pathologic evidence of extraocular extension, and increased the risk of metastatic death from reduced surveillance and inappropriate management of high-risk disease, if enucleation was performed longer than 3 months after diagnosis.

2020 ◽  
Vol 98 (Supplement_3) ◽  
pp. 9-10
Author(s):  
Maggie J Smith ◽  
Mike E King ◽  
Karol E Fike ◽  
Esther D McCabe ◽  
Glenn M Rogers ◽  
...  

Abstract The objective of this study was to identify trends in the percentage of type of respiratory viral vaccines administered to lots of beef calves offered for sale in summer video auctions from 2000 through 2018. There were 59,762 lots of single-gender beef calves (7,167,352 total calves) offered for sale in 145 summer video auctions during these years. Information describing calf lots was obtained from the auction service (Superior Livestock Auction, Fort Worth, TX) which included named vaccines administered to the lot. Named 4- or 5-way respiratory viral vaccines were classified into three groups based on the type of antigens they contained: all modified live antigens (MLV), all killed antigens (KILLED), and a combination of modified live and killed antigens (COMBO). The Cochran-Armitage Trend Test was used to quantify the significance of a trend in the usage of each respiratory viral vaccine type. There was an increase (P &lt; 0.0001) in the percentage of MLV vaccines given to beef calf lots from 2000 (39.7%) through 2018 (88.9%). At the same time, the percentages of both KILLED and COMBO vaccines administered to lots of beef calves declined (P &lt; 0.0001 and P &lt; 0.0001, respectively). In 2000, 31.2% and 29.1% of the total respiratory viral vaccines given to beef calf lots were KILLED or COMBO vaccines, respectively. By 2018, only 4.7% of respiratory viral vaccines were KILLED, and only 6.4% were COMBO vaccines. This dramatic shift indicates an industry trend towards increasing MLV vaccine utilization compared with declining usage of KILLED and COMBO vaccines. This trend may be a result of MLV vaccine approval for use in calves nursing pregnant cows.


2017 ◽  
Vol 41 (S1) ◽  
pp. S575-S576
Author(s):  
Z. Mansuri ◽  
S. Patel ◽  
P. Patel ◽  
O. Jayeola ◽  
A. Das ◽  
...  

ObjectiveTo determine trends and impact on outcomes of atrial fibrillation (AF) in patients with pre-existing psychosis.BackgroundWhile post-AF psychosis has been extensively studied, contemporary studies including temporal trends on the impact of pre-AF psychosis on AF and post-AF outcomes are largely lacking.MethodsWe used Nationwide Inpatient Sample (NIS) from the healthcare cost and utilization project (HCUP) from year's 2002–2012. We identified AF and psychosis as primary and secondary diagnosis respectively using validated international classification of diseases, 9th revision, and Clinical Modification (ICD-9-CM) codes, and used Cochrane–Armitage trend test and multivariate regression to generate adjusted odds ratios (aOR).ResultsWe analyzed total of 3.887.827AF hospital admissions from 2002–2012 of which 1.76% had psychosis. Proportion of hospitalizations with psychosis increased from 5.23% to 14.28% (P trend < 0.001). Utilization of atrial-cardioversion was lower in patients with psychosis (0.76%v vs. 5.79%, P < 0.001). In-hospital mortality was higher in patients with Psychosis (aOR 1.206; 95%CI 1.003–1.449; P < 0.001) and discharge to specialty care was significantly higher (aOR 4.173; 95%CI 3.934–4.427; P < 0.001). The median length of hospitalization (3.13 vs. 2.14 days; P < 0.001) and median cost of hospitalization (16.457 vs. 13.172; P < 0.001) was also higher in hospitalizations with psychosis.ConclusionsOur study displayed an increasing proportion of patients with Psychosis admitted due to AF with higher mortality and extremely higher morbidity post-AF, and significantly less utilization of atrial-cardioversion. There is a need to explore reasons behind this disparity to improve post-AF outcomes in this vulnerable population.Disclosure of interestThe authors have not supplied their declaration of competing interest.


2016 ◽  
Vol 27 (9) ◽  
pp. 2657-2673 ◽  
Author(s):  
Mathieu Emily

The Cochran-Armitage trend test (CA) has become a standard procedure for association testing in large-scale genome-wide association studies (GWAS). However, when the disease model is unknown, there is no consensus on the most powerful test to be used between CA, allelic, and genotypic tests. In this article, we tackle the question of whether CA is best suited to single-locus scanning in GWAS and propose a power comparison of CA against allelic and genotypic tests. Our approach relies on the evaluation of the Taylor decompositions of non-centrality parameters, thus allowing an analytical comparison of the power functions of the tests. Compared to simulation-based comparison, our approach offers the advantage of simultaneously accounting for the multidimensionality of the set of features involved in power functions. Although power for CA depends on the sample size, the case-to-control ratio and the minor allelic frequency (MAF), our results first show that it is largely influenced by the mode of inheritance and a deviation from Hardy–Weinberg Equilibrium (HWE). Furthermore, when compared to other tests, CA is shown to be the most powerful test under a multiplicative disease model or when the single-nucleotide polymorphism largely deviates from HWE. In all other situations, CA lacks in power and differences can be substantial, especially for the recessive mode of inheritance. Finally, our results are illustrated by the comparison of the performances of the statistics in two genome scans.


2012 ◽  
Vol 6 (6) ◽  
pp. 442 ◽  
Author(s):  
Todd M. Webster ◽  
Christopher Newell ◽  
John F. Amrhein

Objective: Cancer Care Ontario has published an evidence-based guideline on their website “Guideline for Optimization of Surgicaland Pathological Quality Performance for Radical Prostatectomy in Prostate Cancer Management: Surgical and Pathological Guidelines.” The evidentiary base for this guideline was recently published in CUAJ. The CCO guideline proposes the following: a positive surgical margin (PSM) rate of <25% for organ-confined disease (pT2), a perioperative mortality of <1%, a rate of rectal injury <1%, and a blood transfusion rate <10% in non-anemic patients. The objective of this study was to review the radical prostatectomy practice at the Grey Bruce Health Services, an Ontario community hospital, and to compare our performance in relation to the Cancer Care Ontario guideline and the literature.Methods: We conducted a retrospective review of all radical prostatectomies performed at the Grey Bruce Health Services from January 1, 2006 to December 31, 2007. The following data were obtained from clinical records and pathology reports: patient age, pre-biopsy prostate-specific antigen, biopsy Gleason score, resected prostate gland weight, radical prostatectomy Gleason score, surgical margin status, pathological tumour stage (pT), lymph node dissection status, perioperative incidence of transfusion of blood products and if the patient was anemic (hemoglobin <140 g/L) preoperatively, incidence of rectal injury, and perioperative mortality within 30 days following surgery.Results: Using the method proposed by D’Amico, most patients undergoing radical prostatectomy were intermediate risk (62%), with a minority of low-risk (24%) and high-risk (14%) patients. The overall PSM rate was 37%. The rate of PSMs in organ-confined disease (pT2) was 26%. There was a statistically significant trend between increasing D’Amico risk category and increasing rate of PSM (Cochran-Armitage trend test, p = 0.023). There was a strong correlation between the pathological tumour stage and the rate of PSM (Cochran-Armitage trend test, p = 0.0003). The rate of blood transfusion in non-anemic patients was 6%. There was 1 patient (0.8%) who experienced a rectal injury. There were no perioperative deaths in our study group.Conclusion: Our results show that a community hospital group can appropriately select patients to undergo radical prostatectomy, as well as achieve an acceptable rate of PSMs. We believe that ongoing critical appraisal and reflective practice are essential to improving surgical outcomes and providing quality care.


2021 ◽  
pp. ijgc-2021-002582
Author(s):  
Gitte Ortoft ◽  
Claus Høgdall ◽  
Estrid Stæhr Hansen ◽  
Margit Dueholm

ObjectiveTo compare the performance of the new ESGO-ESTRO-ESP (European Society of Gynecological Oncology-European Society for Radiotherapy & Oncology-European Society for Pathology) 2020 risk classification system with the previous 2016 risk classification in predicting survival and patterns of recurrence in the Danish endometrial cancer population.MethodsThis Danish national cohort study included 4516 patients with endometrial cancer treated between 2005 and 2012. Five-year Kaplan–Meier adjusted and unadjusted survival estimates and actuarial recurrence rates were calculated for the previous and the new classification systems.ResultsIn the 2020 risk classification system, 81.0% of patients were allocated to low, intermediate, or high-intermediate risk compared with 69.1% in the 2016 risk classification system, mainly due to reclassification of 44.5% of patients previously classified as high risk to either intermediate or especially high-intermediate risk. The survival of the 2020 high-risk group was significantly lower, and the recurrence rate, especially the non-local recurrence rate, was significantly higher than in the 2016 high risk group (2020/2016, overall survival 59%/66%; disease specific 69%/76%; recurrence 40.5%/32.3%, non-local 34.5%/25.8%). Survival and recurrence rates in the other risk groups and the decline in overall and disease-specific survival rates from the low risk to the higher risk groups were similar in patients classified according to the 2016 and 2020 systems.ConclusionThe new ESGO-ESTRO-ESP 2020 risk classification system allocated fewer patients to the high risk group than the previous risk classification system. The main differences were lower overall and disease-specific survival and a higher recurrence rate in the 2020 high risk group. The introduction of the new 2020 risk classification will potentially result in fewer patients at high risk and allocation to the new high risk group will predict lower survival, potentially allowing more specific selection for postoperative adjuvant therapy.


2020 ◽  
pp. 29-52
Author(s):  
Chaitanya Lakkimsetti

This chapter provides an overview of HIV/AIDS policies as well as how sexually marginalized groups are drawn into biopower programs as “high-risk” groups. In 1983, when HIV/AIDS was first detected among sex workers in India, the state’s initial response was to blame the sex workers themselves as well as to forcefully test them and confine them in prison. However, it proved impossible to incarcerate every sex worker and to stop the spread of the HIV/AIDS epidemic. Instead, I argue, ultimately a consensus formed that supported giving marginalized groups a leadership role in tackling the epidemic. Drawing on ethnographic observations and the HIV/AIDS policy of the National AIDS Control Organization (NACO), this chapter also highlights how these biopower projects deepened the involvement of high-risk groups as they moved from simple prevention to behavioral change. Ultimately, communities became extensions of biopower projects as they implemented these programs at the day-to-day level.


Author(s):  
Niranjan Sathianathen

This chapter describes the design, main findings, relevance, and limitations of the landmark Prostate Cancer Prevention Trial (PCPT), which randomized men to finasteride versus placebo and followed them for 7 years. It found a major reduction in prostate cancer incidence but also a higher proportion of high-risk cancer in men diagnosed with prostate cancer. The study did not address the more important oncological outcomes of disease-specific and overall survival. Secondary analyses of PCPT outcomes favored the finasteride arm and suggested that the risk of high-risk cancer is not increased. Linkage analysis of participants from PCPT to Medicare claims data suggested no adverse long-term cardiac, endocrine, or sexual effects.


2011 ◽  
Vol 2011 ◽  
pp. 1-7 ◽  
Author(s):  
Deborah T. Gold ◽  
David L. Weinstein ◽  
Gerhardt Pohl ◽  
Kelly D. Krohn ◽  
Yi Chen ◽  
...  

Purpose. Determine patient-reported reasons for discontinuation with teriparatide.Methods. Patients taking teriparatide in a multicenter, prospective, and observational study were given three questionnaires: baseline, follow-up questionnaire 1 (QF1, 2 to 6 months), and follow-up questionnaire 2 (QF2, 12 months). Discontinuation reported at QF1 and QF2 was defined as “early” and “late,” respectively, and remaining patients were considered persistent. Cochran-Armitage trend test was used to identify factors associated with discontinuation.Results. Side effects, concern about improper use, injection difficulties, and several patient-perceived physician issues were associated with early discontinuation. Low patient-perceived importance of continuing treatment, side effects, difficulty paying, and low patient-perceived physician knowledge were associated with late discontinuation. The most common specific reasons selected for discontinuing treatment were “concerns about treatment outweighing the benefits” (n=53) and “difficulty paying” (n=47).Conclusions. Persistence with teriparatide is dependent on managing side effects, addressing financial challenges, proper training, and obtaining support from the healthcare provider.


2015 ◽  
Vol 81 (6) ◽  
pp. 600-604 ◽  
Author(s):  
Stephen C. Gale ◽  
Dena Arumugam ◽  
Viktor Y. Dombrovskiy

Traditionally, general surgeons provide emergency general surgery (EGS) coverage by assigned call. The acute care surgery (ACS) model is new and remains confined mostly to academic centers. Some argue that in busy trauma centers, on-call trauma surgeons may be unable to also care for EGS patients. In New Jersey, all three Level 1 Trauma Centers (L1TC) have provided ACS services for many years. Analyzing NJ state inpatient data, we sought to determine whether outcomes in one common surgical illness, diverticulitis, have been different between L1TC and nontrauma centers (NTC) over a 10-year period. The NJ Medical Database was queried for patients aged 18 to 90 hospitalized from 2001 to 2010 for acute diverticulitis. Demographics, comorbidities, operative rates, and mortality were compiled and analyzed comparing L1TC to NTC. For additional comparison between L1TC and NTC, 1:1 propensity score matching with replacement was accomplished. χ2, t test, and Cochran-Armitage trend test were used. From 2001 to 2010, 88794 patients were treated in NJ for diverticulitis. 2621 patients (2.95%) were treated at L1TCs. Operative rates were similar between hospital types. Patients treated at L1TCs were more often younger (63.1 ± 0.3 vs 64.7 ± 0.1; P < 0.001), nonwhite (43.1% vs 23.1%; P < 0.0001), and uninsured (11.0% vs 5.5%; P < 0.0001). After propensity matching, neither operative mortality (9.7% vs 7.9% P = 0.45), nor nonoperative mortality (1.2% vs 1.3% P = 0.60) were different between groups. Mortality and operative rates for patients with acute diverticulitis are equivalent between LT1C and NTC in NJ. Trauma centers in NJ more commonly provide care to minority and uninsured patients.


Sign in / Sign up

Export Citation Format

Share Document