Multiple Immunofluorescence Imaging Analysis Reveals Differential Expression of Disialogangliosides GD3 and GD2 in Neuroblastomas

2021 ◽  
pp. 109352662110487
Author(s):  
Haruna Nishimaki ◽  
Yoko Nakanishi ◽  
Hiroshi Yagasaki ◽  
Shinobu Masuda

Background Peripheral neuroblastic tumors (pNTs) are the most common childhood extracranial solid tumors. There are several therapeutic strategies targeting disialoganglioside GD2. Disialoganglioside GD3 has become a potential target. However, the mechanism by which pNTs express GD3 and GD2 remains unclear. We investigated the combined expression status of GD3 and GD2 in pNTs and delineated their clinicopathological values. Methods GD3 and GD2 expression was examined in pNT tissue samples (n = 35) using immunohistochemistry and multiple immunofluorescence imaging. Results GD3 and GD2 expression was positive in 32/35 and 25/35 samples, respectively. Combinatorial analysis of GD3 and GD2 expression in neuroblastoma showed that both were heterogeneously expressed from cell to cell. There were higher numbers of GD3-positive and GD2-negative cells in the low-risk group than in the intermediate-risk ( P = 0.014) and high-risk ( P = 0.009) groups. Cases with high proportions of GD3-positive and GD2-negative cells were associated with the International Neuroblastoma Staging System stage ( P = 0.004), Children’s Oncology Group risk group ( P = 0.001), and outcome ( P = 0.019) and tended to have a higher overall survival rate. Conclusion We demonstrated that neuroblastomas from low-risk patients included more GD3-positive and GD2-negative cells than those from high-risk patients. Clarifying the heterogeneity of neuroblastoma aids in better understanding the biological characteristics and clinical behavior.

Author(s):  
Yan Fan ◽  
Hong Shen ◽  
Brandon Stacey ◽  
David Zhao ◽  
Robert J. Applegate ◽  
...  

AbstractThe purpose of this study was to explore the utility of echocardiography and the EuroSCORE II in stratifying patients with low-gradient severe aortic stenosis (LG SAS) and preserved left ventricular ejection fraction (LVEF ≥ 50%) with or without aortic valve intervention (AVI). The study included 323 patients with LG SAS (aortic valve area ≤ 1.0 cm2 and mean pressure gradient < 40 mmHg). Patients were divided into two groups: a high-risk group (EuroSCORE II ≥ 4%, n = 115) and a low-risk group (EuroSCORE II < 4%, n = 208). Echocardiographic and clinical characteristics were analyzed. All-cause mortality was used as a clinical outcome during mean follow-up of 2 ± 1.3 years. Two-year cumulative survival was significantly lower in the high-risk group than the low-risk patients (62.3% vs. 81.7%, p = 0.001). AVI tended to reduce mortality in the high-risk patients (70% vs. 59%; p = 0.065). It did not significantly reduce mortality in the low-risk patients (82.8% with AVI vs. 81.2%, p = 0.68). Multivariable analysis identified heart failure, renal dysfunction and stroke volume index (SVi) as independent predictors for mortality. The study suggested that individualization of AVI based on risk stratification could be considered in a patient with LG SAS and preserved LVEF.


2018 ◽  
Vol 9 (1_suppl) ◽  
pp. 5-12 ◽  
Author(s):  
Dominique N van Dongen ◽  
Rudolf T Tolsma ◽  
Marion J Fokkert ◽  
Erik A Badings ◽  
Aize van der Sluis ◽  
...  

Background: Pre-hospital risk stratification of non-ST-elevation acute coronary syndrome (NSTE-ACS) by the complete HEART score has not yet been assessed. We investigated whether pre-hospital risk stratification of patients with suspected NSTE-ACS using the HEART score is accurate in predicting major adverse cardiac events (MACE). Methods: This is a prospective observational study, including 700 patients with suspected NSTE-ACS. Risk stratification was performed by ambulance paramedics, using the HEART score; low risk was defined as HEART score ⩽ 3. Primary endpoint was occurrence of MACE within 45 days after inclusion. Secondary endpoint was myocardial infarction or death. Results: A total of 172 patients (24.6%) were stratified as low risk and 528 patients (75.4%) as intermediate to high risk. Mean age was 53.9 years in the low risk group and 66.7 years in the intermediate to high risk group ( p<0.001), 50% were male in the low risk group versus 60% in the intermediate to high risk group ( p=0.026). MACE occurred in five patients in the low risk group (2.9%) and in 111 (21.0%) patients at intermediate or high risk ( p<0.001). There were no deaths in the low risk group and the occurrence of acute myocardial infarction in this group was 1.2%. In the high risk group six patients died (1.1%) and 76 patients had myocardial infarction (14.4%). Conclusions: In suspected NSTE-ACS, pre-hospital risk stratification by ambulance paramedics, including troponin measurement, is accurate in differentiating between low and intermediate to high risk. Future studies should investigate whether transportation of low risk patients to a hospital can be avoided, and whether high risk patients benefit from immediate transfer to a hospital with early coronary angiography possibilities.


2020 ◽  
Vol 12 (2) ◽  
Author(s):  
Widowati W ◽  
Akbar SH ◽  
Tin MH

Introduction: Enamel demineralization is associated with decrease in saliva pH due to fermentation of sugar by oral commensal. Thus, exploring the changing pattern of saliva pH is meaningful in dental caries prevention. The aim of this study was to compare the changing pattern of saliva pH after consuming different types of sweeteners (sucrose and maltitol). Methods: It was a case-control study involving 14 male patients attending IIUM dental clinic who were selected with the intention of getting seven patients with high caries risk ( DMFT ≥6) and seven patients with low caries risk (DMFT ≤3) with initial saliva pH interval of 6.5 to7.5. Patients were asked to consume snacks containing 8 gram sucrose and 8 gram maltitol as sweeteners. The changing pH values of the saliva were measured by Waterproof pHTestr 10BNC (Oakton, Vernon Hills, USA) seven times consecutively at 0 (before snack consumption), and at 5, 10, 15, 20, 30 and 60 minutes after snack consumption. The pH values of saliva of patients with low and high caries risk after consuming sucrose and maltitol were statistically analized by using Anova and Tukey-HSD tests at α = 0.05. Result: There were significant differences in saliva pH changes between low-risk group and high-risk group after consuming sucrose and maltitol. Conclusion: The changing patterns of saliva pH in high-risk patients were lower than those of low-risk patients after consuming two types of snacks containing sucrose and maltitol.


2020 ◽  
Author(s):  
Yi Ding ◽  
Tian Li ◽  
Min Li ◽  
Tuersong Tayier ◽  
MeiLin Zhang ◽  
...  

Abstract Background: Autophagy and long non-coding RNAs (lncRNAs) have been the focus of research on the pathogenesis of melanoma. However, the autophagy network of lncRNAs in melanoma has not been reported. The purpose of this study was to investigate the lncRNA prognostic markers related to melanoma autophagy and predict the prognosis of patients with melanoma.Methods: We downloaded RNA-sequencing data and clinical information of melanoma from The Cancer Genome Atlas. The co-expression of autophagy-related genes (ARGs) and lncRNAs was analyzed. The risk model of autophagy-related lncRNAs was established by univariate and multivariate COX regression analyses, and the best prognostic index was evaluated combined with clinical data. Finally, gene set enrichment analysis was performed on patients in the high- and low-risk groups.Results: According to the results of the univariate COX analysis, only the overexpression of LINC00520 was associated with poor overall survival, unlike HLA-DQB1-AS1, USP30-AS1, AL645929, AL365361, LINC00324, and AC055822. The results of the multivariate COX analysis showed that the overall survival of patients in the high-risk group was shorter than that recorded in the low-risk group (p<0.001). Moreover, in the receiver operating characteristic curve of the risk model we constructed, the area under the curve (AUC) was 0.734, while the AUC of T and N was 0.707 and 0.658, respectively. The Gene Ontology was mainly enriched with the positive regulation of autophagy and the activation of the immune system. The results of the Kyoto Encyclopedia of Genes and Genomes enrichment were mostly related to autophagy, immunity, and melanin metabolism.Conclusion: The positive regulation of autophagy may slow the transition from low-risk patients to high-risk patients in melanoma. Furthermore, compared with clinical information, the autophagy-related lncRNAs risk model may better predict the prognosis of patients with melanoma and provide new treatment ideas.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
T Grinberg ◽  
T Bental ◽  
Y Hammer ◽  
A R Assali ◽  
H Vaknin-Assa ◽  
...  

Abstract Background Following Myocardial Infarction (MI), patients are at increased risk for recurrent cardiovascular events, particularly during the immediate period. Yet some patients are at higher risk than others, owing to their clinical characteristics and comorbidities, these high-risk patients are less often treated with guideline-recommended therapies. Aim To examine temporal trends in treatment and outcomes of patients with MI according to the TIMI risk score for secondary prevention (TRS2°P), a recently validated risk stratification tool. Methods A retrospective cohort study of patients with an acute MI, who underwent percutaneous coronary intervention and were discharged alive between 2004–2016. Temporal trends were examined in the early (2004–2010) and late (2011–2016) time-periods. Patients were stratified by the TRS2°P to a low (≤1), intermediate (2) or high-risk group (≥3). Clinical outcomes included 30-day MACE (death, MI, target vessel revascularization, coronary artery bypass grafting, unstable angina or stroke) and 1-year mortality. Results Among 4921 patients, 31% were low-risk, 27% intermediate-risk and 42% high-risk. Compared to low and intermediate-risk patients, high-risk patients were older, more commonly female, and had more comorbidities such as hypertension, diabetes, peripheral vascular disease, and chronic kidney disease. They presented more often with non ST elevation MI and 3-vessel disease. High-risk patients were less likely to receive drug eluting stents and potent anti-platelet drugs, among other guideline-recommended therapies. Evidently, they experienced higher 30-day MACE (8.1% vs. 3.9% and 2.1% in intermediate and low-risk, respectively, P<0.001) and 1-year mortality (10.4% vs. 3.9% and 1.1% in intermediate and low-risk, respectively, P<0.001). During time, comparing the early to the late-period, the use of potent antiplatelets and statins increased among the entire cohort (P<0.001). However, only the high-risk group demonstrated a significantly lower 30-day MACE (P=0.001). During time, there were no differences in 1-year mortality rate among all risk categories. Temporal trends in 30-day MACE by TRS2°P Conclusion Despite a better application of guideline-recommended therapies, high-risk patients after MI are still relatively undertreated. Nevertheless, they demonstrated the most notable improvement in outcomes over time.


2019 ◽  
Vol 34 (12) ◽  
pp. 2185-2188 ◽  
Author(s):  
Ahmed S. Ghoneima ◽  
Karen Flashman ◽  
Victoria Dawe ◽  
Eleanor Baldwin ◽  
Valerio Celentano

Abstract Aim Bowel resection in Crohn's disease still has a high rate of complications due to risk factors including immune suppression, malnutrition and active inflammation or infection at the time of operating. In this study, we use serological levels and inflammatory markers to predict the potential of complications in patients undergoing resections for complicated Crohn's disease. Methods All patients undergoing laparoscopic bowel resection for Crohn’s disease from 5th of November 2012 to 11th of October 2017 were included in this retrospective observational study. Patients were divided into 4 groups scoring 0, 1, 2 or 3 depending on their pre-operative haemoglobin concentration (Hb), C-reactive protein (CRP) and albumin (Alb) where 1 point was given for an abnormal value in each as detailed in the definitions. They were then grouped into a low risk group comprised of those scoring 0 and 1, and a high risk group for those scoring 2 and 3 and data was collected to compare outcomes and the incidence of septic complications. Results Seventy-nine patients were included. Eleven (13.9%) and 2 (2.5%) patients had 2 or 3 abnormal values of CRP, Alb and Hb and were categorized as high risk. High risk patients had a significantly higher rate of post-operative septic complications (30.7%) compared with low risk patients (10.6%) p value < 0.0001. Conclusion Pre-operative CRP, haemoglobin and albumin can serve as predictors of septic complications after surgery for Crohn’s disease and can therefore be used to guide pre-operative optimisation and clinical decision-making.


2020 ◽  
Vol 41 (Supplement_1) ◽  
Author(s):  
W Sun ◽  
B P Y Yan

Abstract Background We have previously demonstrated unselected screening for atrial fibrillation (AF) in patients ≥65 years old in an out-patient setting yielded 1-2% new AF each time screen-negative patients underwent repeated screening at 12 to 18 month interval. Selection criteria to identify high-risk patients for repeated AF screening may be more efficient than repeat screening on all patients. Aims This study aimed to validate CHA2DS2VASC score as a predictive model to select target population for repeat AF screening. Methods 17,745 consecutive patients underwent 24,363 index AF screening (26.9% patients underwent repeated screening) using a handheld single-lead ECG (AliveCor) from Dec 2014 to Dec 2017 (NCT02409654). Adverse clinical outcomes to be predicted included (i) new AF detection by repeated screening; (ii) new AF clinically diagnosed during follow-up and (ii) ischemic stroke/transient ischemic attack (TIA) during follow-up. Performance evaluation and validation of CHA2DS2VASC score as a prediction model was based on 15,732 subjects, 35,643 person-years of follow-up and 765 outcomes. Internal validation was conducted by method of k-fold cross-validation (k = n = 15,732, i.e., Leave-One-Out cross-validation). Performance measures included c-index for discriminatory ability and decision curve analysis for clinical utility. Risk groups were defined as ≤1, 2-3, or ≥4 for CHA2DS2VASC scores. Calibration was assessed by comparing proportions of actual observed events. Results CHA2DS2VASC scores achieved acceptable discrimination with c-index of 0.762 (95%CI: 0.746-0.777) for derivation and 0.703 for cross-validation. Decision curve analysis showed the use of CHA2DS2VASC to select patients for rescreening was superior to rescreening all or no patients in terms of net benefit across all reasonable threshold probability (Figure 1, left). Predicted and observed probabilities of adverse clinical outcomes progressively increased with increasing CHA2DS2VASC score (Figure 1, right): 0.7% outcome events in low-risk group (CHA2DS2VASC ≤1, predicted prob. ≤0.86%), 3.5% intermediate-risk group (CHA2DS2VASC 2-3, predicted prob. 2.62%-4.43%) and 11.3% in high-risk group (CHA2DS2VASC ≥4, predicted prob. ≥8.50%). The odds ratio for outcome events were 4.88 (95%CI: 3.43-6.96) for intermediate-versus-low risk group, and 17.37 (95%CI: 12.36-24.42) for high-versus-low risk group.  Conclusion Repeat AF screening on high-risk population may be more efficient than rescreening all screen-negative individuals. CHA2DS2VASC scores may be used as a selection tool to identify high-risk patients to undergo repeat AF screening. Abstract P9 Figure 1


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 5452-5452
Author(s):  
Fabio Augusto Ruiz Gómez ◽  
Valentin García Gutierrez ◽  
Elia Gomez ◽  
Pilar Herrera Puente ◽  
Isabel Page Herráez ◽  
...  

Abstract Background and aims: Serum galactomannan (GM) is used as a screening test for the early diagnosis of Aspergillus infection in high risk patients for fungal invasive infection. Serial GM levels analysis have proven to be useful in the high risk clinical setting patients, specially when neutropenic and not receiving anti-mold agents prophylaxis. There is a lack of information regarding the benefit of GM in patients undergoing allogeneic hematopoietic cell transplant (HCT). The aim of this work is to measure the real clinical benefit of the serial periodic determination of GM in the post allo-HCT period. Material and methods: 139 consecutive patients who receivedan allo-HCT (59% related family member donorsand 41% unrelated donors) in our centre for five years (since January 2010 to February 2015) were included in the study. Median age was 46 years. Baseline characteristics of these patients are shown in figure 1. Patients were monitored with GM weekly and received primary prophylaxis with fluconazole since the admission until the immunosuppression was tapered. In order to find a population that could benefit the most for AGA monitoring, we classified our population in low risk patients for invasive fungal infection (IFI) versus high risk patients (those with previous proven or probable IFI or those suffering from GVHD; high risk patients received anti-mold prophyllaxis, mainly with voriconazole or posaconazole). Patients considered as low risk who suffered from Graft versus Host Disease (GVHD) in the ulterior outcome, were censored for low risk and considered as high risk since the development of GVHD, and therefore anti-mold agent prophylaxis was started. GM positivity was determined according standard criteria. When GM positivity was detected, radiological and clinical studies (chest/sinus CT scans, cultures, etc.) to discard Aspegillosis were done as soon as possible. Every patient was followed up prospectively until the last medical consultation or decease. Results: Global overall survival (OS) for the entire cohort was 55.39% and cumulative incidence for severe GVHD grade III/IV was 49.5%. During the follow up, GM became positive in 31/139 (22%) cases. With this approach, the global false positive and false negative rate was 31% and 6% respectively.110/139 (79.14%) patients were identificated as low risk cases. We observed GM positivization in 1.81% (2/110) and 37.18% (29/78) for low and high group respectively. All 2 positive GM in the low risk group were false positives. Regarding the high risk group, 34.48% (10/29) were false positives while in the rest 19 patients (65.52%), subsequent radiological and clinical findings allowed us to diagnose Aspergillus infection (besides they received anti-mold agent prophylaxis). Conclusions: In our experience there is not enough evidence for supporting making serial monitoring with GM in low risk patients for IFI in the post allo-HCT period. However it may be an useful tool in high risk patients. Table 1. N (%) Age (years) Mean 46.18 Median 48.01 Sex (man vs woman) 91 vs 48 (65 vs 35%) Hematology disease Acute Mieloid Leukemia Acute Limphoblastic Leukemia Multiple Myeloma Chronic Mieloid Leukemia Non Hodgkin Lymphoma Hodgkin Lymphoma Others diseases 79 (56.8%) 18 (12.9%) 9 (6.5%) 3 (2.2%) 13 (9.4%) 9 (6.5%) 8 (5.6%) Pre allo-HSCT IFI 17 (12.2%) Aconditioning Myeloablative Reduced Intensity 95 (68.3%) 44 (31.7%) Source of progenitors Peripherical blood Bone marrow 132 (95.0%) 7 (5.0%) Type of donor Related Non-related 82 (59.0%) 57 (41.0%) Graft time (days) Neutrophils (>500 in two consecutive determinations) Platelets (>30.000 in two consecutive determinations) 15.52 17.75 Post-transplant CMV viremia (number of patients with >600 copies in the post-HSCT period) 48 (34.5%) Disclosures No relevant conflicts of interest to declare.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. 1577-1577
Author(s):  
Deesha Sarma ◽  
So Yeon Kim ◽  
David H. Henry

1577 Background: Venous thromboembolism (VTE) poses a significant health risk to cancer patients and is one of the leading causes of death among this population. The most effective way to prevent VTE and reduce its prominence as a public health burden is by identifying high-risk patients and administering prophylactic measures. In 2008, Khorana et al. developed a model that classified patients by risk based on clinical factors. Methods: We conducted a retrospective study to test this model’s efficacy, on 150 patients with cancer receiving chemotherapy at an outpatient oncology clinic between January 1 and August 1, 2011. We aggregated data and assigned points based on the five factors in the Khorana model: site of cancer with 2 points for very high-risk site and 1 point for high-risk site, 1 point each for leukocyte counts more than 11 x 109/L, platelet counts greater than 350 X 109/L, hemoglobin levels less than 100 g/L and/or the use of erythropoiesis-stimulating agents, and BMI greater than 35 kg/m2 (Khorana et al., Blood 2008). Based on this scoring system, patients with 0 points were grouped into the low-risk category, those with 1-2 points were considered intermediate-risk, and those with 3-4 points were classified as high-risk. Results: As shown in the table, VTE incidence for the low-risk group was 1.9%, intermediate-risk group was 3.9%, and high-risk group was 9.1%. Conclusions: High-risk patients were about 4.5 times more likely to develop a VTE than low risk patients. These results provide valuable insight in determining which patients might benefit from prophylaxis and in motivating the design of prospective clinical trials that assess the VTE predictive model in various ambulatory cancer settings. [Table: see text]


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e12046-e12046 ◽  
Author(s):  
Yao Yuan ◽  
Alison Len Van Dyke ◽  
Allison W. Kurian ◽  
Serban Negoita ◽  
Valentina I. Petkov

e12046 Background: OncotypeDX DCIS is a 12-gene assay designed to predict the 10-year risk of local recurrence and to guide treatment decisions, specifically the benefit of radiation therapy in breast ductal carcinoma in situ (DCIS). The test became available in December 2011 and is not currently recommended by guidelines. The Surveillance, Epidemiology and End Results (SEER) program captures cancer data at the population-level and has been conducting annual linkages with Genomic Health Clinical Laboratory, the only lab performing the test, to identify patients receiving the test. Methods: SEER cases diagnosed with in situ breast cancer (DCIS or papillary in situ) between 2011-2015 were included in the analysis. SEER data on patient demographics, tumor characteristics, and treatments were combined with linkage variables for OncotypeDX DCIS tests reported by Genomic Health. Logistic regression was used to identify which patient related factors were associated with having received the test and to evaluate the relationship between test generated risk categories and treatments. Results: Of the 68,826 in situ breast cancer cases, 2,155 were linked to DCIS test data. Test utilization increased from < 1% to 5.3% for patients diagnosed in 2011 vs. 2015. Patients were less likely to receive the test if they had larger and higher-grade tumors, were divorced, had Medicaid insurance, and were in the lowest socioeconomic status tertile. The majority of patients (68%) were at low risk, 17% intermediate, and 15% in the high risk group. Patients at intermediate or high risk were more likely to receive radiation (OR = 2.4, 95% CI: 1.8-3.2 and OR = 3, 95% CI: 2.3,4.1, respectively) than the low risk group. High risk patients were more likely than low risk patients to receive chemotherapy (OR = 4.3, 95% CI: 1.2, 14.4) and to undergo mastectomy than lumpectomy (OR = 1.47, 95% CI: 1.12-1.93). Conclusions: Clinical adoption of the OncotypeDX DCIS test has been slow. The association between multiple demographic factors and receiving the test indicated disparities in the US population. Clinical factors also influenced whether patients received the test. OncotypeDX DCIS results appeared to guide clinical decisions.


Sign in / Sign up

Export Citation Format

Share Document