Impact of socioeconomic status on presentation, treatment and outcomes of patients with pancreatic cancer

2020 ◽  
Vol 9 (17) ◽  
pp. 1233-1241
Author(s):  
Omar Abdel-Rahman

Objective: To assess the impact of socioeconomic status (SES) on the patterns of care and outcomes of patients with pancreatic cancer. Materials & methods: Surveillance, Epidemiology and End Results specialized SES registry has been accessed and patients with pancreatic cancer diagnosed (2000–2015) were evaluated. The following SES variables were included: employment percentage, percent of people above the poverty line, percent of people identified as working-class, educational level, median rent, median household value and median household income. Within this SES registry, patients were classified according to their census-tract SES into three groups (where group-1 represents the lowest SES category and group-3 represents the highest SES category). Multivariable logistic regression analysis was used to assess the impact of SES on access to surgical resection and multivariable Cox regression analysis was used to assess the impact of SES on pancreatic cancer-specific survival. Kaplan–Meier survival estimates were also used to compare overall survival (OS) outcomes according to SES. Results: A total of 83,902 pancreatic cancer patients were included in the current analysis. Within multivariable logistic regression analysis among patients with a localized/regional disease, patients with lower SES were less likely to undergo surgical resection for pancreatic cancer (odds ratio: 0.719; 95% CI: 0.673–0.767; p < 0.001). Among patients with a localized/regional disease who underwent surgical resection, patients with higher SES have better OS (median OS for group-3: 20.0 vs 17.0 months for group-1; p < 0.001). Moreover, patients with lower SES have worse pancreatic cancer-specific survival compared with patients with higher SES: (hazard ratio for group-1 vs group-3: 1.212; 95% CI: 1.135–1.295; p < 0.001). Conclusion: Poor neighborhood SES is associated with more advanced disease at presentation, less probability of surgical resection and even poorer outcomes after surgical resection.

2021 ◽  
Vol 80 (Suppl 1) ◽  
pp. 329.1-329
Author(s):  
Y. Miyazaki ◽  
S. Nakayamada ◽  
K. Nakano ◽  
S. Kubo ◽  
Y. Inoue ◽  
...  

Background:Tofacitinib (TOFA) and baricitinib (BARI) have been widely used in many regions for treatment of rheumatoid arthritis (RA). The selection of JAK inhibitor for RA treatment based on patient type remains a major concern.Objectives:The differences of efficacy between each Janus kinase (JAK) inhibitors have not been clarified in the patients with RA in clinical practice. Here, we compared the efficacy between TOFA and BARI in clinical practice.Methods:A retrospective observational study. The efficacy of TOFA (n=156) in patients with RA was compared with BARI (n=138). Selection bias was reduced to a minimum using propensity score-based inverse probability of treatment weighting (IPTW). We analyzed the trajectories of changes in disease activity in patients receiving TOFA or BARI using growth mixture modeling (GMM). Multivariable logistic regression analysis was performed to identify factors contributing to belonging to treatment-resistance group defined by GMM. The observation period of the study was 24 weeks.Results:No significant difference was observed in patient characteristics between the TOFA and BARI groups in after adjustment by propensity score-based IPTW. The retention rates over 24 weeks did not differ between the TOFA and BARI groups. No difference was observed in the incidence of adverse events in the TOFA and BARI groups. Clinical disease activity index (CDAI) at week 24 after the introduction of JAK inhibitors was 8.0 ± 8.9 and 6.2 ± 7.2 in the TOFA and BARI group, respectively. The rates of CDAI-remission at week 24 in the TOFA and BARI groups were 43/153 (28.3%) and 57/141 (40.4%), respectively. Compared to the TOFA group, the BARI group showed a significantly lower CDAI (⊿CDAI=-1.9, 95% confidence interval: -3.7 to -0.3, p=0.02) and a significantly higher rate of CDAI-remission (odds ratio: 1.7, 95% CI: 1.1–2.7, p=0.04) at week 24. Similarly, at week 24, SDAI was significantly lower in the BARI group (TOFA vs. BARI = 10.1 ± 9.9 vs. 7.3 ± 7.5, ⊿SDAI=-2.2, 95% CI: -4.2 to -0.2, p=0.04), and the rates of SDAI-remission (OR: 1.6, 95% CI: 1.0–2.6, p=0.04).The patients were divided into two groups: patients with MDA to HDA at baseline (Group 1) and patients with HDA at baseline than Group 1 (Groups 2 and 3) based on the analysis of the trajectories of CDAI using GMM, In Groups 1 and 2, disease activity was improved immediately after the introduction of JAK inhibitors. In Group 3, disease activity was partially improved, and LDA was not achieved at week 24 after the introduction of JAK inhibitors. The patients in Group 3 were resistant to treatment (Group 3: treatment-resistance group).When multivariable logistic regression analysis was performed for all patients receiving JAK inhibitors, the factors contributing to belonging to treatment-resistance group were: high baseline HAQ-DI score (OR: 1.76, 95% CI: 1.09–2.84, p=0.02) and high number of biological disease-modifying anti-rheumatic drugs (bDMARDs) used before JAK inhibitors (OR: 1.51, 95% CI: 1.16–1.95, p=0.002) and TOFA use (OR: 2.13, 95% CI: 1.05–4.30, p=0.03).Next, multivariable logistic regression analysis was separately performed for each treatment group. The patients receiving more bDMARDs before the JAK inhibitor were more likely to belong to treatment-resistance group in the TOFA group (OR: 1.76, 95% CI: 1.24–4.06). Among patients with RA who received TOFA, those who had received ≥4 bDMARDs before the introduction of TOFA were more likely to be classified into the treatment-resistant group.In the BARI group, multivariable logistic regression analysis did not identify any factors associated with belonging to treatment-resistance group.Conclusion:TOFA may be partially effective in patients resistant to many bDMARDs. Consequently, efficacy may differ between TOFA and BARI. Because TOFA was less effective in RA patients resistant to ≥4 bDMARDs, the present study suggests that BARI may be more appropriate for RA patients resistant to many bDMARDs.Disclosure of Interests:Yusuke Miyazaki Speakers bureau: Eli Lilly, Shingo Nakayamada Speakers bureau: Bristol-Myers, UCB, Astellas, Abbvie, Eisai, Pfizer, Takeda, Kazuhisa Nakano Speakers bureau: Bristol-Myers, Sanofi, AbbVie, Eisai, Eli Lilly, Chugai, Pfizer, Takeda, and Mitsubishi-Tanabe, Satoshi Kubo Speakers bureau: Bristol-Myers, Yoshino Inoue: None declared, Yoshihisa Fujino: None declared, Yoshiya Tanaka Speakers bureau: Abbvie, Daiichi-Sankyo, Chugai, Takeda, Mitsubishi-Tanabe, Bristol-Myers, Astellas, Eisai, Janssen, Pfizer, Asahi-kasei, Eli Lilly, GlaxoSmithKline, UCB, Teijin, MSD, and Santen


2021 ◽  
Vol 10 (3) ◽  
pp. 527
Author(s):  
Byuk Sung Ko ◽  
Sung-Hyuk Choi ◽  
Tae Gun Shin ◽  
Kyuseok Kim ◽  
You Hwan Jo ◽  
...  

This study aimed to address the impact of 1-hr bundle achievement on outcomes in septic shock patients. Secondary analysis of multicenter prospectively collected data on septic shock patients who had undergone protocolized resuscitation bundle therapy at emergency departments was conducted. In-hospital mortality according to 1-h bundle achievement was compared using multivariable logistic regression analysis. Patients were also divided into 3 groups according to the time of bundle achievement and outcomes were compared to examine the difference in outcome for each group over time: group 1 (≤1 h reference), group 2 (1–3 h) and group 3 (3–6 h). In total, 1612 patients with septic shock were included. The 1-h bundle was achieved in 461 (28.6%) patients. The group that achieved the 1-h bundle did not show a significant difference in in-hospital mortality compared to the group that did not achieve the 1-h bundle on multivariable logistic regression analysis (<1 vs. >1 h) (odds ratio = 0.74, p = 0.091). However, 3- and 6- h bundle achievements showed significantly lower odds ratios of in-hospital mortality compared to the group that did not achieve the bundle (<3 vs. >3 h, <6 vs. >6 h, odds ratio = 0.604 and 0.458, respectively). There was no significant difference in in-hospital mortality over time for group 2 and 3 compared to that of group 1. One-hour bundle achievement was not associated with improved outcomes in septic shock patients. These data suggest that further investigation into the clinical implications of 1-h bundle achievement in patients with septic shock is warranted.


2018 ◽  
Vol 36 (6_suppl) ◽  
pp. 177-177
Author(s):  
Hanan Goldberg ◽  
Ally Hoffman ◽  
Teck Sing Woon ◽  
Zachary William Abraham Klaassen ◽  
Thenappan Chandrasekar ◽  
...  

177 Background: PSA produced from prostate cancer (PC) cells escapes proteolytic processing, resulting in a more complexed PSA and a lower %fPSA. Higher %fpsa correlates with lower PC risk. However, the role of fPSA in biochemical recurrence (BCR) after radical prostatectomy (RP) is unknown. Methods: All patients who had BCR after RP and at least one fPSA test, were included. Patients were stratified according to the %fPSA cut-off of 0.15. Multivariable logistic regression analysis was performed to predict covariates associated with a higher %fPSA. Results: A total of 81 men with BCR were found (Table 1). Interestingly, 20% (group 1) vs. 60% (groups 2) become castrate resistant (CRPC), p<0.0001 and the time to reach CRPC state was much shorter in group 2 (33.5 months) vs. group 1 (57.9 months), p=0.05. Additionally, 60% of group 2 patients vs. 32.5% of group 1 patients developed metastasis, p=0.014. Lastly, median survival of 193 months for group 2 patients with no median survival for group 1, Log Rank test p=0.023. Multivariable logistic regression analysis demonstrated that secondary Gleason score of 5 (compared to 3) and %fPSA>0.15 predicted CRPC status (OR 11.63, CI 95% 1.38-97.4, p=0.024, OR 7.99, CI 95% 2-31.95, p=0.003, respectively). Conclusions: %fPSA>0.15 in the setting of BCR confers a more aggressive disease, manifesting in a faster development of CRPC, metastasis and death. Our findings suggest a reversal in the significance of % fPSA values in BCR patients, and should be validated in larger cohorts. [Table: see text]


2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii440-iii440
Author(s):  
Harriet Dulson ◽  
Rachel McAndrew ◽  
Mark Brougham

Abstract INTRODUCTION Children treated for CNS tumours experience a very high burden of adverse effects. Platinum-based chemotherapy and cranial radiotherapy can cause ototoxicity, which may be particularly problematic in patients who have impaired vision and cognition as a result of their tumour and associated treatment. This study assessed the prevalence of impaired hearing and vision and how this may impact upon education. METHODS 53 patients diagnosed with solid tumours in Edinburgh, UK between August 2013–2018 were included in the study. Patients were split into three groups according to treatment received: Group 1 – cisplatin-based chemotherapy and cranial radiotherapy; Group 2 - platinum-based chemotherapy, no cranial radiotherapy; Group 3 – benign brain tumours treated with surgery only. Data was collected retrospectively from patient notes. RESULTS Overall 69.5% of those treated with platinum-based chemotherapy experienced ototoxicity as assessed by Brock grading and 5.9% of patients had reduced visual acuity. Patients in Group 1 had the highest prevalence of both. 44.4% of patients in Group 1 needed increased educational support following treatment, either with extra support in the classroom or being unable to continue in mainstream school. 12.5% of Group 2 patients required such support and 31.3% in Group 3. CONCLUSIONS Children with CNS tumours frequently require support for future education but those treated with both platinum-based chemotherapy and cranial radiotherapy are at particular risk, which may be compounded by co-existent ototoxicity and visual impairment. It is essential to provide appropriate support for this patient cohort in order to maximise their educational potential.


Healthcare ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 722
Author(s):  
Yusuke Ito ◽  
Hidetaka Wakabayashi ◽  
Shinta Nishioka ◽  
Shin Nomura ◽  
Ryo Momosaki

The object of this study is to determine the impact of the rehabilitation dose on the nutritional status at discharge from a convalescent rehabilitation ward in malnourished patients with hip fracture. This retrospective case-control study involved malnourished patients with hip fracture aged 65 years or older who had been admitted to a convalescent rehabilitation ward and whose data were registered in the Japan Rehabilitation Nutrition Database. The primary outcome was nutritional status at discharge. Patients were classified according to whether nutritional status was improved or not at discharge, according to the Mini Nutritional Assessment-Short Form® (MNA-SF) score. The association between improved nutritional status and rehabilitation dose was assessed by a logistic regression analysis. Data were available for 145 patients (27 men, 118 women; mean age 85.1 ± 7.9 years). Daily rehabilitation dose was 109.5 (median 94.6–116.2) min and the MNA-SF score at admission was 5 (median 4–6). Nutritional status was improved in 97 patients and not improved in 48. Logistic regression analysis showed the following factors to be independently associated with nutritional status at discharge: Functional Independence Measure score (OR 1.042, 95% CI 1.016–1.068), energy intake (OR 1.002 CI 1.000–1.004), daily rehabilitation dose (OR 1.023, 95% CI 1.002–1.045), and length of hospital stay (OR 1.026, 95% CI 1.003–1.049). The daily rehabilitation dose in malnourished patients with hip fracture may positively impact nutritional status at discharge.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yu Liu ◽  
Jing Li ◽  
Wanyu Zhang ◽  
Yihong Guo

AbstractOestradiol, an important hormone in follicular development and endometrial receptivity, is closely related to clinical outcomes of fresh in vitro fertilization-embryo transfer (IVF-ET) cycles. A supraphysiologic E2 level is inevitable during controlled ovarian hyper-stimulation (COH), and its effect on the outcome of IVF-ET is controversial. The aim of this retrospective study is to evaluate the association between elevated serum oestradiol (E2) levels on the day of human chorionic gonadotrophin (hCG) administration and neonatal birthweight after IVF-ET cycles. The data of 3659 infertile patients with fresh IVF-ET cycles were analysed retrospectively between August 2009 and February 2017 in First Hospital of Zhengzhou University. Patients were categorized by serum E2 levels on the day of hCG administration into six groups: group 1 (serum E2 levels ≤ 1000 pg/mL, n = 230), group 2 (serum E2 levels between 1001 and 2000 pg/mL, n = 524), group 3 (serum E2 levels between 2001 and 3000 pg/mL, n = 783), group 4 (serum E2 levels between 3001 and 4000 pg/mL, n = 721), group 5 (serum E2 levels between 4001 and 5000 pg/mL, n = 548 ), and group 6 (serum E2 levels > 5000 pg/mL, n = 852). Univariate linear regression was used to evaluate the independent correlation between each factor and outcome index. Multiple logistic regression was used to adjust for confounding factors. The LBW rates were as follows: 3.0% (group 1), 2.9% (group 2), 1.9% (group 3), 2.9% (group 4), 2.9% (group 5), and 2.0% (group 6) (P = 0.629), respectively. There were no statistically significant differences in the incidences of neonatal LBW among the six groups. We did not detect an association between peak serum E2 level during ovarian stimulation and neonatal birthweight after IVF-ET. The results of this retrospective cohort study showed that serum E2 peak levels during ovarian stimulation were not associated with birth weight during IVF cycles. In addition, no association was found between higher E2 levels and increased LBW risk. Our observations suggest that the hyper-oestrogenic milieu during COS does not seem to have adverse effects on the birthweight of offspring after IVF. Although this study provides some reference, the obstetric-related factors were not included due to historical reasons. The impact of the high estrogen environment during COS on the birth weight of IVF offspring still needs future research.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 340.2-341
Author(s):  
V. Orefice ◽  
F. Ceccarelli ◽  
C. Barbati ◽  
R. Lucchetti ◽  
G. Olivieri ◽  
...  

Background:Systemic lupus erythematosus (SLE) is an autoimmune disease mainly affecting women of childbearing age. The interplay between genetic and environmental factors may contribute to disease pathogenesis1. At today, no robust data are available about the possible contribute of diet in SLE. Caffeine, one of the most widely consumed products in the world, seems to interact with multiple components of the immune system by acting as a non-specific phosphodiesterase inhibitor2.In vitrodose-dependent treatment with caffeine seems to down-regulate mRNA levels of key inflammation-related genes and similarly reduce levels of different pro-inflammatory cytokines3.Objectives:We evaluated the impact of caffeine consumption on SLE-related disease phenotype and activity, in terms of clinimetric assessment and cytokines levels.Methods:We performed a cross-sectional study, enrolling consecutive patients and reporting their clinical and laboratory data. Disease activity was assessed by SLE Disease Activity Index 2000 (SLEDAI-2k)4. Caffeine intake was evaluated by a 7-day food frequency questionnaire, including all the main sources of caffeine. As previously reported, patients were divided in four groups according to the daily caffeine intake: <29.1 mg/day (group 1), 29.2-153.7 mg/day (group 2), 153.8-376.5 mg/day (group 3) and >376.6 mg/day (group 4)5. At the end of questionnaire filling, blood samples were collected from each patient to assess cytokines levels. These were assessed by using a panel by Bio-Plex assays to measure the levels of IL-6, IL-10, IL-17, IL-27, IFN-γ, IFN-α and Blys.Results:We enrolled 89 SLE patients (F/M 87/2, median age 46 years, IQR 14; median disease duration 144 months, IQR 150). The median intake of caffeine was 195 mg/day (IQR 160.5). At the time of the enrollment, 8 patients (8.9%) referred a caffeine intake < 29.1 mg/day (group 1), 27 patients (30.3%) between 29.2 and 153.7 mg/day (group 2), 45 patients (51%) between 153.8 and 376.5 mg/day (group 3) and 9 patients (10.1%) >376.6 mg/day (group 4). A negative correlation between the levels of caffeine and disease activity, evaluated with SLEDAI-2K, was observed (p=0.01, r=-0.26). By comparing the four groups, a significant higher prevalence of lupus nephritis, neuropsychiatric involvement, haematological manifestations, hypocomplementemia and anti-dsDNA positivity was observed in patients with less intake of caffeine (figure 1 A-E). Furthermore, patients with less intake of caffeine showed a significant more frequent use of glucocorticoids [group 4: 22.2%,versusgroup 1 (50.0%, p=0.0001), group 2 (55.5%, p=0.0001), group 3 (40.0%, p=0.009)]. Moving on cytokines analysis, a negative correlation between daily caffeine consumption and serum level of IFNγ was found (p=0.03, r=-0.2) (figure 2A); furthermore, patients with more caffeine intake showed significant lower levels of IFNα (p=0.02, figure 2B), IL-17 (p=0.01, figure 2C) and IL-6 (p=0.003, figure 2D).Conclusion:This is the first report demonstrating the impact of caffeine on SLE disease activity status, as demonstrated by the inverse correlation between its intake and both SLEDAI-2k values and cytokines levels. Moreover, in our cohort, patients with less caffeine consumption seems to have a more severe disease phenotype, especially in terms of renal and neuropsychiatric involvement. Our results seem to suggest a possible immunoregulatory dose-dependent effect of caffeine, through the modulation of serum cytokine levels, as already suggested byin vitroanalysis.References:[1]Kaul et alNat. Rev. Dis. Prim.2016; 2. Aronsen et alEurop Joul of Pharm2014; 3. Iris et alClin Immun.2018; 4. Gladman et al J Rheumatol. 2002; 5. Mikuls et alArth Rheum2002Disclosure of Interests:Valeria Orefice: None declared, Fulvia Ceccarelli: None declared, cristiana barbati: None declared, Ramona Lucchetti: None declared, Giulio Olivieri: None declared, enrica cipriano: None declared, Francesco Natalucci: None declared, Carlo Perricone: None declared, Francesca Romana Spinelli Grant/research support from: Pfizer, Consultant of: Novartis, Gilead, Lilly, Sanofi, Celgene, Speakers bureau: Lilly, cristiano alessandri Grant/research support from: Pfizer, Guido Valesini: None declared, Fabrizio Conti Speakers bureau: BMS, Lilly, Abbvie, Pfizer, Sanofi


2021 ◽  
pp. 019459982199338
Author(s):  
Flora Yan ◽  
Dylan A. Levy ◽  
Chun-Che Wen ◽  
Cathy L. Melvin ◽  
Marvella E. Ford ◽  
...  

Objective To assess the impact of rural-urban residence on children with obstructive sleep-disordered breathing (SDB) who were candidates for tonsillectomy with or without adenoidectomy (TA). Study Design Retrospective cohort study. Setting Tertiary children’s hospital. Methods A cohort of otherwise healthy children aged 2 to 18 years with a diagnosis of obstructive SDB between April 2016 and December 2018 who were recommended TA were included. Rural-urban designation was defined by ZIP code approximation of rural-urban commuting area codes. The main outcome was association of rurality with time to TA and loss to follow-up using Cox and logistic regression analyses. Results In total, 213 patients were included (mean age 6 ± 2.9 years, 117 [55%] male, 69 [32%] rural dwelling). Rural-dwelling children were more often insured by Medicaid than private insurance ( P < .001) and had a median driving distance of 74.8 vs 16.8 miles ( P < .001) compared to urban-dwelling patients. The majority (94.9%) eventually underwent recommended TA once evaluated by an otolaryngologist. Multivariable logistic regression analysis did not reveal any significant predictors for loss to follow-up in receiving TA. Cox regression analysis that adjusted for age, sex, insurance, and race showed that rural-dwelling patients had a 30% reduction in receipt of TA over time as compared to urban-dwelling patients (hazard ratio, 0.7; 95% CI, 0.50-0.99). Conclusion Rural-dwelling patients experienced longer wait times and driving distance to TA. This study suggests that rurality should be considered a potential barrier to surgical intervention and highlights the need to further investigate geographic access as an important determinant of care in pediatric SDB.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Chul Park ◽  
Ryoung-Eun Ko ◽  
Jinhee Jung ◽  
Soo Jin Na ◽  
Kyeongman Jeon

Abstract Background Limited data are available on practical predictors of successful de-cannulation among the patients who undergo tracheostomies. We evaluated factors associated with failed de-cannulations to develop a prediction model that could be easily be used at the time of weaning from MV. Methods In a retrospective cohort of 346 tracheostomised patients managed by a standardized de-cannulation program, multivariable logistic regression analysis identified variables that were independently associated with failed de-cannulation. Based on the logistic regression analysis, the new predictive scoring system for successful de-cannulation, referred to as the DECAN score, was developed and then internally validated. Results The model included age > 67 years, body mass index < 22 kg/m2, underlying malignancy, non-respiratory causes of mechanical ventilation (MV), presence of neurologic disease, vasopressor requirement, and presence of post-tracheostomy pneumonia, presence of delirium. The DECAN score was associated with good calibration (goodness-of-fit, 0.6477) and discrimination outcomes (area under the receiver operating characteristic curve 0.890, 95% CI 0.853–0.921). The optimal cut-off point for the DECAN score for the prediction of the successful de-cannulation was ≤ 5 points, and was associated with the specificities of 84.6% (95% CI 77.7–90.0) and sensitivities of 80.2% (95% CI 73.9–85.5). Conclusions The DECAN score for tracheostomised patients who are successfully weaned from prolonged MV can be computed at the time of weaning to assess the probability of de-cannulation based on readily available variables.


2021 ◽  
Author(s):  
Chenxi Yuan ◽  
Qingwei Wang ◽  
Xueting Dai ◽  
Yipeng Song ◽  
Jinming Yu

Abstract Background: Lung adenocarcinoma (LUAD) and skin cutaneous melanoma (SKCM) are common tumors around the world. However, the prognosis in advanced patients is poor. Because NLRP3 was not extensively studied in cancers, so that we aimed to identify the impact of NLRP3 on LUAD and SKCM through bioinformatics analyses. Methods: TCGA and TIMER database were utilized in this study. We compared the expression of NLRP3 in different cancers and evaluated its influence on survival of LUAD and SKCM patients. The correlations between clinical information and NLRP3 expression were analyzed using logistic regression. Clinicopathologic characteristics associated with overall survival in were analyzed by Cox regression. In addition, we explored the correlation between NLRP3 and immune infiltrates. GSEA and co-expressed gene with NLRP3 were also done in this study. Results: NLRP3 expressed disparately in tumor tissues and normal tissues. Cox regression analysis indicated that up-regulated NLRP3 was an independent prognostic factor for good prognosis in LUAD and SKCM. Logistic regression analysis showed increased NLRP3 expression was significantly correlated with favorable clinicopathologic parameters such as no lymph node invasion and no distant metastasis. Specifically, a positive correlation between increased NLRP3 expression and immune infiltrating level of various immune cells was observed. Conclusion: Together with all these findings, increased NLRP3 expression correlates with favorable prognosis and increased proportion of immune cells in LUAD and SKCM. These conclusions indicate that NLRP3 can serve as a potential biomarker for evaluating prognosis and immune infiltration level.


Sign in / Sign up

Export Citation Format

Share Document