Prescribing patterns for FOLFIRINOX in the real world.

2018 ◽  
Vol 36 (4_suppl) ◽  
pp. 463-463
Author(s):  
Chad Michael Guenther ◽  
Nizar Bhulani ◽  
Adam Korenke ◽  
Jenny Jing Li ◽  
Leticia Khosama ◽  
...  

463 Background: FOLFIRINOX therapy is associated with improved outcome in patients with gastrointestinal cancers. The regimen can be associated with significant toxicity and empiric dose modifications are often used. We analyzed 1) real-world prescribing patterns of FOLFIRINOX and 2) toxicity of therapy. Methods: Patients undergoing FOLFIRINOX chemotherapy at an academic, NCI-Designated Comprehensive Cancer Center were identified and electronic medical records reviewed. Patients who received at least one dose of FOLFIRINOX were included. Chemotherapy dose, growth factor use and toxicity data was abstracted for the first 8 weeks. ‘Standard FOLFIRNOX’ was defined as the regimen utilized by Conroy et al (NEJM 2011). Any empiric reduction/withholding of drug dose for cycle 1 was classified as ‘modified FOLFIRINOX’. Bivariate analysis was performed on the data. Results: There were 111 patients seen between 5/2011-3/2017 and 94% had pancreatic cancer. Age range was 29-87 years and 52% were female. 59% received ‘modified FOLFIRINOX’ and 20% received empiric growth factors. Line of therapy for standard vs modified respectively was 71.1% vs 45.5% for 1st, 17.8% vs 36.4% for 2nd, and 11.1% vs 18.2% for beyond 2nd (p = 0.03). Patients with ‘modified FOLFIRINOX’ were more likely to have metastatic disease (p = 0.01), have received second line or beyond, and higher ECOG score (p = 0.03). Patients with ‘modified FOLFIRINOX’ had a trend toward fewer treatment-related ED visits or hospitalization vs ‘standard FOLFIRINOX’ (27.2% vs 42.2% p = 0.10) and fewer treatment delays (25.8% vs 42.2% p = 0.07). Conclusions: In the real world setting, a majority of patients on FOLFIRINOX receive empiric dose modifications. Although modified dose did not translate to a significant difference in ED visits, hospitalizations or treatment delays, there was a trend toward fewer events.

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e13056-e13056
Author(s):  
Michael Grimm ◽  
Bhuvaneswari Ramaswamy ◽  
Maryam B. Lustberg ◽  
Robert Wesolowski ◽  
Sagar D. Sardesai ◽  
...  

e13056 Background: Invasive lobular carcinoma (ILC) accounts for only 10-15% of all invasive breast cancers but has distinct clinicopathologic characteristics and genomic profiles. In particular, metastatic lobular cancers (mILC) have unique sites of metastasis and it is unclear if the response to initial endocrine therapy differs from metastatic ductal cancers (mIDC). Therefore we have undertaken a single-institution, retrospective analysis to compare overall outcomes and response to initial endocrine therapy (ET) in patients (pts) with metastatic ILC and IDC. Methods: An IRB approved retrospective review of medical records was conducted evaluating pts treated for metastatic IDC and ILC at The Ohio State University Comprehensive Cancer Center from January 1, 2004 to December 31, 2014. Pts diagnosed with mIDC were matched on age, year of diagnosis, estrogen receptor/progesterone receptor and HER2 status and site of metastasis 2:1 to patients diagnosed with mILC. Overall survival (OS) was defined as the time from metastasis to death or last known follow-up. Progression-free survival (PFS) was defined as time from metastasis to progression on first-line ET. Time to chemotherapy (TTC) was defined as time from starting ET for metastasis to initiation of chemotherapy. Kaplan Meier (KM) methods were used to calculate median OS, PFS and TTC. Results: A total of one hundred sixty one pts with metastatic breast cancer were included in this analysis. The demographic features between the two groups were well balanced and included in the table below. The median OS was 2.6 yrs (95% CI: 1.55, 3.22) for mILC and 2.2 yrs (95% CI: 1.95, 2.62) for mIDC. A subset of 111 patients who started on endocrine therapy were used in the PFS and TTC analyses. The median PFS following first-line ET was 2.2 yrs (95% CI: 0.1.0, 2.7) for mILC and 1.4 yrs (95% CI: 0.91, 1.90) for mIDC. Median TTC was 2.1 yrs (95% CI: 1.71, 4.92) for mILC and 2.4 yrs (95% CI: 1.90, 4.77) for mIDC. There was no statistically significant difference in outcomes between the two groups. Conclusions: Outcomes in patients with ILC and IDC have been varied, with several studies reporting that patients with ILC have worse outcomes and response to chemotherapy. Our retrospective study examining outcomes in mILC in comparison with mIDC showed no difference in OS. Given the concern of resistance to conventional therapies in patients with lobular cancers, it is reassuring to see that the median PFS on first line ET and TTC was similar to metastatic ductal cancers.[Table: see text]


2018 ◽  
Vol 25 (7) ◽  
pp. 1645-1650 ◽  
Author(s):  
Stefanie K Clark ◽  
Lisa M Anselmo

Pemetrexed is a multitargeted antifolate indicated for locally advanced or metastatic non-squamous non-small-cell lung cancer and malignant pleural mesothelioma. Cutaneous reactions are associated with pemetrexed use. Pemetrexed prescribing information recommends oral dexamethasone 4 mg twice daily for three days starting the day before pemetrexed infusion to prevent cutaneous reactions. Patients receive intravenous dexamethasone before pemetrexed infusion at the University of New Mexico Comprehensive Cancer Center, but the oral dexamethasone recommendation is not always followed. The objective of this study was to determine if there is a difference between patients who received three days of oral dexamethasone starting the day before pemetrexed infusion and patients who did not by determining incidence of cutaneous reactions, delay in therapy, and therapy change due to adverse reactions. Eighty-five patients received at least one dose of pemetrexed between August 1, 2012 and August 31, 2017. Twenty-nine patients did not receive three days of oral dexamethasone 4 mg twice daily and 56 patients did (34.1% vs. 65.9%). There was no statistically significant difference in the incidence of cutaneous reactions between the intervention group and the control group (13.8% vs. 25.0%; p = 0.384), delay in pemetrexed therapy between groups (44.8% vs. 32.1%; p = 0.2), or therapy change due to adverse events (34.5% vs. 23.2%; p = 0.654). Results suggest three days of oral dexamethasone 4 mg twice daily did not significantly affect incidence rates of cutaneous reactions, delay in therapy, or therapy change in patients who received intravenous dexamethasone before pemetrexed infusion at University of New Mexico Comprehensive Cancer Center.


2019 ◽  
Author(s):  
Rolf Hut ◽  
Casper Albers ◽  
Sam Illingworth ◽  
Chris Skinner

Abstract. From the wilderness of Hyrule, the entire continent of Tamriel, to Middle Earth, players of videogames are exposed to wonderous, fantastic, but ultimately fake, landscapes. Given the time people may spend in these worlds, compared to the time they spend being trained in geoscience, we wondered if expert geoscientists would differ from non-geoscientists in whether they judge the landscapes in these games to be realistic. Since games have a great opportunity for tangential learning it would be a missed opportunity if it turns out that features obviously fake to geoscientists are perceived as plausible by non-geoscientists. To satisfy our curiosity and answer this question we conducted a survey where we asked people to judge both photos from real landscapes as well as screenshots from the recent The Legend of Zelda: Breath of the Wild videogame on how likely they thought the features in the picture were to exist in the real world. Since game-world screenshots are easily identified based on their rendered, pixaleted nature, we pre-processed all pictures with an artistic Van Gogh filter that removed the rendered nature, but retained the dominant landscape features. We found that there is a small but significant difference between geoscientists and non-geoscientists with geoscientists being slightly better at judging which pictures are from the real world versus from the game world. While significant the effect is small enough to conclude that fantastical worlds in games can be used for tangential learning on geoscientific subjects.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. 9532-9532
Author(s):  
Richard Wayne Joseph ◽  
Alicia C. Shillington ◽  
Todd Lee ◽  
Cynthia Macahilig ◽  
Scott J. Diede ◽  
...  

9532 Background: Both pembrolizumab (PEMBRO) and combination ipilimumab + nivolumab (IPI+NIVO) are FDA-approved immunotherapies for advanced melanoma (AM). These two treatment regimens have different toxicity profiles which may impact health care resource utilization (HCRU). Our aim was to compare real-world risk of hospitalization and emergency department (ED) visits within 12 months of starting the two treatment regimens. Methods: A retrospective cohort study was conducted in patients ≥18 years old with AM initiating PEMBRO or IPI+NIVO between Jan 1, 2016 – Dec 30, 2017. Patients were identified from 12 US academic medical centers and affiliated satellite clinics. Data were abstracted through chart review. All-cause hospitalizations or ED visits and the rates per patient per month (PPPM) through 12 months of follow-up were calculated. Utilization was compared between PEMRBO and IPI+NIVO using multivariate logistic regression analysis. Results: 400 patients were included, 200 each PEMBRO and IPI+NIVO with mean (SD) follow-up time of 10 (3) and 10 (4) months, respectively. The PEMBRO cohort had poorer Eastern Cooperative Group (ECOG) performance status at treatment start, 71% ECOG 0 or 1 vs 88% (p < .001); more diabetes, 21% vs 13% (p = .045); a trend towards more heart disease, 18% vs 12% (p = .067); were more likely to be PD-L1 expression positive, 77% vs 63% (p = .011); and less likely to harbor a BRAF mutation, 35% vs 50% (p = .003). The proportion with at least one hospitalization through 12 months was 17% PEMBRO vs 24% IPI+NIVO. Less than 2% of patients had more than one admission and none had more than two, regardless of cohort. Unadjusted mean (SD) PPPM hospitalizations were .016 (.037) for PEMBRO and .020 (.038) for IPI+NIVO. The adjusted odds ratio for any hospitalization with PEMBRO was 0.55 (95% CI .31, .97; p = .039) vs. IPI+NIVO. ED visits occurred in 18% vs 21% in PEMBRO and IPI+NIVO respectively, with no difference in covariate-adjusted analysis (p = .147). Conclusions: Patients receiving PEMBRO had a significantly lower probability of hospitalization and similar probability of ER visits compared with IPI+NIVO in the real world through 12 months.


2021 ◽  
Vol 13 (21) ◽  
pp. 11991
Author(s):  
Jan Dirk Fijnheer ◽  
Herre van Oostendorp ◽  
Geert-Jan Giezeman ◽  
Remco C. Veltkamp

This paper presents the results of a game study, comparing Powersaver Game including a competition feature versus the same game excluding a competition feature with respect to energy conservation in the household. In a pretest–posttest design, we tested whether change in attitude, knowledge and behavior with respect to energy conservation in the household was different for participants playing Powersaver Game with or without competition. All energy conservation activities that the application provides (e.g., washing clothes at low temperatures) take place in the real world and feedback is based on real-time energy consumption. This so-called reality-enhanced game approach aims to optimize the transfer between the game world and the real world. Household energy consumption changed significantly and positively in the long term due to competition. A significant difference of 8% in energy consumption between both conditions after the intervention was detected. Besides energy conservation, no further differences were detected between conditions. The chain of events, that an increase in knowledge leads to attitude change, which in turn results in behavior change in the long term is confirmed by means of a path analysis. We conclude that Powersaver Game is effective in the transfer of energy conservation knowledge, which leads to energy saving behavior in the long term while competition additionally contributes to more change in behavior.


2021 ◽  
Author(s):  
Haiying Zhang ◽  
Yuyuan Jia ◽  
Ying Ji ◽  
Xu Cong ◽  
Yan Liu ◽  
...  

Background Although effective vaccines have been developed against COVID-19, the level of neutralizing antibodies (Nabs) induced after vaccination in the real world is still unknown. To evaluate the level and persistence of NAbs induced by two inactivated COVID-19 vaccines in China. Methods and findings Serum samples were collected from 1,335 people aged 18 and over who were vaccinated with COVID-19 inactivated vaccine in Peking University People's Hospital from January 19 to June 23, 2021, for detection of COVID-19 antibodies. The WHO standard of SARS-CoV-2 NAbs was detected. The coefficients of variation between the detection results and the true values of the NAbs detected by the WHO standard were all lower than the WHO international standard 3% after the dilution of the original and the dilution of the theoretical concentrations of 500 IU/mL, 250 IU/mL, 125 IU/mL, 72.5 IU/mL, 36.25 IU/mL and 18.125 IU/mL. On day 11-70, the positive rate of NAbs against COVID-19 was 82% to 100%; From day 71 to 332, the positive rate of NAbs decreased to 27%. The level of NAbs was significantly higher at 3-8 Weeks than at 0-3 Weeks. There was a high linear correlation between NAbs and IgG antibodies in 1335 vaccinated patients. NAbs levels were decreased in 31 of 38 people (81.6%) at two time points after the second dose of vaccine. There was no significant difference in age between the group with increased and decreased neutralizing antibody levels (x2 =-0.034, P>0.05). The positive rate of NAbs in the two-dose vaccine group (77.3%) was significantly higher than that in the one-dose group (18.1%), with statistical difference (x2=312.590, P<0.001). A total of 206 people who were 11-70 days after receiving the second dose were tested and divided into three groups: 18-40 years old, 41-60 years old and >60 years old. The positive rates of NAbs in three groups (18-40 years old, 41-60 years old and >60 years old) were 95.14%, 78.43% and 81.8%, respectively. The positive rate of NAbs was significantly higher in 18-40 years old than in 41-60 years old (x2=12.547, P <0.01). The titer of NAbs in 18-40 years old group was significantly higher than that in 41-60 years old group (t=-0.222, P <0.01). The positive rate of NAbs in male group (89.32%) was lower than in female (91.26%), but there was no significant difference (x2=0.222, P >0.05). Conclusions The positive rate of NAbs was the highest from 10 to 70 days after the second dose of vaccine, and the positive rate gradually decreased as time went by. There was a high linear correlation between COVID-19 NAbs and IgM/IgG antibodies in vaccinators, suggesting that in cases where NAbs cannot be detected, IgM/IgG antibodies can be detected instead. The level of NAbs produced after vaccination was affected by age, but not by gender. The highest levels of NAbs were produced between shots 21 to 56 days apart, suggesting that 21 to 56 days between shots is suitable for vaccination.


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 21035-21035
Author(s):  
A. S. The ◽  
V. Reddy ◽  
Y. Li ◽  
R. Davis ◽  
M. Baird ◽  
...  

21035 Background: SEER registry data indicate African Americans (AA) have a lower incidence of non-Hodgkin's lymphoma (NHL) but face higher lymphoma-specific mortality rates than Caucasians (C). To investigate this we compared outcomes between AA & C patients (pts) with DLBCL treated at UAB CCC. Correlative histologic & molecular studies of established prognostic features were performed to investigate differences. Methods: After IRB approval, DLBCL pts diagnosed '95-'06 were identified from pathology & referral databases. Baseline demographic & clinical data were extracted, including treatment (Rx), response & survival. Expression patterns of CD10, bcl-6, & MUM-1 were used to identify germinal center B-cell-like (GCB) vs non-GCB molecular subtype of DLBCL based on algorithm of Hans et al (2004). HLA-DR, bcl-2, & CD138 expression was determined. A 2:1 age- & gender-matched comparison of C to A pts was performed to evaluate differences. Descriptive analysis with X2 & Wilcoxon nonparametric tests & Kaplan Meier survival analysis were performed to determine differences. Results: 188 DLBCL patients were identified. Race was noted in 129, including 18 AA pts (14%). In comparison to 36 C pts (2:1 match), no differences were seen in LDH or stage at presentation. AA pts achieved better response to 1st-line Rx (p=0.01) & received fewer regimens (1 vs >1, p=0.05), however they were more likely to receive rituximab with 1st-line Rx (p=0.02), likely reflecting presentation in the post-rituximab era. Median survival, not yet reached for AA's, was 23 months for C pts (p=0.0509). Univariate & parametric survival analyses demonstrated LDH (p=0.0217) & 1st-line rituximab (p=0.0048) independently affected survival. In a separate cohort, no significant difference between races was seen in frequency of GCB vs non-GCB subtype or in expression of bcl-2, CD-138, or HLA-DR. Conclusions: Contrary to SEER observations, the outcome of AA DLBCL pts was superior to C pts in this single center study. No molecular or clinical prognostic feature dominated in either race. The fact that more AA than C patients received rituximab upfront confirms the benefit of adding this agent to 1st-line Rx. No significant financial relationships to disclose.


2017 ◽  
Vol 35 (15_suppl) ◽  
pp. e15168-e15168
Author(s):  
Kristin Lynn Koenig ◽  
Christina Sing-Ying Wu ◽  
Wei Chen ◽  
Wendy L Frankel ◽  
Daniel Jones ◽  
...  

e15168 Background: 2-8% of all colorectal cancer (CRC) cases are in younger adults (YAs), patients (pts) less than age 50. However, current understanding of CRC in YAs is inadequate, especially that of sporadic onset. We conducted a study to describe the landscape of genomic alterations in YA CRC pts presenting to a large academic practice. Methods: Adult pts with CRC presenting to The Ohio State University Comprehensive Cancer Center oncology clinics were offered next generation sequencing (NGS) through a customized 22-gene Ion AmpliSeq Mutation Panel as part of clinical care. Commonly mutated areas of select genes (including AKT1, ALK, BRAF, EGFR, ERBB2/4, FBXW7, FGFR1/2/3, KRAS/NRAS, MET, NOTCH1, PIK3CA, PTEN, TP53) were sequenced from tumor sections. Institutional review board approval was obtained to retrospectively analyze this NGS testing between 1/2013-3/2016. Results: 258 CRC pts underwent genomic profiling. 57 pts (22.1%) were YAs at diagnosis (range 22-49 years); 20 pts (7.8%) were 40 years old or younger. 31 YA pts (54.4%) had metastatic disease. Of the YAs with CRC, 18 pts (31.6%) were diagnosed with R-sided colon, 16 pts (28.1%) with L-sided colon, and 22 pts (38.6%) with rectal cancer. 110 genomic alterations were found in YA pts, with a mean of 1.9 mutations per tumor (range 0-6); 35 (31.8%) of these in 32 (56.1%) YA pts were actionable. Of these 110 alterations, 41.8% were in TP53, 28.2% in KRAS/NRAS, 10.0% in PIK3CA, 3.6% in BRAF, 3.6% in FBXW7, and 2.73% in PTEN. 6 YA pts (10.5%) had microsatellite instability (MSI-H). Only 1 pt had concomitant MSI-H and a BRAF mutation; 4 pts with BRAF mutations were microsatellite stable. Comparing our YA pts to a separate cohort of pts age > 50 who had testing done, no significant difference was seen in mutation incidence in KRAS/NRAS (p = 1.0), TP53 (p = 0.3), PIK3CA (p = 0.128), or BRAF (p = 1.0). Conclusions: Genomic profiling through a targeted NGS panel is feasible as part of routine clinical practice. There is disagreement in the literature on genetic mutations in YA compared to older age CRC pts. Knowledge of the genomic landscape in YAs with CRC will lead to improved understanding of the underlying biology of CRC in YAs as it differs from CRC in older pts, and could impact future care of this cohort.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. 6542-6542
Author(s):  
Jack S Bevins ◽  
Hannah Fullington ◽  
Thomas W. Froehlich ◽  
Stephanie Hobbs ◽  
Ethan Halm ◽  
...  

6542 Background: Several cancer centers describe cancer-patient dedicated urgent care clinic (UCC) that address commonly anticipated complaints of adults with cancer. UCC may be capable of preventing some ED visits, but little is known of the safety and outcomes for patients after a UCC visit. Methods: We identified UCC visits made by adults at our comprehensive cancer center between 2013-2016 and compared the cohort to patients who did not visit the UCC. We linked patients to tumor registry data and their electronic health record from the UCC visit, then tracked ED visits, inpatient and intensive care unit (ICU) admissions occurring within 24 hours of the UCC visit. Results: Between 2013-2016, 551 patients generated 772 UCC visits, compared to 17,496 who did not visit. UCC users had significantly (p<0.001) more advanced-stage cancer than non-UCC users (37.3% vs 18.9%), but there were no significant differences in mean age, race/ethnicity, or death within 180 days of diagnosis. The most common chief complaints accounted for nearly half of all UCC visits: (17.4%), URI symptoms/fever (12.6%), nausea/vomiting/diarrhea (7.8%), and fatigue/weakness (7.6%). After 10.0% of UCC visits, patients had an ED visit, while 12.3% were admitted to the hospital; only 5 UCC visits (0.7%) had an associated ICU stay. Most patients (75.7%) only had a single UCC visit, but patients who visited the UCC more often tended to have higher rates of ED visits and hospitalizations within 24 hours (Table). The mean time from UCC arrival to ED arrival was 3.0 hours, and 6.5 hours from UCC arrival to inpatient arrival. Conclusions: The majority of patients seen in UCC did not require ED or inpatient hospitalization. Patients with subsequent ED or inpatient visits had minimal delays in care. Findings suggest that triaging cancer patients for commonly anticipated complaints to a UCC does not result in high rates of mis-triaging or major delays in care. Patients with ED, Inpatient, or ICU visit after UCC, stratified by UCC visits per patient (2013-2016). [Table: see text]


2019 ◽  
Vol 37 (4_suppl) ◽  
pp. 710-710
Author(s):  
Benjamin D Fangman ◽  
Muhammad Shaalan Beg ◽  
Aravind Sanjeevaiah ◽  
Farshid Araghizadeh ◽  
Shannon Scielzo ◽  
...  

710 Background: Oncologic treatment at National Cancer Institute (NCI) designated comprehensive cancer centers improves outcomes in a variety of malignancies. Racial disparity plays an important role in cancer outcomes and prognosis. Racial outcomes were compared in early stage colorectal cancer patients that presented to a comprehensive cancer center. Methods: This is a retrospective analysis on patients diagnosed with AJCC stage II or stage III colorectal cancer and underwent surgery or adjuvant chemotherapy within the University of Texas Southwestern and Simmons Comprehensive Cancer Center. Pertinent data points were abstracted from EMR including demographic data and dates of initial diagnosis, surgery, adjuvant chemotherapy, progression, and death. Results: Between 4/2011 and 11/2015, 203 patients were identified and 167 patients had complete follow up data available. Median age of cohort was 62 (range 21-90) and most of the patients were men (52.7%). Stage II comprised 44.3% of patients while 55.7% were diagnosed at stage III. One hundred and twenty patients (71.9%) were white, while 34 patients (20.4%) identified as black and the rest belonged to other races. Hispanic ethnicity was identified in 10.4% of patients. There was no significant difference between white and black cohorts between variables age (median 62 vs. 64.5 years; p = 0.44), gender (p = 0.43), and stage (p = 0.99) of colorectal cancer. Similarly there was no significant difference between white and black race in regards to days to surgery (median 17 vs 32 days; p = 0.53), first medical oncology appointment (30 vs. 34 days; p = 0.23) and days to adjuvant chemotherapy (42 vs 52 days; p = 0.24). The rate of recurrence (10.9% vs. 26%; p = 0.09), rate of death (14.1% vs 14.7%; p = 1.0), median relapse free survival (41.7 vs. 36.2 months; p = 0.173) and median overall survival (42 vs. 38.5 months; p = 0.491) from colorectal cancer were also not significantly different between white and black races. Conclusions: Oncologic treatment at NCI designated comprehensive cancer centers may lead to racial parity in colorectal cancer outcomes. Further research should be completed to compare these results to those seen at safety net hospitals.


Sign in / Sign up

Export Citation Format

Share Document