scholarly journals Impact of obesity on functional and oncological outcomes in radical perineal prostatectomy

2015 ◽  
Vol 9 (11-12) ◽  
pp. 766 ◽  
Author(s):  
Bulent Altay ◽  
Bulent Erkurt ◽  
Vahit Guzelburc ◽  
Murat Can Kiremit ◽  
Mustafa Yucel Boz ◽  
...  

Introduction: We evaluated the impact of obesity on perioperative morbidity, functional, and oncological outcomes after radical perineal prostatectomy (RPP).Methods: A total of 298 consecutive patients underwent RPP at our institution. Patients were categorized into 3 groups based on their body mass index (BMI): Normal weight <25 kg/m2 (Group 1), overweight 25 to <30 kg/m2 (Group 2), and obese ≥30 kg/m2 (Group 3). We compared the groups with respect to perioperativedata, postoperative oncologic, and functional outcomes. Evaluation of urinary continence and erectile function was performed using a patient-reported questionnaire and the International Index of Erectile Function-5 questionnaire, respectively, administered preoperatively and at 3, 6, and 12 months. Limitations included shortfollow-up time, retrospective design and lack of a morbidly obese group.Results: No significant differences were found among the 3 groups with regard to operative time, estimated blood loss, length of hospital stay, catheter removal time, positive surgical margin, and complication rates. At 12 months, 94.7%, 95% and 95% of normal, overweight and obese patients, respectively, were continent (freeof pad use) (p = 0.81). At 12 months, 30.6%, 29.8% and 30.4% of patients had spontaneous erections and were able to penetrate and complete intercourse in Group 1, Group 2, and Group 3, respectively (p = 0.63).Conclusions: In this cohort of patients, no clinically relevant risks were associated with increasing BMI.

2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii440-iii440
Author(s):  
Harriet Dulson ◽  
Rachel McAndrew ◽  
Mark Brougham

Abstract INTRODUCTION Children treated for CNS tumours experience a very high burden of adverse effects. Platinum-based chemotherapy and cranial radiotherapy can cause ototoxicity, which may be particularly problematic in patients who have impaired vision and cognition as a result of their tumour and associated treatment. This study assessed the prevalence of impaired hearing and vision and how this may impact upon education. METHODS 53 patients diagnosed with solid tumours in Edinburgh, UK between August 2013–2018 were included in the study. Patients were split into three groups according to treatment received: Group 1 – cisplatin-based chemotherapy and cranial radiotherapy; Group 2 - platinum-based chemotherapy, no cranial radiotherapy; Group 3 – benign brain tumours treated with surgery only. Data was collected retrospectively from patient notes. RESULTS Overall 69.5% of those treated with platinum-based chemotherapy experienced ototoxicity as assessed by Brock grading and 5.9% of patients had reduced visual acuity. Patients in Group 1 had the highest prevalence of both. 44.4% of patients in Group 1 needed increased educational support following treatment, either with extra support in the classroom or being unable to continue in mainstream school. 12.5% of Group 2 patients required such support and 31.3% in Group 3. CONCLUSIONS Children with CNS tumours frequently require support for future education but those treated with both platinum-based chemotherapy and cranial radiotherapy are at particular risk, which may be compounded by co-existent ototoxicity and visual impairment. It is essential to provide appropriate support for this patient cohort in order to maximise their educational potential.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yu Liu ◽  
Jing Li ◽  
Wanyu Zhang ◽  
Yihong Guo

AbstractOestradiol, an important hormone in follicular development and endometrial receptivity, is closely related to clinical outcomes of fresh in vitro fertilization-embryo transfer (IVF-ET) cycles. A supraphysiologic E2 level is inevitable during controlled ovarian hyper-stimulation (COH), and its effect on the outcome of IVF-ET is controversial. The aim of this retrospective study is to evaluate the association between elevated serum oestradiol (E2) levels on the day of human chorionic gonadotrophin (hCG) administration and neonatal birthweight after IVF-ET cycles. The data of 3659 infertile patients with fresh IVF-ET cycles were analysed retrospectively between August 2009 and February 2017 in First Hospital of Zhengzhou University. Patients were categorized by serum E2 levels on the day of hCG administration into six groups: group 1 (serum E2 levels ≤ 1000 pg/mL, n = 230), group 2 (serum E2 levels between 1001 and 2000 pg/mL, n = 524), group 3 (serum E2 levels between 2001 and 3000 pg/mL, n = 783), group 4 (serum E2 levels between 3001 and 4000 pg/mL, n = 721), group 5 (serum E2 levels between 4001 and 5000 pg/mL, n = 548 ), and group 6 (serum E2 levels > 5000 pg/mL, n = 852). Univariate linear regression was used to evaluate the independent correlation between each factor and outcome index. Multiple logistic regression was used to adjust for confounding factors. The LBW rates were as follows: 3.0% (group 1), 2.9% (group 2), 1.9% (group 3), 2.9% (group 4), 2.9% (group 5), and 2.0% (group 6) (P = 0.629), respectively. There were no statistically significant differences in the incidences of neonatal LBW among the six groups. We did not detect an association between peak serum E2 level during ovarian stimulation and neonatal birthweight after IVF-ET. The results of this retrospective cohort study showed that serum E2 peak levels during ovarian stimulation were not associated with birth weight during IVF cycles. In addition, no association was found between higher E2 levels and increased LBW risk. Our observations suggest that the hyper-oestrogenic milieu during COS does not seem to have adverse effects on the birthweight of offspring after IVF. Although this study provides some reference, the obstetric-related factors were not included due to historical reasons. The impact of the high estrogen environment during COS on the birth weight of IVF offspring still needs future research.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 340.2-341
Author(s):  
V. Orefice ◽  
F. Ceccarelli ◽  
C. Barbati ◽  
R. Lucchetti ◽  
G. Olivieri ◽  
...  

Background:Systemic lupus erythematosus (SLE) is an autoimmune disease mainly affecting women of childbearing age. The interplay between genetic and environmental factors may contribute to disease pathogenesis1. At today, no robust data are available about the possible contribute of diet in SLE. Caffeine, one of the most widely consumed products in the world, seems to interact with multiple components of the immune system by acting as a non-specific phosphodiesterase inhibitor2.In vitrodose-dependent treatment with caffeine seems to down-regulate mRNA levels of key inflammation-related genes and similarly reduce levels of different pro-inflammatory cytokines3.Objectives:We evaluated the impact of caffeine consumption on SLE-related disease phenotype and activity, in terms of clinimetric assessment and cytokines levels.Methods:We performed a cross-sectional study, enrolling consecutive patients and reporting their clinical and laboratory data. Disease activity was assessed by SLE Disease Activity Index 2000 (SLEDAI-2k)4. Caffeine intake was evaluated by a 7-day food frequency questionnaire, including all the main sources of caffeine. As previously reported, patients were divided in four groups according to the daily caffeine intake: <29.1 mg/day (group 1), 29.2-153.7 mg/day (group 2), 153.8-376.5 mg/day (group 3) and >376.6 mg/day (group 4)5. At the end of questionnaire filling, blood samples were collected from each patient to assess cytokines levels. These were assessed by using a panel by Bio-Plex assays to measure the levels of IL-6, IL-10, IL-17, IL-27, IFN-γ, IFN-α and Blys.Results:We enrolled 89 SLE patients (F/M 87/2, median age 46 years, IQR 14; median disease duration 144 months, IQR 150). The median intake of caffeine was 195 mg/day (IQR 160.5). At the time of the enrollment, 8 patients (8.9%) referred a caffeine intake < 29.1 mg/day (group 1), 27 patients (30.3%) between 29.2 and 153.7 mg/day (group 2), 45 patients (51%) between 153.8 and 376.5 mg/day (group 3) and 9 patients (10.1%) >376.6 mg/day (group 4). A negative correlation between the levels of caffeine and disease activity, evaluated with SLEDAI-2K, was observed (p=0.01, r=-0.26). By comparing the four groups, a significant higher prevalence of lupus nephritis, neuropsychiatric involvement, haematological manifestations, hypocomplementemia and anti-dsDNA positivity was observed in patients with less intake of caffeine (figure 1 A-E). Furthermore, patients with less intake of caffeine showed a significant more frequent use of glucocorticoids [group 4: 22.2%,versusgroup 1 (50.0%, p=0.0001), group 2 (55.5%, p=0.0001), group 3 (40.0%, p=0.009)]. Moving on cytokines analysis, a negative correlation between daily caffeine consumption and serum level of IFNγ was found (p=0.03, r=-0.2) (figure 2A); furthermore, patients with more caffeine intake showed significant lower levels of IFNα (p=0.02, figure 2B), IL-17 (p=0.01, figure 2C) and IL-6 (p=0.003, figure 2D).Conclusion:This is the first report demonstrating the impact of caffeine on SLE disease activity status, as demonstrated by the inverse correlation between its intake and both SLEDAI-2k values and cytokines levels. Moreover, in our cohort, patients with less caffeine consumption seems to have a more severe disease phenotype, especially in terms of renal and neuropsychiatric involvement. Our results seem to suggest a possible immunoregulatory dose-dependent effect of caffeine, through the modulation of serum cytokine levels, as already suggested byin vitroanalysis.References:[1]Kaul et alNat. Rev. Dis. Prim.2016; 2. Aronsen et alEurop Joul of Pharm2014; 3. Iris et alClin Immun.2018; 4. Gladman et al J Rheumatol. 2002; 5. Mikuls et alArth Rheum2002Disclosure of Interests:Valeria Orefice: None declared, Fulvia Ceccarelli: None declared, cristiana barbati: None declared, Ramona Lucchetti: None declared, Giulio Olivieri: None declared, enrica cipriano: None declared, Francesco Natalucci: None declared, Carlo Perricone: None declared, Francesca Romana Spinelli Grant/research support from: Pfizer, Consultant of: Novartis, Gilead, Lilly, Sanofi, Celgene, Speakers bureau: Lilly, cristiano alessandri Grant/research support from: Pfizer, Guido Valesini: None declared, Fabrizio Conti Speakers bureau: BMS, Lilly, Abbvie, Pfizer, Sanofi


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Janet W Elcano ◽  
Hui Nam Pak

Background: The incidence of atrial fibrillation (AF) is increasing in the elderly population, however, there is paucity of data on the safety outcomes of this patient subgroup thus we sought to investigate on the impact of age on the safety of catheter ablation for AF. Methods and Results: We included 1,293 (male 75%) patients enrolled in Yonsei AF Ablation Cohort database in Seoul, South Korea, from March 2009 to November 2013. We divided the patients into 4 groups according to age (Group 1, aged 17-49, N=295 ; Group 2 50-59, N=421; Group 3 60-69 N=408; and Group 4 ≥ 70, N=169) and evaluated the incidence of procedure related complications. No procedure-related death occurred in this study. There was a trend of increasing incidence of procedure related complications with age noted as follows: Group 1= 3.7%; Group 2= 4.0%; Group 3=6.6%; and Group 4 7.1%, (p= 0.15). There were 28 cases (2.2%) of major complications (Group 1=1.7%, Group 2=1.9%, Group 3=2%, Group 4 4.1%), tamponade being the most common. Major complications in group 4 include: tamponade 4 cases, phrenic nerve palsy 1 case, atrioesophaeal fistula 1 and 3rd degree AV block in 1 patient. Multivariate regression analysis shows ablation time (odds ratio (OR) 1.2 confidence interval (CI)1.0-1.017, p=0.017), procedure time (OR 1.008, CI 1.0-1.15, p=0.04), decreasing eGFR (OR 1.013, CI 1.002-1.026 p=0.018), coronary artery disease (CAD) (OR 1.847, CI 1.003-3.524, p0.04) and age (OR 1.028, CI 1.003-1.055, p=0.03) were associated with increased adjusted risk of total complications. Predictors of major complications include age (OR 1.044, CI 1.003-1.086, p0.02) and ablation time (OR 1.009, CI 0.999-1.000, p=0.033). Conclusion: Our data suggest that incidence of procedural complications in RFA of AF increase with age. Ablation time and age are independent predictors of a major complication.


2020 ◽  
Vol 102-B (6_Supple_A) ◽  
pp. 24-30
Author(s):  
Andrew T. Livermore ◽  
Jill A. Erickson ◽  
Brenna Blackburn ◽  
Christopher L. Peters

Aims A significant percentage of patients remain dissatisfied after total knee arthroplasty (TKA). The aim of this study was to determine whether the sequential addition of accelerometer-based navigation for femoral component preparation and sensor-guided ligament balancing improved complication rates, radiological alignment, or patient-reported outcomes (PROMs) compared with a historical control group using conventional instrumentation. Methods This retrospective cohort study included 371 TKAs performed by a single surgeon sequentially. A historical control group, with the use of intramedullary guides for distal femoral resection and surgeon-guided ligament balancing, was compared with a group using accelerometer-based navigation for distal femoral resection and surgeon-guided balancing (group 1), and one using navigated femoral resection and sensor-guided balancing (group 2). Primary outcome measures were Patient-Reported Outcomes Measurement Information System (PROMIS) and Knee injury and Osteoarthritis Outcome (KOOS) scores measured preoperatively and at six weeks and 12 months postoperatively. The position of the components and the mechanical axis of the limb were measured postoperatively. The postoperative range of motion (ROM), haematocrit change, and complications were also recorded. Results There were 194 patients in the control group, 103 in group 1, and 74 in group 2. There were no significant differences in baseline demographics between the groups. Patients in group 2 had significantly higher baseline mental health subscores than control and group 1 patients (53.2 vs 50.2 vs 50.2, p = 0.041). There were no significant differences in any PROMs at six weeks or 12 months postoperatively (p > 0.05). There was no difference in the rate of manipulation under anaesthesia (MUA), complication rates, postoperative ROM, or blood loss. There were fewer mechanical axis outliers in groups 1 and 2 (25.2%, 14.9% respectively) versus control (28.4%), but this was not statistically significant (p = 0.10). Conclusion The sequential addition of navigation of the distal femoral cut and sensor-guided ligament balancing did not improve short-term PROMs, radiological outcomes, or complication rates compared with conventional techniques. The costs of these added technologies may not be justified. Cite this article: Bone Joint J 2020;102-B(6 Supple A):24–30.


2018 ◽  
Vol 3 (3) ◽  
pp. 2473011418S0008
Author(s):  
Andrew Molloy ◽  
Samantha Whitehouse ◽  
Lyndon Mason

Category: Trauma Introduction/Purpose: Ankle fractures are one of the most common fractures. Historically these have been frequently treated by non-specialists and junior staff. In 2011 we presented high malunion rates, which have been mirrored in other departments work. We present the results of system changes to improve the results of ankle fracture fixation Methods: Image intensifier films were reviewed on PACS and scored based on the criteria published by Pettrone et al. At least two blinded assessors assigned scores independently. Patients clinical data was collected from medical records. In 2011 we presented the results of fixation in 94 consecutive patients (Group 1) from 2009. Following this there was period of education in the department to allow change. 68 patients (Group 2) were then reviewed from a 7 month period in 2014 Multiple system changes were introduced in the department including; new treatment algorithms, dedicated foot and ankle trauma lists and clinics, and next day review of all intra-operative radiographs by independent attending. Prospective data was collected on 205 consecutive cases (Group 3) from 01/01/15 – 09/30/16 Results: Patients in group 1 had a malreduction rate of 33%. The major complication rate in this group was 8.5% (8 patients); with only one of these occurred in a correctly reduced fracture. These complications included 4 revision fixations, 2 deep infections and 1 amputation. Following the period of re-education, in Group 2, the mal-reduction rate deteriorated to 43.8%. In this group the major complication rate was 10.9%; including 6 revision fixations and 1 ankle fusion. In Group 3, following overall system changes, the malreduction rate was 2.4%. This result is statistically significant. The major complication rate fell to 0.98%; 1deep infection and 1 amputation (in a polytrauma patient with vascular injury). This result is again statistically significant. Conclusion: Our initial results show that very poor results are a consequence when sufficient attention is not given to what are frequently considered to be ‘simple’ fractures. In group 2 we demonstrated that soft educational changes (eg presentations, emails) are ineffective in improving results. We have demonstrated that hard (institutional system) changes in our department provided statistically significant improvements. These changes allowed the correct surgeon for the fracture in both determining the treatment plan and operating. With these changes, malreduction rates fell from 43.8% to 2.4% and major complication rates from 10.9% to 0.98%


Author(s):  
Kirsten E Lyke ◽  
Alexandra Singer ◽  
Andrea A Berry ◽  
Sharina Reyes ◽  
Sumana Chakravarty ◽  
...  

Abstract Background A live-attenuated Plasmodium falciparum (Pf) sporozoite (SPZ) vaccine (PfSPZ Vaccine) has shown up to 100% protection against controlled human malaria infection (CHMI) using homologous parasites (same Pf strain as in the vaccine). Using a more stringent CHMI, with heterologous parasites (different Pf strain), we assessed the impact of higher PfSPZ doses, a novel multi-dose prime regimen, and a delayed vaccine boost upon vaccine efficacy. Methods Four groups of 15 healthy, malaria-naïve adults were immunized. Group (Grp) 1 received five doses of 4.5x10 5 PfSPZ (days 1, 3, 5, 7; week 16). Grps 2, 3 and 4 received three doses (weeks 0, 8, 16) with Gp 2 receiving 9.0×10 5/dose, Grp 3 receiving 18.0×10 5/dose, and Grp 4 receiving 27.0×10 5 for dose 1 and 9.0×10 5 for doses 2 and 3. VE was assessed by heterologous CHMI after 12 or 24 weeks. Volunteers not protected at 12 weeks were boosted prior to repeat CHMI at 24 weeks. Results At 12-week CHMI, 6/15 (40%) Group 1 (P=0.04), 3/15 (20%) Group 2 vs. 0/8 controls remained aparasitemic. At 24-week CHMI, 3/13 (23%) Group 3, 3/14 (21%) Group 4 vs. 0/8 controls remained aparasitemic (Groups 2-4, VE not significant). Post-boost, 9/14 (64%) vs. 0/8 controls remained aparasitemic (3/6 Group 1, P=0.025; 6/8 Group 2, P=0.002). Conclusions Four stacked, priming injections (multi-dose priming) showed 40% VE against heterologous CHMI, while dose escalation of PfSPZ using single dose priming was not significantly protective. Boosting unprotected subjects improved VE at 24 weeks to 64%.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 830-830
Author(s):  
J. Alejandro Madrigal ◽  
Neema P. Mayor ◽  
Hazael Maldonado-Torres ◽  
Bronwen E. Shaw ◽  
Steven G.E. Marsh

Abstract Haematopoietic Stem Cell Transplantation (HSCT) using volunteer Unrelated Donors (UD) has become an important and viable option in the treatment of Acute Leukaemia (AL). While matching donors and recipients usually refers to five of the classical HLA genes (HLA-A, -B, -C, -DRB1 and -DQB1), the impact of a sixth gene, HLA-DPB1, on the outcome of UD-HSCT is increasingly emerging. We have previously shown an increased risk of relapse with HLA-DPB1 matching and independently, with NOD2/CARD15 genotype. In light of these data, we have analysed a larger UD-HSCT cohort in order to establish the impact on transplant outcome when both HLA-DPB1 matching status and NOD2/CARD15 genotype are considered. HLA typing and NOD2/CARD15 genotyping was performed on 304 AL patients and their Anthony Nolan Trust volunteer unrelated donors. Transplants occurred between 1996 and 2005 at UK transplant centres. Diagnoses were ALL (47%) and AML (53%). 67% of the cohort were a 10/10 HLA match with 16% also being matched for HLA-DPB1. Myeloablative conditioning regimens were used in 74% of transplants. T-cell depletion was included in 82% of conditioning protocols. Bone marrow was used in 72% of transplants with the remaining 28% using peripheral blood stem cells. Two forms of post-transplant immunosuppression predominated, Cyclosporine A and Methotrexate (47%) and Cyclosporine A alone (38%). Previous studies on a subgroup of this cohort showed that HLA-DPB1 matching and NOD2/CARD15 SNPs independently caused an increase in disease relapse. Consequently, the cohort was grouped into three categories to reflect this risk, group 1 (DPB1 matched; NOD2/CARD15 SNP, n=24), group 2 (HLA-DPB1 matched; NOD2/CARD15 Wild-Type (WT) or HLA-DPB1 mismatched; NOD2/CARD15 SNP, n=112) and group 3 (HLA-DPB1 mismatched; NOD2/CARD15 WT, n=168). There was a significant difference in disease relapse between the three groups (1 year: group 1; 68%, group 2; 48%, group 3; 30%, p=0.0038). This finding persisted in multivariate analysis where being in either group 2 or 3 was protective towards relapse as compared to group 1 (RR 0.321; 95% CI 0.167–0.616; p=0.001 and RR 0.478; 95% CI 0.244–0.934; p=0.031 respectively). In the group with the highest relapse risk (group 1), this resulted in a decrease in Overall Survival (OS) (33% vs 54% in group 3, RR 0.617; 95% CI 0.359–1.060; p=0.080). The best OS was seen in the group with the lowest risk of relapse (group 3). Here, in addition to low relapse, there was increased acute and chronic Graft-versus-Host Disease (GvHD) (p=0.0019 and p=0.0058 respectively). In this cohort, cGvHD (in its limited form) was associated with a significantly lower incidence of relapse (p=0.0066) and better OS (p<0.0001). In concordance with our previous theories, it appears that being HLA-DPB1 matched and having NOD2/CARD15 SNPs predicts for the worst outcome with a significant increase in relapse and reduced OS. Conversely, the ideal pairing would be HLA-DPB1 mismatched and NOD2/CARD15 WT. These data suggest that prospectively typing AL patients for HLA-DPB1 and NOD2/CARD15 SNPs will allow the prediction of disease relapse, aGvHD and cGvHD and in addition will allow the effects of being independently HLA-DPB1 matched or having a NOD2/CARD15 SNP to be offset by intelligently selecting a suitable, less precarious donor.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 1223-1223
Author(s):  
Alessandro Corso ◽  
Silvia Mangiacavalli ◽  
Luciana Barbarano ◽  
Annalisa Citro ◽  
Paola Brasca ◽  
...  

Abstract Abstract 1223 Poster Board I-245 Introduction This study aimed at evaluating the impact of three different pre-transplant therapies on the outcome of patients (pts) eligible for high-dose therapy. Methods two-hundred sixty eight newly diagnosed MM pts aged £65 years, Durie-Salmon stage III, II, or I in progression, were consecutively enrolled from 2000 to 2007 in three different protocols, with three different pre-transplant therapy: Group 1: (145 pts) 3 pulse-VAD cycles; Group 2: (67 pts) 3 pulse-VAD cycles plus 3 Thal-Dex cycles (thalidomide at the dose of 100 mg/day orally at bedtime, continuously for 3 months, oral dexamethasone at the dose of 20 mg on days 1-4 and 14-17 every 28 days); Group 3: (57pts) 4 Vel-Dex courses (Bortezomib 1.3 mg/m2 i.v. on days 1, 4, 8, 11; oral Dexamethasone 40 mg on days 1-4 and 8-11 every 3 weeks). After induction all pts received two DCEP-short cycles as mobilization (oral Dexamethasone 40 mg/day on days 1-4 + Cyclophosphamide 700 mg/m2/day i.v., Etoposide 100 mg/ m2/day i.v., cisPlatin 25 mg/m2/day for 2 days) with peripheral blood stem-cell (PBSC) collection prompted by G-CSF followed by one or two transplants (Tx) with melphalan 200 mg/m2 as conditioning regimen. Response was defined according to IMWG uniform criteria. Pts were considered responsive when obtaining at least a PR. Results pts in the three group were similar for age, gender, Ig type, ISS stage. A significant higher percentage of Durie and Salmon stages III was found in group 3 (83% vs 68% in group 1 and 67% in group 2, p=0.0002). The median follow-up was 46 (1-150) months for group 1, 43 (1-68) months for group 2, and 29.7 (1-79) months for group 3. At the time of this analysis in the three groups 51%, 65%, 90% of transplanted pts respectively were still alive, and progression after transplant was registered in 84%, 80%, 50% respectively. Patient flow before Tx was similar (p=0.45): 19% in group 1, 27% in group 2, 23% in group 3. In group 1, 2% of pts went off-study after VAD, and 17% after mobilization phase. In group 2, patient flow was equally distributed: 7% after pulse VAD, 10% after thal-dex, 9% after DCEP. In group 3, 12% of the pts went off-study after Vel-Dex, 11% after DCEP. Table 1 summarized responses. In group 3 (Vel-Dex) response was better along all protocol phases with respect to group 1 or 2 (p<0.00001). The number of responsive pts progressively increased from 87% after Vel-Dex (CR 31%), to 96% after transplant (CR 38%). Response rates of group 1 and 2 patients were not significantly different either after induction (p=0.6), after DCEP (p=0.5), and after Tx (p=0.65). On intention to treat basis, vel-dex induction produced a better, although not significant, PFS (34.6 months vs 29 in group 1 and 26.8 in group 2, p=0.56). OS were not statistically different among the three groups, event though the different follow-up could affect the analysis (median OS 110 in group 1, 66 months in group 2, and not reached in group 3, p=0.37). In multivariate analysis PFS was improved only by the achievement of CR (p=0.001). No significant difference was observed between VGPR or PR (p=0.43). Conclusion In this study, only CR not VGPR impacts on the outcome. Vel-Dex producing a significant high CR rate after TX (38%), seems to improve survival of MM patients candidate to high-dose therapy with respect to conventional pre-transplant strategies. Disclosures Morra: Roche:.


Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 54-54 ◽  
Author(s):  
Yves Beguin ◽  
Johan Maertens ◽  
Bernard De Prijck ◽  
Rik Schots ◽  
Pascale Frere ◽  
...  

Abstract Background : We previously reported a retrospective study suggesting that erythropoietin therapy starting on day 28 after autologous HCT was highly effective to improve Hb levels and that i.v. iron might improve response in patients with low transferrin saturation (Baron et al., Clin Cancer Res 2003). This prompted us to conduct a multicenter prospective randomized study analyzing the impact of darbepoetin alfa with or without i.v. iron on erythroid recovery after autologous HCT. Patients and Methods : 127 autologous HCT recipients with lymphoid malignancies were randomized 1:2:2 between no treatment (group 1, n=25), darbepoetin alfa (Aranesp®) 300 μg QOW starting on day 28 after HCT for a total of 7 doses (group 2, n=52), or the same regimen of darbepoetin alfa plus i.v. iron sucrose (Venofer®) 200 mg on days 28, 42 and 56 after HCT (group 3, n=50). Once the target Hb (13 g/dL) was attained, the dose of darbepoetin alfa was reduced to 150 μg, while it was withheld when Hb was ≥ 14 g/dL. Primary endpoints included proportion of complete correctors (i.e. patients reaching Hb □ 13 g/dL) before day 126 post-transplant and median time to achieve Hb correction in each arm. Data were analyzed following the intention-to-treat principle. The proportion of complete correctors by day 126 in each group was compared using the Fisher’s exact test, and median times to reach 13 g/dL in each group were compared using the logrank test. Results : The proportion of complete correctors was 24% in group 1, 81% in group 2 (P&lt;0.001 compared with group 1), and 92% in group 3 (P&lt;0.001 compared to group 1, and P=0.099 compared to group 2). Median time to achieve Hb □ 13 g/dL was not reached in group 1, 42 days in group 2 (P&lt;0.001 compared to group 1), and 32 days in group 3 (P&lt;0.001 compared to group 1 and P=0.127 compared to group 2) (Fig 1A). Hb evolution in each group is shown in Fig 1B. Mean ± standard deviation total doses of darbepoetin-alfa administered were 1,445 ± 489 μg in group 2 vs 1,272 ± 443 μg in group 3 (P=0.06). Ferritin levels at the end of study were 477 ± 597, 393 ± 599 and 479 ± 488 in groups 1, 2 and 3, respectively (NS). Eight patients (2 in group 1, 4 in group 2, and 2 in group 3) required red blood cell transfusions on study, including 4 patients following early disease progression. There was no difference in rates of thrombo-embolic events or other complications among the groups. Quality-of-life data will be presented. Conclusions : This is the first prospective randomized trial demonstrating that darbepoetin alfa is safe and highly effective to ensure full erythroid reconstitution after autologous HCT when started on day 28 posttransplant. I.v. iron sucrose tended (not statistically significant) to further fasten erythroid recovery with a lower dose of darbepoetin alfa required. Future studies in this setting should aim at further investigating the impact i.v. iron might have on improving response in patients with low transferrin saturation. Figure 1. Figure 1.


Sign in / Sign up

Export Citation Format

Share Document