How did the COVID crisis affect use of neoadjuvant therapy for patients with breast cancer?

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e18708-e18708
Author(s):  
Anees B. Chagpar ◽  
Donald R. Lannin ◽  
Sarah Schellhorn Mougalian ◽  
Elizabeth Rapp Berger ◽  
Cary Philip Gross ◽  
...  

e18708 Background: The COVID-19 pandemic has caused shifts in terms of cancer management, but the impact of this has not been well-elucidated in a contemporary cohort of patients in clinical practice in the US. We hypothesized that closure of operating rooms would increase the use of neoadjuvant therapy (NT) during the early pandemic period. Methods: The nationwide Flatiron Health database is a longitudinal electronic health record (EHR)-derived database, comprising de-identified, patient-level structured and unstructured data, curated via technology-enabled abstraction. These data originated from approximately 280 cancer clinics. We compared patients diagnosed with non-metastatic breast cancer during the early pandemic period (March 1 – June 30, 2020; group 1) with those diagnosed in the four month period prior (November 1, 2019 – February 29, 2020; group 2) and those diagnosed during the same period one year earlier (March 1 – June 30, 2019; group 3). Results: There were 174 patients in group 1, 277 in group 2, and 348 in group 3. Overall, 591 (74.1%) were ER/PR+HER2-, 100 (12.6%) were HER2+, and 106 (13.3%) were triple negative (TN). Patients in the three groups were equally likely to be ER/PR+HER2- (75.3% vs. 72.2% vs. 74.9%, p = 0.68), HER2+ (12.1% vs. 14.9% vs. 11%, p = 0.33), TN (12.6% vs. 12.7% vs. 14.2%, p = 0.83) and to be high risk by genomic testing (either Oncotype Dx or Mammaprint; p = 0.72). While there was no difference in the clinical stage (p = 0.36) nor patient age at diagnosis (p = 0.76) across the three groups, patients diagnosed during the early pandemic (group 1) were more likely to receive NT compared to those diagnosed one year earlier (group 3); 28.7% vs 16.4%, p < 0.01 (see table). The use of NT differed between the three groups in the ER/PR+her2- (p < 0.01) and her2+ patients (p = 0.05), but not in the TN patients (p = 0.61). There was no difference in the use of NT overall during the pandemic by geographic state (p = 0.32) nor practice setting (p = 0.23); NT was also similar by geographic state and practice setting when considering the ER/PR+HER2-, HER2+, and TNBC subsets. Conclusions: Despite similar clinicopathologic features as earlier time periods, there was an increased use of NT during the early pandemic when compared to the same period in the prior year. This was seen particularly in the ER/PR+HER2- group, suggesting an increased use of neoadjuvant endocrine therapy.[Table: see text]

2018 ◽  
Vol 31 (Supplement_1) ◽  
pp. 9-10
Author(s):  
Michal Lada ◽  
Christian Peyre ◽  
Joseph Wizorek ◽  
Thomas Watson ◽  
Jeffrey Peters ◽  
...  

Abstract Background Five-year survival after the surgical treatment of esophageal cancer has traditionally been reported to be as low as 15%. More recently, the improvement of clinical staging involving PET/CT and the introduction of neoadjuvant chemo-radiation have each altered the survival outcomes of patients with this lethal disease. The impact of these factors on survival trends has not been well described in literature. The aim of this study was to analyze the survival trends after esophagectomy for esophageal adenocarcinoma at a high-volume center. Methods The study population consisted of 471 consecutive patients undergoing esophagectomy for esophageal adenocarcinoma at a university-based medical center between January 1, 2000 and July 31, 2017. Clinical variables were collected for three groups based on the date of esophagectomy and were compared (Group 1: 2000–2004, Group 2: 2005–2011, Group 3: 2012–2017). Survival was compared via the Kaplan-Meier (KM) method. Results The 471 patients had a median age of 64.0 years (range 27.0–86.2) and 395/471 (84%) were male. Dysphagia (282/471, 60%), heartburn (63/471, 13%) and chest pain (29/471, 6%) were the most common presenting symptoms. The majority of the patients underwent transhiatal esophagectomy (n = 279, 59.1%) and en-bloc esophagectomy (n = 85, 18.0%). Staging with PET/CT was utilized in 316/471 patients (67%) with 6% of Group 1, 76% of Group 2 and 100% of Group 3, P < 0.001. Neoadjuvant therapy was utilized in 44% of patients, 209/357 (0% of Group 1, 45% of Group 2 and 76% of Group 3, P < 0.001). The median survival for the entire cohort was 30.0 months (range 0.3–208.0) with 5-year KM survival of 30% for Group 1, 43% for Group 2 and 47% for Group 3, P < 0.001, Figure. When comparing Group 1 and Group 2, the 10-year KM survival improved from 23% to 37%, P < 0.001. Conclusion This analysis reveals an improvement in 5-year survival after esophagectomy from 30% to 47% over the past two decades. Similarly, 10-year survival has improved from 23% to 37%. The evolution of better clinical staging and advancements in neoadjuvant therapy likely played a vital role in these trends. In contrast to the earliest cohort, PET/CT is now routinely utilized in the staging of esophageal cancer. Further, other than those with early stage disease, all patients are currently evaluated for neoadjuvant chemo-radiation. Notably, the 5-year survival rate for the most recent cohort (2012–2017) approaches 50% and would likely be higher if patients with esophageal adenocarcinoma treated endoscopically were included. Improvements in staging and treatment paradigms for esophageal adenocarcinoma have resulted in significant progress towards cure. Disclosure All authors have declared no conflicts of interest.


2017 ◽  
Vol 39 (1) ◽  
pp. 75-77 ◽  
Author(s):  
R Liubota ◽  
V Cheshuk ◽  
R Vereshchako ◽  
O Zotov ◽  
V Zaichuk ◽  
...  

The aim of the study was to investigate the impact of primary tumor locoregional treatment (surgery or/and radiotherapy) on overall survival in patients with primary metastatic breast cancer (PMBC). Materials and Methods: This retrospective study included 295 wo men aged from 23 to 76 years with PMBC. Among the 295 patients, the effect of locoregional treatment of primary tumor on survival outcomes was evaluated in 177 women with distant metastases at diagnosis of breast cancer. 35 patient received breast surgery (group 1), 95 patients with PMBC — radiotherapy (group 2) and 47 patients — combination of breast surgery and radiation (group 3). The remaining 118 patients didn’t receive surgery or/and radiotherapy (group 4). All patients received systemic cytotoxic chemotherapy. Results: The groups of patients with PMBC did not differ significantly by age, menstrual function, ER status, Her2 receptor status, site of metastasis and number of metastatic lesions. 2- and 5-year overall survival in patients of group 1 was 54 and 32%, group 2 — 47 and 8%, group 3 — 73 and 18%, whereas in patients from group 4 — 26 and 9%, respectively. The median survival of patients who underwent surgery was 36 months, patients with PMBC who received radiotherapy — 24 months, patients who obtained combination of breast surgery and radiation — 30 months vs 18 months in patients who did not undergo primary tumor locoregional treatment. Conclusions: The results of this study showed a favourable effect of locoregional treatment in patients with PMBC.


2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii440-iii440
Author(s):  
Harriet Dulson ◽  
Rachel McAndrew ◽  
Mark Brougham

Abstract INTRODUCTION Children treated for CNS tumours experience a very high burden of adverse effects. Platinum-based chemotherapy and cranial radiotherapy can cause ototoxicity, which may be particularly problematic in patients who have impaired vision and cognition as a result of their tumour and associated treatment. This study assessed the prevalence of impaired hearing and vision and how this may impact upon education. METHODS 53 patients diagnosed with solid tumours in Edinburgh, UK between August 2013–2018 were included in the study. Patients were split into three groups according to treatment received: Group 1 – cisplatin-based chemotherapy and cranial radiotherapy; Group 2 - platinum-based chemotherapy, no cranial radiotherapy; Group 3 – benign brain tumours treated with surgery only. Data was collected retrospectively from patient notes. RESULTS Overall 69.5% of those treated with platinum-based chemotherapy experienced ototoxicity as assessed by Brock grading and 5.9% of patients had reduced visual acuity. Patients in Group 1 had the highest prevalence of both. 44.4% of patients in Group 1 needed increased educational support following treatment, either with extra support in the classroom or being unable to continue in mainstream school. 12.5% of Group 2 patients required such support and 31.3% in Group 3. CONCLUSIONS Children with CNS tumours frequently require support for future education but those treated with both platinum-based chemotherapy and cranial radiotherapy are at particular risk, which may be compounded by co-existent ototoxicity and visual impairment. It is essential to provide appropriate support for this patient cohort in order to maximise their educational potential.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yu Liu ◽  
Jing Li ◽  
Wanyu Zhang ◽  
Yihong Guo

AbstractOestradiol, an important hormone in follicular development and endometrial receptivity, is closely related to clinical outcomes of fresh in vitro fertilization-embryo transfer (IVF-ET) cycles. A supraphysiologic E2 level is inevitable during controlled ovarian hyper-stimulation (COH), and its effect on the outcome of IVF-ET is controversial. The aim of this retrospective study is to evaluate the association between elevated serum oestradiol (E2) levels on the day of human chorionic gonadotrophin (hCG) administration and neonatal birthweight after IVF-ET cycles. The data of 3659 infertile patients with fresh IVF-ET cycles were analysed retrospectively between August 2009 and February 2017 in First Hospital of Zhengzhou University. Patients were categorized by serum E2 levels on the day of hCG administration into six groups: group 1 (serum E2 levels ≤ 1000 pg/mL, n = 230), group 2 (serum E2 levels between 1001 and 2000 pg/mL, n = 524), group 3 (serum E2 levels between 2001 and 3000 pg/mL, n = 783), group 4 (serum E2 levels between 3001 and 4000 pg/mL, n = 721), group 5 (serum E2 levels between 4001 and 5000 pg/mL, n = 548 ), and group 6 (serum E2 levels > 5000 pg/mL, n = 852). Univariate linear regression was used to evaluate the independent correlation between each factor and outcome index. Multiple logistic regression was used to adjust for confounding factors. The LBW rates were as follows: 3.0% (group 1), 2.9% (group 2), 1.9% (group 3), 2.9% (group 4), 2.9% (group 5), and 2.0% (group 6) (P = 0.629), respectively. There were no statistically significant differences in the incidences of neonatal LBW among the six groups. We did not detect an association between peak serum E2 level during ovarian stimulation and neonatal birthweight after IVF-ET. The results of this retrospective cohort study showed that serum E2 peak levels during ovarian stimulation were not associated with birth weight during IVF cycles. In addition, no association was found between higher E2 levels and increased LBW risk. Our observations suggest that the hyper-oestrogenic milieu during COS does not seem to have adverse effects on the birthweight of offspring after IVF. Although this study provides some reference, the obstetric-related factors were not included due to historical reasons. The impact of the high estrogen environment during COS on the birth weight of IVF offspring still needs future research.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 340.2-341
Author(s):  
V. Orefice ◽  
F. Ceccarelli ◽  
C. Barbati ◽  
R. Lucchetti ◽  
G. Olivieri ◽  
...  

Background:Systemic lupus erythematosus (SLE) is an autoimmune disease mainly affecting women of childbearing age. The interplay between genetic and environmental factors may contribute to disease pathogenesis1. At today, no robust data are available about the possible contribute of diet in SLE. Caffeine, one of the most widely consumed products in the world, seems to interact with multiple components of the immune system by acting as a non-specific phosphodiesterase inhibitor2.In vitrodose-dependent treatment with caffeine seems to down-regulate mRNA levels of key inflammation-related genes and similarly reduce levels of different pro-inflammatory cytokines3.Objectives:We evaluated the impact of caffeine consumption on SLE-related disease phenotype and activity, in terms of clinimetric assessment and cytokines levels.Methods:We performed a cross-sectional study, enrolling consecutive patients and reporting their clinical and laboratory data. Disease activity was assessed by SLE Disease Activity Index 2000 (SLEDAI-2k)4. Caffeine intake was evaluated by a 7-day food frequency questionnaire, including all the main sources of caffeine. As previously reported, patients were divided in four groups according to the daily caffeine intake: <29.1 mg/day (group 1), 29.2-153.7 mg/day (group 2), 153.8-376.5 mg/day (group 3) and >376.6 mg/day (group 4)5. At the end of questionnaire filling, blood samples were collected from each patient to assess cytokines levels. These were assessed by using a panel by Bio-Plex assays to measure the levels of IL-6, IL-10, IL-17, IL-27, IFN-γ, IFN-α and Blys.Results:We enrolled 89 SLE patients (F/M 87/2, median age 46 years, IQR 14; median disease duration 144 months, IQR 150). The median intake of caffeine was 195 mg/day (IQR 160.5). At the time of the enrollment, 8 patients (8.9%) referred a caffeine intake < 29.1 mg/day (group 1), 27 patients (30.3%) between 29.2 and 153.7 mg/day (group 2), 45 patients (51%) between 153.8 and 376.5 mg/day (group 3) and 9 patients (10.1%) >376.6 mg/day (group 4). A negative correlation between the levels of caffeine and disease activity, evaluated with SLEDAI-2K, was observed (p=0.01, r=-0.26). By comparing the four groups, a significant higher prevalence of lupus nephritis, neuropsychiatric involvement, haematological manifestations, hypocomplementemia and anti-dsDNA positivity was observed in patients with less intake of caffeine (figure 1 A-E). Furthermore, patients with less intake of caffeine showed a significant more frequent use of glucocorticoids [group 4: 22.2%,versusgroup 1 (50.0%, p=0.0001), group 2 (55.5%, p=0.0001), group 3 (40.0%, p=0.009)]. Moving on cytokines analysis, a negative correlation between daily caffeine consumption and serum level of IFNγ was found (p=0.03, r=-0.2) (figure 2A); furthermore, patients with more caffeine intake showed significant lower levels of IFNα (p=0.02, figure 2B), IL-17 (p=0.01, figure 2C) and IL-6 (p=0.003, figure 2D).Conclusion:This is the first report demonstrating the impact of caffeine on SLE disease activity status, as demonstrated by the inverse correlation between its intake and both SLEDAI-2k values and cytokines levels. Moreover, in our cohort, patients with less caffeine consumption seems to have a more severe disease phenotype, especially in terms of renal and neuropsychiatric involvement. Our results seem to suggest a possible immunoregulatory dose-dependent effect of caffeine, through the modulation of serum cytokine levels, as already suggested byin vitroanalysis.References:[1]Kaul et alNat. Rev. Dis. Prim.2016; 2. Aronsen et alEurop Joul of Pharm2014; 3. Iris et alClin Immun.2018; 4. Gladman et al J Rheumatol. 2002; 5. Mikuls et alArth Rheum2002Disclosure of Interests:Valeria Orefice: None declared, Fulvia Ceccarelli: None declared, cristiana barbati: None declared, Ramona Lucchetti: None declared, Giulio Olivieri: None declared, enrica cipriano: None declared, Francesco Natalucci: None declared, Carlo Perricone: None declared, Francesca Romana Spinelli Grant/research support from: Pfizer, Consultant of: Novartis, Gilead, Lilly, Sanofi, Celgene, Speakers bureau: Lilly, cristiano alessandri Grant/research support from: Pfizer, Guido Valesini: None declared, Fabrizio Conti Speakers bureau: BMS, Lilly, Abbvie, Pfizer, Sanofi


2020 ◽  
Vol 28 (3) ◽  
pp. 460-466
Author(s):  
Berkan Özpak

Background: In this study, we present one-year results of drug-eluting balloon treatment of femoropopliteal in-stent restenosis. Methods: A total of 62 patients (48 males, 14 females; mean age 64.2±9.1 years; range, 54 to 81 years) who underwent drug-eluting balloon stenting for femoropopliteal in-stent restenosis between August 2013 and October 2017 were included in the study. The patients were classified into three groups based on the narrowing length of stenosis in the stents. Group/Class 1 (n=17): narrowing <1/2 of the stent length; Group/Class 2 (n=22): narrowing >1/2 of the stent length, not totally occluded; and Group/Class 3 (n=23): totally occluded. In-stent restenosis was treated with drug-eluting balloon treatment. Results: There was a significant difference among all classes in terms of in-stent restenosis. The length of stenosis was a predictor for in-stent restenosis. The mean stent length was 107.7±24.6 mm in Group 1, 164.6±17.9 mm in Group 2, and 180±19.3 mm in Group 3. For non-occluded in-stent restenosis, restenosis rate at one year after balloon angioplasty was 47.1% in Group 1, 86.4% in Group 2, and 95.7% in Group 3. Femoropopliteal bypass was performed in five patients in whom treatment failed. None of the patients required amputation. Conclusion: The length of in-stent restenosis in the femoropopliteal arterial stents is an important predictor for recurrent stenosis, when re-flow is achieved with drug-eluting balloons.


Author(s):  
Osman Erdogan ◽  
Alper Parlakgumus ◽  
Ugur Topal ◽  
Kemal Yener ◽  
Umit Turan ◽  
...  

Aims: Mucinous, medullary, and papillary carcinomas are rarely encountered types of breast cancer. This study aims to contribute to the literature by comparing the clinical and prognostic features and treatment alternatives of rare breast carcinomas. Study Design: Thirty-four patients with rare breast cancer out of a total of 1368 patients who underwent surgery for breast cancer in our clinic between January 2011 and December 2020 were included in the study. Methodology: The patients were assigned into three groups, i.e., medullary carcinoma group (Group 1), mucinous carcinoma group (Group 2) and papillary carcinoma group (Group 3). Demographic and clinical features, treatment modalities used, surgical approaches, pathological features of tumors and survival were compared between the groups. Results: Thirty-four patients were included in the study. The mean age of the patients in Group 3 was higher, though it was not statistically significant. Modified radical mastectomy was more frequently performed in all the groups. The number of the lymph nodes removed through axillary dissections and the number of the positive lymph nodes were similar in all the groups. The tumors in all the groups were also of comparable sizes (30 mm in Group 1, 42.5 mm in Group 2 and 30 mm in Group 3; p:0.464). Estrogen receptors were negative in a significantly higher rate of Group 1(66.7% of Group 1, p<0,001). A significantly higher rate of Group 1 received postoperative chemotherapy (93,3% of Group 1,p:0.001), but the rate of the patients receiving hormonotherapy in this group was significantly lower (26.7% of Group, p<0,001). The patients with medullary cancer had significantly longer survival than those with mucinous cancer and those with papillary cancer (76.2 in Group 1, 54.5 in Group 2 and 58.4 in Group 3; p:0.005). Conclusion: While rare subtypes of breast carcinoma did not affect opting for surgical treatment, selection of oncological therapy was affected depending on the hormone receptor status of these tumors. The long-term survival differed between rare breast tumors. In view of the unique clinical pictures of the tumors, the patients should be evaluated individually, and the evaluation should be associated with theevidence-based principles available for more common breast carcinomas.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Janet W Elcano ◽  
Hui Nam Pak

Background: The incidence of atrial fibrillation (AF) is increasing in the elderly population, however, there is paucity of data on the safety outcomes of this patient subgroup thus we sought to investigate on the impact of age on the safety of catheter ablation for AF. Methods and Results: We included 1,293 (male 75%) patients enrolled in Yonsei AF Ablation Cohort database in Seoul, South Korea, from March 2009 to November 2013. We divided the patients into 4 groups according to age (Group 1, aged 17-49, N=295 ; Group 2 50-59, N=421; Group 3 60-69 N=408; and Group 4 ≥ 70, N=169) and evaluated the incidence of procedure related complications. No procedure-related death occurred in this study. There was a trend of increasing incidence of procedure related complications with age noted as follows: Group 1= 3.7%; Group 2= 4.0%; Group 3=6.6%; and Group 4 7.1%, (p= 0.15). There were 28 cases (2.2%) of major complications (Group 1=1.7%, Group 2=1.9%, Group 3=2%, Group 4 4.1%), tamponade being the most common. Major complications in group 4 include: tamponade 4 cases, phrenic nerve palsy 1 case, atrioesophaeal fistula 1 and 3rd degree AV block in 1 patient. Multivariate regression analysis shows ablation time (odds ratio (OR) 1.2 confidence interval (CI)1.0-1.017, p=0.017), procedure time (OR 1.008, CI 1.0-1.15, p=0.04), decreasing eGFR (OR 1.013, CI 1.002-1.026 p=0.018), coronary artery disease (CAD) (OR 1.847, CI 1.003-3.524, p0.04) and age (OR 1.028, CI 1.003-1.055, p=0.03) were associated with increased adjusted risk of total complications. Predictors of major complications include age (OR 1.044, CI 1.003-1.086, p0.02) and ablation time (OR 1.009, CI 0.999-1.000, p=0.033). Conclusion: Our data suggest that incidence of procedural complications in RFA of AF increase with age. Ablation time and age are independent predictors of a major complication.


Phlebologie ◽  
2015 ◽  
Vol 44 (01) ◽  
pp. 13-17
Author(s):  
J. Duben ◽  
J. Gatek ◽  
T. Saha ◽  
G. Hnatkova ◽  
L. Hnatek

SummaryIntroduction: During the last years, many endovascular techniques have been developed in order to eliminate not only the reflux in stem veins but in perforating veins and their tributaries, too.Aim: The aim of this study was to use endo -vascular RFITT and the foam sclerotherapy for the occlusion of perforating veins as the prime source of reflux and their tributaries.Material and Methods: The Celon method was used for the thermal treatment. Polydocalon with the concentration 1% and 2% with DSS technique was used for the foam sclerotherapy. The RFITT was accomplished in 127 perforating veins in total. This group was divided into three subgroups. The first one consists of patients where only RFITT was carried out (n= 41), in the second, there were patients with RFITT realized with sclerotherapy during one session (n= 48), in the third, RFITT was completed with sclerotherapy in one month after the RFITT intervention (n= 38). The control group included perforating veins treated only with sclerotherapy (n= 81). The power setting 6W was used on the generator during the RFITT with CelonProSurge micro and 18W for usage of Celon ProCurve probe.Results: The effectiveness of the procedure in the group 1 was 8.8%, in the group two 93.7%, in the group three 92.1% and in the control group 76.5% in one year follow up. There was no significant difference between the effectiveness in groups 1, 2 and 3. The marginal difference was among all three groups with RFITT and the control group. Significant differences were in the parameter of the extinction of visible varicose veins with the reflux from perforators. The extinction was faster in group 3 than in group 2 and in the control group and the slowest was in group 1. The significant difference was observed between groups 2 and 3 compared with group 1 and the margin difference was between groups 2 and 3 compared with the control group. No significant difference was observed between groups 1 and the control group.Conclusions: All procedures are effective. The most important is the combination of RFITT and the sclerotherapy one month after thermal intervention. This is associated with a low risk of recanalization and the fastest extinction of visible varicose veins.


Author(s):  
Kirsten E Lyke ◽  
Alexandra Singer ◽  
Andrea A Berry ◽  
Sharina Reyes ◽  
Sumana Chakravarty ◽  
...  

Abstract Background A live-attenuated Plasmodium falciparum (Pf) sporozoite (SPZ) vaccine (PfSPZ Vaccine) has shown up to 100% protection against controlled human malaria infection (CHMI) using homologous parasites (same Pf strain as in the vaccine). Using a more stringent CHMI, with heterologous parasites (different Pf strain), we assessed the impact of higher PfSPZ doses, a novel multi-dose prime regimen, and a delayed vaccine boost upon vaccine efficacy. Methods Four groups of 15 healthy, malaria-naïve adults were immunized. Group (Grp) 1 received five doses of 4.5x10 5 PfSPZ (days 1, 3, 5, 7; week 16). Grps 2, 3 and 4 received three doses (weeks 0, 8, 16) with Gp 2 receiving 9.0×10 5/dose, Grp 3 receiving 18.0×10 5/dose, and Grp 4 receiving 27.0×10 5 for dose 1 and 9.0×10 5 for doses 2 and 3. VE was assessed by heterologous CHMI after 12 or 24 weeks. Volunteers not protected at 12 weeks were boosted prior to repeat CHMI at 24 weeks. Results At 12-week CHMI, 6/15 (40%) Group 1 (P=0.04), 3/15 (20%) Group 2 vs. 0/8 controls remained aparasitemic. At 24-week CHMI, 3/13 (23%) Group 3, 3/14 (21%) Group 4 vs. 0/8 controls remained aparasitemic (Groups 2-4, VE not significant). Post-boost, 9/14 (64%) vs. 0/8 controls remained aparasitemic (3/6 Group 1, P=0.025; 6/8 Group 2, P=0.002). Conclusions Four stacked, priming injections (multi-dose priming) showed 40% VE against heterologous CHMI, while dose escalation of PfSPZ using single dose priming was not significantly protective. Boosting unprotected subjects improved VE at 24 weeks to 64%.


Sign in / Sign up

Export Citation Format

Share Document