scholarly journals 588. Seroconversion Among Adults After Receiving At Least One Dose of a COVID-19 Vaccine: COVID-19 Community Research Partnership, Mid-Atlantic, Southeast and Southern United States, December 2020-May 2021

2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S396-S397
Author(s):  
DeAnna J Friedman-Klabanoff ◽  
Ashley Tjaden ◽  
Michele Santacatterina ◽  
Iqra Munawar ◽  
John W Sanders ◽  
...  

Abstract Background Well-regulated clinical trials have shown authorized COVID-19 vaccines to be immunogenic and highly efficacious. Information about antibody responses after vaccination in real-world settings is needed. Methods We evaluated seroconversion rates in adults reporting ≥ 1 dose of an authorized COVID-19 vaccine in a U.S. multistate longitudinal cohort study, the COVID-19 Community Research Partnership. Participants were recruited through 12 participating healthcare systems and community outreach. Participants had periodic home-based serologic testing using either a SARS-CoV-2 nucleocapsid and spike IgM/IgG lateral flow assay (63% of participants) or a SARS-CoV-2 spike IgG enzyme-linked immunosorbent assay (37% of participants). The timing and number of tests before and after vaccination varied based on participant time in study. Participants were included if they were seronegative on the last test before and had >1 test result after vaccination (some had previously been seropositive, but seroreverted). A weighted Cox regression model with right censoring was used to obtain adjusted hazard ratios for sex, age, race/ethnicity, and prior seropositivity. Time-to-event (seroconversion) was defined as time to first positive test > 4 days after vaccination; participants were censored at the date of their last available test result. Results 13,459 participants were included and 11,722 seroconverted (Table). Median time in study was 272 days (range 31–395). Median follow-up time from vaccine to last available test was 56 days (range 1–147). Participants had a median of 3 tests (range 1–12) before and 2 tests (range 1–8) after vaccination. Based on the Kaplan-Meier method, median time to seroconversion after first COVID-19 vaccination was 35 days (interquartile range: 25–45). Likelihood of seroconversion decreased with older age (Table). Female participants, non-Hispanic Black participants, and participants who were previously seropositive were more likely to seroconvert (Table). Conclusion All subgroups had high rates of seroconversion, with some small differences in likelihood of seroconversion between subgroups. These data demonstrate the excellent immunogenicity of COVID-19 vaccines in real-world settings in the US. Disclosures All Authors: No reported disclosures

Angiology ◽  
2021 ◽  
pp. 000331972110043
Author(s):  
Clemens Höbaus ◽  
Gerfried Pesau ◽  
Bernhard Zierfuss ◽  
Renate Koppensteiner ◽  
Gerit-Holger Schernthaner

We evaluated angiogenin as a prospective biomarker in peripheral artery disease (PAD) patients with and without claudication symptoms. A pilot study suggested an elevation of angiogenin in critical limb ischemia. However, in PAD patients, the predictive value of angiogenin has not yet been evaluated. For this purpose, 342 patients with PAD (age: 69 ± 10 years, 34.5% women) were followed-up for 7 years in a cross-sectional study. Angiogenin was measured by enzyme-linked immunosorbent assay. All-cause and cardiovascular mortality were analyzed by Cox regression. Angiogenin levels were higher in men ( P = .001) and were associated with patient waist-to-hip ratio ( P < .001), fasting triglycerides ( P = .011), and inversely with estimated glomerular filtration rate ( P = .009). However, angiogenin showed no association with age, characteristics of diabetes, markers of lipid metabolism, or C-reactive protein. Angiogenin did not correlate with markers of angiogenesis such as vascular endothelial growth factor, angiopoietin-2, or tie-2. Furthermore, angiogenin was not associated with PAD Fontaine stages or with patient ankle-brachial index in addition to all-cause mortality (hazard ratio [HR] = 1.09 [95% CI: 0.89-1.34]) or cardiovascular morality (HR = 1.05 [0.82-1.35]). These results suggest that angiogenin does not provide further information regarding outcome prediction in patients with PAD.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Xiaoxi Yu ◽  
Xin Zhang ◽  
Zhichao Lai ◽  
Jiang Shao ◽  
Rong Zeng ◽  
...  

Abstract Background Drug-coated balloons (DCBs) have shown superiority in the endovascular treatment of short femoropopliteal artery disease. Few studies have focused on outcomes in long lesions. This study aimed to evaluate the safety and effectiveness of Orchid® DCBs in long lesions over 1 year of follow-up. Methods This study is a multicentre cohort and real-world study. The patients had lesions longer than or equal to 150 mm of the femoropopliteal artery and were revascularized with DCBs. The primary endpoints were primary patency, freedom from clinically driven target lesion revascularization (TLR) at 12 months and major adverse events (all-cause death and major target limb amputation). The secondary endpoints were the changes in Rutherford classification and the ankle brachial index (ABI). Results One hundred fifteen lesions in 109 patients (mean age 67 ± 11 years, male proportion 71.6%) were included in this study. The mean lesion length was 252.3 ± 55.4 mm, and 78.3% of the lesions were chronic total occlusion (CTO). Primary patency by Kaplan–Meier estimation was 98.1% at 6 months and 82.1% at 12 months. The rate of freedom from TLR by Kaplan–Meier estimation was 88.4% through 12 months. There were no procedure- or device-related deaths through 12 months. The rate of all-cause death was 2.8%. Cox regression analysis suggested that renal failure and critical limb ischaemia (CLI) were statistically significant predictors of the primary patency endpoint. Conclusion In our real-world study, DCBs were safe and effective when used in long femoropopliteal lesions, and the primary patency rate at 12 months by Kaplan–Meier estimation was 82.1%.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Nervana Elbakary ◽  
Sami Ouanes ◽  
Sadaf Riaz ◽  
Oraib Abdallah ◽  
Islam Mahran ◽  
...  

Abstract Background Major Depressive Disorder (MDD) requires therapeutic interventions during the initial month after being diagnosed for better disease outcomes. International guidelines recommend a duration of 4–12 weeks for an initial antidepressant (IAD) trial at an optimized dose to get a response. If depressive symptoms persist after this duration, guidelines recommend switching, augmenting, or combining strategies as the next step. Premature discontinuation of IAD due to ineffectiveness can cause unfavorable consequences. We aimed to determine the prevalence and the patterns of strategies applied after an IAD was changed because of a suboptimal response as a primary outcome. Secondary outcomes included the median survival time on IAD before any change; and the predictors that were associated with IAD change. Methods This was a retrospective study conducted in Mental Health Services in Qatar. A dataset between January 1, 2018, and December 31, 2019, was extracted from the electronic health records. Inclusion and exclusion criteria were defined and applied. The sample size was calculated to be at least 379 patients. Descriptive statistics were reported as frequencies and percentages, in addition, to mean and standard deviation. The median time of IAD to any change strategy was calculated using survival analysis. Associated predictors were examined using several cox regression models. Results A total of 487 patients met the inclusion criteria of the study, 431 (88%) of them had an occurrence of IAD change to any strategy before end of the study. Almost half of the sample (212 (49%); 95% CI [44–53%]) had their IAD changed less than or equal to 30 days. The median time to IAD change was 43 days with 95% CI [33.2–52.7]. The factors statistically associated with higher hazard of IAD change were: younger age, un-optimization of the IAD dose before any change, and comorbid anxiety. Conclusions Because almost half of the patients in this study changed their IAD as early as within the first month, efforts to avoid treatment failure are needed to ensure patient-treatment targets are met. Our findings offered some clues to help clinicians identify the high-risk predictors of short survival and subsequent failure of IAD.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Geng-He Chang ◽  
Fong-Fu Chou ◽  
Ming-Shao Tsai ◽  
Yao-Te Tsai ◽  
Ming-Yu Yang ◽  
...  

AbstractPatients with end-stage renal disease (ESRD) may demonstrate secondary hyperparathyroidism (SHPT), characterized by parathyroid hormone oversecretion in response to electrolyte imbalance (e.g., hypocalcemia and hyperphosphatemia). Moreover, this electrolyte imbalance may affect vocal cord muscle contraction and lead to voice change. Here, we explored the effects of SHPT on the voices of patients with ESRD. We used data of 147,026 patients with ESRD from the registry for catastrophic illness patients, a sub-database of Taiwan National Health Insurance Research Database. We divided these patients into 2 groups based on whether they had hyperparathyroidism (HPT) and compared vocal dysfunction (VD) incidence among them. We also prospectively included 60 ESRD patients with SHPT; 45 of them underwent parathyroidectomy. Preoperatively and postoperatively, voice analysis was used to investigate changes in vocal parameters. In the real-world database analysis, the presence of HPT significantly increased VD incidence in patients with ESRD (p = 0.003): Cox regression analysis results indicated that patients with ESRD had an approximately 1.6-fold increased VD risk (p = 0.003). In the clinical analysis, the “jitter” and “shimmer” factors improved significantly after operation, whereas the aerodynamic factors remained unchanged. In conclusion, SHPT was an independent risk factor for VD in patients with ESRD, mainly affecting their acoustic factors.


Circulation ◽  
2012 ◽  
Vol 125 (suppl_10) ◽  
Author(s):  
Amitava Banerjee ◽  
Sophie Taillandier ◽  
Jonas B Olesen ◽  
Deirdre A Lane ◽  
Benedicte Lallemand ◽  
...  

Background: The risk of stroke and thromboembolism (TE) in patients with non-valvular atrial fibrillation (NVAF) can be classified in commonly-used stroke risk stratification scores. The role of the pattern of atrial fibrillation in risk prediction is unclear in contemporary ‘real world’ cohorts. Methods: Patients diagnosed with NVAF in a four-hospital-institution between 2000 and 2010 were identified and included. Event rates of stroke/TE were calculated according to pattern of AF, i.e. paroxysmal, persistent and permanent, defined by consensus guidelines. Independent risk factors of stroke/TE were investigated by Cox regression. Results: Among 7156 patients with NVAF, 4176 (58.4%) patients with paroxysmal, 376 (5.3%) with persistent and 2604 (36.3%) with permanent NVAF patterns were included. In non-anticoagulated patients, the overall stroke/TE event rate per 100 person-years was 1.29 (95% CI 1.13–1.47). Paroxysmal NVAF patients were more likely to be female (p<0.001). Persistent NVAF patients were less likely to have prior history of stroke (p–0.002) and vascular disease (p<0.001), and more likely to have hypertension (p<0.001) and vitamin K antagonist therapy (p<0.001). Permanent NVAF patients were more likely to have diabetes (p<0.001), heart failure therapy (p<0.001) and less likely to have dyslipidaemia (p<0.001). Compared with paroxysmal NVAF, rates of stroke/TE (p=0.001), bleeding (p<0.001) and all-cause mortality (p<0.001) were significantly higher in permanent NVAF patients but not in persistent NVAF patients. In multivariate analyses, only previous stroke (hazard ratio, HR 2.58, 95% CI 2.08–3.21), vascular disease (HR 1.34,1.12–1.61), heart failure (HR 1.20,1.00–1.44), age≥75 years (HR 2.75, 2.16–3.50) and age 65–74 years (HR 1.60,1.22–2.09) increased stroke/TE risk, but persistent (HR1.13, 0.76–1.70) and permanent (HR 1.44,0.96–2.16) patterns of NVAF did not. Conclusion: In this large ‘real world’ cohort of NVAF patients, there were significant differences in rates of stroke, TE, death and bleeding between patterns of NVAF, however only previous stroke, age, heart failure and vascular disease (not pattern of NVAF) independently increased the risk of stroke/TE, death and bleeding in multivariate analyses. Therefore, the risk of stroke is similar across all patterns of NVAF and antithrombotic therapy should be based on clinical risk factors not NVAF pattern.


Author(s):  
Tirsa Verani ◽  
 Kanadi Sumapradja

Objective: To assess the estrone (E1), estradiol (E2) and estriol (E3) blood level and its ratio (E2:E1, E2:E3 and E1:E3) between women with and without endometriosis. Method: We performed an analytical cross sectional study with 27 women with endometriosis and 27 women without endometriosis who met the inclusion criteria. The samples were recruited in Dr. Cipto Mangunkusumo hospital and other satellite hospitals from October 2012 to April 2013. The blood level of estrogen metabolites was examined by enzyme-linked immunosorbent assay (ELISA). Comparison between the two groups was analyzed by using Mann- Whitney test. Result: The level of estrone was found to be lower in endometriosis group compared to that in the control group (54.66 pg/ml vs 73.52 pg/ml, p=0.229). Similarly, the levels of estradiol and estriol were lower in endometriosis group (29 pg/ml vs 35 pg/ml, p=0.815 and 1.11 pg/ml vs 1.67 pg/ml, p=0.095, consecutively). The E2:E1 ratio was higher in endometriosis group (0.51 pg/ml vs 0.38 pg/ml, p=0.164), as well as E2:E3 ratio (26.53 pg/ml vs 21.11 pg/ml, p=0.223) and the E1:E3 ratio (58.55 pg/ml vs 50.28 pg/ml, p=0.684). However, all those differences were not statistically significant. Conclusion: The estrone, estradiol and estriol levels in women with endometriosis were lower compared to those in women without endometriosis. The ratio of E2:E1, E2:E3 and E1:E3 were higher in endometriosis group. However, all those differences failed to reach statistical significance. [Indones J Obstet Gynecol 2014; 3: 127-131] Keywords: endometriosis, estradiol, estriol, estrogen, estrone


2021 ◽  
Vol 12 ◽  
Author(s):  
Bin Lou ◽  
Guanghua Ma ◽  
Feifei LV ◽  
Quan Yuan ◽  
Fanjie Xu ◽  
...  

ObjectiveHepatitis B virus (HBV) reinfection is a serious complication that arise in patients who undergo hepatitis B virus related liver transplantation. We aimed to use biomarkers to evaluate the HBV reinfection in patients after orthotopic liver transplantation.MethodsSeventy-nine patients who underwent liver transplantation between 2009 and 2015 were enrolled, and levels of biomarkers were analyzed at different time points. Cox regression and receiver operating characteristic (ROC) curves of different markers at baseline were used to analyze sustained hepatitis B surface antigen (HBsAg) loss. The Kaplan-Meier method was used to compare the levels of the biomarkers.ResultsAmong the 79 patients, 42 sustained HBsAg loss with a median time of 65.2 months (12.0-114.5, IQR 19.5) after liver transplantation and 37 patients exhibited HBsAg recurrence with a median time of 8.8 (0.47-59.53, IQR 19.47) months. In the ROC curve analysis, at baseline, 4.25 log10 IU/mL qHBcAb and 2.82 log10 IU/mL qHBsAg showed the maximum Youden’s index values with area under the curves (AUCs) of 0.685and 0.651, respectively. The Kaplan-Meier method indicated that qHBsAg and quantitative antibody against hepatitis B core antigen (qHBcAb) levels in the two groups were significantly different (p = 0.031 and 0.006, respectively). Furthermore, the Cox regression model confirmed the predictive ability of qHBcAb at baseline (AUC = 0.685).ConclusionLower pretransplantation qHBcAb is associated with HBV infection. The baseline concentration of qHBcAb is a promising predictor for the recurrence of HBV in patients undergoing liver transplantation and can be used to guide antiviral treatment for HBV infection.


2021 ◽  
Author(s):  
Ru Liu ◽  
Tianyu Li ◽  
Deshan Yuan ◽  
Yan Chen ◽  
Xiaofang Tang ◽  
...  

Abstract Objectives: This study analyzed the association between on-treatment platelet reactivity and long-term outcomes of patients with acute coronary syndrome (ACS) and thrombocytopenia (TP) in the real world. Methods: A total of 10724 consecutive cases with coronary artery disease who underwent percutaneous coronary intervention (PCI) were collected from January to December 2013. Cases with ACS and TP under dual anti-platelet therapy were enrolled from the total cohort. 5-year clinical outcomes were evaluated among cases with high on-treatment platelet reactivity (HTPR), low on-treatment platelet reactivity (LTPR) and normal on-treatment platelet reactivity (NTPR), tested by thromboelastogram (TEG) at baseline. Results: Cases with HTPR, LTPR and NTPR accounted for 26.2%, 34.4% and 39.5%, respectively. Cases with HTPR were presented with the most male sex, lowest hemoglobin level, highest erythrocyte sedimentation rate and most LM or three-vessel disease, compared with the other two groups. The rates of 5-year all-cause death, major adverse cardiovascular and cerebrovascular events (MACCE), cardiac death, myocardial infarction (MI), revascularization, stroke and bleeding were all not significantly different among three groups. Multivariable Cox regression indicated that, compared with cases with NTPR, cases with HTPR were not independently associated with all endpoints, as well as cases with LTPR (all P>0.05). Conclusions: In patients with ACS and TP undergoing PCI, 5-year all-cause death, MACCE, MI, revascularization, stroke and bleeding risk were all similar between cases with HTPR and cases with NTPR, tested by TEG at baseline, in the real world. The comparison result was the same between cases with LTPR and NTPR.


Sign in / Sign up

Export Citation Format

Share Document