scholarly journals A New Model for the Prediction of Preeclampsia in Twin Pregnancy:a Retrospective Cohort Study

Author(s):  
Qing Han ◽  
Shuisen Zheng ◽  
Rongxin Chen ◽  
Huale Zhang ◽  
Jianying Yan

Abstract Objective: To develop an effective nomogram model with which to predict the risk of preeclampsia in twin pregnancies. Material and Methods: The study was a retrospective cohort study of women pregnant with twins who attended antenatal care and labored between January 2015 and December 2020 at the Fujian Maternity and Child Health Hospital, China. We extracted Maternal demographic data and clinical characteristics. Then we performed the least absolute shrinkage and selection operator (LASSO) regression combined with clinical significance to screen variables. Thereafter, multivariate logistic regression was used to construct a nomogram that predicted the risk of preeclampsia in twin pregnancies. Finally, the nomogram was validated using C-statistics (C-index) and calibration curves.Results: A total of 2 469 women with twin pregnancies were included, of whom 325 (13.16%) women had preeclampsia. Multivariate logistic regression models revealed that serum creatinine, uric acid, mean platelet volume, high density lipoprotein, lactate dehydrogenase, fibrinogen, primiparity, pre-pregnancy body mass index, and regular prenatal were independently associated with preeclampsia in twin pregnancies. The constructed predictive model exhibited a good discrimination and predictive ability for preeclampsia in twin pregnancies (concordance index 0.821).Conclusion: The model for the prediction of preeclampsia in twin pregnancies has high accuracy and specificity. It can be used to assess the risk of preeclampsia in twin pregnancies.

2020 ◽  
pp. 088506661990109 ◽  
Author(s):  
Tetsuro Maeda ◽  
Janvi Paralkar ◽  
Toshiki Kuno ◽  
Paru Patrawalla

Background: Lactate clearance has become important in the management of sepsis. However, factors unrelated to sepsis-induced hyperlactatemia, including β-2 adrenergic agonists, can interfere with lactate clearance. Objectives: To investigate the association of inhaled albuterol with lactate clearance in patients with sepsis. Methods: This was a single-center retrospective cohort study. Adult patients with sepsis diagnosed in the emergency department from May 2015 to May 2016 with initial lactate levels >2 mmol/L and serial lactate measurements 2 to 6 hours apart were included. Patients were divided into 2 groups based on whether they received inhaled albuterol between lactate measurements. The primary end point was lactate clearance of 10%. Secondary end points included intensive care unit (ICU) consultation and in-hospital mortality. A multivariate logistic regression analysis was performed to assess the effect of inhaled albuterol on lactate clearance. Results: Of 269 patients included, 58 (22%) received inhaled albuterol between lactate measurements. This group had a significantly higher prevalence of pulmonary disease and a lower initial lactate compared to those who did not receive inhaled albuterol. They had a significantly lower rate of lactate clearance (45% vs 77%, P < .001); however, ICU consultation (71% vs 57%, P = .066) and in-hospital mortality (19% vs 22%, P = .64) were not significantly different. A multivariate logistic regression analysis adjusting for age, sex, chronic kidney disease, cirrhosis, cancer, septic shock or severe sepsis, and the amount of intravenous fluids received showed that inhaled albuterol was independently associated with impaired lactate clearance (adjusted odds ratio: 0.26, 95% confidence interval: 0.14-0.50, P < .001). Conclusions: Inhaled albuterol in patients with sepsis was associated with impaired lactate clearance without an increase in ICU consultation or in-hospital mortality. Impaired lactate clearance in patients with sepsis who receive inhaled albuterol should be interpreted with caution.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e18550-e18550
Author(s):  
Amina Dhahri ◽  
Sam Azargoon ◽  
Portia Buchongo ◽  
Tatiana Chicas ◽  
Amrik Singh ◽  
...  

e18550 Background: Early detection through screening mammography has been shown to decrease breast cancer mortality. Screening mammography rates remains low among racial/ethnic minorities and patients with socioeconomic deprivation (SED). Most studies evaluating the role of area-level social determinants of health and breast cancer screening have included only a small number of variables; in this study, a comprehensive and granular measure of socioeconomic deprivation (SED) which included 17 variables was used to determine an association with screening mammogram completion. Methods: A retrospective cohort study was conducted at an academic hospital system between 2014-2020 to identify asymptomatic female patients who received screening mammogram referrals in their primary care clinic after they were deemed eligible per screening guidelines. Patients were assessed for mammogram completion at their annual visits. SED was evaluated using the area deprivation index (ADI), a measure of 17 variables including education, housing, and income at the census block group level. Other covariates analyzed were insurance status, age, and race. Chi-square test, Kruskal-Wallis test and a multivariate logistic regression model were used for statistical analysis. Results: 856 women were referred for screening mammography. 324 (38%) underwent mammogram. Patients with high, moderate, and low SED comprised 69 (8%), 287 (34%) and 500 (58%) of the cohort, respectively. In multivariable analysis, SED and race were not associated with higher screening rates. Uninsured and self-pay patients had the lowest odds of screening mammography completion (AOR 0.22; 95% 0.08, 0.60) and Medicare patients had decreased odds of mammogram completion relative to privately insured patients (AOR 0.64; 95% CI 0.43, 0.97). Older age was associated with a slightly higher odds of mammography completion (AOR 1.02; 95% CI 1.00, 1.04). Conclusions: The receipt of screening mammography was low among all patients relative to previously published rates. Uninsured/self-pay status was the strongest indicator for completion of mammography. Additional research is needed to understand the barriers that may influence mammography completion in this population with high socioeconomic deprivation. Multivariate Logistic Regression Estimates for Associations Between Mammogram Completion and SED category.[Table: see text]


2021 ◽  
Vol 9 (2) ◽  
pp. 270
Author(s):  
Kiyoharu Fukushima ◽  
Hiroshi Kida

Chronic pulmonary aspergillosis (CPA) has been reported to be associated with poor prognosis in non-tuberculous mycobacteria (NTM)-pulmonary disease (PD) patients. However, whether isolation of Apergillus species is associated with poor outcome or mostly just the reflection of colonization is a widely debated issue and a yet unsolved question. We conducted this single-centered retrospective cohort study of 409 NTM-PD patients to assess the impacts and prevalence of Aspergillus isolation and CPA development. The median observation time was 85 months. Aspergillus species were isolated from 79 (19.3%) and 23 (5.6%) developed CPA. Isolation of Aspergillus species was not associated with mortality in NTM-PD patients (p = 0.9016). Multivariate logistic regression analysis revealed that higher CRP (p = 0.0213) and AFB stain positivity (p = 0.0101) were independently associated with Aspergillus isolation. Different mycobacterial species were not associated with Aspergillus isolation. Survival curves for patients with CPA diagnosis were significantly and strikingly different from those without (p = 0.0064), suggesting that CPA development severely affects clinical outcome. Multivariate logistic regression analysis revealed that the use of systemic steroids (p = 0.0189) and cavity (p = 0.0207) were independent risk factors for the progression to CPA. Considering the high mortality rate of CPA in NTM-PD, early diagnosis and treatment are essential to improve outcomes for NTM-PD patients.


2018 ◽  
Vol 69 (9) ◽  
pp. 2465-2466
Author(s):  
Iustin Olariu ◽  
Roxana Radu ◽  
Teodora Olariu ◽  
Andrada Christine Serafim ◽  
Ramona Amina Popovici ◽  
...  

Osseointegration of a dental implant may encounter a variety of problems caused by various factors, as prior health-related problems, patients� habits and the technique of the implant inserting. Retrospective cohort study of 70 patients who received implants between January 2011- April 2016 in one dental unit, with Kaplan-Meier method to calculate the probability of implants�s survival at 60 months. The analysis included demographic data, age, gender, medical history, behavior risk factors, type and location of the implant. For this cohort the implants�survival for the first 6 months was 92.86% compared to the number of patients and 97.56% compared to the number of total implants performed, with a cumulative failure rate of 2.43% after 60 months. Failures were focused exclusively on posterior mandible implants, on the percentage of 6.17%, odds ratio (OR) for these failures being 16.76 (P = 0.05) compared with other localisations of implants, exclusively in men with median age of 42 years.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S262-S262
Author(s):  
Kok Hoe Chan ◽  
Bhavik Patel ◽  
Iyad Farouji ◽  
Addi Suleiman ◽  
Jihad Slim

Abstract Background Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) infection can lead to many different cardiovascular complications, we were interested in studying prognostic markers in patients with atrial fibrillation/flutter (A. Fib/Flutter). Methods A retrospective cohort study of patients with confirmed COVID-19 and either with existing or new onset A. Fib/Flutter who were admitted to our hospital between March 15 and May 20, 2020. Demographic, outcome and laboratory data were extracted from the electronic medical record and compared between survivors and non-survivors. Univariate and multivariate logistic regression were employed to identify the prognostic markers associated with mortality in patients with A. Fib/Flutter Results The total number of confirmed COVID-19 patients during the study period was 350; 37 of them had existing or new onset A. Fib/Flutter. Twenty one (57%) expired, and 16 (43%) were discharged alive. The median age was 72 years old, ranged from 19 to 100 years old. Comorbidities were present in 33 (89%) patients, with hypertension (82%) being the most common, followed by diabetes (46%) and coronary artery disease (30%). New onset of atrial fibrillation was identified in 23 patients (70%), of whom 13 (57%) expired; 29 patients (78%) presented with atrial fibrillation with rapid ventricular response, and 2 patients (5%) with atrial flutter. Mechanical ventilation was required for 8 patients, of whom 6 expired. In univariate analysis, we found a significant difference in baseline ferritin (p=0.04), LDH (p=0.02), neutrophil-lymphocyte ratio (NLR) (p=0.05), neutrophil-monocyte ratio (NMR) (p=0.03) and platelet (p=0.015) between survivors and non-survivors. With multivariable logistic regression analysis, the only value that had an odds of survival was a low NLR (odds ratio 0.74; 95% confidence interval 0.53–0.93). Conclusion This retrospective cohort study of hospitalized patients with COVID-19 demonstrated an association of increase NLR as risk factors for death in COVID-19 patients with A. Fib/Flutter. A high NLR has been associated with increased incidence, severity and risk for stroke in atrial fibrillation patients but to our knowledge, we are first to demonstrate the utilization in mortality predictions in COVID-19 patients with A. Fib/Flutter. Disclosures Jihad Slim, MD, Abbvie (Speaker’s Bureau)Gilead (Speaker’s Bureau)Jansen (Speaker’s Bureau)Merck (Speaker’s Bureau)ViiV (Speaker’s Bureau)


Critical Care ◽  
2019 ◽  
Vol 23 (1) ◽  
Author(s):  
Edgar Santos ◽  
Arturo Olivares-Rivera ◽  
Sebastian Major ◽  
Renán Sánchez-Porras ◽  
Lorenz Uhlmann ◽  
...  

Abstract Objective Spreading depolarizations (SD) are characterized by breakdown of transmembrane ion gradients and excitotoxicity. Experimentally, N-methyl-d-aspartate receptor (NMDAR) antagonists block a majority of SDs. In many hospitals, the NMDAR antagonist s-ketamine and the GABAA agonist midazolam represent the current second-line combination treatment to sedate patients with devastating cerebral injuries. A pressing clinical question is whether this option should become first-line in sedation-requiring individuals in whom SDs are detected, yet the s-ketamine dose necessary to adequately inhibit SDs is unknown. Moreover, use-dependent tolerance could be a problem for SD inhibition in the clinic. Methods We performed a retrospective cohort study of 66 patients with aneurysmal subarachnoid hemorrhage (aSAH) from a prospectively collected database. Thirty-three of 66 patients received s-ketamine during electrocorticographic neuromonitoring of SDs in neurointensive care. The decision to give s-ketamine was dependent on the need for stronger sedation, so it was expected that patients receiving s-ketamine would have a worse clinical outcome. Results S-ketamine application started 4.2 ± 3.5 days after aSAH. The mean dose was 2.8 ± 1.4 mg/kg body weight (BW)/h and thus higher than the dose recommended for sedation. First, patients were divided according to whether they received s-ketamine at any time or not. No significant difference in SD counts was found between groups (negative binomial model using the SD count per patient as outcome variable, p = 0.288). This most likely resulted from the fact that 368 SDs had already occurred in the s-ketamine group before s-ketamine was given. However, in patients receiving s-ketamine, we found a significant decrease in SD incidence when s-ketamine was started (Poisson model with a random intercept for patient, coefficient − 1.83 (95% confidence intervals − 2.17; − 1.50), p < 0.001; logistic regression model, odds ratio (OR) 0.13 (0.08; 0.19), p < 0.001). Thereafter, data was further divided into low-dose (0.1–2.0 mg/kg BW/h) and high-dose (2.1–7.0 mg/kg/h) segments. High-dose s-ketamine resulted in further significant decrease in SD incidence (Poisson model, − 1.10 (− 1.71; − 0.49), p < 0.001; logistic regression model, OR 0.33 (0.17; 0.63), p < 0.001). There was little evidence of SD tolerance to long-term s-ketamine sedation through 5 days. Conclusions These results provide a foundation for a multicenter, neuromonitoring-guided, proof-of-concept trial of ketamine and midazolam as a first-line sedative regime.


2022 ◽  
Vol 14 (1) ◽  
pp. 20-25
Author(s):  
Riccardo Garbo ◽  
Francesca Valent ◽  
Gian Luigi Gigli ◽  
Mariarosaria Valente

There is limited information regarding the severity of COVID-19 in immunocompromized patients. We conducted a retrospective cohort study considering the period from 1 March 2020 to 31 December 2020 to determine whether previously existing lymphopenia increases the risk of hospitalization and death after SARS-CoV-2 infection in the general population. The laboratory and hospital discharge databases of the Azienda Sanitaria Universitaria Friuli Centrale were used, and 5415 subjects infected with SARS-CoV-2 and with at least one recent absolute lymphocyte count determination before SARS-CoV-2 positivity were included. In total, 817 (15.1%) patients had severe COVID-19. Patients developing severe COVID-19 were more frequently males (44.9% of the severe COVID-19 group vs. 41.5% in the non-severe COVID-19 group; p < 0.0001) and were older (73.2 ± 13.8 vs. 58.4 ± 20.3 years; p < 0.0001). Furthermore, 29.9% of the lymphopenic patients developed severe COVID-19 vs. 14.5% of the non-lymphopenic patients (p < 0.0001). In a logistic regression model, female sex remained a protective factor (OR = 0.514, 95%CI 0.438–0.602, p < 0.0001), while age and lymphopenia remained risk factors for severe COVID-19 (OR = 1.047, 95%CI 1.042–1.053, p < 0.0001 for each additional year of age; OR = 1.715, 95%CI 1.239–2.347, p = 0.0011 for lymphopenia). This provides further information to stratify the risk of COVID-19 severity, which may be an important element in the management of immunosuppressive therapies.


2022 ◽  
Author(s):  
John J Fraser ◽  
Ryan Pommier ◽  
Andrew J MacGregor ◽  
Amy B Silder ◽  
Todd C Sander

Context: Musculoskeletal injuries (MSKIs) are ubiquitous during initial entry military training, with overuse injuries in the lower extremities the most frequent. A common mechanism for overuse injuries is running, an activity that is an integral part of United States Coast Guard (USCG) training and a requirement for graduation. Objective: Assess the effects of athletic footwear choice on lower quarter MSKI risk in USCG recruits. Design: Descriptive Epidemiological Study Setting: USCG Training Center, Cape May, NJ Participants: A retrospective cohort study was performed in which 1229 recruits (1038 males, 191 females) were allowed to self-select athletic footwear during training. A group of 2876 recruits (2260 males, 616 females) who trained under a policy that required obligatory wear of prescribed athletic shoes served as a control. Main Outcome Measures: Demographic data and physical performance were derived from administrative records. Injury data were abstracted from a medical tracking database. Multivariable logistic regression was used to assess group, age, sex, height, body mass, and run times on MSKI outcomes. Results: Ankle-foot, leg, knee and lumbopelvic-hip complex diagnoses were ubiquitous in both groups (experimental: 20.37 to 29.34 per 1000 recruits; control: 18.08 to 25.59 per 1000 recruits). Group was not a significant factor for any of the injuries assessed. Sex was a significant factor in all injury types, with female recruits demonstrating ~2.00 greater odds of experiencing running-related injuries (RRIs), overuse injuries, or any MSKI in general. When considering ankle-foot or bone stress injuries, the risk in female recruits was 3.73 to 4.11 greater odds than their male counterparts. Run time was a significant predictor in RRI, all overuse injuries, and for any MSKI in general. Conclusion: While footwear choice did not influence MSKI risk in USCG recruits, female sex was a primary, nonmodifiable intrinsic risk factor.


2021 ◽  
Author(s):  
Guifang Deng ◽  
Lanlan Wu ◽  
Yao Liu ◽  
Zengyou Liu ◽  
Hengying Chen ◽  
...  

Abstract Background: Blood urea nitrogen (BUN) and creatinine (SCr) are associated with gestational diabetes mellitus (GDM). However, there were limited data in the literature on the influence of BUN and SCr on maternal and fetal outcomes of pregnancy. We aimed to examine the association of BUN and SCr levels during gestation with the risk of selected adverse pregnancy outcomes.Methods: This retrospective cohort study included 1606 singleton mothers aged 22-44 years. Both BUN and SCr levels were collected and measured during the second (16-18th week), third (28-30th week) trimesters of gestation respectively and followed up pregnancy outcomes. Statistical analysis was used multivariate logistic regression. Results: In the multivariate adjusted logistic regression model, the highest level of SCr in the second trimester increased the risk of PROM by 45% (95% CI, 1.01-2.09). In the third trimester of gestation, compared with those in the lowest quartile, BUN levels in the highest quartile decreased the risk of macrosomia and LGA by 60%(95% CI, 0.20-0.78), 66%(95% CI, 0.21,0.55) , respectively, and increased the risk of SGA by 137%(1.06, 5.31), 186%(1.29,6.34) in the third and fourth quartiles, respectively. The adjusted OR (95%CI) for macrosomia in the fourth quartile was 0.46 (0.24, 0.87), for SGA in the third quartiles was 2.36 (1.10, 5.10), and for LGA in the fourth quartile was 0.61 (0.41,0.91) compared with those in the first quartile of SCr levels. The elevated changes of BUN (> 0.64mmol/L) was the risk factor of SGA (OR: 2.11, 95%CI: 1.03,4.32).Conclusion: Higher BUN and SCr levels during the 28-30th week of gestation even those towards the upper limit of the normal range can act as a warning sign of the impending SGA. Elevated changes of BUN and SCr during pregnancy also associated with the lower birth weight.


2003 ◽  
Vol 8 (1) ◽  
pp. 1-8 ◽  
Author(s):  
B Pedalino ◽  
E Feely ◽  
P McKeown ◽  
B Foley ◽  
B Smyth ◽  
...  

A retrospective cohort study was conducted to investigate an outbreak of Norwalk-like viral gastroenteritidis that occurred in Irish holidaymakers visiting Andorra, in January–February 2002. Preliminary results showed the risk exposure was higher for tourists who stayed in Soldeu and consumed ice cubes in their drinks (OR = 2.5, 95% CI [1.3–4.6)], after logistic regression and adjusting for sex and water consumption).


Sign in / Sign up

Export Citation Format

Share Document