scholarly journals Plasma prothrombin time and activated partial thromboplastin time as predictors of bleeding manifestations during dengue hemorrhagic fever

2009 ◽  
Vol 49 (2) ◽  
pp. 69
Author(s):  
I. N. Budastra ◽  
B. N. P. Arhana ◽  
IB. Mudita

Background  Massive bleeding and shock are complications  ofdengue  hemorrhagic  fever (DHF)  that  are associated withhigh mortality. Impaired hemostasis, especially coagulopathy,contributes  to  bleeding manifestations in DHF. Parameters suchas  activated partial thromboplastin time (APTT)  and  plasmaprothrombin  time (PPT) indicate  the  impact  of  coagulationsystem.Objective  To  determine the relationship between  APTT  and PPTlevels with bleeding manifestations in  DHF  patients.Methods  A prospective  cohort  study was applied to subjectsdiagnosed with  DHF  at  the  Infection  and  Tropical DiseasesDivision, Department  of  Child Health, Medical School, UdayanaUniversity, Sanglah Hospital, Denpasar, Indonesia. Laboratorytests  to  determine  APTT  and  PPT  were carried  out  on  thethird, fourth,  and  fifth day after  the  onset  of  fever. Bleedingmanifestations were examined in patients during their hospitalstay. Univariate  and  Cox regression analyses were performedto examine relationship between  APTT  and  PPT  values withbleeding manifestations in  DHF  patients.Results  Forty-three children were enrolled in this study.  Therewas a significant relationship between increases in  APTT  valuewith bleeding manifestations in  DHF  patients [RR 2.79 (95%CI1.68 to 4.69), P <0.01]. Cox regression analysis showed  that  onlyincreased  APTT  values correlated with bleeding manifestations[RR 2.05 (95%CI 1.92 to 3.90), P  =  0.02].Conclusion  APTT  values may be used  as  a predictor for bleedingmanifestations  in  DHF.

Circulation ◽  
2015 ◽  
Vol 131 (suppl_1) ◽  
Author(s):  
Juan C Villar ◽  
Luz X Martínez ◽  
Yeny Z Castellanos ◽  
Skarlet M Vásquez ◽  
Víctor M Herrera

Background: Overweight is a modifiable risk factor for high blood pressure (BP). Despite the increasing prevalence of both conditions in the Latin American population, there are no estimates of either the incidence of hypertension or the impact of overweight on it that inform the design and evaluation of individual and community-based preventive interventions in the region. Methods: We conducted a prospective cohort study in a sample of normotensive, blood donors from Bucaramanga, Colombia, who were free of transfusion-transmitted infectious and cardiovascular diseases at baseline. Participants were re-evaluated after a median follow-up of 12 years to determine the incidence of hypertension defined as: 1) Self-reported diagnosis with evidence of pharmacological treatment; 2) Systolic BP >140 mmHg or diastolic BP >90 mmHg (average of two measures in seated position); or 3) Current systolic/diastolic BP >120/80 mmHg with evidence of increments >10/5 mmHg from baseline. We estimated crude incidence rates of hypertension and age- and sex-adjusted hazard ratios (HRs) for baseline overweight (body mass index ≥25 kg/m2) using Cox regression analysis. The population attributable fraction (PAF) for overweight was also assessed. Results: We followed 594 participants (baseline mean age = 38.0 years; 64% male; adherence rate = 78%) at risk of hypertension among which we observed 164 incident cases: Cumulative incidence of 27.6%; incidence rate of 23.4 cases per 1,000 person-years. Incidence rate was similar in men and women (23.4 vs. 23.2 per 1,000 person-years; p>0.05) and tended to increase with age (17.4, 21.2, and 27.8 per 1,000 person-years among participants <30, 30-39, and ≥40 years old, respectively; p>0.05). Participants with overweight at baseline had twice the risk of developing hypertension than participants with normal weight (adjusted-HR = 2.00, 95%CI: 1.11, 3.61). The estimated PAF was 25.7%, considering a national prevalence of overweight equal to 34.6%. Conclusion: The incidence of hypertension in our study is similar to that reported two decades ago in cohorts from developed countries, which is consisting with the ongoing epidemiological transition in Latin America. We also confirmed the role of overweight as a risk factor for hypertension, accounting for about 1 out 4 incident cases. This finding highlights the importance of addressing overweight in our population.


2014 ◽  
Vol 58 (7) ◽  
pp. 3799-3803 ◽  
Author(s):  
Regis G. Rosa ◽  
Luciano Z. Goldani

ABSTRACTThe time to antibiotic administration (TTA) has been proposed as a quality-of-care measure in febrile neutropenia (FN); however, few data regarding the impact of the TTA on the mortality of adult cancer patients with FN are available. The objective of this study was to determine whether the TTA is a predictor of mortality in adult cancer patients with FN. A prospective cohort study of all consecutive cases of FN, evaluated from October 2009 to August 2011, at a single tertiary referral hospital in southern Brazil was performed. The TTA was assessed as a predictive factor for mortality within 28 days of FN onset using the Cox proportional hazards model. Kaplan-Meier curves were used for an assessment of the mortality rates according to different TTAs; the log-rank test was used for between-group comparisons. In total, 307 cases of FN (169 subjects) were evaluated. During the study period, there were 29 deaths. In a Cox regression analysis, the TTA was independently associated with mortality within 28 days (hazard ratio [HR], 1.18; 95% confidence interval [CI], 1.10 to 1.26); each increase of 1 h in the TTA raised the risk of mortality within 28 days by 18%. Patients with FN episodes with a TTA of ≤30 min had lower 28-day mortality rates than those with a TTA of between 31 min and 60 min (3.0% versus 18.1%; log-rankP= 0.0002). Early antibiotic administration was associated with higher survival rates in the context of FN. Efforts should be made to ensure that FN patients receive effective antibiotic therapy as soon as possible. A target of 30 min to the TTA should be adopted for cancer patients with FN.


2018 ◽  
Vol 22 (6) ◽  
pp. 47-55
Author(s):  
V. A. Dobronravov ◽  
A. O. Mukhametdinova ◽  
M. S. Khrabrova ◽  
A. Nabokow ◽  
H. -J. Gröne ◽  
...  

THE OBJECTIVEof the study was to assess the impact of the count of interstitial CD3+, CD68+ and CD20+ cells on long-term prognosis of renal allograft (RA).PATIENTS AND METHODS.86 RA recipients with biopsy-proven according to the Banff 2013- 2017 criteria glomerulitis were enrolled in this retrospective study. The patients were subdivided into the following groups: 1) isolated glomerulitis with negative donor-specific antibodies (DSA) at the biopsy (n=53); 2) glomerulitis with positive DSA (n=22); 3) glomerulitis with undetermined DSA (n=11). Quantitative assay of interstitial positive cells was performed after immunohistochemical staining for CD68+, CD3+, CD20+. The Kaplan-Meier method and Cox proportional hazards regression model were used for the analysis of the relationship between interstitial CD3+, CD68+, CD20+ cells and risk of RA loss.RESULTS.CD68+ and CD3+ cells prevailed in interstitium in RA glomerulitis. CD20+ infiltrates were found in 60% of cases. CD20+ cells tended to form infiltrates, in 9 cases these infiltrates reached large sizes (≥ 50 CD20+ lymphocytes) and formed nodular structures. There was no difference in the count of interstitial CD3+ and CD68+ cells and in the presence of CD20+ infiltrates between DSA subgroups. Interstitial CD68+ ≥ 5 cells per field of view (FOV) (x400) and CD3+ ≥ 8 cells per FOV (x400), as well as the presence of large CD20+ infiltrates were associated with a lower RA survival (plog-rank < 0,05). Interstitial CD68+ (≥ 5 cells/FOV), CD3 + (≥ 8 cells/FOV) and the presence of large CD20+ interstitial infiltrates were independently associated with the risk of RA loss in the multivariable Cox regression analysis adjusted for DSA, cold and warm ischemia time (p < 0.05). CONCLUSION. Grade of interstitial infiltration by CD68+, CD3+ and CD20+ cells in RA glomerulitis could be independent predictor of RA loss.


2021 ◽  
Vol 8 (11) ◽  
Author(s):  
Dan Ilges ◽  
David J Ritchie ◽  
Tamara Krekel ◽  
Elizabeth A Neuner ◽  
Nicholas Hampton ◽  
...  

Abstract Background Hospital-acquired and ventilator-associated pneumonia (HAP/VAP) cause significant mortality. Guidelines recommend empiric broad-spectrum antibiotics followed by de-escalation (DE). This study sought to assess the impact of DE on treatment failure. Methods This single-center retrospective cohort study screened all adult patients with a discharge diagnosis code for pneumonia from 2016 to 2019. Patients were enrolled if they met predefined criteria for HAP/VAP ≥48 hours after admission. Date of pneumonia diagnosis was defined as day 0. Spectrum scores were calculated, and DE was defined as a score reduction on day 3 versus day 1. Patients with DE were compared to patients with no de-escalation (NDE). The primary outcome was composite treatment failure, defined as all-cause mortality or readmission for pneumonia within 30 days of diagnosis. Results Of 11860 admissions screened, 1812 unique patient-admissions were included (1102 HAP, 710 VAP). Fewer patients received DE (876 DE vs 1026 NDE). Groups were well matched at baseline, although more patients receiving DE had respiratory cultures ordered (56.6% vs 50.6%, P = .011). There was no difference in composite treatment failure (35.0% DE vs 33.8% NDE, P = .604). De-escalation was not associated with treatment failure on multivariable Cox regression analysis (hazard ratio, 1.13; 95% confidence interval, 0.96–1.33). Patients receiving DE had fewer antibiotic days (median 9 vs 11, P &lt; .0001), episodes of Clostridioides difficile infection (2.2% vs 3.8%, P = .046), and hospital days (median 20 vs 22 days, P = .006). Conclusions De-escalation and NDE resulted in similar rates of 30-day treatment failure; however, DE was associated with fewer antibiotic days, episodes of C difficile infection, and days of hospitalization.


1994 ◽  
Vol 72 (05) ◽  
pp. 685-692 ◽  
Author(s):  
Michael T Nurmohamed ◽  
René J Berckmans ◽  
Willy M Morriën-Salomons ◽  
Fenny Berends ◽  
Daan W Hommes ◽  
...  

SummaryBackground. Recombinant hirudin (RH) is a new anticoagulant for prophylaxis and treatment of venous and arterial thrombosis. To which extent the activated partial thromboplastin time (APTT) is suitable for monitoring of RH has not been properly evaluated. Recently, a capillary whole blood device was developed for bed-side monitoring of the APTT and it was demonstrated that this device was suitable to monitor heparin therapy. However, monitoring of RH was not evaluated.Study Objectives. To evaluate in vitro and ex vivo the responsiveness and reproducibility for hirudin monitoring of the whole blood monitor and of plasma APTT assays, which were performed with several reagents and two conventional coagulometers.Results. Large interindividual differences in hirudin responsiveness were noted in both the in vitro and the ex vivo experiments. The relationship between the APTT, expressed as clotting time or ratio of initial and prolonged APTT, and the hirudin concentration was nonlinear. A 1.5-fold increase of the clotting times was obtained at 150-200 ng/ml plasma. However, only a 2-fold increase was obtained at hirudin levels varying from 300 ng to more than 750 ng RH/ml plasma regardless of the assays. The relationship linearized upon logarithmic conversion of the ratio and the hirudin concentration. Disregarding the interindividual differences, and presuming full linearity of the relationship, all combinations were equally responsive to hirudin.Conclusions. All assays were equally responsive to hirudin. Levels up to 300 ng/ml plasma can be reliably estimated with each assay. The manual device may be preferable in situations where rapid availability of test results is necessary.


2021 ◽  
pp. 152660282199672
Author(s):  
Giovanni Tinelli ◽  
Marie Bonnet ◽  
Adrien Hertault ◽  
Simona Sica ◽  
Gian Luca Di Tanna ◽  
...  

Purpose: Evaluate the impact of hybrid operating room (HOR) guidance on the long-term clinical outcomes following fenestrated and branched endovascular repair (F-BEVAR) for complex aortic aneurysms. Materials and Methods: Prospectively collected registry data were retrospectively analyzed to compare the procedural, short- and long-term outcomes of consecutive F-BEVAR performed from January 2010 to December 2014 under standard mobile C-arm versus hybrid room guidance in a high-volume aortic center. Results: A total of 262 consecutive patients, including 133 patients treated with a mobile C-arm equipped operating room and 129 with a HOR guidance, were enrolled in this study. Patient radiation exposure and contrast media volume were significantly reduced in the HOR group. Short-term clinical outcomes were improved despite higher case complexity in the HOR group, with no statistical significance. At a median follow-up of 63.3 months (Q1 33.4, Q3 75.9) in the C-arm group, and 44.9 months (Q1 25.1, Q3 53.5, p=0.53) in the HOR group, there was no statistically significant difference in terms of target vessel occlusion and limb occlusion. When the endograft involved 3 or more fenestrations and/or branches (complex F-BEVAR), graft instability (36% vs 25%, p=0.035), reintervention on target vessels (20% vs 11%, p=0.019) and total reintervention rates (24% vs 15%, p=0.032) were significantly reduced in the HOR group. The multivariable Cox regression analysis did not show statistically significant differences for long-term death and aortic-related death between the 2 groups. Conclusion: Our study suggests that better long-term clinical outcomes could be observed when performing complex F-BEVAR in the latest generation HOR.


Antibiotics ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 798
Author(s):  
Ignacio Martin-Loeches ◽  
Adrian Ceccato ◽  
Marco Carbonara ◽  
Gianluigi li Bassi ◽  
Pierluigi di Natale ◽  
...  

Background: Cardiovascular failure (CVF) may complicate intensive care unit-acquired pneumonia (ICUAP) and radically alters the empirical treatment of this condition. The aim of this study was to determine the impact of CVF on outcome in patients with ICUAP. Methods: A prospective, single-center, observational study was conducted in six medical and surgical ICUs at a University Hospital. CVS was defined as a score of 3 or more on the cardiovascular component of the Sequential Organ Failure Assessment (SOFA) score. At the onset of ICUAP, CVF was reported as absent, transient (if lasting ≤ 3 days) or persistent (>3 days). The primary outcome was 90-day mortality modelled through a Cox regression analysis. Secondary outcomes were 28-day mortality, hospital mortality, ICU length of stay (LOS) and hospital LOS. Results: 358 patients were enrolled: 203 (57%) without CVF, 82 (23%) with transient CVF, and 73 (20%) with persistent CVF. Patients with transient and persistent CVF were more severely ill and presented higher inflammatory response than those without CVF. Despite having similar severity and aetiology, the persistent CVF group more frequently received inadequate initial antibiotic treatment and presented more treatment failures than the transient CVF group. In the persistent CVF group, at day 3, a bacterial superinfection was more frequently detected. The 90-day mortality was significantly higher in the persistent CVF group (62%). The 28-day mortality rates for patients without CVF, with transient and with persistent CVF were 19, 35 and 41% respectively and ICU mortality was 60, 38 and 19% respectively. In the multivariate analysis chronic pulmonary conditions, lack of Pa02/FiO2 improvement at day 3, pulmonary superinfection at day 3 and persistent CVF were independently associated with 90-day mortality in ICUAP patients. Conclusions: Persistent CVF has a significant impact on the outcome of patients with ICUAP. Patients at risk from persistent CVF should be promptly recognized to optimize treatment and outcomes.


2021 ◽  
Vol 14 ◽  
pp. 175628482110234
Author(s):  
Mario Romero-Cristóbal ◽  
Ana Clemente-Sánchez ◽  
Patricia Piñeiro ◽  
Jamil Cedeño ◽  
Laura Rayón ◽  
...  

Background: Coronavirus disease (COVID-19) with acute respiratory distress syndrome is a life-threatening condition. A previous diagnosis of chronic liver disease is associated with poorer outcomes. Nevertheless, the impact of silent liver injury has not been investigated. We aimed to explore the association of pre-admission liver fibrosis indices with the prognosis of critically ill COVID-19 patients. Methods: The work presented was an observational study in 214 patients with COVID-19 consecutively admitted to the intensive care unit (ICU). Pre-admission liver fibrosis indices were calculated. In-hospital mortality and predictive factors were explored with Kaplan–Meier and Cox regression analysis. Results: The mean age was 59.58 (13.79) years; 16 patients (7.48%) had previously recognised chronic liver disease. Up to 78.84% of patients according to Forns, and 45.76% according to FIB-4, had more than minimal fibrosis. Fibrosis indices were higher in non-survivors [Forns: 6.04 (1.42) versus 4.99 (1.58), p < 0.001; FIB-4: 1.77 (1.17) versus 1.41 (0.91), p = 0.020)], but no differences were found in liver biochemistry parameters. Patients with any degree of fibrosis either by Forns or FIB-4 had a higher mortality, which increased according to the severity of fibrosis ( p < 0.05 for both indexes). Both Forns [HR 1.41 (1.11–1.81); p = 0.006] and FIB-4 [HR 1.31 (0.99–1.72); p = 0.051] were independently related to survival after adjusting for the Charlson comorbidity index, APACHE II, and ferritin. Conclusion: Unrecognised liver fibrosis, assessed by serological tests prior to admission, is independently associated with a higher risk of death in patients with severe COVID-19 admitted to the ICU.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
J. M. van Rees ◽  
W. Hartman ◽  
J. J. M. E. Nuyttens ◽  
E. Oomen-de Hoop ◽  
J. L. A. van Vugt ◽  
...  

Abstract Background Chemoradiation with capecitabine followed by surgery is standard care for locally advanced rectal cancer (LARC). Severe diarrhea is considered a dose-limiting toxicity of adding capecitabine to radiation therapy. The aim of this study was to describe the risk factors and the impact of body composition on severe diarrhea in patients with LARC during preoperative chemoradiation with capecitabine. Methods A single centre retrospective cohort study was conducted in a tertiary referral centre. All patients treated with preoperative chemoradiation with capecitabine for LARC from 2009 to 2015 were included. Patients with locally recurrent rectal cancer who received chemoradiation for the first time were included as well. Logistic regression analyses were performed to identify risk factors for severe diarrhea. Results A total of 746 patients were included. Median age was 64 years (interquartile range 57–71) and 477 patients (64%) were male. All patients received a radiation dosage of 25 × 2 Gy during a period of five weeks with either concomitant capecitabine administered on radiation days or continuously during radiotherapy. In this cohort 70 patients (9%) developed severe diarrhea. In multivariable logistic regression analyses female sex (OR: 4.42, 95% CI 2.54–7.91) and age ≥ 65 (OR: 3.25, 95% CI 1.85–5.87) were the only risk factors for severe diarrhea. Conclusions Female patients and patients aged sixty-five or older had an increased risk of developing severe diarrhea during preoperative chemoradiation therapy with capecitabine. No relation was found between body composition and severe diarrhea.


2021 ◽  
pp. oemed-2020-106903
Author(s):  
Julio González Martin-Moro ◽  
Marta Chamorro Gómez ◽  
Galicia Dávila Fernández ◽  
Ana Elices Apellaniz ◽  
Ana Fernández Hortelano ◽  
...  

ObjectivesReverse transcriptase PCR (RT-PCR) is considered the gold standard in diagnosing COVID-19. Infected healthcare workers do not go back to work until RT-PCR has demonstrated that the virus is no longer present in the upper respiratory tract. The aim of this study is to determine the most efficient time to perform RT-PCR prior to healthcare workers’ reincorporation.Materials and methodsThis is a cohort study of healthcare workers with RT-PCR-confirmed COVID-19. Data were collected using the medical charts of healthcare workers and completed with a telephone interview. Kaplan-Meier curves were used to determine the influence of several variables on the time to RT-PCR negativisation. The impact of the variables on survival was assessed using the Breslow test. A Cox regression model was developed including the associated variables.Results159 subjects with a positive RT-PCR out of 374 workers with suspected COVID-19 were included. The median time to negativisation was 25 days from symptom onset (IQR 20–35 days). Presence of IgG, dyspnoea, cough and throat pain were associated with significant longer time to negativisation. Cox logistic regression was used to adjust for confounding variables. Only dyspnoea and cough remained in the model as significant determinants of prolonged negativisation time. Adjusted HRs were 0.68 (0.48–096) for dyspnoea and 0.61 (0.42–0.88) for dry cough.ConclusionsRT-PCR during the first 3 weeks leads to a high percentage of positive results. In the presence of respiratory symptoms, negativisation took nearly 1 week more. Those who developed antibodies needed longer time to negativisate.


Sign in / Sign up

Export Citation Format

Share Document