scholarly journals 1005 Antibiotic-Impregnated Calcium Sulfate Beads Are Not Effective in the Primary Prevention of Infection in Open Femur and Tibia Fractures

2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
A Fung ◽  
A Ward ◽  
K Patel ◽  
M Krkovic

Abstract Introduction Infection is a major complication of open fractures. Antibiotic-impregnated calcium sulfate (AICS) beads are widely used as an adjuvant to systemic antibiotics. Whilst their efficacy in the secondary prevention of infection is established, we present the first retrospective study evaluating AICS beads in the primary prevention of infection in open fractures. Method 214 open femur and tibia fractures in 207 patients were reviewed over a seven-year period. 148 fractures received only systemic antibiotic prophylaxis. 66 fractures also received AICS beads. The occurrence of acute infection (wound infection and acute osteomyelitis) was recorded, as well as that of long-term complications (chronic osteomyelitis, non-union and death). Results Fractures that received AICS with systemic antibiotics had an overall acute infection rate of 42% (28/66), compared to 43% (63/148) in fractures that received only systemic antibiotics (p > 0.05). There was no significant difference in infection rate even when fractures were stratified by Gustilo-Anderson grade. There was also no significant difference in the rate of long-term complications. Conclusions Our results indicate that the adjuvant use of AICS beads is not effective for the primary prevention of acute infection or long-term complications in open leg fractures. Further research is needed to elucidate the factors influencing the outcomes of AICS use.

Heart ◽  
2018 ◽  
Vol 104 (18) ◽  
pp. 1529-1535 ◽  
Author(s):  
Sérgio Barra ◽  
Rui Providência ◽  
Serge Boveda ◽  
Rudolf Duehmke ◽  
Kumar Narayanan ◽  
...  

ObjectiveIn patients indicated for cardiac resynchronisation therapy (CRT), the choice between a CRT-pacemaker (CRT-P) versus defibrillator (CRT-D) remains controversial and indications in this setting have not been well delineated. Apart from inappropriate therapies, which are inherent to the presence of a defibrillator, whether adding defibrillator to CRT in the primary prevention setting impacts risk of other acute and late device-related complications has not been well studied and may bear relevance for device selection.MethodsObservational multicentre European cohort study of 3008 consecutive patients with ischaemic or non-ischaemic dilated cardiomyopathy and no history of sustained ventricular arrhythmias, undergoing CRT implantation with (CRT-D, n=1785) or without (CRT-P, n=1223) defibrillator. Using propensity score and competing risk analyses, we assessed the risk of significant device-related complications requiring surgical reintervention. Inappropriate shocks were not considered except those due to lead malfunction requiring lead revision.ResultsAcute complications occurred in 148 patients (4.9%), without significant difference between groups, even after considering potential confounders (OR=1.20, 95% CI 0.72 to 2.00, p=0.47). During a mean follow-up of 41.4±29 months, late complications occurred in 475 patients, giving an annual incidence rate of 26 (95% CI 9 to 43) and 15 (95% CI 6 to 24) per 1000 patient-years in CRT-D and CRT-P patients, respectively. CRT-D was independently associated with increased occurrence of late complications (HR=1.68, 95% CI 1.27 to 2.23, p=0.001). In particular, when compared with CRT-P, CRT-D was associated with an increased risk of device-related infection (HR 2.10, 95% CI 1.18 to 3.45, p=0.004). Acute complications did not predict overall late complications, but predicted device-related infection (HR 2.85, 95% CI 1.71 to 4.56, p<0.001).ConclusionsCompared with CRT-P, CRT-D is associated with a similar risk of periprocedural complications but increased risk of long-term complications, mainly infection. This needs to be considered in the decision of implanting CRT with or without a defibrillator.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 3989-3989
Author(s):  
Asher Winder ◽  
Tatiana Bruzgol ◽  
Inna Lichman ◽  
Amir Herman ◽  
Eyal Leibovitz ◽  
...  

Abstract Introduction: The coagulation system plays a pivotal role in the creation and development of vascular clotting. It is activated during infection episodes, as can be measured by different markers like, d-dimer, plasminogen activator inhibitor-1 (PAI1) and fibrinogen. The goal of this study was to check whether high levels of d-dimer, fibrinogen or PAI1 during acute infection, as markers of the coagulation system, indicates higher risk for cerebro-cardiovascular disease or any cause mortality in elderly during 6 months after the infection episode. Methods: Coagulation markers, d-dimer, fibrinogen and PAI1 levels as well as other routine demographic and laboratory variables were recorded in elderly patients hospitalized due to infections. The variables were recorded at the time of admission, at discharge and at 6 months after discharge. End points of the study were Cerebro vascular events, acute coronary syndrome, other thromboembolic events and all cause mortality. Results: The study group consisted of 52 patients, 5 of them died during hospitalization, 4 died later from infectious complications and 6 had a cerebrovascular or cardiovascular (CCV) event. Lower fibrinogen levels were associated with development of CCV events. A statistically significant difference (p=0.048) was found between fibrinogen levels at time of discharge from the hospital among patients with CCV event (366±149 mg/dL) compared to the levels among patients without CCV event (511±139 mg/dL), though comparing fibrinogen levels at admission did not reach statistical significance (patients with CCV event −456±70 mg/dL, patients without CCV event −544±133 mg/dL, p=0.052). Furthermore, the ratio of d-dimer levels measured in admission to the levels at discharge was found to be significantly higher (p=0.029) in patients that eventually had CCV event (1.60±0.5) compared to patients who did not have CCV event (1.12±0.4). PAI-1, Platelets or other Blood variables were not significantly different between the groups. Based on these differences an algorithm was build to identify patients with infection in high risk for CCV event (figure). Patients with fibrinogen levels on admission below 530 mg/dL and d-dimer ratio above 1.54 were identified to be at high risk for CCV event. The algorithm yielded a sensitivity and specificity of 66% and 95%, respectively. The hazard ratio for these patients was found to be 16.8 (95% confidence interval: 3.06–92). Conclusions: In this study we found that lower levels of fibrinogen are associated with higher risk for developing a CCV event, while the opposite occurs for d-dimer ratio. An algorithm was developed for identification of high risk patients. Further research is needed to validate these results in order to identify a group of people that might benefit from long term antithrombotic agents in order to prevent cardio and cerebro-vascular events. Figure Figure


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S443-S444
Author(s):  
David B Kopelman ◽  
Sharon B Wright ◽  
Howard Gold ◽  
Preeti Mehrotra ◽  
Preeti Mehrotra

Abstract Background In an effort to more accurately diagnose Clostridioides difficile infection (CDI), many hospitals have switched to two-step testing algorithms that rely on nucleic acid amplification testing with reflex enzyme immunoassay for toxin. Additionally, oral vancomycin prophylaxis (OVP) against CDI is increasingly being used; initial studies focused on preventing recurrence in patients with a prior history of CDI, but OVP is also being studied in primary prevention. We hypothesized that following the implementation of two-step testing, clinicians may use OVP for prevention of a patient’s first episode of CDI based on knowledge of prior PCR+/Toxin- testing. Methods We performed a single-center, retrospective cohort study of patients admitted to Beth Israel Deaconess Medical Center. We identified patients who received oral vancomycin once daily or BID for the prevention of CDI following implementation of two-step testing. Patients who received oral vancomycin as part of a taper following acute infection were excluded. We categorized rationale for prophylaxis based on clinical documentation and collected details of patients’ CDI history, antibiotic exposure, and subsequent CDI testing during hospitalization. Results In the 12 months following implementation of two-step testing, there were 80 patients who received OVP during hospitalization (2 daily and 78 BID). The vast majority (73, 91.3%) had a history of CDI and received OVP for secondary prevention while receiving systemic antibiotics. There were only 3 patients (3.8%) without known clinical history of CDI whose clinicians documented prophylaxis based on previous PCR+/Toxin- testing. Patients on OVP received a mean of 4.1 systemic antibiotics during hospitalization. When continuing OVP for a finite period after discontinuation of systemic antibiotics, this was most commonly done for 2-7 days (16 of 26, 61.5%). 22 patients underwent stool testing for CDI while receiving OVP in the hospital and all resulted PCR-negative. OVP Indication OVP Duration Conclusion Following implementation of two-step testing for CDI, use of OVP for primary prevention based solely on knowledge of PCR+/Toxin- testing in patients without a history of CDI was rare. Acute CDI appears unlikely in patients actively receiving OVP. Disclosures All Authors: No reported disclosures


Problems when calculating reinforced concrete structures based on the concrete deformation under compression diagram, which is presented both in Russian and foreign regulatory documents on the design of concrete and reinforced concrete structures are considered. The correctness of their compliance for all classes of concrete remains very approximate, especially a significant difference occurs when using Euronorm due to the different shape and sizes of the samples. At present, there are no methodical recommendations for determining the ultimate relative deformations of concrete under axial compression and the construction of curvilinear deformation diagrams, which leads to limited experimental data and, as a result, does not make it possible to enter more detailed ultimate strain values into domestic standards. The results of experimental studies to determine the ultimate relative deformations of concrete under compression for different classes of concrete, which allowed to make analytical dependences for the evaluation of the ultimate relative deformations and description of curvilinear deformation diagrams, are presented. The article discusses various options for using the deformation model to assess the stress-strain state of the structure, it is concluded that it is necessary to use not only the finite values of the ultimate deformations, but also their intermediate values. This requires reliable diagrams "s–e” for all classes of concrete. The difficulties of measuring deformations in concrete subjected to peak load, corresponding to the prismatic strength, as well as main cracks that appeared under conditions of long-term step loading are highlighted. Variants of more accurate measurements are proposed. Development and implementation of the new standard GOST "Concretes. Methods for determination of complete diagrams" on the basis of the developed method for obtaining complete diagrams of concrete deformation under compression for the evaluation of ultimate deformability of concrete under compression are necessary.


Author(s):  
Diana Hart

All countries are faced with the problem of the prevention and control of non-communicable diseases (NCD): implement prevention strategies eff ectively, keep up the momentum with long term benefi ts at the individual and the population level, at the same time tackling hea lth inequalities. Th e aff ordability of therapy and care including innovative therapies is going to be one of the key public health priorities in the years to come. Germany has taken in the prevention and control of NCDs. Germany’s health system has a long history of guaranteeing access to high-quality treatment through universal health care coverage. Th r ough their membership people are entitled to prevention and care services maintaining and restoring their health as well as long term follow-up. Like in many other countries general life expectancy has been increasing steadily in Germany. Currently, the average life expectancy is 83 and 79 years in women and men, respectively. Th e other side of the coin is that population aging is strongly associated with a growing burden of disease from NCDs. Already over 70 percent of all deaths in Germany are caused by four disease entities: cardiovascular disease, cancer, chronic respiratory disease and diabetes. Th ese diseases all share four common risk factors: smoking, alcohol abuse, lack of physical activity and overweight. At the same time, more and more people become long term survivors of disease due to improved therapy and care. Th e German Government and public health decision makers are aware of the need for action and have responded by initiating and implementing a wide spectrum of activities. One instrument by strengthening primary prevention is the Prevention Health Care Act. Its overarching aim is to prevent NCDs before they can manifest themselves by strengthening primary prevention and health promotion in diff erent sett ings. One of the main emphasis of the Prevention Health Care Act is the occupational health promotion at the workplace.


2020 ◽  
Vol 132 (5) ◽  
pp. 1405-1413 ◽  
Author(s):  
Michael D. Staudt ◽  
Holger Joswig ◽  
Gwynedd E. Pickett ◽  
Keith W. MacDougall ◽  
Andrew G. Parrent

OBJECTIVEThe prevalence of trigeminal neuralgia (TN) in patients with multiple sclerosis (MS-TN) is higher than in the general population (idiopathic TN [ITN]). Glycerol rhizotomy (GR) is a percutaneous lesioning surgery commonly performed for the treatment of medically refractory TN. While treatment for acute pain relief is excellent, long-term pain relief is poorer. The object of this study was to assess the efficacy of percutaneous retrogasserian GR for the treatment of MS-TN versus ITN.METHODSA retrospective chart review was performed, identifying 219 patients who had undergone 401 GR procedures from 1983 to 2018 at a single academic institution. All patients were diagnosed with medically refractory MS-TN (182 procedures) or ITN (219 procedures). The primary outcome measures of interest were immediate pain relief and time to pain recurrence following initial and repeat GR procedures. Secondary outcomes included medication usage and presence of periprocedural hypesthesia.RESULTSThe initial pain-free response rate was similar between groups (p = 0.726): MS-TN initial GR 89.6%; MS-TN repeat GR 91.9%; ITN initial GR 89.6%; ITN repeat GR 87.0%. The median time to recurrence after initial GR was similar between MS-TN (2.7 ± 1.3 years) and ITN (2.1 ± 0.6 years) patients (p = 0.87). However, there was a statistically significant difference in the time to recurrence after repeat GR between MS-TN (2.3 ± 0.5 years) and ITN patients (1.2 ± 0.2 years; p < 0.05). The presence of periprocedural hypesthesia was highly predictive of pain-free survival (p < 0.01).CONCLUSIONSPatients with MS-TN achieve meaningful pain relief following GR, with an efficacy comparable to that following GR in patients with ITN. Initial and subsequent GR procedures are equally efficacious.


2021 ◽  
Vol 4 (Supplement_1) ◽  
pp. 234-236
Author(s):  
P Willems ◽  
J Hercun ◽  
C Vincent ◽  
F Alvarez

Abstract Background The natural history of primary sclerosing cholangitis (PSC) in children seems to differ from PSC in adults. However, studies on this matter have been limited by short follow-up periods and inconsistent classification of patients with autoimmune cholangitis (AIC) (or overlap syndrome). Consequently, it remains unclear if long-term outcomes are affected by the clinical phenotype. Aims The aims of this is study are to describe the long-term evolution of PSC and AIC in a pediatric cohort with extension of follow-up into adulthood and to evaluate the influence of phenotype on clinical outcomes. Methods This is a retrospective study of patients with AIC or PSC followed at CHU-Sainte-Justine, a pediatric referral center in Montreal. All charts between January 1998 and December 2019 were reviewed. Patients were classified as either AIC (duct disease on cholangiography with histological features of autoimmune hepatitis) or PSC (large or small duct disease on cholangiography and/or histology). Extension of follow-up after the age of 18 was done for patients followed at the Centre hospitalier de l’Université de Montréal. Clinical features at diagnosis, response to treatment at one year and liver-related outcomes were compared. Results 40 patients (27 PSC and 13 AIC) were followed for a median time of 71 months (range 2 to 347), with 52.5% followed into adulthood. 70% (28/40) had associated inflammatory bowel disease (IBD) (78% PSC vs 54% AIC; p=0.15). A similar proportion of patients had biopsy-proven significant fibrosis at diagnosis (45% PSC vs 67% AIC; p=0.23). Baseline liver tests were similar in both groups. At diagnosis, all patients were treated with ursodeoxycholic acid. Significantly more patients with AIC (77% AIC vs 30 % PSC; p=0.005) were initially treated with immunosuppressive drugs, without a significant difference in the use of Anti-TNF agents (0% AIC vs 15% PSC; p= 0.12). At one year, 55% (15/27) of patients in the PSC group had normal liver tests versus only 15% (2/13) in the AIC group (p=0.02). During follow-up, more liver-related events (cholangitis, liver transplant and cirrhosis) were reported in the AIC group (HR=3.7 (95% CI: 1.4–10), p=0.01). Abnormal liver tests at one year were a strong predictor of liver-related events during follow-up (HR=8.9(95% CI: 1.2–67.4), p=0.03), while having IBD was not (HR=0.48 (95% CI: 0.15–1.5), p=0.22). 5 patients required liver transplantation with no difference between both groups (8% CAI vs 15% CSP; p=0.53). Conclusions Pediatric patients with AIC and PSC show, at onset, similar stage of liver disease with comparable clinical and biochemical characteristics. However, patients with AIC receive more often immunosuppressive therapy and treatment response is less frequent. AIC is associated with more liver-related events and abnormal liver tests at one year are predictor of bad outcomes. Funding Agencies None


2021 ◽  
Vol 7 (5) ◽  
pp. 369
Author(s):  
Joseph Cherabie ◽  
Patrick Mazi ◽  
Adriana M. Rauseo ◽  
Chapelle Ayres ◽  
Lindsey Larson ◽  
...  

Histoplasmosis is a common opportunistic infection in people with HIV (PWH); however, no study has looked at factors associated with the long-term mortality of histoplasmosis in PWH. We conducted a single-center retrospective study on the long-term mortality of PWH diagnosed with histoplasmosis between 2002 and 2017. Patients were categorized into three groups based on length of survival after diagnosis: early mortality (death < 90 days), late mortality (death ≥ 90 days), and long-term survivors. Patients diagnosed during or after 2008 were considered part of the modern antiretroviral therapy (ART) era. Insurance type (private vs. public) was a surrogate indicator of socioeconomic status. Out of 54 PWH infected with histoplasmosis, overall mortality was 37%; 14.8% early mortality and 22.2% late mortality. There was no statistically significant difference in survival based on the availability of modern ART (p = 0.60). Insurance status reached statistical significance with 38% of survivors having private insurance versus only 8% having private insurance in the late mortality group (p = 0.05). High mortality persists despite the advent of modern ART, implicating a contribution from social determinants of health, such as private insurance. Larger studies are needed to elucidate the role of these factors in the mortality of PWH.


Cancers ◽  
2021 ◽  
Vol 13 (14) ◽  
pp. 3390
Author(s):  
Mats Enlund

Retrospective studies indicate that cancer survival may be affected by the anaesthetic technique. Propofol seems to be a better choice than volatile anaesthetics, such as sevoflurane. The first two retrospective studies suggested better long-term survival with propofol, but not for breast cancer. Subsequent retrospective studies from Asia indicated the same. When data from seven Swedish hospitals were analysed, including 6305 breast cancer patients, different analyses gave different results, from a non-significant difference in survival to a remarkably large difference in favour of propofol, an illustration of the innate weakness in the retrospective design. The largest randomised clinical trial, registered on clinicaltrial.gov, with survival as an outcome is the Cancer and Anesthesia study. Patients are here randomised to propofol or sevoflurane. The inclusion of patients with breast cancer was completed in autumn 2017. Delayed by the pandemic, one-year survival data for the cohort were presented in November 2020. Due to the extremely good short-term survival for breast cancer, one-year survival is of less interest for this disease. As the inclusions took almost five years, there was also a trend to observe. Unsurprisingly, no difference was found in one-year survival between the two groups, and the trend indicated no difference either.


Sign in / Sign up

Export Citation Format

Share Document