Abstract P197: Impact of Cranioplasty on Neurological Recovery in Syndrome of the Trephined: A Prospective Longitudinal Study

Stroke ◽  
2021 ◽  
Vol 52 (Suppl_1) ◽  
Author(s):  
Lukas Sveikata ◽  
Lana Vasung ◽  
Amir El Rahal ◽  
Andrea Bartoli ◽  
Armin Schnider ◽  
...  

Introduction: Syndrome of the Trephined (SoT) is a common and underdiagnosed complication after decompressive craniectomy (DC). This study aimed to address the knowledge gap in SoT incidence, risk factors, and the impact of cranioplasty timing on neurological recovery. Methods: In a prospective single-center study we examined 40 consecutive patients that underwent a large DC and cranioplasty for diverse etiologies. The participants underwent a cognitive, motor, and radiological evaluation 1-4 days before and after cranioplasty. SoT was diagnosed when neurological symptoms worsened before cranioplasty (’a priori’) or a neurological improvement was observed after cranioplasty without previous overt symptoms (’a posteriori’). The primary outcome was the occurrence of SoT, and secondary outcome was improvement of disability after cranioplasty (mRS 0-3 as good outcome). We used logistic regression models to assess risk factors for SoT and the impact of cranioplasty timing on neurological recovery. Hemorrhagic lesions were defined as initial brain injury or DC-related intraparenchymal or subarachnoid hemorrhage. Radiologic signs were sinking skin flap, paradoxical midline shift, or slit-like ventricle. Results: Of 40 patients enrolled, 14 (35%) developed ’a priori’ and 12 (30%) ’a posteriori’ SoT. Cranioplasty resulted in mRS improvement in 7 (18%) patients 1-4 days after surgery. A shift towards good outcome was observed 1-4 days post-cranioplasty (62% vs. 42%, p=0.025) and at 90 days (73% vs. 42%, p=0.005) compared to 1-4 days pre-cranioplasty in the SoT group, but not in the non-SoT group. A composite score (0-3) of traumatic brain injury, hemorrhagic lesions, and radiologic markers had 92% sensitivity and 86% specificity to predict SoT. Increasing delay to cranioplasty was associated with decreased odds of disability improvement after cranioplasty, when adjusting for age and baseline disability (OR 0.96, p=0.03). Conclusions: In this prospective study, SoT occurred more frequently than previously reported after DC. Earlier cranioplasty was associated with neurological improvement. A proposed 3-variable score could help predict SoT and better allocate scarce rehabilitation resources by warranting an earlier cranioplasty.

2021 ◽  
Author(s):  
Lukas Sveikata ◽  
Lana Vasung ◽  
Amir El Rahal ◽  
Andrea Bartoli ◽  
Martin Bretzner ◽  
...  

Abstract Background: Syndrome of the Trephined (SoT) is an underrecognized complication after decompressive craniectomy. We aimed to investigate SoT incidence, clinical spectrum, risk factors, and the impact of the cranioplasty on neurological recovery.Methods: Patients undergoing a large craniectomy (>80 cm2) and cranioplasty were prospectively evaluated using modified Rankin score (mRS) and cognitive (attention, processing speed, executive function, language, visuospatial neglect), motor (Motricity Index, Jamar dynamometer, postural score, gait assessment), and radiological evaluation within four days before and after cranioplasty. The primary outcome was SoT, diagnosed when a neurological improvement was observed after the cranioplasty. The secondary outcome was good outcome (mRS 0-3) four days and 90 days after the cranioplasty. Logistic regression models were used to evaluate the risk factors for SoT and the impact of cranioplasty timing on neurological recovery.Results: twenty-six patients (65%) developed SoT and improved after cranioplasty. Brain trauma, hemorrhagic lesions, and shifting of brain structures were associated with SoT. After cranioplasty, a shift towards a good outcome was observed within four days (p=0.025) and persisted at 90 days (p=0.005). Increasing delay to cranioplasty was associated with decreased odds of improvement when adjusting for age and baseline disability (odds ratio 0.96; 95% CI, 0.93-0.99 p=0.012).Conclusions: SoT is frequent after craniectomy and interferes with recovery. A high suspicion of SoT should be exercised in patients who fail to progress or have a previous trauma, hemorrhage, or shifting of brain structures. Performing the cranioplasty earlier was associated with improved and quantifiable neurological recovery.


Author(s):  
Lukas Sveikata ◽  
Lana Vasung ◽  
Amir El Rahal ◽  
Andrea Bartoli ◽  
Martin Bretzner ◽  
...  

AbstractSyndrome of the trephined (SoT) is an underrecognized complication after decompressive craniectomy. We aimed to investigate SoT incidence, clinical spectrum, risk factors, and the impact of the cranioplasty on neurologic recovery. Patients undergoing a large craniectomy (> 80 cm2) and cranioplasty were prospectively evaluated using modified Rankin score (mRS), cognitive (attention/processing speed, executive function, language, visuospatial), motor (Motricity Index, Jamar dynamometer, postural score, gait assessment), and radiologic evaluation within four days before and after a cranioplasty. The primary outcome was SoT, diagnosed when a neurologic improvement was observed after the cranioplasty. The secondary outcome was a good neurologic outcome (mRS 0–3) 4 days and 90 days after the cranioplasty. Logistic regression models were used to evaluate the risk factors for SoT and the impact of cranioplasty timing on neurologic recovery. We enrolled 40 patients with a large craniectomy; 26 (65%) developed SoT and improved after the cranioplasty. Brain trauma, hemorrhagic lesions, and shifting of brain structures were associated with SoT. After cranioplasty, a shift towards a good outcome was observed within 4 days (p = 0.025) and persisted at 90 days (p = 0.005). Increasing delay to cranioplasty was associated with decreased odds of improvement when adjusting for age and baseline disability (odds ratio 0.96; 95% CI, 0.93–0.99, p = 0.012). In conclusion, SoT is frequent after craniectomy and interferes with neurologic recovery. High suspicion of SoT should be exercised in patients who fail to progress or have a previous trauma, hemorrhage, or shifting of brain structures. Performing the cranioplasty earlier was associated with improved and quantifiable neurologic recovery.


Author(s):  
MF Shamji ◽  
C Mohanty ◽  
EM Massicotte ◽  
MG Fehlings

Objective: The impact of spinal alignment on neurological recovery among myelopathy patients has not been thoroughly investigated. This study evaluated the impact of sagittal cervical alignment on neurological recovery in a prospective surgical series of myelopathy patients. Methods: Prospective data was analyzed from surgical CSM patients at a tertiary-care neurosurgical centre. Demographic data and clinical preoperative and postoperative measures of neurological disability (mJOA, Nurick, NDI scores) were analyzed for dependency on cervical spine imaging parameters. Results: Among 124 CSM patients, 34% exhibited kyphotic alignment. Surgical intervention was more frequently anterior or combined anterior/posterior among this group than those with preserved lordosis. Most patients exhibited postoperative neurological improvement for myelopathy severity, however the extent of this improvement was dichotomous based on preoperative sagittal alignment. Improvement was greater among patients with preoperative lordosis (ΔmJOA of 3.1) than those with preoperative kyphosis (ΔmJOA of 1.4, p=0.02). Surgical correction of spinal malalignment did not provide for heightened neurological recovery, although whether it protects against symptomatic adjacent segment disease is unclear. Conclusion: Most CSM patients showed postoperative neurological improvement. Patients with preoperative lordotic alignment exhibited greater improvement than those with preoperative kyphotic alignment. Neither correction of the spinal alignment nor surgical approach in this series specifically affected the extent of neurological recovery.


2010 ◽  
Vol 22 (8) ◽  
pp. 1216-1224 ◽  
Author(s):  
Robert C. Baldwin

ABSTRACTBackground: Achieving remission in late-life depressive disorder is difficult; it is far better to prevent depression. In the last ten years there have been a number of clinical studies of the feasibility of prevention.Methods: A limited literature review was undertaken of studies from 2000 specifically concerning the primary prevention of late-life depressive disorder or where primary prevention is a relevant secondary outcome.Results: Selective primary prevention (targeting individuals at risk but not expressing depression) has been shown to be effective for stroke and macular degeneration but not hip fracture. It may also prove effective for the depression associated with caregiving in dementia. Emerging evidence finds effectiveness for indicated prevention (in those identified with subthreshold depression often with other risk factors such as functional limitation). Despite a number of promising risk factors (for example, diet, exercise, vascular risk factors, homocysteine and insomnia), universal prevention of late-life depression (acting to reduce the impact of risk factors at the population level) has no current evidence base, although a population approach might mitigate suicide.Conclusion: Interventions which work in preventing late-life depression include antidepressant medication in standard doses and Problem-Solving Treatment. When integrated into a care model, such as collaborative care, prevention is feasible but more economic studies are needed.


BMJ Open ◽  
2020 ◽  
Vol 10 (8) ◽  
pp. e040550
Author(s):  
Jean-Francois Payen ◽  
Marion Richard ◽  
Gilles Francony ◽  
Gérard Audibert ◽  
Emmanuel L Barbier ◽  
...  

IntroductionIntracranial hypertension is considered as an independent risk factor of mortality and neurological disabilities after severe traumatic brain injury (TBI). However, clinical studies have demonstrated that episodes of brain ischaemia/hypoxia are common despite normalisation of intracranial pressure (ICP). This study assesses the impact on neurological outcome of guiding therapeutic strategies based on the monitoring of both brain tissue oxygenation pressure (PbtO2) and ICP during the first 5 days following severe TBI.Methods and analysisMulticentre, open-labelled, randomised controlled superiority trial with two parallel groups in 300 patients with severe TBI. Intracerebral monitoring must be in place within the first 16 hours post-trauma. Patients are randomly assigned to the ICP group or to the ICP + PbtO2 group. The ICP group is managed according to the international guidelines to maintain ICP≤20 mm Hg. The ICP + PbtO2 group is managed to maintain PbtO2 ≥20 mm Hg in addition to the conventional optimisation of ICP. The primary outcome measure is the neurological status at 6 months as assessed using the extended Glasgow Outcome Scale. Secondary outcome measures include quality-of-life assessment, mortality rate, therapeutic intensity and incidence of critical events during the first 5 days. Analysis will be performed according to the intention-to-treat principle and full statistical analysis plan developed prior to database freeze.Ethics and disseminationThis study has been approved by the Institutional Review Board of Sud-Est V (14-CHUG-48) and from the National Agency for Medicines and Health Products Safety (Agence Nationale de Sécurité du Médicament et des produits de santé) (141 435B-31). Results will be presented at scientific meetings and published in peer-reviewed publications.The study was registered with ClinTrials NCT02754063 on 28 April 2016 (pre-results).


2014 ◽  
Vol 14 (11) ◽  
pp. 15867-15894
Author(s):  
A. Fraser ◽  
P. I. Palmer ◽  
L. Feng ◽  
H. Bösch ◽  
R. Parker ◽  
...  

Abstract. We use the GEOS-Chem global 3-D atmospheric chemistry transport model to interpret XCH4:XCO2 column ratios retrieved using a proxy method from the Japanese Greenhouse gases Observing SATellite (GOSAT). The advantage of these data over CO2 and CH4 columns retrieved independently using a full physics optimal estimation algorithm is that they suffer less from scattering-related regional bias. We show the model is able to reproduce observed global and regional spatial (mean bias =0.7%) and temporal variations (global r2=0.92) of this ratio with model bias <2.5%. We also show these variations are driven by emissions of CO2 and CH4 that are typically six months out of phase which may reduce the sensitivity of the ratio to changes in either gas. To simultaneously estimate fluxes of CO2 and CH4 we use a formal Bayesian inverse model infrastructure. We use two approaches to independently resolve flux estimates of these two gases using GOSAT observations of XCH4:XCO2: (1) the a priori error covariance between CO2 and CH4 describing common source from biomass burning; and (2) also fitting independent surface atmospheric measurements of CH4 and CO2 mole fraction that provide additional constraints, improving the effectiveness of the observed GOSAT ratio to constrain fluxes. We demonstrate the impact of these two approaches using Observing System Simulation Experiments. A posteriori flux estimates inferred using only the GOSAT ratios and taking advantage of the error covariance due to biomass burning are not consistent with the true fluxes in our experiments, as the inversion system cannot judge which species' fluxes to adjust. This can result in a posteriori fluxes that are further from the truth than the a priori fluxes. We find that adding the surface data to the inversion dramatically improves the ability of the GOSAT ratios to infer both CH4 and CO2 fluxes. We show that using real GOSAT XCH4:XCO2 ratios together with the surface data during 2010 outcompetes inversions using the individual XCH4 or the full-physics XCO2 data products. Regional fluxes that show the greatest improvements have model minus observation differences with a large seasonal cycle such as Tropical South America for which we report a small but significant annual source of CO2 compared to a small annual sink inferred from the XCO2 data. Based on our analysis we argue that using the ratios we may be reaching the limitations on the precision of these data.


2015 ◽  
Vol 282 (1798) ◽  
pp. 20141013 ◽  
Author(s):  
Rachel C. M. Warnock ◽  
James F. Parham ◽  
Walter G. Joyce ◽  
Tyler R. Lyson ◽  
Philip C. J. Donoghue

Calibration is the rate-determining step in every molecular clock analysis and, hence, considerable effort has been expended in the development of approaches to distinguish good from bad calibrations. These can be categorized into a priori evaluation of the intrinsic fossil evidence, and a posteriori evaluation of congruence through cross-validation. We contrasted these competing approaches and explored the impact of different interpretations of the fossil evidence upon Bayesian divergence time estimation. The results demonstrate that a posteriori approaches can lead to the selection of erroneous calibrations. Bayesian posterior estimates are also shown to be extremely sensitive to the probabilistic interpretation of temporal constraints. Furthermore, the effective time priors implemented within an analysis differ for individual calibrations when employed alone and in differing combination with others. This compromises the implicit assumption of all calibration consistency methods, that the impact of an individual calibration is the same when used alone or in unison with others. Thus, the most effective means of establishing the quality of fossil-based calibrations is through a priori evaluation of the intrinsic palaeontological, stratigraphic, geochronological and phylogenetic data. However, effort expended in establishing calibrations will not be rewarded unless they are implemented faithfully in divergence time analyses.


BMJ Open ◽  
2019 ◽  
Vol 9 (3) ◽  
pp. e019186 ◽  
Author(s):  
Aurore Berthe-Aucejo ◽  
Phuong Khanh Hoang Nguyen ◽  
François Angoulvant ◽  
Xavier Bellettre ◽  
Patrick Albaret ◽  
...  

Background and objectivePediatrics: Omission of Prescription and Inappropriate prescription (POPI) is the first detection tool for potentially inappropriate medicines (PIMs) and potentially prescribing omissions (PPOs) in paediatrics. The aim of this study was to evaluate the prevalence of PIM and PPO detected by POPI regarding prescriptions in hospital and for outpatients. The second objective is to determine the risk factors related to PIM and PPO.DesignA retrospective, descriptive study was conducted in the emergency department (ED) and community pharmacy (CP) during 6 months. POPI was used to identify PIM and PPO.SettingRobert-Debré Hospital (France) and Albaret community pharmacy (Seine and Marne).ParticipantsPatients who were under 18 years old and who had one or more drugs prescribed were included. Exclusion criteria consisted of inaccessible medical records for patients consulted in ED and prescription without drugs for outpatients.Primary and secondary outcome measuresPIM and PPO rate and risk factors.ResultsAt the ED, 18 562 prescriptions of 15 973 patients and 4780 prescriptions of 2225 patients at the CP were analysed. The PIM rate and PPO rate were, respectively, 2.9% and 2.3% at the ED and 12.3% and 6.1% at the CP. Respiratory and digestive diseases had the highest rate of PIM.ConclusionThis is the first study to assess the prevalence of PIM and PPO detected by POPI in a paediatric population. This study assessed PIMs or PPOs within a hospital and a community pharmacy. POPI could be used to improve drug use and patient care and to limit hospitalisation and adverse drug reaction. A prospective multicentric study should be conducted to evaluate the impact and benefit of implementing POPI in clinical practice.


BMJ Open ◽  
2019 ◽  
Vol 9 (7) ◽  
pp. e030263 ◽  
Author(s):  
Fredrikke Tove Birgitta Dam Larsen ◽  
Christian Thomas Brandt ◽  
Lykke larsen ◽  
Vibeke Klastrup ◽  
Lothar Wiese ◽  
...  

ObjectiveTo examine predefined risk factors and outcome of seizures in community-acquired bacterial meningitis (CABM).DesignObservational cohort studiesSettingDenmarkParticipantsIn the derivation cohort, we retrospectively included all adults (>15 years of age) with CABM in North Denmark Region from 1998 to 2014 and at Hvidovre and Hillerød hospitals from 2003 to 2014. In the validation cohort, we prospectively included all adults (>18 years of age) with CABM treated at all departments of infectious diseases in Denmark from 2015 to 2017.Primary and secondary outcome measuresIn the derivation cohort, we used modified Poisson regression to compute adjusted relative risks (RRs) with 95% confidence intervals for predefined risk factors for seizures during CABM as well as for risks of death and unfavourable outcome assessed by the Glasgow Outcome Scale score (1-4). Next, results were validated in the validation cohort.ResultsIn the derivation cohort (n=358), risk factors for seizures at any time were pneumococcal aetiology (RR 1.69, 1.01–2.83) and abnormal cranial imaging (RR 2.27, 1.46–3.53), while the impact of age >65 years and immunocompromise was more uncertain. Examining seizures occurring after admission, risk factors were abnormal cranial imaging (RR 2.23, 1.40–3.54) and immunocompromise (RR 1.59, 1.01–2.50). Seizures at any time were associated with increased risks of in-hospital mortality (RR 1.45, 1.01–2.09) and unfavourable outcome at discharge (RR 1.27, 1.02–1.60). In the validation cohort (n=379), pneumococcal aetiology (RR 1.69, 1.10–2.59) and abnormal cranial imaging (RR 1.68, 1.09–2.59) were confirmed as risk factors for seizures at any time. For seizures occurring after admission, only pneumococcal meningitis (RR 1.92, 1.12–3.29) remained significant. Seizures at any time were also associated with in-hospital mortality (RR 3.26, 1.83–5.80) and unfavourable outcome (RR 1.23, 1.00–1.52) in this cohort.ConclusionsPneumococcal aetiology, immunocompromise and abnormal cranial imaging were risk factors for seizures in CABM. Seizures were strongly associated with mortality and unfavourable outcome.


Sign in / Sign up

Export Citation Format

Share Document