Percutaneous versus Cut-Down Technique for Indwelling Port Placement

2017 ◽  
Vol 83 (12) ◽  
pp. 1336-1342
Author(s):  
Mario Matiotti-neto ◽  
Mariam F. Eskander ◽  
Omidreza Tabatabaie ◽  
Gyulnara Kasumova ◽  
Lindsay A. Bliss ◽  
...  

The superiority of surgical cut-down of the cephalic vein versus percutaneous catheterization of the subclavian vein for the insertion of totally implantable venous access devices (TIVADs) is debated. To compare the safety and efficacy of surgical cut-down versus percutaneous placement of TIVADs. This is a single-institution retrospective cohort study of oncologic patients who had TIVADs implanted by 14 surgeons. Primary outcomes were inability to place TIVAD by the primary approach and postoperative complications within 30 days. Multivariate analysis was performed by logistic regression. Secondary outcomes included operative time. Two hundred and forty-seven (55.9%) percutaneous and 195 (44.1%) cephalic cut-down patients were identified. The 30-day complication rate was 5.2 per cent: 14 patients (5.7%) in the percutaneous and nine (4.6%) in the cut-down group. The technique was not a significant predictor of having a 30-day complication (odds ratio = 0.820; 95% confidence interval 0.342–1.879). Implantation failure was observed in 16 percutaneous patients (6.5%) and 28 cut-down patients (14.4%) (adjusted odds ratio for cephalic vs cut-down = 2.387; 95% confidence interval 1.275–4.606). The median operative time for percutaneous patients was 46 minutes (interquartile range = 35, 59) versus 37.5 minutes (interquartile range = 30, 49) for cut-down patients(P < 0.0001). Both the percutaneous and cut-down technique are safe and effective for TIVAD implantation. Operative times were shorter and the odds of implantation failure higher for cephalic cut-down. As implantation failure is common, surgeons should familiarize themselves with both techniques.

2017 ◽  
Vol 13 (1) ◽  
pp. 45-52 ◽  
Author(s):  
Mark M. Mitsnefes ◽  
Aisha Betoko ◽  
Michael F. Schneider ◽  
Isidro B. Salusky ◽  
Myles Selig Wolf ◽  
...  

Background and ObjectivesHigh plasma concentration of fibroblast growth factor 23 (FGF23) is a risk factor for left ventricular hypertrophy (LVH) in adults with CKD, and induces myocardial hypertrophy in experimental CKD. We hypothesized that high FGF23 levels associate with a higher prevalence of LVH in children with CKD.Design, setting, participants, & measurementsWe performed echocardiograms and measured plasma C-terminal FGF23 concentrations in 587 children with mild-to-moderate CKD enrolled in the Chronic Kidney Disease in Children (CKiD) study. We used linear and logistic regression to analyze the association of plasma FGF23 with left ventricular mass index (LVMI) and LVH (LVMI ≥95th percentile), adjusted for demographics, body mass index, eGFR, and CKD-specific factors. We also examined the relationship between FGF23 and LVH by eGFR level.ResultsMedian age was 12 years (interquartile range, 8–15) and eGFR was 50 ml/min per 1.73 m2 (interquartile range, 38–64). Overall prevalence of LVH was 11%. After adjustment for demographics and body mass index, the odds of having LVH was higher by 2.53 (95% confidence interval, 1.28 to 4.97; P<0.01) in participants with FGF23 concentrations ≥170 RU/ml compared with those with FGF23<100 RU/ml, but this association was attenuated after full adjustment. Among participants with eGFR≥45 ml/min per 1.73 m2, the prevalence of LVH was 5.4%, 11.2%, and 15.3% for those with FGF23 <100 RU/ml, 100–169 RU/ml, and ≥170 RU/ml, respectively (Ptrend=0.01). When eGFR was ≥45 ml/min per 1.73 m2, higher FGF23 concentrations were independently associated with LVH (fully adjusted odds ratio, 3.08 in the highest versus lowest FGF23 category; 95% confidence interval, 1.02 to 9.24; P<0.05; fully adjusted odds ratio, 2.02 per doubling of FGF23; 95% confidence interval, 1.29 to 3.17; P<0.01). By contrast, in participants with eGFR<45 ml/min per 1.73 m2, FGF23 did not associate with LVH.ConclusionsPlasma FGF23 concentration ≥170 RU/ml is an independent predictor of LVH in children with eGFR≥45 ml/min per 1.73 m2.


2021 ◽  
Vol 16 (4) ◽  
pp. 514-521
Author(s):  
Alexandre Karras ◽  
Marine Livrozet ◽  
Hélène Lazareth ◽  
Nicolas Benichou ◽  
Jean-Sébastien Hulot ◽  
...  

Background and objectivesKidney involvement is frequent among patients with coronavirus disease 2019 (COVID-19), and occurrence of AKI is associated with higher mortality in this population. The objective of this study was to describe occurrence and significance of proteinuria in this setting.Design, setting, participants & measurements We conducted a single-center retrospective study to describe the characteristic features of proteinuria measured within 48 hours following admission among patients with COVID-19 admitted in a tertiary care hospital in France, and to evaluate its association with initiation of dialysis, intensive care unit admission, and death.ResultsAmong 200 patients with available data, urine protein-creatinine ratio at admission was ≥1 g/g for 84 (42%), although kidney function was normal in most patients, with a median serum creatinine of 0.94 mg/dl (interquartile range, 0.75–1.21). Median urine albumin-creatinine ratio was 110 mg/g (interquartile range, 50–410), with a urine albumin-protein ratio <50% in 92% of patients. Urine retinol binding protein concentrations, available for 85 patients, were ≥0.03 mg/mmol in 62% of patients. Urine protein-creatinine ratio ≥1 g/g was associated with initiation of dialysis (odds ratio, 4.87; 95% confidence interval, 2.03 to 13.0; P<0.001), admission to the intensive care unit (odds ratio, 3.55; 95% confidence interval, 1.93 to 6.71; P<0.001), and death (odds ratio, 3.56; 95% confidence interval, 1.90 to 6.54; P<0.001).ConclusionsProteinuria is very frequent among patients admitted for COVID-19 and may precede AKI. Low levels of albuminuria suggest a predominant tubular origin, confirmed by the elevated levels of urine retinol binding protein. Urine protein-creatinine ratio ≥1 g/g at admission is strongly associated with poor kidney and patient outcome.


Author(s):  
Ayelet Grupper ◽  
Nechama Sharon ◽  
Talya Finn ◽  
Regev Cohen ◽  
Meital Israel ◽  
...  

Background and objectivesCoronavirus disease 2019 (COVID-19) is associated with higher morbidity and mortality in patients on maintenance hemodialysis. Patients on dialysis tend to have a reduced immune response to infection or vaccination. We aimed to assess, for the first time to the best of our knowledge, the humoral response following vaccination with the BNT162b2 vaccine in patients on maintenance hemodialysis and the factors associated with it.Design, setting, participants, & measurementsThe study included 56 patients on maintenance hemodialysis (dialysis group) and a control group composed of 95 health care workers. All participants had received two doses of the BNT162b2 (Pfizer-BioNTech) vaccine. The serology testing was done using Quant II IgG anti-Spike severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) assay by Abbott a median of 30 days after receipt of the second dose of the vaccine.ResultsAll subjects in the control group developed an antibody response compared with 96% (54 of 56) positive responders in the dialysis group. The IgG levels in the dialysis group (median, 2900; interquartile range, 1128–5651) were significantly lower than in the control group (median, 7401; interquartile range, 3687–15,471). A Mann–Whitney U test indicated that this difference was statistically significant (U=1238; P<0.001). There was a significant inverse correlation of age and IgG levels in both groups. The odds of being in the lower quartile were significantly higher for older individuals (odds ratio, 1.11 per year of age; 95% confidence interval, 1.08 to 1.20; P=0.004) and for the dialysis group compared with the control group (odds ratio, 2.7; 95% confidence interval, 1.13 to 7.51; P=0.05). Within the dialysis group, older age and lower lymphocyte count were associated with antibody response in the lower quartile (odds ratio, 1.22 per 1-year older; 95% confidence interval, 1.13 to 1.68; P=0.03 and odds ratio, 0.83 per 10-e3/µl-higher lymphocyte count; 95% confidence interval, 0.58 to 0.97; P=0.05).ConclusionsAlthough most patients on maintenance hemodialysis developed a substantial humoral response following the BNT162b2 vaccine, it was significantly lower than controls. Age was an important factor in the humoral response, regardless of chronic medical conditions.


2017 ◽  
Vol 14 (2) ◽  
pp. 139-143 ◽  
Author(s):  
Anna Duda-Sobczak ◽  
Aleksandra Araszkiewicz ◽  
Magdalena Urbas ◽  
Lukasz Borucki ◽  
Katarzyna Kulas ◽  
...  

Introduction: Olfactory dysfunction is suggested to be a clinical manifestation of central diabetic neuropathy. The aim of the study was to assess olfactory function in adult patients with type 1 diabetes. Materials and methods: A total of 106 patients with type 1 diabetes and 30 healthy subjects were included in the study. We evaluated the metabolic control of diabetes and the presence of chronic complications. Olfactory function was assessed with Sniffin’ Sticks. Results: We found a negative correlation between olfactory identification scores and body mass index ( Rs −0.2; p = 0.04) and triglycerides ( Rs = −0.2; p = 0.04). We showed lower olfactory identification scores in neuropathy group versus non-neuropathy group [8 (interquartile range, 7–9) vs 10 (interquartile range, 9–11) points; p = 0.005]. In multivariate linear regression, impaired olfaction was independently associated with neuropathy (beta, −0.3; p = 0.005). In multivariate logistic regression, diabetes duration (odds ratio, 1.06; 95% confidence interval, 1.00–1.11; p = 0.04) and olfactory identification score (odds ratio, 0.61; 95% confidence interval, 0.43–0.85; p = 0.003) were independently associated with neuropathy. Conclusion: Olfactory dysfunction is observed in patients with type 1 diabetes and diabetic peripheral neuropathy.


2017 ◽  
Vol 22 (1) ◽  
pp. 44-50 ◽  
Author(s):  
Immad Sadiq ◽  
Samuel Z Goldhaber ◽  
Ping-Yu Liu ◽  
Gregory Piazza ◽  

Ultrasound-facilitated, catheter-directed, low-dose fibrinolysis minimizes the risk of intracranial bleeding compared with systemic full-dose fibrinolytic therapy for pulmonary embolism (PE). However, major bleeding is nevertheless a potential complication. We analyzed the 150-patient SEATTLE II trial of submassive and massive PE patients to describe those who suffered major bleeding events following ultrasound-facilitated, catheter-directed, low-dose fibrinolysis and to identify risk factors for bleeding. Major bleeding was defined as GUSTO severe/life-threatening or moderate bleeds within 72 hours of initiation of the procedure. Of the 15 patients with major bleeding, four (26.6%) developed access site-related bleeding. Multiple venous access attempts were more frequent in the major bleeding group (27.6% vs 3.6%; p<0.001). All patients with major bleeding had femoral vein access for device delivery. Patients who developed major bleeding had a longer intensive care stay (6.8 days vs 4.7 days; p=0.004) and longer hospital stay (12.9 days vs 8.4 days; p=0.004). The frequency of inferior vena cava filter placement was 40% in patients with major bleeding compared with 13% in those without major bleeding ( p=0.02). Massive PE (adjusted odds ratio 3.6; 95% confidence interval 1.01–12.9; p=0.049) and multiple venous access attempts (adjusted odds ratio 10.09; 95% confidence interval 1.98–51.46; p=0.005) were independently associated with an increased risk of major bleeding. In conclusion, strategies for improving venous access should be implemented to reduce the risk of major bleeding associated with ultrasound-facilitated, catheter-directed, low-dose fibrinolysis. ClinicalTrials.gov Identifier: NCT01513759; EKOS Corporation 10.13039/100006522


2016 ◽  
Vol 4 ◽  
pp. 205031211562643 ◽  
Author(s):  
Mariecel Pilapil ◽  
Lee Morris ◽  
Kohta Saito ◽  
Francine Kouya ◽  
Vivian Maku ◽  
...  

Objectives: Young women are more likely to be infected with HIV globally, in sub-Saharan Africa, and in Cameroon. Despite its clear clinical and public health benefits, condom use among HIV-infected women continues to be low. The objective of this study was to describe the prevalence of inconsistent condom use among HIV-infected women in Cameroon and the factors associated with it. Methods: We conducted a cross-sectional study of HIV-infected young women aged 17–26 years from three semi-urban HIV clinics in the Northwest Region of Cameroon. This study was a subgroup analysis of a previously reported study on inconsistent condom use in HIV-infected and -uninfected youth. Inconsistent condom use was defined as reporting “sometimes” or “never” to questions regarding frequency of condom use. Logistic regression modeling was used to determine factors associated with inconsistent condom use. Results: A total of 84 participants were recruited and submitted completed questionnaires for analysis. Median age was 24 years (interquartile range = 22–25) and the median age at HIV diagnosis was 21 years (interquartile range = 20–23). Fifty percent of the participants reported no prior schooling or only primary school education. Overall, 61/84 (73%) reported inconsistent condom use. After adjusting for potential confounders, education to the secondary school level was protective against inconsistent condom use (odds ratio = 0.19; confidence interval: 0.04–0.95), and having ≥2 pregnancies was associated with inconsistent condom use (odds ratio = 7.52; confidence interval: 1.67–34.00). Conclusion: There is a high prevalence of inconsistent condom use among young HIV-infected women in Cameroon, which appears to be associated with lower levels of educational attainment and higher parity. Further larger studies assessing the factors associated with poor condom use in this population are warranted and may inform public health policy in resource-limited settings with high HIV prevalence.


2021 ◽  
pp. 112972982110268
Author(s):  
Maria Giuseppina Annetta ◽  
Matt Ostroff ◽  
Bruno Marche ◽  
Alessandro Emoli ◽  
Andrea Musarò ◽  
...  

Background: Chest-to-arm (CTA) tunneling has been described recently as a technique that allows an optimal exit site at mid-arm even in chronically ill patients with complex clinical issues and challenging problems of vascular access. Method: We adopted CTA tunneling in oncologic and in non-oncologic patients, in totally implanted and in external devices, for both medium and long-term intravenous treatments. We report our experience with 60 cases of CTA tunneling: 19 patients requiring a totally implantable device, who had bilateral contraindication to venous access at the arm and bilateral contraindication to placement of the pocket in the infra-clavicular area; 41 patients requiring an external central venous catheter, who had bilateral contraindication to insertion of peripherally inserted central catheters or femoral catheters, as well as contraindication to an exit site in the infraclavicular area. All venous access devices were inserted with ultrasound guidance and tip location by intracavitary electrocardiography, under local anesthesia. Results: There were no immediate or early complications. Patients with CTA-ports had no late complications. In patients with CTA-tunneled external catheters, there were two dislodgments, four episodes of central line associated blood stream infections, and one local infection. There were no episodes of venous thrombosis or catheter malfunction. Conclusion: Our experience suggests that CTA tunneling is a safe maneuver, with very low risk of complications, and should be considered as an option in patients with complex venous access.


Author(s):  
M.A. Gregory ◽  
G.P. Hadley

The insertion of implanted venous access systems for children undergoing prolonged courses of chemotherapy has become a common procedure in pediatric surgical oncology. While not permanently implanted, the devices are expected to remain functional until cure of the primary disease is assured. Despite careful patient selection and standardised insertion and access techniques, some devices fail. The most commonly encountered problems are colonisation of the device with bacteria and catheter occlusion. Both of these difficulties relate to the development of a biofilm within the port and catheter. The morphology and evolution of biofilms in indwelling vascular catheters is the subject of ongoing investigation. To date, however, such investigations have been confined to the examination of fragments of biofilm scraped or sonicated from sections of catheter. This report describes a novel method for the extraction of intact biofilms from indwelling catheters.15 children with Wilm’s tumour and who had received venous implants were studied. Catheters were removed because of infection (n=6) or electively at the end of chemotherapy.


Sign in / Sign up

Export Citation Format

Share Document