scholarly journals The Impact of Hospital Safety-Net Status on Inpatient Outcomes for Brain Tumor Craniotomy: A 10-Year Nationwide Analysis

Author(s):  
Oliver Y Tang ◽  
Krissia M Rivera Perla ◽  
Rachel K Lim ◽  
Robert J Weil ◽  
Steven A Toms

Abstract Background Outcome disparities have been documented at safety-net hospitals (SNHs), which disproportionately serve vulnerable patient populations. Using a nationwide retrospective cohort, we assessed inpatient outcomes following brain tumor craniotomy at SNHs in the United States. Methods We identified all craniotomy procedures in the National Inpatient Sample from 2002-2011 for brain tumors: glioma, metastasis, meningioma, and vestibular schwannoma. Safety-net burden was calculated as the number of Medicaid plus uninsured admissions divided by total admissions. Hospitals in the top quartile of burden were defined as SNHs. The association between SNH status and in-hospital mortality, discharge disposition, complications, hospital-acquired conditions (HACs), length of stay (LOS), and costs were assessed. Multivariate regression adjusted for patient, hospital, and severity characteristics. Results 304,719 admissions were analyzed. The most common subtype was glioma (43.8%). Of 1,206 unique hospitals, 242 were SNHs. SNH admissions were more likely to be non-white (P<0.001), low-income (P<0.001), and have higher severity scores (P=0.034). Mortality rates were higher at SNHs for metastasis admissions (odds ratio [OR]=1.48, P=0.025), and SNHs had higher complication rates for meningioma (OR=1.34, P=0.003) and all tumor types combined (OR=1.17, P=0.034). However, there were no differences at SNHs for discharge disposition or HACs. LOS and hospital costs were elevated at SNHs for all subtypes, culminating in a 10% and 9% increase in LOS and costs for the overall population, respectively (all P<0.001). Conclusions SNHs demonstrated poorer inpatient outcomes for brain tumor craniotomy. Further analyses of the differences observed and potential interventions to ameliorate interhospital disparities are warranted.

2014 ◽  
Vol 84 (5-6) ◽  
pp. 244-251 ◽  
Author(s):  
Robert J. Karp ◽  
Gary Wong ◽  
Marguerite Orsi

Abstract. Introduction: Foods dense in micronutrients are generally more expensive than those with higher energy content. These cost-differentials may put low-income families at risk of diminished micronutrient intake. Objectives: We sought to determine differences in the cost for iron, folate, and choline in foods available for purchase in a low-income community when assessed for energy content and serving size. Methods: Sixty-nine foods listed in the menu plans provided by the United States Department of Agriculture (USDA) for low-income families were considered, in 10 domains. The cost and micronutrient content for-energy and per-serving of these foods were determined for the three micronutrients. Exact Kruskal-Wallis tests were used for comparisons of energy costs; Spearman rho tests for comparisons of micronutrient content. Ninety families were interviewed in a pediatric clinic to assess the impact of food cost on food selection. Results: Significant differences between domains were shown for energy density with both cost-for-energy (p < 0.001) and cost-per-serving (p < 0.05) comparisons. All three micronutrient contents were significantly correlated with cost-for-energy (p < 0.01). Both iron and choline contents were significantly correlated with cost-per-serving (p < 0.05). Of the 90 families, 38 (42 %) worried about food costs; 40 (44 %) had chosen foods of high caloric density in response to that fear, and 29 of 40 families experiencing both worry and making such food selection. Conclusion: Adjustments to USDA meal plans using cost-for-energy analysis showed differentials for both energy and micronutrients. These differentials were reduced using cost-per-serving analysis, but were not eliminated. A substantial proportion of low-income families are vulnerable to micronutrient deficiencies.


2021 ◽  
Author(s):  
Rajeev S. Ramchandran ◽  
Reza Yousefi-Nooraie ◽  
Porooshat Dadgostar ◽  
Sule Yilmaz ◽  
Jesica Basant ◽  
...  

BACKGROUND Store and forward camera based evaluation or teleophthalmology is considered an effective way to identify diabetic retinopathy, the leading cause of blindness in the United States, but uptake has been slow. OBJECTIVE Understanding the barriers and facilitators of implementing teleophthalmology programs from those actively adopting, running, and sustaining such programs is important for widespread adoption. METHODS This qualitative study in three urban low-income, largely minority-serving safety-net primary care clinics in Rochester, NY, USA interviewed nurses and doctors on implementing a teleophthalmology program using questions informed by Practical, Robust Implementation and Sustainability Model (PRISM) and Consolidated Framework for Implementation Research (CFIR) frameworks. RESULTS Primary care nurses operationalizing the program in their clinics saw increased work burden and lack of self-efficacy as barriers. Continuous training on the teleophthalmology process for nurses, doctors, and administrative staff through in-service and peer-training by champions/super-users were identified by interviewees as needs. Facilitators included the perceived convenience for the patient and a perceived educational advantage to the program as it gave an opportunity for providers to discuss the importance of eye care with patients. Concerns in making and tracking referrals to ophthalmology due to challenges related to care coordination were highlighted. Financial aspects of the program (e.g. patient coverage and care provider reimbursement) were unclear to many staff, influencing adoption and sustainability. CONCLUSIONS Streamlining of processes and workflows, training and assigning adequate staff, effective care coordination between primary care and eye care to improve follow-ups, and ensuring financial viability can all help streamline the adoption of teleophthalmology.


Elements ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. 39-52
Author(s):  
Charlie Power

The debate over the future direction of elementary and secondary education in the United States is fractious and contentious. Many of these are rooted in concerns over disparities in financial circumstances and race. While the full extent of the gaps, in addition to the United States' mediocre education system relative to other industrialized nations, has been a subject of frequent research and heated debate, one crucial component of this divide has yet to be analyzed: summer learning loss. This paper will closely analyze published literature in order to analyze the impact of summer education loss. Additionally, this paper will argue that summer learning varies by socioeconomic status (SES), with low-income populations gradually regressing over the years. This phenomenon has ramifications on students' achievement and explains the disparities that accumulate over a student's educational career. Finally, based on current evidence, this paper will make policy recommendations on how to change the current education system to better address summer's inherent inequities. 


1975 ◽  
Vol 7 (1) ◽  
pp. 223-231 ◽  
Author(s):  
Ron Mittelhammer ◽  
Donald A. West

The USDA's Food Stamp Program (FSP) is a major item in the department's budget. In effect from 1939 to 1943 and revived as a pilot program in 1961, FSP has grown until, in 1973, it provided nearly $4 billion in food stamps to an average of 12 million persons per month. About 55 percent of the $4 billion is federal subsidy. The program is continuing to expand as a result of a congressional mandate that FSP be in effect nationwide after June 30, 1974. Because of the FSP's growth, questions are now being asked about the program's impact on demand for food in the United States.In its pre-World War II inception, FSP was developed as an alternative to direct distribution of commodities to relief families. Although the objective of improving food consumption among needy households was recognized, FSP was viewed primarily as a method for stimulating demand for farm products.


Stroke ◽  
2014 ◽  
Vol 45 (suppl_1) ◽  
Author(s):  
Cristine W Small ◽  
Donald L Price ◽  
Jeffrey D Ferguson ◽  
Lawrence I Madubeze ◽  
Susan D Freeman

Purpose: To determine whether the stroke alert process results in improved outcomes, as reflected in door to lytic times and other outcome measures. Introduction: The diagnosis and treatment of stroke is time-sensitive and should be inclusive of all seven D’s in the “chain of survival” - Detection, Dispatch, Delivery, Door, Data, Decision and Drug (Adams, Stroke, 2007). Early stroke activation is part of the “Delivery” which incorporates transport and management by Emergency Medical Services (EMS). Clinical suspicion of stroke by EMS resulted in a process of early activation which was labeled “Stroke Alert.” This expedited the code stroke process upon arrival, preparing the hospital based stroke team to provide immediate triage and evaluation. The goal was to improve clinical efficiency and possibly clinical outcomes. Methods: • Implementation of a notification process from EMS to ED - Stroke Alert • Incorporated Stroke Alert to include Stroke Response Team (SRT) nurses January 22, 2011 • Retrospective review of internal stroke database (January 22, 2011 to July 2013) for comparative analysis of Stroke Alerts called versus those where no stroke alert was called • Evaluate clinical outcomes directly related to Stroke Alert process Results: From January 22, 2011 to July 2013: Stroke Alert Called: • 37 t-PA patients and 14 of those, 37.8%, met the 60 minute benchmark • Average Door to Lytic time - 65 minutes Stroke Alert NOT Called: • 35 t-PA patients and 10 of those, 28.6% met the 60 minute benchmark • Average Door to Lytic time - 79 minutes Conclusions: The ability for a SRT to meet the golden hour of stroke benchmark occurs more frequently when a Stroke Alert is called to the SRT nurse. Future plans include review of stroke severity scores, length of stay (LOS), and discharge disposition, to determine the impact a Stroke Alert may have on clinical outcomes.


2021 ◽  
Author(s):  
◽  
Jaime Lancaster

<p>This thesis expands the literature on minimum and living wages by investigating local minimum wage ordinances and voluntary living wage programs. This thesis is presented as three distinct papers; the first explores a county-wide minimum wage ordinance in New Mexico, USA, while papers 2 and 3 explore New Zealand’s voluntary living wage program. In the United States, local minimum wage ordinances are growing in popularity, and research is emerging on their effects. Setting minimum wages at the local level is politically easier than enacting Federal legislation, and local minimum wages may be better targeted to local economic conditions. In my first chapter, “Local Minimum Wage Laws and Labour Market Outcomes: Evidence from New Mexico,” I use fixed effects and synthetic control analysis to uncover the effects of a local minimum wage law on the Albuquerque/Bernalillo region of New Mexico, with a focus on how provisions exempting tipped workers affect gains in earnings. My findings reveal that these provisions can lead to reductions in hourly wages for workers exempted from the minimum wage even when the labour market is not harmed overall. I find that the minimum wage ordinance did not reduce teen employment but that it served to increase the supply of teen labour leading to an increase in the teen unemployment rate.  The second and third papers in this thesis address the voluntary living wage program in New Zealand. In the first quantitative work on New Zealand’s living wage, I utilize data from Statistics New Zealand’s Integrated Data Infrastructure (IDI) to explore several facets of the living wage experience for employers and employees. In the second paper, “The New Zealand Living Wage: Earnings, Labour Costs and Turnover,” I investigate the characteristics of New Zealand living wage firms and use fixed effects to examine the impact of living wage certification on employment, worker earnings and turnover. My results provide some evidence for increases in labour costs and worker earnings following certification but find that this change is driven by changes in small firms that employ few workers. I find no evidence of a reduction in turnover.  In my final chapter, “Who Benefits from Living Wage Certification?” I investigate the distribution of benefits from the living wage based on an employees’ pre-treatment earnings, time of hire and whether or not they remained employed with the living wage firm. To do this, I utilize a worker-level panel dataset containing the full earnings history of all workers that were employed for a living wage or matched control firm between January 2014 and December 2015. I use fixed effects models containing fixed effects for worker, firm and month to compare patterns of earnings growth for workers hired before certification (‘pre-hires’) with those hired after certification (‘joiners’) and those who left their living wage job but remained in the workforce (‘leavers’). I also estimate the impact of living wage employment on the earnings of low-income workers. I find that the financial benefit of the living wage accrues almost exclusively to workers hired after certification and to low income workers. In addition, my analysis on the worker-level panel suggests that overall earnings growth in living wage firms lagged that in control firms over the observation period. This result is driven by relative declines in earnings for living wage workers in large firms and is attributed to increases in the published living wage rate that lags behind wage growth in the relevant segments of the job market.</p>


2019 ◽  
Vol 13 ◽  
pp. 175346661984123 ◽  
Author(s):  
Mark R. Bowling ◽  
Erik E. Folch ◽  
Sandeep J. Khandhar ◽  
Jordan Kazakov ◽  
William S. Krimsky ◽  
...  

Background: Fiducial markers (FMs) help direct stereotactic body radiation therapy (SBRT) and localization for surgical resection in lung cancer management. We report the safety, accuracy, and practice patterns of FM placement utilizing electromagnetic navigation bronchoscopy (ENB). Methods: NAVIGATE is a global, prospective, multicenter, observational cohort study of ENB using the superDimension™ navigation system. This prospectively collected subgroup analysis presents the patient demographics, procedural characteristics, and 1-month outcomes in patients undergoing ENB-guided FM placement. Follow up through 24 months is ongoing. Results: Two-hundred fifty-eight patients from 21 centers in the United States were included. General anesthesia was used in 68.2%. Lesion location was confirmed by radial endobronchial ultrasound in 34.5% of procedures. The median ENB procedure time was 31.0 min. Concurrent lung lesion biopsy was conducted in 82.6% (213/258) of patients. A mean of 2.2 ± 1.7 FMs (median 1.0 FMs) were placed per patient and 99.2% were accurately positioned based on subjective operator assessment. Follow-up imaging showed that 94.1% (239/254) of markers remained in place. The procedure-related pneumothorax rate was 5.4% (14/258) overall and 3.1% (8/258) grade ⩾ 2 based on the Common Terminology Criteria for Adverse Events scale. The procedure-related grade ⩾ 4 respiratory failure rate was 1.6% (4/258). There were no bronchopulmonary hemorrhages. Conclusion: ENB is an accurate and versatile tool to place FMs for SBRT and localization for surgical resection with low complication rates. The ability to perform a biopsy safely in the same procedure can also increase efficiency. The impact of practice pattern variations on therapeutic effectiveness requires further study. Trial registration: ClinicalTrials.gov identifier: NCT02410837.


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S64-S65
Author(s):  
Emma Aguila ◽  
Jaqueline L Angel ◽  
Kyriakos Markides

Abstract The United States and Mexico differ greatly in the organization and financing of their old-age welfare states. They also differ politically and organizationally in government response at all levels to the needs of low-income and frail citizens. While both countries are aging rapidly, Mexico faces more serious challenges in old-age support that arise from a less developed old-age welfare state and economy. For Mexico, financial support and medical care for older low-income citizens are universal rights, however, limited fiscal resources for a large low-income population create inevitable competition among the old and the young alike. Although the United States has a more developed economy and well-developed Social Security and health care financing systems for the elderly, older Mexican-origin individuals in the U.S. do not necessarily benefit fully from these programs. These institutional and financial problems to aging are compounded in both countries by longer life spans, smaller families, as well as changing gender roles and cultural norms. In this interdisciplinary panel, the authors of five papers deal with the following topics: (1) an analysis of old age health and dependency conditions, the supply of aging and disability services, and related norms and policies, including the role of the government and the private sector; (2) a binational comparison of federal safety net programs for low-income elderly in U.S. and Mexico; (3) when strangers become family: the role of civil society in addressing the needs of aging populations; and (4) unmet needs for dementia care for Latinos in the Hispanic-EPESE.


Neurosurgery ◽  
2004 ◽  
Vol 54 (3) ◽  
pp. 553-565 ◽  
Author(s):  
Edward R. Smith ◽  
William E. Butler ◽  
Fred G. Barker

Abstract OBJECTIVE Large provider caseloads are associated with better patient outcomes after many complex surgical procedures. Mortality rates for pediatric brain tumor surgery in various practice settings have not been described. We used a national hospital discharge database to study the volume-outcome relationship for craniotomy performed for pediatric brain tumor resection, as well as trends toward centralization and specialization. METHODS We conducted a cross sectional and longitudinal cohort study using Nationwide Inpatient Sample data for 1988 to 2000 (Agency for Healthcare Research and Quality, Rockville, MD). Multivariate analyses adjusted for age, sex, geographic region, admission type (emergency, urgent, or elective), tumor location, and malignancy. RESULTS We analyzed 4712 admissions (329 hospitals, 480 identified surgeons) for pediatric brain tumor craniotomy. The in-hospital mortality rate was 1.6% and decreased from 2.7% (in 1988–1990) to 1.2% (in 1997–2000) during the study period. On a per-patient basis, median annual caseloads were 11 for hospitals (range, 1–59 cases) and 6 for surgeons (range, 1–32 cases). In multivariate analyses, the mortality rate was significantly lower at high-volume hospitals than at low-volume hospitals (odds ratio, 0.52 for 10-fold larger caseload; 95% confidence interval, 0.28–0.94; P = 0.03). The mortality rate was 2.3% at the lowest-volume-quartile hospitals (4 or fewer admissions annually), compared with 1.4% at the highest-volume-quartile hospitals (more than 20 admissions annually). There was a trend toward lower mortality rates after surgery performed by high-volume surgeons (P = 0.16). Adverse hospital discharge disposition was less likely to be associated with high-volume hospitals (P &lt; 0.001) and high-volume surgeons (P = 0.004). Length of stay and hospital charges were minimally related to hospital caseloads. Approximately 5% of United States hospitals performed pediatric brain tumor craniotomy during this period. The burden of care shifted toward large-caseload hospitals, teaching hospitals, and surgeons whose practices included predominantly pediatric patients, indicating progressive centralization and specialization. CONCLUSION Mortality and adverse discharge disposition rates for pediatric brain tumor craniotomy were lower when the procedure was performed at high-volume hospitals and by high-volume surgeons in the United States, from 1988 to 2000. There were trends toward lower mortality rates, greater centralization of surgery, and more specialization among surgeons during this period.


2012 ◽  
Vol 69 (3) ◽  
pp. 351-365 ◽  
Author(s):  
Patricia Pittman ◽  
Carolina Herrera ◽  
Joanne Spetz ◽  
Catherine R. Davis

More than 8% of employed RNs licensed since 2004 in the United States were educated overseas, yet little is known about the conditions of their recruitment or the impact of that experience on health care practice. This study assessed whether the labor rights of foreign-educated nurses were at risk during the latest period of high international recruitment: 2003 to 2007. Using consensus-based standards contained in the Voluntary Code of Ethical Conduct for the Recruitment of Foreign-Educated Health Professionals to the United States, this study found 50% of actively recruited foreign-educated nurses experienced a negative recruitment practice. The study also found that nurses educated in low-income countries and nurses with high contract breach fees, were significantly more likely to report such problems. If, as experts believe may occur, the nursing shortage in the United States returns around 2014, oversight of international recruitment will become critically important to delivering high-quality health care to Americans.


Sign in / Sign up

Export Citation Format

Share Document