scholarly journals Sequelae of an Evidence-based Approach to Management for Access to Care in the Veterans Health Administration

Medical Care ◽  
2019 ◽  
Vol 57 ◽  
pp. S213-S220 ◽  
Author(s):  
Peter J. Kaboli ◽  
Isomi M. Miake-Lye ◽  
Christopher Ruser ◽  
Elizabeth M. Yano ◽  
Greg Orshansky ◽  
...  
2009 ◽  
Vol 174 (1) ◽  
pp. 029-034 ◽  
Author(s):  
Kim Hamlett-Berry ◽  
John Davison ◽  
Daniel R. Kivlahan ◽  
Marybeth H. Matthews ◽  
Jane E. Hendrickson ◽  
...  

2020 ◽  
Vol 42 (3) ◽  
pp. 148-156
Author(s):  
Quyen A. Ngo-Metzger ◽  
Iris R. Mabry-Hernandez ◽  
Jane Kim ◽  
Prajakta Adsul ◽  
Laura B. Higginbotham ◽  
...  

2009 ◽  
Vol 43 (10) ◽  
pp. 1565-1575 ◽  
Author(s):  
Michael L Johnson ◽  
Laura A Petersen ◽  
Raji Sundaravaradan ◽  
Margaret M Byrne ◽  
Jennifer C Hasche ◽  
...  

2014 ◽  
Vol 104 (S4) ◽  
pp. S532-S534 ◽  
Author(s):  
Michael R. Kauth ◽  
Jillian C. Shipherd ◽  
Jan Lindsay ◽  
John R. Blosnich ◽  
George R. Brown ◽  
...  

2021 ◽  
Author(s):  
Caitlin Reardon ◽  
Andrea Nevedal ◽  
Marilla A. Opra Widerquist ◽  
Maria Arasim ◽  
George L. Jackson ◽  
...  

Abstract Background There are challenges associated with measuring sustainment of evidence-based practices (EBPs). First, the terms sustainability and sustainment are often falsely conflated: sustainability assesses the likelihood of an EBP being in use in the future while sustainment assesses the extent to which an EBP is (or is not) in use. Second, grant funding often ends before sustainment can be assessed. The Veterans Health Administration (VHA) Diffusion of Excellence (DoE) is one of few large-scale models of diffusion; it seeks to identify and disseminate practices across the VHA system. The DoE sponsors “Shark Tank” competitions, in which leaders bid on the opportunity to implement a practice with 6 months of implementation support. As part of an ongoing evaluation of the DoE, we sought to develop and administer a pragmatic instrument to assess sustainment of DoE practices.Methods In June 2020, surveys were sent to 64 facilities that were part of the DoE evaluation. We began analysis by comparing alignment of quantitative and qualitative responses; some facility representatives reported in the open text box of the survey that their practice was on a temporary hold due to COVID-19 but answered the primary outcome question differently. As a result, the team reclassified the primary outcome of these facilities to Sustained: Temporary COVID-Hold. Following this reclassification, the number and percent of facilities in each category was calculated. We used directed content analysis, guided by the Consolidated Framework for Implementation Research (CFIR), to analyze open text box responses. Results A representative from forty-one facilities (64%) completed the survey. Among responding facilities, 29/41 sustained their practice, 1/41 partially sustained their practice, 8/41 had not sustained their practice, and 3/41 had never implemented their practice. Sustainment rates increased between Cohorts 1 – 4. Conclusions The development and administration of our pragmatic survey allowed us to assess sustainment of DoE practices. Planned updates to the survey will enable flexibility in assessing sustainment and its determinants at any phase after adoption. This assessment approach can flex with the longitudinal and dynamic nature of sustainment, including capturing nuances in outcomes when practices are on a temporary hold.


Sign in / Sign up

Export Citation Format

Share Document