Quality information disclosure and health insurance demand: evidence from VA hospital report cards

2019 ◽  
Vol 20 (2) ◽  
pp. 177-199
Author(s):  
Xiaoxue Li
JAMA Surgery ◽  
2014 ◽  
Vol 149 (2) ◽  
pp. 143 ◽  
Author(s):  
Justin B. Dimick ◽  
Samantha K. Hendren

Medical Care ◽  
2005 ◽  
Vol 43 (8) ◽  
pp. 801-809 ◽  
Author(s):  
Peter C. Austin ◽  
Jack V. Tu ◽  
David A. Alter ◽  
C David Naylor

2005 ◽  
Vol 25 (1) ◽  
pp. 11-19 ◽  
Author(s):  
Peter C. Austin ◽  
Geoffrey M. Anderson

2004 ◽  
Vol 148 (6) ◽  
pp. 1041-1046 ◽  
Author(s):  
Peter C. Austin ◽  
David A. Alter ◽  
Geoffrey M. Anderson ◽  
Jack V. Tu

2014 ◽  
Vol 34 ◽  
pp. 42-58 ◽  
Author(s):  
Shin-Yi Chou ◽  
Mary E. Deily ◽  
Suhui Li ◽  
Yi Lu

2006 ◽  
Vol 0 (0) ◽  
pp. 060720074824035-??? ◽  
Author(s):  
Laurent G. Glance ◽  
Andrew W. Dick ◽  
Turner M. Osler ◽  
Dana B. Mukamel

Author(s):  
Mathew J Reeves ◽  
Peter C Austin

Background: Hospital report cards are published with increasing frequency, but their accuracy is controversial, especially when case volumes are small. Our objective was to determine the relationship between hospital case volumes and the accuracy of hospital report cards using simulation studies. Methods: Monte Carlo simulations were used in a setting such that the true hospital rankings were known with certainty and perfect risk-adjustment was possible. Parameters used to generate simulated datasets were obtained from analyses of 31,000 hospitalized AMI patients in Ontario. We varied the number of patients per hospital from 100 to 2000 in increments of 100. For each scenario we simulated 500 datasets and determined the correlation between the true hospital ranking (determined by hospital-specific random effects) and the observed ranking (determined by observed vs. expected (O-E) or predicted vs. expected (P-E) ratios). Baseline simulations used the observed 30-day AMI mortality of 11%. In sensitivity analysis we explored the impact of using a lower event rate (2% mortality) commonly associated with cardiovascular procedures. Results: Provider volume had a strong effect on the accuracy of hospital report cards. When the event rate was high (11%), provider volume had to exceed 500 before the correlation between known and observed rankings exceeded 80%, and for hospitals with 200 or fewer cases the correlation was <60% (Figure). When the event rate was low (2%), provider volume had to exceed 1200 before the correlation exceeded 80%, and for hospitals with 200 or fewer cases the correlation was <40%. Conclusions: These results highlight the substantial amount of error that occurs when profiling hospitals even in the environment of perfect risk adjustment. These random errors increase with decreasing hospital volume and decreasing event rates. Hospital report cards display acceptable levels of accuracy only when provider volumes exceed those seen in many hospitals.


Sign in / Sign up

Export Citation Format

Share Document