scholarly journals Longer-term Direct and Indirect Effects of Infant Rotavirus Vaccination Across All Ages in the United States in 2000–2013: Analysis of a Large Hospital Discharge Data Set

2018 ◽  
Vol 68 (6) ◽  
pp. 976-983 ◽  
Author(s):  
Julia M Baker ◽  
Jacqueline E Tate ◽  
Claudia A Steiner ◽  
Michael J Haber ◽  
Umesh D Parashar ◽  
...  

Abstract Background Rotavirus disease rates dramatically declined among children <5 years of age since the rotavirus vaccine was introduced in 2006; population-level impacts remain to be fully elucidated. Methods Data from the Healthcare Cost and Utilization Project State Inpatient Databases were used to conduct a time-series analysis of monthly hospital discharges across age groups for acute gastroenteritis and rotavirus from 2000 to 2013. Rate ratios were calculated comparing prevaccine and postvaccine eras. Results Following vaccine introduction, a decrease in rotavirus hospitalizations occurred with a shift toward biennial patterns across all ages. The 0–4-year age group experienced the largest decrease in rotavirus hospitalizations (rate ratio, 0.14; 95% confidence interval, .09–.23). The 5–19-year and 20–59-year age groups experienced significant declines in rotavirus hospitalization rates overall; the even postvaccine calendar years were characterized by progressively lower rates, and the odd postvaccine years were associated with reductions in rates that diminished over time. Those aged ≥60 years experienced the smallest change in rotavirus hospitalization rates overall, with significant reductions in even postvaccine years compared with prevaccine years (rate ratio, 0.51; 95% confidence interval, .39–.66). Conclusions Indirect impacts of infant rotavirus vaccination are apparent in the emergence of biennial patterns in rotavirus hospitalizations that extend to all age groups ineligible for vaccination. These observations are consistent with the notion that young children are of primary importance in disease transmission and that the initial postvaccine period of dramatic population-wide impacts will be followed by more complex incidence patterns across the age range in the long term.

2016 ◽  
Vol 132 (1) ◽  
pp. 65-75 ◽  
Author(s):  
Prabhu P. Gounder ◽  
Robert C. Holman ◽  
Sara M. Seeman ◽  
Alice J. Rarig ◽  
Mary McEwen ◽  
...  

Objective: Reports about infectious disease (ID) hospitalization rates among American Indian/Alaska Native (AI/AN) persons have been constrained by data limited to the tribal health care system and by comparisons with the general US population. We used a merged state database to determine ID hospitalization rates in Alaska. Methods: We combined 2010 and 2011 hospital discharge data from the Indian Health Service and the Alaska State Inpatient Database. We used the merged data set to calculate average annual age-adjusted and age-specific ID hospitalization rates for AI/AN and non-AI/AN persons in Alaska. We stratified the ID hospitalization rates by sex, age, and ID diagnosis. Results: ID diagnoses accounted for 19% (6501 of 34 160) of AI/AN hospitalizations, compared with 12% (7397 of 62 059) of non-AI/AN hospitalizations. The average annual age-adjusted hospitalization rate was >3 times higher for AI/AN persons (2697 per 100 000 population) than for non-AI/AN persons (730 per 100 000 population; rate ratio = 3.7, P < .001). Lower respiratory tract infection (LRTI), which occurred in 38% (2486 of 6501) of AI/AN persons, was the most common reason for ID hospitalization. AI/AN persons were significantly more likely than non-AI/AN persons to be hospitalized for LRTI (rate ratio = 5.2, P < .001). Conclusions: A substantial disparity in ID hospitalization rates exists between AI/AN and non-AI/AN persons, and the most common reason for ID hospitalization among AI/AN persons was LRTI. Public health programs and policies that address the risk factors for LRTI are likely to benefit AI/AN persons.


2008 ◽  
Vol 36 (12) ◽  
pp. 2328-2335 ◽  
Author(s):  
Laurel A. Borowski ◽  
Ellen E. Yard ◽  
Sarah K. Fields ◽  
R. Dawn Comstock

Background With more than a million high school athletes playing during the 2006–2007 academic year, basketball is one of the most popular sports in the United States. Hypothesis Basketball injury rates and patterns differ by gender and type of exposure. Study Design Descriptive epidemiology study. Methods Basketball-related injury data were collected during the 2005–2006 and 2006–2007 academic years from 100 nationally representative US high schools via Reporting Information Online. Results High school basketball players sustained 1518 injuries during 780 651 athlete exposures for an injury rate of 1.94 per 1000 athlete exposures. The injury rate per 1000 athlete exposures was greater during competition (3.27) than during practice (1.40; rate ratio, 2.33; 95% confidence interval, 2.10–2.57) and was greater among girls (2.08) than among boys (1.83; rate ratio, 1.14; 95% confidence interval, 1.03–1.26). The ankle/foot (39.7%), knee (14.7%), head/face/neck (13.6%), arm/hand (9.6%), and hip/thigh/upper leg (8.4%) were most commonly injured. The most frequent injury diagnoses were ligament sprains (44.0%), muscle/tendon strains (17.7%), contusions (8.6%), fractures (8.5%), and concussions (7.0%). Female basketball players sustained a greater proportion of concussions (injury proportion ratio, 2.41; 95% confidence interval, 1.49–3.91) and knee injuries (injury proportion ratio, 1.71; 95% confidence interval, 1.27–2.30), whereas boys more frequently sustained fractures (injury proportion ratio, 1.87; 95% confidence interval, 1.27–2.77) and contusions (injury proportion ratio, 1.52; 95% confidence interval, 1.00–2.31). The most common girls’ injury requiring surgery was knee ligament sprains (47.9%). Conclusion High school basketball injury patterns vary by gender and type of exposure. This study suggests several areas of emphasis for targeted injury prevention interventions.


2019 ◽  
Vol 57 (6) ◽  
Author(s):  
Priyanka Uprety ◽  
Ana María Cárdenas

ABSTRACT Chlamydia trachomatis and Neisseria gonorrhoeae are the two most common causes of sexually transmitted disease in the United States. Studies in adults, mostly in men who have sex with men, have shown that the prevalence of C. trachomatis and N. gonorrhoeae infections is much higher in extragenital sources compared to urogenital sources. A similar large sample of data on the burden of C. trachomatis and N. gonorrhoeae infections by anatomic site is lacking in children. We retrospectively analyzed data from 655 patients tested for C. trachomatis (887 specimens) and N. gonorrhoeae (890 specimens) at the Children’s Hospital of Philadelphia. We restricted the analysis to include patients between 2 and 17 years of age that had all three sources (urine, oropharynx, and rectum) collected at the same visit. The final data set included specimens from all three sources from 148 and 154 patients for C. trachomatis and N. gonorrhoeae, respectively. Specimens were tested for C. trachomatis and N. gonorrhoeae using a Gen-Probe Aptima Combo 2 assay. The burden of C. trachomatis and N. gonorrhoeae infection was significantly higher in the 14- to 17-year age group (24.7%, P = 0.041; 25.8%; P = 0.001) compared to the 10- to 13-year (5.9%; 5.6%), 6- to 9-year (4.6%; 4.6%), and 2- to 5-year (8.3%; 0%) age groups, respectively. The positivity rate for C. trachomatis was highest for rectal (16.2%), followed by urine (5.4%) and oropharyngeal (0.7%) sites. The positivity rate for N. gonorrhoeae was highest for rectal sites (10.4%), followed by oropharyngeal (9.7%) and urine (1.9%) sites. The source with highest diagnostic yield is rectum for C. trachomatis and rectum and oropharynx for N. gonorrhoeae. Hence, extragenital screening is critical for the comprehensive detection of C. trachomatis and N. gonorrhoeae in the pediatric population.


2020 ◽  
Vol 10 (4) ◽  
pp. 204589402091183
Author(s):  
John W. McConnell ◽  
Yuen Tsang ◽  
Janis Pruett ◽  
William Drake III

Two oral medications targeting the prostacyclin pathway are available to treat pulmonary arterial hypertension in the United States: oral treprostinil and selexipag. We compared real-world hospitalization in patients receiving these medications. A retrospective administrative claims study was conducted using the Optum® Clinformatics® Data Mart database. Patients with pulmonary hypertension were identified using diagnostic codes. Cohort inclusion required age ≥ 18 years, first oral treprostinil or selexipag prescription between 1 January 2015 and 30 September 2017 (index date), and continuous enrollment in the prior ≥6 months. Patients who switched index drug were excluded. Follow-up was from index date until the first of end of index drug exposure, end of continuous enrollment, death, or 31 December 2017. Multivariable Cox proportional hazard and Poisson regression were used to compare risk and rate, respectively, of hospitalization associated with oral treprostinil vs. selexipag, adjusting for potential confounders. The study cohort included 99 patients receiving oral treprostinil and 123 receiving selexipag. Mean age was 61 years, and most patients were females (71%). Compared with oral treprostinil, selexipag was associated with a 46% lower risk of all-cause hospitalization (hazard ratio 0.54, 95% confidence interval 0.31, 0.92; P = 0.02), a 47% lower risk of pulmonary hypertension-related hospitalization (hazard ratio 0.53, 95% confidence interval 0.31, 0.93; P = 0.03), a 42% lower all-cause hospitalization rate (rate ratio 0.58, 95% confidence interval 0.39, 0.87; P = 0.01), and a 46% lower pulmonary hypertension-related hospitalization rate (rate ratio 0.54, 95% confidence interval 0.35, 0.82; P = 0.004). This study suggests that selexipag is associated with lower hospitalization risk and rate than oral treprostinil.


Author(s):  
Douglas R Tolleson ◽  
David W Schafer

Abstract Hot-iron branding is a traditional form of permanent cattle identification in the United States. There is a need for science-based determination of cattle brand age. Near infrared reflectance spectroscopy (NIRS) has been used to obtain information about animal tissues and healing processes. Height-width allometry and NIRS were applied to hot-iron cattle brand scars to determine if either or both of these methods can be used to non-invasively establish the interval since application of hot-iron cattle brands. Length and width of a brand routinely applied to calves (~30 to 60 d old) were established and then the same measurements were recorded on 378 calfhood branded cattle of known age ranging from 0.5 to &gt; 6.5 yr-of-age. Brand width and height increased over the original measurements by &gt; 100% between calfhood application and 2.5 yr-of-age (P &lt; 0.001). Brand size did not change dramatically between 2.5 and &gt; 6.5 yr, however, both width and height were (P &lt; 0.05) greater at maturity than at weaning. Near infrared spectra were collected from: a) branded skin b) non-clipped (hair), non-branded skin, and c) hair clipped, non-branded skin on Bos taurus cross calves. Individual trial calibrations yielded high R 2 and low SE of calibration values as well as similar cross validation performance (P &lt; 0.001). Numerically lower but still strong performance (P &lt; 0.001) resulted from combined data set calibrations. Cross-trial prediction of brand age was unsuccessful. One single year calibration underpredicted (P &lt; 0.001) brand age of an independent validation set by 2.83 d, and another single year calibration underpredicted (P &lt; 0.001) the same validation set by 9.91 d. When combined, these two datasets resulted in a calibration that overpredicted brand age in the validation set by 6.9 d (P &lt; 0.02). Discriminant analyses for identification of skin surface type yielded success rates of 90% for branded, 99% for non-clipped, non-branded, and 96% for clipped, non-branded (P &lt; 0.01). Discriminant analyses were also performed on samples grouped into: a) less than 33 d, b) 141-153 d, and c) 169 d categories. All group membership identifications were successful at greater than 90% (P &lt; 0.01). Preliminary results indicate that brand size could be used to indicate brand age and that NIRS can predict brand age as well as discriminate between broad brand age groups in cattle. More work will need to be done before these techniques can be used in real-world forensic applications.


2019 ◽  
Vol 111 (10) ◽  
pp. 1104-1106 ◽  
Author(s):  
Rebecca L Siegel ◽  
Genet A Medhanie ◽  
Stacey A Fedewa ◽  
Ahmedin Jemal

Abstract The extent to which the increase in early-onset colorectal cancer (CRC) in the United States varies geographically is unknown. We analyzed changes in CRC incidence and risk factors among people aged 20–49 years by state using high-quality population-based cancer registry data provided by the North American Association of Central Cancer Registries and national survey data, respectively. Early-onset CRC incidence was mostly stable among blacks and Hispanics but increased in 40 of 47 states among non-Hispanic whites, most prominently in western states. For example, rates increased in Washington from 6.7 (per 100 000) during 1995–1996 to 11.5 during 2014–2015 (rate ratio = 1.73, 95% confidence interval = 1.48 to 2.01) and in Colorado from 6.0 to 9.5 (rate ratio = 1.57, 95% confidence interval = 1.30 to 1.91). Nevertheless, current CRC incidence was highest in southern states. From 1995 to 2005, increases occurred in obesity prevalence in all states and heavy alcohol consumption in one-third of states, but neither were correlated with CRC incidence trends. Early-onset CRC is increasing most rapidly among whites in western states. Etiologic studies are needed to explore early life colorectal carcinogenesis.


1989 ◽  
Vol 19 (2) ◽  
pp. 56-62 ◽  
Author(s):  
Don Hindle ◽  
Angela Cook ◽  
John Pilla

The content of discharge abstracts (or ‘morbidity statistics forms’ as they are popularly known in Australia) is determined by perceived needs for information about inpatients. It should be sensitive to changes in those needs. The emergence of interest in diagnosis related group (DRG) data has had an impact on discharge abstracting. However, revisions have been minor because the DRG system was designed to make use of the standard discharge data set in the United States. In Australia, it has been necessary only to adjust practices to bring them more or less in line with the American standard. In the near future, however, it is likely that more significant changes will be needed. In this paper the authors discuss one new area of interest concerning measurement of resource use by DRG. They suggest that it will lead to the addition of new fields on the discharge abstract and to major changes in the way that discharges are defined. (AMRJ (1989). 19(2), 56–62).


2019 ◽  
Vol 188 (8) ◽  
pp. 1475-1483 ◽  
Author(s):  
Philip A Collender ◽  
Christa Morris ◽  
Rose Glenn-Finer ◽  
Andrés Acevedo ◽  
Howard H Chang ◽  
...  

Abstract Mass gatherings exacerbate infectious disease risks by creating crowded, high-contact conditions and straining the capacity of local infrastructure. While mass gatherings have been extensively studied in the context of epidemic disease transmission, the role of gatherings in incidence of high-burden, endemic infections has not been previously studied. Here, we examine diarrheal incidence among 17 communities in Esmeraldas, Ecuador, in relation to recurrent gatherings characterized using ethnographic data collected during and after the epidemiologic surveillance period (2004–2007). Using distributed-lag generalized estimating equations, adjusted for seasonality, trend, and heavy rainfall events, we found significant increases in diarrhea risk in host villages, peaking 2 weeks after an event’s conclusion (incidence rate ratio, 1.21; confidence interval, adjusted for false coverage rate of ≤0.05: 1.02, 1.43). Stratified analysis revealed heightened risks associated with events where crowding and travel were most likely (2-week-lag incidence rate ratio, 1.51; confidence interval, adjusted for false coverage rate of ≤0.05: 1.09, 2.10). Our findings suggest that community-scale mass gatherings might play an important role in endemic diarrheal disease transmission and could be an important focus for interventions to improve community health in low-resource settings.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S1001-S1002
Author(s):  
Shikha Garg ◽  
Alissa O’Halloran ◽  
Alissa O’Halloran ◽  
Charisse N Cummings ◽  
Charisse N Cummings ◽  
...  

Abstract Background The 2018–19 influenza season was characterized by prolonged co-circulation of Influenza A H3N2 (H3) and H1N1pdm09 (H1) viruses. We used data from the Influenza Hospitalization Surveillance Network (FluSurv-NET) to describe age-related differences in the distribution of influenza A subtypes. Methods We included all cases residing within a FluSurv-NET catchment area and hospitalized with laboratory-confirmed influenza during October 1, 2018–April 30, 2019. We multiply imputed influenza A subtype for 63% of cases with unknown subtype and based imputation on factors that could be associated with missing subtype including surveillance site, 10-year age groups and month of hospital admission. We calculated influenza hospitalization rates and 95% confidence intervals (95% CI) by type and subtype per 100,000 population. We compared the proportion of cases with H1 by year of age in FluSurv-NET to the distribution obtained from US public health laboratories participating in virologic surveillance and providing specimen-level influenza Results. Results Based on available data, 18,669 hospitalizations were reported; 41% received influenza vaccination ≥2 weeks prior to hospitalization and 90% received antivirals. Cumulative hospitalization rates per 100,000 population were as follows: H1 32.5 (95% CI 31.7–33.3), H3 29.3 (95% CI 28.5–30.1) and B 2.5 (95% CI 2.3–2.7). Based on weekly rates, H1 hospitalizations peaked during February (week 8) and H3 hospitalizations during March (week 11) (Figure A). FluSurv-NET data showed distinct patterns of subtype distribution by age, with H1 predominating among cases 0–9 and 24–70 years, and H3 predominating among cases 10–23 and ≥71 years. Data on the proportion of H1 results by age correlated well between FluSurv-NET and US virologic surveillance (Figure B). Conclusion Influenza A H1 and H3 virus circulation patterns varied by age group during the 2018–2019 season. The proportion of cases with H1 relative to H3 was low among those born between 1996 and 2009 and those born before 1948. These findings may indicate protection against H1 viruses in age groups with exposure to H1N1pdm09 during the 2009 pandemic or to older antigenically similar H1N1 viruses as young children. Disclosures Evan J. Anderson, MD, AbbVie (Consultant), GSK (Grant/Research Support), Merck (Grant/Research Support), Micron (Grant/Research Support), PaxVax (Grant/Research Support), Pfizer (Consultant, Grant/Research Support), sanofi pasteur (Grant/Research Support), Keipp Talbot, MD, MPH, Sequirus (Other Financial or Material Support, On Data Safety Monitoring Board).


2020 ◽  
Vol 15 (6) ◽  
pp. 805-812 ◽  
Author(s):  
Finnian R. Mc Causland ◽  
Jim A. Tumlin ◽  
Prabir Roy-Chaudhury ◽  
Bruce A. Koplan ◽  
Alexandru I. Costea ◽  
...  

Background and objectivesPatients receiving maintenance hemodialysis (HD) have a high incidence of cardiac events, including arrhythmia and sudden death. Intradialytic hypotension (IDH) is a common complication of HD and is associated with development of reduced myocardial perfusion, a potential risk factor for arrhythmia.Design, setting, participants, & measurementsWe analyzed data from the Monitoring in Dialysis study, which used implantable loop recorders to detect and continuously monitor electrocardiographic data from patients on maintenance HD (n=66 from the United States and India) over a 6-month period (n=4720 sessions). Negative binomial mixed effects regression was used to test the association of IDH20 (decline in systolic BP >20 mm Hg from predialysis systolic BP) and IDH0–20 (decline in systolic BP 0–20 mm Hg from predialysis systolic BP) with clinically significant arrhythmia (bradycardia≤40 bpm for ≥6 seconds, asystole≥3 seconds, ventricular tachycardia ≥130 bpm for ≥30 seconds, or patient-marked events) during HD.ResultsThe median age of participants was 58 (25th–75th percentile, 49–66) years; 70% were male; and 65% were from the United States. IDH occurred in 2251 (48%) of the 4720 HD sessions analyzed, whereas IDH0–20 occurred during 1773 sessions (38%). The number of sessions complicated by least one intradialytic clinically significant arrhythmia was 27 (1.2%) where IDH20 occurred and 15 (0.8%) where IDH0–20 occurred. Participants who experienced IDH20 (versus not) had a nine-fold greater rate of developing an intradialytic clinically significant arrhythmia (incidence rate ratio, 9.4; 95% confidence interval, 3.0 to 29.4), whereas IDH0–20 was associated with a seven-fold higher rate (incidence rate ratio, 7.2; 95% confidence interval, 2.1 to 25.4).ConclusionsIDH is common in patients on maintenance HD and is associated with a greater risk of developing intradialytic clinically significant arrhythmia.


Sign in / Sign up

Export Citation Format

Share Document