Abstract T P274: Accreditation Status of Outpatient Cerebrovascular Testing Facilities Among Medicare Beneficiaries: The VALUE (Vascular Accreditation, Location & Utilization Evaluation) Study

Stroke ◽  
2015 ◽  
Vol 46 (suppl_1) ◽  
Author(s):  
Scott C Brown ◽  
Kefeng Wang ◽  
Chuanhui Dong ◽  
Mary B Farrell ◽  
Gary V Heller ◽  
...  

OBJECTIVE: Accreditation of cerebrovascular ultrasound laboratories by the Intersocietal Accreditation Commission (IAC) or equivalent bodies is supported by The Joint Commission certification of stroke centers. Limited information exists on the accreditation status and geographic distribution of these testing facilities in the US. The aims were to: (1) Identify the proportion of IAC accredited vascular testing facilities used by Medicare beneficiaries for outpatient cerebrovascular testing services; (2) Describe the geographical distribution of these facilities; and (3) Identify variation in the types and volumes of cerebrovascular testing procedures by accreditation status. METHODS: As a part of the VALUE (Vascular Accreditation, Location & Utilization Evaluation) Study, we examined the proportion of IAC accredited facilities that conducted cerebrovascular testing in a 5% CMS random Outpatient Limited Data Set (LDS) for the US in 2011 and investigated their geographical distribution using the Medicare Provider of Services (POS) file. RESULTS: Of the 7,864 total facilities billing Medicare for cerebrovascular testing procedures, only 22% (n=1,723) were IAC accredited. The percentage of facilities conducting cerebrovascular testing that were IAC accredited varied by region (Χ2[3]=400.4, p<0.0001), with 43%, 21%, 17% and 13% located in the Northeast, South, Midwest, and West, respectively. However, when examining the total number of cerebrovascular outpatient procedures conducted in 2011 (total n=38,646), 41% (15,729) were conducted in IAC accredited facilities. Moreover, when examining procedure type across all sites, 98% (38,011) of all cerebrovascular testing procedures conducted were carotid duplex, of which 41% (15,417) were conducted in IAC accredited facilities. In contrast, 1% (n=315) of all cerebrovascular procedures were transcranial (TCD), of which 56% (n=177) were conducted in IAC accredited facilities. CONCLUSIONS: The proportion of IAC accredited facilities conducting outpatient cerebrovascular testing is low and varies by region. The growing number of certified stroke centers should be accompanied by more accredited vascular testing facilities that could potentially improve quality of stroke care.

2020 ◽  
Vol 41 (S1) ◽  
pp. s224-s224
Author(s):  
Curt Hewitt ◽  
Katharina Weber ◽  
Danielle LeSassier ◽  
Anthony Kappell ◽  
Kathleen Schulte ◽  
...  

Background: The prevalence of healthcare-acquired infections (HAIs) and rising levels of antimicrobial resistance place a significant burden on modern healthcare systems. Cultures are typically used to track HAIs; however, culture methods provide limited information and are not applicable to all pathogens. Next-generation sequencing (NGS) can detect and characterize pathogens present within a sample, but few research studies have explored how NGS could be used to detect pathogen transmission events under HAI-relevant scenarios. The objective of this CDC-funded project was to evaluate and correlate sequencing approaches for pathogen transmission with standard culture-based analysis. Methods: We modeled pathogen transfer via hand contact using synthetic skin. These skin coupons were seeded with a community of commensal organisms to mimic the human skin microbiome. Pathogens were added at physiologically relevant high or low levels prior to skin-to-skin contact. The ESKAPE pathogens: E. faecium, S. aureus, K. pneumoniae, A. baumannii, P. aeruginosa, and Enterobacter spp plus C. difficile were employed because they are the most common antibiotic resistant HAIs. Pathogen transfer between skin coupons was measured following direct skin contact and fomite surface transmission. The effects of handwashing or fomite decontamination were also evaluated. Transferred pathogens were enumerated via culture to establish a robust data set against which DNA and RNA sequence analyses of the same samples could be compared. These data also provide a quantitative assessment of individual ESKAPE+C pathogen transfer rates in skin contact scenarios. Results: Metagenomic and metatranscriptomic analysis using custom analysis pipelines and reference databases successfully identified the commensal and pathogenic organisms present in each sample at the species level. This analysis also identified antibiotic resistance genes and plasmids. Metatranscriptomic analysis permitted not only gene identification but also confirmation of gene expression, a critical factor in the evaluation of antibiotic resistance. DNA analysis does not require cell viability, a key differentiator between sequencing and culturing reflected in simulated handwashing data. Sensitivity remains a key limitation of metagenomic analysis, as shown by the poor species identification and gene content characterization of pathogens present at low abundance within the simulated microbial community. Species level identification typically failed as ratios fell below 1:1,000 pathogen CFU:total community CFU. Conclusions: These findings demonstrate the strengths and weaknesses of NGS for molecular epidemiology. The data sets produced for this study are publicly available so they can be employed for future metagenomic benchmarking studies.Funding: NoneDisclosures: None


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Marco Diers ◽  
Robert Weigel ◽  
Heike Culmsee ◽  
Christoph Leuschner

Abstract Background Organic carbon stored in forest soils (SOC) represents an important element of the global C cycle. It is thought that the C storage capacity of the stable pool can be enhanced by increasing forest productivity, but empirical evidence in support of this assumption from forests differing in tree species and productivity, while stocking on similar substrate, is scarce. Methods We determined the stocks of SOC and macro-nutrients (nitrogen, phosphorus, calcium, potassium and magnesium) in nine paired European beech/Scots pine stands on similar Pleistocene sandy substrates across a precipitation gradient (560–820 mm∙yr− 1) in northern Germany and explored the influence of tree species, forest history, climate, and soil pH on SOC and nutrient pools. Results While the organic layer stored on average about 80% more C under pine than beech, the pools of SOC and total N in the total profile (organic layer plus mineral soil measured to 60 cm and extrapolated to 100 cm) were greater under pine by about 40% and 20%, respectively. This contrasts with a higher annual production of foliar litter and a much higher fine root biomass in beech stands, indicating that soil C sequestration is unrelated to the production of leaf litter and fine roots in these stands on Pleistocene sandy soils. The pools of available P and basic cations tended to be higher under beech. Neither precipitation nor temperature influenced the SOC pool, whereas tree species was a key driver. An extended data set (which included additional pine stands established more recently on former agricultural soil) revealed that, besides tree species identity, forest continuity is an important factor determining the SOC and nutrient pools of these stands. Conclusion We conclude that tree species identity can exert a considerable influence on the stocks of SOC and macronutrients, which may be unrelated to productivity but closely linked to species-specific forest management histories, thus masking weaker climate and soil chemistry effects on pool sizes.


2021 ◽  
pp. 194016122110091
Author(s):  
Magdalena Wojcieszak ◽  
Ericka Menchen-Trevino ◽  
Joao F. F. Goncalves ◽  
Brian Weeks

The online environment dramatically expands the number of ways people can encounter news but there remain questions of whether these abundant opportunities facilitate news exposure diversity. This project examines key questions regarding how internet users arrive at news and what kinds of news they encounter. We account for a multiplicity of avenues to news online, some of which have never been analyzed: (1) direct access to news websites, (2) social networks, (3) news aggregators, (4) search engines, (5) webmail, and (6) hyperlinks in news. We examine the extent to which each avenue promotes news exposure and also exposes users to news sources that are left leaning, right leaning, and centrist. When combined with information on individual political leanings, we show the extent of dissimilar, centrist, or congenial exposure resulting from each avenue. We rely on web browsing history records from 636 social media users in the US paired with survey self-reports, a unique data set that allows us to examine both aggregate and individual-level exposure. Visits to news websites account for about 2 percent of the total number of visits to URLs and are unevenly distributed among users. The most widespread ways of accessing news are search engines and social media platforms (and hyperlinks within news sites once people arrive at news). The two former avenues also increase dissimilar news exposure, compared to accessing news directly, yet direct news access drives the highest proportion of centrist exposure.


2018 ◽  
Vol 61 (1) ◽  
pp. 55-82 ◽  
Author(s):  
Takaaki Masaki

Abstract:This article utilizes a newly available dataset on the geographical distribution of development projects in Zambia to test whether electoral incentives shape aid allocation at the subnational level. Based on this dataset, it argues that when political elites have limited information to target distributive goods specifically to swing voters, they allocate more donor projects to districts where opposition to the incumbent is strong, as opposed to districts where the incumbent enjoys greater popularity.


2018 ◽  
Vol 21 ◽  
pp. S2
Author(s):  
W Lo-Ciganic ◽  
WF Gellad ◽  
L Zhou ◽  
JM Donohue ◽  
A Roubal ◽  
...  

Circulation ◽  
2008 ◽  
Vol 118 (suppl_18) ◽  
Author(s):  
Robert L Page ◽  
Christopher Hogan ◽  
Kara Strongin ◽  
Roger Mills ◽  
JoAnn Lindenfeld

In fiscal year 2003, Medicare beneficiaries with heart failure (HF) accounted for 37% of all Medicare spending and nearly 50% of all hospital inpatient costs. On average, each beneficiary had 10.3 outpatient and 2 inpatient visits specifically for HF. Despite significant improvements in medical care for HF, mortality and hospital admissions remain high. No data exist regarding the number of providers ordering and providing care for this population. An analysis of fiscal year 2005 Medicare claims was conducted, using a 5% sample standard analytic and denominator file, limited data set version to extrapolate the 34,150,200 Medicare beneficiaries. Three cohorts were defined according to mild, moderate, severe HF employing the Centers for Medicare and Medicaid Services Hierarchical Condition Categories Model and Chronic Care Improvement Program definitions. HMO enrollees, persons without Part A and Part B coverage, and those outside the United States were excluded. We identified physicians by using the unique physician identification number of performing physicians. Based on inclusion criteria, 173,863 beneficiaries were identified. The average number of providers providing care in all sites were 15.9, 18.6, 23.1 for beneficiaries with mild, moderate, and severe HF, respectively; and 10.1, 11.5, and 12.1 in the outpatient setting, respectively. The average number of providers ordering care in all sites consisted of 8.3, 9.6, and 11.2 for beneficiaries with mild, moderate, and severe HF, respectively; and 6.5,7.3, and 7.8 in the outpatient setting, respectively. For beneficiaries with mild disease, only 10% of all office visits were specifically for HF, while those with moderate or severe disease, only 20% were specifically for HF. Medicare beneficiaries with HF, even those with mild disease, have a large number of providers ordering and providing care. These data highlight the importance for developing systems and processes of coordinated care for this population.


Author(s):  
Dan Lin ◽  
Ziv Shkedy ◽  
Dani Yekutieli ◽  
Tomasz Burzykowski ◽  
Hinrich W.H. Göhlmann ◽  
...  

Dose-response studies are commonly used in experiments in pharmaceutical research in order to investigate the dependence of the response on dose, i.e., a trend of the response level toxicity with respect to dose. In this paper we focus on dose-response experiments within a microarray setting in which several microarrays are available for a sequence of increasing dose levels. A gene is called differentially expressed if there is a monotonic trend (with respect to dose) in the gene expression. We review several testing procedures which can be used in order to test equality among the gene expression means against ordered alternatives with respect to dose, namely Williams' (Williams 1971 and 1972), Marcus' (Marcus 1976), global likelihood ratio test (Bartholomew 1961, Barlow et al. 1972, and Robertson et al. 1988), and M (Hu et al. 2005) statistics. Additionally we introduce a modification to the standard error of the M statistic. We compare the performance of these five test statistics. Moreover, we discuss the issue of one-sided versus two-sided testing procedures. False Discovery Rate (Benjamni and Hochberg 1995, Ge et al. 2003), and resampling-based Familywise Error Rate (Westfall and Young 1993) are used to handle the multiple testing issue. The methods above are applied to a data set with 4 doses (3 arrays per dose) and 16,998 genes. Results on the number of significant genes from each statistic are discussed. A simulation study is conducted to investigate the power of each statistic. A R library IsoGene implementing the methods is available from the first author.


2021 ◽  
Vol 10 (1) ◽  
Author(s):  
Michael S. Puddicombe

Productivity is widely recognized as one of the main contributors to increased economic and societal wellbeing. Unfortunately, productivity has been extremely difficult to operationalize in a repeatable context in the construction sector. The result is a lack of consensus on the basic question of whether there has been improvement or decline in the productivity of the sector. This study focuses on productivity in the housing industry. Productivity is especially important in this industry, as in addition to providing shelter, the housing market is the primary source of wealth accumulation in the US. An individual’s ability to enter this market will be a function of affordability which will be effected by the productivity of the industry. The combination of academic and societal impacts suggests that there is a need to address a fundamental question: what is the status of productivity in the housing industry. In order to address this question a data base was compiled from the 10-Ks of the largest, long lived, US companies in the single family housing industry. The result is a panel data set that consists of information on 11 firms over a 15-year period. These 11 firms were responsible for approximately 25% of all new home sales in any given year. The data set was analysed with random effects GLS time series regression. The results indicate that, at best, the housing industry has seen negligible total productivity growth.


Sign in / Sign up

Export Citation Format

Share Document