scholarly journals Should policy makers trust composite indices? A commentary on the pitfalls of inappropriate indices for policy formation

2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Matthias Kaiser ◽  
Andrew Tzer-Yeu Chen ◽  
Peter Gluckman

Abstract Background This paper critically discusses the use and merits of global indices, in particular, the Global Health Security Index (GHSI; Cameron et al. https://www.ghsindex.org/#l-section--map) in times of an imminent crisis, such as the current pandemic. This index ranked 195 countries according to their expected preparedness in the case of a pandemic or other biological threat. The coronavirus disease 2019 (Covid-19) pandemic provides the background to compare each country's predicted performance from the GHSI with the actual performance. In general, there is an inverted relation between predicted versus actual performance, i.e. the predicted top performers are among those that are the worst hit. Obviously, this reflects poorly on the potential policy uses of this index in imminent crisis management. Methods The paper analyses the GHSI and identifies why it may have struggled to predict actual pandemic preparedness as evidenced by the Covid-19 pandemic. The paper also uses two different data sets, one from the Worldmeter on the spread of the Covid-19 pandemics, and the other from the International Network for Government Science Advice (INGSA) Evidence-to-Policy Tracker, to draw comparisons between the actual introduction of pandemic response policies and the corresponding death rate in 29 selected countries. Results This paper analyses the reasons for the poor match between prediction and reality in the index, and mentions six general observations applying to global indices in this respect. These observations are based on methodological and conceptual analyses. The level of abstraction in these global indices builds uncertainties upon uncertainties and hides implicit value assumptions, which potentially removes them from the policy needs on the ground. Conclusions From the analysis, the question is raised if the policy community might have better tools for decision-making in a pandemic. On the basis of data from the INGSA Evidence-to-Policy Tracker, and with backing in studies from social psychology and philosophy of science, some simple heuristics are suggested, which may be more useful than a global index.

2021 ◽  
Vol 13 (2) ◽  
pp. 179-194
Author(s):  
Serene Ong ◽  
Jeffrey Ling ◽  
Angela Ballantyne ◽  
Tamra Lysaght ◽  
Vicki Xafis

AbstractGovernments are investing in precision medicine (PM) with the aim of improving healthcare through the use of genomic analyses and data analytics to develop tailored treatment approaches for individual patients. The success of PM is contingent upon clear public communications that engender trust and secure the social licence to collect and share large population-wide data sets because specific consent for each data re-use is impractical. Variation in the terminology used by different programmes used to describe PM may hinder clear communication and threaten trust. Language is used to create common understanding and expectations regarding precision medicine between researchers, clinicians and the volunteers. There is a need to better understand public interpretations of PM-related terminology. This paper reports on a qualitative study involving 24 focus group participants in the multi-lingual context of Singapore. The study explored how Singaporeans interpret and understand the terms ‘precision medicine’ and ‘personalised medicine’, and which term they felt more aptly communicates the concept and goals of PM. Results suggest that participants were unable to readily link the terms with this area of medicine and initially displayed preferences for the more familiar term of ‘personalised’. The use of visual aids to convey key concepts resonated with participants, some of whom then indicated preferences for the term ‘precision’ as being a more accurate description of PM research. These aids helped to facilitate dialogue around the ethical and social value, as well as the risks, of PM. Implications for programme developers and policy makers are discussed.


Author(s):  
Timo Wandhöfer ◽  
Steve Taylor ◽  
Miriam Fernandez ◽  
Beccy Allen ◽  
Harith Alani ◽  
...  

The role of social media in politics has increased considerably. A particular challenge is how to deal with the deluge of information generated on social media: it is impractical to read lots of messages with the hope of finding useful information. In this chapter, the authors suggest an alternative approach: utilizing analysis software to extract the most relevant information of the discussions taking place. This chapter discusses the WeGov Toolbox as one concept for policy-makers to deal with the information overload on Social Media, and how it may be applied. Two complementary, in depth case studies were carried out to validate the usefulness of the analysis results of the WeGov Toolbox components' within its target audience's everyday life. Firstly, the authors used the “HeadsUp” forum, operated by the Hansard Society. Here, they were able to compare the key themes and opinions extracted automatically by the Toolbox to a control group of manually pre-analyzed data sets. In parallel, results of analyses based on four weeks' intensive monitoring on policy area-specific Facebook pages selected by German policy makers, as well as topics on Twitter globally and local, were assessed by taking into account their existing experience with content discussed and user behavior in their respective public spheres. The cases show that there are interesting applications for policy-makers to use the Toolbox in combination with online forums (blogs) and social networks, if behavioral user patterns will be considered and the framework will be refined.


Author(s):  
Eleftheria Vasileiadou

The participation of stakeholders in policy formation has increased, based on the recognition that policy-makers today face increasingly complex and non-linear problems, requiring flexible modes of governance. In this chapter, I analyse the role of formalised stakeholder consultations in EU energy policy and their potential of integrating climate change issues. More specifically, I empirically investigate how stakeholder consultation processes influenced the formation of the EU Energy Communication of 2007. The analysis shows that there was limited diversity of participation in consultations, as actors from civil society or NGOs were not included. Moreover, the role of scientific knowledge in the consultations was minimal. Actors at the regional and sub-national level are generally ignored in such formalised consultation processes. Recommendations for EU policymakers and organisers of consultations are provided.


<em>Abstract.</em>—The landscape for policy and management of fish habitat is changing. The historic focus on evaluating environmental impact assessments for large projects, and issuing (or not) permits for small projects is being supplanted by new expectations for habitat managers and policy makers. Many of these new expectations are rooted in the adoption of an ecosystem approach to management of diverse human activities, including fisheries, in aquatic ecosystems, combined with a growing emphasis on integrated management of those human activities, in turn aided by spatial planning and spatial management approaches in many fields. These new expectations placed on habitat managers and policy makers create the need for expanded support from a new blending of habitat and population sciences. Historically, it may have been sufficient to use science advice based on relative indices of habitat quality and carefully assembled expert opinion as the basis for many tasks in habitat policy and management. Such tools now must be augmented by much more quantitative science advice, to allow for setting operational objectives for managing habitats, assessing the quality and quantity of critical or essential habitat for protected or exploited fish populations, conducting risk assessments of projects and mitigation measures, making siting decisions about marine protected areas and other spatial zoning measures, and many other tasks in which habitat managers and policy makers must participate. Science advice now must be able to quantify the relationships between habitat features and population status and productivity, as well with community properties such as resilience and vulnerability. This advice has to capture the uncertainty in the relationships and data sources, in forms that fit comfortably into risk assessments. Tools for forward projection of the habitat consequences of management options are needed, as are tools for cost-benefit analyses of tradeoffs among different types of habitats for different groups of aquatic species. None of these analytical challenges is beyond the scope of modern statistical and modelling capabilities, and current ecological concepts. Few of them can be met by existing tools and data-bases however. Moreover, many of the conceptual approaches to aquatic habitat management have been imported from terrestrial habitat management. They may have served adequately for management of riverine and marine benthic habitats, but some of the fundamental conceptual starting points are being questioned for marine and lacustrine habitats more generally. The paper brings out both some promising opportunities and some difficult challenges for the science needed to support contemporary habitat management and policy.


Author(s):  
Stephen G. Zemba ◽  
Michael R. Ames ◽  
Laura C. Green

Most ash generated by waste-to-energy (WTE) facilities in the U.S. is landfilled. Studies undertaken in the late 1980’s and early 1990’s indicated no significant environmental concerns associated with ash landfilling. However, in 2001, policy-makers at the Massachusetts Department of Environmental Protection (MA DEP) became concerned that the “cumulative” impacts of landfills, including ash landfills, might pose a risk to human health. To address this concern, we performed an in-depth assessment of impacts to air quality, and theoretical risks to health, from fugitive emissions associated with an ash landfill. Nine sources of fugitive ash emissions were modeled using methods that coupled detailed information about the site operations, ash properties, and meteorological conditions on an hour-by-hour basis. The results of these assessments, combined with ambient air data collected by others, demonstrated that the impacts from fugitive emissions of the ash were no more than negligible. Accordingly, in 2006, MA DEP revised its policy, exempting ash disposal landfills from the requirement to demonstrate no significant impact, effectively granting presumptive certainty to ash landfills that employ best management practices. Detailed analyses such as described herein, combined with robust data sets, can form the basis of more efficient regulatory policies.


Land ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 143 ◽  
Author(s):  
Daikun Wang ◽  
Victor Jing Li ◽  
Huayi Yu

The traditional linear regression model of mass appraisal is increasingly unable to satisfy the standard of mass appraisal with large data volumes, complex housing characteristics and high accuracy requirements. Therefore, it is essential to utilize the inherent spatial-temporal characteristics of properties to build a more effective and accurate model. In this research, we take Beijing’s core area, a typical urban center, as the study area of modeling for the first time. Thousands of real transaction data sets with a time span of 2014, 2016 and 2018 are conducted at the community level (community annual average price). Three different models, including multiple regression analysis (MRA) with ordinary least squares (OLS), geographically weighted regression (GWR) and geographically and temporally weighted regression (GTWR), are adopted for comparative analysis. The result indicates that the GTWR model, with an adjusted R2 of 0.8192, performs better in the mass appraisal modeling of real estate. The comparison of different models provides a useful benchmark for policy makers regarding the mass appraisal process of urban centers. The finding also highlights the spatial characteristics of price-related parameters in high-density residential areas, providing an efficient evaluation approach for planning, land management, taxation, insurance, finance and other related fields.


2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Tuan-Minh Nguyen ◽  
Adib Shafi ◽  
Tin Nguyen ◽  
Sorin Draghici

Abstract Background Many high-throughput experiments compare two phenotypes such as disease vs. healthy, with the goal of understanding the underlying biological phenomena characterizing the given phenotype. Because of the importance of this type of analysis, more than 70 pathway analysis methods have been proposed so far. These can be categorized into two main categories: non-topology-based (non-TB) and topology-based (TB). Although some review papers discuss this topic from different aspects, there is no systematic, large-scale assessment of such methods. Furthermore, the majority of the pathway analysis approaches rely on the assumption of uniformity of p values under the null hypothesis, which is often not true. Results This article presents the most comprehensive comparative study on pathway analysis methods available to date. We compare the actual performance of 13 widely used pathway analysis methods in over 1085 analyses. These comparisons were performed using 2601 samples from 75 human disease data sets and 121 samples from 11 knockout mouse data sets. In addition, we investigate the extent to which each method is biased under the null hypothesis. Together, these data and results constitute a reliable benchmark against which future pathway analysis methods could and should be tested. Conclusion Overall, the result shows that no method is perfect. In general, TB methods appear to perform better than non-TB methods. This is somewhat expected since the TB methods take into consideration the structure of the pathway which is meant to describe the underlying phenomena. We also discover that most, if not all, listed approaches are biased and can produce skewed results under the null.


2009 ◽  
Vol 30 (6) ◽  
pp. 852-872 ◽  
Author(s):  
Rebecca L. Clark ◽  
Jennifer E. Glick ◽  
Regina M. Bures

Family researchers and policy makers are giving increasing attention to the consequences of immigration for families. Immigration affects the lives of family members who migrate as well as those who remain behind and has important consequences for family formation, kinship ties, living arrangements, and children's outcomes. We present a selective review of the literature on immigrant families in the United States, focusing on key research themes and needs. A summary of secondary data sets that can be used to study immigrant families is presented as well as suggestions for future research in this increasingly important area of family research and policy.


1995 ◽  
Vol 31 (2) ◽  
pp. 213-226 ◽  
Author(s):  
P. K. Thornton ◽  
A. R. Saka ◽  
U. Singh ◽  
J. D. T. Kumwenda ◽  
J. E. Brink ◽  
...  

SUMMARYA computer crop simulation model of the growth and development of maize was validated using data sets obtained from field experiments run at various sites in the mid-altitude maize zone of central Malawi between 1989 and 1992. The model was used to provide information concerning management options such as the timing and quantity of nitrogen fertilizer applications and to quantify the weather-related risks of maize production in the region. It was linked to a Geographic Information System to provide information at a regional level that could ultimately be of value to policy makers and research and extension personnel.


2021 ◽  
Author(s):  
Mark Adrian Turcato

The performance gap, the difference between how a building was intended to Perform and its actual performance, poses a challenge to successful high performance design. This research examines the application of submetering data and whole building energy models to evaluate the performance gap in buildings as related to energy consumption, and in specific energy use associated with receptacles and lighting. While difficulties in grappling with large amounts of data persist, results indicate that building management and occupancy issues can offer an explanation for a significant portion of differences between predicted and actual energy use. Experience working with these data sets also suggests that further efforts are required to demonstrate the value of submetering in order to ensure submetering systems are not compromised by the value engineering process.


Sign in / Sign up

Export Citation Format

Share Document