scholarly journals Psychometric Performance of the Mental Health Implementation Science Tools (mhIST) Across Six Low- and Middle-income Countries

Author(s):  
Luke R Aldridge ◽  
Christopher G Kemp ◽  
Judith K Bass ◽  
Kristen Danforth ◽  
Jeremy C Kane ◽  
...  

Abstract BackgroundExisting implementation measures developed in high-income countries may have limited appropriateness for use within low- and middle-income countries (LMIC). In response, researchers at Johns Hopkins University began developing the Mental Health Implementation Science Tools (mhIST) in 2013 to assess priority implementation determinants and outcomes across four key stakeholder groups – consumers, providers, organization leaders, and policy makers – with dedicated versions of scales for each group. These were field tested and refined in several contexts, and criterion validity was established in Ukraine. The Consumer and Provider mhIST have since grown in popularity in mental health research, outpacing psychometric evaluation. Our objective was to establish the cross-context psychometric properties of these versions and inform future revisions.MethodsWe compiled data from seven studies across six LMIC – Colombia, Myanmar, Pakistan, Thailand, Ukraine, and Zambia – to evaluate the psychometric performance of the Consumer and Provider mhIST. We used exploratory factor analysis to identify dimensionality, factor structure, and item loadings for each scale within each stakeholder version. We also used alignment analysis (i.e., multi-group confirmatory factor analysis) to estimate measurement invariance and differential item functioning of the Consumer scales across the six countries.FindingsAll but one scale within the Provider and Consumer versions had a Cronbach’s alpha greater than 0.8. Exploratory factor analysis indicated most scales were multidimensional, with factors generally aligning with a priori subscales for the Provider version; the Consumer version has no predefined subscales. Alignment analysis of the Consumer mhIST indicated a range of measurement invariance for scales across settings (R2 0.46 to 0.77). Several items were identified for potential revision due to participant non-response or low or cross- factor loadings. We found only one item – which asked consumers whether their intervention provider was available when needed – to have differential item functioning in both intercept and loading.ConclusionWe provide evidence that the Consumer and Provider versions of the mhIST are internally valid and reliable across diverse contexts and stakeholder groups for mental health research in LMIC. We recommend the instrument be revised based on these analyses and future research examine instrument utility by linking measurement to other outcomes of interest.

2009 ◽  
Vol 195 (4) ◽  
pp. 364-365 ◽  
Author(s):  
Ricardo Araya

SummaryThere are huge inequalities in health research within and between countries. It is argued that this may hinder the process of setting and tackling mental health priorities. If this were true, establishing research priorities would be important. However, this is not a simple process and one must be aware of its limitations. Despite a plethora of declarations, funding for mental health research in low- and middle-income countries remains hard to find. In the absence of funding, establishing research priorities is seen by many as an exercise of lesser importance.


2015 ◽  
Vol 2 ◽  
Author(s):  
Mark Tomlinson ◽  
Barak Morgan

Background.Less than 3% of articles published in the peer reviewed literature include data from low- and middle-income countries – where 90% of the world's infants live.Methods.In this paper, we discuss the context of infancy in Africa and the conditions of adversity obtaining in Africa.Results.We discuss the implications of poverty on parenting, and linked to this outline the impact of maternal depression on infant development.Conclusions.We outline three features of the field of infant mental health research in Africa, and issue a call for action about what we believe is needed in order to develop the field in the next decade.


Health Policy ◽  
2010 ◽  
Vol 94 (3) ◽  
pp. 211-220 ◽  
Author(s):  
Denise Razzouk ◽  
Pratap Sharan ◽  
Carla Gallo ◽  
Oye Gureje ◽  
Exaltacion E. Lamberte ◽  
...  

2020 ◽  
Vol 35 (4) ◽  
pp. 424-439
Author(s):  
Nicole Votruba ◽  
Jonathan Grant ◽  
Graham Thornicroft

Abstract The burden of mental illness is excessive, but many countries lack evidence-based policies to improve practice. Mental health research evidence translation into policymaking is a ‘wicked problem’, often failing despite a robust evidence base. In a recent systematic review, we identified a gap in frameworks on agenda setting and actionability, and pragmatic, effective tools to guide action to link research and policy are needed. Responding to this gap, we developed the new EVITA 1.1 (EVIdence To Agenda setting) conceptual framework for mental health research–policy interrelationships in low- and middle-income countries (LMICs). We (1) drafted a provisional framework (EVITA 1.0); (2) validated it for specific applicability to mental health; (3) conducted expert in-depth interviews to (a) validate components and mechanisms and (b) assess intelligibility, functionality, relevance, applicability and effectiveness. To guide interview validation, we developed a simple evaluation framework. (4) Using deductive framework analysis, we coded and identified themes and finalized the framework (EVITA 1.1). Theoretical agenda-setting elements were added, as targeting the policy agenda-setting stage was found to lead to greater policy traction. The framework was validated through expert in-depth interviews (n = 13) and revised. EVITA 1.1 consists of six core components [advocacy coalitions, (en)actors, evidence generators, external influences, intermediaries and political context] and four mechanisms (capacity, catalysts, communication/relationship/partnership building and framing). EVITA 1.1 is novel and unique because it very specifically addresses the mental health research–policy process in LMICs and includes policy agenda setting as a novel, effective mechanism. Based on a thorough methodology, and through its specific design and mechanisms, EVITA has the potential to improve the challenging process of research evidence translation into policy and practice in LMICs and to increase the engagement and capacity of mental health researchers, policy agencies/planners, think tanks, NGOs and others within the mental health research–policy interface. Next, EVITA 1.1 will be empirically tested in a case study.


2004 ◽  
Vol 34 (5) ◽  
pp. 954-954

Late in 2003 the World Health Organisation convened a meeting of psychiatric journal editors to discuss the role of scientific journals in mental health research in developing countries. The joint statement from the meeting is now available at the Journal's website: http://journals.cambridge.org/final_joint_statement.pdf


Sign in / Sign up

Export Citation Format

Share Document