scholarly journals Development of a communication system for humanitarian emergencies

Author(s):  
Antonio Sarasa Cabezuelo

One of the main tasks of NGOs is the countries in which they operate is to alert to possible humanitarian emergencies that may occur due to different events such as epidemics caused by illness, famine, armed conflict, and other events that may occur. A key element for the emergency to be controlled and not have tragic consequences is the speed of information management: rapid communication of the emergency, where it is occurring, immediate needs…This information will allow the NGO managers to take action and decisions about how to act. This information is normally collected at the place where it is produced by the donors. So it is necessary to have specialized computer tools that facilitate the cooperator the transmission of information and the subsequent processing and display of the information collected. This article describes a tool aimed at NGO cooperators that aims to facilitate data collection, transmission and processing.

2009 ◽  
Vol 24 (S2) ◽  
pp. s202-s205 ◽  
Author(s):  
Susan Purdin ◽  
Paul Spiegel ◽  
Katelyn P. Mack ◽  
Jennifer Millen

AbstractIntroduction:Surveillance is an essential component of health and nutrition information management during humanitarian situations. Changes in the nature and scope of humanitarian assistance activities have created new challenges in health surveillance, particularly outside of camp-based settings.Objectives:The primary aim of the Humanitarian Health Information Management Working Group was to identify challenges and areas that need further elucidation in a range of non-camp settings, including urban and rural as well as low-and middle-income countries.Results:Three major themes emerged: (1) standardization of measures and methodologies; (2) context in data collection and management; and (3) hidden populations and the purpose of surveillance in urban settings. Innovative examples of data collection and management in community-based surveillance were discussed, including task-shifting, health worker to community member ratio, and literacy needs.Conclusions:Surveillance in non-camp settings can be informed by surveillance activities in camp-based settings, but requires additional consideration of new methods and population needs to achieve its objectives.


2019 ◽  
Vol 21 (4) ◽  
pp. 571-581
Author(s):  
Shobana Sivaraman ◽  
Punit Soni

Public health deals with promotion of health, prevention and treatment of communicable and non-communicable diseases by designing appropriate health interventions and services to deliver through the health systems. There is a need for robust database on the magnitude of disease burden, socio-demographic characteristics and associated risk factors for evidence-based effective planning and developing appropriate strategies, their implementation, monitoring and evaluation. Although India has vast information available through various large-scale surveys and research studies, it still lacks a reliable health information management system. The available data are seldom analysed to draw meaningful conclusions, to develop evidence for policies and strategies and to measure effectiveness of health programmes. The challenges faced in the survey research are multifaceted, from data collection in the field to its rapid transmission of data to central data servers. There is an increasing trend in using technology, especially computer-assisted personal interviews (CAPI) which is not only expensive but also requires extensive training and information management for transmission of data and its storage. This article examines the application of technology in survey research for efficient data management and to improve data quality. A software called Open Data Kit (ODK) was used for data collection and real-time monitoring of interviewers in field to improve the quality of data collection, achieve desired response rate (RR) and for better field operations’ management. The data collection and field reporting forms designed using ODK act as a significant tool to demonstrate how technology can be used to articulate research expectations at various levels with lower cost and higher efficiency. The research article examines all possible aspects of using technology in Health Survey Research. It aims to introduce further discussion of using technology for field data collection and monitoring.


2009 ◽  
Vol 44 (3) ◽  
pp. 41-62 ◽  
Author(s):  
Anna K. Jarstad

Why are some elections followed by armed conflict, while others are not? This article begins to explore this question by mapping the prevalence of power-sharing agreements and patterns of post-election peace in states shattered by civil war. While democracy builds on the notion of free political competition and uncertain electoral outcomes, power-sharing reduces the uncertainty by ensuring political power for certain groups. Nevertheless, new data presented in this article – the Post-Accord Elections (PAE) data collection – shows that the issues of peace, power-sharing and democracy have become intertwined as the vast majority of contemporary peace agreements provide for both power-sharing and elections. First, in contrast to previous research which has suggested that power-sharing is a tool for ending violence, this study shows that conflict often continues after an agreement has been signed, even if it includes provisions for power-sharing. Second, this investigation shows no evidence of power-sharing facilitating the holding of elections. On the contrary, it is more common that elections are held following a peace process without power-sharing. Third, a period of power-sharing ahead of the elections does not seem to provide for postelection peace. Rather, such elections are similarly dangerous as post-accord elections held without a period of power-sharing. The good news is that power-sharing does not seem to have a negative effect on post-election peace.


2019 ◽  
Vol 37 (4) ◽  
pp. 244-249
Author(s):  
Akshay Rajaram ◽  
Trevor Morey ◽  
Sonam Shah ◽  
Naheed Dosani ◽  
Muhammad Mamdani

Background: Considerable gains are being made in data-driven efforts to advance quality improvement in health care. However, organizations providing hospice-oriented palliative care for structurally vulnerable persons with terminal illnesses may not have the enabling data infrastructure or framework to derive such benefits. Methods: We conducted a pilot cross-sectional qualitative study involving a convenience sample of hospice organizations across North America providing palliative care services for structurally vulnerable patients. Through semistructured interviews, we surveyed organizations on the types of data collected, the information systems used, and the challenges they faced. Results: We contacted 13 organizations across North America and interviewed 9. All organizations served structurally vulnerable populations, including the homeless and vulnerably housed, socially isolated, and HIV-positive patients. Common examples of collected data included the number of referrals, the number of admissions, length of stay, and diagnosis. More than half of the organizations (n = 5) used an electronic medical record, although none of the record systems were specifically designed for palliative care. All (n = 9) the organizations used the built-in reporting capacity of their information management systems and more than half (n = 6) augmented this capacity with chart reviews. Discussion: A number of themes emerged from our discussions. Present data collection is heterogeneous, and storage of these data is highly fragmented within and across organizations. Funding appeared to be a key enabler of more robust data collection and use. Future work should address these gaps and examine opportunities for innovative ways of analysis and reporting to improve care for structurally vulnerable populations.


2012 ◽  
Vol 49 (2) ◽  
pp. 351-362 ◽  
Author(s):  
Ralph Sundberg ◽  
Kristine Eck ◽  
Joakim Kreutz

This article extends the Uppsala Conflict Data Program (UCDP) by presenting new global data on non-state conflict, or armed conflict between two groups, neither of which is the state. The dataset includes conflicts between rebel groups and other organized militias, and thus serves as a complement to existing datasets on armed conflict which have either ignored this kind of violence or aggregated it into civil war. The dataset also includes cases of fighting between supporters of different political parties as well as cases of communal conflict, that is, conflict between two social groups, usually identified along ethnic or religious lines. This thus extends UCDP’s conflict data collection to facilitate the study of topics like rebel fractionalization, paramilitary involvement in conflict violence, and communal or ethnic conflict. In the article, we present a background to the data collection and provide descriptive statistics for the period 1989–2008 and then illustrate how the data can be used with the case of Somalia. These data move beyond state-centric conceptions of collective violence to facilitate research into the causes and consequences of group violence which occurs without state participation.


2014 ◽  
Vol 12 (5) ◽  
pp. 383
Author(s):  
Nuala M. Cowan, DSc, MA, BA

Objective: An effectual emergency response effort is contingent upon the quality and timeliness of information provided to both the decision making and coordinating functions; conditions that are hard to guarantee in the urgent climate of the response effort. The purpose of this paper is to present a validated Humanitarian Data Model (HDM) that can assist in the rapid assessment of disaster needs and subsequent decision making. Substandard, inconsistent information can lead to poorly informed decisions, and subsequently, inappropriate response activities. Here we present a novel, organized, and fluid information management workflow to be applied during the rapid assessment phase of an emergency response. A comprehensive, peer-reviewed geospatial data model not only directs the design of data collection tools but also allows for more systematic data collection and management, leading to improved analysis and response outcomes.Design: This research involved the development of a comprehensive geospatial data model to guide the collection, management and analysis of geographically referenced assessment information, for implementation at the rapid response phase of a disaster using a mobile data collection app based on key outcome parameters. A systematic review of literature and best practices was used to identify and prioritize the minimum essential data variables.Subjects: The data model was critiqued for variable content, structure, and usability by a group of subject matter experts in the fields of humanitarian information management and geographical information systems.Conclusions: Consensus found that the adoption of a standardized system of data collection, management, and processing, such as the data model presented here, could facilitate the collection and sharing of information between agencies with similar goals, facilitate the better coordination of efforts by unleashing the power of geographic information for humanitarian decision support.


2008 ◽  
Vol 38 ◽  
pp. 88-92
Author(s):  
Vincent F. Hock ◽  
Susan Drozdz ◽  
Andrew Seelinger ◽  
Delmar Doyle

When the coating application does not meet the required standards, the lifetime of the coating can be substantially reduced. In the worst case, the coating may catastrophically fail immediately after being placed into service. This ongoing joint project conducted by the U.S. Navy and the U.S. Army is intended to demonstrate and provide for the automation of data collection for painting projects on critical structures and the make this data a more effective resource for making effective management decisions for the protection of DoD assets.


2019 ◽  
Vol 23 (1) ◽  
pp. 16-22 ◽  
Author(s):  
Jolanta Korycka-Skorupa ◽  
Tomasz Nowacki

Abstract Nowadays a lot of people are trying to make maps, and especially digital maps. A wide range of computer tools and high graphic capabilities have together made maps increasingly popular and seemingly easy to prepare for any person who can use a computer. It seems necessary to verify the bases of the cartographic presentation methods. There is a need for a new, formalized view of the method as a sequence of steps from data collection, to correct presentation, to map. Two terms related to cartographic presentation should be distinguished in this article: “methods” and “forms.” A method is understood as the process by which data is transformed into a presentation. A form is understood as the end result of this process, i.e. the resulting graphical image or map. In the article five types of cartographic presentation are indicated. In the successive types, one can observe an increasing degree of complexity of cartographic presentation.


Author(s):  
Joris S. M. Vergeest ◽  
Imre Horváth

Abstract The shared usage of computer tools among members of a design team heavily relies on the interoperability of the systems involved. Interoperability is an outstanding issue in engineering information management science for more than twenty years, and is held responsible for a multi-billion economical loss yearly in industry. Efforts (STEP, IGES) by standardization bodies and by the software industry, which now deliver web-based platforms such as Corba and Java, can only superficially address the interoperability problem. Most of the solutions come down to giving clients long fingers to remotely control a centralized model. It is generally recognized that such a centralized approach is far from efficient. However, when the design tasks are really distributed among the team members, a rock bottom limitation invariably emerges, thus canceling most of the potential gain in efficiency. In this paper the interoperability is formally defined. It is then shown why and under which conditions interoperability is deemed to fail. The prime purpose of the paper is to promote awareness about this issue among researchers and infrastructure designers. Once being aware of the fundamental constraints of interoperability, compromise solutions may be intentionally developed, rather than to implement ad hoc work-around procedures (which are responsible the bulk of the financial loss mentioned). We present an approach to systematically analyze and model the requirements of a shared infrastructure, and to anticipate the feasibility of interoperability.


Sign in / Sign up

Export Citation Format

Share Document