Data collection, management, and score reporting

2020 ◽  
pp. 158-180
Author(s):  
Slobodanka Dimova ◽  
Xun Yan ◽  
April Ginther
2020 ◽  
pp. 1564-1619
Author(s):  
Jeremy Horne

In the last half century, we have gone from storing data on 5¼ inch floppy diskettes to the cloud and now use fog computing. But one should ask why so much data is being collected. Part of the answer is simple in light of scientific projects, but why is there so much data on us? Then, we ask about its “interface” through fog computing. Such questions prompt this article on the philosophy of big data and fog computing. After some background on definitions, origins and contemporary applications, the main discussion begins with thinking about modern data collection, management, and applications from a complexity standpoint. Big data is turned into knowledge, but knowledge is extrapolated from the past and used to manage the future. Yet it is questionable whether humans have the capacity to manage contemporary technological and social complexity evidenced by our world in crisis and possibly on the brink of extinction. Such calls for a new way of studying societies from a scientific point of view. We are at the center of the observation from which big data emerge and are manipulated, the overall human project being not only to create an artificial brain with an attendant mind, but a society that might be able to survive what “natural” humans cannot.


1984 ◽  
Vol 6 (2) ◽  
pp. 6-6

Oswald Werner was one of the first to actually take data processing equipment into the field. This section is included as an illustration of the difficulties that face the technological innovator. The story illustrates the saying "Anything you can buy is already obsolete." Nonetheless, Werner has successfully implemented a sophisticated data collection, management, and analysis technique using readily available equipment and working under field conditions.


2014 ◽  
Vol 12 (5) ◽  
pp. 383
Author(s):  
Nuala M. Cowan, DSc, MA, BA

Objective: An effectual emergency response effort is contingent upon the quality and timeliness of information provided to both the decision making and coordinating functions; conditions that are hard to guarantee in the urgent climate of the response effort. The purpose of this paper is to present a validated Humanitarian Data Model (HDM) that can assist in the rapid assessment of disaster needs and subsequent decision making. Substandard, inconsistent information can lead to poorly informed decisions, and subsequently, inappropriate response activities. Here we present a novel, organized, and fluid information management workflow to be applied during the rapid assessment phase of an emergency response. A comprehensive, peer-reviewed geospatial data model not only directs the design of data collection tools but also allows for more systematic data collection and management, leading to improved analysis and response outcomes.Design: This research involved the development of a comprehensive geospatial data model to guide the collection, management and analysis of geographically referenced assessment information, for implementation at the rapid response phase of a disaster using a mobile data collection app based on key outcome parameters. A systematic review of literature and best practices was used to identify and prioritize the minimum essential data variables.Subjects: The data model was critiqued for variable content, structure, and usability by a group of subject matter experts in the fields of humanitarian information management and geographical information systems.Conclusions: Consensus found that the adoption of a standardized system of data collection, management, and processing, such as the data model presented here, could facilitate the collection and sharing of information between agencies with similar goals, facilitate the better coordination of efforts by unleashing the power of geographic information for humanitarian decision support.


2019 ◽  
Author(s):  
Elizabeth A. Silva ◽  
Alicia B. Mejía ◽  
Elizabeth S. Watkins

Universities are at long last undertaking efforts to collect and disseminate information about student career outcomes, after decades of calls to action. Organizations such as Rescuing Biomedical Research and Future of Research brought this issue to the forefront of graduate education, and the second Future of Biomedical Graduate and Postdoctoral Training conference (FOBGAPT2) featured the collection of career outcomes data in its final recommendations, published in this journal (Hitchcock et al., 2017). More recently, 26 institutions assembled as the Coalition for Next Generation Life Science, committing to ongoing collection and dissemination of career data for both graduate and postdoc alumni. A few individual institutions have shared snapshots of the data in peer-reviewed publications (Mathur et al., 2018; Silva, des Jarlais, Lindstaedt, Rotman, Watkins, 2016) and on websites. As more and more institutions take up this call to action, they will now be looking for tools, protocols, and best practices for ongoing career outcomes data collection, management, and dissemination. Here, we describe UCSF's experiences in conducting a retrospective study, and in institutionalizing a methodology for annual data collection and dissemination. We describe and share all tools we have developed, and we provide calculations of the time and resources required to accomplish both retrospective studies and annual updates. We also include broader recommendations for implementation at your own institutions, increasing the feasibility of this endeavor.


2016 ◽  
Vol 1 (2) ◽  
Author(s):  
Lance Lanyon

Professor Lance Lanyon recently published an article in Veterinary Record (Lanyon, 2016) proposing a nationwide Evidence Based Veterinary Medicine (EBVM) system of veterinary-practice data collection, management and interrogation. The goal is to use data from UK practices to aid “the understanding of the links between the cause, prevalence and treatment of disease.”His article describes the need for such a system, and possible mechanisms to pay for it. Professor Lanyon’s article started an important conversation about the role all practices can play in EBVM, so Veterinary Evidence asked Professor Lanyon to expand on some of his ideas.


2021 ◽  
Author(s):  
Vinicius Costa Lima ◽  
Filipe Andrade Bernardi ◽  
Felipe Carvalho Pellison ◽  
Francisco Barbosa Júnior ◽  
Márcio Elói Filho ◽  
...  

Abstract The outcomes of a clinical research directly depend on the correct definition of the research protocol, the data collection strategy and the data management plan. Furthermore, researchers often need to work within challenging contexts, such as in Tuberculosis services, where human and technological resources for research may be rare. The use of Electronic Data Capture systems, such as REDCap and KoBotoolbox, can help to mitigate such risks and to enable a reliable environment to conduct health research and promote results dissemination and data reusability. The proposed solution was based on needs pinpointed by researchers, considering the lack of an embracing solution to conduct research in low resources environments. The REDbox framework was built to enhance data collection, management and sharing in tuberculosis research, while providing a better user experience. The relevance of this article lies in the innovative approach to support TB research by combining existing technologies and developing support features. When focusing on positive aspects of each tool, it is possible to underpin tuberculosis research by improving data collection, management capability and security. Furthermore, the aggregation of meaning in raw data helps to promote the quality and the availability of research data.


2021 ◽  
Vol 2 (1) ◽  
pp. 1-9
Author(s):  
Masbullah Masbullah ◽  
◽  
Salmi Yuniar Bahri ◽  

Purpose: This study aimed to see how the Asset Data Collection Management of the Regional Secretariat of East Lombok Regency is in accordance with the rules from start to finish and get a clear picture of the research topic. Research methodology: This study was qualitative research. The data technique was done by using documentation, field observation, document review, and interview. The collected data were then analyzed using data analysis theory by Miles and Huberman. Results: The results showed that the Regional Asset Data Collection Management in East Lombok Regency was in accordance with the Domestic Government Regulation No. 19/2016. However, the management of official vehicles was not yet effective and efficient. This can be seen from the management practices that still follow the scope contained in the regulation. Limitations: The weakness of this study is that most of the data cannot be obtained from primary sources but relies on secondary data or other people's information. Contribution: It is hoped that this research can be used as a reference used in further research. Furthermore, the research results are expected to provide an overview of how the regional secretariat asset data management can be used as material for evaluation and improvement in local government policymaking.


2015 ◽  
Vol 162 (4) ◽  
pp. 287 ◽  
Author(s):  
Tianjing Li ◽  
S. Swaroop Vedula ◽  
Nira Hadar ◽  
Christopher Parkin ◽  
Joseph Lau ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document