scholarly journals Eye Injury Registries: A Review on Key Registry Processes

Author(s):  
Farkhondeh Asadi ◽  
Nahid Ramezanghorbani ◽  
Sohrab Almasi ◽  
Mehrnaz Hajiabedin Rangraz

Background: Data management related to eye injuries is vital in improving care process, improving treatment and implementing preventive programs. Implementation of a registry to manage data is an integral part of this process. This systematic review aimed to identify processes related to eye injury registries. Methods: Databases such as PubMed, Web of Science, Embase and Scopus were used in searching for articles from 2010 to Oct 2020 using the keywords “eye injuries” and” registry”. The identified processes related to eye injuries registry such as case finding, data collection, abstracting, reporting, follow-up and data quality control are presented in this review. Results: Of 1493 articles retrieved, 30 articles were selected for this study based on the inclusion and exclusion criteria. Majority of these studies were conducted in the United States. All registries had case finding and the most common resources for case finding included medical documents, reports and screening results. Moreover, majority of registries collected data electronically. However, few registries used data quality attributes to improve the data collected. Conclusion: Eye injury registry plays an important role in the management of eye injury data and as a result, better management of these data will be established. Taking into consideration that the quality of collected data has a vital role in adopting prevention strategies, it is essential to use high-quality data and quality control methods in planning and designing eye injury registries.

PEDIATRICS ◽  
1992 ◽  
Vol 90 (6) ◽  
pp. 959-965
Author(s):  
Terri A. Slagle ◽  
Jeffrey B. Gould

The purpose of this national survey was to define the extent and features of database use by 445 tertiary level neonatal intensive care nurseries in the United States. Of the 305 centers responding to our survey, 78% had a database in use in 1989 and 15% planned to develop one in the future. Nurseries varied remarkably in the volume of data collected, the amount of time devoted to completing data collection forms, and the personnel involved in data collection. Although data were used primarily for statistical reports (93% of nurseries), quality assurance (73%) and research activities (61%) were also enhanced by database information. Neonatal databases were used to generate reports for the permanent medical record in 38% of centers. Satisfaction with the database was dependent on how useful the database information was to centers which collected and actually used a large volume of information. Overall, nurseries expressed a high degree of confidence in the data they collected, and 65% felt their neonatal database information could be used directly in publication of research. It was disturbing that accuracy of data was not monitored formally by the majority of nurseries. Only 27% of centers followed a routine schedule of data quality assurance, and only 53% had built in error messages for data entry. We caution all who receive database information in the form of morbidity and mortality statistics, clinical reports on patients cared for in neonatal units, and published manuscripts to be attentive to the quality of the data they consume. We feel that future database design efforts need to better address data quality control. Our findings stress the importance and need for immediate efforts to better address database quality control.


2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Janet E. Squires ◽  
Alison M. Hutchinson ◽  
Anne-Marie Bostrom ◽  
Kelly Deis ◽  
Peter G. Norton ◽  
...  

Researchers strive to optimize data quality in order to ensure that study findings are valid and reliable. In this paper, we describe a data quality control program designed to maximize quality of survey data collected using computer-assisted personal interviews. The quality control program comprised three phases: (1) software development, (2) an interviewer quality control protocol, and (3) a data cleaning and processing protocol. To illustrate the value of the program, we assess its use in the Translating Research in Elder Care Study. We utilize data collected annually for two years from computer-assisted personal interviews with 3004 healthcare aides. Data quality was assessed using both survey and process data. Missing data and data errors were minimal. Mean and median values and standard deviations were within acceptable limits. Process data indicated that in only 3.4% and 4.0% of cases was the interviewer unable to conduct interviews in accordance with the details of the program. Interviewers’ perceptions of interview quality also significantly improved between Years 1 and 2. While this data quality control program was demanding in terms of time and resources, we found that the benefits clearly outweighed the effort required to achieve high-quality data.


Cancer ◽  
2017 ◽  
Vol 123 ◽  
pp. 4982-4993 ◽  
Author(s):  
Claudia Allemani ◽  
Rhea Harewood ◽  
Christopher J. Johnson ◽  
Helena Carreira ◽  
Devon Spika ◽  
...  

2017 ◽  
Vol 6 (2) ◽  
pp. 505-521 ◽  
Author(s):  
Luděk Vecsey ◽  
Jaroslava Plomerová ◽  
Petr Jedlička ◽  
Helena Munzarová ◽  
Vladislav Babuška ◽  
...  

Abstract. This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.


2017 ◽  
Author(s):  
Luděk Vecsey ◽  
Jaroslava Plomerová ◽  
Petr Jedlička ◽  
Helena Munzarová ◽  
Vladislav Babuška ◽  
...  

Abstract. This paper focuses on major issues related to data reliability and MOBNET network performance in the AlpArray seismic experiments, in which twenty temporary broad-band stations of the Czech MOBNET pool of mobile stations have been involved. Currently used high-resolution scientific methods require high-quality data recorded for a sufficiently long time interval at observatories and during full time of operation of temporary stations. In this paper we present both new hardware and software tools that help to assure the high-quality standard of broad-band seismic data. Special attention is paid to issues like a detection of sensor mis-orientation, timing problems, exchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, e.g., imperfect gain. Thorough data-quality control should represent an integral constituent of seismic data recording, pre-processing and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts of scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate beneficial effects of the procedures we have developed for having a sufficiently large set of high-quality and reliable data from each group participating in field experiments.


2020 ◽  
pp. 183335832092936
Author(s):  
Ali Garavand ◽  
Reza Rabiei ◽  
Hassan Emami ◽  
Mehdi Pishgahi ◽  
Mojtaba Vahidi-Asl

Background: The management of data on coronary artery disease (CAD) plays a significant role in controlling the disease and reducing the mortality of patients. The diseases registries facilitate the management of data. Objective: This study aimed to identify the attributes of hospital-based CAD registries with a focus on key registry processes. Method: In this systematic review, we searched for studies published between 2000 and 2019 in PubMed, Scopus, EMBASE and ISI Web of Knowledge. The search terms included coronary artery disease, registry and data management (MeSH terms) at November 2019. Data gathering was conducted using a data extraction form, and the content of selected studies was analysed with respect to key registry processes, including case finding, data gathering, data abstracting, data quality control, reporting and patient follow-up. Results: A total of 17,604 studies were identified in the search, 55 of which were relevant studies that addressed the 21 registries and were selected for the analysis. Results showed that the most common resources for case finding included admission and discharge documents, physician’s reports and screening results. Patient follow-up was mainly performed through direct visits or via telephone calls. The key attributes used for checking the data quality included data accuracy, completeness and definition. Conclusion: CAD registries aim to facilitate the assessment of health services provided to patients. Putting the key registry processes in place is crucial for developing and implementing the CAD registry. The data quality control, as a CAD registry process, requires developing standard tools and applying appropriate data quality attributes. Implications: The findings of the current study could lay the foundation for successful design and development of CAD registries based on the key registry processes for effective data management.


Author(s):  
Antonella D. Pontoriero ◽  
Giovanna Nordio ◽  
Rubaida Easmin ◽  
Alessio Giacomel ◽  
Barbara Santangelo ◽  
...  

2001 ◽  
Vol 27 (7) ◽  
pp. 867-876 ◽  
Author(s):  
Pankajakshan Thadathil ◽  
Aravind K Ghosh ◽  
J.S Sarupria ◽  
V.V Gopalakrishna

Sign in / Sign up

Export Citation Format

Share Document