scholarly journals Incremental Innovation towards Business Performance: Data Management Challenges in Healthcare Industry in Indonesia

2018 ◽  
Vol 218 ◽  
pp. 04015 ◽  
Author(s):  
Muharman Lubis ◽  
Lubis Arif Ridho ◽  
Bastian Lubis ◽  
Asmin Lubis

The Industry 4.0 indicate the increases in competitive pressures, the margins reduction, the availability of new technology and the marketing development techniques suggest making more complex decisions to make and sustain the success. Data management is essential for organizations because the administrative process, which data is acquired, validated, stored, protected and processed through its accessibility, reliability and timeliness ensures that the needs of data are met. It is also a process of developing data architectures, practices and procedures that address the data and then implement these aspects on a regular basis. All these process will ease and smoothen the business flow. Therefore, there are number of data management challenges to maintain sheer volume of data, taking reactive approach, lack of process and data handling, fragmented data ownership and driving a data culture. This study investigate the current issues in health industry to overcome through recognizing both the importance of quality data and having more sophisticated approach to manage data as the organization begin shifting to be more data centric model.

1980 ◽  
Vol 19 (01) ◽  
pp. 37-41
Author(s):  
R. F. Woolson ◽  
M. T. Tsuang ◽  
L. R. Urban

We are now conducting a forty-year follow-up and family study of 200 schizophrenics, 325 manic-depressives and 160 surgical controls. This study began in 1973 and has continued to the present date. Numerous data handling and data management decisions were made in the course of collecting the data for the project. In this report some of the practical difficulties in the data handling and computer management of such large and bulky data sets are enumerated.


2017 ◽  
Vol 4 (1) ◽  
pp. 25-31 ◽  
Author(s):  
Diana Effendi

Information Product Approach (IP Approach) is an information management approach. It can be used to manage product information and data quality analysis. IP-Map can be used by organizations to facilitate the management of knowledge in collecting, storing, maintaining, and using the data in an organized. The  process of data management of academic activities in X University has not yet used the IP approach. X University has not given attention to the management of information quality of its. During this time X University just concern to system applications used to support the automation of data management in the process of academic activities. IP-Map that made in this paper can be used as a basis for analyzing the quality of data and information. By the IP-MAP, X University is expected to know which parts of the process that need improvement in the quality of data and information management.   Index term: IP Approach, IP-Map, information quality, data quality. REFERENCES[1] H. Zhu, S. Madnick, Y. Lee, and R. Wang, “Data and Information Quality Research: Its Evolution and Future,” Working Paper, MIT, USA, 2012.[2] Lee, Yang W; at al, Journey To Data Quality, MIT Press: Cambridge, 2006.[3] L. Al-Hakim, Information Quality Management: Theory and Applications. Idea Group Inc (IGI), 2007.[4] “Access : A semiotic information quality framework: development and comparative analysis : Journal ofInformation Technology.” [Online]. Available: http://www.palgravejournals.com/jit/journal/v20/n2/full/2000038a.html. [Accessed: 18-Sep-2015].[5] Effendi, Diana, Pengukuran Dan Perbaikan Kualitas Data Dan Informasi Di Perguruan Tinggi MenggunakanCALDEA Dan EVAMECAL (Studi Kasus X University), Proceeding Seminar Nasional RESASTEK, 2012, pp.TIG.1-TI-G.6.


2017 ◽  
Vol 47 (1) ◽  
pp. 46-55 ◽  
Author(s):  
S Aqif Mukhtar ◽  
Debbie A Smith ◽  
Maureen A Phillips ◽  
Maire C Kelly ◽  
Renate R Zilkens ◽  
...  

Background: The Sexual Assault Resource Center (SARC) in Perth, Western Australia provides free 24-hour medical, forensic, and counseling services to persons aged over 13 years following sexual assault. Objective: The aim of this research was to design a data management system that maintains accurate quality information on all sexual assault cases referred to SARC, facilitating audit and peer-reviewed research. Methods: The work to develop SARC Medical Services Clinical Information System (SARC-MSCIS) took place during 2007–2009 as a collaboration between SARC and Curtin University, Perth, Western Australia. Patient demographics, assault details, including injury documentation, and counseling sessions were identified as core data sections. A user authentication system was set up for data security. Data quality checks were incorporated to ensure high-quality data. Results: An SARC-MSCIS was developed containing three core data sections having 427 data elements to capture patient’s data. Development of the SARC-MSCIS has resulted in comprehensive capacity to support sexual assault research. Four additional projects are underway to explore both the public health and criminal justice considerations in responding to sexual violence. The data showed that 1,933 sexual assault episodes had occurred among 1881 patients between January 1, 2009 and December 31, 2015. Sexual assault patients knew the assailant as a friend, carer, acquaintance, relative, partner, or ex-partner in 70% of cases, with 16% assailants being a stranger to the patient. Conclusion: This project has resulted in the development of a high-quality data management system to maintain information for medical and forensic services offered by SARC. This system has also proven to be a reliable resource enabling research in the area of sexual violence.


2013 ◽  
Vol 427-429 ◽  
pp. 2441-2444
Author(s):  
Wei Chen ◽  
Long Chen ◽  
Ming Li

This paper presents a software design useful for power quality analysis and data management. The software was programmed in LabVIEW and Oracle, running on Windows in a regular PC. LabVIEW acquires data continuously from the lower machine via TCP/IP. Using its database connection toolkit, LabVIEW accesses to Oracle to stores and retrieve the power quality data according to different indicators. A friendly GUI was built for data display and user operation, taking advantage of the powerful data-handling capacity of LabVIEW and its rich controls. Moreover, Excel reports can be exported using report generation toolkit in LabVIEW. The software greatly improves the data analysis and management capacity.


2018 ◽  
Vol 6 (2) ◽  
pp. 89-92 ◽  
Author(s):  
Sarah Whitcher Kansa ◽  
Eric C. Kansa

ABSTRACTThis special section stems from discussions that took place in a forum at the Society for American Archaeology's annual conference in 2017. The forum, Beyond Data Management: A Conversation about “Digital Data Realities”, addressed challenges in fostering greater reuse of the digital archaeological data now curated in repositories. Forum discussants considered digital archaeology beyond the status quo of “data management” to better situate the sharing and reuse of data in archaeological practice. The five papers for this special section address key themes that emerged from these discussions, including: challenges in broadening data literacy by making instructional uses of data; strategies to make data more visible, better cited, and more integral to peer-review processes; and pathways to create higher-quality data better suited for reuse. These papers highlight how research data management needs to move beyond mere “check-box” compliance for granting requirements. The problems and proposed solutions articulated by these papers help communicate good practices that can jumpstart a virtuous cycle of better data creation leading to higher impact reuses of data.


Wireless sensor network incorporates an innovative aspect called as data handling technologies for big data organization. In today’s research the data aggregation occupies an important position and its emerging rapidly. Data aggregation incudes, process of accumulating the data at node, then either store or transfer further to reach out the destination. This survey depicts about the previous work on data aggregation in WSN and also its impact on the different services. There are number of data aggregation techniques available for reducing the data, processing the data and storing the data. Some of them are discussed here as a review. The data aggregation performed using certain techniques can also be aimed in having energy efficiency, time efficient, security could be in the form of confidentiality, unimpaired, authenticate, freshness, quality, data availability, access control, nonrepudiation, secrecy, secrecy. These are the relevant performance metrics to maintain the better Qos in WSNs applications. The goal of this paper is to display an overview of existing techniques for performance improvement in homogenous/ heterogenous networks.


2016 ◽  
Author(s):  
Alfred Enyekwe ◽  
Osahon Urubusi ◽  
Raufu Yekini ◽  
Iorkam Azoom ◽  
Oloruntoba Isehunwa

ABSTRACT Significant emphasis on data quality is placed on real-time drilling data for the optimization of drilling operations and on logging data for quality lithological and petrophysical description of a field. This is evidenced by huge sums spent on real time MWD/LWD tools, broadband services, wireline logging tools, etc. However, a lot more needs to be done to harness quality data for future workover and or abandonment operations where data being relied on is data that must have been entered decades ago and costs and time spent are critically linked to already known and certified information. In some cases, data relied on has been migrated across different data management platforms, during which relevant data might have been lost, mis-interpreted or mis-placed. Another common cause of wrong data is improperly documented well intervention operations which have been done in such a short time, that there is no pressure to document the operation properly. This leads to confusion over simple issues such as what depth a plug was set, or what junk was left in hole. The relative lack of emphasis on this type of data quality has led to high costs of workover and abandonment operations. In some cases, well control incidents and process safety incidents have arisen. This paper looks at over 20 workover operations carried out in a span of 10 years. An analysis is done on the wells’ original timeline of operation. The data management system is generally analyzed and a categorization of issues experienced during the workover operations is outlined. Bottlenecks in data management are defined and solutions currently being implemented to manage these problems are listed as recommended good practices.


2018 ◽  
Vol 2 (2) ◽  
pp. 164-176
Author(s):  
Zhiwen Pan ◽  
Wen Ji ◽  
Yiqiang Chen ◽  
Lianjun Dai ◽  
Jun Zhang

Purpose The disability datasets are the datasets that contain the information of disabled populations. By analyzing these datasets, professionals who work with disabled populations can have a better understanding of the inherent characteristics of the disabled populations, so that working plans and policies, which can effectively help the disabled populations, can be made accordingly. Design/methodology/approach In this paper, the authors proposed a big data management and analytic approach for disability datasets. Findings By using a set of data mining algorithms, the proposed approach can provide the following services. The data management scheme in the approach can improve the quality of disability data by estimating miss attribute values and detecting anomaly and low-quality data instances. The data mining scheme in the approach can explore useful patterns which reflect the correlation, association and interactional between the disability data attributes. Experiments based on real-world dataset are conducted at the end to prove the effectiveness of the approach. Originality/value The proposed approach can enable data-driven decision-making for professionals who work with disabled populations.


Sign in / Sign up

Export Citation Format

Share Document