scholarly journals A Capability Model for Learning Analytics Adoption: Identifying Organizational Capabilities from Literature on Learning Analytics, Big Data Analytics, and Business Analytics

Author(s):  
Justian Knobbout ◽  
Esther Van der Stappen

Despite the promises of learning analytics and the existence of several learning analytics implementation frameworks, the large-scale adoption of learning analytics within higher educational institutions remains low. Extant frameworks either focus on a specific element of learning analytics implementation, for example, policy or privacy, or lack operationalization of the organizational capabilities necessary for successful deployment. Therefore, this literature review addresses the research question “<em>What capabilities for the successful adoption of learning analytics can be identified in existing literature on big data analytics, business analytics, and learning analytics?”</em> Our research is grounded in resource-based view theory and we extend the scope beyond the field of learning analytics and include capability frameworks for the more mature research fields of big data analytics and business analytics. This paper’s contribution is twofold: 1) it provides a literature review on known capabilities for big data analytics, business analytics, and learning analytics and 2) it introduces a capability model to support the implementation and uptake of learning analytics. During our study, we identified and analyzed 15 key studies. By synthesizing the results, we found 34 organizational capabilities important to the adoption of analytical activities within an institution and provide 461 ways to operationalize these capabilities. Five categories of capabilities can be distinguished – <em>Data, Management, People, Technology</em>, and <em>Privacy &amp; Ethics.</em> Capabilities presently absent from existing learning analytics frameworks concern <em>sourcing and integration, market, knowledge, training, automation, </em>and <em>connectivity</em>. Based on the results of the review, we present the Learning Analytics Capability Model: a model that provides senior management and policymakers with concrete operationalizations to build the necessary capabilities for successful learning analytics adoption.

2019 ◽  
Author(s):  
Meghana Bastwadkar ◽  
Carolyn McGregor ◽  
S Balaji

BACKGROUND This paper presents a systematic literature review of existing remote health monitoring systems with special reference to neonatal intensive care (NICU). Articles on NICU clinical decision support systems (CDSSs) which used cloud computing and big data analytics were surveyed. OBJECTIVE The aim of this study is to review technologies used to provide NICU CDSS. The literature review highlights the gaps within frameworks providing HAaaS paradigm for big data analytics METHODS Literature searches were performed in Google Scholar, IEEE Digital Library, JMIR Medical Informatics, JMIR Human Factors and JMIR mHealth and only English articles published on and after 2015 were included. The overall search strategy was to retrieve articles that included terms that were related to “health analytics” and “as a service” or “internet of things” / ”IoT” and “neonatal intensive care unit” / ”NICU”. Title and abstracts were reviewed to assess relevance. RESULTS In total, 17 full papers met all criteria and were selected for full review. Results showed that in most cases bedside medical devices like pulse oximeters have been used as the sensor device. Results revealed a great diversity in data acquisition techniques used however in most cases the same physiological data (heart rate, respiratory rate, blood pressure, blood oxygen saturation) was acquired. Results obtained have shown that in most cases data analytics involved data mining classification techniques, fuzzy logic-NICU decision support systems (DSS) etc where as big data analytics involving Artemis cloud data analysis have used CRISP-TDM and STDM temporal data mining technique to support clinical research studies. In most scenarios both real-time and retrospective analytics have been performed. Results reveal that most of the research study has been performed within small and medium sized urban hospitals so there is wide scope for research within rural and remote hospitals with NICU set ups. Results have shown creating a HAaaS approach where data acquisition and data analytics are not tightly coupled remains an open research area. Reviewed articles have described architecture and base technologies for neonatal health monitoring with an IoT approach. CONCLUSIONS The current work supports implementation of the expanded Artemis cloud as a commercial offering to healthcare facilities in Canada and worldwide to provide cloud computing services to critical care. However, no work till date has been completed for low resource setting environment within healthcare facilities in India which results in scope for research. It is observed that all the big data analytics frameworks which have been reviewed in this study have tight coupling of components within the framework, so there is a need for a framework with functional decoupling of components.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Rajesh Kumar Singh ◽  
Saurabh Agrawal ◽  
Abhishek Sahu ◽  
Yigit Kazancoglu

PurposeThe proposed article is aimed at exploring the opportunities, challenges and possible outcomes of incorporating big data analytics (BDA) into health-care sector. The purpose of this study is to find the research gaps in the literature and to investigate the scope of incorporating new strategies in the health-care sector for increasing the efficiency of the system.Design/methodology/approachFora state-of-the-art literature review, a systematic literature review has been carried out to find out research gaps in the field of healthcare using big data (BD) applications. A detailed research methodology including material collection, descriptive analysis and categorization is utilized to carry out the literature review.FindingsBD analysis is rapidly being adopted in health-care sector for utilizing precious information available in terms of BD. However, it puts forth certain challenges that need to be focused upon. The article identifies and explains the challenges thoroughly.Research limitations/implicationsThe proposed study will provide useful guidance to the health-care sector professionals for managing health-care system. It will help academicians and physicians for evaluating, improving and benchmarking the health-care strategies through BDA in the health-care sector. One of the limitations of the study is that it is based on literature review and more in-depth studies may be carried out for the generalization of results.Originality/valueThere are certain effective tools available in the market today that are currently being used by both small and large businesses and corporations. One of them is BD, which may be very useful for health-care sector. A comprehensive literature review is carried out for research papers published between 1974 and 2021.


2021 ◽  
Author(s):  
R. Salter ◽  
Quyen Dong ◽  
Cody Coleman ◽  
Maria Seale ◽  
Alicia Ruvinsky ◽  
...  

The Engineer Research and Development Center, Information Technology Laboratory’s (ERDC-ITL’s) Big Data Analytics team specializes in the analysis of large-scale datasets with capabilities across four research areas that require vast amounts of data to inform and drive analysis: large-scale data governance, deep learning and machine learning, natural language processing, and automated data labeling. Unfortunately, data transfer between government organizations is a complex and time-consuming process requiring coordination of multiple parties across multiple offices and organizations. Past successes in large-scale data analytics have placed a significant demand on ERDC-ITL researchers, highlighting that few individuals fully understand how to successfully transfer data between government organizations; future project success therefore depends on a small group of individuals to efficiently execute a complicated process. The Big Data Analytics team set out to develop a standardized workflow for the transfer of large-scale datasets to ERDC-ITL, in part to educate peers and future collaborators on the process required to transfer datasets between government organizations. Researchers also aim to increase workflow efficiency while protecting data integrity. This report provides an overview of the created Data Lake Ecosystem Workflow by focusing on the six phases required to efficiently transfer large datasets to supercomputing resources located at ERDC-ITL.


2019 ◽  
Vol 01 (02) ◽  
pp. 12-20 ◽  
Author(s):  
Smys S ◽  
Vijesh joe C

The big data includes the enormous flow of data from variety of applications that does not fit into the traditional data base. They deal with the storing, managing and manipulating of the data acquired from various sources at an alarming rate to gather valuable insights from it. The big data analytics is used provide with the new and better ideas that pave way to the improvising of the business strategies with its broader, deeper insights and frictionless actions that leads to an accurate and reliable systems. The paper proposes the big data analytics for the improving the strategic assets in the health care industry by providing with the better services for the patients, gaining the satisfaction of the patients and enhancing the customer relationship.


Sign in / Sign up

Export Citation Format

Share Document