Efficient Large-scale Medical Data (eHealth Big Data) Analytics in Internet of Things

Author(s):  
Andreas P. Plageras ◽  
Christos Stergiou ◽  
George Kokkonis ◽  
Kostas E. Psannis ◽  
Yutaka Ishibashi ◽  
...  
Author(s):  
Zhihan Lv ◽  
Ranran Lou ◽  
Jinhua Li ◽  
Amit Kumar Singh ◽  
Houbing Song

2018 ◽  
Vol 1018 ◽  
pp. 012013 ◽  
Author(s):  
Waleed Noori Hussein ◽  
L.M. Kamarudin ◽  
Haider N. Hussain ◽  
A. Zakaria ◽  
R Badlishah Ahmed ◽  
...  

2015 ◽  
Vol 2015 ◽  
pp. 1-16 ◽  
Author(s):  
Ashwin Belle ◽  
Raghuram Thiagarajan ◽  
S. M. Reza Soroushmehr ◽  
Fatemeh Navidi ◽  
Daniel A. Beard ◽  
...  

The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.


2021 ◽  
Vol 83 (4) ◽  
pp. 100-111
Author(s):  
Ahmad Anwar Zainuddin ◽  

Internet of Things (IoT) is an up-and-coming technology that has a wide variety of applications. It empowers physical objects to be organized in a specialized framework to grow its convenience in terms of ease and time utilization. It is to convert the thought of bridging the crevice between the physical world and the machine world. It is also being use in the wide range of the technology in this current situation. One of its applications is to monitor and store data over time from numerous devices allows for easy analysis of the dataset. This analysis can then be the basis of decisions made on the same. In this study, the concept, architecture, and relationship of IoT and Big Data are described. Next, several use cases in IoT and big data in the research methodology are studied. The opportunities and open challenges which including the future directions are described. Furthermore, by proposing a new architecture for big data analytics in the Internet of Things, this paper adds value. Overall, the various types of big IoT data analytics, their methods, and associated big data mining technologies are discussed.


2021 ◽  
Author(s):  
R. Salter ◽  
Quyen Dong ◽  
Cody Coleman ◽  
Maria Seale ◽  
Alicia Ruvinsky ◽  
...  

The Engineer Research and Development Center, Information Technology Laboratory’s (ERDC-ITL’s) Big Data Analytics team specializes in the analysis of large-scale datasets with capabilities across four research areas that require vast amounts of data to inform and drive analysis: large-scale data governance, deep learning and machine learning, natural language processing, and automated data labeling. Unfortunately, data transfer between government organizations is a complex and time-consuming process requiring coordination of multiple parties across multiple offices and organizations. Past successes in large-scale data analytics have placed a significant demand on ERDC-ITL researchers, highlighting that few individuals fully understand how to successfully transfer data between government organizations; future project success therefore depends on a small group of individuals to efficiently execute a complicated process. The Big Data Analytics team set out to develop a standardized workflow for the transfer of large-scale datasets to ERDC-ITL, in part to educate peers and future collaborators on the process required to transfer datasets between government organizations. Researchers also aim to increase workflow efficiency while protecting data integrity. This report provides an overview of the created Data Lake Ecosystem Workflow by focusing on the six phases required to efficiently transfer large datasets to supercomputing resources located at ERDC-ITL.


Sign in / Sign up

Export Citation Format

Share Document