Big Data: Technologies, Challenges, Data Analytics and Management: A Review

2017 ◽  
Vol 12 (01) ◽  
Author(s):  
Shweta Kaushik

Internet assumes an essential part in giving different learning sources to the world, which encourages numerous applications to give quality support of the customers. As the years go on the web is over-burden with parcel of data and it turns out to be difficult to extricate the applicable data from the web. This offers path to the advancement of the Big Data and the volume of the information continues expanding quickly step by step. Enormous Data has increased much consideration from the scholarly world and the IT business. In the advanced and figuring world, data is produced and gathered at a rate that quickly surpasses the limit go. Data mining procedures are utilized to locate the concealed data from the huge information. This Technique is utilized store, oversee, and investigate high speed of information and this information can be in any shape organized or unstructured frame. It is hard to handle substantial volume of information utilizing information base strategy like RDBMS. From one perspective, Big Data is amazingly important to deliver efficiency in organizations and transformative achievements in logical controls, which give us a considerable measure of chances to make incredible advances in many fields. There is most likely the future rivalries in business profitability and advances will without a doubt merge into the Big Data investigations. Then again, Big Data likewise emerges with many difficulties, for example, troubles in information catch, information stockpiling, information investigation and information perception. In this paper we concentrate on the audit of Big Data, its information order techniques and the way it can be mined utilizing different mining strategies.

Author(s):  
Joseph E. Kasten

The development of vaccines has been one of the most important medical and pharmacological breakthroughs in the history of the world. Besides saving untold lives, they have enabled the human race to live and thrive in conditions thought far too dangerous only a few centuries ago. In recent times, the development of the COVID-19 vaccine has captured the world’s attention as the primary tool to defeat the current pandemic. The tools used to develop these vaccines have changed dramatically over time, with the use of big data technologies becoming standard in many instances. This study performs a structured literature review centered on the development, distribution, and evaluation of vaccines and the role played by big data tools such as data analytics, datamining, and machine learning. Through this review, the paper identifies where these technologies have made important contributions and in what areas further research is likely to be useful.


2020 ◽  
pp. 70-93
Author(s):  
Nayem Rahman

Data mining techniques are widely used to uncover hidden knowledge that cannot be extracted using conventional information retrieval and data analytics tools or using any manual techniques. Different data mining techniques have evolved over the last two decades and solve a wide variety of business problems. Different techniques have been proposed. Practitioners and researchers in both industry and academia continuously develop and experiment with variety of data mining techniques. This article provides a consolidated list of problems being solved by different data mining techniques. The author presents up to three techniques that can be used to address a particular type of problem. The objective is to assist practitioners and researchers to have a holistic view of data mining techniques, and the problems being solved by them. This article also provides an overview of data mining problems solved in the healthcare industry. The article also highlights as to how big data technologies are leveraged in handling and processing huge amounts of complex data from data mining perspectives.


Author(s):  
Nayem Rahman

Data mining techniques are widely used to uncover hidden knowledge that cannot be extracted using conventional information retrieval and data analytics tools or using any manual techniques. Different data mining techniques have evolved over the last two decades and solve a wide variety of business problems. Different techniques have been proposed. Practitioners and researchers in both industry and academia continuously develop and experiment with variety of data mining techniques. This article provides a consolidated list of problems being solved by different data mining techniques. The author presents up to three techniques that can be used to address a particular type of problem. The objective is to assist practitioners and researchers to have a holistic view of data mining techniques, and the problems being solved by them. This article also provides an overview of data mining problems solved in the healthcare industry. The article also highlights as to how big data technologies are leveraged in handling and processing huge amounts of complex data from data mining perspectives.


In the current day scenario, a huge amount of data is been generated from various heterogeneous sources like social networks, business apps, government sector, marketing, health care system, sensors, machine log data which is created at such a high speed and other sources. Big Data is chosen as one among the upcoming area of research by several industries. In this paper, the author presents wide collection of literature that has been reviewed and analyzed. This paper emphasizes on Big Data Technologies, Application & Challenges, a comparative study on architectures, methodologies, tools, and survey results proposed by various researchers are presented


Author(s):  
Nataliia Geseleva ◽  
Anastasiia Yaroslavtseva

The paper examines the telecommunications industry, its development and impact on economic growth in countries including Ukraine. The characteristics of mobile communication, as a segment of the telecommunications industry that is most actively progressing, both in the world as a whole and in Ukraine, are given. It’s examined a current state of the Ukrainian mobile communication market. Its importance for the national economy is reviewed. The Ukrainian mobile market has been studied; the changes that have taken place in recent years in the direction of global trends in the field of communications. Development trends that encourage mobile operators to develop their own platforms, introduce new products and services are considered. Examples of current developments and services of operators such as virtual mobile automatic telephone exchange, Big Data Scoring, Vodafone Analytics and others are given. The article pays special attention to Big Data processing and analysis technologies. Big data is defined as very large datasets that can be analyzed computationally to reveal patterns, trends, and associations – especially in connection with human behavior and interactions. A big data revolution has arrived with the growth of the Internet, wireless networks, smartphones, social media and other technology. These features of Big Data are the ability to use Data Mining. Data mining is a process used by companies to turn raw data into useful information. By using software to look for patterns in large batches of data, businesses can learn more about their customers to develop more effective marketing strategies, increase sales and decrease costs. Data mining depends on effective data collection, warehousing, and computer processing. Data mining processes are used to build machine learning models that power applications including search engine technology and website recommendation programs. Also describes how Big Data affects the retail industry, namely helping to optimize merchandising tactics, personalize customer service, increase advertising effectiveness, target offline shoppers (remarketing) and expand cross-selling. Also in the field of telecommunications, Big Data helps providers to automate and optimize the provision of their services. Thus, the introduction of Big Data technologies will allow Ukraine to become a more competitive country on the world market.


2019 ◽  
Author(s):  
Meghana Bastwadkar ◽  
Carolyn McGregor ◽  
S Balaji

BACKGROUND This paper presents a systematic literature review of existing remote health monitoring systems with special reference to neonatal intensive care (NICU). Articles on NICU clinical decision support systems (CDSSs) which used cloud computing and big data analytics were surveyed. OBJECTIVE The aim of this study is to review technologies used to provide NICU CDSS. The literature review highlights the gaps within frameworks providing HAaaS paradigm for big data analytics METHODS Literature searches were performed in Google Scholar, IEEE Digital Library, JMIR Medical Informatics, JMIR Human Factors and JMIR mHealth and only English articles published on and after 2015 were included. The overall search strategy was to retrieve articles that included terms that were related to “health analytics” and “as a service” or “internet of things” / ”IoT” and “neonatal intensive care unit” / ”NICU”. Title and abstracts were reviewed to assess relevance. RESULTS In total, 17 full papers met all criteria and were selected for full review. Results showed that in most cases bedside medical devices like pulse oximeters have been used as the sensor device. Results revealed a great diversity in data acquisition techniques used however in most cases the same physiological data (heart rate, respiratory rate, blood pressure, blood oxygen saturation) was acquired. Results obtained have shown that in most cases data analytics involved data mining classification techniques, fuzzy logic-NICU decision support systems (DSS) etc where as big data analytics involving Artemis cloud data analysis have used CRISP-TDM and STDM temporal data mining technique to support clinical research studies. In most scenarios both real-time and retrospective analytics have been performed. Results reveal that most of the research study has been performed within small and medium sized urban hospitals so there is wide scope for research within rural and remote hospitals with NICU set ups. Results have shown creating a HAaaS approach where data acquisition and data analytics are not tightly coupled remains an open research area. Reviewed articles have described architecture and base technologies for neonatal health monitoring with an IoT approach. CONCLUSIONS The current work supports implementation of the expanded Artemis cloud as a commercial offering to healthcare facilities in Canada and worldwide to provide cloud computing services to critical care. However, no work till date has been completed for low resource setting environment within healthcare facilities in India which results in scope for research. It is observed that all the big data analytics frameworks which have been reviewed in this study have tight coupling of components within the framework, so there is a need for a framework with functional decoupling of components.


2020 ◽  
Vol 4 (2) ◽  
pp. 5 ◽  
Author(s):  
Ioannis C. Drivas ◽  
Damianos P. Sakas ◽  
Georgios A. Giannakopoulos ◽  
Daphne Kyriaki-Manessi

In the Big Data era, search engine optimization deals with the encapsulation of datasets that are related to website performance in terms of architecture, content curation, and user behavior, with the purpose to convert them into actionable insights and improve visibility and findability on the Web. In this respect, big data analytics expands the opportunities for developing new methodological frameworks that are composed of valid, reliable, and consistent analytics that are practically useful to develop well-informed strategies for organic traffic optimization. In this paper, a novel methodology is implemented in order to increase organic search engine visits based on the impact of multiple SEO factors. In order to achieve this purpose, the authors examined 171 cultural heritage websites and their retrieved data analytics about their performance and user experience inside them. Massive amounts of Web-based collections are included and presented by cultural heritage organizations through their websites. Subsequently, users interact with these collections, producing behavioral analytics in a variety of different data types that come from multiple devices, with high velocity, in large volumes. Nevertheless, prior research efforts indicate that these massive cultural collections are difficult to browse while expressing low visibility and findability in the semantic Web era. Against this backdrop, this paper proposes the computational development of a search engine optimization (SEO) strategy that utilizes the generated big cultural data analytics and improves the visibility of cultural heritage websites. One step further, the statistical results of the study are integrated into a predictive model that is composed of two stages. First, a fuzzy cognitive mapping process is generated as an aggregated macro-level descriptive model. Secondly, a micro-level data-driven agent-based model follows up. The purpose of the model is to predict the most effective combinations of factors that achieve enhanced visibility and organic traffic on cultural heritage organizations’ websites. To this end, the study contributes to the knowledge expansion of researchers and practitioners in the big cultural analytics sector with the purpose to implement potential strategies for greater visibility and findability of cultural collections on the Web.


Author(s):  
Nirmit Singhal ◽  
Amita Goel, ◽  
Nidhi Sengar ◽  
Vasudha Bahl

The world generated 52 times the amount of data in 2010 and 76 times the number of information sources in 2022. The ability to use this data creates enormous opportunities, and in order to make these opportunities a reality, people must use data to solve problems. Unfortunately, in the midst of a global pandemic, when people all over the world seek reliable, trustworthy information about COVID-19 (Coronavirus). Tableau plays a key role in this scenario because it is an extremely powerful tool for quickly visualizing large amounts of data. It has a simple drag-and-drop interface. Beautiful infographics are simple to create and take little time. Tableau works with a wide variety of data sources. COVID-19 (Coronavirus)analytics with Tableau will allow you to create dashboards that will assist you. Tableau is a tool that deals with big data analytics and generates output in a visualization technique, making it more understandable and presentable. Data blending, real-time reporting, and data collaboration are one of its features. Ultimately, this paper provides a clear picture of the growing COVID19 (Coronavirus) data and the tools that can assist more effectively, accurately, and efficiently. Keywords: Data Visualization, Tableau, Data Analysis, Covid-19 analysis, Covid-19 data


Sign in / Sign up

Export Citation Format

Share Document