scholarly journals HR Challenges in Big Data

2016 ◽  
Vol 15 (2) ◽  
pp. 49-55
Author(s):  
Pala SuriyaKala ◽  
Ravi Aditya

Human resources is traditionally an area subject to measured changes but with Big data, data analytics, Human capital Management, Talent acquisition and performance metrics as new trends, there is bound to be a sea change in this function. This paper is conceptual and tries to introspect and outline the challenges that HRM faces in Big Data. Big Data is as one knows the world of enormous generation which is revolutionizing the world with data sets at exabytes. This has been the driving force behind how governments, companies and functions will come to perform in the decades to come. The immense amount of information if properly utilized can lead to efficiency in various fields like never before. But to do this the cloud of suspicion, fear and uncertainty regarding the use of Big Data has to be removed from those who can use it to the benefit of their respective areas of application.HR traditionally has never been very data centric in the analysis of its decisions unlike marketing, finance, etc.

Author(s):  
Nitigya Sambyal ◽  
Poonam Saini ◽  
Rupali Syal

The world is increasingly driven by huge amounts of data. Big data refers to data sets that are so large or complex that traditional data processing application software are inadequate to deal with them. Healthcare analytics is a prominent area of big data analytics. It has led to significant reduction in morbidity and mortality associated with a disease. In order to harness full potential of big data, various tools like Apache Sentry, BigQuery, NoSQL databases, Hadoop, JethroData, etc. are available for its processing. However, with such enormous amounts of information comes the complexity of data management, other big data challenges occur during data capture, storage, analysis, search, transfer, information privacy, visualization, querying, and update. The chapter focuses on understanding the meaning and concept of big data, analytics of big data, its role in healthcare, various application areas, trends and tools used to process big data along with open problem challenges.


Author(s):  
Adriano Fernandes ◽  
Jonathan Barretto ◽  
Jonas Fernandes

Big data analytics is becoming more and more popular every day as a tool for evaluating large volumes of data on demand. Apache Hadoop, Spark, Storm, and Flink are four of the most widely used big data processing frameworks. Although all four architectures support big data analysis, they vary in how they are used and the infrastructure that supports it. This paper defines a general collection of main performance metrics, which include Processing Time, CPU Use, Latency, Execution Time, Performance, Scalability, and Fault-tolerance, and contrasting the four big data architectures against these KPIs in a literature review. When compared to Apache Hadoop and Apache Storm frameworks for non-real-time results, Spark was found to be the winner over multiple KPIs, including processing time, CPU usage, Latency, Execution time, and Scalability. In terms of processing time, CPU consumption, latency, execution time, and performance, Flink surpassed Apache Spark and Apache Storm architectures.


2018 ◽  
Vol 20 (1) ◽  
Author(s):  
Tiko Iyamu

Background: Over the years, big data analytics has been statically carried out in a programmed way, which does not allow for translation of data sets from a subjective perspective. This approach affects an understanding of why and how data sets manifest themselves into various forms in the way that they do. This has a negative impact on the accuracy, redundancy and usefulness of data sets, which in turn affects the value of operations and the competitive effectiveness of an organisation. Also, the current single approach lacks a detailed examination of data sets, which big data deserve in order to improve purposefulness and usefulness.Objective: The purpose of this study was to propose a multilevel approach to big data analysis. This includes examining how a sociotechnical theory, the actor network theory (ANT), can be complementarily used with analytic tools for big data analysis.Method: In the study, the qualitative methods were employed from the interpretivist approach perspective.Results: From the findings, a framework that offers big data analytics at two levels, micro- (strategic) and macro- (operational) levels, was developed. Based on the framework, a model was developed, which can be used to guide the analysis of heterogeneous data sets that exist within networks.Conclusion: The multilevel approach ensures a fully detailed analysis, which is intended to increase accuracy, reduce redundancy and put the manipulation and manifestation of data sets into perspectives for improved organisations’ competitiveness.


2021 ◽  
Vol 9 (1) ◽  
pp. 16-44
Author(s):  
Weiqing Zhuang ◽  
Morgan C. Wang ◽  
Ichiro Nakamoto ◽  
Ming Jiang

Abstract Big data analytics (BDA) in e-commerce, which is an emerging field that started in 2006, deeply affects the development of global e-commerce, especially its layout and performance in the U.S. and China. This paper seeks to examine the relative influence of theoretical research of BDA in e-commerce to explain the differences between the U.S. and China by adopting a statistical analysis method on the basis of samples collected from two main literature databases, Web of Science and CNKI, aimed at the U.S. and China. The results of this study help clarify doubts regarding the development of China’s e-commerce, which exceeds that of the U.S. today, in view of the theoretical comparison of BDA in e-commerce between them.


2016 ◽  
Vol 13 (3) ◽  
pp. 110-130 ◽  
Author(s):  
Florence Martin ◽  
◽  
Abdou Ndoye ◽  

Learning analytics can be used to enhance student engagement and performance in online courses. Using learning analytics, instructors can collect and analyze data about students and improve the design and delivery of instruction to make it more meaningful for them. In this paper, the authors review different categories of online assessments and identify data sets that can be collected and analyzed for each of them. Two different data analytics and visualization tools were used: Tableau for quantitative data and Many Eyes for qualitative data. This paper has implications for instructors, instructional designers, administrators, and educational researchers who use online assessments.


Author(s):  
Nirmit Singhal ◽  
Amita Goel, ◽  
Nidhi Sengar ◽  
Vasudha Bahl

The world generated 52 times the amount of data in 2010 and 76 times the number of information sources in 2022. The ability to use this data creates enormous opportunities, and in order to make these opportunities a reality, people must use data to solve problems. Unfortunately, in the midst of a global pandemic, when people all over the world seek reliable, trustworthy information about COVID-19 (Coronavirus). Tableau plays a key role in this scenario because it is an extremely powerful tool for quickly visualizing large amounts of data. It has a simple drag-and-drop interface. Beautiful infographics are simple to create and take little time. Tableau works with a wide variety of data sources. COVID-19 (Coronavirus)analytics with Tableau will allow you to create dashboards that will assist you. Tableau is a tool that deals with big data analytics and generates output in a visualization technique, making it more understandable and presentable. Data blending, real-time reporting, and data collaboration are one of its features. Ultimately, this paper provides a clear picture of the growing COVID19 (Coronavirus) data and the tools that can assist more effectively, accurately, and efficiently. Keywords: Data Visualization, Tableau, Data Analysis, Covid-19 analysis, Covid-19 data


Author(s):  
Pethuru Raj

The implications of the digitization process among a bevy of trends are definitely many and memorable. One is the abnormal growth in data generation, gathering, and storage due to a steady increase in the number of data sources, structures, scopes, sizes, and speeds. In this chapter, the author shows some of the impactful developments brewing in the IT space, how the tremendous amount of data getting produced and processed all over the world impacts the IT and business domains, how next-generation IT infrastructures are accordingly getting refactored, remedied, and readied for the impending big data-induced challenges, how likely the move of the big data analytics discipline towards fulfilling the digital universe requirements of extracting and extrapolating actionable insights for the knowledge-parched is, and finally, the establishment and sustenance of the dreamt smarter planet.


Web Services ◽  
2019 ◽  
pp. 1430-1443
Author(s):  
Louise Leenen ◽  
Thomas Meyer

The Governments, military forces and other organisations responsible for cybersecurity deal with vast amounts of data that has to be understood in order to lead to intelligent decision making. Due to the vast amounts of information pertinent to cybersecurity, automation is required for processing and decision making, specifically to present advance warning of possible threats. The ability to detect patterns in vast data sets, and being able to understanding the significance of detected patterns are essential in the cyber defence domain. Big data technologies supported by semantic technologies can improve cybersecurity, and thus cyber defence by providing support for the processing and understanding of the huge amounts of information in the cyber environment. The term big data analytics refers to advanced analytic techniques such as machine learning, predictive analysis, and other intelligent processing techniques applied to large data sets that contain different data types. The purpose is to detect patterns, correlations, trends and other useful information. Semantic technologies is a knowledge representation paradigm where the meaning of data is encoded separately from the data itself. The use of semantic technologies such as logic-based systems to support decision making is becoming increasingly popular. However, most automated systems are currently based on syntactic rules. These rules are generally not sophisticated enough to deal with the complexity of decisions required to be made. The incorporation of semantic information allows for increased understanding and sophistication in cyber defence systems. This paper argues that both big data analytics and semantic technologies are necessary to provide counter measures against cyber threats. An overview of the use of semantic technologies and big data technologies in cyber defence is provided, and important areas for future research in the combined domains are discussed.


Author(s):  
Abid Ali ◽  
Nursyarizal Mohd Nor ◽  
Taib Ibrahim ◽  
Mohd Fakhizan Romlie ◽  
Kishore Bingi

This chapter proposes Big Data Analytics for the sizing and locating of solar photovoltaic farms to reduce the total energy loss in distribution networks. The Big Data Analytics, which uses the advance statistical and computational tools for the handling of large data sets, has been adopted for modeling the 15 years of solar weather data. Total Power Loss Index (TPLI) is formulated as the main objective function for the optimization problem and meanwhile bus voltage deviations and penetrations of the PV farms are calculated. To solve the optimization problem, this study adopts the Mixed Integer Optimization using Genetic Algorithm (MIOGA) technique. By considering different time varying voltage dependent load models, the proposed algorithm is applied on IEEE 33 bus and IEEE 69 bus test distribution networks and optimum results are acquired. From the results, it is revealed that compared to single PV farm, the integration of two PV farms reduced more energy loss and reduced the total size of PV farms. Big Data Analytics is found very effective for the storing, handling, processing and the visualizing of the weather Big Data.


Sign in / Sign up

Export Citation Format

Share Document