scholarly journals Eco-Innovation and Industry 4.0: A Big Data Usage conceptual model

2018 ◽  
Vol 56 ◽  
pp. 05003 ◽  
Author(s):  
Russell Tatenda Munodawafa ◽  
Satirenjit Kaur Johl

Driven by Cyber Physical Systems, Big Data Analytics, Internet of Things and Automation, Industry 4.0 is expected to revolutionize the world. A new era beckons for enterprises of all sizes, markets, governments, and the world at large as the digital economy fully takes off under Industry 4.0. The United Nations has also expressed its desire to usher in a new era for humanity with the Sustainable Development Goals 2030 (SDG’s) replacing the Millennial Development Goals (MDG’s). Critical to the achievement of both of the above-mentioned ambitions is the efficient and sustainable use of natural resources. Big Data Analytics, an important arm of Industry 4.0, gives organizations the ability to eco-innovate from a resource perspective. This paper conducts an analysis of previously published research literature and contributes to this emerging research area looking at Big Data Usage from a strategic and organizational perspective. A conceptual framework that can be utilized in future research is developed from the literature. Also discussed is the expected impact of Big Data Usage towards firm performance, particularly as the world becomes more concerned about the environment. Data driven eco-innovation should be in full motion if organizations are to remain relevant in tomorrow’s potentially ultra-competitive digital economy.

Author(s):  
Renan Bonnard ◽  
Márcio Da Silva Arantes ◽  
Rodolfo Lorbieski ◽  
Kléber Magno Maciel Vieira ◽  
Marcelo Canzian Nunes

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Rajesh Kumar Singh ◽  
Saurabh Agrawal ◽  
Abhishek Sahu ◽  
Yigit Kazancoglu

PurposeThe proposed article is aimed at exploring the opportunities, challenges and possible outcomes of incorporating big data analytics (BDA) into health-care sector. The purpose of this study is to find the research gaps in the literature and to investigate the scope of incorporating new strategies in the health-care sector for increasing the efficiency of the system.Design/methodology/approachFora state-of-the-art literature review, a systematic literature review has been carried out to find out research gaps in the field of healthcare using big data (BD) applications. A detailed research methodology including material collection, descriptive analysis and categorization is utilized to carry out the literature review.FindingsBD analysis is rapidly being adopted in health-care sector for utilizing precious information available in terms of BD. However, it puts forth certain challenges that need to be focused upon. The article identifies and explains the challenges thoroughly.Research limitations/implicationsThe proposed study will provide useful guidance to the health-care sector professionals for managing health-care system. It will help academicians and physicians for evaluating, improving and benchmarking the health-care strategies through BDA in the health-care sector. One of the limitations of the study is that it is based on literature review and more in-depth studies may be carried out for the generalization of results.Originality/valueThere are certain effective tools available in the market today that are currently being used by both small and large businesses and corporations. One of them is BD, which may be very useful for health-care sector. A comprehensive literature review is carried out for research papers published between 1974 and 2021.


2021 ◽  
Author(s):  
Samuel Boone ◽  
Fabian Kohlmann ◽  
Moritz Theile ◽  
Wayne Noble ◽  
Barry Kohn ◽  
...  

<p>The AuScope Geochemistry Network (AGN) and partners Lithodat Pty Ltd are developing AusGeochem, a novel cloud-based platform for Australian-produced geochemistry data from around the globe. The open platform will allow laboratories to upload, archive, disseminate and publish their datasets, as well as perform statistical analyses and data synthesis within the context of large volumes of publicly funded geochemical data. As part of this endeavour, representatives from four Australian low-temperature thermochronology laboratories (University of Melbourne, University of Adelaide, Curtin University and University of Queensland) are advising the AGN and Lithodat on the development of low-temperature thermochronology (LTT)-specific data models for the relational AusGeochem database and its international counterpart, LithoSurfer. These schemas will facilitate the structured archiving of a wide variety of thermochronology data, enabling geoscientists to readily perform LTT Big Data analytics and gain new insights into the thermo-tectonic evolution of Earth’s crust.</p><p>Adopting established international data reporting best practices, the LTT expert advisory group has designed database schemas for the fission track and (U-Th-Sm)/He methods, as well as for thermal history modelling results and metadata. In addition to recording the parameters required for LTT analyses, the schemas include fields for reference material results and error reporting, allowing AusGeochem users to independently perform QA/QC on data archived in the database. Development of scripts for the automated upload of data directly from analytical instruments into AusGeochem using its open-source Application Programming Interface are currently under way.</p><p>The advent of a LTT relational database heralds the beginning of a new era of Big Data analytics in the field of low-temperature thermochronology. By methodically archiving detailed LTT (meta-)data in structured schemas, intractably large datasets comprising 1000s of analyses produced by numerous laboratories can be readily interrogated in new and powerful ways. These include rapid derivation of inter-data relationships, facilitating on-the-fly age computation, statistical analysis and data visualisation. With the detailed LTT data stored in relational schemas, measurements can then be re-calculated and re-modelled using user-defined constants and kinetic algorithms. This enables analyses determined using different parameters to be equated and compared across regional- to global scales.</p><p>The development of this novel tool heralds the beginning of a new era of structured Big Data in the field of low-temperature thermochronology, improving laboratories’ ability to manage and share their data in alignment with FAIR data principles while enabling analysts to readily interrogate intractably large datasets in new and powerful ways.</p>


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Marwa Rabe Mohamed Elkmash ◽  
Magdy Gamal Abdel-Kader ◽  
Bassant Badr El Din

Purpose This study aims to investigate and explore the impact of big data analytics (BDA) as a mechanism that could develop the ability to measure customers’ performance. To accomplish the research aim, the theoretical discussion was developed through the combination of the diffusion of innovation theory with the technology acceptance model (TAM) that is less developed for the research field of this study. Design/methodology/approach Empirical data was obtained using Web-based quasi-experiments with 104 Egyptian accounting professionals. Further, the Wilcoxon signed-rank test and the chi-square goodness-of-fit test were used to analyze data. Findings The empirical results indicate that measuring customers’ performance based on BDA increase the organizations’ ability to analyze the customers’ unstructured data, decrease the cost of customers’ unstructured data analysis, increase the ability to handle the customers’ problems quickly, minimize the time spent to analyze the customers’ data and obtaining the customers’ performance reports and control managers’ bias when they measure customer satisfaction. The study findings supported the accounting professionals’ acceptance of BDA through the TAM elements: the intention to use (R), perceived usefulness (U) and the perceived ease of use (E). Research limitations/implications This study has several limitations that could be addressed in future research. First, this study focuses on customers’ performance measurement (CPM) only and ignores other performance measurements such as employees’ performance measurement and financial performance measurement. Future research can examine these areas. Second, this study conducts a Web-based experiment with Master of Business Administration students as a study’s participants, researchers could conduct a laboratory experiment and report if there are differences. Third, owing to the novelty of the topic, there was a lack of theoretical evidence in developing the study’s hypotheses. Practical implications This study succeeds to provide the much-needed empirical evidence for BDA positive impact in improving CPM efficiency through the proposed framework (i.e. CPM and BDA framework). Furthermore, this study contributes to the improvement of the performance measurement process, thus, the decision-making process with meaningful and proper insights through the capability of collecting and analyzing the customers’ unstructured data. On a practical level, the company could eventually use this study’s results and the new insights to make better decisions and develop its policies. Originality/value This study holds significance as it provides the much-needed empirical evidence for BDA positive impact in improving CPM efficiency. The study findings will contribute to the enhancement of the performance measurement process through the ability of gathering and analyzing the customers’ unstructured data.


Author(s):  
Yali Ren ◽  
Ning Wang ◽  
Jinwei Jiang ◽  
Junxiao Zhu ◽  
Gangbing Song ◽  
...  

In the challenging downhole environment, drilling tools are normally subject to high temperature, severe vibration, and other harsh operation conditions. The drilling activities generate massive field data, namely field reliability big data (FRBD), which includes downhole operation, environment, failure, degradation, and dynamic data. Field reliability big data has large size, high variety, and extreme complexity. FRBD presents abundant opportunities and great challenges for drilling tool reliability analytics. Consequently, as one of the key factors to affect drilling tool reliability, the downhole vibration factor plays an essential role in the reliability analytics based on FRBD. This paper reviews the important parameters of downhole drilling operations, examines the mode, physical and reliability impact of downhole vibration, and presents the features of reliability big data analytics. Specifically, this paper explores the application of vibration factor in reliability big data analytics covering tool lifetime/failure prediction, prognostics/diagnostics, condition monitoring (CM), and maintenance planning and optimization. Furthermore, the authors highlight the future research about how to better apply the downhole vibration factor in reliability big data analytics to further improve tool reliability and optimize maintenance planning.


Author(s):  
Nirmit Singhal ◽  
Amita Goel, ◽  
Nidhi Sengar ◽  
Vasudha Bahl

The world generated 52 times the amount of data in 2010 and 76 times the number of information sources in 2022. The ability to use this data creates enormous opportunities, and in order to make these opportunities a reality, people must use data to solve problems. Unfortunately, in the midst of a global pandemic, when people all over the world seek reliable, trustworthy information about COVID-19 (Coronavirus). Tableau plays a key role in this scenario because it is an extremely powerful tool for quickly visualizing large amounts of data. It has a simple drag-and-drop interface. Beautiful infographics are simple to create and take little time. Tableau works with a wide variety of data sources. COVID-19 (Coronavirus)analytics with Tableau will allow you to create dashboards that will assist you. Tableau is a tool that deals with big data analytics and generates output in a visualization technique, making it more understandable and presentable. Data blending, real-time reporting, and data collaboration are one of its features. Ultimately, this paper provides a clear picture of the growing COVID19 (Coronavirus) data and the tools that can assist more effectively, accurately, and efficiently. Keywords: Data Visualization, Tableau, Data Analysis, Covid-19 analysis, Covid-19 data


2021 ◽  
Vol 23 (06) ◽  
pp. 1167-1182
Author(s):  
Shreyas Nopany ◽  
◽  
Prof. Manonmani S ◽  

The healthcare industry has become increasingly demanding in recent years. The growing number of patients makes it difficult for doctors and staff to manage their work effectively. In order to achieve their objectives, data analysts collect a large amount of data, analyze it, and use it to derive valuable insights. Data analytics may become a promising solution as healthcare industry demands increase. The paper discusses the challenges of data analytics in the healthcare sector and the benefits of using big data for healthcare analytics. Aside from focusing on the opportunities that big data analytics has in the healthcare sector, the paper will also discuss data governance, strategy formulation, and improvements to IT infrastructure. Implementation techniques include Hadoop, HDFS, MapReduce, and Apache in Big Data Analytics. A Healthcare Management System can be categorized into five divisions, namely, Drug discovery, Disease prevention, diagnosis and treatment, Hospital operations, post-care, requiring comprehensive data management. Big Data analysis support transformation is identified as a required component in future research for the application of Big Data in HealthCare.


Author(s):  
Pethuru Raj

The implications of the digitization process among a bevy of trends are definitely many and memorable. One is the abnormal growth in data generation, gathering, and storage due to a steady increase in the number of data sources, structures, scopes, sizes, and speeds. In this chapter, the author shows some of the impactful developments brewing in the IT space, how the tremendous amount of data getting produced and processed all over the world impacts the IT and business domains, how next-generation IT infrastructures are accordingly getting refactored, remedied, and readied for the impending big data-induced challenges, how likely the move of the big data analytics discipline towards fulfilling the digital universe requirements of extracting and extrapolating actionable insights for the knowledge-parched is, and finally, the establishment and sustenance of the dreamt smarter planet.


Sign in / Sign up

Export Citation Format

Share Document