INFLUENCE OF BIG DATA ON THE DEVELOPMENT OF MANAGEMENT ACCOUNTING

2021 ◽  
Vol 4 (4) ◽  
pp. 107-110
Author(s):  
D. Yu. ROZHKOVA ◽  

Continuous advances in artificial intelligence and machine learning, combined with big data analytics, increase the risk of computerization and the disappearance of a wide range of professionals, including accountants and management accountants. The article discusses the fact of emergence of data scientist specialists, as well as the prospects for the development of data analysis functions. Based on a data analysis methodology that takes into account the emergence of big data, we positioned management accounting specialists and data analysts based on their functionality and data used. It was concluded that today data analyst functionality complements the skills and knowledge of accountants. However, the further vector of development will depend on the needs of an enterprise and development of analytical methods and information products.

2020 ◽  
Vol 10 (1) ◽  
pp. 343-356
Author(s):  
Snezana Savoska ◽  
Blagoj Ristevski

AbstractNowadays, big data is a widely utilized concept that has been spreading quickly in almost every domain. For pharmaceutical companies, using this concept is a challenging task because of the permanent pressure and business demands created through the legal requirements, research demands and standardization that have to be adopted. These legal and standards’ demands are associated with human healthcare safety and drug control that demands continuous and deep data analysis. Companies update their procedures to the particular laws, standards, market demands and regulations all the time by using contemporary information technology. This paper highlights some important aspects of the experience and change methodology used in one Macedonian pharmaceutical company, which has employed information technology solutions that successfully tackle legal and business pressures when dealing with a large amount of data. We used a holistic view and deliverables analysis methodology to gain top-down insights into the possibilities of big data analytics. Also, structured interviews with the company’s managers were used for information collection and proactive methodology with workshops was used in data integration toward the implementation of big data concepts. The paper emphasizes the information and knowledge used in this domain to improve awareness for the needs of big data analysis to achieve a competitive advantage. The main results are focused on systematizing the whole company’s data, information and knowledge and propose a solution that integrates big data to support managers’ decision-making processes.


Big data and Data science are the two top trends of recent years. Both can be combined together as big data science. This leads to the demand for new system architectures which facilitates the development of processes which can handle huge data volumes without deterring the agility, flexibility and the interactive feel which suits the exploratory approach of a data scientist. Businesses today have found ways of using data as the principal factor for value generation. These data-driven businesses apply a variety of data tools as data analysis is one of the chief elements in this process. In order to raise data science to the new computational level that is required to meet the challenges of big data and interactive advanced analytics, EXASOL has introduced a new technological approach. This tool enables us more effective and easy data analysis.


2020 ◽  
Vol 17 (6) ◽  
pp. 2806-2811
Author(s):  
Wahidah Hashim ◽  
A/L Jayaretnam Prathees ◽  
Marini Othman ◽  
Andino Maseleno

Data Science also known as Analytics, has a high demand in the industries right now, where professionals who are well trained in this field are being recruited by many large companies. Before the existence of data science, companies and industries search for software engineers and data analysis to sort IT related problems. However, as the internet start to being used by most of the people in the world, data keep on pouring in a large volume and velocity, software engineers and data analysis could not handle it anymore. Analyzing the tremendous size of data is called Big Data Analytics. Corporate companies have already started to realize that data scientists are the right person to tackle Big Data related problems. Low supply of data scientist has hiked in the salary of the data scientist, as the pay for a data scientist many more time higher compare to other IT related professionals. Knowledge in data science can solve any data related problems in this world. Data scientist are not only recruited by tech-giants like Google and Amazon, medium organizations also started to understand the importance of data science and they too recruit data scientist for their company. In this paper, we will explore on the requirement and knowledges of data science that can be covered in UNITEN’s computer science syllabus.


2021 ◽  
Vol 83 (4) ◽  
pp. 100-111
Author(s):  
Ahmad Anwar Zainuddin ◽  

Internet of Things (IoT) is an up-and-coming technology that has a wide variety of applications. It empowers physical objects to be organized in a specialized framework to grow its convenience in terms of ease and time utilization. It is to convert the thought of bridging the crevice between the physical world and the machine world. It is also being use in the wide range of the technology in this current situation. One of its applications is to monitor and store data over time from numerous devices allows for easy analysis of the dataset. This analysis can then be the basis of decisions made on the same. In this study, the concept, architecture, and relationship of IoT and Big Data are described. Next, several use cases in IoT and big data in the research methodology are studied. The opportunities and open challenges which including the future directions are described. Furthermore, by proposing a new architecture for big data analytics in the Internet of Things, this paper adds value. Overall, the various types of big IoT data analytics, their methods, and associated big data mining technologies are discussed.


2018 ◽  
Vol 20 (1) ◽  
Author(s):  
Tiko Iyamu

Background: Over the years, big data analytics has been statically carried out in a programmed way, which does not allow for translation of data sets from a subjective perspective. This approach affects an understanding of why and how data sets manifest themselves into various forms in the way that they do. This has a negative impact on the accuracy, redundancy and usefulness of data sets, which in turn affects the value of operations and the competitive effectiveness of an organisation. Also, the current single approach lacks a detailed examination of data sets, which big data deserve in order to improve purposefulness and usefulness.Objective: The purpose of this study was to propose a multilevel approach to big data analysis. This includes examining how a sociotechnical theory, the actor network theory (ANT), can be complementarily used with analytic tools for big data analysis.Method: In the study, the qualitative methods were employed from the interpretivist approach perspective.Results: From the findings, a framework that offers big data analytics at two levels, micro- (strategic) and macro- (operational) levels, was developed. Based on the framework, a model was developed, which can be used to guide the analysis of heterogeneous data sets that exist within networks.Conclusion: The multilevel approach ensures a fully detailed analysis, which is intended to increase accuracy, reduce redundancy and put the manipulation and manifestation of data sets into perspectives for improved organisations’ competitiveness.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Heru Fahlevi ◽  
Irsyadillah Irsyadillah ◽  
Mirna Indriani ◽  
Rina Suryani Oktari

Purpose This study aims to provide insights into management accounting changes (MACs) and potential roles of big data analytics (BDA) in accelerating the MACs in an Indonesian public hospital as a response towards the adoption of the diagnosis-related groups (DRG)-based payment system. Design/methodology/approach A mixed-method approach was used to collect and analyse data from a referral public hospital in Indonesia. First, a BDA simulation was carried out to reveal its usefulness in predicting and evaluating patient costs, and finally improving the cost recovery rate (CRR) of each DRG case. This part formulated and tested the mathematical models that predict patient cost, the CRR and determinants (length of stay/LOS, severity/SEV, patient age/AGE and gender/SEX). For this purpose, data of the top ten inpatient cases of 2018 were collected and analysed. Second, semi-structured interviews with senior staff and doctors were carried out to understand cost control strategies implemented in the hospital and the management and doctors’ perceptions regarding the application of tested mathematical models for cost control. Old institutional economics and new institutional sociology were used to gain insight about how and why management accounting practices changed in the hospital. Findings The findings show that the absence of detailed per-case/patient cost information has not only hindered further evolvement of MACs but also stimulate tensions between managerial and medical worlds in the studied Indonesian public hospital. The simulation of BDA in this study was not only discovering the determinants of case cost recovery but also enabling the prediction of CRR of patients immediately after admission. The application of BDA and casemix accounting in the hospital will potentially become catalysts of discussion and mutual learning between managerial and medical staff in controlling patient costs. Originality/value This paper provides a more comprehensive picture of the potential roles of BDA in cost control practices. The study assesses the feasibility of BDA application in the hospital and evaluates the potential roles and acceptance of BDA application by both management and doctors.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Fatao Wang ◽  
Di Wu ◽  
Hongxin Yu ◽  
Huaxia Shen ◽  
Yuanjun Zhao

PurposeBased on the typical service supply chain (SSC) structure, the authors construct the model of e-tailing SSC to explore the coordination relationship in the supply chain, and big data analysis provides realistic possibilities for the creation of coordination mechanisms.Design/methodology/approachAt the present stage, the e-commerce companies have not yet established a mature SSC system and have not achieved good synergy with other members of the supply chain, the shortage of goods and the greater pressure of express logistics companies coexist. In the case of uncertain online shopping market demand, the authors employ newsboy model, applied in the operations research, to analyze the synergistic mechanism of SSC model.FindingsBy analyzing the e-tailing SSC coordination mechanism and adjusting relevant parameters, the authors find that the synergy mechanism can be implemented and optimized. Through numerical example analysis, the authors confirmed the feasibility of the above analysis.Originality/valueBig data analysis provides a kind of reality for the establishment of online SSC coordination mechanism. The establishment of an online supply chain coordination mechanism can effectively promote the efficient allocation of supplies and better meet consumers' needs.


Have you ever wondered how companies that adopt big data and analytics have generated value? Which algorithm are they using for which situation? And what was the result? These points will be discussed in this chapter in order to highlight the importance of big data analytics. To this end, and in order to give a quick introduction to what is being done in data analytics applications and to trigger the reader's interest, the author introduces some applications examples. This will allow you, in more detail, to gain more insight into the types and uses of algorithms for data analysis. So, enjoy the examples.


Web Services ◽  
2019 ◽  
pp. 1301-1329
Author(s):  
Suren Behari ◽  
Aileen Cater-Steel ◽  
Jeffrey Soar

The chapter discusses how Financial Services organizations can take advantage of Big Data analysis for disruptive innovation through examination of a case study in the financial services industry. Popular tools for Big Data Analysis are discussed and the challenges of big data are explored as well as how these challenges can be met. The work of Hayes-Roth in Valued Information at the Right Time (VIRT) and how it applies to the case study is examined. Boyd's model of Observe, Orient, Decide, and Act (OODA) is explained in relation to disruptive innovation in financial services. Future trends in big data analysis in the financial services domain are explored.


Sign in / Sign up

Export Citation Format

Share Document