scholarly journals Case Study on Data Analytics and Machine Learning Accuracy

2021 ◽  
Vol 09 (04) ◽  
pp. 249-270
Author(s):  
Abdullah Z. Alruhaymi ◽  
Charles J. Kim
Author(s):  
Rathimala Kannan ◽  
Intan Soraya Rosdi ◽  
Kannan Ramakrishna ◽  
Haziq Riza Abdul Rasid ◽  
Mohamed Haryz Izzudin Mohamed Rafy ◽  
...  

Data analytics is the essential component in deriving insights from data obtained from multiple sources. It represents the technology, methods and techniques used to obtain insights from massive datasets. As data increases, companies are looking for ways to gain relevant business insights underneath layers of data and information, to help them better understand new business ventures, opportunities, business trends and complex challenges. However, to date, while the extensive benefits of business data analytics to large organizations are widely published, micro, small, and medium sized organisations have not fully grasped the potential benefits to be gained from data analytics using machine learning techniques. This study is guided by the research question of how data analytics using machine learning techniques can benefit small businesses. Using the case study method, this paper outlines how small businesses in two different industries i.e. healthcare and retail can leverage data analytics and machine learning techniques to gain competitive advantage from the data. Details on the respective benefits gained by the small business owners featured in the two case studies provide important answers to the research question.


2021 ◽  
Author(s):  
Andrei Popa ◽  
Ben Amaba ◽  
Jeff Daniels

Abstract A practical framework that outlines the critical steps of a successful process that uses data, machine learning (Ml), and artificial intelligence (AI) is presented in this study. A practical case study is included to demonstrate the process. The use of artificial intelligent and machine learning has not only enhanced but also sped up problem-solving approaches in many domains, including the oil and gas industry. Moreover, these technologies are revolutionizing all key aspects of engineering including; framing approaches, techniques, and outcomes. The proposed framework includes key components to ensure integrity, quality, and accuracy of data and governance centered on principles such as responsibility, equitability, and reliability. As a result, the industry documentation shows that technology coupled with process advances can improve productivity by 20%. A clear work-break-down structure (WBS) to create value using an engineering framework has measurable outcomes. The AI and ML technologies enable the use of large amounts of information, combining static & dynamic data, observations, historical events, and behaviors. The Job Task Analysis (JTA) model is a proven framework to manage processes, people, and platforms. JTA is a modern data-focused approach that prioritizes in order: problem framing, analytics framing, data, methodology, model building, deployment, and lifecycle management. The case study exemplifies how the JTA model optimizes an oilfield production plant, similar to a manufacturing facility. A data-driven approach was employed to analyze and evaluate the production fluid impact during facility-planned or un-planned system disruptions. The workflows include data analytics tools such as ML&AI for pattern recognition and clustering for prompt event mitigation and optimization. The paper demonstrates how an integrated framework leads to significant business value. The study integrates surface and subsurface information to characterize and understand the production impact due to planned and unplanned plant events. The findings led to designing a relief system to divert the back pressure during plant shutdown. The study led to cost avoidance of a new plant, saving millions of dollars, environment impact, and safety considerations, in addition to unnecessary operating costs and maintenance. Moreover, tens of millions of dollars value per year by avoiding production loss of plant upsets or shutdown was created. The study cost nothing to perform, about two months of not focused time by a team of five engineers and data scientists. The work provided critical steps in "creating a trusting" model and "explainability’. The methodology was implemented using existing available data and tools; it was the process and engineering knowledge that led to the successful outcome. Having a systematic WBS has become vital in data analytics projects that use AI and ML technologies. An effective governance system creates 25% productivity improvement and 70% capital improvement. Poor requirements can consume 40%+ of development budget. The process, models, and tools should be used on engineering projects where data and physics are present. The proposed framework demonstrates the business impact and value creation generated by integrating models, data, AI, and ML technologies for modeling and optimization. It reflects the collective knowledge and perspectives of diverse professionals from IBM, Lockheed Martin, and Chevron, who joined forces to document a standard framework for achieving success in data analytics/AI projects.


Amicus Curiae ◽  
2020 ◽  
Vol 1 (3) ◽  
pp. 338-360
Author(s):  
Jamie Grace ◽  
Roxanne Bamford

Policymaking is increasingly being informed by ‘big data’ technologies of analytics, machine learning and artificial intelligence (AI). John Rawls used particular principles of reasoning in his 1971 book, A Theory of Justice, which might help explore known problems of data bias, unfairness, accountability and privacy, in relation to applications of machine learning and AI in government. This paper will investigate how the current assortment of UK governmental policy and regulatory developments around AI in the public sector could be said to meet, or not meet, these Rawlsian principles, and what we might do better by incorporating them when we respond legislatively to this ongoing challenge. This paper uses a case study of data analytics and machine-learning regulation as the central means of this exploration of Rawlsian thinking in relation to the redevelopment of algorithmic governance.


2020 ◽  
Author(s):  
Arman Behnam ◽  
Roohollah Jahanmahin

World is now experiencing the new pandemic caused by COVID-19 virus and all countries areaffected by this disease specially Iran. From the beginning of the outbreak until April 30, 2020,over 90000 confirmed cases of COVID-19 have been reported in Iran. Due to socioeconomicproblems of this disease, it is required to predict the trend of the outbreak and propose abeneficial method to find out the correct trend. In this paper, we compiled a dataset includingthe number of confirmed cases, the daily number of death cases and the number of recoveredcases. Further, by combining case number variables like behavior and policies that are changingover time and machine learning (ML) algorithms such as logistic function using inflectionpoint, we created new rates like weekly death rate, life rate and new approaches to mortalityrate and recovery rate. Gaussian functions shows superior performance which is helpful forgovernment to improve its awareness about important factors that have significant impacts onfuture trends of this virus.


2020 ◽  
Author(s):  
Avinash Wesley ◽  
Bharat Mantha ◽  
Ajay Rajeev ◽  
Aimee Taylor ◽  
Mohit Dholi ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document