Artificial Intelligence and Machine Learning Algorithms

Author(s):  
Amit Kumar Tyagi ◽  
Poonam Chahal

With the recent development in technologies and integration of millions of internet of things devices, a lot of data is being generated every day (known as Big Data). This is required to improve the growth of several organizations or in applications like e-healthcare, etc. Also, we are entering into an era of smart world, where robotics is going to take place in most of the applications (to solve the world's problems). Implementing robotics in applications like medical, automobile, etc. is an aim/goal of computer vision. Computer vision (CV) is fulfilled by several components like artificial intelligence (AI), machine learning (ML), and deep learning (DL). Here, machine learning and deep learning techniques/algorithms are used to analyze Big Data. Today's various organizations like Google, Facebook, etc. are using ML techniques to search particular data or recommend any post. Hence, the requirement of a computer vision is fulfilled through these three terms: AI, ML, and DL.

Author(s):  
Arul Murugan R. ◽  
Sathiyamoorthi V.

Machine learning (ML) is one of the exciting sub-fields of artificial intelligence (AI). The term machine learning is generally stated as the ability to learn without being explicitly programmed. In recent years, machine learning has become one of the thrust areas of research across various business verticals. The technical advancements in the field of big data have provided the ability to gain access over large volumes of diversified data at ease. This massive amount of data can be processed at high speeds in a reasonable amount of time with the help of emerging hardware capabilities. Hence the machine learning algorithms have been the most effective at leveraging all of big data to provide near real-time solutions even for the complex business problems. This chapter aims in giving a solid introduction to various widely adopted machine learning techniques and its applications categorized into supervised, unsupervised, and reinforcement and will serve a simplified guide for the aspiring data and machine learning enthusiasts.


2021 ◽  
Vol 10 (2) ◽  
pp. 205846012199029
Author(s):  
Rani Ahmad

Background The scope and productivity of artificial intelligence applications in health science and medicine, particularly in medical imaging, are rapidly progressing, with relatively recent developments in big data and deep learning and increasingly powerful computer algorithms. Accordingly, there are a number of opportunities and challenges for the radiological community. Purpose To provide review on the challenges and barriers experienced in diagnostic radiology on the basis of the key clinical applications of machine learning techniques. Material and Methods Studies published in 2010–2019 were selected that report on the efficacy of machine learning models. A single contingency table was selected for each study to report the highest accuracy of radiology professionals and machine learning algorithms, and a meta-analysis of studies was conducted based on contingency tables. Results The specificity for all the deep learning models ranged from 39% to 100%, whereas sensitivity ranged from 85% to 100%. The pooled sensitivity and specificity were 89% and 85% for the deep learning algorithms for detecting abnormalities compared to 75% and 91% for radiology experts, respectively. The pooled specificity and sensitivity for comparison between radiology professionals and deep learning algorithms were 91% and 81% for deep learning models and 85% and 73% for radiology professionals (p < 0.000), respectively. The pooled sensitivity detection was 82% for health-care professionals and 83% for deep learning algorithms (p < 0.005). Conclusion Radiomic information extracted through machine learning programs form images that may not be discernible through visual examination, thus may improve the prognostic and diagnostic value of data sets.


Author(s):  
Qifang Bi ◽  
Katherine E Goodman ◽  
Joshua Kaminsky ◽  
Justin Lessler

Abstract Machine learning is a branch of computer science that has the potential to transform epidemiologic sciences. Amid a growing focus on “Big Data,” it offers epidemiologists new tools to tackle problems for which classical methods are not well-suited. In order to critically evaluate the value of integrating machine learning algorithms and existing methods, however, it is essential to address language and technical barriers between the two fields that can make it difficult for epidemiologists to read and assess machine learning studies. Here, we provide an overview of the concepts and terminology used in machine learning literature, which encompasses a diverse set of tools with goals ranging from prediction to classification to clustering. We provide a brief introduction to 5 common machine learning algorithms and 4 ensemble-based approaches. We then summarize epidemiologic applications of machine learning techniques in the published literature. We recommend approaches to incorporate machine learning in epidemiologic research and discuss opportunities and challenges for integrating machine learning and existing epidemiologic research methods.


Scientific Knowledge and Electronic devices are growing day by day. In this aspect, many expert systems are involved in the healthcare industry using machine learning algorithms. Deep neural networks beat the machine learning techniques and often take raw data i.e., unrefined data to calculate the target output. Deep learning or feature learning is used to focus on features which is very important and gives a complete understanding of the model generated. Existing methodology used data mining technique like rule based classification algorithm and machine learning algorithm like hybrid logistic regression algorithm to preprocess data and extract meaningful insights of data. This is, however a supervised data. The proposed work is based on unsupervised data that is there is no labelled data and deep neural techniques is deployed to get the target output. Machine learning algorithms are compared with proposed deep learning techniques using TensorFlow and Keras in the aspect of accuracy. Deep learning methodology outfits the existing rule based classification and hybrid logistic regression algorithm in terms of accuracy. The designed methodology is tested on the public MIT-BIH arrhythmia database, classifying four kinds of abnormal beats. The proposed approach based on deep learning technique offered a better performance, improving the results when compared to machine learning approaches of the state-of-the-art


2019 ◽  
Vol 2019 (2) ◽  
pp. 103-112
Author(s):  
Dr. Pasumpon pandian

The recent technological growth at a rapid pace has paved way for the big data that denotes to the exponential growth of the information’s. The big data analytics are the trending concepts that have emerged as the promising technology that offers more enhanced perceptions from the huge set of the data that have been produced from the diverse areas. The review in the paper proceeds with the methods of the big-data-analytics and the machine-learning in handling, the huge set of data flow. The overview of the utilization of the machine-learning algorithms in the analytics of high voluminous data would provide with the deeper and the richer analysis of the huge set of information gathered to extract the valuable and turn it into actionable information’s. The paper is to review the part of machine-learning algorithms in the analytics of high voluminous data


Author(s):  
Thiyagarajan P.

Digitalization is the buzz word today by which every walk of our life has been computerized, and it has made our life more sophisticated. On one side, we are enjoying the privilege of digitalization. On the other side, security of our information in the internet is the most concerning element. A variety of security mechanisms, namely cryptography, algorithms which provide access to protected information, and authentication including biometric and steganography, provide security to our information in the Internet. In spite of the above mechanisms, recently artificial intelligence (AI) also contributes towards strengthening information security by providing machine learning and deep learning-based security mechanisms. The artificial intelligence (AI) contribution to cyber security is important as it serves as a provoked reaction and a response to hackers' malicious actions. The purpose of this chapter is to survey recent papers which are contributing to information security by using machine learning and deep learning techniques.


2021 ◽  
Vol 31 (11) ◽  
pp. 2150173
Author(s):  
Miguel A. F. Sanjuán

Machine learning and deep learning techniques are contributing much to the advancement of science. Their powerful predictive capabilities appear in numerous disciplines, including chaotic dynamics, but they miss understanding. The main thesis here is that prediction and understanding are two very different and important ideas that should guide us to follow the progress of science. Furthermore, the important role played by nonlinear dynamical systems is emphasized for the process of understanding. The path of the future of science will be marked by a constructive dialogue between big data and big theory, without which we cannot understand.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Ashwin A. Phatak ◽  
Franz-Georg Wieland ◽  
Kartik Vempala ◽  
Frederik Volkmar ◽  
Daniel Memmert

AbstractWith the rising amount of data in the sports and health sectors, a plethora of applications using big data mining have become possible. Multiple frameworks have been proposed to mine, store, preprocess, and analyze physiological vitals data using artificial intelligence and machine learning algorithms. Comparatively, less research has been done to collect potentially high volume, high-quality ‘big data’ in an organized, time-synchronized, and holistic manner to solve similar problems in multiple fields. Although a large number of data collection devices exist in the form of sensors. They are either highly specialized, univariate and fragmented in nature or exist in a lab setting. The current study aims to propose artificial intelligence-based body sensor network framework (AIBSNF), a framework for strategic use of body sensor networks (BSN), which combines with real-time location system (RTLS) and wearable biosensors to collect multivariate, low noise, and high-fidelity data. This facilitates gathering of time-synchronized location and physiological vitals data, which allows artificial intelligence and machine learning (AI/ML)-based time series analysis. The study gives a brief overview of wearable sensor technology, RTLS, and provides use cases of AI/ML algorithms in the field of sensor fusion. The study also elaborates sample scenarios using a specific sensor network consisting of pressure sensors (insoles), accelerometers, gyroscopes, ECG, EMG, and RTLS position detectors for particular applications in the field of health care and sports. The AIBSNF may provide a solid blueprint for conducting research and development, forming a smooth end-to-end pipeline from data collection using BSN, RTLS and final stage analytics based on AI/ML algorithms.


2020 ◽  
Vol 4 (1) ◽  
pp. 16-30
Author(s):  
Mehreen Arshad

Purpose: Research on business simulation and machine learning has attracted immense interest in the last few years.  The aim of this study was to provide a comprehensive view of machine learning in business simulation. To review the use of artificial intelligence in business simulation analysis. A review of the literature, however, shows little systematic reviews on the application of machine learning techniques to business simulation, yet systematic reviews have gained prominence in the academic jargon. Methodology: Thus, this study does reviews systematically a total of 123 shortlisted articles that focus on the machine learning techniques in the business simulation process. Findings: There are immense algorithms of machine learning which can be used in a business simulation, although this study was able to review ten machine learning algorithms in the business simulation process. As a whole, the machine learning algorithms have been deployed to yield lead-time production in the industry. In inventory and storage, machine learning has been applied to improve efficiency in identifying inventory patterns that would have never been revealed and thus saves on costs. Future direction also discussed.  


Sign in / Sign up

Export Citation Format

Share Document