scholarly journals Improving Communication System for Vehicle-to-Everything Networks by Using 5G Technology

2021 ◽  
Author(s):  
Tarik Adnan Almohamad ◽  
Muhammet Tahir Güneşer ◽  
Mohd Nazri Mahmud ◽  
Cihat Şeker

Next-generations of wireless communication systems (5G scheme & beyond) are rapidly evolving in the contemporary life. These schemes could propose vital solutions for many existing challenges in various aspects of our lives, eventually to ensure stable communications. Such challenges are even greater when it comes to address ubiquitous coverage and steady interconnection performance in fast mobile vehicles (i.e., trains or airplanes) where certainly blind spots exist. As an early initiative, the Third Generation Partnership Project (3GPP) has proposed a regulation for Long Term Evolution (LTE)-based Vehicle-to-Everything (V2X) network in order to offer solid solutions for V2X interconnections. V2X term should comprise the following terminologies: vehicle-to-vehicle (V2V), vehicle-to-network (V2N) communications, vehicle-to-infrastructure (V2I), and vehicle-to-pedestrian (V2P). Superior V2X communications have a promising potential to improve efficiency, road safety, security, the accessibility of infotainment services (any service of user-interface exists inside a vehicle). In this chapter, the aforementioned topics will be addressed. In addition, the chapter will open the door on investigating the role of wireless cooperative and automatic signal identification schemes in V2X networks, and shedding light on the machine learning techniques (i.e, Support Vector Machines (SVMs), Deep Neural Networks (DNNs)) when they meet with the next-generations of wireless networks.

2021 ◽  
Author(s):  
Manimegaai C T ◽  
kali muthu ◽  
sabitha gauni

Abstract These days population are taking a risk in their drive and in no time dangers are happening, and loosing lives by doing tiny wrongs when on drive near restricted zones. To escape these accidents to make population risk free traffic department are introducing signboards. But then again with the ignorance of the people, dangers are happening again, so “Li-Fi technology” is being used here to decrease the count of accidents. The transmission takes place with the help of LEDs (Light Emitting Diodes).Text, audio and video can also be transmitted with the help of this li-fi. The transmission is done when the light turns on and off. When this is compared to Wi-Fi it has many advantages like this light is not harmful to human body. The Transmission takes place in the form of zeroes and ones. Therefore to avoid accidents we suggested an intelligent, adaptable, and efficient model that utilizes Machine Learning techniques. The proposed system helps in vehicle to vehicle and vehicle to Infrastructure communication systems.


2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Shaker El-Sappagh ◽  
Tamer Abuhmed ◽  
Bader Alouffi ◽  
Radhya Sahal ◽  
Naglaa Abdelhade ◽  
...  

Early detection of Alzheimer’s disease (AD) progression is crucial for proper disease management. Most studies concentrate on neuroimaging data analysis of baseline visits only. They ignore the fact that AD is a chronic disease and patient’s data are naturally longitudinal. In addition, there are no studies that examine the effect of dementia medicines on the behavior of the disease. In this paper, we propose a machine learning-based architecture for early progression detection of AD based on multimodal data of AD drugs and cognitive scores data. We compare the performance of five popular machine learning techniques including support vector machine, random forest, logistic regression, decision tree, and K-nearest neighbor to predict AD progression after 2.5 years. Extensive experiments are performed using an ADNI dataset of 1036 subjects. The cross-validation performance of most algorithms has been improved by fusing the drugs and cognitive scores data. The results indicate the important role of patient’s taken drugs on the progression of AD disease.


Energies ◽  
2021 ◽  
Vol 14 (18) ◽  
pp. 5947
Author(s):  
William Mounter ◽  
Chris Ogwumike ◽  
Huda Dawood ◽  
Nashwan Dawood

Advances in metering technologies and emerging energy forecast strategies provide opportunities and challenges for predicting both short and long-term building energy usage. Machine learning is an important energy prediction technique, and is significantly gaining research attention. The use of different machine learning techniques based on a rolling-horizon framework can help to reduce the prediction error over time. Due to the significant increases in error beyond short-term energy forecasts, most reported energy forecasts based on statistical and machine learning techniques are within the range of one week. The aim of this study was to investigate how facility managers can improve the accuracy of their building’s long-term energy forecasts. This paper presents an extensive study of machine learning and data processing techniques and how they can more accurately predict within different forecast ranges. The Clarendon building of Teesside University was selected as a case study to demonstrate the prediction of overall energy usage with different machine learning techniques such as polynomial regression (PR), support vector regression (SVR) and artificial neural networks (ANNs). This study further examined how preprocessing training data for prediction models can impact the overall accuracy, such as via segmenting the training data by building modes (active and dormant), or by days of the week (weekdays and weekends). The results presented in this paper illustrate a significant reduction in the mean absolute percentage error (MAPE) for segmented building (weekday and weekend) energy usage prediction when compared to unsegmented monthly predictions. A reduction in MAPE of 5.27%, 11.45%, and 12.03% was achieved with PR, SVR and ANN, respectively.


Author(s):  
William Mounter ◽  
Huda Dawood ◽  
Nashwan Dawood

AbstractAdvances in metering technologies and machine learning methods provide both opportunities and challenges for predicting building energy usage in the both the short and long term. However, there are minimal studies on comparing machine learning techniques in predicting building energy usage on their rolling horizon, compared with comparisons based upon a singular forecast range. With the majority of forecasts ranges being within the range of one week, due to the significant increases in error beyond short term building energy prediction. The aim of this paper is to investigate how the accuracy of building energy predictions can be improved for long term predictions, in part of a larger study into which machine learning techniques predict more accuracy within different forecast ranges. In this case study the ‘Clarendon building’ of Teesside University was selected for use in using it’s BMS data (Building Management System) to predict the building’s overall energy usage with Support Vector Regression. Examining how altering what data is used to train the models, impacts their overall accuracy. Such as by segmenting the model by building modes (Active and dormant), or by days of the week (Weekdays and weekends). Of which it was observed that modelling building weekday and weekend energy usage, lead to a reduction of 11% MAPE on average compared with unsegmented predictions.


2021 ◽  
pp. 1-11
Author(s):  
Stephanie M. Helman ◽  
Elizabeth A. Herrup ◽  
Adam B. Christopher ◽  
Salah S. Al-Zaiti

Abstract Machine learning uses historical data to make predictions about new data. It has been frequently applied in healthcare to optimise diagnostic classification through discovery of hidden patterns in data that may not be obvious to clinicians. Congenital Heart Defect (CHD) machine learning research entails one of the most promising clinical applications, in which timely and accurate diagnosis is essential. The objective of this scoping review is to summarise the application and clinical utility of machine learning techniques used in paediatric cardiology research, specifically focusing on approaches aiming to optimise diagnosis and assessment of underlying CHD. Out of 50 full-text articles identified between 2015 and 2021, 40% focused on optimising the diagnosis and assessment of CHD. Deep learning and support vector machine were the most commonly used algorithms, accounting for an overall diagnostic accuracy > 0.80. Clinical applications primarily focused on the classification of auscultatory heart sounds, transthoracic echocardiograms, and cardiac MRIs. The range of these applications and directions of future research are discussed in this scoping review.


2020 ◽  
Vol 12 (2) ◽  
pp. 84-99
Author(s):  
Li-Pang Chen

In this paper, we investigate analysis and prediction of the time-dependent data. We focus our attention on four different stocks are selected from Yahoo Finance historical database. To build up models and predict the future stock price, we consider three different machine learning techniques including Long Short-Term Memory (LSTM), Convolutional Neural Networks (CNN) and Support Vector Regression (SVR). By treating close price, open price, daily low, daily high, adjusted close price, and volume of trades as predictors in machine learning methods, it can be shown that the prediction accuracy is improved.


Author(s):  
Anantvir Singh Romana

Accurate diagnostic detection of the disease in a patient is critical and may alter the subsequent treatment and increase the chances of survival rate. Machine learning techniques have been instrumental in disease detection and are currently being used in various classification problems due to their accurate prediction performance. Various techniques may provide different desired accuracies and it is therefore imperative to use the most suitable method which provides the best desired results. This research seeks to provide comparative analysis of Support Vector Machine, Naïve bayes, J48 Decision Tree and neural network classifiers breast cancer and diabetes datsets.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 194
Author(s):  
Sarah Gonzalez ◽  
Paul Stegall ◽  
Harvey Edwards ◽  
Leia Stirling ◽  
Ho Chit Siu

The field of human activity recognition (HAR) often utilizes wearable sensors and machine learning techniques in order to identify the actions of the subject. This paper considers the activity recognition of walking and running while using a support vector machine (SVM) that was trained on principal components derived from wearable sensor data. An ablation analysis is performed in order to select the subset of sensors that yield the highest classification accuracy. The paper also compares principal components across trials to inform the similarity of the trials. Five subjects were instructed to perform standing, walking, running, and sprinting on a self-paced treadmill, and the data were recorded while using surface electromyography sensors (sEMGs), inertial measurement units (IMUs), and force plates. When all of the sensors were included, the SVM had over 90% classification accuracy using only the first three principal components of the data with the classes of stand, walk, and run/sprint (combined run and sprint class). It was found that sensors that were placed only on the lower leg produce higher accuracies than sensors placed on the upper leg. There was a small decrease in accuracy when the force plates are ablated, but the difference may not be operationally relevant. Using only accelerometers without sEMGs was shown to decrease the accuracy of the SVM.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Tomoaki Mameno ◽  
Masahiro Wada ◽  
Kazunori Nozaki ◽  
Toshihito Takahashi ◽  
Yoshitaka Tsujioka ◽  
...  

AbstractThe purpose of this retrospective cohort study was to create a model for predicting the onset of peri-implantitis by using machine learning methods and to clarify interactions between risk indicators. This study evaluated 254 implants, 127 with and 127 without peri-implantitis, from among 1408 implants with at least 4 years in function. Demographic data and parameters known to be risk factors for the development of peri-implantitis were analyzed with three models: logistic regression, support vector machines, and random forests (RF). As the results, RF had the highest performance in predicting the onset of peri-implantitis (AUC: 0.71, accuracy: 0.70, precision: 0.72, recall: 0.66, and f1-score: 0.69). The factor that had the most influence on prediction was implant functional time, followed by oral hygiene. In addition, PCR of more than 50% to 60%, smoking more than 3 cigarettes/day, KMW less than 2 mm, and the presence of less than two occlusal supports tended to be associated with an increased risk of peri-implantitis. Moreover, these risk indicators were not independent and had complex effects on each other. The results of this study suggest that peri-implantitis onset was predicted in 70% of cases, by RF which allows consideration of nonlinear relational data with complex interactions.


2014 ◽  
Vol 28 (2) ◽  
pp. 3-28 ◽  
Author(s):  
Hal R. Varian

Computers are now involved in many economic transactions and can capture data associated with these transactions, which can then be manipulated and analyzed. Conventional statistical and econometric techniques such as regression often work well, but there are issues unique to big datasets that may require different tools. First, the sheer size of the data involved may require more powerful data manipulation tools. Second, we may have more potential predictors than appropriate for estimation, so we need to do some kind of variable selection. Third, large datasets may allow for more flexible relationships than simple linear models. Machine learning techniques such as decision trees, support vector machines, neural nets, deep learning, and so on may allow for more effective ways to model complex relationships. In this essay, I will describe a few of these tools for manipulating and analyzing big data. I believe that these methods have a lot to offer and should be more widely known and used by economists.


Sign in / Sign up

Export Citation Format

Share Document