scholarly journals Attacks to Automatous Vehicles: A Deep Learning Algorithm for Cybersecurity

Sensors ◽  
2022 ◽  
Vol 22 (1) ◽  
pp. 360
Author(s):  
Theyazn H. H. Aldhyani ◽  
Hasan Alkahtani

Rapid technological development has changed drastically the automotive industry. Network communication has improved, helping the vehicles transition from completely machine- to software-controlled technologies. The autonomous vehicle network is controlled by the controller area network (CAN) bus protocol. Nevertheless, the autonomous vehicle network still has issues and weaknesses concerning cybersecurity due to the complexity of data and traffic behaviors that benefit the unauthorized intrusion to a CAN bus and several types of attacks. Therefore, developing systems to rapidly detect message attacks in CAN is one of the biggest challenges. This study presents a high-performance system with an artificial intelligence approach that protects the vehicle network from cyber threats. The system secures the autonomous vehicle from intrusions by using deep learning approaches. The proposed security system was verified by using a real automatic vehicle network dataset, including spoofing, flood, replaying attacks, and benign packets. Preprocessing was applied to convert the categorical data into numerical. This dataset was processed by using the convolution neural network (CNN) and a hybrid network combining CNN and long short-term memory (CNN-LSTM) models to identify attack messages. The results revealed that the model achieved high performance, as evaluated by the metrics of precision, recall, F1 score, and accuracy. The proposed system achieved high accuracy (97.30%). Along with the empirical demonstration, the proposed system enhanced the detection and classification accuracy compared with the existing systems and was proven to have superior performance for real-time CAN bus security.

2020 ◽  
Author(s):  
Takemasa Miyoshi ◽  
Takmi Honda ◽  
Shigenori Otsuka ◽  
Arata Amemiya ◽  
Yasumitsu Maejima ◽  
...  

<p>The Japan’s Big Data Assimilation (BDA) project started in October 2013 and ended its 5.5-year period in March 2019. The direct follow-on project was accepted and started in April 2019 under the Japan Science and Technology Agency (JST) AIP (Advanced Intelligence Project) Acceleration Research, with emphases on the connection with AI technologies, in particular, an integration of DA and AI with high-performance computation (HPC). The BDA project aimed to fully take advantage of “big data” from advanced sensors such as the phased array weather radar (PAWR) and Himawari-8 geostationary satellite, which provide two orders of magnitude more data than the previous sensors. We have achieved successful case studies with newly-developed 30-second-update, 100-m-mesh numerical weather prediction (NWP) system based on the RIKEN’s SCALE model and local ensemble transform Kalman filter (LETKF) to assimilate PAWR in Osaka and Kobe. We have been actively developing the workflow for real-time weather forecasting in Tokyo in summer 2020. In addition, we developed two precipitation nowcasting systems with the every-30-second PAWR data: one with an optical-flow-based system, the other with a deep-learning-based system. We chose the convolutional Long Short Term Memory (Conv-LSTM) as a deep learning algorithm, and found it effective for precipitation nowcasting. The use of Conv-LSTM would lead to an integration of DA and AI with HPC. This presentation will include an overview of the BDA project toward a DA-AI-HPC integration under the new AIP Acceleration Research scheme, and recent progress of the project.</p>


2020 ◽  
Vol 34 (4) ◽  
pp. 437-444
Author(s):  
Lingyan Ou ◽  
Ling Chen

Corporate internet reporting (CIR) has such advantages as the strong timeliness, large amount, and wide coverage of financial information. However, the CIR, like any other online information, faces various risks. With the aid of the increasingly sophisticated artificial intelligence (AI) technology, this paper proposes an improved deep learning algorithm for the prediction of CIR risks, aiming to improve the accuracy of CIR risk prediction. After building a reasonable evaluation index system (EIS) for CIR risks, the data involved in risk rating and the prediction of risk transmission effect (RTE) were subject to structured feature extraction and time series construction. Next, a combinatory CIR risk prediction model was established by combining the autoregressive moving average (ARMA) model with long short-term memory (LSTM). The former is good at depicting linear series, and the latter excels in describing nonlinear series. Experimental results demonstrate the effectiveness of the ARMA-LSTM model. The research findings provide a good reference for applying AI technology in risk prediction of other areas.


2021 ◽  
Vol 25 (11) ◽  
pp. 6041-6066
Author(s):  
Jiancong Chen ◽  
Baptiste Dafflon ◽  
Anh Phuong Tran ◽  
Nicola Falco ◽  
Susan S. Hubbard

Abstract. Climate change is reshaping vulnerable ecosystems, leading to uncertain effects on ecosystem dynamics, including evapotranspiration (ET) and ecosystem respiration (Reco). However, accurate estimation of ET and Reco still remains challenging at sparsely monitored watersheds, where data and field instrumentation are limited. In this study, we developed a hybrid predictive modeling approach (HPM) that integrates eddy covariance measurements, physically based model simulation results, meteorological forcings, and remote-sensing datasets to estimate ET and Reco in high space–time resolution. HPM relies on a deep learning algorithm and long short-term memory (LSTM) and requires only air temperature, precipitation, radiation, normalized difference vegetation index (NDVI), and soil temperature (when available) as input variables. We tested and validated HPM estimation results in different climate regions and developed four use cases to demonstrate the applicability and variability of HPM at various FLUXNET sites and Rocky Mountain SNOTEL sites in Western North America. To test the limitations and performance of the HPM approach in mountainous watersheds, an expanded use case focused on the East River Watershed, Colorado, USA. The results indicate HPM is capable of identifying complicated interactions among meteorological forcings, ET, and Reco variables, as well as providing reliable estimation of ET and Reco across relevant spatiotemporal scales, even in challenging mountainous systems. The study documents that HPM increases our capability to estimate ET and Reco and enhances process understanding at sparsely monitored watersheds.


2014 ◽  
Vol 494-495 ◽  
pp. 1373-1376
Author(s):  
Yan Hui Cheng

This paper is based on the expansion of the CAN bus interface on ARM so that the embedded CNC system to achieve network, and display their respective advantages of field bus technology and embedded technology, so that to set up local area network embedded NC system model. The CNC system designed in this paper has characteristic as follows: high integration, flexible structure, good expansibility and the high performance-to-price ratio and so on.


2021 ◽  
Author(s):  
Xuting Duan ◽  
Huiwen Yan ◽  
Jianshan Zhou

Abstract Because of the rapid development of automobile intelligence and networking, cyber attackers can invade the vehicle network via wired and wireless interfaces, such as physical interfaces, short-range wireless interfaces, and long-range wireless interfaces. Thus, interfering with regular driving will immediately jeopardises the drivers’ and passengers’ personal and property safety. To accomplish security protection for the vehicle CAN (Controller Area Network) bus, we propose an anomaly detection method by calculating the information entropy based on the number of interval messages during the sliding window. It detects periodic attacks on the vehicle CAN bus, such as replay attacks and flooding attacks. First, we calculate the number of interval messages according to the CAN bus baud rate, the number of bits of a single frame message, and the time required to calculate information entropy within the window. Second, we compute the window information entropy of regular packet interval packets and determine the normal threshold range by setting a threshold coefficient. Finally, we calculate the information entropy of the data to be measured, determine whether it is greater than or less than the threshold, and detect the anomaly. The experiment uses CANoe software to simulate the vehicle network. It uses the body frame CAN bus network of a brand automobile body bench as the regular network, simulates attack nodes to attack the regular network periodically, collects message data, and verifies the proposed detection method. The results show that the proposed detection method has lower false-negative and false-positive rates for attack scenarios such as replay attacks and flood attacks across different attack cycles.


10.6036/10007 ◽  
2021 ◽  
Vol 96 (5) ◽  
pp. 528-533
Author(s):  
XAVIER LARRIVA NOVO ◽  
MARIO VEGA BARBAS ◽  
VICTOR VILLAGRA ◽  
JULIO BERROCAL

Cybersecurity has stood out in recent years with the aim of protecting information systems. Different methods, techniques and tools have been used to make the most of the existing vulnerabilities in these systems. Therefore, it is essential to develop and improve new technologies, as well as intrusion detection systems that allow detecting possible threats. However, the use of these technologies requires highly qualified cybersecurity personnel to analyze the results and reduce the large number of false positives that these technologies presents in their results. Therefore, this generates the need to research and develop new high-performance cybersecurity systems that allow efficient analysis and resolution of these results. This research presents the application of machine learning techniques to classify real traffic, in order to identify possible attacks. The study has been carried out using machine learning tools applying deep learning algorithms such as multi-layer perceptron and long-short-term-memory. Additionally, this document presents a comparison between the results obtained by applying the aforementioned algorithms and algorithms that are not deep learning, such as: random forest and decision tree. Finally, the results obtained are presented, showing that the long-short-term-memory algorithm is the one that provides the best results in relation to precision and logarithmic loss.


PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e11262
Author(s):  
Guobin Li ◽  
Xiuquan Du ◽  
Xinlu Li ◽  
Le Zou ◽  
Guanhong Zhang ◽  
...  

DNA-binding proteins (DBPs) play pivotal roles in many biological functions such as alternative splicing, RNA editing, and methylation. Many traditional machine learning (ML) methods and deep learning (DL) methods have been proposed to predict DBPs. However, these methods either rely on manual feature extraction or fail to capture long-term dependencies in the DNA sequence. In this paper, we propose a method, called PDBP-Fusion, to identify DBPs based on the fusion of local features and long-term dependencies only from primary sequences. We utilize convolutional neural network (CNN) to learn local features and use bi-directional long-short term memory network (Bi-LSTM) to capture critical long-term dependencies in context. Besides, we perform feature extraction, model training, and model prediction simultaneously. The PDBP-Fusion approach can predict DBPs with 86.45% sensitivity, 79.13% specificity, 82.81% accuracy, and 0.661 MCC on the PDB14189 benchmark dataset. The MCC of our proposed methods has been increased by at least 9.1% compared to other advanced prediction models. Moreover, the PDBP-Fusion also gets superior performance and model robustness on the PDB2272 independent dataset. It demonstrates that the PDBP-Fusion can be used to predict DBPs from sequences accurately and effectively; the online server is at http://119.45.144.26:8080/PDBP-Fusion/.


Author(s):  
Luotong Wang ◽  
Li Qu ◽  
Longshu Yang ◽  
Yiying Wang ◽  
Huaiqiu Zhu

AbstractNanopore sequencing is regarded as one of the most promising third-generation sequencing (TGS) technologies. Since 2014, Oxford Nanopore Technologies (ONT) has developed a series of devices based on nanopore sequencing to produce very long reads, with an expected impact on genomics. However, the nanopore sequencing reads are susceptible to a fairly high error rate owing to the difficulty in identifying the DNA bases from the complex electrical signals. Although several basecalling tools have been developed for nanopore sequencing over the past years, it is still challenging to correct the sequences after applying the basecalling procedure. In this study, we developed an open-source DNA basecalling reviser, NanoReviser, based on a deep learning algorithm to correct the basecalling errors introduced by current basecallers provided by default. In our module, we re-segmented the raw electrical signals based on the basecalled sequences provided by the default basecallers. By employing convolution neural networks (CNNs) and bidirectional long short-term memory (Bi-LSTM) networks, we took advantage of the information from the raw electrical signals and the basecalled sequences from the basecallers. Our results showed NanoReviser, as a post-basecalling reviser, significantly improving the basecalling quality. After being trained on standard ONT sequencing reads from public E. coli and human NA12878 datasets, NanoReviser reduced the sequencing error rate by over 5% for both the E. coli dataset and the human dataset. The performance of NanoReviser was found to be better than those of all current basecalling tools. Furthermore, we analyzed the modified bases of the E. coli dataset and added the methylation information to train our module. With the methylation annotation, NanoReviser reduced the error rate by 7% for the E. coli dataset and specifically reduced the error rate by over 10% for the regions of the sequence rich in methylated bases. To the best of our knowledge, NanoReviser is the first post-processing tool after basecalling to accurately correct the nanopore sequences without the time-consuming procedure of building the consensus sequence. The NanoReviser package is freely available at https://github.com/pkubioinformatics/NanoReviser.


Sign in / Sign up

Export Citation Format

Share Document