scholarly journals A Mixed Deep Recurrent Neural Network for MEMS Gyroscope Noise Suppressing

Electronics ◽  
2019 ◽  
Vol 8 (2) ◽  
pp. 181 ◽  
Author(s):  
Changhui Jiang ◽  
Yuwei Chen ◽  
Shuai Chen ◽  
Yuming Bo ◽  
Wei Li ◽  
...  

Currently, positioning, navigation, and timing information is becoming more and more vital for both civil and military applications. Integration of the global navigation satellite system and /inertial navigation system is the most popular solution for various carriers or vehicle positioning. As is well-known, the global navigation satellite system positioning accuracy will degrade in signal challenging environments. Under this condition, the integration system will fade to a standalone inertial navigation system outputting navigation solutions. However, without outer aiding, positioning errors of the inertial navigation system diverge quickly due to the noise contained in the raw data of the inertial measurement unit. In particular, the micromechanics system inertial measurement unit experiences more complex errors due to the manufacturing technology. To improve the navigation accuracy of inertial navigation systems, one effective approach is to model the raw signal noise and suppress it. Commonly, an inertial measurement unit is composed of three gyroscopes and three accelerometers, among them, the gyroscopes play an important role in the accuracy of the inertial navigation system’s navigation solutions. Motivated by this problem, in this paper, an advanced deep recurrent neural network was employed and evaluated in noise modeling of a micromechanics system gyroscope. Specifically, a deep long short term memory recurrent neural network and a deep gated recurrent unit–recurrent neural network were combined together to construct a two-layer recurrent neural network for noise modeling. In this method, the gyroscope data were treated as a time series, and a real dataset from a micromechanics system inertial measurement unit was employed in the experiments. The results showed that, compared to the two-layer long short term memory, the three-axis attitude errors of the mixed long short term memory–gated recurrent unit decreased by 7.8%, 20.0%, and 5.1%. When compared with the two-layer gated recurrent unit, the proposed method showed 15.9%, 14.3%, and 10.5% improvement. These results supported a positive conclusion on the performance of designed method, specifically, the mixed deep recurrent neural networks outperformed than the two-layer gated recurrent unit and the two-layer long short term memory recurrent neural networks.

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Yiming Zhang ◽  
Hang Zhao ◽  
Jinyi Ma ◽  
Yunmei Zhao ◽  
Yiqun Dong ◽  
...  

A new fault detection scheme for aircraft Inertial Measurement Unit (IMU) sensors is developed in this paper. This scheme adopts a deep neural network with a CNN-LSTM-fusion architecture (CNN: convolution neural network; LSTM: long short-term memory). The fault detection network (FDN) developed in this paper is irrelative to aircraft model or flight condition. Flight data is reformed into a 2D format for FDN input and is mapped via the net to fault cases directly. We simulate different aircrafts with various flight conditions and separate them into training and testing sets. Part of the aircrafts and flight conditions appears only in the testing set to validate robustness and scalability of the FDN. Different architectures of FDN are studied, and an optimized architecture is obtained via ablation studies. An average detecting accuracy of 94.5% on 20 different cases is achieved.


Author(s):  
Peter Wintoft ◽  
Magnus Wik

Three different recurrent neural network (RNN) architectures are studied for the prediction of geomagnetic activity. The RNNs studied are the Elman, gated recurrent unit (GRU), and long short-term memory (LSTM). The RNNs take solar wind data as inputs to predict the Dst index. The Dst index summarizes complex geomagnetic processes into a single time series. The models are trained and tested using five-fold cross-validation based on the hourly resolution OMNI dataset using data from the years 1995–2015. The inputs are solar wind plasma (particle density and speed), vector magnetic fields, time of year, and time of day. The RNNs are regularized using early stopping and dropout. We find that both the gated recurrent unit and long short-term memory models perform better than the Elman model; however, we see no significant difference in performance between GRU and LSTM. RNNs with dropout require more weights to reach the same validation error as networks without dropout. However, the gap between training error and validation error becomes smaller when dropout is applied, reducing over-fitting and improving generalization. Another advantage in using dropout is that it can be applied during prediction to provide confidence limits on the predictions. The confidence limits increase with increasing Dst magnitude: a consequence of the less populated input-target space for events with large Dst values, thereby increasing the uncertainty in the estimates. The best RNNs have test set RMSE of 8.8 nT, bias close to zero, and linear correlation of 0.90.


Author(s):  
Md. Asifuzzaman Jishan ◽  
Khan Raqib Mahmud ◽  
Abul Kalam Al Azad

We presented a learning model that generated natural language description of images. The model utilized the connections between natural language and visual data by produced text line based contents from a given image. Our Hybrid Recurrent Neural Network model is based on the intricacies of Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), and Bi-directional Recurrent Neural Network (BRNN) models. We conducted experiments on three benchmark datasets, e.g., Flickr8K, Flickr30K, and MS COCO. Our hybrid model utilized LSTM model to encode text line or sentences independent of the object location and BRNN for word representation, this reduced the computational complexities without compromising the accuracy of the descriptor. The model produced better accuracy in retrieving natural language based description on the dataset.


Sign in / Sign up

Export Citation Format

Share Document