scholarly journals Real Time LDR Data Prediction using IoT and Deep Learning Algorithm

2020 ◽  
pp. 158-161
Author(s):  
Chandraprabha S ◽  
Pradeepkumar G ◽  
Dineshkumar Ponnusamy ◽  
Saranya M D ◽  
Satheesh Kumar S ◽  
...  

This paper outfits artificial intelligence based real time LDR data which is implemented in various applications like indoor lightning, and places where enormous amount of heat is produced, agriculture to increase the crop yield, Solar plant for solar irradiance Tracking. For forecasting the LDR information. The system uses a sensor that can measure the light intensity by means of LDR. The data acquired from sensors are posted in an Adafruit cloud for every two seconds time interval using Node MCU ESP8266 module. The data is also presented on adafruit dashboard for observing sensor variables. A Long short-term memory is used for setting up the deep learning. LSTM module uses the recorded historical data from adafruit cloud which is paired with Node MCU in order to obtain the real-time long-term time series sensor variables that is measured in terms of light intensity. Data is extracted from the cloud for processing the data analytics later the deep learning model is implemented in order to predict future light intensity values.

2020 ◽  
Vol 34 (4) ◽  
pp. 437-444
Author(s):  
Lingyan Ou ◽  
Ling Chen

Corporate internet reporting (CIR) has such advantages as the strong timeliness, large amount, and wide coverage of financial information. However, the CIR, like any other online information, faces various risks. With the aid of the increasingly sophisticated artificial intelligence (AI) technology, this paper proposes an improved deep learning algorithm for the prediction of CIR risks, aiming to improve the accuracy of CIR risk prediction. After building a reasonable evaluation index system (EIS) for CIR risks, the data involved in risk rating and the prediction of risk transmission effect (RTE) were subject to structured feature extraction and time series construction. Next, a combinatory CIR risk prediction model was established by combining the autoregressive moving average (ARMA) model with long short-term memory (LSTM). The former is good at depicting linear series, and the latter excels in describing nonlinear series. Experimental results demonstrate the effectiveness of the ARMA-LSTM model. The research findings provide a good reference for applying AI technology in risk prediction of other areas.


Author(s):  
Liu Chenang ◽  
Wang Rongxuan ◽  
Zhenyu Kong ◽  
Babu Suresh ◽  
Joslin Chase ◽  
...  

Layer-wise 3D surface morphology information is critical for the quality monitoring and control of additive manufacturing (AM) processes. However, most of the existing 3D scan technologies are either contact or time consuming, which are not capable of obtaining the 3D surface morphology data in a real-time manner during the process. Therefore, the objective of this study is to achieve real-time 3D surface data acquisition in AM, which is achieved by a supervised deep learning-based image analysis approach. The key idea of this proposed method is to capture the correlation between 2D image and 3D point cloud, and then quantify this relationship by using a deep learning algorithm, namely, convolutional neural network (CNN). To validate the effectiveness and efficiency of the proposed method, both simulation and real-world case studies were performed. The results demonstrate that this method has strong potential to be applied for real-time surface morphology measurement in AM, as well as other advanced manufacturing processes.


Author(s):  
Luotong Wang ◽  
Li Qu ◽  
Longshu Yang ◽  
Yiying Wang ◽  
Huaiqiu Zhu

AbstractNanopore sequencing is regarded as one of the most promising third-generation sequencing (TGS) technologies. Since 2014, Oxford Nanopore Technologies (ONT) has developed a series of devices based on nanopore sequencing to produce very long reads, with an expected impact on genomics. However, the nanopore sequencing reads are susceptible to a fairly high error rate owing to the difficulty in identifying the DNA bases from the complex electrical signals. Although several basecalling tools have been developed for nanopore sequencing over the past years, it is still challenging to correct the sequences after applying the basecalling procedure. In this study, we developed an open-source DNA basecalling reviser, NanoReviser, based on a deep learning algorithm to correct the basecalling errors introduced by current basecallers provided by default. In our module, we re-segmented the raw electrical signals based on the basecalled sequences provided by the default basecallers. By employing convolution neural networks (CNNs) and bidirectional long short-term memory (Bi-LSTM) networks, we took advantage of the information from the raw electrical signals and the basecalled sequences from the basecallers. Our results showed NanoReviser, as a post-basecalling reviser, significantly improving the basecalling quality. After being trained on standard ONT sequencing reads from public E. coli and human NA12878 datasets, NanoReviser reduced the sequencing error rate by over 5% for both the E. coli dataset and the human dataset. The performance of NanoReviser was found to be better than those of all current basecalling tools. Furthermore, we analyzed the modified bases of the E. coli dataset and added the methylation information to train our module. With the methylation annotation, NanoReviser reduced the error rate by 7% for the E. coli dataset and specifically reduced the error rate by over 10% for the regions of the sequence rich in methylated bases. To the best of our knowledge, NanoReviser is the first post-processing tool after basecalling to accurately correct the nanopore sequences without the time-consuming procedure of building the consensus sequence. The NanoReviser package is freely available at https://github.com/pkubioinformatics/NanoReviser.


2021 ◽  
Vol 2114 (1) ◽  
pp. 012067
Author(s):  
Ruba R. Nori ◽  
Rabah N. Farhan ◽  
Safaa Hussein Abed

Abstract Novel algorithm for fire detection has been introduced. CNN based System localization of fire for real time applications was proposed. Deep learning algorithms shows excellent results in a way that it accuracy reaches very high accuracy for fire image dataset. Yolo is a superior deep learning algorithm that is capable of detect and localize fires in real time. The luck of image dataset force us to limit the system in binary classification test. Proposed model was tested on dataset gathered from the internet. In this article, we built an automated alert system integrating multiple sensors and state-of-the art deep learning algorithms, which have a limited number of false positive elements and which provide our prototype robot with reasonable accuracy in real-time data and as little as possible to track and record fire events as soon as possible.


2021 ◽  
pp. 137-147
Author(s):  
Nilay Nishant ◽  
Ashish Maharjan ◽  
Dibyajyoti Chutia ◽  
P. L. N. Raju ◽  
Ashis Pradhan

Sign in / Sign up

Export Citation Format

Share Document