scholarly journals A Total Crop-Diagnosis Platform Based on Deep Learning Models in a Natural Nutrient Environment

2018 ◽  
Vol 8 (10) ◽  
pp. 1992 ◽  
Author(s):  
YiNa Jeong ◽  
SuRak Son ◽  
SangSik Lee ◽  
ByungKwan Lee

This paper proposes a total crop-diagnosis platform (TCP) based on deep learning models in a natural nutrient environment, which collects the weather information based on a farm’s location information, diagnoses the collected weather information and the crop soil sensor data with a deep learning technique, and notifies a farm manager of the diagnosed result. The proposed TCP is composed of 1 gateway and 2 modules as follows. First, the optimized farm sensor gateway (OFSG) collects data by internetworking sensor nodes which use Zigbee, Wi-Fi and Bluetooth protocol and reduces the number of sensor data fragmentation times through the compression of a fragment header. Second, the data storage module (DSM) stores the collected farm data and weather data in a farm central server. Third, the crop self-diagnosis module (CSM) works in the cloud server and diagnoses by deep learning whether or not the status of a farm is in good condition for growing crops according to current weather and soil information. The TCP performance shows that the data processing rate of the OFSG is increased by about 7% compared with existing sensor gateways. The learning time of the CSM is shorter than that of the long short-term memory models (LSTM) by 0.43 s, and the success rate of the CSM is higher than that of the LSTM by about 7%. Therefore, the TCP based on deep learning interconnects the communication protocols of various sensors, solves the maximum data size that sensor can transfer, predicts in advance crop disease occurrence in an external environment, and helps to make an optimized environment in which to grow crops.

Up until early 2000’s climate predictions were made mainly using statistical methods. This prediction wasn’t always entirely accurate. With the introduction of deep learning in climate prediction, the prediction accuracy has improved dramatically. The sensors in the weather stations give massive amount of unstructured data. Due to the humungous amounts of sensors and data from it, it’s almost impossible to compute all the necessary weather information in time. AI and deep learning help to overcome this problem using different models which can swiftly and accurately make this job simple. Accurate climate prediction is very important to predict is very important to predict any natural calamities or unexpected change in weather. This report highlights few of the deep learning models which can be used for climate prediction by scientists. This paper only takes scratches the surface of the capabilities of AI in climate change. More advancements in this field would lead to better simulations of the weather conditions which can then be useful to predict the extreme weather conditions accurately. Few of the authors have used unique models in their prediction of various temperature, rainfall, pollution levels etc. which have helped them to find the discrepancies in the climate if any


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1064
Author(s):  
I Nyoman Kusuma Wardana ◽  
Julian W. Gardner ◽  
Suhaib A. Fahmy

Accurate air quality monitoring requires processing of multi-dimensional, multi-location sensor data, which has previously been considered in centralised machine learning models. These are often unsuitable for resource-constrained edge devices. In this article, we address this challenge by: (1) designing a novel hybrid deep learning model for hourly PM2.5 pollutant prediction; (2) optimising the obtained model for edge devices; and (3) examining model performance running on the edge devices in terms of both accuracy and latency. The hybrid deep learning model in this work comprises a 1D Convolutional Neural Network (CNN) and a Long Short-Term Memory (LSTM) to predict hourly PM2.5 concentration. The results show that our proposed model outperforms other deep learning models, evaluated by calculating RMSE and MAE errors. The proposed model was optimised for edge devices, the Raspberry Pi 3 Model B+ (RPi3B+) and Raspberry Pi 4 Model B (RPi4B). This optimised model reduced file size to a quarter of the original, with further size reduction achieved by implementing different post-training quantisation. In total, 8272 hourly samples were continuously fed to the edge device, with the RPi4B executing the model twice as fast as the RPi3B+ in all quantisation modes. Full-integer quantisation produced the lowest execution time, with latencies of 2.19 s and 4.73 s for RPi4B and RPi3B+, respectively.


Sensors ◽  
2020 ◽  
Vol 20 (7) ◽  
pp. 1956 ◽  
Author(s):  
Sami Kabir ◽  
Raihan Ul Islam ◽  
Mohammad Shahadat Hossain ◽  
Karl Andersson

Sensor data are gaining increasing global attention due to the advent of Internet of Things (IoT). Reasoning is applied on such sensor data in order to compute prediction. Generating a health warning that is based on prediction of atmospheric pollution, planning timely evacuation of people from vulnerable areas with respect to prediction of natural disasters, etc., are the use cases of sensor data stream where prediction is vital to protect people and assets. Thus, prediction accuracy is of paramount importance to take preventive steps and avert any untoward situation. Uncertainties of sensor data is a severe factor which hampers prediction accuracy. Belief Rule Based Expert System (BRBES), a knowledge-driven approach, is a widely employed prediction algorithm to deal with such uncertainties based on knowledge base and inference engine. In connection with handling uncertainties, it offers higher accuracy than other such knowledge-driven techniques, e.g., fuzzy logic and Bayesian probability theory. Contrarily, Deep Learning is a data-driven technique, which constitutes a part of Artificial Intelligence (AI). By applying analytics on huge amount of data, Deep Learning learns the hidden representation of data. Thus, Deep Learning can infer prediction by reasoning over available data, such as historical data and sensor data streams. Combined application of BRBES and Deep Learning can compute prediction with improved accuracy by addressing sensor data uncertainties while utilizing its discovered data pattern. Hence, this paper proposes a novel predictive model that is based on the integrated approach of BRBES and Deep Learning. The uniqueness of this model lies in the development of a mathematical model to combine Deep Learning with BRBES and capture the nonlinear dependencies among the relevant variables. We optimized BRBES further by applying parameter and structure optimization on it. Air pollution prediction has been taken as use case of our proposed combined approach. This model has been evaluated against two different datasets. One dataset contains synthetic images with a corresponding label of PM2.5 concentrations. The other one contains real images, PM2.5 concentrations, and numerical weather data of Shanghai, China. We also distinguished a hazy image between polluted air and fog through our proposed model. Our approach has outperformed only BRBES and only Deep Learning in terms of prediction accuracy.


Author(s):  
Ahsen Tahir ◽  
Jawad Ahmad ◽  
Gordon Morison ◽  
Hadi Larijani ◽  
Ryan M. Gibson ◽  
...  

Falls are a major health concern in older adults. Falls lead to mortality, immobility and high costs to social and health care services. Early detection and classification of falls is imperative for timely and appropriate medical aid response. Traditional machine learning models have been explored for fall classification. While newly developed deep learning techniques have the ability to potentially extract high-level features from raw sensor data providing high accuracy and robustness to variations in sensor position, orientation and diversity of work environments that may skew traditional classification models. However, frequently used deep learning models like Convolutional Neural Networks (CNN) are computationally intensive. To the best of our knowledge, we present the first instance of a Hybrid Multichannel Random Neural Network (HMCRNN) architecture for fall detection and classification. The proposed architecture provides the highest accuracy of 92.23% with dropout regularization, compared to other deep learning implementations. The performance of the proposed technique is approximately comparable to a CNN yet requires only half the computation cost of the CNN-based implementation. Furthermore, the proposed HMCRNN architecture provides 34.12% improvement in accuracy on average than a Multilayer Perceptron.


Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3307 ◽  
Author(s):  
Caroline König ◽  
Ahmed Mohamed Helmi

Condition monitoring (CM) is a useful application in industry 4.0, where the machine’s health is controlled by computational intelligence methods. Data-driven models, especially from the field of deep learning, are efficient solutions for the analysis of time series sensor data due to their ability to recognize patterns in high dimensional data and to track the temporal evolution of the signal. Despite the excellent performance of deep learning models in many applications, additional requirements regarding the interpretability of machine learning models are getting relevant. In this work, we present a study on the sensitivity of sensors in a deep learning based CM system providing high-level information about the relevance of the sensors. Several convolutional neural networks (CNN) have been constructed from a multisensory dataset for the prediction of different degradation states in a hydraulic system. An attribution analysis of the input features provided insights about the contribution of each sensor in the prediction of the classifier. Relevant sensors were identified, and CNN models built on the selected sensors resulted equal in prediction quality to the original models. The information about the relevance of sensors is useful for the system’s design to decide timely on the required sensors.


Author(s):  
Govind P. Gupta ◽  
Shubham Gaur

Remote monitoring and recognition of physical activities of elderly people within smart homes and detection of the deviations in their daily activities from previous behavior is one of the fundamental research challenges for the development of ambient assisted living system. This system is also very helpful in monitoring the health of a swiftly aging population in the developed countries. In this chapter, a framework is proposed for remote monitoring and recognition of physical activities of elderly people using smart phone accelerometer sensor data using deep learning models. The main objective of the proposed framework is to provide preventive measures for the emergency health issues such as cardiac arrest, sudden falls, dementia, or arthritis. For the performance evaluation of the proposed framework, two different benchmark accelerometer sensor datasets, UCI and WISDM, are used. Results analysis confirms the performance of the proposed scheme in terms of accuracy, F1-score, root-mean square error (RMSE).


Sensors ◽  
2022 ◽  
Vol 22 (2) ◽  
pp. 446
Author(s):  
Jay-Shian Tan ◽  
Sawitchaya Tippaya ◽  
Tara Binnie ◽  
Paul Davey ◽  
Kathryn Napier ◽  
...  

Deep learning models developed to predict knee joint kinematics are usually trained on inertial measurement unit (IMU) data from healthy people and only for the activity of walking. Yet, people with knee osteoarthritis have difficulties with other activities and there are a lack of studies using IMU training data from this population. Our objective was to conduct a proof-of-concept study to determine the feasibility of using IMU training data from people with knee osteoarthritis performing multiple clinically important activities to predict knee joint sagittal plane kinematics using a deep learning approach. We trained a bidirectional long short-term memory model on IMU data from 17 participants with knee osteoarthritis to estimate knee joint flexion kinematics for phases of walking, transitioning to and from a chair, and negotiating stairs. We tested two models, a double-leg model (four IMUs) and a single-leg model (two IMUs). The single-leg model demonstrated less prediction error compared to the double-leg model. Across the different activity phases, RMSE (SD) ranged from 7.04° (2.6) to 11.78° (6.04), MAE (SD) from 5.99° (2.34) to 10.37° (5.44), and Pearson’s R from 0.85 to 0.99 using leave-one-subject-out cross-validation. This study demonstrates the feasibility of using IMU training data from people who have knee osteoarthritis for the prediction of kinematics for multiple clinically relevant activities.


Author(s):  
Hangwei Qian ◽  
Sinno Jialin Pan ◽  
Bingshui Da ◽  
Chunyan Miao

Feature-engineering-based machine learning models and deep learning models have been explored for wearable-sensor-based human activity recognition. For both types of methods, one crucial research issue is how to extract proper features from the partitioned segments of multivariate sensor readings. Existing methods have different drawbacks: 1) feature-engineering-based methods are able to extract meaningful features, such as statistical or structural information underlying the segments, but usually require manual designs of features for different applications, which is time consuming, and 2) deep learning models are able to learn temporal and/or spatial features from the sensor data automatically, but fail to capture statistical information. In this paper, we propose a novel deep learning model to automatically learn meaningful features including statistical features, temporal features and spatial correlation features for activity recognition in a unified framework. Extensive experiments are conducted on four datasets to demonstrate the effectiveness of our proposed method compared with state-of-the-art baselines.


Sign in / Sign up

Export Citation Format

Share Document