scholarly journals Resource-Efficient Pet Dog Sound Events Classification Using LSTM-FCN Based on Time-Series Data

Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 4019 ◽  
Author(s):  
Yunbin Kim ◽  
Jaewon Sa ◽  
Yongwha Chung ◽  
Daihee Park ◽  
Sungju Lee

The use of IoT (Internet of Things) technology for the management of pet dogs left alone at home is increasing. This includes tasks such as automatic feeding, operation of play equipment, and location detection. Classification of the vocalizations of pet dogs using information from a sound sensor is an important method to analyze the behavior or emotions of dogs that are left alone. These sounds should be acquired by attaching the IoT sound sensor to the dog, and then classifying the sound events (e.g., barking, growling, howling, and whining). However, sound sensors tend to transmit large amounts of data and consume considerable amounts of power, which presents issues in the case of resource-constrained IoT sensor devices. In this paper, we propose a way to classify pet dog sound events and improve resource efficiency without significant degradation of accuracy. To achieve this, we only acquire the intensity data of sounds by using a relatively resource-efficient noise sensor. This presents issues as well, since it is difficult to achieve sufficient classification accuracy using only intensity data due to the loss of information from the sound events. To address this problem and avoid significant degradation of classification accuracy, we apply long short-term memory-fully convolutional network (LSTM-FCN), which is a deep learning method, to analyze time-series data, and exploit bicubic interpolation. Based on experimental results, the proposed method based on noise sensors (i.e., Shapelet and LSTM-FCN for time-series) was found to improve energy efficiency by 10 times without significant degradation of accuracy compared to typical methods based on sound sensors (i.e., mel-frequency cepstrum coefficient (MFCC), spectrogram, and mel-spectrum for feature extraction, and support vector machine (SVM) and k-nearest neighbor (K-NN) for classification).

Author(s):  
S. Jing ◽  
T. Chao

Abstract. Time series imagery containing high-dimensional temporal features are conducive to improving classification accuracy. With the plenty accumulation of historical images, the inclusion of time series data becomes available to utilize, but it is difficult to avoid missing values caused by cloud cover. Meanwhile, seeking a large amount of training labels for long time series also makes data collection troublesome. In this study, we proposed a semi-supervised convolutional long short-term memory neural network (Semi-LSTM) in long time series which achieves an accurate and automated land cover classification with a small proportion of labels. Three main contributions of this work are summarized as follows: i) the proposed method achieve an excellent classification via a small group of labels in long time series data, and reducing dependence of training labels; ii) it is a robust algorithm in accuracy for the influence of noise, and reduces the requirements of sequential data for cloudless and lossless images; and iii) it makes full advantage of spectral-spatial-temporal features, especially expanding time context information to enhance classification accuracy. Finally, the proposed network is validated on time series imagery from Landsat 8. All quantitative analyses and evaluation indicators of the experimental results demonstrate competitive performance in the suggested modes.


2020 ◽  
Vol 12 (20) ◽  
pp. 8555
Author(s):  
Li Huang ◽  
Ting Cai ◽  
Ya Zhu ◽  
Yuliang Zhu ◽  
Wei Wang ◽  
...  

Accurate forecasts of construction waste are important for recycling the waste and formulating relevant governmental policies. Deficiencies in reliable forecasting methods and historical data hinder the prediction of this waste in long- or short-term planning. To effectively forecast construction waste, a time-series forecasting method is proposed in this study, based on a three-layer long short-term memory (LSTM) network and univariate time-series data with limited sample points. This method involves network structure design and implementation algorithms for network training and the forecasting process. Numerical experiments were performed with statistical construction waste data for Shanghai and Hong Kong. Compared with other time-series forecasting models such as ridge regression (RR), support vector regression (SVR), and back-propagation neural networks (BPNN), this paper demonstrates that the proposed LSTM-based forecasting model is effective and accurate in predicting construction waste generation.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Tuan D. Pham

AbstractAutomated analysis of physiological time series is utilized for many clinical applications in medicine and life sciences. Long short-term memory (LSTM) is a deep recurrent neural network architecture used for classification of time-series data. Here time–frequency and time–space properties of time series are introduced as a robust tool for LSTM processing of long sequential data in physiology. Based on classification results obtained from two databases of sensor-induced physiological signals, the proposed approach has the potential for (1) achieving very high classification accuracy, (2) saving tremendous time for data learning, and (3) being cost-effective and user-comfortable for clinical trials by reducing multiple wearable sensors for data recording.


2021 ◽  
Vol 13 (3) ◽  
pp. 67
Author(s):  
Eric Hitimana ◽  
Gaurav Bajpai ◽  
Richard Musabe ◽  
Louis Sibomana ◽  
Jayavel Kayalvizhi

Many countries worldwide face challenges in controlling building incidence prevention measures for fire disasters. The most critical issues are the localization, identification, detection of the room occupant. Internet of Things (IoT) along with machine learning proved the increase of the smartness of the building by providing real-time data acquisition using sensors and actuators for prediction mechanisms. This paper proposes the implementation of an IoT framework to capture indoor environmental parameters for occupancy multivariate time-series data. The application of the Long Short Term Memory (LSTM) Deep Learning algorithm is used to infer the knowledge of the presence of human beings. An experiment is conducted in an office room using multivariate time-series as predictors in the regression forecasting problem. The results obtained demonstrate that with the developed system it is possible to obtain, process, and store environmental information. The information collected was applied to the LSTM algorithm and compared with other machine learning algorithms. The compared algorithms are Support Vector Machine, Naïve Bayes Network, and Multilayer Perceptron Feed-Forward Network. The outcomes based on the parametric calibrations demonstrate that LSTM performs better in the context of the proposed application.


Author(s):  
Gudipally Chandrashakar

In this article, we used historical time series data up to the current day gold price. In this study of predicting gold price, we consider few correlating factors like silver price, copper price, standard, and poor’s 500 value, dollar-rupee exchange rate, Dow Jones Industrial Average Value. Considering the prices of every correlating factor and gold price data where dates ranging from 2008 January to 2021 February. Few algorithms of machine learning are used to analyze the time-series data are Random Forest Regression, Support Vector Regressor, Linear Regressor, ExtraTrees Regressor and Gradient boosting Regression. While seeing the results the Extra Tree Regressor algorithm gives the predicted value of gold prices more accurately.


Open Physics ◽  
2021 ◽  
Vol 19 (1) ◽  
pp. 618-627
Author(s):  
Weixing Song ◽  
Jingjing Wu ◽  
Jianshe Kang ◽  
Jun Zhang

Abstract The aim of this study was to improve the low accuracy of equipment spare parts requirement predicting, which affects the quality and efficiency of maintenance support, based on the summary and analysis of the existing spare parts requirement predicting research. This article introduces the current latest popular long short-term memory (LSTM) algorithm which has the best effect on time series data processing to equipment spare parts requirement predicting, according to the time series characteristics of spare parts consumption data. A method for predicting the requirement for maintenance spare parts based on the LSTM recurrent neural network is proposed, and the network structure is designed in detail, the realization of network training and network prediction is given. The advantages of particle swarm algorithm are introduced to optimize the network parameters, and actual data of three types of equipment spare parts consumption are used for experiments. The performance comparison of predictive models such as BP neural network, generalized regression neural network, wavelet neural network, and squeeze-and-excitation network prove that the new method is effective and provides an effective method for scientifically predicting the requirement for maintenance spare parts and improving the quality of equipment maintenance.


Mathematics ◽  
2020 ◽  
Vol 8 (7) ◽  
pp. 1078
Author(s):  
Ruxandra Stoean ◽  
Catalin Stoean ◽  
Miguel Atencia ◽  
Roberto Rodríguez-Labrada ◽  
Gonzalo Joya

Uncertainty quantification in deep learning models is especially important for the medical applications of this complex and successful type of neural architectures. One popular technique is Monte Carlo dropout that gives a sample output for a record, which can be measured statistically in terms of average probability and variance for each diagnostic class of the problem. The current paper puts forward a convolutional–long short-term memory network model with a Monte Carlo dropout layer for obtaining information regarding the model uncertainty for saccadic records of all patients. These are next used in assessing the uncertainty of the learning model at the higher level of sets of multiple records (i.e., registers) that are gathered for one patient case by the examining physician towards an accurate diagnosis. Means and standard deviations are additionally calculated for the Monte Carlo uncertainty estimates of groups of predictions. These serve as a new collection where a random forest model can perform both classification and ranking of variable importance. The approach is validated on a real-world problem of classifying electrooculography time series for an early detection of spinocerebellar ataxia 2 and reaches an accuracy of 88.59% in distinguishing between the three classes of patients.


Sign in / Sign up

Export Citation Format

Share Document