recurrent neural networks
Recently Published Documents





Mohammed Al-Shabi ◽  
Anmar Abuhamdah

<span lang="EN-US">The development of the internet of things (IoT) has increased exponentially, creating a rapid pace of changes and enabling it to become more and more embedded in daily life. This is often achieved through integration: IoT is being integrated into billions of intelligent objects, commonly labeled “things,” from which the service collects various forms of data regarding both these “things” themselves as well as their environment. While IoT and IoT-powered decices can provide invaluable services in various fields, unauthorized access and inadvertent modification are potential issues of tremendous concern. In this paper, we present a process for resolving such IoT issues using adapted long short-term memory (LSTM) recurrent neural networks (RNN). With this method, we utilize specialized deep learning (DL) methods to detect abnormal and/or suspect behavior in IoT systems. LSTM RNNs are adopted in order to construct a high-accuracy model capable of detecting suspicious behavior based on a dataset of IoT sensors readings. The model is evaluated using the Intel Labs dataset as a test domain, performing four different tests, and using three criteria: F1, Accuracy, and time. The results obtained here demonstrate that the LSTM RNN model we create is capable of detecting abnormal behavior in IoT systems with high accuracy.</span>

2022 ◽  
Leon Faure ◽  
Bastien Mollet ◽  
Wolfram Liebermeister ◽  
Jean-Loup Faulon

Metabolic networks have largely been exploited as mechanistic tools to predict the behavior of microorganisms with a defined genotype in different environments. However, flux predictions by constraint-based modeling approaches are limited in quality unless labor-intensive experiments including the measurement of media intake fluxes, are performed. Using machine learning instead of an optimization of biomass flux - on which most existing constraint-based methods are based - provides ways to improve flux and growth rate predictions. In this paper, we show how Recurrent Neural Networks can surrogate constraint-based modeling and make metabolic networks suitable for backpropagation and consequently be used as an architecture for machine learning. We refer to our hybrid - mechanistic and neural network - models as Artificial Metabolic Networks (AMN). We showcase AMN and illustrate its performance with an experimental dataset of Escherichia coli growth rates in 73 different media compositions. We reach a regression coefficient of R2=0.78 on cross-validation sets. We expect AMNs to provide easier discovery of metabolic insights and prompt new biotechnological applications.

2022 ◽  
Leo Kozachkov ◽  
John Tauber ◽  
Mikael Lundqvist ◽  
Scott L Brincat ◽  
Jean-Jacques Slotine ◽  

Working memory has long been thought to arise from sustained spiking/attractor dynamics. However, recent work has suggested that short-term synaptic plasticity (STSP) may help maintain attractor states over gaps in time with little or no spiking. To determine if STSP endows additional functional advantages, we trained artificial recurrent neural networks (RNNs) with and without STSP to perform an object working memory task. We found that RNNs with and without STSP were both able to maintain memories over distractors presented in the middle of the memory delay. However, RNNs with STSP showed activity that was similar to that seen in the cortex of monkeys performing the same task. By contrast, RNNs without STSP showed activity that was less brain-like. Further, RNNs with STSP were more robust to noise and network degradation than RNNs without STSP. These results show that STSP can not only help maintain working memories, it also makes neural networks more robust.

Yue Huang ◽  
Jiali Yu ◽  
Jinsong Leng ◽  
Bisen Liu ◽  
Zhang Yi

Sign in / Sign up

Export Citation Format

Share Document