scholarly journals Estimating time-dependent entropy production from non-equilibrium trajectories

2022 ◽  
Vol 5 (1) ◽  
Author(s):  
Shun Otsubo ◽  
Sreekanth K. Manikandan ◽  
Takahiro Sagawa ◽  
Supriya Krishnamurthy

AbstractThe rate of entropy production provides a useful quantitative measure of a non-equilibrium system and estimating it directly from time-series data from experiments is highly desirable. Several approaches have been considered for stationary dynamics, some of which are based on a variational characterization of the entropy production rate. However, the issue of obtaining it in the case of non-stationary dynamics remains largely unexplored. Here, we solve this open problem by demonstrating that the variational approaches can be generalized to give the exact value of the entropy production rate even for non-stationary dynamics. On the basis of this result, we develop an efficient algorithm that estimates the entropy production rate continuously in time by using machine learning techniques and validate our numerical estimates using analytically tractable Langevin models in experimentally relevant parameter regimes. Our method only requires time-series data for the system of interest without any prior knowledge of the system’s parameters.

2019 ◽  
Vol 3 (2) ◽  
pp. 282-287
Author(s):  
Ika Oktavianti ◽  
Ermatita Ermatita ◽  
Dian Palupi Rini

Licensing services is one of the forms of public services that important in supporting increased investment in Indonesia and is currently carried out by the Investment and Licensing Services Department. The problems that occur in general are the length of time to process licenses and one of the contributing factors is the limited number of licensing officers. Licensing data is a time series data which have monthly observation. The Artificial Neural Network (ANN) and Support Vector Machine (SVR) is used as machine learning techniques to predict licensing pattern based on time series data. Of the data used dataset 1 and dataset 2, the sharing of training data and testing data is equal to 70% and 30% with consideration that training data must be more than testing data. The result of the study showed for Dataset 1, the ANN-Multilayer Perceptron have a better performance than Support Vector Regression (SVR) with MSE, MAE and RMSE values is 251.09, 11.45, and 15.84. Then for dataset 2, SVR-Linear has better performance than MLP with values of MSE, MAE and RMSE of 1839.93, 32.80, and 42.89. The dataset used to predict the number of permissions is dataset 2. The study also used the Simple Linear Regression (SLR) method to see the causal relationship between the number of licenses issued and licensing service officers. The result is that the relationship between the number of licenses issued and the number of service officers is less significant because there are other factors that affect the number of licenses.  


2017 ◽  
Vol 4 (1) ◽  
pp. 160874 ◽  
Author(s):  
Matteo Smerlak ◽  
Bapu Vaitla

Resilience, the ability to recover from adverse events, is of fundamental importance to food security. This is especially true in poor countries, where basic needs are frequently threatened by economic, environmental and health shocks. An empirically sound formalization of the concept of food security resilience, however, is lacking. Here, we introduce a general non-equilibrium framework for quantifying resilience based on the statistical notion of persistence. Our approach can be applied to any food security variable for which high-frequency time-series data are available. We illustrate our method with per capita kilocalorie availability for 161 countries between 1961 and 2011. We find that resilient countries are not necessarily those that are characterized by high levels or less volatile fluctuations of kilocalorie intake. Accordingly, food security policies and programmes will need to be tailored not only to welfare levels at any one time, but also to long-run welfare dynamics.


Author(s):  
Daniela A. Gomez-Cravioto ◽  
Ramon E. Diaz-Ramos ◽  
Francisco J. Cantu-Ortiz ◽  
Hector G. Ceballos

AbstractTo understand and approach the spread of the SARS-CoV-2 epidemic, machine learning offers fundamental tools. This study presents the use of machine learning techniques for projecting COVID-19 infections and deaths in Mexico. The research has three main objectives: first, to identify which function adjusts the best to the infected population growth in Mexico; second, to determine the feature importance of climate and mobility; third, to compare the results of a traditional time series statistical model with a modern approach in machine learning. The motivation for this work is to support health care providers in their preparation and planning. The methods compared are linear, polynomial, and generalized logistic regression models to describe the growth of COVID-19 incidents in Mexico. Additionally, machine learning and time series techniques are used to identify feature importance and perform forecasting for daily cases and fatalities. The study uses the publicly available data sets from the John Hopkins University of Medicine in conjunction with the mobility rates obtained from Google’s Mobility Reports and climate variables acquired from the Weather Online API. The results suggest that the logistic growth model fits best the pandemic’s behavior, that there is enough correlation of climate and mobility variables with the disease numbers, and that the Long short-term memory network can be exploited for predicting daily cases. Given this, we propose a model to predict daily cases and fatalities for SARS-CoV-2 using time series data, mobility, and weather variables.


2020 ◽  
Author(s):  
Pavan Kumar Jonnakuti ◽  
Udaya Bhaskar Tata Venkata Sai

<p>Sea surface temperature (SST) is a key variable of the global ocean, which affects air-sea interaction processes. Forecasts based on statistics and machine learning techniques did not succeed in considering the spatial and temporal relationships of the time series data. Therefore, to achieve precision in SST prediction we propose a deep learning-based model, by which we can produce a more realistic and accurate account of SST ‘behavior’ as it focuses both on space and time. Our hybrid CNN-LSTM model uses multiple processing layers to learn hierarchical representations by implementing 3D and 2D convolution neural networks as a method to better understand the spatial features and additionally we use LSTM to examine the temporal sequence of relations in SST time-series satellite data. Widespread studies, based on the historical satellite datasets spanning from 1980 - present time, in Indian Ocean region shows that our proposed deep learning-based CNN-LSTM model is extremely capable for short and mid-term daily SST prediction accurately exclusive based on the error estimates (obtained from LSTM) of the forecasted data sets.</p><p><strong>Keywords: Deep Learning, Sea Surface Temperature, CNN, LSTM, Prediction.</strong></p><p> </p>


Entropy ◽  
2019 ◽  
Vol 22 (1) ◽  
pp. 49
Author(s):  
Mariano Lemus ◽  
João P. Beirão ◽  
Nikola Paunković ◽  
Alexandra M. Carvalho ◽  
Paulo Mateus

Biomedical signals constitute time-series that sustain machine learning techniques to achieve classification. These signals are complex with measurements of several features over, eventually, an extended period. Characterizing whether the data can anticipate prediction is an essential task in time-series mining. The ability to obtain information in advance by having early knowledge about a specific event may be of great utility in many areas. Early classification arises as an extension of the time-series classification problem, given the need to obtain a reliable prediction as soon as possible. In this work, we propose an information-theoretic method, named Multivariate Correlations for Early Classification (MCEC), to characterize the early classification opportunity of a time-series. Experimental validation is performed on synthetic and benchmark data, confirming the ability of the MCEC algorithm to perform a trade-off between accuracy and earliness in a wide-spectrum of time-series data, such as those collected from sensors, images, spectrographs, and electrocardiograms.


2008 ◽  
Vol 6 (39) ◽  
pp. 925-940 ◽  
Author(s):  
Melissa Vellela ◽  
Hong Qian

Schlögl's model is the canonical example of a chemical reaction system that exhibits bistability. Because the biological examples of bistability and switching behaviour are increasingly numerous, this paper presents an integrated deterministic, stochastic and thermodynamic analysis of the model. After a brief review of the deterministic and stochastic modelling frameworks, the concepts of chemical and mathematical detailed balances are discussed and non-equilibrium conditions are shown to be necessary for bistability. Thermodynamic quantities such as the flux, chemical potential and entropy production rate are defined and compared across the two models. In the bistable region, the stochastic model exhibits an exchange of the global stability between the two stable states under changes in the pump parameters and volume size. The stochastic entropy production rate shows a sharp transition that mirrors this exchange. A new hybrid model that includes continuous diffusion and discrete jumps is suggested to deal with the multiscale dynamics of the bistable system. Accurate approximations of the exponentially small eigenvalue associated with the time scale of this switching and the full time-dependent solution are calculated using M atlab . A breakdown of previously known asymptotic approximations on small volume scales is observed through comparison with these and Monte Carlo results. Finally, in the appendix section is an illustration of how the diffusion approximation of the chemical master equation can fail to represent correctly the mesoscopically interesting steady-state behaviour of the system.


The stock market has been one of the primary revenue streams for many for years. The stock market is often incalculable and uncertain; therefore predicting the ups and downs of the stock market is an uphill task even for the financial experts, which they been trying to tackle without any little success. But it is now possible to predict stock markets due to rapid improvement in technology which led to better processing speed and more accurate algorithms. It is necessary to forswear the misconception that prediction of stock market is only meant for people who have expertise in finance; hence an application can be developed to guide the user about the tempo of the stock market and risk associated with it.The prediction of prices in stock market is a complicated task, and there are various techniques that are used to solve the problem, this paper investigates some of these techniques and compares the accuracy of each of the methods. Forecasting the time series data is important topic in many economics, statistics, finance and business. Of the many techniques in forecasting time series data such as the Autoregressive, Moving Average, and the Autoregressive Integrated Moving Average, it is the Autoregressive Integrated Moving Average that has higher accuracy and higher precision than other methods. And with recent advancement in computational power of processors and advancement in knowledge of machine learning techniques and deep learning, new algorithms could be made to tackle the problem of predicting the stock market. This paper investigates one of such machine learning algorithms to forecast time series data such as Long Short Term Memory. It is compared with traditional algorithms such as the ARIMA method, to determine how superior the LSTM is compared to the traditional methods for predicting the stock market.


Sign in / Sign up

Export Citation Format

Share Document