Deriving the Fluctuation Theorem for Information-Transmission Systems Using a Cyclic Chain-Reaction Sequence
Abstract Chemical chain-reactions are pathways that can transmit information, as demonstrated by signal-transduction reactions in cell biology. In this study, we defined entropy as the logarithm of the concentration ratio of chemical species and considered the channel capacity for information transmission by maximizing the entropy. We hypothesized that the reaction chain has an orientation in which the reaction time for the reverse reaction is sufficiently long. According to this model, the logarithm of the forward and reverse transitional probability ratio was found to indicate the entropy-time average per unit reaction time, corresponding to the fluctuation theorem for thermodynamics regarding entropy production rate. This conclusion illuminates the process of signal transduction in cells and other biochemical systems and may provide insights into the relation between thermodynamic and information entropy.