scholarly journals Statistics of the number of equilibria in random social dilemma evolutionary games with mutation

2021 ◽  
Author(s):  
Manh Hong Duong ◽  
The Anh Han

In this paper, we study analytically the statistics of the number of equilibria in pairwise social dilemma evolutionary games with mutation where a game's payoff entries are random variables. Using the replicator-mutator equations, we provide explicit formulas for the probability distributions of the number of equilibria as well as other statistical quantities. This analysis is highly relevant assuming that one might know the nature of a social dilemma game at hand (e.g., cooperation vs coordination vs anti-coordination), but measuring the exact values of its payoff entries is difficult. Our delicate analysis shows clearly the influence of the mutation probability on these probability distributions, providing insights into how varying this important factor impacts the overall behavioural or biological diversity of the underlying evolutionary systems.

2021 ◽  
Vol 94 (8) ◽  
Author(s):  
Manh Hong Duong ◽  
The Anh Han

Abstract In this paper, we study analytically the statistics of the number of equilibria in pairwise social dilemma evolutionary games with mutation where a game’s payoff entries are random variables. Using the replicator–mutator equations, we provide explicit formulas for the probability distributions of the number of equilibria as well as other statistical quantities. This analysis is highly relevant assuming that one might know the nature of a social dilemma game at hand (e.g., cooperation vs coordination vs anti-coordination), but measuring the exact values of its payoff entries is difficult. Our delicate analysis shows clearly the influence of the mutation probability on these probability distributions, providing insights into how varying this important factor impacts the overall behavioural or biological diversity of the underlying evolutionary systems. Graphic abstract


Author(s):  
RONALD R. YAGER

We look at the issue of obtaining a variance like measure associated with probability distributions over ordinal sets. We call these dissonance measures. We specify some general properties desired in these dissonance measures. The centrality of the cumulative distribution function in formulating the concept of dissonance is pointed out. We introduce some specific examples of measures of dissonance.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


Sign in / Sign up

Export Citation Format

Share Document