Nonextensive Entropy
Latest Publications


TOTAL DOCUMENTS

23
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780195159769, 9780197562024

Author(s):  
Juan Pérez-Mercade

We present a scenario that is useful for describing hierarchies within classes of many-component systems. Although this scenario may be quite general, it will be illustrated in the case of many-body systems whose space-time evolution can be described by a class of stochastic parabolic nonlinear partial differential equations. The stochastic component we will consider is in the form of additive noise, but other forms of noise such as multiplicative noise may also be incorporated. It will turn out that hierarchical behavior is only one of a class of asymptotic behaviors that can emerge when an out-of-equilibrium system is coarse grained. This phenomenology can be analyzed and described using the renormalization group (RG) [6, 15]. It corresponds to the existence of complex fixed points for the parameters characterizing the system. As is well known (see, for example, Hochberg and Perez-Mercader [8] and Onuki [12] and the references cited there), parameters such as viscosities, noise couplings, and masses evolve with scale. In other words, their values depend on the scale of resolution at which the system is observed (examined). These scaledependent parameters are called effective parameters. The evolutionary changes due to coarse graining or, equivalently, changes in system size, are analyzed using the RG and translate into differential equations for the probability distribution function [8] of the many-body system, or the n-point correlation functions and the effective parameters. Under certain conditions and for systems away from equilibrium, some of the fixed points of the equations describing the scale dependence of the effective parameters can be complex; this translates into complex anomalous dimensions for the stochastic fields and, therefore, the correlation functions of the field develop a complex piece. We will see that basic requirements such as reality of probabilities and maximal correlation lead, in the case of complex fixed points, to hierarchical behavior. This is a first step for the generalization of extensive behavior as described by real power laws to the case of complex exponents and the study of hierarchical behavior.


Author(s):  
Andrea Rapisarda ◽  
Vito Latora

The Boltzmann-Gibbs formulation of equilibrium statistical mechanics depends crucially on the nature of the Hamiltonian of the JV-body system under study, but this fact is clearly stated only in the introductions of textbooks and, in general, it is very soon neglected. In particular, the very same basic postulate of equilibrium statistical mechanics, the famous Boltzmann principle S = k log W of the microcanonical ensemble, assumes that dynamics can be automatically an easily taken into account, although this is not always justified, as Einstein himself realized [20]. On the other hand, the Boltzmann-Gibbs canonical ensemble is valid only for sufficiently short-range interactions and does not necessarily apply, for example, to gravitational or unscreened Colombian fields for which the usually assumed entropy extensivity postulate is not valid [5]. In 1988, Constantino Tsallis proposed a generalized thermostatistics formalism based on a nonextensive entropic form [24]. Since then, this new theory has been encountering an increasing number of successful applications in different fields (for some recent examples see Abe and Suzuki [1], Baldovin and Robledo [4], Beck et al. [8], Kaniadakis et al. [12], Latora et al. [16], and Tsallis et al. [25]) and seems to be the best candidate for a generalized thermodynamic formalism which should be valid when nonextensivity, long-range correlations, and fractal structures in phase space cannot be neglected: in other words, when the dynamics play a nontrivial role [11] and fluctuations are quite large and non-Gaussian [6, 7, 8, 24, 26]. In this contribution we consider a nonextensive JV-body classical Hamiltonian system, with infinite range interaction, the so-called Hamiltonian mean field (HMF) model, which has been intensively studied in the last several years [3, 13, 14, 15, 17, 18, 19]. The out-of-equilibrium dynamics of the model exhibits a series of anomalies like negative specific heat, metastable states, vanishing Lyapunov exponents, and non-Gaussian velocity distributions. After a brief overview of these anomalies, we show how they can be interpreted in terms of nonextensive thermodynamics according to the present understanding.


Author(s):  
N. Scafetta ◽  
P. Grigolin

A complex process is often a balance between nonscaling and scaling components. We show how the nonextensive Tsallis g-entropy indicator may be interpreted as a measure of the nonscaling condition in time series. This is done by applying the nonextensive entropy formalism to the diffusion entropy analysis (DEA). We apply the analysis to the study of the teen birth phenomenon. We find that the number of unmarried teen births is strongly influenced by social processes that induce an anomalous memory in the data. This memory is related to the strength of the nonscaling component of the signal and is more intense than that in the married teen birth time series. By using a wavelet multiresolution analysis, we attempt to provide a social interpretation of this effect…. One of the most exciting and rapidly developing areas of modern research is the quantitative study of "complexity." Complexity has special interdisciplinary impacts in the fields of physics, mathematics, information science, biology, sociology, and medicine. No definition of a complex system has been universally embraced, so here we adopt the working definition, "an arrangement of parts so intricate as to be hard to understand or deal with." Therefore, the main goal of the science of complexity is to develop mathematical methods in order to discriminate among the fundamental microscopic and macroscopic constituents of a complex system and to describe their interrelations in a concise way. Experiments usually yield results in the form of time series for physical observables. Typically, these time series contain both a slow regular variation, usually called a "signal," and a rapid erratic fluctuation, usually called "noise." Historically, the techniques applied to processing such time series have been based on equilibrium statistical mechanics and, therefore, they are not applicable to phenomena far from equilibrium. Among the fluctuating phenomena, a particularly important place is occupied by those phenomena characterized by some type of self-similar or scaling-fractal structures [4]. In this chapter we show that the nonextensive Tsallis g-entropy indicator may be interpreted as a measure of the strength of the nonscaling component of a time series.


Author(s):  
Sergio A. Cannas ◽  
Diana E. Marco

Species in an ecosystem can be classified as natives or exotics. Native species are those that have coevolved in the ecosystem, while exotic ones have not. The introduction of exotic species into an ecosystem is usually associated with human influence, which can be intentional or accidental. Some exotic species do not survive, at least not without artificial assistance. But some others do quite well on their own in a new environment. Exotic species may have no natural predators in the new environment or they may make better use of the natural resources than the natives, so they spread in the new territory and compete with some of the natives, who eventually become extinct. Exotic species that successfully establish and spread in an ecosystem are called invaders. The process by which an invader arrives and spreads into the new territory is called biological invasion. It is worth mentioning that, although invaders are usually exotic species, sometimes native species may also behave like invaders. That is, if an ecosystem suffers a strong disturbance, like fire or heavy grazing, some native species whose populations were originally stable may start to grow, outcompeting other native species. There are many examples of introduced species that became invaders, ranging from bacteria to cattle. Accidental or intentional introductions by humans are responsible for most of the present biological invasions, threatening the structure and functioning of many ecosystems. There are many effects associated with biological invasions, perhaps the most important one being the possible loss of biodiversity in the long term. But biological invasions may also introduce changes in different environmental traits, like climate, hydrology (invaders may consume more water than natives), and soil composition (for instance, some plants take up salt from soil and deposit it on the surface, making it unsuitable for some native species). All these changes have strong economical impacts, considering their influences in agriculture, forestry, and public health [9]. Hence, it is of interest to understand this phenomenon in order to predict the potential invasiveness of a species before its introduction in an ecosystem, and to develop strategies of control for invasive species that have already been introduced.


Author(s):  
T. J. P. Penna ◽  
J. C. Sartorelli

Here we present our attempt to characterize a time series of drop-to-drop intervals from a dripping faucet as a nonextensive system. We found a long-range anticorrelated behavior as evidence of memory in the dynamics of our system. The hypothesis of faucets dripping at the edge of chaos is reinforced by results of the linear rate of the increase of the nonextensive Tsallis statistics. We also present some similarities between dripping faucets and healthy hearts…. Many systems in Nature exhibit complex or chaotic behaviors. Chaotic behavior is characterized by short-range correlations and strong sensitivity to small changes of the initial conditions. Complex behavior is characterized by the presence of long-range power-law correlations in its dynamics. In the latter, the sensitivity to a perturbation of the initial condition is weaker than in the former. Because the probability densities are frequently described as inverse power laws, the variance and the mean often diverge. Although it is hard to predict the long-term behavior of such systems, it is still possible to get some information from them and even to find similarities between two apparently very distinct systems. Tools from statistical physics are frequently used because the main task here is to deal with diverse macroscopic phenomena and to try to explain them, starting with the microscopic interactions among many individual components. The microscopic interactions are not necessarily complicated, but the collective behavior can determine a rather intricate macroscopic description. Nonextensive statistical mechanics, since its proposal in 1988 [27], has been applied to an impressive collection of systems in which spatial or temporal longrange correlations appear. Hence, it can also become a useful tool to characterize such systems. Here, we present an attempt of using such formalism to try to understand the intriguing behavior of an apparently simple system: a dripping faucet.


Author(s):  
Fulvio Baldovin

We discuss the sensitivity to initial conditions and the entropy production of low-dimensional conservative maps, focusing on situations where the phase space presents complex (fractal-like) structures. We analyze numerically the standard map as a specific example and we observe a scenario that presents appealing analogies with anomalies detected in long-range Hamiltonian systems. We see how the Tsallis nonextensive formalism handles this situation both from a dynamical and from a statistical mechanics point of view…. In recent years, the Tsallis extension of the Boltzmann-Gibbs (BG) statistical mechanics [9, 26], usually referred to as nonextensive (NE) statistical mechanics, has become an intense and exciting research area (see, e.g., Tsallis [25]). The q-exponential distribution functions that emerge as a consequence of the NE formalism have been applied to an impressive variety of problems, ranging from turbulence, to high-energy physics, epilepsy, protein folding, and financial analysis. Yet, the foundation of this formalism, as well as the definition of its area of applicability, is still not completely understood, and it stands as a present challenge in the affirmation of the whole proposal. An intensive effort is currently being made to investigate this point, precisely in trying to understand: (1) which mechanisms lead to a crisis of the BG formalism; and (2) in these cases, does the NE formalism provide a "way out" to some of the problems? A possible approach to these questions comes from the study of the underlying dynamics that gives the basis for a statistical mechanic treatment of the system. This idea is not new. Einstein, in his critical remark about the validity of the Boltzmann principle [10], was one of the first to call attention to the relevance of a dynamical foundation of statistical mechanics. Another fundamental contribution is Krylov's seminal work [14] on the mixing properties of dynamical systems. In one-dimensional (dissipative) systems, intensive effort has been made to analyze the properties of the systems at the edge of chaos, i.e., at the critical poin that marks the transition between chaoticity and regularity [6, 8, 16, 19, 18, 23, 27].


Author(s):  
Sumiyoshi Abe

Nonadditive classical information theory is developed in the axiomatic framework and then translated into quantum theory. The nonadditive conditional entropy associated with the Tsallis entropy indexed by q is given in accordance with the formalism of nonextensive statistical mechanics. The theory is applied to the problems of quantum entanglement and separability of the Werner-Popescu-type mixed state of a multipartite system, in order to examine if it has any points superior to the additive theory with the von Neumann entropy realized in the limit q → 1. It is shown that the nonadditive theory can lead to the necessary and sufficient condition for separability of the Werner-Popescu-type state, whereas the von Neumann theory can give only a much weaker condition…. Tsallis' nonextensive generalization of Boltzmann-Gibbs statistical mechanics [3, 15, 16] and its success in describing behaviors of a large class of complex systems naturally lead to the question of whether information theory can also admit an analogous generalization. If the answer is affirmative, then that will be of particular importance in connection with the problem of quantum entanglement and quantum theory of measurement [6, 8], in which necessities of a nonadditive information measure and an information content are suggested. One should also remember that there exists a conceptual similarity between a complex system and an entangled quantum system. In these systems, a "part" is indivisibly connected with the rest. An external operation on any part drastically influences the whole system, in general. Thus, the traditional reductionistic approach to an understanding of the nature of such a system may not work efficiently. In this chapter, we report a recent development in nonadditive quantum information theory based on the Tsallis entropy indexed by q [15] and its associated nonadditive conditional entropy [1]. This theory includes the ordinary additive theory with the von Neumann entropy in a special limiting case: q → To see if it has points superior to the additive theory, we apply it to the problems of separability and quantum entanglement.


Author(s):  
Murray Gell-Mann ◽  
Seth Lloyd

It would take a great many different concepts—or quantities—to capture all of our notions of what is meant by complexity (or its opposite, simplicity). However, the notion that corresponds most closely to what we mean by complexity in ordinary conversation and in most scientific discourse is "effective complexity." In nontechnical language, we can define the effective complexity (EC) of an entity as the length of a highly compressed description of its regularities [6, 7, 8]. For a more technical definition, we need a formal approach both to the notion of minimum description length and to the distinction between regularities and those features that are treated as random or incidental. We can illustrate with a number of examples how EC corresponds to our intuitive notion of complexity. We may call a novel complex if it has a great many different characters, scenes, subplots, and so on, so that the regularities of the novel require a long description. The United States tax code is complex, since it is very long and each rule in it is a regularity. Neckties may be simple, like those with regimental stripes, or complex, like some of those designed by Jerry Garcia. From time to time, an author presents a supposedly new measure of complexity (such as the "self-dissimilarity" of Wolpert and Macready [17]) without recognizing that when carefully defined it is just a special case of effective complexity. Like some other concepts sometimes identified with complexity, the EC of an entity is context-dependent, even subjective to a considerable extent. It depends on the coarse graining (level of detail) at which the entity is described, the language used to describe it, the previous knowledge and understanding that are assumed, and, of course, the nature of the distinction made between regularity and randomness. Like other proposed "measures of complexity," EC is most useful when comparing two entities, at least one of which has a large value of the quantity in question.


Author(s):  
Marcelo A. Montemurro

Human language evolved by natural mechanisms into an efficient system capable of coding and transmitting highly structured information [12, 13, 14]. As a remarkable complex system it allows many levels of description across its organizational hierarchy [1, 11, 18]. In this context statistical analysis stands as a valuable tool in order to reveal robust structural patterns that may have resulted from its long evolutionary history. In this chapter we shall address the statistical regularities of human language at its most basic level of description, namely the rank-frequency distribution of words. Around 1932 the philologist George Zipf [6, 19, 20] noted the manifestation of several robust power-law distributions arising in different realms of human activity. Among them, the most striking was undoubtedly the one referring to the distribution of words frequencies in human languages. The best way to introduce Zipf's law for words is by means of a concrete example. Let us take a literary work, say, James Joyce's Ulysses, and perform some basic statistics on it, whic simply consists in counting all the words present in the text and noting how many occurrences each distinct word form has. For this particular text we should arrive at the following numbers: the total number of words N = 268,112, and the number of different word forms V = 28,838. We can now order the list of different words according to decreasing number of occurrences, and we can assign to each word a rank index s equal to its position in the list starting from the most frequent word. Some general features of the rank-ordered list of words can be mentioned at this point. First, the top-rank words are functional components of language devoid of direct meaning, such as the article the and prepositions, for instance. A few ranks down the list, words more related to the contents of the text start to appear.


Author(s):  
Lukasz Debowski

In this chapter, we identify possible links between theoretical computer science, coding theory, and statistics reinforced by subextensivity of Shannon entropy. Our specific intention is to address these links in a way that may arise from a rudimentary theory of human learning from language communication. The semi-infinite stream of language production that a human being experiences during his or her life will be called simply parole (= "speech," [7]). Although modern computational linguistics tries to explain human language competence in terms of explicit mathematical models in order to enable its machine simulation [17, 20], modeling parole itself (widely known as "language modeling") is not trivial in a very obscure way. When a behavior of parole that improves its prediction is newly observed in a finite portion of the empirical data, it often suggests only minor improvements to the current model. When we use larger portions of parole to test the freshly improved model, this model always fails seriously, but in a different way. How we can provide necessary updates to, with- out harming the integrity of, the model is an important problem that experts must continually solve. Is there any sufficiently good definition of parole that is ready-made for industrial applications? Although not all readers of human texts learn continuously, parole is a product of those who can and often do learn throughout their lives. Thus, we assume that the amount of knowledge generalizable from a finite window of parole should diverge to infinity when the length of the window also tends to infinity. Many linguists assume that a very distinct part of the generalizable knowledge is "linguistic knowledge," which can be finite in principle. Nevertheless, for the sake of good modeling of parole in practical applications, it is useless to restrict ourselves solely to "finite linguistic knowledge" [6, 22]. Inspired by Crutchfield and Feldman [5], we will call any processes (distributions of infinite linear data) "finitary" when the amount of knowledge generalizable from them is finite, and "infinitary" when it is infinite. The crucial point is to accurately define the notion of knowledge generalized from a data sample. According to the principle of minimum description length (MDL), generalizable knowledge is the definition of such representation for the data which yields the shortest total description. In this case, we will define infinitarity as computational infinitarity (CIF).


Sign in / Sign up

Export Citation Format

Share Document