A HOMOTOPY ALGORITHM FOR SOLVING ENTROPY MAXIMIZATION PROBLEM OF OD MATRIX ESTIMATION

Author(s):  
Wataru MOGI
Entropy ◽  
2019 ◽  
Vol 21 (6) ◽  
pp. 549 ◽  
Author(s):  
Hisa-Aki Tanaka ◽  
Masaki Nakagawa ◽  
Yasutada Oohama

The well-known Hölder’s inequality has been recently utilized as an essential tool for solving several optimization problems. However, such an essential role of Hölder’s inequality does not seem to have been reported in the context of generalized entropy, including Rényi–Tsallis entropy. Here, we identify a direct link between Rényi–Tsallis entropy and Hölder’s inequality. Specifically, we demonstrate yet another elegant proof of the Rényi–Tsallis entropy maximization problem. Especially for the Tsallis entropy maximization problem, only with the equality condition of Hölder’s inequality is the q-Gaussian distribution uniquely specified and also proved to be optimal.


1986 ◽  
Vol 29 (1) ◽  
pp. 70-73 ◽  
Author(s):  
Silviu Guiasu

AbstractS. Golomb noticed that Riemann's zeta function ζ induces a probability distribution on the positive integers, for any s > 1, and studied some of its properties connected to divisibility. The object of this paper is to show that the probability distribution mentioned above is the unique solution of an entropy-maximization problem.


2016 ◽  
Vol 104 ◽  
pp. 1-15 ◽  
Author(s):  
Vanniyarajan Chellappan ◽  
Krishna M. Sivalingam ◽  
Kamala Krithivasan

Author(s):  
Nguyen N. Tran ◽  
Ha X. Nguyen

A capacity analysis for generally correlated wireless multi-hop multi-input multi-output (MIMO) channels is presented in this paper. The channel at each hop is spatially correlated, the source symbols are mutually correlated, and the additive Gaussian noises are colored. First, by invoking Karush-Kuhn-Tucker condition for the optimality of convex programming, we derive the optimal source symbol covariance for the maximum mutual information between the channel input and the channel output when having the full knowledge of channel at the transmitter. Secondly, we formulate the average mutual information maximization problem when having only the channel statistics at the transmitter. Since this problem is almost impossible to be solved analytically, the numerical interior-point-method is employed to obtain the optimal solution. Furthermore, to reduce the computational complexity, an asymptotic closed-form solution is derived by maximizing an upper bound of the objective function. Simulation results show that the average mutual information obtained by the asymptotic design is very closed to that obtained by the optimal design, while saving a huge computational complexity.


Sign in / Sign up

Export Citation Format

Share Document