scholarly journals Reply to “What Is the Maximum Entropy Principle? Comments on ‘Statistical Theory on the Functional Form of Cloud Particle Size Distributions’”

2019 ◽  
Vol 76 (12) ◽  
pp. 3961-3963
Author(s):  
Wei Wu ◽  
Greg M. McFarquhar

Abstract We welcome the opportunity to correct the misunderstandings and misinterpretations contained in Yano’s comment that led him to incorrectly state that Wu and McFarquhar misunderstood the maximum entropy (MaxEnt) principle. As correctly stated by Yano, the principle itself does not suffer from the problem of a lack of invariance. But, as restated in this reply and in Wu and McFarquhar, the commonly used Shannon–Gibbs entropy does suffer from a lack of invariance for coordinate transform when applied in continuous cases, and this problem is resolved by the use of the relative entropy. Further, it is restated that the Wu and McFarquhar derivation of the PSD form using MaxEnt is more general than the formulation by Yano and allows more constraints with any functional relations to be applied. The derivation of Yano is nothing new but the representation of PSDs in other variables.

2019 ◽  
Vol 76 (12) ◽  
pp. 3955-3960 ◽  
Author(s):  
Jun-Ichi Yano

Abstract The basic idea of the maximum entropy principle is presented in a succinct, self-contained manner. The presentation points out some misunderstandings on this principle by Wu and McFarquhar. Namely, the principle does not suffer from the problem of a lack of invariance by change of the dependent variable; thus, it does not lead to a need to introduce the relative entropy as suggested by Wu and McFarquhar. The principle is valid only with a proper choice of a dependent variable, called a restriction variable, for a distribution. Although different results may be obtained with the other variables obtained by transforming the restriction variable, these results are simply meaningless. A relative entropy may be used instead of a standard entropy. However, the former does not lead to any new results unobtainable by the latter.


2018 ◽  
Vol 75 (3) ◽  
pp. 787-804 ◽  
Author(s):  
Jun-Ichi Yano ◽  
Andrew J. Heymsfield ◽  
Aaron Bansemer

Abstract The possibility is suggested of estimating particle size distributions (PSD) solely based on the bulk quantities of the hydrometeors. The method, inspired by the maximum entropy principle, can be applied to any predefined general PSD form as long as the number of the free parameters is equal to or less than that of the bulk quantities available. As long as an adopted distribution is “physically based,” these bulk characterizations can recover a fairly accurate PSD estimate. This method is tested for ice particle measurements from the Tropical Composition, Cloud and Climate Coupling Experiment (TC4). The total particle number, total mass, and mean size are taken as bulk quantities. The gamma distribution and two distributions obtained under the maximum entropy principle by taking the size and the particle mass, respectively, as a restriction variable are adopted for fit. The fitting error for the two maximum entropy–based distributions is comparable to that of a standard direct fitting method with the gamma distribution. The same procedure works almost equally well when the mean size is removed from the constraint, especially for an exponential distribution. The results suggest that the total particle number and the total mass of the hydrometeors are sufficient for determining the PSD to a reasonable accuracy when a “physically based” distribution is assumed. In addition to the in situ cloud measurements, remote sensing measurements such as those from radar as well as satellite can be adopted as physical constraints. Possibilities of exploiting different types of measurements should be further pursued.


2007 ◽  
Vol 24 (11) ◽  
pp. 1860-1879 ◽  
Author(s):  
Paul J. Connolly ◽  
Michael J. Flynn ◽  
Z. Ulanowski ◽  
T. W. Choularton ◽  
M. W. Gallagher ◽  
...  

Abstract This paper explains and develops a correction algorithm for measurement of cloud particle size distributions with the Stratton Park Engineering Company, Inc., Cloud Particle Imager (CPI). Cloud particle sizes, when inferred from images taken with the CPI, will be oversized relative to their “true” size. Furthermore, particles will cease to be “accepted” in the image frame if they lie a distance greater than the depth of field from the object plane. By considering elements of the scalar theory for diffraction of light by an opaque circular disc, a calibration method is devised to overcome these two problems. The method reduces the error in inferring particle size from the CPI data and also enables the determination of the particles distance from the object plane and hence their depth of field. These two quantities are vital to enable quantitative measurements of cloud particle size distributions (histograms of particle size that are scaled to the total number concentration of particles) in the atmosphere with the CPI. By using both glass calibration beads and novel ice crystal analogs, these two problems for liquid drops and ice particles can be quantified. Analysis of the calibration method shows that 1) it reduces the oversizing of 15-μm beads (from 24.3 to 14.9 μm for the sample mean), 40-μm beads (from 50.0 to 41.4 μm for the sample mean), and 99.4-μm beads (from 103.7 to 99.8 μm for the sample mean); and 2) it accurately predicts the particles distance from the object plane (the relationship between measured and predicted distance shows strong positive correlation and gives an almost one-to-one relationship). Realistic ice crystal analogs were also used to assess the errors in sampling ice clouds and found that size and distance from the object plane could be accurately predicted for ice crystals by use of the particle roundness parameter (defined as the ratio of the projected area of the particle to the area of a circle with the same maximum length). While the results here are not directly applicable to every CPI, the methods are, as data taken from three separate CPIs fit the calibration model well (not shown).


RSC Advances ◽  
2015 ◽  
Vol 5 (116) ◽  
pp. 95967-95980 ◽  
Author(s):  
Mehdi Asadollahzadeh ◽  
Meisam Torab-Mostaedi ◽  
Shahrokh Shahhosseini ◽  
Ahad Ghaemi

In this study, the maximum entropy principle is used to predict the drop size distributions in a multi-impeller column extractor.


2009 ◽  
Vol 2 (1) ◽  
pp. 259-271 ◽  
Author(s):  
J. P. Fugal ◽  
R. A. Shaw

Abstract. Holographic data from the prototype airborne digital holographic instrument HOLODEC (Holographic Detector for Clouds), taken during test flights are digitally reconstructed to obtain the size (equivalent diameters in the range 23 to 1000 μm), three-dimensional position, and two-dimensional image of ice particles and then ice particle size distributions and number densities are calculated using an automated algorithm with minimal user intervention. The holographic method offers the advantages of a well-defined sample volume size that is not dependent on particle size or airspeed, and offers a unique method of detecting shattered particles. The holographic method also allows the volume sample rate to be increased beyond that of the prototype HOLODEC instrument, limited solely by camera technology. HOLODEC size distributions taken in mixed-phase regions of cloud compare well to size distributions from a PMS FSSP probe also onboard the aircraft during the test flights. A conservative algorithm for detecting shattered particles utilizing their depth-position along the optical axis eliminates the obvious ice particle shattering events from the data set. In this particular case, the size distributions of non-shattered particles are reduced by approximately a factor of two for particles 15 to 70 μm in equivalent diameter, compared to size distributions of all particles.


2009 ◽  
Vol 2 (2) ◽  
pp. 659-688 ◽  
Author(s):  
J. P. Fugal ◽  
R. A. Shaw

Abstract. Holographic data from the prototype airborne digital holographic instrument HOLODEC (Holographic Detector for Clouds), taken during test flights are digitally reconstructed to obtain the size (equivalent diameters in the range 23 to 1000 μm), three-dimensional position, and two-dimensional profile of ice particles and then ice particle size distributions and number densities are calculated using an automated algorithm with minimal user intervention. The holographic method offers the advantages of a well-defined sample volume size that is not dependent on particle size or airspeed, and offers a unique method of detecting shattered particles. The holographic method also allows the volume sample rate to be increased beyond that of the prototype HOLODEC instrument, limited solely by camera technology. HOLODEC size distributions taken in mixed-phase regions of cloud compare well to size distributions from a PMS FSSP probe also onboard the aircraft during the test flights. A conservative algorithm for detecting shattered particles utilizing the particles depth-position along the optical axis eliminates the obvious ice particle shattering events from the data set. In this particular case, the size distributions of non-shattered particles are reduced by approximately a factor of two for particles 15 to 70 μm in equivalent diameter, compared to size distributions of all particles.


2018 ◽  
Vol 75 (8) ◽  
pp. 2801-2814 ◽  
Author(s):  
Wei Wu ◽  
Greg M. McFarquhar

Abstract Several functional forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying theoretical explanation as to why certain distribution forms preferentially occur instead of others. Intuitively, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of a process-level approach, the use of the principle of maximum entropy (MaxEnt) for determining the theoretical form of PSDs from the perspective of system is examined here. MaxEnt theory states that the probability density function with the largest information entropy among a group satisfying the given properties of the variable should be chosen. Here, the issue of variability under coordinate transformations that arises using the Gibbs–Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power-law relations among state variables, scale invariance, and a further constraint on the expectation of one state variable (e.g., bulk water mass). The four-parameter generalized gamma distribution is very flexible to accommodate various type of constraints that could be assumed for cloud PSDs.


Sign in / Sign up

Export Citation Format

Share Document