total probability
Recently Published Documents


TOTAL DOCUMENTS

195
(FIVE YEARS 63)

H-INDEX

14
(FIVE YEARS 3)

2021 ◽  
Vol 14 (1) ◽  
pp. 125
Author(s):  
Victor Makarichev ◽  
Irina Vasilyeva ◽  
Vladimir Lukin ◽  
Benoit Vozel ◽  
Andrii Shelestov ◽  
...  

Lossy compression of remote sensing data has found numerous applications. Several requirements are usually imposed on methods and algorithms to be used. A large compression ratio has to be provided, introduced distortions should not lead to sufficient reduction of classification accuracy, compression has to be realized quickly enough, etc. An additional requirement could be to provide privacy of compressed data. In this paper, we show that these requirements can be easily and effectively realized by compression based on discrete atomic transform (DAT). Three-channel remote sensing (RS) images that are part of multispectral data are used as examples. It is demonstrated that the quality of images compressed by DAT can be varied and controlled by setting maximal absolute deviation. This parameter also strictly relates to more traditional metrics as root mean square error (RMSE) and peak signal-to-noise ratio (PSNR) that can be controlled. It is also shown that there are several variants of DAT having different depths. Their performances are compared from different viewpoints, and the recommendations of transform depth are given. Effects of lossy compression on three-channel image classification using the maximum likelihood (ML) approach are studied. It is shown that the total probability of correct classification remains almost the same for a wide range of distortions introduced by lossy compression, although some variations of correct classification probabilities take place for particular classes depending on peculiarities of feature distributions. Experiments are carried out for multispectral Sentinel images of different complexities.


Author(s):  
E. V. Vakulina ◽  
V. V. Andreev ◽  
N. V. Maksimenko

In this paper, we obtained a solution for the equation of motion of a charged spinless particle in the field of a plane electromagnetic wave. Relativistic expressions for the cross section of Compton scattering by a charged particle of spin 0 interacting with the field of a plane electromagnetic wave are calculated. Numerical simulation of the total probability of radiation as the function of the electromagnetic wave amplitude is carried out. The radiation probability is found to be consistent with the total cross section for Compton scattering by a charged particle of spin 0.


2021 ◽  
Vol 11 (2) ◽  
pp. 300-314
Author(s):  
Tetiana Malovichko

The paper is devoted to the study of what changes the course of the probability theory has undergone from the end of the 19th century to our time based on the analysis of The Theory of Probabilities textbook by Vasyl P. Ermakov published in 1878. In order to show the competence of the author of this textbook, his biography and creative development of V. P. Ermakov, a famous mathematician, Corresponding Member of the St. Petersburg Academy of Sciences, have been briefly reviewed. He worked at the Department of Pure Mathematics at Kyiv University, where he received the title of Honored Professor, headed the Department of Higher Mathematics at the Kyiv Polytechnic Institute, published the Journal of Elementary Mathematics, and he was one of the founders of the Kyiv Physics and Mathematics Society. The paper contains a comparative analysis of The Probability Theory textbook and modern educational literature. V. P. Ermakov's textbook uses only the classical definition of probability. It does not contain such concepts as a random variable, distribution function, however, it uses mathematical expectation. V. P. Ermakov insists on excluding the concept of moral expectation accepted in the science of that time from the probability theory. The textbook consists of a preface, five chapters, a synopsis containing the statements of the main results, and a collection of tasks with solutions and instructions. The first chapter deals with combinatorics, the presentation of which does not differ much from its modern one. The second chapter introduces the concepts of event and probability. Although operations on events have been not considered at all; the probabilities of intersecting and combining events have been discussed. However, the above rule for calculating the probability of combining events is generally incorrect for compatible events. The third chapter is devoted to events during repeated tests, mathematical expectation and contains Bernoulli's theorem, from which the law of large numbers follows. The next chapter discusses conditional probabilities, the simplest version of the conditional mathematical expectation, the total probability formula and the Bayesian formula (in modern terminology). The last chapter is devoted to the Jordan method and its applications. This method is not found in modern educational literature. From the above, we can conclude that the probability theory has made significant progress since the end of the 19th century. Basic concepts are formulated more rigorously; research methods have developed significantly; new sections have appeared.


2021 ◽  
pp. 258-264
Author(s):  
А.Л. Боран-Кешишьян ◽  
М.В. Заморёнов ◽  
П.Н. Флоря ◽  
А.А. Ярошенко ◽  
С.И. Кондратьев

В работе рассматривается функционирование технической системы с мгновенно пополняемым резервом времени с учетом профилактики. Приводится описание функционирования такой системы. При использовании аппарата полумарковских исследований производится построение аналитической модели системы с мгновенно пополняемым резервом времени при учете влияния профилактики на ее производительность. При построении полумарковской модели принимается ограничение на количество профилактик за время восстановления рабочего элемента. Описываются полумарковские состояния исследуемой системы, и приводится граф состояний. Определяются времена пребывания в состояниях системы, вероятности переходов и стационарное распределение вложенной цепи Маркова. Для определения функции распределения времени пребывания системы в подмножестве работоспособных состояний с использованием метода траекторий находятся все траектории переходов системы из этого подмножества в подмножество неработоспособных состояний и вероятности их реализации. Определяются времена пребывания системы в найденных траекториях. На основании теоремы полной вероятности определяются функции распределения времен пребывания системы в подмножествах работоспособных и неработоспособных состояний и коэффициент готовности системы. Приводится пример моделирования исследуемой системы. Проводится сравнение полученных результатов с результатами использования теоремы о среднестационарном времени пребывания системы в подмножестве состояний. The work examines the functioning of a technical system with an instantly replenished reserve of time, taking into account prevention. The description of the functioning of such a system is given. When using the apparatus of semi-Markov studies, an analytical model of the system is constructed with an instantly replenished reserve of time, taking into account the effect of prevention on its performance. When constructing a semi-Markov model, a limitation on the number of preventive measures during the restoration of a working element is adopted. The semi-Markov states of the system under study are described, and the state graph is given. The sojourn times in the states of the system, the transition probabilities, and the stationary distribution of the embedded Markov chain are determined. To determine the distribution function of the time spent by the system in a subset of operable states using the trajectory method, all trajectories of the system's transitions from this subset to the subset of inoperable states and the probability of their realization are found. The residence times of the system in the found trajectories are determined. On the basis of the total probability theorem, the distribution functions of the sojourn times of the system in subsets of the healthy and inoperable states and the system availability factor are determined. The modeling example of th system under study is given. The results obtained are compared with the results of using the theorem on the average stationary sojourn time of the system in a subset of states.


2021 ◽  
Vol 923 (2) ◽  
pp. L32
Author(s):  
S. Dichiara ◽  
R. L. Becerra ◽  
E. A. Chase ◽  
E. Troja ◽  
W. H. Lee ◽  
...  

Abstract We report the results of our follow-up campaign for the neutron-star—black-hole (NSBH) merger GW200115 detected during the O3 run of the Advanced LIGO and Advanced Virgo detectors. We obtained wide-field observations with the Deca-Degree Optical Transient Imager covering ∼20% of the total probability area down to a limiting magnitude of w = 20.5 AB at ∼23 hr after the merger. Our search for counterparts returns a single candidate (AT2020aeo), likely not associated with the merger. In total, only 25 sources of interest were identified by the community and later discarded as unrelated to the GW event. We compare our upper limits with the emission predicted by state-of-the-art kilonova simulations and disfavor high-mass ejecta (>0.1 M ⊙), indicating that the spin of the system is not particularly high. By combining our optical limits with gamma-ray constraints from Swift and Fermi, we disfavor the presence of a standard short-duration burst for viewing angles ≲15° from the jet axis. Our conclusions are, however, limited by the large localization region of this GW event, and accurate prompt positions remain crucial to improving the efficiency of follow-up efforts.


Author(s):  
Sergey A. Shteingolts ◽  
Julia K. Voronina ◽  
Liliya F. Saifina ◽  
Marina M. Shulaeva ◽  
Vyacheslav E. Semenov ◽  
...  

The crystal and electronic structure of an isocyanuric acid derivative was studied by high-resolution single-crystal X-ray diffraction within the Hansen–Coppens multipole formalism. The observed deformation electron density shows signs of thermal smearing. The experimental picture meaningfully assigned to the consequences of unmodelled anharmonic atomic motion. Straightforward simultaneous refinement of all parameters, including Gram–Charlier coefficients, resulted in more significant distortion of apparent static electron density, even though the residual density became significantly flatter and more featureless. Further, the method of transferring multipole parameters from the model refined against theoretical structure factors as an initial guess was employed, followed by the subsequent block refinement of Gram–Charlier coefficients and the other parameters. This procedure allowed us to appropriately distinguish static electron density from the contaminant smearing effects of insufficiently accounted atomic motion. In particular, some covalent bonds and the weak π...π interaction between isocyanurate moieties were studied via the mutual penetration of atomic-like kinetic and electrostatic potential φ-basins with complementary atomic ρ-basins. Further, local electronic temperature was applied as an advanced descriptor for both covalent bonds and noncovalent interactions. Total probability density function (PDF) of nuclear displacement showed virtually no negative regions close to and around the atomic nuclei. The distribution of anharmonic PDF to a certain extent matched the residual electron density from the multipole model before anharmonic refinement. No signs of disordering of the sulfonyl group hidden in the modelled anharmonic motion were found in the PDF.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1410
Author(s):  
Wojciech M. Kempa ◽  
Rafał Marjasz

The transient behavior of the finite-buffer queueing model with batch arrivals and generally distributed repeated vacations is analyzed. Such a system has potential applications in modeling the functioning of production systems, computer and telecommunication networks with energy saving mechanism based on cyclic monitoring the queue state (Internet of Things, wireless sensors networks, etc.). Identifying renewal moments in the evolution of the system and applying continuous total probability law, a system of Volterra-type integral equations for the time-dependent queue-size distribution, conditioned by the initial buffer state, is derived. A compact-form solution for the corresponding system written for Laplace transforms is obtained using an algebraic approach based on Korolyuk’s potential method. An illustrative numerical example presenting the impact of the service rate, arrival rate, initial buffer state and single vacation duration on the queue-size distribution is attached as well.


2021 ◽  
pp. 875529302110492
Author(s):  
Michael W Greenfield ◽  
Andrew J Makdisi

Since their inception in the 1980s, simplified procedures for the analysis of liquefaction hazards have typically characterized seismic loading using a combination of peak ground acceleration and earthquake magnitude. However, more recent studies suggest that certain evolutionary intensity measures (IMs) such as Arias intensity or cumulative absolute velocity may be more efficient and sufficient predictors of liquefaction triggering and its consequences. Despite this advantage, widespread hazard characterizations for evolutionary IMs are not yet feasible due to a relatively incomplete representation of the ground motion models (GMMs) needed for probabilistic seismic hazard analysis (PSHA). Without widely available hazard curves for evolutionary IMs, current design codes often rely on spectral targets for ground motion selection and scaling, which are shown in this study to indirectly result in low precision of evolutionary IMs often associated with liquefaction hazards. This study presents a method to calculate hazard curves for arbitrary intensity measures, such as evolutionary IMs for liquefaction hazard analyses, without requiring an existing GMM. The method involves the conversion of a known IM hazard curve into an alternative IM hazard curve using the total probability theorem. The effectiveness of the method is illustrated by comparing hazard curves calculated using the total probability theorem to the results of a PSHA to demonstrate that the proposed method does not result in additional uncertainty under idealized conditions and provides a range of possible hazard values under most practical conditions. The total probability theorem method can be utilized by practitioners and researchers to select ground motion time series that target alternative IMs for liquefaction hazard analyses or other geotechnical applications. This method also allows researchers to investigate the efficiency, sufficiency, and predictability of new, alternative IMs without necessarily requiring GMMs.


Molecules ◽  
2021 ◽  
Vol 26 (19) ◽  
pp. 5987
Author(s):  
Pier Luigi Gentili

Human interaction with the world is dominated by uncertainty. Probability theory is a valuable tool to face such uncertainty. According to the Bayesian definition, probabilities are personal beliefs. Experimental evidence supports the notion that human behavior is highly consistent with Bayesian probabilistic inference in both the sensory and motor and cognitive domain. All the higher-level psychophysical functions of our brain are believed to take the activities of interconnected and distributed networks of neurons in the neocortex as their physiological substrate. Neurons in the neocortex are organized in cortical columns that behave as fuzzy sets. Fuzzy sets theory has embraced uncertainty modeling when membership functions have been reinterpreted as possibility distributions. The terms of Bayes’ formula are conceivable as fuzzy sets and Bayes’ inference becomes a fuzzy inference. According to the QBism, quantum probabilities are also Bayesian. They are logical constructs rather than physical realities. It derives that the Born rule is nothing but a kind of Quantum Law of Total Probability. Wavefunctions and measurement operators are viewed epistemically. Both of them are similar to fuzzy sets. The new link that is established between fuzzy logic, neuroscience, and quantum mechanics through Bayesian probability could spark new ideas for the development of artificial intelligence and unconventional computing.


Sign in / Sign up

Export Citation Format

Share Document