likelihood functions
Recently Published Documents


TOTAL DOCUMENTS

258
(FIVE YEARS 64)

H-INDEX

28
(FIVE YEARS 6)

2022 ◽  
Vol 107 ◽  
pp. 102636
Author(s):  
Christopher R. Fisher ◽  
Joseph W. Houpt ◽  
Glenn Gunzelmann
Keyword(s):  

2021 ◽  
Vol 49 (6) ◽  
Author(s):  
Mathieu Gerber ◽  
Kari Heine

2021 ◽  
Vol 81 (11) ◽  
Author(s):  
Peter Athron ◽  
Neal Avis Kozar ◽  
Csaba Balázs ◽  
Ankit Beniwal ◽  
Sanjay Bloor ◽  
...  

AbstractWe assess the status of a wide class of WIMP dark matter (DM) models in light of the latest experimental results using the global fitting framework . We perform a global analysis of effective field theory (EFT) operators describing the interactions between a gauge-singlet Dirac fermion and the Standard Model quarks, the gluons and the photon. In this bottom-up approach, we simultaneously vary the coefficients of 14 such operators up to dimension 7, along with the DM mass, the scale of new physics and several nuisance parameters. Our likelihood functions include the latest data from Planck, direct and indirect detection experiments, and the LHC. For DM masses below 100 GeV, we find that it is impossible to satisfy all constraints simultaneously while maintaining EFT validity at LHC energies. For new physics scales around 1 TeV, our results are influenced by several small excesses in the LHC data and depend on the prescription that we adopt to ensure EFT validity. Furthermore, we find large regions of viable parameter space where the EFT is valid and the relic density can be reproduced, implying that WIMPs can still account for the DM of the universe while being consistent with the latest data.


2021 ◽  
Author(s):  
Gamini Joshi ◽  
Vidushi Sharma

Abstract The exposure of IoT nodes to the internet makes them vulnerable to malicious attacks and failures. These failures affect the survivability, integrity, and connectivity of the network. Thus the detection and elimination of attacks in a timely manner become an important factor to maintain the network connectivity. Trust-based techniques are used in understanding the behavior of nodes in the network. Several researchers have proposed conventional trust models that are power-hungry and demand large storage space. Succeeding this Hidden Markov Models have also been developed to calculate trust but the survivability of network achieved from them is low. To improve the survivability selfish and malicious nodes present in the network are required to be treated separately. Hence, an improved Hidden Markov Trust (HMT) Model is developed in this paper which accurately detects the selfish and malicious nodes that illegally intercept the network. An algorithm is generalized for learning the behavior of nodes using the HMT model with the expected output. The evaluated node’s likelihood functions differentiate the selfish node from the malicious node and provide independent timely treatment to both types of nodes. Further, comparative analysis for attacks such as black-hole, grey-hole, and sink-hole has been done and performance parameters have been extended to survivability-rate, power-consumption, delay, and false-alarm-rate, for different networks sizes and vulnerability. Simulation result provides a 10% higher PDR, 29% lower overhead, and 15% higher detection rate when compared to FUCEM, FTCSPM, and OADM trust models presented in the literature.


2021 ◽  
pp. 2150006
Author(s):  
BUU-CHAU TRUONG ◽  
KIM-HUNG PHO ◽  
CONG-CHANH DINH ◽  
MICHAEL McALEER

This paper makes a theoretical contribution by presenting a detailed derivation of a zero-inflated Poisson (ZIP) model, and then deriving the parameters of the ZIP model using a fishing data set. This model has several practical applications, and is largely performed to model count data that have an excess number of zero counts. In the scope of the paper, we introduce the complete formulae, the likelihood and log-likelihood functions and the estimating equation of the ZIP model. We then investigate the theory of large sample properties of this model under some regularity conditions. A simulation study and a fishing data set are studied for the ZIP model. The results in the actual application in this work are meaningful, useful and crucial in reality. The results also provide reliable evidence for obtaining the largest number of fish while fishing. This is the contribution of this research in terms of applications. Finally, the important applications of this model in practice, some conclusions, and future work is also presented for consideration.


2021 ◽  
Author(s):  
Yameng Wang ◽  
Liguo Fei ◽  
Yuqiang Feng ◽  
Yanqing Wang ◽  
Luning Liu

Abstract Case-based reasoning (CBR) is the retrieval of one or more similar cases from an existing case base for the problem to be solved according to the characteristics of the new problem. The core idea of CBR is that similar cases have similar solutions, so whether the CBR system can play a powerful advantage depends on the quality of case retrieval strategy. At present, the commonly used case retrieval algorithm is based on the mean operator method, which is very hard, and a certain local similarity is low will affect the overall result. In order to calculate the global similarity of cases from a new and softer point of view, this paper introduces the soft likelihood functions into case retrieval, combines the soft likelihood functions with KNN, and proposes a hybrid retrieval strategy. The core of the retrieval strategy is to define the global similarity through SLFs, aggregate the local similarity and characteristic similarity together, and also take the attitude characteristics of decision makers into consideration. Through simulation experiments on real data sets, the accuracy rate is more than 81%, which verifies the effectiveness of the retrieval strategy.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 467
Author(s):  
Marco A. Rodríguez-García ◽  
Isaac Pérez Castillo ◽  
P. Barberis-Blostein

Estimating correctly the quantum phase of a physical system is a central problem in quantum parameter estimation theory due to its wide range of applications from quantum metrology to cryptography. Ideally, the optimal quantum estimator is given by the so-called quantum Cramér-Rao bound, so any measurement strategy aims to obtain estimations as close as possible to it. However, more often than not, the current state-of-the-art methods to estimate quantum phases fail to reach this bound as they rely on maximum likelihood estimators of non-identifiable likelihood functions. In this work we thoroughly review various schemes for estimating the phase of a qubit, identifying the underlying problem which prohibits these methods to reach the quantum Cramér-Rao bound, and propose a new adaptive scheme based on covariant measurements to circumvent this problem. Our findings are carefully checked by Monte Carlo simulations, showing that the method we propose is both mathematically and experimentally more realistic and more efficient than the methods currently available.


2021 ◽  
Vol 3 ◽  
pp. 1
Author(s):  
Dan Shiebler

We take a category-theoretic perspective on the relationship between probabilistic modeling and gradient based optimization. We define two extensions of function composition to stochastic process subordination: one based on a co-Kleisli category and one based on the parameterization of a category with a Lawvere theory. We show how these extensions relate to the category of Markov kernels Stoch through a pushforward procedure.We extend stochastic processes to parametric statistical models and define a way to compose the likelihood functions of these models. We demonstrate how the maximum likelihood estimation procedure defines a family of identity-on-objects functors from categories of statistical models to the category of supervised learning algorithms Learn.Code to accompany this paper can be found on GitHub (https://github.com/dshieble/Categorical_Stochastic_Processes_and_Likelihood).


Sign in / Sign up

Export Citation Format

Share Document