plausible assumption
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 14)

H-INDEX

9
(FIVE YEARS 1)

2021 ◽  
Vol 2 (3) ◽  
pp. 1-28
Author(s):  
Yunpu Ma ◽  
Volker Tresp

Semantic knowledge graphs are large-scale triple-oriented databases for knowledge representation and reasoning. Implicit knowledge can be inferred by modeling the tensor representations generated from knowledge graphs. However, as the sizes of knowledge graphs continue to grow, classical modeling becomes increasingly computationally resource intensive. This article investigates how to capitalize on quantum resources to accelerate the modeling of knowledge graphs. In particular, we propose the first quantum machine learning algorithm for inference on tensorized data, i.e., on knowledge graphs. Since most tensor problems are NP-hard [18], it is challenging to devise quantum algorithms to support the inference task. We simplify the modeling task by making the plausible assumption that the tensor representation of a knowledge graph can be approximated by its low-rank tensor singular value decomposition, which is verified by our experiments. The proposed sampling-based quantum algorithm achieves speedup with a polylogarithmic runtime in the dimension of knowledge graph tensor.


Synthese ◽  
2021 ◽  
Author(s):  
Patryk Dziurosz-Serafinowicz ◽  
Dominika Dziurosz-Serafinowicz

AbstractWe explore the question of whether cost-free uncertain evidence is worth waiting for in advance of making a decision. A classical result in Bayesian decision theory, known as the value of evidence theorem, says that, under certain conditions, when you update your credences by conditionalizing on some cost-free and certain evidence, the subjective expected utility of obtaining this evidence is never less than the subjective expected utility of not obtaining it. We extend this result to a type of update method, a variant of Judea Pearl’s virtual conditionalization, where uncertain evidence is represented as a set of likelihood ratios. Moreover, we argue that focusing on this method rather than on the widely accepted Jeffrey conditionalization enables us to show that, under a fairly plausible assumption, gathering uncertain evidence not only maximizes expected pragmatic utility, but also minimizes expected epistemic disutility (inaccuracy).


2021 ◽  
Vol 81 (8) ◽  
Author(s):  
A. V. Nefediev

AbstractRecently the LHCb Collaboration announced the first observation of nontrivial structures in the double-$$J/\psi $$ J / ψ mass spectrum in the mass range 6.2–7.2 GeV, and a theoretical coupled-channel analysis of these data performed in Dong et al. (Phys Rev Lett 126:132001, 2021) evidenced the existence of a new state X(6200) close to the double-$$J/\psi $$ J / ψ threshold. Although its molecular interpretation seems the most plausible assumption, the present data do not exclude an admixture of a compact component in its wave function, for which a fully-charmed compact tetraquark is the most natural candidate. It is argued in this work that the QCD string model is compatible with the existence of a compact $$cc{\bar{c}}{\bar{c}}$$ c c c ¯ c ¯ state bound by QCD forces just below the double-$$J/\psi $$ J / ψ threshold. A nontrivial interplay of the quark dynamics associated with this compact state and the molecular dynamics provided by soft gluon exchanges between $$J/\psi $$ J / ψ mesons is discussed and the physical X(6200) is argued to be a shallow bound state, in agreement with the results of the aforementioned coupled-channel analysis of the LHCb data.


Erkenntnis ◽  
2021 ◽  
Author(s):  
Sebastian Schmidt

AbstractThe normative force of evidence can seem puzzling. It seems that having conclusive evidence for a proposition does not, by itself, make it true that one ought to believe the proposition. But spelling out the condition that evidence must meet in order to provide us with genuine normative reasons for belief seems to lead us into a dilemma: the condition either fails to explain the normative significance of epistemic reasons or it renders the content of epistemic norms practical. The first aim of this paper is to spell out this challenge for the normativity of evidence. I argue that the challenge rests on a plausible assumption about the conceptual connection between normative reasons and blameworthiness. The second aim of the paper is to show how we can meet the challenge by spelling out a concept of epistemic blameworthiness. Drawing on recent accounts of doxastic responsibility and epistemic blame, I suggest that the normativity of evidence is revealed in our practice of suspending epistemic trust in response to impaired epistemic relationships. Recognizing suspension of trust as a form of epistemic blame allows us to make sense of a purely epistemic kind of normativity the existence of which has recently been called into doubt by certain versions of pragmatism and instrumentalism.


2021 ◽  
Author(s):  
Richard Breen ◽  
John Ermisch

Heterogeneous effects of treatment on an outcome is a plausible assumption to make about the vast majority of causal relationships studied in the social sciences. In these circumstances the IV estimator is often interpreted as yielding an estimate of a Local Average Treatment Effect (LATE): a marginal change in the outcome for those whose treatment is changed by the variation of the particular instrument in the study. Our aim is to explain the relationship between the LATE parameter and its IV estimator by using a simple model which is easily accessible to applied researchers, and by relating the model to examples from the demographic literature. A focus of the paper is how additional heterogeneity in the instrument – treatment relationship affects the properties and interpretation of the IV estimator. We show that if the two kinds of heterogeneity are correlated, then the LATE parameter combines both the underlying treatment effects and the parameters from the instrument – treatment relationship. It is then a more complicated concept than many researchers realise.


Author(s):  
Ivano Ciardelli

AbstractThe view that if-clauses function semantically as restrictors is widely regarded as the only candidate for a fully general account of conditionals. The standard implementation of this view assumes that, where no operator to be restricted is in sight, if-clauses restrict covert epistemic modals. Stipulating such modals, however, lacks independent motivation and leads to wrong empirical predictions. In this paper I provide a theory of conditionals on which if-clauses are uniformly interpreted as restrictors, but no covert modals are postulated. Epistemic if-clauses, like those in bare conditionals, restrict an information state parameter which is used to interpret an expressive layer of the language. I show that this theory yields an attractive account of bare and overtly modalized conditionals and solves various empirical problems for the standard view, while dispensing with its less plausible assumption.


Author(s):  
Hamidreza Tahmasbi ◽  
Mehrdad Jalali ◽  
Hassan Shakeri

AbstractAn essential problem in real-world recommender systems is that user preferences are not static and users are likely to change their preferences over time. Recent studies have shown that the modelling and capturing the dynamics of user preferences lead to significant improvements on recommendation accuracy and, consequently, user satisfaction. In this paper, we develop a framework to capture user preference dynamics in a personalized manner based on the fact that changes in user preferences can vary individually. We also consider the plausible assumption that older user activities should have less influence on a user’s current preferences. We introduce an individual time decay factor for each user according to the rate of his preference dynamics to weigh the past user preferences and decrease their importance gradually. We exploit users’ demographics as well as the extracted similarities among users over time, aiming to enhance the prior knowledge about user preference dynamics, in addition to the past weighted user preferences in a developed coupled tensor factorization technique to provide top-K recommendations. The experimental results on the two real social media datasets—Last.fm and Movielens—indicate that our proposed model is better and more robust than other competitive methods in terms of recommendation accuracy and is more capable of coping with problems such as cold-start and data sparsity.


2020 ◽  
Vol 2020 (3) ◽  
pp. 64-74
Author(s):  
Andriy Viktorovich Goncharenko

AbstractThe paper deals with the uncertainty of the operated system’s possible states hybrid combined optional functions. Traditionally, the probabilities of the system’s possible states are treated as the reliability measures. However, in the framework of the proposed doctrine, the optimality (for example, the maximal probability of the system’s state) is determined based upon a plausible assumption of the intrinsic objectively existing parameters. The two entropy theory wings consider on one hand the subjective preferences functions in subjective analysis, concerning the multi-alternativeness of the operational situation at an individual’s choice problems, and on the other hand the objectively existing characteristics used in theoretical physics. The discussed in the paper entropy paradigm proceeds with the objectively presented phenomena of the state’s probability and the probability’s maximum. The theoretical speculations and mathematical derivations are illustrated with the necessary plotted diagrams.


2020 ◽  
Vol 2020 (9) ◽  
Author(s):  
Naoyuki Haba ◽  
Yukihiro Mimura ◽  
Toshifumi Yamada

Abstract The ratio of the partial widths of some dimension-5 proton decay modes can be predicted without detailed knowledge of supersymmetric (SUSY) particle masses, and this allows us to experimentally test various SUSY grand unified theory (GUT) models without discovering SUSY particles. In this paper, we study the ratio of the partial widths of the $p\to K^0\mu^+$ and $p\to K^+\bar{\nu}_\mu$ decays in the minimal renormalizable SUSY $SO(10)$ GUT, under only a plausible assumption that the 1st- and 2nd-generation left-handed squarks are mass-degenerate. In the model, we expect that the Wilson coefficients of dimension-5 operators responsible for these modes are on the same order and that the ratio of $p\to K^0\mu^+$ and $p\to K^+\bar{\nu}_\mu$ partial widths is $O(0.1)$. Hence, we may be able to detect both $p\to K^0\mu^+$ and $p\to K^+\bar{\nu}_\mu$ decays at Hyper-Kamiokande, thereby gaining a hint for the minimal renormalizable SUSY $SO(10)$ GUT. Moreover, since this partial width ratio is quite suppressed in the minimal $SU(5)$ GUT, it allows us to distinguish the minimal renormalizable SUSY $SO(10)$ GUT from the minimal $SU(5)$ GUT. In the main body of the paper, we perform a fitting of the quark and lepton masses and flavor mixings with the Yukawa couplings of the minimal renormalizable $SO(10)$ GUT, and derive a concrete prediction for the partial width ratio based on the fitting results. We find that the partial width ratio generally varies in the range $0.05$–$0.6$, confirming the above expectation.


The clinical manifestations of myocardial ischemia are protean in nature and include a variable combination of typical or atypical angina symptoms, electrocardiographic changes, noninvasive findings of regional wall motion abnormalities, and reversible scintigraphic perfusion defects—the changes of which, importantly, may or may not be of epicardial coronary origin. Thus, mounting evidence indicates that the presence or absence of atherosclerotic coronary artery disease (CAD) should no longer be considered a surrogate marker for myocardial ischemia, as suggested by the high prevalence of minor or absent coronary obstruction among patients with proven myocardial ischemia. Whereas the management of CAD has been largely predicated on the plausible assumption that flow-limiting obstructions of the epicardial coronary arteries are the proximate cause of both angina and myocardial ischemia, there is scant evidence from many randomized trials and several meta-analyses that treating epicardial coronary obstructions in patients with stable CAD, particularly with percutaneous coronary intervention (PCI), reduces mortality and morbidity, as compared with optimal medical therapy (OMT). A crucial scientific question for which evidence has been lacking is whether more severe and extensive myocardial ischemia is the driver of adverse cardiovascular outcomes and whether an invasive strategy with myocardial revascularization would be superior to OMT alone in such patients. The results of the recent ISCHEMIA trial (International Study of Comparative Health Effectiveness with Medical and Invasive Approaches), however, have failed to show—even in this higher-risk CAD subset—any incremental clinical benefit of revascularization as compared with OMT alone on cardiac event reduction.


Sign in / Sign up

Export Citation Format

Share Document