suitable notion
Recently Published Documents


TOTAL DOCUMENTS

38
(FIVE YEARS 15)

H-INDEX

6
(FIVE YEARS 2)

Author(s):  
Cristián Soto

Nomological Humeanism has developed into a research program encompassing several variations on a single theme, namely, the view that laws are statements about regularities that we find in nature. After briefly revisiting an early form of nomological Humeanism in Hume’s critique of the idea of necessary connection, this article critically examines Lewis’ two-fold approach based on Humean supervenience and the best system account. We shall point out three limits of nomological Humeanism, which are widely recognized in the literature: its inadequacy in view of physical theories, its explanatory circularity, and its purported anthropomorphism, all of which advocates of nomological Humeanism have attempted to overcome Humeanism (Jaag y Loew 2020, Loewer 2004 y Massimi 2018). Lastly, we will argue that nomological Humeanism fails to provide a suitable notion of modality for laws of nature. This latter issue continues to represent a live challenge for empiricism in the philosophy of physical laws.


Author(s):  
Claudio Meneses ◽  
Leon A. Takhtajan

AbstractModuli spaces of stable parabolic bundles of parabolic degree 0 over the Riemann sphere are stratified according to the Harder–Narasimhan filtration of underlying vector bundles. Over a Zariski open subset $$\mathscr {N}_{0}$$ N 0 of the open stratum depending explicitly on a choice of parabolic weights, a real-valued function $$\mathscr {S}$$ S is defined as the regularized critical value of the non-compact Wess–Zumino–Novikov–Witten action functional. The definition of $$\mathscr {S}$$ S depends on a suitable notion of parabolic bundle ‘uniformization map’ following from the Mehta–Seshadri and Birkhoff–Grothendieck theorems. It is shown that $$-\mathscr {S}$$ - S is a primitive for a (1,0)-form $$\vartheta $$ ϑ on $$\mathscr {N}_{0}$$ N 0 associated with the uniformization data of each intrinsic irreducible unitary logarithmic connection. Moreover, it is proved that $$-\mathscr {S}$$ - S is a Kähler potential for $$(\Omega -\Omega _{\mathrm {T}})|_{\mathscr {N}_{0}}$$ ( Ω - Ω T ) | N 0 , where $$\Omega $$ Ω is the Narasimhan–Atiyah–Bott Kähler form in $$\mathscr {N}$$ N and $$\Omega _{\mathrm {T}}$$ Ω T is a certain linear combination of tautological (1, 1)-forms associated with the marked points. These results provide an explicit relation between the cohomology class $$[\Omega ]$$ [ Ω ] and tautological classes, which holds globally over certain open chambers of parabolic weights where $$\mathscr {N}_{0} = \mathscr {N}$$ N 0 = N .


2021 ◽  
Author(s):  
Alessandro Artale ◽  
Andrea Mazzullo ◽  
Ana Ozaki ◽  
Frank Wolter

Definite descriptions are phrases of the form ‘the x such that φ’, used to refer to single entities in a context. They are often more meaningful to users than individual names alone, in particular when modelling or querying data over ontologies. We investigate free description logics with both individual names and definite descriptions as terms of the language, while also accounting for their possible lack of denotation. We focus on the extensions of ALC and, respectively, EL with nominals, the universal role, and definite descriptions. We show that standard reasoning in these extensions is not harder than in the original languages, and we characterise the expressive power of concepts relative to first-order formulas using a suitable notion of bisimulation. Moreover, we lay the foundations for automated support for definite descriptions generation by studying the complexity of deciding the existence of definite descriptions for an individual under an ontology. Finally, we provide a polynomial-time reduction of reasoning in other free description logic languages based on dual-domain semantics to the case of partial interpretations.


Author(s):  
Ugo Bruzzo ◽  
William Montoya

AbstractWe establish the Hodge conjecture for some subvarieties of a class of toric varieties. First we study quasi-smooth intersections in a projective simplicial toric variety, which is a suitable notion to generalize smooth complete intersection subvarieties in the toric environment, and in particular quasi-smooth hypersurfaces. We show that under appropriate conditions, the Hodge conjecture holds for a very general quasi-smooth intersection subvariety, generalizing the work on quasi-smooth hypersurfaces of the first author and Grassi in Bruzzo and Grassi (Commun Anal Geom 28: 1773–1786, 2020). We also show that the Hodge Conjecture holds asymptotically for suitable quasi-smooth hypersurface in the Noether–Lefschetz locus, where “asymptotically” means that the degree of the hypersurface is big enough, under the assumption that the ambient variety $${{\mathbb {P}}}_\Sigma ^{2k+1}$$ P Σ 2 k + 1 has Picard group $${\mathbb {Z}}$$ Z . This extends to a class of toric varieties Otwinowska’s result in Otwinowska (J Alg Geom 12: 307–320, 2003).


2021 ◽  
Vol 2021 (4) ◽  
Author(s):  
Mario Herrero-Valea ◽  
Stefano Liberati ◽  
Raquel Santos-Garcia

Abstract The persistence of a suitable notion of black hole thermodynamics in Lorentz breaking theories of gravity is not only a non-trivial consistency test for such theories, it is also an interesting investigation per se, as it might help us identifying the crucial features at the root of these surprising laws governing such purely gravitational objects. In past investigations, controversial findings were presented in this sense. With the aim of settling this issue, we present here two complementary derivations of Hawking radiation in geometries endowed with universal horizons: a novel feature of back holes in Lorentz breaking theories of gravity which reproduces several properties normally characterizing Killing horizons. We find that both the derivations agree on the fact that the Hawking temperature associated to these geometries is set by the generalized universal horizon peeling surface gravity, as required for consistency with extant derivations of the first law of thermodynamics for these black holes. We shall also comment on the compatibility of our results with previous alternative derivations and on their significance for the survival of the generalized second law of black hole thermodynamics in Lorentz breaking theories of gravity.


2021 ◽  
Author(s):  
Peter F. Faul

AbstractIt is well known that the set of isomorphism classes of extensions of groups with abelian kernel is characterized by the second cohomology group. In this paper we generalise this characterization of extensions to a natural class of extensions of monoids, the cosetal extensions. An extension "Equation missing" is cosetal if for all $$g,g' \in G$$ g , g ′ ∈ G in which $$e(g) = e(g')$$ e ( g ) = e ( g ′ ) , there exists a (not necessarily unique) $$n \in N$$ n ∈ N such that $$g = k(n)g'$$ g = k ( n ) g ′ . These extensions generalise the notion of special Schreier extensions, which are themselves examples of Schreier extensions. Just as in the group case where a semidirect product could be associated to each extension with abelian kernel, we show that to each cosetal extension (with abelian group kernel), we can uniquely associate a weakly Schreier split extension. The characterization of weakly Schreier split extensions is combined with a suitable notion of a factor set to provide a cohomology group granting a full characterization of cosetal extensions, as well as supplying a Baer sum.


Synthese ◽  
2020 ◽  
Author(s):  
Eleonora Montuschi

AbstractSeveral and repeated attempts have been made to say what objectivity consists of and why it should be pursued in research. In the first part of this paper two main strategies are singled out, sharing the assumption that there is a way (or different ways) objectivity can be thought of in the abstract (which does not mean without content), and that it can be instantiated in context—and in enough contexts to justify the abstract case. But not only is this assumption open to the objection that objectivity so conceived does not admit of one clear definition (even a disjunctive one) that is appropriate in many or most contexts where we intend the term to do its work. It also does not seem to pay specific attention to what actually constitutes a context of practice, when we think of objectivity in some relation to such context. The aim of this paper is to question how context works both as a mechanism of meaning formation for the concept of objectivity, and as a practical framework for pursuing research objectively. To articulate a suitable notion of context some insight from recent literature in the philosophy of science is first introduced and then adapted to show how research practices successfully achieve objectivity as one of their aims. It will be argued that an idea of context that includes activities which (in a way to be qualified) are relevant and reliable towards a settled aim is the model of practice that makes objectivity a pursuable task in research. This contextual picture of objectivity, it will be suggested, might better serve the purpose of scientific research (including social research) than either of the two descriptive strategies outlined at the beginning of this paper can do.


Entropy ◽  
2020 ◽  
Vol 22 (11) ◽  
pp. 1209
Author(s):  
Massimo Tessarotto ◽  
Claudio Cremaschini

The subject of this paper deals with the mathematical formulation of the Heisenberg Indeterminacy Principle in the framework of Quantum Gravity. The starting point is the establishment of the so-called time-conjugate momentum inequalities holding for non-relativistic and relativistic Quantum Mechanics. The validity of analogous Heisenberg inequalities in quantum gravity, which must be based on strictly physically observable quantities (i.e., necessarily either 4-scalar or 4-vector in nature), is shown to require the adoption of a manifestly covariant and unitary quantum theory of the gravitational field. Based on the prescription of a suitable notion of Hilbert space scalar product, the relevant Heisenberg inequalities are established. Besides the coordinate-conjugate momentum inequalities, these include a novel proper-time-conjugate extended momentum inequality. Physical implications and the connection with the deterministic limit recovering General Relativity are investigated.


2020 ◽  
Vol 37 (1-2) ◽  
pp. 25-53 ◽  
Author(s):  
Claudio Albanese ◽  
Yannick Armenti ◽  
Stéphane Crépey

AbstractBased on an XVA analysis of centrally cleared derivative portfolios, we consider two capital and funding issues pertaining to the efficiency of the design of central counterparties (CCPs). First, we consider an organization of a clearing framework, whereby a CCP would also play the role of a centralized XVA calculator and management center. The default fund contributions would become pure capital at risk of the clearing members, remunerated as such at some hurdle rate, i.e. return-on-equity. Moreover, we challenge the current default fund Cover 2 EMIR sizing rule with a broader risk based approach, relying on a suitable notion of economic capital of a CCP. Second, we compare the margin valuation adjustments (MVAs) resulting from two different initial margin raising strategies. The first one is unsecured borrowing by the clearing member. As an alternative, the clearing member delegates the posting of its initial margin to a so-called specialist lender, which, in case of default of the clearing member, receives back from the CCP the portion of IM unused to cover losses. The alternative strategy results in a significant MVA compression. A numerical case study shows that the volatility swings of the IM funding expenses can even be the main contributor to an economic capital based default fund of a CCP. This is an illustration of the transfer of counterparty risk into liquidity risk triggered by extensive collateralization.


Author(s):  
S. Montaldo ◽  
C. Oniciuc ◽  
A. Ratto

In recent years, the study of the bienergy functional has attracted the attention of a large community of researchers, but there are not many examples where the second variation of this functional has been thoroughly studied. We shall focus on this problem and, in particular, we shall compute the exact index and nullity of some known examples of proper biharmonic maps. Moreover, we shall analyze a case where the domain is not compact. More precisely, we shall prove that a large family of proper biharmonic maps [Formula: see text] is strictly stable with respect to compactly supported variations. In general, the computations involved in this type of problems are very long. For this reason, we shall also define and apply to specific examples a suitable notion of index and nullity with respect to equivariant variations.


Sign in / Sign up

Export Citation Format

Share Document