scholarly journals Lines and Semi-Countably Differentiable Primes

2021 ◽  
Vol 70 (2) ◽  
pp. 90-98
Author(s):  
Abigaël Alkema

Let l(u)⊃ |G|. A central problem in higher non-linear graph theoryis the construction of projective numbers. We show that Recent developments in axiomatic set theory [6] have raised the questionof whetherEis not dominated byl. On the other hand, the work in [6, 24] did not consider the hyper-real case.

1942 ◽  
Vol 7 (4) ◽  
pp. 133-145 ◽  
Author(s):  
Paul Bernays

Our task in the treatment of general set theory will be to give a survey for the purpose of characterizing the different stages and the principal theorems with respect to their axiomatic requirements from the point of view of our system of axioms. The delimitation of “general set theory” which we have in view differs from that of Fraenkel's general set theory, and also from that of “standard logic” as understood by most logicians. It is adapted rather to the tendency of von Neumann's system of set theory—the von Neumann system having been the first in which the possibility appeared of separating the assumptions which are required for the conceptual formations from those which lead to the Cantor hierarchy of powers. Thus our intention is to obtain general set theory without use of the axioms V d, V c, VI.It will also be desirable to separate those proofs which can be made without the axiom of choice, and in doing this we shall have to use the axiom V*—i.e., the theorem of replacement taken as an axiom. From V*, as we saw in §4, we can immediately derive V a and V b as theorems, and also the theorem that a function whose domain is represented by a set is itself represented by a functional set; and on the other hand V* was found to be derivable from V a and V b in combination with the axiom of choice. (These statements on deducibility are of course all on the basis of the axioms I–III.)


1959 ◽  
Vol 24 (2) ◽  
pp. 154-166 ◽  
Author(s):  
Azriel Lévy

Ackermann introduced in [1] a system of axiomatic set theory. The quantifiers of this set theory range over a universe of objects which we call classes. Among the classes we distinguish the sets. Here we shall show that, in some sense, all the theorems of Ackermann's set theory can be proved in Zermelo-Fraenkel's set theory. We shall also show that, on the other hand, it is possible to prove in Ackermann's set theory very strong theorems of the Zermelo-Fraenkel set theory.


Author(s):  
Diego Liberati

In many fields of research, as well as in everyday life, it often turns out that one has to face a huge amount of data, without an immediate grasp of an underlying simple structure, often existing. A typical example is the growing field of bio-informatics, where new technologies, like the so-called Micro-arrays, provide thousands of gene expressions data on a single cell in a simple and fast integrated way. On the other hand, the everyday consumer is involved in a process not so different from a logical point of view, when the data associated to his fidelity badge contribute to the large data base of many customers, whose underlying consuming trends are of interest to the distribution market. After collecting so many variables (say gene expressions, or goods) for so many records (say patients, or customers), possibly with the help of wrapping or warehousing approaches, in order to mediate among different repositories, the problem arise of reconstructing a synthetic mathematical model capturing the most important relations between variables. To this purpose, two critical problems must be solved: 1 To select the most salient variables, in order to reduce the dimensionality of the problem, thus simplifying the understanding of the solution 2 To extract underlying rules implying conjunctions and/or disjunctions between such variables, in order to have a first idea of their even non linear relations, as a first step to design a representative model, whose variables will be the selected ones When the candidate variables are selected, a mathematical model of the dynamics of the underlying generating framework is still to be produced. A first hypothesis of linearity may be investigated, usually being only a very rough approximation when the values of the variables are not close to the functioning point around which the linear approximation is computed. On the other hand, to build a non linear model is far from being easy: the structure of the non linearity needs to be a priori known, which is not usually the case. A typical approach consists in exploiting a priori knowledge to define a tentative structure, and then to refine and modify it on the training subset of data, finally retaining the structure that best fits a cross-validation on the testing subset of data. The problem is even more complex when the collected data exhibit hybrid dynamics, i.e. their evolution in time is a sequence of smooth behaviors and abrupt changes.


2020 ◽  
pp. 73-81
Author(s):  
Nicolas Bommarito

This chapter discusses the cycle of birth and death in Buddhism. It is important to distinguish rebirth from reincarnation. Reincarnation is the transfer of a soul from body to body. Rebirth, on the other hand, is a cycle of many births and deaths, without any soul linking them. It is commonly called rebirth but it could equally well be redeath since each lifetime involves both a birth and a death. In many traditional forms of Buddhism, this cycle of rebirth and redeath includes supernatural being and places. Many traditional Buddhists think of these places and beings as real. In fact, this is central to many traditional statements of the central problem Buddhism aims to solve; these different kinds of lives all make up what is called samsara. Beings are constantly being born and dying in these different realms, over and over and over. On this traditional understanding, Buddhism solves the problem by ending this cycle of birth and death. The solution, sometimes called nirvana, is about getting out of the cycle.


1981 ◽  
Vol 3 (2) ◽  
Author(s):  
Klaus Lüderssen

AbstractIt is shown by means of four examples that the demarcation between law and morals has become problematical. The study of more recent developments in ethics and in law indicates that in both fields the relevance of discourse and consent has grown. Though both law and morals aim at agreement their degree of dependance on it differs. The definition of law and morals suggested in this article is based on this view. Legitimate law consists of norms, which besides fulfilling other conditions have attained a certain degree of consent. On the other hand one can only talk of social morals when a very high degree of consent has been reached. The consequences of this definition are explained by means of the examples presented at the beginning.


2013 ◽  
Vol 22 (04) ◽  
pp. 1330006 ◽  
Author(s):  
PAULO VARGAS MONIZ

This report comprises two parts. On the one hand, I will, based on the talks at the CM4 parallel session "Quantum Cosmology and Quantum Effects in the Early Universe" which I chaired, point to interesting recent developments in quantum cosmology. On the other hand, some of the basics of supersymmetric quantum cosmology are briefly reviewed, pointing to promising lines of research to explore. I will start with the latter, finishing the report with the former.


2017 ◽  
Vol 8 ◽  
pp. 140
Author(s):  
Cristina López Vargas

<p class="Abstract">With growing sustainability concern in mind, firms seek to implement reverse logistic systems in their operations. However, if these practices were not properly implemented, they would be costly and even ineffective. In order to guide company efforts, the present study provide a comprehensive framework based on two dimensions. On one hand, it suits a reverse logistic management model stage-by-stage. On the other hand, the framework brings together concrete measures to optimize SC sustainability from three perspectives: operative, economical and environmental. The proposed framework thus allow to balance reverse logistic practices and SC sustainability. Furthermore, we validated it by analysing six real case in different industries. Findings highlight how reverse logistic activities may improve each SC sustainability dimension.</p>


Author(s):  
Massimiliano Simons ◽  

In this article, two different claims about nature are discussed. On the one hand, environmental philosophy has forced us to reflect on our position within nature. We are not the masters of nature as was claimed before. On the other hand there are the recent developments within synthetic biology. It claims that, now at last, we can be the masters of nature we have never been before. The question is then raised how these two claims must be related to one another. Rather than stating that they are completely irreconcilable, I will argue for a dialogue aimed to discuss the differences and similarities. The claim is that we should not see it as two successive temporal phases of our relation to nature, but two tendencies that can coexist.


PMLA ◽  
1945 ◽  
Vol 60 (4-Part1) ◽  
pp. 1106-1129
Author(s):  
H. E. Briggs

Writers on Keats, when discussing the central problem of the effects upon the poet of the contemptuous criticism directed against Endymion and against Keats personally, have presented various and often contradictory opinions. The two extreme views, namely, that Keats was killed by this criticism, or, on the other hand, that he was scarcely touched by it, are no longer regarded as tenable. The current opinion was first stated, I believe, by Sir Sidney Colvin when he wrote:the reviews of those days, especially the Edinburgh and Quarterly, had a real power of barring the acceptance and checking the sale of an author's work. What actually happened was that when a year or so later [after Endymion was condemned] Keats began to realise the harm which the reviews had done and were doing to his material prospects, these consequences in his darker hours preyed on him severely and conspired with the forces of disease and passion to his undoing.


1995 ◽  
Vol 10 ◽  
pp. 585-587
Author(s):  
Keith Butler

In this paper I review some recent advances in the use of large amounts of atomic data in the modelling of atmospheres and winds of hot stars. The review is highly selective but representative of current developments. A more general overview is to be found in Kudritzki and Hummer (1990) although the field is changing so rapidly that much has happened since then. The paper breaks down into three parts: work on line formation, in which the atmospheric structure is known and held fixed, is described first, then follows a description of the inclusion of line opacities in non-LTE in the atmosphere problem itself, and finally recent developments in the theory of radiatively driven stellar winds are summarized. Here special emphasis is given to a novel distance determination method based entirely on spectroscopie quantities. I close with a brief shopping list.In a series of papers, Becker and Butler (1992,1994a, b,c) have investigated iron and nickel spectra in sub-dwarfs using the complete linearization method of Auer and Heasley (1976). The method scales linearly with the number of frequency points so they were able to use well over 10000 frequencies to adequately describe the line opacities. Several thousand lines were treated explicitly and the resultant computed spectra gave execellent fits to observed Hubble spectra in the wavelength ranges dominated by the ions concerned.The different ionization stages gave consistent results for the iron and nickel abundances but only after line-blocking from millions of spectral lines in the far UV had been included. This was done using the Kurucz (1988) line lists coupled with line grouping as suggested by Anderson (1989) and described briefly in the next section.The line-blanketed atmospheres of Kurucz (1991) are the best available up to about 30000K, where non-LTE effects start to become important. Non-LTE line-blanketed atmospheres have become feasible because the computational requirements of the accelerated lambda iteration (ALI) method (Werner and Husfeld, 1985) also scale linearly with the number of frequency points. On the other hand, Anderson (1989) suggested grouping energetically adjacent atomic levels together to form pseudo-levels on the basis that although they might, as a group, be in non-LTE, they should be in LTE with respect to one another due to the large number of collisions between them. This greatly reduces the number of levels to be considered but instead gives rise to highly complicated pseudo line-profiles. Grigsby et al (1992), who did not use ALI, constructed the first grid of line-blanketed non-LTE models by using a variation on the Opacity Distribution Function concept to group line opacities into blocks thereby reducing the number of frequency points required. Dreizler and Werner (1993) on the other hand were able to sample the opacity as they used ALI in their models.


Sign in / Sign up

Export Citation Format

Share Document