scholarly journals An Introduction to the Calculus of Environments

2020 ◽  
Vol 31 (5) ◽  
pp. 1-22

The paper deals with the modern appropriation of Lucretius’ atomistic philosophy as presented in Louis Althusser’s late writings. The aleatory materialism that Althusser elaborated in some fragments from the 1980s argues for the total contingency of any world, which is nothing but an accidental clutch of atoms resulting from a Lucretian clinamen. Althusser interprets “world” in a broad sense as referring both to cosmological and ontological global arrangements and also to particular political and practical states of affairs. By claiming that thought and necessity are always determined by a certain connection among atoms, Althusser touches upon the problem of the “principles of cohesion” — the sub-semantic field which determines the semantic but is not itself semantic. However, these principles are described by Althusser only metaphorically and without further elaboration. The paper proposes a further development of these principles derived from aleatory materialism. Althusser’s late writings are placed in the context of Leibniz and Kant’s thought in order to clarify the importance of Althusser’s problematics for time dj-ing, or TJ-ing — the immanent protocols for intercutting between and stitching together possible worlds and time-series. Building upon Kant’s concept of transcendental schematism, the paper proposes a system of quaternary gestural code and twelve basic environmental types which provide an immanent answer to the question of what e principles govern the clutching of atoms. This in turn forms the basis for the operation of a new kind of computer as an alternative to the two basic New Age kinds of machinery based either on carbon-energy or silicon-information.

2021 ◽  
Author(s):  
Patrick Bartlein ◽  
Sandy Harrison

<p>The increasing availability of time-evolving or transient palaeoclimatic simulations makes it imperative to develop “best-practices” for comparing simulations with palaeoclimatic observations including both climate reconstructions and environmental data.  There are two sets of considerations, temporal and spatial, that should guide those comparisons.  The chronology of simulations can in some ways be viewed as exact, as determined by the insolation forcing, but data archiving and reporting conventions, such as reporting summaries that use the modern calendar (that leads to the long-recognized palaeo-calendar effect) can, if ignored, lead to “built-in” temporal offsets of thousands of years in such features as temperature or precipitation maxima or minima.  Likewise, there are age uncertainties in time series of palaeoclimatic data that are often ignored, despite the fact that these are large during “climatically interesting times” such as the Younger Dryas chronozone.  Similarly, although model resolution is increasing, there is still a mismatch in topography (and its climatic effects) between a model and the “real world” sensed by the palaeoclimatic data sources. </p><p>There are existing approaches for dealing with some of these issues, such as calendar-adjustment programs, Monte-Carlo approaches for describing age uncertainties in palaeoclimate time series, or clustering approaches for objectively defining appropriate regions for the calculation of area averages, but there is certainly room for further development.  This abstract is intended to serve as platform for discussion of some of best practices for data-model comparisons in transient mode.</p>


1969 ◽  
Vol 1 (01) ◽  
pp. 111-122
Author(s):  
P. D. Finch

Many problems arising in the physical and social sciences relate to processes which happen sequentially. Such processes are usually investigated by means of the theory of stationary stochastic processes, but there have been some attempts to develop techniques which are not subject to the conceptual difficulties inherent in the probabilistic approach. These difficulties stem from the fact that in practice one is often restricted to a single record which, from the probabilistic point of view, is only one sample from an ensemble of possible records. In some instances such a viewpoint seems artificial, and for some time series it is questionable whether any objective reality corresponds to the idea of an ensemble of possible time series. For example, as noted in Feller (1967), a theory of probability based on a frequency interpretation cannot meaningfully attach a probability to a statement such as “the sun will rise tomorrow”, because to do so one would have to set up a conceptual universe of possible worlds.


2020 ◽  
Vol 2020 (66) ◽  
pp. 27-45
Author(s):  
أ.م.د. ابتسام علي حسين ◽  
أ.م د. بدر شحدة حمدان

The aim of the research is to measure the impact of financial development on economic growth in Iraq using the annual time series for the period 2004-2018 for a number of monetary and financial variables (money supply in the broad sense / GDP, capital accumulation rate / GDP and the ratio of credit granted to the private sector / GDP) expressing the development in the financial sector in Iraq, and this period was chosen in line with the relatively high rates of economic growth witnessed in Iraq, and the study used descriptive and quantitative approaches in order to build an appropriate standard model to measure the impact of financial development on economic growth in Iraq, and the method of time series analysis was used. The results showed that the economic variables contain the unit root, and the variables become stable after the first differences, and this was followed by subjecting the variables to the joint integration test by the Johansson method, which proved the existence of four vectors for the joint integration between the research variables, and the results of the joint integration showed that there is a long-term relationship between the variables The subject of the research, as the causation test of Granger concluded that there is a unilateral trend of causation from the financial variables to the variable of economic growth, and the research found the existence of an effect of each of (broad-sense money supply / GDP, and the ratio of credit granted to the private sector / GDP) on economic growth in Iraq, while the rate of capital accumulation / GDP was not statistically significant. In light of the previous results, the research recommends the following: The necessity of directing domestic credit to productive investments by paying attention to the rate of capital accumulation that is directed to local investments and attracting foreign investments by providing a safe and stable legislative environment that helps financial liberalization for the purpose of increasing economic growth rates in Iraq..


1993 ◽  
pp. 375-378
Author(s):  
Barbara Motnikar ◽  
Drago Čepar ◽  
Peter Žunko ◽  
Marijan Ribarič

Author(s):  
Harriet E. Baber

According to preferentism, the ‘desire theory’ of well-being, one is made better off to the extent that her preferences, or desires, are satisfied. According to narrow preferentism, preferentism as it has traditionally been understood, the preferences that matter in this regard are just actual preferences; preferences we might ‘easily have had’, do not matter. On this account also, only actual preference satisfaction contributes to well-being. Merely possible preference satisfaction, including the ‘real possibility’ of attaining desired states of affairs, does not contribute to well-being. Broad preferentism makes sense of the intuition that feasibility as such contributes to well-being. On this account, we are made better off not only by the actual satisfaction of our actual preferences but also by the mere feasibility of satisfying preferences that we ‘might easily have had’. In addition to making sense of our intuition that feasibility as such, contributes to our well-being, broad preferentism provides a rationale for altruistic behavior. On this account support policies that benefit worldmates whose actual circumstances are different from our own because their circumstances are the our circumstances at nearby possible worlds, and our circumstances at other possible worlds, affect our own actual well-being.


Synthese ◽  
2021 ◽  
Author(s):  
Fabio Lampert ◽  
Pedro Merlussi

AbstractIn a recent article, P. Roger Turner and Justin Capes argue that no one is, or ever was, even partly morally responsible for certain world-indexed truths. Here we present our reasons for thinking that their argument is unsound: It depends on the premise that possible worlds are maximally consistent states of affairs, which is, under plausible assumptions concerning states of affairs, demonstrably false. Our argument to show this is based on Bertrand Russell’s original ‘paradox of propositions’. We should then opt for a different approach to explain world-indexed truths whose upshot is that we may be (at least partly) morally responsible for some of them. The result to the effect that there are no maximally consistent states of affairs is independently interesting though, since this notion motivates an account of the nature of possible worlds in the metaphysics of modality. We also register in this article, independently of our response to Turner and Capes, and in the spirit of Russell’s aforementioned paradox and many other versions thereof, a proof of the claim that there is no set of all true propositions one can render false.


1969 ◽  
Vol 1 (1) ◽  
pp. 111-122 ◽  
Author(s):  
P. D. Finch

Many problems arising in the physical and social sciences relate to processes which happen sequentially. Such processes are usually investigated by means of the theory of stationary stochastic processes, but there have been some attempts to develop techniques which are not subject to the conceptual difficulties inherent in the probabilistic approach. These difficulties stem from the fact that in practice one is often restricted to a single record which, from the probabilistic point of view, is only one sample from an ensemble of possible records. In some instances such a viewpoint seems artificial, and for some time series it is questionable whether any objective reality corresponds to the idea of an ensemble of possible time series. For example, as noted in Feller (1967), a theory of probability based on a frequency interpretation cannot meaningfully attach a probability to a statement such as “the sun will rise tomorrow”, because to do so one would have to set up a conceptual universe of possible worlds.


2020 ◽  
Vol 46 (3-4) ◽  
pp. 159-200
Author(s):  
Friederike Moltmann

Abstract This paper gives an outline of truthmaker semantics for natural language against the background of standard possible-worlds semantics. It develops a truthmaker semantics for attitude reports and deontic modals based on an ontology of attitudinal and modal objects and on a semantic function of clauses as predicates of such objects. The semantics is applied to factive verbs and response-stance verbs as well as to cases of modal concord. The paper also presents new motivations for ‘object-based truthmaker semantics’ from intensional transitive verbs such as need, look for, own, and buy and gives an outline of their semantics based on a further development of truthmaker semantics.


Sign in / Sign up

Export Citation Format

Share Document