standard models
Recently Published Documents


TOTAL DOCUMENTS

594
(FIVE YEARS 178)

H-INDEX

35
(FIVE YEARS 7)

Author(s):  
Sylvie Geisendorf ◽  
Christian Klippert

AbstractThe paper proposes an agent-based evolutionary ecological-economic model that captures the link between the economy and the ecosystem in a more inclusive way than standard economic optimization models do. We argue that an evolutionary approach is required to understand the integrated dynamics of both systems, i.e. micro–macro feedbacks. In the paper, we illustrate that claim by analyzing the non-triviality of finding a sustainability policy mix as a use case for such a coupled system. The model has three characteristics distinguishing it from traditional environmental and resource economic models: (1) it implements a multi-dimensional link between the economic and the ecological system, considering side effects of production, and thus combines the analyses of environmental and resource economics; (2) following literature from biology, it uses a discrete time approach for the biological resource allowing for the whole range of stability regimes instead of artificially stabilizing the system, and (3) it links this resource system to an evolving, agent-based economy (on the basis of a Nelson-Winter model) with bounded rational decision makers instead of the standard optimization model. The policy case illustrates the relevance of the proposed integrated assessment as it delivers some surprising results on the effects of combined and consecutively introduced policies that would go unnoticed in standard models.


Galaxies ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 3
Author(s):  
Guillermo Torres ◽  
Gregory A. Feiden ◽  
Andrew Vanderburg ◽  
Jason L. Curtis

Main-sequence stars with convective envelopes often appear larger and cooler than predicted by standard models of stellar evolution for their measured masses. This is believed to be caused by stellar activity. In a recent study, accurate measurements were published for the K-type components of the 1.62-day detached eclipsing binary EPIC 219511354, showing the radii and temperatures for both stars to be affected by these discrepancies. This is a rare example of a system in which the age and chemical composition are known, by virtue of being a member of the well-studied open cluster Ruprecht 147 (age~3 Gyr, [Fe/H] = +0.10). Here, we report a detailed study of this system with nonstandard models incorporating magnetic inhibition of convection. We show that these calculations are able to reproduce the observations largely within their uncertainties, providing robust estimates of the strength of the magnetic fields on both stars: 1600 ± 130 G and 1830 ± 150 G for the primary and secondary, respectively. Empirical estimates of the magnetic field strengths based on the measured X-ray luminosity of the system are roughly consistent with these predictions, supporting this mechanism as a possible explanation for the radius and temperature discrepancies.


2021 ◽  
Vol 81 (12) ◽  
Author(s):  
Jacob Hollingsworth ◽  
Michael Ratz ◽  
Philip Tanedo ◽  
Daniel Whiteson

AbstractModels of physics beyond the Standard Model often contain a large number of parameters. These form a high-dimensional space that is computationally intractable to fully explore. Experimental results project onto a subspace of parameters that are consistent with those observations, but mapping these constraints to the underlying parameters is also typically intractable. Instead, physicists often resort to scanning small subsets of the full parameter space and testing for experimental consistency. We propose an alternative approach that uses generative models to significantly improve the computational efficiency of sampling high-dimensional parameter spaces. To demonstrate this, we sample the constrained and phenomenological Minimal Supersymmetric Standard Models subject to the requirement that the sampled points are consistent with the measured Higgs boson mass. Our method achieves orders of magnitude improvements in sampling efficiency compared to a brute force search.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Lluís Hernández-Navarro ◽  
Ainhoa Hermoso-Mendizabal ◽  
Daniel Duque ◽  
Jaime de la Rocha ◽  
Alexandre Hyafil

AbstractStandard models of perceptual decision-making postulate that a response is triggered in reaction to stimulus presentation when the accumulated stimulus evidence reaches a decision threshold. This framework excludes however the possibility that informed responses are generated proactively at a time independent of stimulus. Here, we find that, in a free reaction time auditory task in rats, reactive and proactive responses coexist, suggesting that choice selection and motor initiation, commonly viewed as serial processes, are decoupled in general. We capture this behavior by a novel model in which proactive and reactive responses are triggered whenever either of two competing processes, respectively Action Initiation or Evidence Accumulation, reaches a bound. In both types of response, the choice is ultimately informed by the Evidence Accumulation process. The Action Initiation process readily explains premature responses, contributes to urgency effects at long reaction times and mediates the slowing of the responses as animals get satiated and tired during sessions. Moreover, it successfully predicts reaction time distributions when the stimulus was either delayed, advanced or omitted. Overall, these results fundamentally extend standard models of evidence accumulation in decision making by showing that proactive and reactive processes compete for the generation of responses.


2021 ◽  
Vol 3 (4) ◽  
pp. 455-470
Author(s):  
David Dillenberger ◽  
Uzi Segal

We study a simple variant of the house allocation problem (one-sided matching). We demonstrate that agents with recursive preferences may systematically prefer one allocation mechanism to the other, even among mechanisms that are considered to be the same in standard models, in the sense that they induce the same probability distribution over successful matchings. Using this, we propose a new priority groups mechanism and provide conditions under which it is preferred to two popular mechanisms, random top cycle and random serial dictatorship. (JEL C78, D44, D82)


Author(s):  
ANDREW T. LITTLE ◽  
KEITH E. SCHNAKENBERG ◽  
IAN R. TURNER

Does motivated reasoning harm democratic accountability? Substantial evidence from political behavior research indicates that voters have “directional motives” beyond accuracy, which is often taken as evidence that they are ill equipped to hold politicians accountable. We develop a model of electoral accountability with voters as motivated reasoners. Directional motives have two effects: (1) divergence—voters with different preferences hold different beliefs, and (2) desensitization—the relationship between incumbent performance and voter beliefs is weakened. While motivated reasoning does harm accountability, this is generally driven by desensitized voters rather than polarized partisans with politically motivated divergent beliefs. We also analyze the relationship between government performance and vote shares, showing that while motivated reasoning always weakens this relationship, we cannot infer that accountability is also harmed. Finally, we show that our model can be mapped to standard models in which voters are fully Bayesian but have different preferences or information.


2021 ◽  
Author(s):  
Irene Cogliati Dezza ◽  
Christina Maher ◽  
Tali Sharot

Information can strongly impact peoples’ affect, their level of uncertainty and their decisions. It is assumed that people seek information with the goal of improving all three. But are they successful at achieving this goal? Answering this question is important for assessing the impact of self-driven information consumption on people’s well-being. Here, over four experiments (total N = 518) we show that participants accurately predict the impact of information on their internal states (e.g., affect and cognition) and external outcomes (e.g., material rewards), and use these predictions to guide information-seeking choices. A model incorporating participants’ subjective expectations regarding the impact of information on their affective, cognitive, and material outcomes accounted for information-seeking choices better than standard models currently used in the literature, which include objective proxies of those subjective measures. This model also accounted for individual differences in information-seeking choices. By balancing considerations of the impact of information on affective, cognitive and material outcomes when seeking knowledge, participants became happier, more certain and earned more points when they purchased information relative to when they did not, suggesting they adopted an adaptive strategy.


2021 ◽  
Author(s):  
Irene Cogliati Dezza ◽  
Tali Sharot ◽  
Christina Maher

Information can strongly impact peoples’ affect, their level of uncertainty and their decisions. It is assumed that people seek information with the goal of improving all three. But are they successful at achieving this goal? Answering this question is important for assessing the impact of self-driven information consumption on people’s well-being. Here, over four experiments (total N = 518) we show that participants accurately predict the impact of information on their internal states (e.g., affect and cognition) and external outcomes (e.g., material rewards), and use these predictions to guide information-seeking choices. A model incorporating participants’ subjective expectations regarding the impact of information on their affective, cognitive, and material outcomes accounted for information-seeking choices better than standard models currently used in the literature, which include objective proxies of those subjective measures. This model also accounted for individual differences in information-seeking choices. By balancing considerations of the impact of information on affective, cognitive and material outcomes when seeking knowledge, participants became happier, more certain and earned more points when they purchased information relative to when they did not, suggesting they adopted an adaptive strategy.


2021 ◽  
Author(s):  
◽  
Konstantin Kvatch

<p>The thesis will have two main parts. First, let us start with an example. In finance, the standard version of the Black-Scholes formula is a beautiful closed form solution used to price European options. This famous formula is ingenious, but has a flaw that relegates it to something that should be admired, and perhaps not be used in the real world. It relies on the assumption that prices of shares evolve according to geometric Brownian motion. This means that we are willing to accept that extreme shocks to prices are almost impossible. Is this a realistic assumption? Of course not. The stock market crashes of 1929, 1987 are great examples to show that extreme events do happen. More recently, the 1997 Asian crisis and 2000 crash of the NASDAQ show that in addition, such events are not so rare. These jumps occur even more frequently and are larger in magnitude for share prices of individual companies. This problem is by no means new, and a plethora of models and pricing techniques have been developed. The standard Black-Scholes formula is just one example, but this is simply illustration of the matter at hand. The process that we use to model a financial time series is of paramount importance, whether we do it for forecasting purposes or for pricing financial derivatives. If we choose to use a model that does not capture the key empirical aspects of the data, then any subsequent inference may be very unfavourably biased. It is because of this problem that we should investigate the more standard modeling that assumes continuity and normal or log-normal distribution of financial time series. We will begin from the very basics and we will see that this is a wonderful piece of theory, deserving of the reputation it has in being simple, groundbreaking and extremely useful. This work should bring us to a position where we can evaluate a second goal. Stochastic processes with jumps and "heavy-tails" have existed for some time, but have begun to filter through to the financial industry only recently. This lag is due to the perceived added conceptual difficulty in the introduction of such models, although we will see that this should not be the case. There is plenty of real evidence that nancial time series exhibit discontinuous behaviour and that these series are far from normally or log-normally distributed. Rather than looking at standard models as correct, and jump or stochastic volatility models as complicated, we should look upon standard models as educational but not sufficient for the real world. Stochastic volatility or jump models should instead be viewed as natural. The theme of the thesis is the importance of choosing a correct model for the underlying process. Although we may speak of the implications of some models to hedging, we will not actually look at specific hedging techniques. The particular aspect of pricing is also not considered in full scope although we will see the Black-Scholes pricing formula. We will consider that the main problem is to specify the model correctly where the method of pricing is a subsequent technicality. In examples we may take pricing tools like Monte-Carlo simulation as a given. We will not strive for full generality or formality, but rather take a physical approach and aim for clarity and understanding. Let us now move on to the beginning, with the introduction of our primary source of randomness.</p>


2021 ◽  
Author(s):  
◽  
Konstantin Kvatch

<p>The thesis will have two main parts. First, let us start with an example. In finance, the standard version of the Black-Scholes formula is a beautiful closed form solution used to price European options. This famous formula is ingenious, but has a flaw that relegates it to something that should be admired, and perhaps not be used in the real world. It relies on the assumption that prices of shares evolve according to geometric Brownian motion. This means that we are willing to accept that extreme shocks to prices are almost impossible. Is this a realistic assumption? Of course not. The stock market crashes of 1929, 1987 are great examples to show that extreme events do happen. More recently, the 1997 Asian crisis and 2000 crash of the NASDAQ show that in addition, such events are not so rare. These jumps occur even more frequently and are larger in magnitude for share prices of individual companies. This problem is by no means new, and a plethora of models and pricing techniques have been developed. The standard Black-Scholes formula is just one example, but this is simply illustration of the matter at hand. The process that we use to model a financial time series is of paramount importance, whether we do it for forecasting purposes or for pricing financial derivatives. If we choose to use a model that does not capture the key empirical aspects of the data, then any subsequent inference may be very unfavourably biased. It is because of this problem that we should investigate the more standard modeling that assumes continuity and normal or log-normal distribution of financial time series. We will begin from the very basics and we will see that this is a wonderful piece of theory, deserving of the reputation it has in being simple, groundbreaking and extremely useful. This work should bring us to a position where we can evaluate a second goal. Stochastic processes with jumps and "heavy-tails" have existed for some time, but have begun to filter through to the financial industry only recently. This lag is due to the perceived added conceptual difficulty in the introduction of such models, although we will see that this should not be the case. There is plenty of real evidence that nancial time series exhibit discontinuous behaviour and that these series are far from normally or log-normally distributed. Rather than looking at standard models as correct, and jump or stochastic volatility models as complicated, we should look upon standard models as educational but not sufficient for the real world. Stochastic volatility or jump models should instead be viewed as natural. The theme of the thesis is the importance of choosing a correct model for the underlying process. Although we may speak of the implications of some models to hedging, we will not actually look at specific hedging techniques. The particular aspect of pricing is also not considered in full scope although we will see the Black-Scholes pricing formula. We will consider that the main problem is to specify the model correctly where the method of pricing is a subsequent technicality. In examples we may take pricing tools like Monte-Carlo simulation as a given. We will not strive for full generality or formality, but rather take a physical approach and aim for clarity and understanding. Let us now move on to the beginning, with the introduction of our primary source of randomness.</p>


Sign in / Sign up

Export Citation Format

Share Document